We’ve all heard the arguments that our lives have become irrevocably mediated by screens and camera phones — that the more we document and publish moments, the less we actually live them. So when Elise Hu over at All Tech Considered got a Narrative Clip in the mail, I was curious.
This discreet, wearable camera (here’s what it looks like) is supposed to get you out from behind the screen and into your life — by automatically snapping a photo every 30 seconds so you don’t have to. It then saves the scenes it algorithmically deems important and trashes the rest.
If you don’t trust the Clip’s editorial sensibility, you can also tap twice to manually take a photo. So I wore it around the office for an afternoon and let it roll. Later, I went for a run and I attempted a photo series. I wanted to document every instance of street art I passed on the trail, as well as every passer-by. Here is the one single shot of street art that was saved (obscured by my finger):
Here’s the only runner that survived the algorithm:
The vast majority of the other 500-ish images looked a lot like this:
Some warpy buildings:
Some trippy trees:
It’s kinda cool-looking, if you’re into experimental photo projects. Though I can’t say it was a time-saver (I still looked through every photo to find these). And I can’t say it’s reliable (There’s no telling whether it will capture the right moments). And I can’t say it delivers the promise of “photographic memory.” Because if you do have memories grounded in photos, it’s probably because you decided to actively capture that scene, or you’re in it, or you once held it in you hands. Remove all that, and it sounds like an algorithm for more digital clutter.
Here’s one interesting photo the Narrative Clip did take: A photo of me taking a photo with my iPhone — of a beached boat named “Hi Hopes” — which I then posted to Instagram. It’s almost as if it knew.