Google Clips gets better at capturing candids of hugs and kisses (which is not creepy, right?) – TechCrunch
Google Clips’ AI-powered “smart camera” just got even smarter, Google announced today, revealing improved functionality around Clips’ ability to automatically capture specific moments – like hugs and kisses. Or jumps and dance moves. You know, in case you want to document all your special, private moments in a totally non-creepy way.
I kid, I kid!
Well, not entirely. Let me explain.
Look, Google Clips comes across to me as more of a proof-of-concept device that showcases the power of artificial intelligence as applied to the world of photography, rather than a breakthrough consumer device.
I’m the target market for this camera – a parent and a pet owner (and look how cute she is) – but I don’t at all have a desire for a smart camera designed to capture those tough-to-photograph moments, even though neither my kid nor my pet will sit still for pictures.
I’ve tried to articulate this feeling, and I find it’s hard to say why I don’t want this thing, exactly. It’s not because the photos are automatically uploaded to the cloud or made public – they are not. They are saved to the camera’s 16 GB of onboard storage and can be reviewed later with your phone, where you can then choose to keep them, share them, or delete them. And it’s not even entirely because of the price point – though, arguably, even with the recent $50 discount it’s quite the expensive toy at $199.
Maybe it’s just the camera’s premise.
That in order for us to fully enjoy a moment, we have to capture it. And because some moments are so difficult to capture, we spend too much time with phone-in-hand, instead of actually living our lives – like playing with our kids or throwing the ball for the dog, for example. And that the only solution to this problem is more technology. Not just putting the damn phone down.
What also irks me is the broader idea behind Clips that all our precious moments have to be photographed or saved as videos. They do not. Some are meant to be ephemeral. Some are meant to be memories. In aggregate, our hearts and minds tally up all these little life moments – a hug, a kiss, a smile – and then turn them into feelings. Bonds. Love. It’s okay to miss capturing every single one.
I’m telling you, it’s okay.
At the end of the day, there are only a few times I would have even considered using this product – when baby was taking her first steps, and I was worried it would happen while my phone was away. Or maybe some big event, like a birthday party, where I wanted candids but had too much going on to take photos. But even in these moments, I’d rather prop my phone up and turn on a “Google Clips” camera mode, rather than shell out hundreds for a dedicated device.
You may feel differently. That’s cool. To each their own.
Anyway, what I think is most interesting about Clips is the actual technology. That it can view things captured through a camera lens and determine the interesting bits – and that it’s already getting better at this, only months after its release. That we’re teaching A.I. to understand what’s actually interesting to us humans, with our subjective opinions. That sort of technology has all kinds of practical applications beyond a physical camera that takes spy shots of Fido.
The improved functionality is rolling out to Clips with the May update, and will soon be followed by support for family pairing which will let multiple family members connect the camera to their device to view content.
Here’s an intro to Clips, if you missed it the first time. (See below) Note that it’s currently on sale for $199. Yeah, already. Hmmm.