6 months of dev time for one interaction. Many companies would never choose to take such a bold path. I applaud the team for their focus on user privacy as well as their incredibly innovative solution. I'm impressed!
Thank you, this is something we've been worked on for some time as you noted.
We like to cook up crazy ideas like this and then let them simmer, knowing that eventually, we'll find a use for them!
Couldn't said it better, bold and expensive. Applauds to the team on persistency, thorough work and dedication on all aspects!
This is effin awesome!
Loving it both in terms of thinking outside the box, and execution. It's things like this that make me excited about work. Top marks guys!
Very creative thinking. This is much better than the "notch drawer" nonsense.
Anyone concerned that Apple won't approve this?
they wont even check it, sadly!
beme (before it shut down) had a similar concept. It used the front facing camera to activate the rear-facing camera. So they certainly have approved similar interactions in the past.
I thought beme used iPhone’s proximity/light sensor.
Yep, it used the proximity sensor not camera.
I'm impressed. Clever solution, and I'm glad they put a focus on battery life. Merely having the camera app open is usually very power hungry. I'd love to see a comparison of battery life with this enabled vs not. If their claims are true, this is incredible.
True. I thought the same about the battery.
Really smart idea. Way to work around your limitations. Would love for this to become built into iOS.
It's a clever idea, but i do see a problem of increased image processing load and battery drain. Cameras are not cheap in terms of energy consumption.
Now that is clever.
This is ridiculously amazing.
The context in which it's used is interesting - I'd really like to see the onboarding of this feature. In terms of accessibility for other apps and other audiences - rather than struggling with "which menu gets the camera working?!", simply tap on the thing itself.
Uh. This is wonderful.
Did you consider the Ambient Light Sensor? I guess it does exact same, but not sure about lighting conditions.
Having never used Luna Display before, I'm left wondering how you wold have used the home button in the first place? Also, why not use the left and right edges of the screen? While not scientific, on the handful of apps tried, the left and right edges were both available (no interactions occurred while swiping). Not only would this have called up the edge menu in a consistent and discoverable way, it could work for both left and right handed individuals by having the menu appear no matter which edge you chose.
Clever, but seems like you'd constantly be coating your front facing camera with oil/finger prints. Millennials will not be happy about this.