Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

rafark

macrumors 68000
Original poster
Sep 1, 2017
1,746
2,944
As I mentioned in another post, most people think of the TrueDepth camera system as faceId and poop animojis; however the technology has the potential to be used in many other useful ways other than the dark and scary things people think of.

I imagine:

-Facial gestures. You're given a confirmation screen/alert and you may accept it by turning your head up and down or cancel it by turning your head left to right. Or maybe you are watching a movie and got a new notification that gives you the option to reply or perform a similar action again by turning your head up/down or left/right.

-eyesight tracking.- perhaps an interface that uses your eye sight to adapt the contents of the screen.

-facial gestures and behavioral learning. You're browsing your social feed and the app uses your facial gestures to build up a profile. When you smile to a picture or post, the app uses that post to find similar posts to show you in the future. Same thing but the oposite when you do an angry or uncomfortable expression when viewing a certain post. This technology can really serve as a foundation to a system that knows what you like and what you don't. Basically the computer can now get to deeply know you.

-photography apps.- what about an app that can change the facial expressions of your face from any image of yourself.

-games and fx apps.

The possibilities are endless and the technology has the potential to really bring up to the table things we only saw in science fiction movies.

So what else do you think can be done with the TrueDepth camera?
 

KGB7

Suspended
Jun 15, 2017
925
753
Rockville, MD
What if I’m watching a movie on tv while using my phone? That would make the tech kind of useless.
There are many examples I can bring up why face gestures or eye tracking would not work while I’m multitasking.
 

Rigby

macrumors 603
Aug 5, 2008
6,225
10,170
San Jose, CA
-Facial gestures. You're given a confirmation screen/alert and you may accept it by turning your head up and down or cancel it by turning your head left to right. Or maybe you are watching a movie and got a new notification that gives you the option to reply or perform a similar action again by turning your head up/down or left/right.
How would that be easier or better than tapping a button on the screen?
-eyesight tracking.- perhaps an interface that uses your eye sight to adapt the contents of the screen.
I don't see how a dot projector would help with that. Eye tracking can be done with a regular camera (see e.g. here).
-facial gestures and behavioral learning. You're browsing your social feed and the app uses your facial gestures to build up a profile. When you smile to a picture or post, the app uses that post to find similar posts to show you in the future. Same thing but the oposite when you do an angry or uncomfortable expression when viewing a certain post. This technology can really serve as a foundation to a system that knows what you like and what you don't. Basically the computer can now get to deeply know you.
Creep factor is too high. I usually don't make faces at my phone anyway.
-photography apps.- what about an app that can change the facial expressions of your face from any image of yourself.
Sounds like a variation of the poop emoji. ;)
-games and fx apps.
More specifically?

Maybe someone comes up with a brilliant idea, but I'm not convinced that the system is all that useful in its current limited form. It would be cool if it could be used as a general 3D object scanner, but apparently the ML model is trained for faces only.
 

fred98tj

macrumors 6502a
Jul 9, 2017
575
380
Central Luzon, Philippines
Some is old tech, such as eyesight tracking. The Canon A2E (35mm film camera) had eyesight tracking years ago. It would follow your eye in the viewfinder and focus on whatever you looked at. Worked good. And that came out in 1992.
 

44267547

Cancelled
Jul 12, 2016
37,642
42,491
Some is old tech, such as eyesight tracking. The Canon A2E (35mm film camera) had eyesight tracking years ago. It would follow your eye in the viewfinder and focus on whatever you looked at. Worked good. And that came out in 1992.

But I imagine this new TrueDepth Camera is far more AdVanced that will offer much more capabilities with tracking and recognition compared to Canon. And I doubt many today were aware the technology you are referring to existed back in 1992.
 

DNichter

macrumors G3
Apr 27, 2015
9,385
11,183
Philadelphia, PA
I trust that developers will come up with a ton of useful things to use the TrueDepth camera for. It's very advanced tech that nobody else has right now. I will leave it up to them though as I am not that clever.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.