Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

subjonas

macrumors 603
Feb 10, 2014
5,554
5,883
This probably isn’t something a third party could do since it’s more system-wide—but as someone who draws digitally, I wish you could take any window in VP (especially the virtual Mac display), and pin it precisely to a physical flat surface so that you can draw on that virtual window accurately with pen pressure sensitivity. Maybe this would be possible with an Apple Pencil, but I’d think it would require a special stylus made for the VP that can be more accurately tracked. (A rumor does say Apple may be working on a stylus for the VP.) This is probably another reason why only Apple would be able to implement this.
And not just stationary surfaces, but also a portable slate. I would think this portable slate (and maybe stationary surfaces too) would also need to have some special hardware to help the VP track its exact position so that the virtual window stays perfectly aligned to the physical surface. Maybe this could be done with an iPad if it can somehow communicate with the VP its exact position, in which case I’m sure a regular Apple Pencil would suffice since the iPad would know exactly where the Pencil is with Hover.
The key to this and probably the hardest part is super accurate tracking of the surface and the stylus. If it’s not perfectly accurate, it would probably be useless to me.

I wish for this because from what I hear, looking at a physical screen (like a Cintiq) through the VP pass-through is not good, especially for long periods. As someone who wanted to use the VP to aid with my work, and that being my main use case, this is probably unfortunately a deal-breaker.
 

ddvmor

macrumors newbie
Nov 1, 2011
24
8
Less an app, and more of a quality of life feature suggestion- if you look at your (linked) iphone in pass-through, it could automatically overlay the phone’s screen with a sharper, higher resolution version than the passthrough itself is capable of.
 

riverfreak

macrumors 68000
Jan 10, 2005
1,828
2,289
Thonglor, Krung Thep Maha Nakhon
Less an app, and more of a quality of life feature suggestion- if you look at your (linked) iphone in pass-through, it could automatically overlay the phone’s screen with a sharper, higher resolution version than the passthrough itself is capable of.
Or if you look at your phone from your optic-id unlocked VP, your phone should be unlocked?
 

jclardy

macrumors 601
Oct 6, 2008
4,164
4,393
DAG visualization/ navigation would be amazing (as would any visualization of data relationships / data enrichment).

You just do the dev in your normal dev enviro, yes? I don’t develop mobile apps on my iPhone…
Your iPhone doesn't have a keyboard and infinite screens around your space. This is the same argument used against iPad, which would actually make an incredible portable development machine, if Apple gave it just a hint of development support. I use swift playgrounds on the iPad a lot for quick UI sketches, as it is honestly much faster than setting up a full project on Xcode on an M1 mac.
You can do that on a laptop or desktop. You don’t need an IDE or terminal on device.

Nobody writes iPhone apps on an iPhone. But you know that already?

Oh, and for clarity, it’s spatial.
The AVP is a vision of the future of computing, why should people have to tote around a laptop and a Vision Pro to build software? In 5-10 years when Vision goggles can fit in your pocket and you have a portable folding keyboard/trackpad, why would you use a laptop at that point?
 
  • Like
Reactions: Night Spring

subjonas

macrumors 603
Feb 10, 2014
5,554
5,883
Did anyone say a light saber? If not, a light saber. But of course it needs to work with other VP users who also have the app. Not sure if Apple has set up that kind of multi VP functionality yet.
 
  • Like
Reactions: Chuckeee

subjonas

macrumors 603
Feb 10, 2014
5,554
5,883
This probably isn’t something a third party could do since it’s more system-wide—but as someone who draws digitally, I wish you could take any window in VP (especially the virtual Mac display), and pin it precisely to a physical flat surface so that you can draw on that virtual window accurately with pen pressure sensitivity. Maybe this would be possible with an Apple Pencil, but I’d think it would require a special stylus made for the VP that can be more accurately tracked. (A rumor does say Apple may be working on a stylus for the VP.) This is probably another reason why only Apple would be able to implement this.
And not just stationary surfaces, but also a portable slate. I would think this portable slate (and maybe stationary surfaces too) would also need to have some special hardware to help the VP track its exact position so that the virtual window stays perfectly aligned to the physical surface. Maybe this could be done with an iPad if it can somehow communicate with the VP its exact position, in which case I’m sure a regular Apple Pencil would suffice since the iPad would know exactly where the Pencil is with Hover.
The key to this and probably the hardest part is super accurate tracking of the surface and the stylus. If it’s not perfectly accurate, it would probably be useless to me.

I wish for this because from what I hear, looking at a physical screen (like a Cintiq) through the VP pass-through is not good, especially for long periods. As someone who wanted to use the VP to aid with my work, and that being my main use case, this is probably unfortunately a deal-breaker.
A more general use version of the above feature would be:
the ability to anchor any virtual object (such as an app window) to any portable physical object (such as a laptop or drawing tablet, although the object may need special hardware to track).

In my case, I would use it not only to position a virtual drawing app window onto a physical drawing tablet as described in the above post, but also to position other floating windows around the tablet, essentially virtually extending the size of the tablet for elements that don’t require a physical surface for pen accuracy and pressure sensitivity. For example, if I could, I would move my drawing app’s tool bar window and layers window off of the tablet into virtual space since those can be easily selected with touch if needed (I know this isn’t currently possible for multiple reasons). And as I move the tablet, or as I myself move with the tablet, all the floating windows stay locked in position relative to the tablet, easily within reach.

That’s my use case, but I think the general feature of locking virtual objects to portable physical ones could be useful for a wide range of use cases.
 
Last edited:
  • Like
Reactions: Tdude96

Piggie

macrumors G3
Feb 23, 2010
9,121
4,022
AR Pets that run around your home accurately,
You need to feed them every day and can teach them tricks.
Rolling a ball, throwing a toy.
an absolutely no-brainer of an app someone needs to create
 
  • Like
Reactions: Tdude96

fatTribble

macrumors 65816
Sep 21, 2018
1,420
3,893
Ohio
I’d like to see the ability to create your own Environments and then share those publicly with location data included. Then looking at a map you could see what the view would be like from that location. I think this might have been done already in other forms but would look sharp on AVP.
 

CrysisDeu

macrumors 6502a
Sep 16, 2018
600
876
spatial user interfaces for controlling smart home devices:
For example, if I want to lower the light a bit, I can:
1. look at the lights and turn the Digital Crown to control the brightness
2. not look at the lights and point my hand towards the lights, and drag down to lower the brightness

If I want to close the window blinds, I can see a virtual blind and how much I have it turned off, it will go to that position.

If I need my robot vacuum to clean a certain area, I can draw a real life area and ask it to clean it.

If I am cooking and have dirty hands, I can turn a virtual knob to control the range hood and faucet

Basically not needing to know what a smart home device is named, or locate it in the home app to control devices. Feels like having magic power to control things in distance.
 

CrysisDeu

macrumors 6502a
Sep 16, 2018
600
876
Less an app, and more of a quality of life feature suggestion- if you look at your (linked) iphone in pass-through, it could automatically overlay the phone’s screen with a sharper, higher resolution version than the passthrough itself is capable of.
better yet, it should do it for ALL apple devices, including iPhone, apple watch, iPad, and even Apple TV.
And then add ornaments to the screens, for example, more info surrounding Apple Watch when you raise your wrist, stage manager OUTSIDE of the mirrored Mac screen, iPad sized apps on iPhones (like a digital foldable)
 
  • Like
Reactions: martens

subjonas

macrumors 603
Feb 10, 2014
5,554
5,883
Something where our solar system is displayed and an autonomous space launch is shown sped up in time to view the trajectory of the spacecraft. Example: New Horizons launch to Pluto flyby. Augment the display with vectors of the spacecraft coordinate system, the earth and sun's coordinate systems and the other planet's coordinate systems.
I think that those who make the star gazer apps can take it to a whole new level. They could suspend the user in Space at their current lat and long and given a full view of their surroundings.
I think a great use case of VR and maybe also AR is to just be able to appreciate scales that aren't feasible to experience in the real world.

Maybe this is what you two were getting at, and maybe something like this already exists for Quest, but how amazing would it be to be able to go on a tour of (or freely traverse) a to-scale representation of the solar system, to get a true appreciation and understanding for its size. (And of course the more beautiful the better.) According to this video, if the Earth was the size of a marble, the diameter of the solar system would be 7 miles (to the orbit of the outer planet Neptune).
This is an incredible video where they went out to a desert to recreate the solar system to that 7 mile scale--I recommend watching.

And of course, not just the solar system, but beyond.

But not just the big, but also the small. To see the scale of cells, molecules, atoms, nuclei, particles, even theoretical strings (granted at these levels, visual representation would require artistic license).

And 3D space can also be used to represent time scale. This video shows a great visual representation in the desert of the history of time of our universe, I believe made by the same creators as the solar system video. I recommend watching this video also.

It's one thing to see scale represented in 2D, but it's hard to appreciate scale through a small window. To truly appreciate scale I think one needs to be plopped in it, and even explore it.
 

michisto

macrumors newbie
Aug 27, 2011
8
28
I think a great use case of VR and maybe also AR is to just be able to appreciate scales that aren't feasible to experience in the real world.

Maybe this is what you two were getting at, and maybe something like this already exists for Quest, but how amazing would it be to be able to go on a tour of (or freely traverse) a to-scale representation of the solar system, to get a true appreciation and understanding for its size. (And of course the more beautiful the better.) According to this video, if the Earth was the size of a marble, the diameter of the solar system would be 7 miles (to the orbit of the outer planet Neptune).
This is an incredible video where they went out to a desert to recreate the solar system to that 7 mile scale--I recommend watching.

And of course, not just the solar system, but beyond.

But not just the big, but also the small. To see the scale of cells, molecules, atoms, nuclei, particles, even theoretical strings (granted at these levels, visual representation would require artistic license).

And 3D space can also be used to represent time scale. This video shows a great visual representation in the desert of the history of time of our universe, I believe made by the same creators as the solar system video. I recommend watching this video also.

It's one thing to see scale represented in 2D, but it's hard to appreciate scale through a small window. To truly appreciate scale I think one needs to be plopped in it, and even explore it.
Love the video! What they are doing should actually be possible with solAR for Apple Vision Pro: https://apps.apple.com/us/app/solar-solar-system-in-ar/id1286558019?platform=appleVisionPro

I'm part of the developer team of solAR. Unfortunately we don't live in the US and therefore couldn't get our hands on an Apple Vision Pro yet. So I can't tell for sure how well it works in practise but we implemented the app in a way that exactly what is done in the video is possible with solAR. You can choose to use realistic scales, which then basically renders the whole solar system so tiny that all the planets and moons are invisible at first. but then if you have an area of 7 miles you could technically zoom the whole solar system up and end up with the earth the size of a marble and then sun in the center with a diameter of a meter, just like in the video. In case you try that I would be dying to know how it goes and what we should improve to make solAR even better :)
 
  • Like
Reactions: Tdude96

HDFan

Contributor
Jun 30, 2007
6,638
2,879
Spectacular visuals!

1. Object rotations "fast" need to be much faster. Maybe 1 minute max.

2. Somehow I lost the controls when viewing Venus. Didn't come up when I pinched. Had to exit out to the main VP menu.

3. Solar system views are edge views. Have to move them to the floor to see from the top. Would like to be able to rotate to see them from the top when looking straight ahead.

4. Outer solar system. Control window disappeared. Had to force quit and reopen.

5. Need some way to make it more informative, allow deeper dives. Right really no reason to use it more than a few minutes.

6. It is similar to some other apps such as Space Vision.

Unfortunately we don't live in the US and therefore couldn't get our hands on an Apple Vision Pro yet.

Amazing job without having a VP. Admire your initiative!
 

AppliedMicro

macrumors 68020
Aug 17, 2008
2,257
2,612
For example, if I want to lower the light a bit, I can:
1. look at the lights and turn the Digital Crown to control the brightness
2. not look at the lights and point my hand towards the lights, and drag down to lower the brightness

If I want to close the window blinds, I can see a virtual blind and how much I have it turned off, it will go to that position.
Do you care about the lights or window blinds, when you have illuminated displays strapped to your head?
If I need my robot vacuum to clean a certain area, I can draw a real life area and ask it to clean it.
Sounds like a pretty dumb home, futuristically speaking.

Why would you put on a headset and open up the robot vacuum companion app to (virtually) draw that area - when you can just finger-point to your robot vacuum in real the real world?

Or, better, use its voice recognition or a virtual assistant to just call it out: “Robovac! Please clean below the dining table in my living room”. A smart home would have the area spatially mapped anyway.

I’d even argue that a really smart home should recognise the “need to clean certain areas” automatically.
 

Jeehut

macrumors newbie
Apr 20, 2021
16
28
Mainz, Germany
I have several ideas that are not possible yet, such as an app that shows things like the plumbob from the Sims games on top of humans around you for a fun effect. But I asked Apple engineers and they confirmed that this is not currently possible due to lack of person / skeleton / head / face detection APIs.

But other ideas I've had which were possible, I already built and shipped. Such as:

"Posters: Discover Movies at Home"
Posters-Demo-1080p-silent.gif


"Guided Guest Mode: Device Demo"
Demo.gif


I have one more idea that I've started exploring, but it requires a device for testing. I've visited the Apple Developer labs in Munich for 3 days already, but I need more time. Gonna wait for the device to come to Germany before I will continue working on it. Hope it'll be as soon as May!

Generally, I make all my multi-platform apps available on Vision Pro as well. Apart from its 3D / AR / VR capabilities, it's also just a huge screen which makes it also like an "enhanced iPad" for productivity purposes but with window support like on Mac. I just have to mix the views from iPad & Mac to get my apps working on Vision Pro. In other words: I believe that every app idea that works on iPad & Mac should also be on Apple Vision.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.