In-person buyers of the $3,500 headset will be required to book an appointment to customize the experience
www.tomsguide.com
According to a shady web tip from China, Apples Vision Pro headset will be released in the last week of January.Apples Vision Pro Will Reportedly Be Released In Last Week of JanuaryWall Street Insights, a Chinese...
www.econotimes.com
It also requires prescription lenses, rather than being able to be worn over glasses, so scalpers will need to make sure they're not sellin to anyone who requires vision correction.
Have you not noticed that an iPad doesn't have multiple user switching? It's a single device, for a single person. Vision Pro is going to be MUCH more locked down than iPads, because its biometric eye-tracking is inherently more invasive, and it carries a rigged, textured facial model of its owner for generating the wearer's end of a facetime video call.
It can remote access a
single, mirrored, 4k maximum resolution version of
one screen on your Mac.
That is not anything like using a display plugged direct to a computer in terms of frame rates, and responsiveness. I have both ARD and Airplay displays running right here and now. It is not the equivalent of a directly connected display.
You can tell wireless remote mirroring isn't good enough for main use, because funnily enough, Apple uses cabled connections for its displays, and you cant make an Airplay display the only screen.
Virtually no lag is noticeable lag. It is simply dishonest to pretend that a VNC session from an M2 iPad, which is what the Vision Pro is, will provide an experience that is more like using a Mac directly, than it is like using a Mac via VNC.
I don't know if you've ever tried, for example, putting VLC full screen on an Airplay display, and then playing a video on it, but the experience is garbage.
The iPhone launch was almost entirely faked, with pre-recorded video and animations, every single interaction carefully sequenced, and multiple devices wired to a switched output going to the projection systems, passed off as live UI with a single device.
You don't seem to understand about how Apple constructs SOC devices - the M2 iPad Pro, M2 Macbook, and M2 Vision Pro are all more or less the same device, with different i/o options and support chipsets - a hinge sensor for the Mac, a touch sensor for the iPad, but it's the same processing, same ram etc, they just boot different OS variants.
Yes, I would prefer macOS... because it's a more productive & flexible environment. "Make it more like a Mac" has been the overwhelming demand from iPad users since day 1.
It's called Vision Pro - funny that there seem to be people claiming it will be a professional tool when it suits their argument, but when it's pointed out how it doesn't do things current professional headsets require, suddenly it's a consumer device?
Guess what - it's an iPad, it's an iPad you wear on your face. It's descended from the iPad paradigm of being a limited, locked-down computing experience, it features apps which are primarily single-screen contained, just like iPad apps, and it lets you VNC into a Mac to see a single screen from that Mac, just like an iPad.
I know about using three dimensional headset-based computing devices to do actual professional work that is inherently three dimensional. Apple's primary focus seems to be using a headset to do 2D iPad style computing in an environment that floats a bunch of iPad screens around you, to make multitasking easier, because multitasking has been a weakness of the iPad since day 1.
Perhaps the difference between us is that I have actual experience with this stuff, and you have seen a promotional video, and some youtube influencers.