Loading Stories...
Loading Stories...
For me the biggest issue though is that it can't fulfil it's primary use cases:
Want it for productivity? it can't run MacOS applications and if you want to use your actual Mac it can't do multiple monitors.
Want it for entertainment? people want to enjoy photos, videos, movies with other people and it can't include them. Even if they have a Vision Pro, I haven't yet seen any sign of ability for multiple people to do these things together.
All up, it all seems far more immature and dev-kit stage than I was expecting.
This is one of those things that Apple never claimed was supported, and yet there's something about that behavior that feels like such a natural intuitive implication to the technology that a lot of people feel alarmed or even cheated when they realize it's not possible (yet). It's been funny to watch the various discussion threads as people pop up talking about their shocked realization and disappointed feelings.
Update: I did realize when watching the WSJ video that the "mirrored" display actually appeared to have greater "resolution" (more pixels in height and width) than what she had on her laptop. So that's something.
However, the best review I’ve found that actually transmits what is possible and what it is like to use is Brian Tong’s 55 minute review video: https://youtu.be/GkPw6ScHyb4
I’m not familiar with him, but unlike other reviews I’ve seen, he spends less time evaluating or summarizing, and more time trying to actually use the device. I didn’t even realize that you can seamlessly use your Mac to control your visionOS apps, for example.
But what is the account situation like?
For years I’ve been complaining that I can’t easily use my private iPad with my company Mac because they have separate Apple IDs. Things like sidecar for a quick virtual whiteboard are basically impossible.
AirPods have gotten better over the years where today I can freely switch between devices belonging to different Apple IDs with the same AirPods.
But is the Vision Pro like that as well? It would seem weird to exclude the not-so-small group of people working from home but with company MacBooks
I am hoping we will see a lot of experimentation in the coming years, and I am excited for what the Apple ecosystem will bring to the table. That said, from what I have seen so far this does not seem to be a revolution compared to the current offerings, but an evolution on various fronts, without addressing the killer app question.
Oh sorry, that's from CNET's review of the first iPhone in 2007: https://www.cnet.com/tech/mobile/original-iphone-review/
It's way too early to tell if this product line will succeed in the long term. Will the first gen Vision Pro be a runaway success? Of course not! Will later generations look as obvious as the iPhone does now? I sure hope so!
For comparison, Apple sold 1.4 million iPhones in 2007. Supposedly Apple is expecting to sell around 500k Vision Pro units this year. Given the 3x price difference (in 2024 dollars), that effectively means the first gen Vision Pro is expected to bring the same revenue as the first gen iPhone.
We all have rosy retrospection about how great and obvious the first iPhone or first iPod was, but honestly nobody had any idea if Apple's crazy bet would pay off. We all agreed it was magical tech, but it was expensive, had tons of limitations, and nobody really needed it. Sound familiar?
All I know is betting against Apple has rarely paid off. They do have failures too though and this is clearly technologically more ambitious than any other launch, so who knows! And honestly that's what makes this launch most exciting.
It's been so long since I've had child-like wonder about some new technology that I'm just glad Apple took a chance on launching such a crazy device, even if I don't know what to do with it... yet.
1. How does it compare to an high-end monitor for text editing/programming, web browsing, watching non-VR video, playing non-VR games? Is it better or not?
2. Is the resolution, latency, FoV and lack of color fringing good enough for it to be indistinguishable from reality in both passthrough and VR modes? If not, how exactly far is it?
3. Can you run VR games on a PC with multiple desktop GPUs and stream to it? How does it compare to current high-end and ultra-high-end VR headsets?
I feel like being able to see everything open at once would be incredibly distracting. I like how I can swipe between app screens on my desktop so that I'm only focusing on one app at a time.
Of course I imagine there are some applications where it's useful, but to me it just feels like the Apple Vision Pro is just a very large screen and they haven't quite figured out what to do with it.
Even with just 1 4k floating screen I think it would be a winner for me but I'm also really excited to see what people come up with in VisionOS itself. I think for a while the sweet spot (for me) with be using a virtual monitor and a handful of VisionOS apps as well. Eventually I hope to be able to pull macOS windows out of a fixed box and arrange them wherever I want but I'm fully aware that might not be this year or even next year.
Reaching out with my whole arm to click, or pinch to zoom, is a HORRIBLE user experience. (For most things - clearly VR games and such will be fine).
It seems I can connect an ordinary keyboard and mouse to it though, so I could see using it as essentially a really large monitor. As long as I can point and click with my mouse independent of my eyes or head position, and my text-entry focus doesn't move without permission.
This whole product is 90% of the way there. But the next 9% is just as hard as that original 90%. Apple is releasing it now to push it to 99%. It'll get better.
- The Vision Pro is the best VR set that can be done today, with massive investment (rumoured 5e9 USD) and competent staff, and hefty price. It is miles ahead of the competition.
- It’s still not enough for most if any practical use, apart from films maybe. The technical requirements for a really useful VR are still largely out-of-reach, and will be for at least the next 5-10 years.
If they'd made something like "Vision Air" that was essentially Meta's Ray-Ban Smart Glasses plus AR for $999 they'd sell millions, they'd have a market for app developers, they'd convince people there was a point to having them. That's an actual interesting product and it's closer to where they want to be anyway.
This feels broadly like a staff retention project, or some kind of market positioning strategy or supply chain capture strategy or something. I just can't believe this product was the point of all the time, effort, and money Apple spent on it.
Its sad that the fastest thing we have now is log.Printf("asdasdf") and grep the logs on the pod :)
Debugging multithreaded application is as difficult as it was 50 years ago, maybe VR debuggers will allow us to debug complex interconnected systems or models in a more intuitive environment.
Also I think having hundreds of chats with some llms to investigate specific part of the code/docs fits very nicely with infinite screen space, and using your eyes to focus instead of alt+tabbing
I have my fingers crossed for some insane tooling advancements in the next years.
Does anyone know of any re-inventing the IDE in VR projects that are worth following?
What is required is a device as thin as sunglasses, which basically does AR things. That will be the gamechanger and I think that will be done by another company because it feels Apple forced this one and that's sad.
I already imagine how people from 5 years in the future will be sharing photos of the current Vision Pro asking "Remember when this was the best VR headset hhh??"
Sure, VR headsets existed before Apple's foray into that segment, but also did laptops, smartphones, tables computers, smart watches and bluetooth headsets.
And if one to learn from history, all these products categories were significantly improved after Apple entered their respective markets.
Could also be very handy when working on planes, and for working from outdoor locations like parks (which I love to do). I find I'm most productive working from planes, but if there's a seat in front (non exit row), looking down at a laptop for several hours at a sharp angle can cause some neck pain. This would allow me to look straight ahead.
The device is a simulation of the dream device that can overlay UI on top of your vision without you looking any different to those around you, I wonder how far away that is.
Does it? What about gopros or similar action cams, being used for over a decade, and for less than a tenth of the price?
> A: The Vision Pro wasn’t designed to be worn with glasses. Instead you have to order prescription Zeiss optical inserts for $99. The two monocle-looking pieces snap right into place.
It's $99 for readers (non-prescription) and $149 for a prescription. Very odd that she would have gotten that detail wrong since it's so easy to check and it's been repeated so often in coverage of the AVP.
Also, it's hard not to enjoy any review that includes the word "bejeezus."
The thing I most want to know about this device that went totally unmentioned is:
how sharp is the text? Or, how many windows of code can I comfortably see at once and is it more or less than my 30-something-inch monitor?
TBH, that's what will sway me one way or the other on this... my current monitor might have been $3500 nearly 20 years ago. It was $1000+ when I bought it in 2012. If this (or one of its near term successors) could replace it in a way that is better, I'm kind of interested.
And the battery life is brutal. Extremely first generation.
But in this case, I would probably try this device during half a day if someone gave it to me, but I feel zero desire to own or use one.
It does not enable anything that I can't do better with a real computer, a TV or a smartphone.
I'm sure it supports Airplay, but being able to plug it in to a real monitor with a wire, wouldn't that make the purchase a bit more appealing?
Very much like the XReal glasses but with resolution of Quest line.
I loved my XReals. The biggest issue with them was it had no head/position tracking to keep the screens in one place.
The technology I’d throw my money is glasses that:
- look like fashionable glasses. Not that glasshole crap from Google. Or VisionPro goggles weighing a kilo. XReal are fashionable enough that most people didn’t notice it when I was using it in public.
- Add an unobtrusive display only visible to you, not to bystanders. Text display ought to be crisp and readable.
- Built in speakers audible only to you, not to bystanders.
- track head position so windows can stay floated in space instead of moving with head. The immersion of virtual object.
- some form of finger tracking to select and tap on things. Again private, shouldn’t need to put them infront of my face.
- pretty darn good contextual AI that has understanding of world around me. Show me gps directions. When I meet someone, recall their notes so I can have a meaningful conversations. Fact check my conversations on the fly. Be the bicyle of my mind.
Basically, the thing gives you some screens fixed in world space. You have to be mostly stationary for the effect to work. So it won't replace walking around staring at your cell phone. A lot of thought has gone into the interface for this sort of thing. That's the real achievement here, and that matters.
Workplace of the near future - a room full of people, crammed together as on an airplane or in a theater, each wearing a headset. No more desks!
It's still too heavy, though. If it's tethered to a belt-mounted unit, why isn't the electronics down there, instead of in a brick on the back of your head? The headgear part needs to drop to no worse than swim goggles.
This thing could have been useful on day one for lots of people, from drone operators to cinematographers to programmers. But Apple's sad fear of I/O has crippled yet another product.
People should expect that a $3500 video-display device has a way to get video into it.
Is it to watch movies on a 100" screen on an airplane? The xReal weighs 75 grams and does that for four hundred dollars. It has a HDMI port too, so I can play my steam deck or switch.
Where are the cool AR apps? Where is the contextual data popping up as I drive my car? None of that exists!
Journalists are the easiest to delight so if they can’t even do that…
All that said, I do think they will succeed eventually with this product, with years of using us as beta testers and shrinking the size of it.
From my experience with VR, the only thing I've been absolutely and definitively blown away by was sim racing. Before getting a PSVR2 I had played roughly 70 hours of Gran Turismo 7, now I'm at over 500 hours in the five months I've had a PSVR2. I want to buy a PC-compatible headset to get access to better pedals, wheels, and simulators (games) than I can get on the PS5. It's quite easy to sink many thousands of dollars into the experience. I believe the margin requirements are there for Apple (eventually) when considering the potential for professional training solutions.
But! Without taking ownership of something, creating some IP (intellectual property) to provide a killer app, and inspire other segments, they're going to be hard pressed to make inroads and get this to scale.
The spatially placed timers are one of the most compelling use cases I’ve seen so far, but I wouldn’t want to wear 1.3+ pound goggles plus battery for that.
The Vision Pro is like Google Glass, a none essential luxury item that seems cumbersome.
I don't think this will make it TBH. But perhaps it does. And if so, seems a bit dystopian.
And while you're strapped to those items you can't use keyboard or anything else with your hands. You can't scratch an itch or put something in your mouth. Your hands are occupied and you feel like you're in jail. Surprisingly you actually want to have hands.
This is the single most important innovation in Apple Vision. It gives you hands.
Pilots use AR and skiing is a lot like flying. I hope this sort of technology gets so common in my lifetime that companies can bother to build it into legit ski goggles.
I went back to my Arch laptop instead.
Unless the problem is Apple fans having too much disposable income, in which case I guess this is an excellent solution XD
That's... embarrassing.
This should have been a dev kit.
VR enthusiasts, which I guess now includes Apple, assume there's some natural, obvious mass market demand for such a solution if only the tech is good enough.
I'm telling you that this mass appeal doesn't exist. Nobody cares about AR/VR. We've had AR on iOS for a decade and other than navigation and Pokemon, there's no killer apps, no ecosystem.
I took my colleagues to a VR center, where they could experiment with lots of setups, try out games and experiences. They had a great time but nobody spoke about it ever again and nobody bought any device. And these people work in tech.
The tech isn't the problem. The entire interaction simply isn't a fit.
I cannot imagine having to do this review. It's:
- Likely to be read and seen by a substantial number of people
- On the highest of high-tech devices
- From a company that splits the world into Fans and Haters
- Where the real question is how does it feel, and how do I look?
- And for heaven's sake should I spend $3,500+?
So much potential for embarassment!
She did a great job, and identified its killer app: home 3D movies.
Selfies are so passe, when you can actually re-experience something, and perhaps notice what you missed before, and reinforce memories. It's ideal oldsters to experience trips that their grandchildren take.
Gating factors: My concern is mainly that this will need another generation of chip miniaturization to get the headset lighter AND more performant AND using less power -- and may require a faster bus between the chips.
Win/lose factors: Probably enough people will buy it to make it not a financial mistake. But the big win is that developers will have to start solving the problems with this device, and only those using this device will have solutions under their belt. To get to VR + 3D media, go through Apple?
The Macbook went from an amazing piece of tech to a useless piece of RSI inducing crap. You have to buy from system 76 to get anything like the old macbooks.
The phone, well, I was forced into an upgrade this year (as part of the watch fiasco above) and was amazed how underwhelmed I was. I went from an X to a 15 and as far as I can tell, the camera is better and it's a little lighter. That's not enough for the money I had to shell out.
All in all, the company has seemed to be going downhill to me for at least 5-8 years. But, I'm just one consumer of many. Other people seem very amazed by these things which just do nothing for me.
This is what it will be like in 10 years. The current generation (teens and adulets) wondering around the streets with their head sinked to their phones, will be replaced wearing "specs" 24/7.
In the next 20-25, I would not be surprised if it becomes no more than contact lenses.
Monitors will likely be a thing of the past.. even giant TVs in the living room. Same for (physical) keyboard and mouse, being replaced by IR equiverlants being as good/responsive.
The technology is not there, yet. This is a glimpse of what our everyday will be, and in the working office.
Interesting to see where it goes.
I feel like that _has_ to be why all major companies are pushing hard on vision products like this. They know AI is coming and need to be there first so their glasses are efficient and adopted where models will be easy additions.
Presumably the main concern is the uncanny not-very-lifelike nature that's putting people off and the rest is self-consciousness. Maybe they need longer scanning to build a more accurate facial movement model?
I still find it very odd when the eyes the device displays to others have such a raised position compared to the true eye position. If I had one, I would rather something static went there or nothing, and maybe overlay that with a gaze direction indicator "spot" instead of faux eyes.
Won’t be surprised if some android manufacturer like xiaomi release something like this.
Also, makes sense why Meta kept the form factor because they never had their own mobile device!
As with most Apple products, just like the Apple Watch and the iPhone, I don't think this first gen product is a good buy. Wait for the 2nd gen unless you just want to play with the tech.
1. Vision Pro will definitely replace all kind of monitors, such as TV, projectors, etc., because it provides a completely immersive watching experience
2. Besides that, due to the limitations of latency and noise issues in the low light situations, it will still be an entertainment toy instead of a productive tool
Or, I dunno. Clean the living room? You can even watch a movie while you do that!
One thing I really notice about this stuff is that unlike mobile phones and TVs, this will not be for everyone. There's a big chunk of society that does not want home automation, VR/AR, fitbits, smartwatches, etc etc. Me included.
Seems like kind of fun if you're into that sort of thing, though.
People forget that the brand Apple carries so much freaking weight.
People buy the Apple lifestyle. Which now includes AR goggles.
Call me crazy, but if Apple can't figure this out, maybe the concept itself is flawed.
This has been know for at least 30 years in the eye tracking business and it even has a name - The Midas Touch problem.
Is an AR productivity tool really so hard? Apple owns the whole stack here. Nintendo can do Mario Kart AR in your living room with an RC car, but I can't get unlimited AR desktops for software development etc?
As far as viewing with other people, this doesn’t seem like an insurmountable challenge. They have the theaters, they have Personas, they have spacial audio, and other Apple devices have features for watching content together with friends. Put them all together and it seems like if several friends had Vision Pro they could feel like they were sitting in a theater room together while watching a movie. I’m not saying this will be easy, but it seems like all the building blocks are there. The Personas are probably the big weak point, especially looking at someone next to you, but with the focus on the movie, I that’s probably the least important part.
This has always been the case and this technology has been around for a while. I'm surprised Apple would have chosen to use it for user input.
lol, literally the only use case I could convince myself to spend this money on (hey its cheaper than an XDR ).
Even then I was having trouble convincing myself since.. all indications were you wouldn't want to wear this more than a few hours at a time, so ultimately you still need the real physical monitors for the other 80% of your workday.
Whatever, it at least gives a startup an opportunity to build something unique - it's just sad to see your old friend start going senile.
There’s a video circulating of someone cooking while wearing it and gingerly pinching a virtual timer and placing it on a pot of boiling pasta.
It looks so stupid that I couldn’t help but laugh out loud.
Maybe younger people might think differently but for me, this stuff is dead on arrival because of simply how uncool and stupid it makes you look when you use it.
Everything you've said is reminiscent of the reviews of the first iPhone.
The reviews haven't mentioned it, but SharePlay [1] is OS-level functionality and the press releases mention using it with movies, music, and games.
[1]: https://developer.apple.com/videos/play/wwdc2023/10087/
Most of Apple offerings are good:
Watch
iPad
Mac
iPhone
services
Are they really expecting this to just be a hard problem initially that they get better at over time? When is the last time they launched a "so so" product?
Not on a plane they don’t. Not in a hotel room on a business trip.
IMHo, the Vision Pro is for being somewhere when you’re nowhere; not for being somewhere when you’re already somewhere.
This is not accurate, FaceTime has Share Play. Any app that leverages it can build a synced entertainment. Example out of the box, Apple TV, Freeform, Apple Music.
I wish the SDK for Share Play was not tied with FaceTime, since it limits you to only people you have their iCloud email or phone number to. Big Screen on Quest was a great app that leveraged the idea of multiple users in the same VR session, based on interest, Quest just lacked the quality.
https://support.apple.com/guide/iphone/shareplay-watch-liste...
I wrote a long comment [1] months ago when the Vision was first announced expressing my skepticism about the use of eye tracking based on my person experience with the tech. At the end I said, "maybe I'm wrong." Turns out I wasn't.
Why would you want presence in an action movie? Cool, in an exciting sort of way.
But presence in recording of your family? That's powerful stuff, for everyone!
That feels like the long-term hook. iCloud to handle obscene storage amounts as a service. iPhone to generate new recordings. Vision to play back recordings.
And the dastardly brilliant part is... the more 3D video you record... the more valuable a Vision is to you.
>> Capturing -- One of the more remarkable things to watch? Your own home 3-D movies. Apple introduced “spatial video” for the iPhone 15 Pro a few months ago, and I started recording my sons with it. Watching the videos in 3-D in the headset now is almost like reliving the moment. The Vision Pro also captures these videos and photos—you just hold down a button on the top left.
Do you mean a Mac's external monitor visible through the vision pro AR view?
At the end of the day, this is Apple testing the waters and trying to get a positive cash flow to help offset significant R&D...what they're showing is pretty impressive in a number of ways, even as it's lacking in others.
I don’t know why, but while I feel multiple monitors helps my productivity a lot in Windows and Linux I find myself not caring as much in MacOS as long as the screen is big enough. I think it has to do with my habits around how I use the windowing in each. I tend to teasselate and arrange them in MacOS while I tend to maximize or lock to screed edges in Windows.
may be apple expect every person to buy one. Didn't facebook recently try something similar?
> cameras are still cameras, and displays are still displays.
Anyone remotely familiar with the state of development in those areas would be aware that “even Apple” can’t cheat Reality (punintentionally).
Those left still raving about and/or hoping for a game changer will be greatly dissapointed - or only in it for the line go up.
The whole concept will be a niche product for many years to come and will stay an isolating experience.
iPod was the size of a beefy wallet, but was good enough.
iPhone was glorified plastic and websites looked like crap, no app store. But hey, it worked well enough.
That said... this isn't accessibly priced and what's the hook? Like if this launched the same time as Pokemon Go or WoW were taking off alongside it it'd get the social momentum all the other options had.
Also it's better than the competition in key was, but differentiable ways...? AR/VR could very well take off but it's not this year.
No idea if it does this, but the obvious use case is for people who aren't physically present - but letting them somehow share a physical space. It could potentially be awesome for friends/partners who live far apart.
First Gen is usually awful
This is not awful and maybe even closer to 2nd gen
Everything starts somewhere
Most things are ready for the masses by the 3rd or 4th gen.
Eh... I prefer empty cinemas
No loud babies, no popcorn sounds, no people explaining the plot to people not paying attention, just me and the world of the film. Bliss.
This is a disingenuous argument. Your other points are much more valid than this one. You don’t have a VR headset to interact with other people in the same room. If you want to watch a movie with other people around you, there are many other (cheaper) ways to do that (and Apple can sell you a nice AppleTV to do it).
or, how often do we believe other people are clicking something with out looking at it?
im examining this for myself...hard to feel organic while im actively focusing on it, but i at least glance at my mouse pointer target while traversing the pointer towards the target across my screens
OSX had the opportunity to follow that path before settling on the “render windows, capture the screen, compress the image, send it over the network to be decompressed” VNC-style remote access that’s bog-standard today, and if they had Vision Pro would be set up to be an absolute mind-blowing macOS experience.
With Stage Manager on macOS now, it feels like they have all the primitives in place to "transpose" macOS stage manager windows textures to Vision OS/ the iPad OS foundation.
Though this will be tricky to get right for all apps. Will be interesting to see if it's a macOS App store only feature/ API, opt-in, or some other option
So even though you have a sequestered Mac output alongside Vision apps, you can use the same controls for all them simultaneously. This should help in the interim.
But this is sufficient for many use cases (or at least, mine). I pre-ordered one with the idea that my main work will be on the 4K monitor, with most of my superfluous apps floating around as native visionOS apps. That's mail, a web browser, and zoom, which all have apps now, and Slack, which I could just use Safari for but may have a native app in the future.
It feels like the Vision Pro would definitely be a great replacement for people who (want to) buy multiple expensive monitors, but it doesn't fully reach that potential today, and mostly because of software? Although rendering 3 or 4 virtual workspaces through ad-hoc Wifi at 4K 60fps+ low-latency would certainly be a huge challenge.
He said he could wear it 45 mins before needing to take it off, that it was overstimulating so you need to slow down how quickly you use apps and move things on screen, and that gestures also were fatiguing. You could tell he was trying to be fair but positive.
Headsets just haven’t cracked this nut yet, and tho tech may advance somewhat, they may be limitations inherent to the form factor. Even if it gets really light weight, the issues of overstimulation, headaches, and the amount of neck movement implied may keep these products in a niche. (I say this as someone super excited about AVP)
For everyone used to using their computers all day long wanting to do it in a headset, don’t throw your macbooks away just yet.
Either you can't sign in with your personal Apple account, or you shouldn't (because MDM). So the only way to access anything associated with iCloud is what is available on the iCloud web portal; which is a horrible experience. You can't do sidecar. You can't do airdrop, copy-paste, continuity camera, nothing.
I've only ever used Macs in a professional environment. I've, also, always had a Mac and iPhone as personal devices. But I've never made the jump toward saying "Ok I'm actually using iCloud Seriously now" for this single reason. The best Google Cloud experience is available in a web browser, which I can be signed-in to on everything. Google Drive is everywhere. The list goes on.
Its such a crystalline example of why Apple's walled garden actually hurts themselves.
It's actually far worse. There's a single user and a "guest mode", but for AR/VR to work with, there's a calibration step, which means that the guest has to go through that step every single time they want to use the device. It might be fine for a real guest using it once, but it would be basically impossible to share the device with someone else. Having to setup the device every single time you use it sounds absolutely terrible.
Locking you customers into your ecosystem? Fine, whatever. But even within the ecosystem restricting usage in such a way!?
It's been said for years but the iPad could be so much more than a mere media consumption device if it weren't for short-term-profit driven design decisions.
Maybe they do better with the Vision Pro.
Nah. I've tried using the Quest 3 for various things, and at the OS level it just fundamentally sucks for anything but fully immersive games. Multitasking support is close to nonexistent, the hand tracking is crap, the UX is mediocre for even basic things like "sit on the couch and watch a movie", the system is end-to-end filled with small annoyances every time you use it, and there are no signs that Meta has even recognized these problems, let alone have any plans to improve on them.
The difference in functionality is night and day, even including the downsides noted by reviewers.
So some might argue it is better even.
Third party immersive apps might break that rule, of course.
I can see the current XR headsets shrinking down to the size of a pair of glasses within a couple decades, but contact lenses would require a giant leap forward in technology.
You'd need displays, electronics, sensors and batteries that can all fit together inside a paper-thin plane, and also, somehow, be invisible. Not saying it will never happen but we are really talking sci-fi tech at this point.
Holy moly that website (https://www.viture.com/). It looks like a huge amount of development effort went into it but it's almost completely useless. I went looking for detailed information about the product and was relieved to find the "learn more" button (which I'd initially mentally dismissed as a GDPR cookie compliance button), but it just opens an autoplaying marketing video with all the controls disabled.
(If you do want to read about the product their Kickstarter page was a bit more digestible/normal for me: https://www.kickstarter.com/projects/viture/viture-game-and-... .)
As an anchoring point, Apple couldn't figure out a better solution for movie watching than Google's or Amazon's offering (cheaper yet better than the Apple TV), and I don't think the concept is flawed in any way.
If anything, it's probably healthy to see Apple stretch and hit some walls, try new ideas that might or might not succeed, and overall experiment in the open instead of always assuming they've got the perfect solution.
Reading these reviews, I just keep thinking -- yes Apple does this better, and that is interesting, but... the Quest 3 can almost do all of these at a fraction of the price.
And the biggest investments -- screen and cameras -- need to see real use cases yet. I am not convinced that people are actually going to replace their monitors or TVs with Vision Pro and wearing this for many hours straight at this point.
I have a similar complaint with my Apple Watch and my corporate issued laptop. When I am using my own computer (mac mini) I love how easy it is to use my watch to login, use it to approve actions, etc. However when it comes to my company laptop I have to type my password in repeatedly. It would be awesome if the watch could be linked to both IDs to make this much more seamless.
This is kinda what Managed Apple IDs are for - the work 'owns' the Apple ID it puts into its management profile and can set policy. Apps write into a separate storage container which the company could remote wipe, without affecting the rest of your personal data. If they want to disable things like sidecar, they can do it.. for the corporate apps/accounts/web domains.
I'd' generally assume the multi-user aspect is worse (because face shields and prescriptive inserts) so generalized multi-account is pretty low on the priority list.
Ended up creating a new account that was part of my family.
These devices are going to have your sweat, makeup, odours etc on them.
So you're really not going to want to share a device with anyone else.
What the fuck. The fact that an apple ID is even involved is absurd. Should be able to just Bluetooth to any device.
(My partner is corpo; I'm startup, but have worked at corpos. No thanks.)
Better to keep it all owned by the company, in my opinion, and have them issue you an iPad for this express purpose.
The moment you use different Apple IDs you lose a lot of nice features of Apple‘s products
For example, "killer apps/content" never arrived for 3D TVs and they have largely disappeared from the market. Same with various "waggling" input technologies like the Wiimote and Kinect. There were some compelling uses, like Wii Sports, but these were pretty limited and many other uses of these in games was a case of Nintendo shoehorning the technology into the game.
I think the best pessimist argument is the one offered by Folding Ideas in his metaverse video[1]: Text is really, really useful, and a virtual 3D space is not a good environment for either creating or consuming textual content.
The reason the software doesn't exist is because compelling hardware doesn't exist for it to run on, so nobody bothers to write it.
Apple is imagining this device will be used for productivity but it's still painful to actually wear for long periods. We're a long way from being limited by software instead of hardware.
In 1988-94, the CPUs available in desktop computers were substantially more advanced than the widely used operating systems. Windows 3 and Mac System 6/7 didn't support pre-emptive multitasking, memory protection, or many other features that define a modern OS.
Maybe we'll look back at today's Quest and Vision Pro as similar transitional devices with one foot stuck in the old paradigm, running old-style software.
I’ve also heard about players spending a lot of time in counter strike games like pavlov.
At this point it seems like there’s a TON of things to do in VR (and I’m gonna be honest, there were a ton of experiences too on the Quest 1 when I had it).
I’m just waiting for more live shows and concerts that I can attend from the Quest personally.
It's a 600+ grams headset with a battery on a leash for $3500. I wouldn't say the hardware is mainstream ready or fulfilling it's side of the contract yet.
So the hardware is not good enough yet. It will be good enough when I basically don‘t care, just like I don‘t care with glasses.
How about enabling AIs to create layouts of information on behalf of the user? Like, what if an AI could arrange all of your information for you in a scheme derived from Archy?
Totally agree. I'm waiting for a usable Virtual Desktop app to come out. All the ones I have tried which work on my cheap WMR headset fall short of having floating app windows in view.
I guess there is one of those which works on Meta Quest, but not PC headsets. That's really what you need to be effective working in VR. Just like is mentioned the Apple headset supports.
I agree that progress has been slow in the consumer space and meaningful long-term adoption of VR has been confined to a few niches; that isn't necessarily an indictment of the long-term prospects for VR, because desktop computers spent much longer in that stage than most people remember.
In enterprise, I think things are more advanced and some user groups have decisively gone through the one-way door for some applications. I think the best example is architecture. If you've done a couple of client presentations in VR, you just aren't going back to showing renders on a flat screen, because immersing the client in a physical space is that powerful. It's not just a sales tool, but a communications tool - clients can understand and respond to the environment intuitively and give much better feedback as a result.
Industrial and clinical training is less clearly one-way, but I think we're very close in a lot of areas. AR is still less developed than VR, but I do think we're on the cusp of something significant - a sufficiently comfortable standalone AR headset with sufficiently high-quality passthrough can deliver training experiences that can't practically be replicated through other means.
I think one of the most interesting areas of development is in psychiatry. It's still early days, but we're starting to see real, meaningful benefits in RCTs for VR-based therapy of disorders like phobia and PTSD. Some of the most compelling results have been in the very sickest patients - people with psychosis, who often find it impossible to engage with conventional psychotherapy.
https://www.psy.ox.ac.uk/research/oxford-cognitive-approache...
I don't think it's remotely likely that VR will ever replace flat screens, but I do think that VR is slowly growing into a niche but durable HCI platform. Tablets are a reasonable analogy - a lot of people see them as a failure, but they still sell in serious volume and they're often a much better form-factor for specific applications than either a phone or a laptop, especially in industry. Tablets didn't change the world, but nor are they likely to go away.
Probably a dozen other companies launched similar devices already. Apple is hardly going out on a crazy limb here. This is their classic iterative refinement of what other people already did.
But I do agree with the first point - the flaws in this gen 1 has very little bearing on the long term success of it as a product category. But I would argue it works both ways, in that to the extent it is successful in the niche that buys it, you can't tell yet if it will break out to mass appeal. We just don't know.
> Fortunately, we can report that on the whole, the touchscreen and software interface are easier to use than expected. What's more, we didn't miss a stylus in the least. Despite a lack of tactile feedback on the keypad, we had no trouble tapping our fingers to activate functions and interact with the main menu.
What I’m seeing in the reviews of AVP say the opposite about many aspects:
> There is a built-in virtual keyboard so you can type in thin air. But it will drive you mad for anything longer than a short message. And selecting smaller buttons with a pinch should be a carnival game. I started getting real work done once I paired the Vision Pro with a Bluetooth keyboard and mouse.
I agree it’s still too early to tell, but the best thing that I see being mentioned is movie watching which is something all the other headsets already do as well. The AR aspect seems to be a unique aspect, but I wonder if there will be safety issues that prevent things like cooking and doing other tasks assisted with AR from truly taking off.
Streaming games from a PC to a standalone headset over WiFi has been proven to work with the Quest, but that has proper controllers.
I recommend the video review as well. Seeing the video call between Nilay, Joanna, and MKB shows how much the tech has advanced but also how much it still needs to evolve to be at the level of FaceTime Video.
You can shove the app into a corner of your room, out of your view, and it will be there when you walk over to that corner later. That's part of the idea of "spatial computing" - if we can associate computer objects with real locations in space, maybe we can better harness spatial memory and stuff like that when we interact with them.
I don't think you'll be able to pull windows out of the mac screen, but apps you might need are in vision OS anyway like safari or messages.
I think my dream would be dual 4k monitors, or maybe a double wide?
just look with their eyes and then gesture with the hand
lots of reviewers discuss how the hands can be in low, resting positions
theoretically, as i dont have primary experience to draw from, this UX could potentially be even a smaller amount of wrist motion
That's also the main reason nobody uses Quest 3's hand tracking despite it being really good. Why would I wave my hands around when I can achieve this with micro movements of my thumb.
I really don't think this is correct. It's a bit better at some things than any of its competition, and it's a little worse in some ways too. Other than the displays I don't think you can say it's substantially ahead at all.
Thank you.
"What's a computer©®™"
- Learning instruments in a guitar-hero way (Piano, guitar, drums)
- Cooking with timers and recipes right in front of you (will be even more doable with better internal displays in the future)
- Coding with virtual displays on-demand. This is another thing where we still need more resolution to make it really doable.
- Watching movies. Obviously a solo way of doing this but I could see it being big.
- (once these are much lighter and less intrusive) I could see these being huge for virtual workouts like Yoga, weight lifting, etc.
Also regarding your question, I'm trying to think of the "killer app" is for a currently successful device - iPhone. I mean, camera? Texting? Most people use tiktok a ton but I wouldn't consider that a killer app. I think it's more of the device providing a home for a bunch of different apps.
> According to an exclusive report from The New York Post, NBA Commissioner Adam Silver “said the league is working with Apple to bring a tech-enhanced viewing experience” to its upcoming headset.
When asked about it, he told the outlet: “We’re working very closely with Apple.”
https://www.tomsguide.com/news/nba-games-could-be-apple-visi...
If there is one group that has a track record for laying down piles of money for hardware like giant televisions and expensive streaming services, it's sports aficionados.
Can’t imagine doing work is the killer app, not while wearing a headset is more cumbersome than opening a laptop.
Movies on airplanes. I’m guessing these will become virtually ubiquitous in the front cabin within a few years.
For a lot of people, Beat Saber and similar games are a killer app for the Quest. It can be good for making exercise fun and accessible at the same time.
The other one that could be huge IMO is attending real-life events like sport, concerts, shows, etc.
It could not even copy-paste. It had horrible internet.
Vision Pro may be the first OS that does things "the right way" in AR/VR too. Big difference is that iPhone was a real consumer device, this seems more like a development device. Which may be a smart move, because after all: (Phone) Apps made Apple win, and not enough apps made Windows lose.
"The displays have other limitations: the field of view isn’t huge, and the essential nature of looking at tiny displays through lenses makes that field of view feel even smaller. Apple won’t tell me the exact number, but the Vision Pro’s field of view is certainly smaller than the Quest 3’s 110 horizontal degrees. That means there are fairly large black borders around what you’re seeing, a bit like you’re looking through binoculars."
https://www.theverge.com/24054862/apple-vision-pro-review-vr...
The widest FOV headset you can currently buy is the XTAL 3 at 180 degrees, and it's huge, despite being a PC-tethered design that doesn't need to make space for a SoC or battery.
https://www.xtal.pro/product/xtal-3-mr
We're a few breakthroughs away from having full immersion and a reasonable form factor at the same time.
In reality, it will collect dust.
It needs to "do work" faster, or at least more conveniently, than a computer can.
Just needing to put the thing on is a massive hurdle for regular adoption.
I bet Luckey had something like this working in the early days but abandoned it because it wouldn't work for games. What a shame
Also, AR glasses being used by someone operating a motor vehicle sounds like a recipe for an entirely different kind of “killer” app.
This is an utterly bizarre stance.
I can't possibly imagine buying a product, any product, let alone a $1,000+ product, purely based on their marketing, when you admit that you don't even like the product.
Most people, when marketing convinces them to buy something, aren't really consciously aware of it. Not only are you consciously aware of it, but you acknowledge that you don't like it, and buy it anyways.
You also seem to forget the huge success of not-so-essential AirPods and to a lesser extent, Apple Watches.
I wanted to see how it is possible but sure enough, I found a paper from 1995 that cited even older research about this.
Let's say I want to click on the "reply" button below this text box. If I'm perfectly honest, I DO look at the button for a moment, then I move the mouse pointer over to it. But then right before clicking, my eyes switch back to the content I've created to observe that my click is having the desired effect on it.
I'm not actually looking at the button at the moment I click on it, but I DID look at it just a few milliseconds prior to the click. Why can't the UI just keep track of what I looked at a few milliseconds ago, to figure out that I actually wanted to click on the button, and not in the center of some text box?
One issue could be maybe I thought for a moment about replying but then changed my mind and decided to edit the content some more. But the UI has decided that I meant to click the "reply" button and so now it's been submitted prematurely. Yeah, I can see the problem now. The position of the mouse cursor is meaningful when clicking, and the Vision OS doesn't have a cursor. Cursors are important.
What you don't get is Mac windows intermingled with AVP windows, it's just your Mac screen as one window that you can move about. It sounds good though, and there has been a fair amount of movement in screen sharing over the last few macOS releases which suggests more could be coming here (like multi-screen support).
You can't because it's computationally impossible. There is simply no computing device that can render unlimited high-res desktops at 60Hz per eye.
Not to mention the need to stream all that data to the headset - since you're not going to put a high-end graphics card needed to even attempt this in anything approaching a wearable form factor. Good luck getting multiple 4k or even just HD streams between your laptop and your AR headset over Wi-Fi.
It really isn't though, at least not for long, as of mid 2023 there are publicly showcased compact lightweight prototypes with 240° FoV.
This can also be achieved with 3rd party apps. On Quest, this is already a reality with Big Screen.
It's basically what the SimulaVR guys are aiming at, and I'm surprised Apple didn't go this way with their Mac integration. Especially because the native visionOS apps do seem to behave just that way.
I think all the wear comes if you’re sitting up wearing it, though. For passive consumption (or thinking between moments of work), you can just lean back or even lay down, which will take the pressure off your head/face.
We all look stupid staring at our black rectangles, with notches at the top, with little headphone stems sticking out of our ears. It looks stupid at first and then you get over it
Sort of virtual assistant.
That could be useful to people to avoid burning or forgetting stuff.
I'm so old I remember the N-Gage 1st gen being ridiculed for the "sidetalking" feature.
Now we have millionaires on TV talking to their phones like it's piece of bread they are about to take a bite of and nobody bats an eye.
The first iPhone, yeah, it had some detractors, but I don't think the kinds of criticisms the parent poster gave ever really applied to the iPhone. To succeed, the iPhone didn't have to be this utopian product; it just had to be more useful than its main competitor, which was dumbphones. People who complained that it was missing features the Blackberry had were working from an unstated major premise that the iPhone was initially targeted at enterprise users, and I think that everyone who wasn't too busy being a pundit to see how the world works could see that that quite transparently wasn't the case. There was even a time period where I had both an iPhone for personal use and a Blackberry for work.
And I think the criticism about entertainment is spot-on. By contrast, despite being extraordinarily limited compared to even the very next modal, the first iPhone was fantastic for entertainment, precisely because it was good for fostering shared experiences. It didn't take long after the device came out before you'd see groups of people clustered around an iPhone, looking at photos together on that big, vibrant, gorgeous screen. That was something that none of its competitors could do. And you better bet that people saw that happening and started wanting to have one of their own so they could have fun, too.
I do think we're still in the "wait and see" phase for this product, but, unlike some of the original iPhone criticisms or cmdrtaco's original dismissal of the iPod, the criticisms this article points out feel really personally relevant to me.
- incredible amounts of hype
- loving the design
- loving the touchscreen and input (directly contrasting folks worrying about the eye tracking now)
- a sense (at least from cnet and pcmag) that it's really just an overgrown iPod so they keep comparing it to an iPod (compared to the vision where folks get that it's a new category for apple and have good comps outside anyway )
There are definitely similarities in terms of complaining about missing features that apple's probably going to add soon anyway (keyboard showing up in portrait and stuff). Lots of complaints about not supporting flash but we know how that went. Also apparently the headphone jack position was annoying.
What I'm not seeing in the current vision reviews - and maybe it's impossible to see this in real time - is some feature that has the chance to change literally everything that people arent able to comprehend just yet. These reviews being relatively dismissive of this web browsing on your phone thing is absolutely hilarious in hindsight. The only similar thing in the vision is - the passthrough eye thing maybe? Nothing else seems particularly baffling.
I'm glad I read some of those reviews. The vibe I'm getting is - the iPhone was doing something fundamentally weird with this whole smartphone thing that reviewers just didnt get, so they kept reviewing it as an iPod with really bad voice calling and a browser and being confused by all the hype. The vision though? It's a vr/mixed reality headset, we know what those are like, and apple didn't throw any real curveballs.
I do think future generations of AVP will do well. Iterating and applying learnings and customer feedback will make this a good product.
I think it's meant to be used with a Mac for most productivity use cases. That's how I intend to use mine: VSCode, terminals, compiling all happening on my nearby laptop with the Vision Pro as "just" a 4K monitor, and then extra apps like Slack, Zoom, Safari, Mail, music, etc, floating around as native visionOS windows.
In addition, it can be used as an iPad-like media consumption device, e.g., on an airplane, but I see that as an additional (and for me only supplementary) use case.
I think it's worried about not being able to apply a 30% tax on third party software.
Another comment mentions that you’re confined to the host computer’s “screen” and can’t break applications away from that rectangle. But you could imagine that being a possibility in the not-too-distant future.
I don't think that is true. I think 'most people' don't need a computer for more than what an iPad can do, and that your use case is more exceptional than common.
Computers as we think of them, with a somewhat permissive operating system that let's you execute whatever you want are probably going to decrease in relevance to somewhere akin to how they were in the 90s -- incredibly powerful devices useful to some people who need them for particular reasons or just like using them. Everyone else will be fine with whatever the equivalent of the smartphone/tablet OS is.
Meta isn't really opening up their headset, since the whole point of that adventure for them was to have their 'iphone' type kingdom.
Our best bet might just be Microsoft but they gave up on mixed reality windows, and Idk if they're really gearing up to jump into this product space.
I'm fascinated in the idea of Vision pro, not sure I'm ready to shell out for what is basically just a really great movie watching experience. I can see the movie pretty well on my 4k tv, and I'm not sure the improvement is worth the cash.
That's exactly what corporations like Apple want their devices to be.
Apple didn’t fix any of them.
They're just a consumer goods company. Sometimes they make good things people don't appreciate at first. Sometimes they make bad things that people don't appreciate ever.
Their track record is better than the mean, but comparing every criticism of a first-gen Apple product to the iPod/iPhone launches is unserious. Of course some people panned any given new thing on Earth.
And this isn't even in response to someone predicting that AVP would fail, but just that the 1st-gen AVP is an immature product. The 1st-gen iPhone was an immature product! It's delusional (and discrediting to Apple!) to think the iPod, iPhone, OS X, Intel Macs, M1 Macs, etc. were as mature at launch as the later iterations we associate those technologies with now.
This however is coming after a decade of existing AR/VR consumer electronics and still misses the mark
Btw, don’t forget visionOS 2.0 is just 18 weeks away. (Source: WWDC is every June and every platform gets a version bump, as we saw with watchOS launching in April then getting a 2.0 immediately after at WWDC.)
The first Watch was awful. I love my Series 8.
Don't get me started on the first Mac...
The first two iPhones weren't as innovative as they make them,just more polished than other symbians with cameras and internet, it took off with apps really in third iteration.
I think visionpro has lots of opportunities in the next iterations, early users will provide feedback this gen.
All of them! The first version of the iPhone, iPad, iPod, watch, AirPods...
They all had similar reviews. "Seems like a tech preview, not really ready for general use, too expensive", etc.
Or maybe it’ll be another homepod
But nope, nothing more interesting came, and OS updates ruined performance so badly that I happily returned it to my employer 2 years later and opted not to buy my own until 2021 (for exercise and notifications only).
That might not be their primary goal, but the device could appeal to that demographic either way if the other features are appealing enough.
[0] https://thehill.com/policy/healthcare/4085828-a-record-share...
I think about all the apps I'm running and switching between on my computer now, using shortcuts and toolbars and docks to arrange, hide, and switch between them. Everyone using multiple apps on visionOS just looks chaotic.
I once had a second portrait monitor next to my ultrawide. I had to get rid of it because it was just too tiring to be constantly turning my head so far to look at it. It didn't work out.
I cannot imagine how uncomfortable it would be if each app needed to be in a different physical space that required turning my head to use. Painful.
At a deeper level it depends a lot on the question of does Apple want this? If they do then all these will be solved over time. But if they actually see MacOS as a legacy integration then they simply aren't going to invest in encouraging people to use it. I'm waiting to see indications on which way they are going to play it.
Vision has both of those. People will conveniently ignore all the downsides of it, like they do with current Apple products.
"Ensemble (formerly MacCast, before the lawyers had something to say about it) bridges windows from your Mac directly into visionOS, letting you move, resize, and interact with them just like you would with any other native app. It's wireless, like Mac Virtual Display, but without the limitations of resolution or working in a flat plane."
And this is a quite hard limitation, since the Mac has to actually render those windows and then stream them to the goggles over radio. So, without quite a bit of magic, you have a limited amount of pixels the Mac can draw and send.
I have tried eye tracking as primary input method some years ago in another setting, and I very much did not like the experience.
I also started thinking about it reading the reviews, and the main cases to me are:
- checking something before commiting an action: for instance reread the product name before pushing the purchase button. The pointer is already on the button, I keep it there while checking the order, so I just need to click.
- focus switch: pushing another window to the forefront doesn't need a super accurate click. I assume most people eye ball it like me and will click on a emptyish part of the window from the corner of their eye. Same for moving the focus away.
- scroll and type like situation: mostly when using a document on the side while taking notes. My eyes and focus will be primarily on one side (with quick glances on the other), while the mouse/trackpad movement will be on another.
I think we'll discover a lot more instances of this.
It falls apart as UIs got richer, browsers in particular: they're entirely composited in-app and not via GDI, because GDI isn't an expressive enough interface. So you end up shipping a lot of bitmaps, and to optimize you need to compress them. You might as well compress the whole screen then.
https://www.anandtech.com/show/3972/nvidia-gtc-2010-wrapup/3
This has been done many times before (see e.g. X Windows) and has known downsides. Off the top of my head:
- You need the same fonts installed on both sides for native font rendering to work
- Applications that don't use native drawing functions will tend to be very chatty, making the total amount of data larger than VNC/rdesktop/&c. style "send compressed pictures"
- Detaching and re-attaching to an application is hard to get right, so it's either disallowed or buggy.
I guess one downfall is that that your pipe has to be lossless, and there's no way to recover from a broken pipe (unless you keep a shadow copy of the window state, and have a protocol for resynchronizing from that, and a way to ensure you don't get out of sync.)
1. The receiving side has to have at least as much rendering power as the original side, since it will be the one actually having to render things on screen. This is always going to be the opposite case with any kind of glasses, where you'll always want to put as little compute as possible for weight and warmth reasons.
2. Each application actually has to send draw instructions instead of displaying photos or directly taking on control of the graphics hardware themselves. No or very few modern applications work like this for any significant part of their UI.
Applications which bypass the native APIs to render their window contents, in particular video players or games, get a compressed streamed video which has very decent performance. The video quality seems to be dynamic as well, so if there's a scene with very few changes you can see the quality progressively improve.
All of this is done per window, so a small VLC window playing a video in a corner gets the video treatment, while everything else still works like native UI.
How did you know it sends windows hooks? Was it some sort of binary serialization?
They just haven’t done it.
EDIT: -5* doesn't make sense, this is the most polite way you can point out that getting macOS apps windowed on visionOS has ~0 to do with double-buffered windows on iPad OS. n.b. I didn't use half-baked, OP did.
I use my iPad so sporadically that it could easily be the house iPad, but I’m signed in with my email and so on it can’t be.
Zoom has a good Airplay sharing feature that works well in this situation.
But I get what GP means -- I do have a corporate profile, and I made my own @corporation.com Apple ID, but what do I do to use sidecar? Either log out of my personal iCloud on the iPad (gross) or log in to my personal iCloud on my work computer (grosser)
If Apple approves it, of course. This is one of my major concerns; there's a lot of potentially useful functionality that could be implemented, but you have to jump through the app store hoops and hope that Apple doesn't decide that it conflicts with their idea of what you should be allowed to do.
At first people think "wow it's so awesome I can be sitting on the moon while I browse the web". But after a bit of time you just get tired, and I think it's precisely because your whole brain is working in overdrive to understand the unnatural environment you are in. None of this manifests explicitly but at the end of it, when people are faced with the choice of putting the headset on or not, they just "feel" like it's a lot of effort.
I say all this as someone who does regularly spend 1-2 hours working in Immersed with multiple giant screens up. And I love it as a break, a way to focus or just relieve the boredom of working in the same space day in day out. But even I feel this effect of it being tiring and not keen to do it for 8 hours a day. And the minute you say that, you lost the use case of this being your "only" computer / replacing your laptop, so it's actually kind of crucial to its central justification as a replacement for a computer or a 'new kind' of computer.
[1]: https://www.bobovr.com/products/bobovr-m3-pro
A lot of the physical downsides here are basically self-inflicted by companies trying really hard to hide the "nerd factor" necessary for comfort, to the detriment of the actual user experience.
Having demoed VR at my old office I can tell you that the range of reactions varies from an immediate "nope" and having to take the headset off to being able to stay in it for a significant amount of time with no discomfort.
And those who work in VR report their coworkers and just about everyone they talk to customer wise feels the same way.
I know there are people who really want to, or think they do, but most would rather just use screens until maybe such a time the form factor becomes a pair of eye glasses.
Assuming the Vision Pro screen sharing works using the same stuff, I have high hopes.
Unfortunately Apple is charging $200 per extra facial interface though.
However, pairing an audio device is an exchange of settings and encryption keys, and Apple will sync that pairing that to your entire account. Hold your AirPods near your Phone and tap the button to create the initial pairing, and they start working with your Mac and Apple TV.
I don't miss multiple monitors so much, but I do often wish for a larger screen. Not enough to put one in my space, though. That's where my interest in the Vision Pro lies - simply a way to project large, high-fidelity, 2d screens.
The "freely switch" here is referring to the W-chip multi-device support that will on the fly switch between any number of Apple devices based on what's actively being used at the time, without needing to do any manual connection stuff.
Other non-proprietary Bluetooth devices will generally do 2 devices at most, and getting that to work right with microphone input settings can be kind of a nightmare.
If your work is on the traditional model of perimeter protection and trusted intranet, a non-work device can't join the network as you have correctly pointed out. If your work is on the newer BeyondCorp style model, switching to a second account on your computer is going to invalidate the device trust needed to access work resources.
The main one being a complete separation of calls, messages, calendar, notes and reminders. For my own sake more than for my employers sake.
And many employees with company phones already have that separation. iPhone and Mac is not that uncommon to provide for employees. But an iPad on top? I think that’s gonna be much harder to find
And edit: a Vision Pro on top…
I'm still full of myself for postponing getting a 3D TV enough times that the technology died.
> Same with various "waggling" input technologies like the Wiimote and Kinect.
... but I have a PS Move gathering dust somewhere. Which I even preordered.
Hasn't VR taken over waggle? I don't think you can say its disappeared when the VR install base is in the 10s of millions.
That was partly because RAM was over $100/MB (Nominal; ~$230 inflation adjusted) in 1990. Additionally, in the IBM compatible world, many people didn't have a 386 at that point.
Also, minor nitpick on the dates; 1993 saw OS/2 2.1 and NT 3.1, both of which had preemptive multitasking and memory protection.
I see on scifi all the time where someone flicks/flings a video playing on a device to move it to a larger display surface and it kills me that we actually have the technology to do stuff like this right now...but because every company works in their own interests/don't work together to create standards we don't get to have fun use of tech like that.
It is like VR is currently stuck being Kinect in terms of sales and stickiness, while Meta and Apple would both like it to be at least like the Wii, or ideally the iPad.
Personally I have found social experiences to have the best long-term appeal (i.e. Racket NX or Drop Dead with friends), but even there I am not these apps have sufficient mainstream appeal.
The discontinued WMR Portal, essentially the Window's desktop in VR, was so far the only software that tried to be a full workspace in VR. But even that was missing a lot of important features and Microsoft gave up on it years ago and never made it accessible to non-Microsoft headsets. It's currently scheduled for removal from Windows.
VisionPro seems very similar to WMRPortal so far, with a few key improvements like allowing apps do add 3D objects into a shared space.
But other notable ones include Synth Riders, FitXR, OhShape, Pistol Whip, Thrill of the Fight, and (maybe) Gorilla Tag. And this list is far from exhaustive.
VR is pretty good for fitness just because it can make exercising more interesting, comparable to sports without the need to coordinate with other people (and it's easy to do inside your house, if you have at least a 2m x 2m open space). Major downsides would be having that space available and sweat inside the headset.
WebGPU and WebXR are the two big enablers going forward. With WebGPU, developers have a common way to access hardware and that's a big deal across all your devices. A common way to access the hardware that gets you real-time 3D graphics, machine learning, crypto, etc. that works on your phone, tablet, laptop, headset, whatever is a big deal. And it's not just for anyone with Apple gear, but anyone with a compatible browser. Think generative AI/ML streaming Gaussian splats to your retinas via a browser. That's where we're headed.
Need an OR to explore a phobia of surgery? Sketchfab has you covered: https://sketchfab.com/3d-models/charite-university-hospital-...
Windows even has support for IndirectDisplayDevices - I'm not sure how openXR or SteamVR handles those, however.
It turned out the killer Apple Watch feature was fitness, and I don’t see why it couldn’t have been here.
They’re like talking paintings in the Haunted Mansion ride with bunch of blur and depth of field. It’s way too weird. Joanna’s looks a bit like she’s been stuck as the replacement person in the Mona Lisa.
A static picture of you or maybe your Memoji (remember those?) would be far preferable.
I’m surprised Apple is shipping Personas.
lol what kind of review says that. It would be like someone who doesn't even use any kind of vision correction saying "I just used it without glasses and it was fine."
(edit: the review is actually quite good. but that line was bizarre)
> the Vision Pro has the best video passthrough I’ve ever seen on the sharpest VR displays any normal person will ever come across
> the eye and hand tracking control system… is light years beyond any other consumer hand or eye tracking systems
It calls out other aspects similarly.
Eh I think there are a couple of important differences.
The first difference is that the iPhone wasn't 300% the price of the contemporary Nokia N95. It was actually around 30% cheaper ($500 vs. $730). The Vision Pro is $3,500 and its closest competition is the Quest Pro at $1,200. I don't think anyone would argue that the experience justifies the price difference.
The second difference is that the iPhone foretold how people were going to use phones in the future. The technology was good and convincing enough that it--to my sorrow--killed other ideas instantly. The Vision Pro obviously hasn't done this. People are buying the state of the art when they buy the Vision Pro, but people were buying the future when they bought the iPhone, and everyone knows it.
And I mean, inventing the future is a bonkers high bar. I admit we shouldn't be holding any company--including Apple--to that standard. It was a confluence of a talented, experienced team, world events, and technological progress. I'm saying I find it hard to square the fact that buying the future (the iPhone) cost less than the then state of the art, whereas Apple's clearly not created the future here, and priced it way out of bounds. It would be like if the iPhone ran the best version of Symbian and had the best resisitive touch screen or physical keyboard that could possibly be made, and then they sold it for $2k. We wouldn't even be talking about this product as a real product if it weren't Apple selling it. We'd assume only rich people would buy it and it would impact the culture not at all.
Also, that's what's happening! Well, that's the best case. The probable actual outcome is people not only get used to the idea that AR/VR has hit a ceiling and is really expensive, but a few really obnoxious people wear the Vision Pro to cafes and everyone gets the idea that it's extremely uncool. This thing is at best a dev kit and at worst a bauble.
[0]: https://indianapolis.craigslist.org/search/cta?hasPic=1&max_...
Or real workouts. I want to be able to have floating text to read on my runs, and real time biometric data directly in my field of view rather than on my watch would be cool too.
What would VR add here? For guitar-hero style instrument learning, there are already Yousician, Simply Piano / Guitar, Gibson and Fender apps, and quite a few others.
I do agree, just watching a sporting event on a projector where the athletes are life-sized is excellent, on-field cameras already provide a better view of the game then any seat.
Now make it more immersive, and the trick of immersion is very cool. Like The Sphere, that immersion is next level.
Add sports-betting to the Vision experience, that is a great side-car app for this. (I am not pro-bet but see the usage).
I'm wondering how that translates to VR.. do they just teleport you around the arena? That seems like it would be a bit jarring if not altogether sickening.
I couldn't see carrying one of these instead of just an iPad for movies. (But then I'm a very light packer.)
“There is a lot of very complicated display scaling going on behind the scenes here, but the easiest way to think about it is that you’re basically getting a 27-inch Retina display, like you’d find on an iMac or Studio Display. Your Mac thinks it’s connected to a 5K display with a resolution of 5120 x 2880, and it runs macOS at a 2:1 logical resolution of 2560 x 1440, just like a 5K display. (You can pick other resolutions, but the device warns you that they’ll be lower quality.) That virtual display is then streamed as a 4K 3560 x 2880 video to the Vision Pro, where you can just make it as big as you want. The upshot of all of this is that 4K content runs at a native 4K resolution — it has all the pixels to do it, just like an iMac — but you have a grand total of 2560 x 1440 to place windows in, regardless of how big you make the Mac display in space, and you’re not seeing a pixel-perfect 5K image.”
edit: Yes I know you can build apps before they're in the store
It's more like 1080p monitor. The virtual monitor only covers a small part of the VisionPro's display. You can compensate a bit for a lack of resolution by making the virtual screen bigger or by leaning in, but none of that gives you a 4k display.
To really take proper advantage of the VR environment you really need the ability to pull out apps into their own windows, as than you can move lesser used apps into your peripheral vision and leave only the important stuff right in front of you. You also miss out on the verticality that VR offers when you are stuck with a virtual 16:9 screen.
?!? The audio quality wasn't somehow different, but the feature set surrounding the phone sucked: it didn't have ring tone profiles / called groups and so was pretty infuriating versus a basic Nokia phone in addition to having a fraction of the battery life, a worse form factor to hold up to your ear, and a fundamentally more fragile construction. It also was pretty pathetic at text messages, as it didn't have any support for MMS (which was already shipping on competing devices). Notably and critically, even a cheap Nokia candy bar phone could do the most basic texting feature of sending a single message to multiple people at once... somehow the iPhone shipped without that? Even the most basic things like "at least you will be able to store a ton of text messages finally as the device had a LOT of storage" somehow were thwarted: Apple implemented some stupid arbitrary limit of I think it was only 1000 messages before it started deleting old ones!! I just can't believe you think the original iPhone was a better phone than it's contemporary competitors.
I continue to say that these PDA's will be a 5" touchscreen, allowing you to play games. Games far more powerful than the Mega Drive and can play with your friends from anywhere in the world. The PDA has a far superior, higher resolution screen beyond 640x480 and can render much faster 3D images than the supercomputers of today (ie SGI workstations)
All of these claims.. alongside explaining what a PDA is, which you can fit in your pocket that wont need charging for 5+ hours, etc...
Even people who do see the possibility will likely spend most of their efforts explaining that their RC Cars don't last 30 mins without recharding the batteries, so this wont work, either!
Of course by PDA (Personal Data Assistant) - I refer to what we call Smartphones today.
Point is - I try to be realistic about future tech even in the next 20-25 years.. but I leave some areas open to being more "advanced" than we anticipate.
Yes.. contact lenses is a stretch for "next 20-25 years" but lets not ignore the suggestion. Young kids in 2040 will laugh at how we managed to wear these "heavy" VR headsets. Who knows how this type of tech will evolve and mix with other tech.
This is a solution that creates 20 new problems.
> too much credit to a corporation
What does this mean? Are indie developers supposed to do a better job than a corp (see: a group of people) with billions in R&D?
Literally the only cloud drive product I know of which doesn't work on my corporate laptop is iCloud Drive, because the EMM gave a checkbox to set a flag. As a result, a huge portion of built-in collaborative features and apps just don't work. I have paid seats in other products only to regain functionality lost by that checkbox.
Honestly headset weight is more of an issue for 4+ hour working sessions than exercise.
Maybe it’s apple trying to push the limits of this tech, a16 bionic compared to m2 is like 40% slower in the various computational performances so won’t be surprised to see the form factor show up in android devices soon for sure.
And Apple is supposed to be the master at taking a bit longer to nail it.
If anything they just inadvertently demonstrated how flawed strapping a screen to your face is if you're trying to do something.
This is something I've been wondering about. The Vision Pro's displays are 4K per eye, for the entire field of view. The monitor I'm sitting in front of is 5K, and takes up quite a bit less than my full field of view.
Surely the virtual Mac screen (and everything, I guess) is gonna be substantially lower resolution than traditional high-DPI screens at normal viewing distances?