Apple Vision Pro…. Anybody buying?

Has anybody here started recording videos on their iPhone 15 Pro/Max in anticipation of Vision Pro? Played a little with it and WOW… file sizes are large (to be expected in fairness). But knowing something and actually something in person are two different things.
Gotta think that if Vision pro takes off, it will do wonders for sales of higher storage capacity iPhones!
 
I'm wondering how a person wearing an AVP, and say, doing a virtual museum visit, navigates from area to area of the museum.

Sitting on a couch, I'm assuming head movements (left, right, up, and down) allowing you to look around a museum space from a static position (on your couch) are handled with solid state piezoelectric accelerometers (Analog Devices, for example) in AVP.

But how is "walking forward" (to go to a different area/space in a museum) handled from your couch? Perhaps finger gestures (pointing in a direction) or snaps detected by AVP cameras?

Any thoughts?
 
I'm wondering how a person wearing an AVP, and say, doing a virtual museum visit, navigates from area to area of the museum.

Sitting on a couch, I'm assuming head movements (left, right, up, and down) allowing you to look around a museum space from a static position (on your couch) are handled with solid state piezoelectric accelerometers (Analog Devices, for example) in AVP.

But how is "walking forward" (to go to a different area/space in a museum) handled from your couch? Perhaps finger gestures (pointing in a direction) or snaps detected by AVP cameras?

Any thoughts?
Good question. How about the action where the middle and index fingers are moved back-and-forth to simulate walking? I’m sort of kidding, but maybe this could work.
 
Good question. How about the action where the middle and index fingers are moved back-and-forth to simulate walking? I’m sort of kidding, but maybe this could work.

Great idea! And the rate at which your two fingers are moving back and forth would move you faster/slower.

But what about a Michael Jackson-style backwards moon-walk? :)
 
I'm just not exactly seeing the killer feature here for me. It's a bit like the Apple Watch in that way.

And as someone who can't seem to find headphones that aren't a huge pain to wear for more than 30min, it's hard for me to get jazzed about yet another "fits the average person" gadget that may or may not fit me.
I agree with you. So far it seems to mainly revolve around placing 2D ipad apps in a 3D space. Closest I come to seeing a “killer” app it’s media consumption - watching videos in a simulated theater, etc., and 3D photos. But it’s got to do a lot more than that to be worth the money for me.
 
Although I won't be purchasing the 1st gen AVP, I'm looking forward to seeing how realistically and convincing it can place a user (me) in a dynamic space with other people; say in a museum, in a football stadium watching a game, on a beach, in a library, hiking in the Grand Canyon, in a Chinese New Years parade, etc.

I'm also interested in learning how and what was accomplished collaborating with Stanford University's AR/VR laboratory for the last seven years. I'm hoping that will have lead to Apple developing an AR-builder-type application, letting people (non-programmers) design their own AR apps, similar to how the 4th Dimension database product let customers design custom database programs with custom UIs. That was a great product.

When the 2nd gen AVP is released (probably next year), I'm looking forward to removing my own appendix, being able to summon up loads of reference medical/surgical information to guide me through the process performing the surgical procedure. : )
 
@Citysnaps
I was going to personally message you and say this, but then I saw you commented on here so I figured I’d say it here! I wanted to commend you for dealing with all of these constant morons that plague every MacRumors blogpost on this product. It is so irritating to read what they write, but you kept going so I wanted to commend you for putting up with that. I hope you told some of the actually decent people on those posts about this forum! Anyways thanks for defeating the constant trolling those few seem to do.
 
Anyone know which prescription matters for these things? (for those of us who have near and far prescriptions). Sounds like near? I scheduled my eye exam for this friday, just in case I waiver and decide to buy this thing next week.
 
Hard pass. I imagine it will be great for gamers, but not much for productivity.

I’m not singling you out personally. I just want to address this alongside some other comments that are similar on here to this.



This is not a gaming device. I understand because of its physical nature, that it’s often compared to what’s been on the market, which are VR headsets in the form of a game console, but this is not that.

The best way to really explain it is to use the terminology, which is blending digital content into your physical space, or spatial computing. It supports immersive experiences (most akin to VR, but nothing like what VR has traditionally offered), but that is optional. This is a product built to be in your world, and keep you in your world (augmented reality or AR). This is not something that you go into to escape life; on the contrary, it is designed to enhance it.

Facebook’s Oculus Quest is most often compared to this, again, for the reason I stated at the beginning. But these two products could not be more different. It’s like comparing an Xbox to a PC desktop — they’re used for different purposes. Just because Facebook is trying to brand itself as visionary by bolting on a feature or two from Apple’s product doesn’t make their product a spatial computer.

There have been two major interfaces in the market, the text-driven UI and the graphics UI. The text-driven UI is what started the entire personal computing revolution, with the Apple 2, with a personal computer on the desk of many who had not even heard of a computer. Then came the GUI to mass market in 1984 with Mac, but since then all existing products have been various forms of a GUI, all different but fundamentally the same. The question has arisen, what will come next after the GUI? After all, whether mouse, click wheel, or multi touch, they are all fundamentally GUI driven. This is where that product comes into play. It is the Mac of the GUI driven era. This is not the iPhone, which can be considered an extreme and advanced forward-thinking refinement of the GUI. This is the beginning of a new UI paradigm.


When the Mac came out, as ridiculous as it sounds to me, I’m sure there were naysayers about GUI: this is a toy, you can’t do stuff as fast as command line commands, it’s not as advanced, etc. But what the GUI did bring was an entirely new way to do computing, something that has revolutionized and created entire industries. Think about everything around you, everything is driven by a GUI now. There is very little that is not touched by this concept. And entire industries, like on demand accommodation or travel in an instant, where you can schedule a taxi, plane, and hotel room from YOUR room with all the pictures and videos of where you wish to go, in an instant. These sort of experiences would never be possible on a text-driven UI, like the Apple 2. You could never see, hear, or touch (with Multi touch) your photos and videos and bring them anywhere with you. You couldn’t video call your family, friends, and anyone. Obviously only a few examples, but nonetheless you get the point: GUI has revolutionized the entire world.

But at the beginning it wasn’t really that way (not that I was alive, but that’s the great thing about Internet: you get to become knowledgeable on things you weren’t alive for). The Mac launched not with millions of apps that knew what a GUI was truly for, but a pallet of apps designed by Apple that gave functionality, but also demonstrated what you can do with GUI. For the first time you could draw or see beautiful fonts like in a book.

But the thing is, much like certain critiques today, the app landscape was built for Text UI, where all that was dreamed possible and became useful on a PC was defined by what you could do with command line prompts and a text box, with lines, letters, and numbers. Technically speaking, the Mac was a new thing that really speaking, all the apps on an Apple 2 could do better there than on a Mac. If your app is a spreadsheet, you’re not going to think much of the potential of a GUI, right? The GUI at the time, at best which just recreate it and maybe it looks a little cooler but is fundamentally the same.

That’s the thing. All apps today are designed for GUI, which are limited by the boundaries of a screen, mouse or touch screen, be it one piece of glass or a foldable, rollable device. They are all fundamentally built for GUI. Which means GUI apps, of course, are going to be supported here just like Text UI apps like a spreadsheet was supported on Mac, but none of them are (yet) taking advantage of the new UI paradigm. That’s why a lot of apps, like Messages or Safari, look like a “2D window in 3D space,” albeit not exactly because they take advantage of the new paradigm more than that.
So when you ask, “what’s the point?” I tell you to look at this:

Reliving your memories of those you care about; the ability to see 3 dimensionally, so much more than a traditional picture that it’s been described as peering into someone’s life.
Not getting “the best seats in the house,” but getting the ONLY seats in the house; the ability to see a play being shown elsewhere, a sporting event in another part of the country, or experiences like diving into the shallow water of a tropical island with unique animals, all from a perspective impossible for a group of people to share in real-life — all captured with 3D, 8K, HDR video that wraps around you.
Seeing and learning about things you could never normally have access to; the ability to see something you’re learning about, and be able to see inside of it and see, right in front of you, the real time interactions of subatomic molecules or what the inside of an animal is, and learn how all of these things work in ways a textbook or even an iPad can’t fully detail.

The possibilities of a fully 3 dimensional interface have barely begun, and I hope I’ve demonstrated a bit about how the possibilities are vast. This is more than playing a game on a floating screen in front of you, it is about revolutionizing how the whole world works, all over again.

The era of spatial computing is here.
 
Last edited:
Anyone know which prescription matters for these things? (for those of us who have near and far prescriptions). Sounds like near? I scheduled my eye exam for this friday, just in case I waiver and decide to buy this thing next week.
Apparently Joz from Apple has a prescription but reportedly doesn’t need to use lenses, his eyes work fine. So I suppose it will be dependent on each person, that being said it sounds like Apple is offering both near and far sighted people prescriptions. The UI is designed to be out of reach to encourage remote interaction, which means if you can’t see more than an arm’s length away you won’t be able to see the UI.

Essentially, if you have a prescription you should absolutely use it here. You may be able to see some stuff, but your eyes may get strained more quickly compared to viewing it through a lens. That’s really great timing that you have an eye exam lol!
 
Apparently Joz from Apple has a prescription but reportedly doesn’t need to use lenses, his eyes work fine. So I suppose it will be dependent on each person, that being said it sounds like Apple is offering both near and far sighted people prescriptions. The UI is designed to be out of reach to encourage remote interaction, which means if you can’t see more than an arm’s length away you won’t be able to see the UI.

Essentially, if you have a prescription you should absolutely use it here. You may be able to see some stuff, but your eyes may get strained more quickly compared to viewing it through a lens. That’s really great timing that you have an eye exam lol!
Sure, but the issue is that some of us are both near and far sighted (and wear bifocals or progressive lenses). So the question is which prescription is the prescription that matters? Since they offer “reader” lenses, I assume that what matters is near-vision, but I’m not sure.
 
Sure, but the issue is that some of us are both near and far sighted (and wear bifocals or progressive lenses). So the question is which prescription is the prescription that matters? Since they offer “reader” lenses, I assume that what matters is near-vision, but I’m not sure.
Best is to ask Apple for certain, but from what I understand the far distance prescription is more important for the way the UI is designed. You can get up close to it and interact with it like multi touch, but Apple says it’s designed to be out of reach, to encourage you to sit or stand a few feet away and gaze at the UI.

Edit: I’ve learned more just now, and I’m definitely going with if you can only choose one and Apple says choosing one out of a bifocal is fine, that it’s going to end up being the far distance prescription.
 
Last edited:
Best is to ask Apple for certain, but from what I understand the far distance prescription is more important for the way the UI is designed. You can get up close to it and interact with it like multi touch, but Apple says it’s designed to be out of reach, to encourage you to sit or stand a few feet away and gaze at the UI.
This is going to be a big pass for me. The lenses on glasses are the biggest line item expense, and at least every two years my prescription changes. I am going to imagine that the customize lenses for prescription holders is going to be a big expense and one that will need a change every couple of years.

I'll wait until some of this shakes out after the release to ultimately make my determination on this potential issue.
 
This is going to be a big pass for me. The lenses on glasses are the biggest line item expense, and at least every two years my prescription changes. I am going to imagine that the customize lenses for prescription holders is going to be a big expense and one that will need a change every couple of years.

I'll wait until some of this shakes out after the release to ultimately make my determination on this potential issue.
I’ve done a little more research on it, based on competing VR goggles. Looks like “distance” is what matters, but the effective focal distance is usually something under a couple meters (not clear what Apple’s is). So even if you need distance correction, you may not need anything at that distance. Guess I’ll try one in the store and see how blurry things are. Would be nice if it offered some knob like my cameras do where I can just turn the dial until everything is sharp.
 
This is going to be a big pass for me. The lenses on glasses are the biggest line item expense, and at least every two years my prescription changes. I am going to imagine that the customize lenses for prescription holders is going to be a big expense and one that will need a change every couple of years.

I'll wait until some of this shakes out after the release to ultimately make my determination on this potential issue.
Well that’s fine, but I’m confused. Are you aware of the pricing that was announced today or not yet? Because it’s not $600 as some fools rumored lol
 
So even if you need distance correction, you may not need anything at that distance. Guess I’ll try one in the store and see how blurry things are. Would be nice if it offered some knob like my cameras do where I can just turn the dial until everything is sharp.
Correct, and I believe that’s what Joz was saying in his personal experience. I’m presuming no one here likes me, but I hope I helped a bit in giving you some information.
Also some headsets do offer something like that, but they are severely limited and I’ve heard just plain bad. But Apple does have patents on stuff that is exactly what you’re talking about. So hopefully in a future generation!
 
Best is to ask Apple for certain, but from what I understand the far distance prescription is more important for the way the UI is designed. You can get up close to it and interact with it like multi touch, but Apple says it’s designed to be out of reach, to encourage you to sit or stand a few feet away and gaze at the UI.

Edit: I’ve learned more just now, and I’m definitely going with if you can only choose one and Apple says choosing one out of a bifocal is fine, that it’s going to end up being the far distance prescription.
I just have this mental image of being in the Apple Store wearing this thing on my head with the employee going “one or two? … two or three?” “Uhhhh one I think?” 🙃
 
Back
Top