STEREOSCOPE

Samsung’s Galaxy XR, Meta’s Moves, And Avatar’s 3D Return

Byron Diffenderffer, Anthony Vasiliadis Season 1 Episode 15

Send us a text

We unpack Samsung’s Galaxy XR strategy, Meta’s push for premium licensed content, and Apple’s VisionOS upgrades that aim to close real usability gaps. Along the way, we test practical tools—from room scanning to video glasses—that make spatial creation and viewing feel ready.

• Samsung Galaxy XR specs, display claims, pricing, launch plan
• Android XR platform, Snapdragon XR2+ Gen 2 performance
• Sony micro‑OLEDs, persistence and smearing explained
• Galaxy phone spatial capture and 4K30 pipeline
• Meta Connect: Blumhouse Enhanced Cinema on Quest
• Avatar 3 teaser quality, frame rate, and MetaTV curation
• MetaTV strategy shift and licensed content implications
• Meta Hyperscape room scanning, fidelity, and workflows
• Ray‑Ban display glasses with EMG wristband input
• VisionOS 2.6: PSVR2 controllers, widgets, 90 Hz hands
• Native 180/360 playback, macOS spatial rendering
• Steam “Frame” standalone rumors and streaming
• Viture Pro XR impressions, stereo preview, and use cases


Support the show

SPEAKER_00:

Hi. Welcome to the Stereoscope Podcast number 15. Here we are, back in the studio. It's been a while. No, we had some we have got some stuff cooking. Anthony's been doing a lot of projects. I've been incredibly busy. Sean, what have you been up to? At home. At home. And outside my home. Hey, you shot a couple movies over the summer.

SPEAKER_01:

That's true. They're gonna be out there, and it's so much fun to do them in the world of Portland, Oregon, where we make things in Oregon. Yeah, we are makers here, that's for sure.

SPEAKER_00:

Movies? Yeah. So we've uh we're back in the studio. We've got a bunch of news today, a bunch of new stuff. It feels like we're like just on the tip of the next generation of XR and VR stuff. Yeah. Because all the stuff that has sort of been cooking for a while is starting to finally like release. Or strong rumors of release. Yeah, for sure. Um we finished the fire. Fire. The the first one straight out of the gate is and it's it's it feels like we should know more at this point, but we don't, especially because this was supposed to launch last year. Is Samsung's Project Mu Han reported to be releasing October 21st of 2025, this year. We've talked it, we've talked about it a bunch because we just we heard about this two years ago.

SPEAKER_02:

Yeah, one of them were hotly anticipated.

SPEAKER_00:

And they've been working on it for I think five or six years. Supposedly, uh, Samsung's XR headset was supposed to come out like three years ago. Yeah.

SPEAKER_02:

Similarly to when the iPhone came out, they're like, oh, and they went back to the drawing board after the AVP came out. So delays. Yeah.

SPEAKER_00:

So it's gonna, as we've said before, it's gonna be running Android XR. This is a co-branded Google slash Samsung headset. It's running the Snapdragon XR2 Plus Gen 2, which is the next iteration of the same chip that that's in the Quest 3. It's a little bit more powerful. So it's not a generational leap in processing power, but it is more performant than the other one. Mind you, it's gonna be running more stuff.

SPEAKER_02:

So yeah, I'm actually, this is like the biggest sticking point for me on this whole device. What sticking point?

SPEAKER_00:

What kind of stuffs? Yeah, I mean, it's also running high-resolution Sony micro OLEDs. They're not using their own. We were talking about this. There was some speculation whether they were using their own Samsung OLEDs, but it looks like they're using the Sony ones. But supposedly these screens are going to look better than the AVP. We don't really know how.

SPEAKER_02:

What does better mean? What does that define that?

SPEAKER_00:

I'm guessing that they'll probably find some way to deal with the persistence issue and the smearing because they've had some time to sort of iterate and it seems like they are.

SPEAKER_01:

Explain to me why the schmearing or less smearing is smearing is better. Smearing? Yes. For those who know what it is to persist.

SPEAKER_00:

So or to persist. OLED panels, they the way that they work is that they they turn on and off, and that sometimes they don't do it quick enough, so it creates a smearing effect in headset.

SPEAKER_02:

Basically, refresh rate is you know more for the gamer side of things. Like it's refresh rate, basically.

SPEAKER_00:

And it's a it's a it's a baked-in quintessential issue with all OLED headsets. There is some technology to sort of like minimize that, and these micro OLEDs, we think, are specifically designed to minimize that. So that's well.

SPEAKER_02:

Yeah, so basically if you if you demo the AVP or Own One, you move around and you see a sort of like fuzzy, ghosty thing happening every time you move your head, that's the persistence we're talking about. Yeah. If you're lucky you didn't notice it, I noticed it immediately and it ruined the entire noticed it at all.

SPEAKER_00:

To be said, it's also gonna have eye tracking, hand tracking, multimodal, so it'll have like you know, their version of it's gonna have, you know, the Google Gemini baked in. And it's also got the external tether tethered battery, which we can see up there. Um supposedly this battery is hot swappable, though. So that's a big thing.

SPEAKER_02:

That's a good improvement.

SPEAKER_00:

The other thing that we're hearing from lots of the reporting is that the headset is finally got a very specific price range locked down. It's gonna be between$1,800 to$3,000.

SPEAKER_02:

Yeah, if you can call that specific. USD.

SPEAKER_00:

Yeah. So yeah, and that's USD. And I think the last time we talked, there wasn't a specific number like that locked down.

SPEAKER_01:

What was the last thing that cost upwards of that 3,000? Is it the AVP?

SPEAKER_02:

Yeah.

SPEAKER_01:

Yeah. Yeah.

SPEAKER_00:

Starts at$300,000, or$3,500, yeah. Starts at$3,500. After everything, most people that get it's like$4,200. Yeah. So the fact that they're launching edit between$1,800 and$3,000 means that they're they want to undercut Apple. They probably wanted to launch it for more, I'm guessing, but they realize that nobody's going to pay that because look at how many ABPs were sold.

SPEAKER_02:

Sure. Or they're just actually targeting the rumored price point of the next AVP. The competitor.

SPEAKER_00:

Because the next AVP is going to be. I would guess that the Apple Vision Pro competitor will launch at$1,500.

SPEAKER_02:

Oh, yeah. See, and I've heard the strongest room I've heard is$2,500, but either way, significantly less. Sure. I I mean I guess I always want it to be less. Of course. And maybe it will be. And that's right within that window, right?

SPEAKER_01:

Yeah. It definitely feels like Coach Burberry Prada. Like, where's your price point for that kind of what do you want? Well, and these are luxury devices.

SPEAKER_02:

Yeah. I mean, yeah, once you go over$1,000, it's a luxury device.

SPEAKER_01:

Yeah. Or$10,000. That's a k that's a camera that's like one of the cameras that we're using here, or three of them combined, would be like close to that. Yeah.

SPEAKER_00:

And so the other thing is that that you have to keep in mind is that they're only targeting 100,000 launch units, and the first batch of units is only going to be available in South Korea with a sort of the inverse of the Apple Vision Pro rollout, which makes sense because Samsung is a South Korean company. And it seems like they're explicitly trying to have the headset hit most of the problem places of the AVP. Like it's smaller, it's lighter. I mean, even from photos, you can tell that it's generally it looks less bulky. I think that little flap on the top is a little bit misleading. But supposedly it's like heavily compressible.

SPEAKER_02:

So yeah, I've heard it's really comfortable.

SPEAKER_00:

Yes, that's been the Also there's a better weight distribution, I've heard. Well, that's the interesting is that it has much more of an open periphery design than even the Apple Vision Pro does. Because the Apple Vision Pro creates a literal seal around. This, if you can't see, right? There's actually not very much blocking light at the bottom. Supposedly there is a bit that you can get that can close that, but it doesn't, it doesn't launch with it.

SPEAKER_02:

Yeah, interesting.

SPEAKER_00:

So the other interesting thing that has sort of leaked out around the same and sort of gives more context to the the project Muhan is that Samsung is preparing to launch a feature for its flagship phones, which allows a similar 3D capture as to what Apple rolled out for their phones.

SPEAKER_01:

Yeah, like they're doing what Apple forgot to do. Well, Apple did do. Yeah, but they did it after they forgot to do it.

SPEAKER_02:

No, no. They launched that feature with the AVP.

SPEAKER_01:

Oh, but nobody knew.

SPEAKER_02:

Yeah, I guess not. I mean, I guess they forgot. We've been using it the whole time.

SPEAKER_01:

Yeah. Yeah, the point is that that like, yeah, like the idea of capturing and putting it in your phone and on there, it's like we're a lot closer to having that be one like thing, one thing that they want to sell, they want to put out there on the on the floor, on the uh on the demo floor. It's not like this is a completely immersive experience just for consumers, just for like, where's the content? So like Samsung wants to be part of that conversation.

SPEAKER_00:

But Samsung So Samson, this uh Samsung watching site, they first found this feature, and it it was in, it was like leaked in an existing phone. And they they took the app and they sideloaded it onto uh like current flagship phones and it worked. So there's also some documentation within the app itself that says this content is designed for viewing on Galaxy XR headsets. So that pretty much tells us what the headset is going to be called. Yeah. So it looks like Project Muhan is gonna be called Samsung Galaxy XR, which of course it is. Yeah. Yeah, to infinity and beyond. I don't know why it took them two years to come up with that name.

SPEAKER_02:

Well, they probably didn't. They just, I don't know, maybe there's some sort of mystique around like, ooh, it's project, blah, blah, blah. And so like it makes it feel like a skunk works thing.

SPEAKER_00:

I had heard though that they were tooling with the name even as recently as like two or three months ago.

SPEAKER_02:

If they really put a lot of effort and ended up at Galaxy XR, then that's kind of pathetic. That feels like Samsung. It's the most obvious name possible.

SPEAKER_00:

That feels like it's also like that's a company that who named their foldable the Samsung Fold. Galaxy Fold. Galaxy Fold. Yeah. So I don't know.

SPEAKER_02:

It's an obvious name, but that's what I'm saying. If they spent a lot of time on it, then that's fine.

SPEAKER_01:

That's it's the familiar brand. Like Galaxy has been around forever. When I think of a Samsung Galaxy, I think of their phone.

SPEAKER_02:

Yeah.

SPEAKER_01:

You know? I don't think of a TV, right? I think of like the phone. Yeah.

SPEAKER_02:

It is their mobile device.

SPEAKER_00:

Uh, it also supposedly it works on all their high-end models currently. It also the leak suggests that the 3D video will be able to go up to 4K 30, which currently the the Apple, the iPhone in integration does not do that. The native app. That's what's interesting.

SPEAKER_02:

So if you use Spatialify or whatever, however you want to say it, it has a 4K uh 30 mode. 4K 30 mode.

SPEAKER_00:

You've said that there's some quality differences between.

SPEAKER_02:

Yeah, well, what I think is, I mean, the main difference, I I like the way that the Apple app renders for whatever reason. The biggest difference, though, and I think the why they only advertise at 1080, is in the native app it's stabilized. Spatialify, you can turn you basically turn off like whenever you turn on that feature, it gets the image gets bigger, basically. Interesting. So I think that's what's actually happening.

SPEAKER_01:

Because they, you know, really don't want you capturing Oh, they don't want you to barf when uh they don't want amateur VR virtuographers to create barfy material. Right. Yeah. Yeah.

SPEAKER_02:

And like you're most people are gonna be walking around because they they don't know. And it's not immersive. I mean, this is spatial, so it's not as sick making, but still like you want that video to be stable.

SPEAKER_01:

Uh but that's just kind of gonna it's baked in into the leak.

SPEAKER_02:

Like the It is good to see a certain metadata leak, basically.

SPEAKER_01:

Feature parody happening with Samsung.

SPEAKER_02:

Yeah. It is good. I mean, it seems to be one of the more powerful features in the like when people talk about what makes them like emotional about the AVP, it's capturing these like home movie, home like memory things and being able to relive them.

SPEAKER_00:

And I will say I don't I don't have an iPhone, so I don't have access to that feature. But I know that if I had it on an Android device and look looks like I will, I would absolutely be using it.

SPEAKER_02:

Yeah, it's great. I mean I used it on a recent trip to Chucky. I also wonder if loved it.

SPEAKER_00:

Uh I Where'd you go, Jesse? I'm curious to see what their implementation between the the dual camera setup is. Are they going to be using a similar like is one going to be a fisheye? Because those those Samsung cameras, they have like there's like six lenses on there.

SPEAKER_02:

Yeah, and that see the the question about that is is like the it can't be every phone unless they it has to only be phones that have the right arrangement of cameras. Yeah. So like if the cameras are horizontal across the vertical phone, then that won't work.

SPEAKER_03:

Yeah.

SPEAKER_02:

Or they'll make it maybe you can film this way. But like, you know, depending on the phone, you won't be able to do it. Just like with the iPhones, you have to have at least a regular iPhone with the two slides.

SPEAKER_00:

It seems to me that you could probably figure out which which phones this is coming to based on the cameras, yeah. The the layout of the lenses.

SPEAKER_02:

Yeah. If they have the same like one's a fisheye, one's a telephoto lens, yeah. You're gonna have the same thing.

SPEAKER_01:

So you would be able to walk into the store after this, like and see the camera. Like, you know, we have three eyes. Is it four eyes now? Is it six eyes? Like, oh, that's a six-side camera slide.

SPEAKER_00:

It's a six-side camera. Yes. See, you can see the Project Muhan sort of like like logo in the top right corner there or top left corner there. That is got the exact same move farther away.

SPEAKER_02:

Yeah.

SPEAKER_01:

I mean Samsung, they really don't even really make it uh this is the back of the camera. I mean like the front of the camera, you know? Oh, sorry, this is the front of the camera with the little gold uh headset logo that shows I'm doing a spatial video, I'm capturing spatial. But if you're on the other side and you can see someone on the street and you see they have six cams, then you know that that is a virtual camera.

SPEAKER_02:

Maybe, maybe not.

SPEAKER_01:

Maybe. Maybe, maybe not.

SPEAKER_02:

Because it all you need is two. So like as long as there's two.

SPEAKER_01:

Yeah, exactly. So yeah, you never know until you see yourself on the internet with your you know, cat videos in 3D. It's about time.

SPEAKER_00:

Is there much diff it's about time we fled the internet with 3D cat. Alright, so next we have some Mediconnect 2025 was last week, and so a bunch of stuff came out. Zuck is Zucking as usual, and but some cool things were announced there, and this is something that I think I I'm pretty into. So this is at MetaConnect 2025. Blumhouse announced a new app for Quest 3 and Quest 3S called Blumhouse Enhanced Cinema. This is pretty cool. This application, available soon in the US, will allow users to watch a full-length feature film or to watch the full-length feature films Megan and the Blackphone on a large virtual screen. The experience is designed to be more immersive than a standard viewing, with special effects from the movie extending beyond the virtual screen and into the user's surrounding environment. That's pretty cool.

SPEAKER_02:

Sounds very cool.

SPEAKER_00:

Yeah. I hope this is cool. Just generally, I mean, the implementation of an official place to watch movies from a specif any official licensed movie watching that extends. Yes. You know, uh you can keyword official.

SPEAKER_02:

Like that's surprisingly been hard to come license.

SPEAKER_00:

Yeah. I mean, you can watch Amazon Prime in the headset. Yeah, you know, like 1080 or less or something like that. 1080. But it's a biz uh good one. On the most recent version of the app, they increased the bit rate. So that's nice. It's still heavily like in it's I I think it's some iteration of H E V C. So it looks good. Compression isn't that bad. Dynamic ranges. Did you see that this is a Blumhouse? Yeah, this is Blumhouse. Is that the the horror studio?

SPEAKER_01:

Wait, Jason Blumhouse has a deal to do 3D movies. Or to put the same movies, but I didn't mean to use the word 3D.

SPEAKER_00:

So the the way the name works in in this implementation is that so you'll have your movie in your in your living room, right? And then there's gonna be effects that happen that are synced up at points in the timeline. Like, you know, like if the black phone the black phone horn guy played by Ethan Hawk will, I don't know, walk by your screen at one hour, two minutes, or something like that. I don't know. I'm speculating. Or like the blood will splatter on the wall.

SPEAKER_01:

Yeah, who knows?

SPEAKER_00:

Something like that. Maybe there's like an explosion. That's what they use in the frame you saw above us, is they showed an explosion extending out from the screen. Literally gonna bleed out. That's a very interesting and sort of experiential use for these types of especially for horror movies.

SPEAKER_02:

I was gonna say, especially for existing media. It's retrofitting existing media, which is cool.

SPEAKER_00:

And I think that's a good use case. I've seen there was a couple things that I saw of like The Matrix. There was some, there's they're doing some Screen X thing in Hollywood where you actually are watching the movie inside of a dome, and then all of a sudden you're in the dojo, and each scene in the movie is synced up to a rendered reality. So you're watching the movie, and it and they you you can actually go to a movie theater and see this. Yeah. And it's a specific dome theater that they have in LA. Interesting. And it's got the movie. Oh, yeah. And then you're also in like the the white room with the the the weapon racks, and then you're in the the valley of the real, etc. It's like extending the screen.

SPEAKER_01:

Yeah. Yeah. So this is that same sort of thing. Didn't they do Wizard of the The Wizard of Oz in that dome? That's that's a different. Yeah, that's at the sphere. That's at the sphere. Yeah. And that thing's different than this. Yeah, but uh just to just to paint a picture for someone who doesn't know the difference. Like that dome with the wiz they took the Wizard of Oz and they created extra elements like what happens at the yellow brick road going all the way around. Yeah, it's a similar type of implementation. But this is yeah, similar in something.

SPEAKER_00:

But this is like for your living room. You don't have to go to a movie theater to do it. And I think that's cool. I think stuff like this is the type of thing that will get a lot of normies into the headset because they're like, hey, I love horror movies. I can get spooked in real life, man. Hey, stop talking about me. I'm right here.

SPEAKER_02:

Yeah, why why do all normies talk like that? I don't know.

SPEAKER_00:

It's a normies speak, obviously. Obviously, come on.

SPEAKER_02:

Obviously.

SPEAKER_00:

So it's uh the first time we've seen something like this done with like a full movie, and that's pretty cool. Yeah. I I appreciate that.

SPEAKER_01:

Yeah, I'm looking forward to more Blum House movies in life. Um especially.

SPEAKER_00:

And it's uh it's not the first time that Blum worked with like VR stuff because they've done their like horror verse, social world, and horizon worlds, and yeah. All of those are based on their other Eli Roth was uh franchises. There's also yeah, the the Eli Roth thing. I mean, I I don't think they're they're associated, but oh no. Yeah. So this one is actually this is interesting. We just watched this, so also announced you could see behind us, James Cameron was at Metaconnect 2025 and again, and uh he announced that there was a 3D teaser trailer for Avatar 3 Fire and Ash. And then we watched it, and this is both a really cool thing and also incredibly frustrating because it was really good. It was probably the best looking 3D, well, probably the best looking trailer I've ever seen in the Quest 3 ever.

SPEAKER_01:

Honestly, I just not even close. I saw it too, and I was I was shocked. I blow like this is not a video game. This is not a this is a beyond like this is thousands of hours of rendering. This is thousands and thousands and thousands and thousands and thousands and thousands of Zoe Soldana being like in every micro second. And I'm just I get that 3D is a thing, but like just because like you can see just because you can see a floaty jellyfish stingray monster in a large, you know, massive landscape, and like machines taking them down, and then fighting them back, and then we get and then we shoot back, cut back to like Zoe Del Sinus all down and being like, but we should hate them. I mean, it has an emotional gravity to it, but then I'm like, wait a minute. It's also this is even better.

SPEAKER_00:

It's also the the best implementation of the 3D that I've ever seen from the Avatar movies. I saw I've seen all the Avatar movies in 3D, and I used to work at a movie theater that showed Avatar and 3D back in the day, living room theaters in Portland, Oregon. And I probably saw Avatar and 3D probably two or three times, and those were active shutter glasses. And if you've ever used any type of like theater 3D before, every implementation of 3D dims the picture.

SPEAKER_03:

Yeah. This didn't happen.

SPEAKER_00:

Whether it's the active shutter or if it's the polarized lenses, it you're effectively cutting your light in half. Cutting your light in half. And not just that, but and this is another thing that I noticed is that trailer was not at 24 frames per second. Correct. Yeah, I don't know if this is true, but I it didn't feel like it was 24.

SPEAKER_02:

No, it was minimum 48. I think it might have actually gone 60, but like but it still had the cinematic quality.

SPEAKER_01:

Like it absolutely did.

SPEAKER_02:

It's a little bit of see, I'm I might be a little bit more sensitive to that shutter speed. So like to me, it it looked more like a video game. Yeah, like I recognize that you sort of need that for this format, but I can't help, especially when you combine high frame rates with computer generated graphics, it just screams be like it definitely looked more like old man.

SPEAKER_00:

I I agree with you. No, I mean I agree with you, but the problem is that 3D in general, motion blur kills the 3D effect. 100%. And so But there's none of that in this. Well, and the interesting thing is that is that this is it's it's interesting though, because like the 48 frames, if this is in fact at 48 frames per second, which we highly suspect it is, or 60. The 48 frames per second was created to minimize the the loss of light for the shutter glasses.

SPEAKER_02:

Right, as in 48 versus 60.

SPEAKER_00:

Yeah, yeah. Versus 24.

SPEAKER_02:

Well, right. I mean, because like 24 was like the reason the way they arrived at 24.

SPEAKER_00:

So you're getting two frames per shutter rather than one frame per shot.

SPEAKER_02:

Well, it's you're getting one frame per eye per shutter. So like you're still you're getting 48, but it's like it's 2420, you know, it's like yes. But yeah, the, I mean, they arrived at 24 back in the day because that was like as fast as it had to be to give you persistence of vision. Yeah. Any slower you notice it, and any more it's a waste. So that's where they're similar thing. It's like you can't really go any slower than 48 and get the effect.

SPEAKER_00:

But exactly. But here in this implementation, you're still getting more frames per eye because it's the shutters aren't happening.

SPEAKER_02:

Well, right, yeah.

SPEAKER_00:

This is more native, so there's no weird blanket. So the 48 feels it's interesting because it it it feels like it's ness less necessary on a headset because you're getting pure image data per eye. Right. But it still looks really good.

SPEAKER_02:

And that's the thing. Like, I think we're just it's gonna be the slow rewriting of what we determine and label cinematic. And also, people are so used to video games at this point. That's what I'm saying. That's why I say I'm an old man. Like, I have not played I didn't grow up playing video games the same way.

SPEAKER_00:

Like, I play my laptop is capable of 144 Hertz. Yeah. And I prefer playing video games at that frame rate if I if I can get there.

SPEAKER_01:

But like the the the thing about this trailer, okay, if you've ever played a video game and you've seen like a cutscene, right? The cutscene is somewhat different from the actual gameplay. There is a there's a different thing. Usually a lot of times they'll render the cutscenes at 30 frames per second. But this trailer felt like a like actual gameplay footage and actual like cutscene footage. And that experience, I think if it if you're doing something as broadly appealing as Avatar, works because you've got dad, mom, and their adopted child. And they all are like, this is like a game. Yeah. Now we can all be part of it.

SPEAKER_03:

Yeah. How can you tell?

SPEAKER_01:

Exactly. Can you can you tell the difference?

SPEAKER_00:

Can anyone tell the difference? Can you tell the difference?

SPEAKER_01:

Well, and one of the other interesting implementations, it feels like it's it's a seamless. And one of the other interesting metaphors is about this usage. Let's adopt this technology. We better clip this.

SPEAKER_00:

And one of the other implementations of of the of the rendering on Quest is this is by far a higher resolution than I've ever seen anything rendered on the headset. So they have to be doing some magic under the hood to be able to make this happen versus what they've been able to make happen in the past. Because this was like sort of it felt more like 4K. I'd like to know what the source file for this was.

SPEAKER_03:

Yeah.

SPEAKER_00:

Because it looked like it was 4K, but you can't actually get 4K PPD for a film on the Quest 3. Or if you can, it's still simulated 4K.

SPEAKER_02:

Yeah. Like you can't, there's not because the the the screen's off 4K.

SPEAKER_01:

Right. Yeah.

SPEAKER_02:

Yeah, that's interesting.

SPEAKER_01:

Not for eye. How is this interesting in the sense that in in the cinematic immersive, like the experience of being familiar with like your TV at 4K and then being 4K in your well, that's kind of the question, right?

SPEAKER_02:

Like it's it's it relates because like we're talking about now perceived resolution versus actual literal resolution. Yeah. And so like since literally you can't have 4K like native resolution in that view window in the headset, but it kind of looks like it's 4K, so how are they doing that?

SPEAKER_03:

Yeah.

SPEAKER_02:

Because it's not just a straight like 1080p file or a 4K. It could just be a 4K file. Maybe that's what it looks like when you overstamp it.

SPEAKER_00:

It it feels like it was a 4K file that they got to render at a very high perceived PPD. So like they this gets into some of the stuff that we've been talking about.

SPEAKER_02:

Right. So it's one of those things that like we'd have to just look under the hood to see how to do that.

SPEAKER_00:

And there's no way that we are going to be able to know without breaking the trailer that we watched.

SPEAKER_02:

Where's that hosted?

SPEAKER_00:

It was on MetaTV. Yeah. The new MetaTV app. Yeah. There is some speculation that Meta Meta may have changed their rendering tech for the MetaTV app. The existing MetaTV app has been deprecated. Really? So all of our videos going forward are not available on the MetaTV app anymore. All that stuff is just Yeah.

SPEAKER_02:

They've gone fully curated, they've lost, they've ditched all the user-generated content.

SPEAKER_00:

Unless it's if it was in-house or uh partner produced content. That stuff is gone.

SPEAKER_02:

Yeah. Does that fine, I guess, whatever. It's like it's understandable and also kind of frustrating, but also like not that frustrating.

SPEAKER_00:

Yeah. You know, we didn't we were making anything on that anyway. So like, but we got some of our all-time highest view counts on that. Like some of our videos did absolutely spectacularly well on meta TV. And then some of them, you know, didn't move at all.

SPEAKER_02:

Yeah. From a larger standpoint, of like, it seems like everybody's going towards curated content. And I think overall that's a better plan. Yeah. Because some of the stuff that people put out there sucks and it turns users off of viewing that content.

SPEAKER_00:

But here's the thing is in right now in the meta TV app, there's a it it's a it's effectively like a front end for a bunch of content. There is certain stuff that like if you go to like it'll populate Amazon Prime content, like movies that you have. Oh, interesting.

SPEAKER_02:

Or and then this is how the Apple TV app on the Apple TV.

SPEAKER_00:

So here's the thing though, is that if you try to watch an Amazon Prime video app, it doesn't render it in that player. In that player, it takes you to the Amazon.

SPEAKER_02:

It's almost exactly how the Apple TV app on the Apple TV. Just frustrating Apple that you did that whole branding thing. But that's what you you connect your other services, it'll show you HBO or something.

SPEAKER_00:

But then there's certain things like this trailer that was rendered in the Amazon or sorry, in the Mega.

SPEAKER_01:

Namely in the MetaTV Amazon. So are you saying that they have that different content is siloed? They've deprecated the app, they've siloed a lot of proprietary shit. Can we use that word in this point? Proprietary shits. And that's how they got off. That's how James Cameron put together a deal. He remember a few episodes ago, we said we covered how instead of going to he who to a different company. To Apple. Yeah. They went to this company and they've been locked in for at least half a year.

SPEAKER_00:

I think it's most likely just that Mada was told their in in-house content team to reach out to any and all possible people that they could that would be interested in providing content for their service. And James Cameron said, how much are you going to pay me? Right. And that amount was enough for him to go, okay, yes. Because all we've seen is this trailer. It doesn't necessarily mean that we're going to see the full film released on here. And this is what's so frustrating about this relationship between the movie and the new MetaTVS. Is that despite the fact that we are seeing this trailer officially hosted on the Meta TV app, there is no guarantee that we will be able to see the final movie in the headset. Yeah. Especially since the movie is by Disney. However, it was announced that Metaconnect says only December 19th.

SPEAKER_01:

Yeah.

SPEAKER_00:

Well, once they announced that Metaconnect that Disney Plus and Adolby Atmos and Adolby Vision are coming to Quest 3 soon. Why did it take this long? And why did it take their biggest competitor coming out to prove to them that this is what they needed to do?

SPEAKER_01:

And I think that's a pretty there's a lot of different answers to that question.

SPEAKER_00:

We were asking for, and other users, power users, or even just normies, were asking where is The real content. Where is the the answer is we're not James Cameron, we don't count.

unknown:

Yeah.

SPEAKER_01:

Also, the tech then, like, it was re it was really DIY hack like DIT. Like everyone was it's like it was a different time, you know?

SPEAKER_00:

Like imagine a small space colony uh with minimal resources and only like and our communications lagging, and you know, it's like how many quests how many quests headsets are out there in the wild versus how many Apple Vision Pros are there in the world? I've seen some walking around in the woods.

SPEAKER_01:

Like and I'm like, wait, come back, come back. No, oh okay. I guess I'll I'll come back later when it's feeding time. And I s and every time I see a wild quest, I go, Are you the same as the other quest? So I bring out my treats and I make sure my Wi-Fi is narrow beans.

SPEAKER_00:

Supposedly there are nearly And they always come back. 20 million quest headsets in the world. There are probably less than a million uh Apple Vision pros in the world.

SPEAKER_01:

When this was new, one million AVPs versus 20 million and left. James Cameron went to them.

SPEAKER_00:

Yeah. So that's what I'm saying. Is that like obviously Meta needed some sort of legitimacy, and it it was their biggest rival getting into the game to prove that legitimacy. And I think that's absurd and just a lack of vision. And I don't think it's uh I don't think it's uh surprising that that the Zuck is constantly playing catch-up.

SPEAKER_01:

Wants to remain part of the conversation. Yeah, but the table was set already by Apple and But he did I mean a board of directors still runs that company.

SPEAKER_02:

I know he can't be fired and all that good stuff, but like there's you know, like running a company, especially in a full capitalistic society, means like certain things. The priority of the company is to make money, not to make a good product. And so like this you have to like the it takes a certain push. That's why I think there's probably people at Meta that were like, yeah, we want to do this, and like, well, we don't see a reason to do that. Well, and Apple doesn't like, oh, all of a sudden we see a reasonable.

SPEAKER_00:

We also have to we do have to reiterate that once upon a time you could actually buy movies through a meta slash Facebook slash Samsung portal on the back on the gear VR and early in the Rift days through like an official app, but they deprecated that. And and I think most likely what it was is all of the various powers that be were like, well, we gave you the license to do this, and there's no money. Nothing happened.

SPEAKER_02:

We didn't make a bajillion dollars, so let's just close it all down.

SPEAKER_00:

Why are we finding about these theaters? Things have changed, the tech is dramatically more usable. That video that we just the trailer that I just watched was comparable in quality to Apple Vision Pro 3D movies for the first time ever from an officially licensed 3D movie trailer. And I really hope they do the whole movie that way. Because teasing especially if they're getting Disney Plus, there hasn't been any communication on whether or not the 3D cont because on Apple Vision Pro, all of Disney's 3D movies are available through their Disney Plus app. There hasn't been any communication about whether or not the Disney Plus app that is launching later on Quest 3 or on meta devices is going to include that 3D content. If it isn't, just never know, then let's take a moment. Cool, I can watch the Mandalorian in my headset. You know?

SPEAKER_02:

But Disney's like, in in five years, we're gonna come out with our own headsets.

SPEAKER_00:

We're just gonna and you know, Kimmel's coming back tomorrow, so we'll see if the what the cultural consensus is on whether or not Disney is cancelled.

SPEAKER_02:

Whatever whatever it is, it'll be different by the time this comes out. Yep, almost certainly.

SPEAKER_01:

I just want a gin style headset that I can put on to block out the world that we're living in. But also be part of it. Yeah.

SPEAKER_00:

You you own a quest three. It's not the same.

SPEAKER_02:

But it is he wants the matrix.

unknown:

All right.

SPEAKER_02:

This is the ignorance is bliss scene. I've heard of these words before.

SPEAKER_00:

I guess though, okay, this is interesting. Meditate is that this trailer is just the beginning of how fans will be able to experience the world of Pandora.

SPEAKER_01:

The world in Avatar, not the app.

SPEAKER_00:

We'll have more trailers for you in the future.

SPEAKER_03:

Right.

SPEAKER_00:

Alright. So this next one, moving on, this is sort of another continuation of stuff from last year. So last year this old chestnut.

SPEAKER_01:

If you're not aware, the can't the cameras on the outside of the quest can also capture your surroundings. But Meta's developing an entire like way of immersifying and being creating like spaces by by scanning your environment, and their the fidelity is like improving like month to month.

SPEAKER_00:

Yeah, it's really good. So Meta is rolling out their capture part of uh MetaHorizon Hyberscape. Like that's a real room. Last year they launched an app where you could you could look at existing spaces and some are like MetaOffice or like some makers space, and they were very impressive. You could walk around one-to-one in an environment, and it really absolutely felt like you were there. Like it felt like photogrammetry. I'm guessing it's using Gaussian splats, but they've released their own tech so that you can do that for yourself. Actually, before the podcast started, I took the headset and I scanned our space. So hopefully, if that works correctly, I had some trouble uploading our thus far, but you should be able to see a demo of our podcast space in a few days, if you want.

SPEAKER_01:

All this being your home. It took a while. It took a while. It took you like a half an hour.

SPEAKER_00:

Yeah, it took me almost a half an hour.

SPEAKER_02:

You can do this all in 30 seconds. It's like more like 30 minutes.

SPEAKER_00:

Yeah.

SPEAKER_02:

Walking over.

SPEAKER_00:

It's multi-stage too, because I'm going around like looking at everything to create the mesh, but then you actually have to go in and like get really close to stuff to get the detail pass. And then you have to look at this the ceiling, and then you have to upload it. And based on the size of the thing, it says it could take up to eight hours for it to render online.

SPEAKER_01:

On a geological timescale, that's nothing compared to where we were at 30 years ago. But also the upload failed, so it doesn't even matter if I if I can't upload it to the cloud. So well, try and try again. They've iterated this tech to the point where it's like it's impressive. Like this is a big room, you know, and the lighting is nice, but like 30 years ago, right? Like not like not a not a thing. Like a total like a like a moonshot was a shot.

SPEAKER_00:

So the other thing about these is that these spaces, they're stored on a cloud, and then they're you download them and they're reconstructed in real time on the device.

SPEAKER_03:

Yeah.

SPEAKER_00:

And they're going to be utilizing, so it's their capture tech is based on their new rendering, meta rendering end engine that they're using for Horizon, because they they created their own proprietary game engine for Horizon that seems interesting, but uh we'll have to see what it I'm skeptical. They have they promised a lot already that they have yet to deliver on. So but supposedly these hyperscape scenes will be usable in by other people other than just yourself. So you could use it as the basis for like a horizon world and invite people to like your own home, that type of thing, which is pretty cool.

SPEAKER_02:

Yeah. I would use this, like I I tried to do this last year when I was scouting locations for the feature film that I shot back in October. Tried to do the Gaussian spotting thing with Grazia or whatever, did not work, never got it to work right.

SPEAKER_00:

Oh dang.

SPEAKER_02:

This I would just I'm still looking for like this. Sounds like the easier actual usable solution where no weird app, you know, like it's all one ecosystem. Because you were using like a different app on the phone and then uploading the squat to Grazie to try to like.

SPEAKER_00:

Using that to scout a location, oh my god.

SPEAKER_02:

So that way I could be like, here, my gaffer, like look at look at the space. Oh yeah, we can put lights in the ceiling right there.

SPEAKER_00:

Like whoa. Huge for me. And you could even start planning up shows that way. Yeah. Put the camera here, put the lights there. So cool.

SPEAKER_02:

Yes. Oh, yeah. And then especially if you have like multiple headsets and you get other people into that space, you could literally be like collaborating with like actors and the director and the DP all in one space. Wow.

SPEAKER_00:

And while not, I wonder if you could even export that to even use it for like virtual photography or something like that. It's probably not higher enough for us to do it.

SPEAKER_01:

Well, like what the yeah, the end use co the use case is really strong here for just trad filmmaking, right? It you'll save time, you'll save money, you'll save like insurance.

SPEAKER_00:

It's just like if you can literally transport your producer into the space and be like, hey, this is our location, right? This is how much we're paying for it per day, blah, blah, blah. And he's just like, A plus. Like most times, like you have to pay money for the location, you get like cool. You're one of the things. And you get maybe yeah, or you get like a couple hundred photos, but yeah. You know, like, oh, is there like power on set, you know, like etc. That's really cool.

SPEAKER_01:

Uh yeah, yeah.

SPEAKER_00:

Or you're like Seth Rogan in the studio, which I just started watching. Incredibly funny. The producer visits. Oh, treasure. Anyways. Uh all right. So moving on. So this is this was the big one at Metaconnect, and this is sort of a big deal in general for the larger XR industry. This is sort of the a culmination of a lot of different things over a long time. This has been, we've been approaching this as a consumer device for I think.

SPEAKER_02:

I don't know what you mean. A couple decades now.

SPEAKER_00:

And that's the meta Ray-Ban display glasses. And there's a lot here, there's a lot to unpack. And so the display glasses are sort of a hybrid of two devices. You've got this monocular display with smart glasses with a 20-degree field of view, 5,000 knit display with a 600 by 600 resolution, which creates a actually pretty impressive pixel pixel per degree.

SPEAKER_01:

5,000 knits, that's enough to see out. It's only a HUD, and it's not a tracked HUD. So it's not it's not 5,000 knits, that's enough to see outside. That's like bright. That's incredible.

SPEAKER_00:

My my laptop has a maximum uh knit of 600. Yeah. So yes, that is very, very bright outside.

SPEAKER_01:

This is like Times Square, like levels of light.

SPEAKER_00:

Yeah. That's sure.

SPEAKER_01:

Well, those specs, those specs being like the base. Well, you need to need it to be able to see outside. So the idea is to take it out. Like it's a fashion. Yeah.

SPEAKER_00:

It's a it's a statement. It's a focal statement.

SPEAKER_02:

It's like this they just show this little like demo of somebody walking through the parklar.

SPEAKER_00:

And all of all the influencers and journalists that I've read who have used this device, they took it outside and they all were sort of incredibly impressed that they were able to use it outside effectively.

SPEAKER_02:

Yeah. Like they can do walk and direction. So this is this seems like a pretty good representation of like the amount of brightness per and I actually did see somebody manage to film.

SPEAKER_00:

They while while somebody wasn't looking, they took their glasses off and they put their camera screen right in front of the screen, and they were able to get it.

SPEAKER_02:

Oh, cool.

SPEAKER_00:

And it's the only one that I've seen of of it. Of an actual and it does, it looks just like this.

SPEAKER_01:

Yep.

SPEAKER_00:

And it's impressive. The the other element is is that there's a there's a second device that comes with this, and it's as it's an S E M G wristband. It's right there. You can see it.

SPEAKER_02:

And yeah, we saw this last year at Connect the the Connect announcement with the aerobic projects.

SPEAKER_00:

It's surface electromyography. And what it does is it reads the electrical impulses in your in your wrist to be able to tell exactly all of the movements that your hand is making. There's also an IMU in the wrist band itself, so it can measure internal movement unit data.

SPEAKER_02:

I love the subtlety of movement that you can do. Like you the can't it doesn't have to be seen by camera, it can be down by your hand. Like that's really neat. Yeah, you can handle it. Yeah, I think of of of all the technology, that's the coolest to me. Like that.

SPEAKER_00:

Yeah. Well, and also they've been that it seems to me like they were they've been developing this ESE SMG band tech for like five or six years, and they were just like, we have to pair it with another device to be able to sell it.

SPEAKER_02:

Right.

SPEAKER_00:

Yeah.

SPEAKER_02:

Start selling it now, not in ten years or whenever Orion comes out.

SPEAKER_00:

But from the people that have used it, from what I've heard, a lot of people are almost more impressed by the wristband than the glasses. The glasses. Though I have heard a lot of people say that they had they were more impressed by the glasses than they thought they were going to be, but still, and especially Haney from Upload BR mentioned that it still feels just like a single monotony.

SPEAKER_02:

I love how tall like makes your Markbook really small.

SPEAKER_01:

Yeah, even though well, that's like a good foot of distance higher than that. They're not that he's only about maybe 10 inches. They're not that much.

SPEAKER_00:

I mean, I will say I am very, very interested in this product conceptually. I don't think I would ever buy one. Definitely not a monocular one, but I could absolutely see myself buying the inevitable 2027 binocular refresh.

SPEAKER_02:

Yeah. The thing that's interesting, especially some of the use cases they show, like somebody walking through the park and like reading text messages. Does anybody really want that? Like I think people do.

SPEAKER_00:

But the like I think I actually am one of those people.

SPEAKER_02:

Like the walk in the park is like that's me getting away from tech and away from like text and I would absolutely use the Spotify integration. Yeah.

SPEAKER_00:

And especially if I'm in a in like a home, a place away from home, the navigational stuff. Yeah. Absolutely.

SPEAKER_02:

There's that's what I'm saying. There's definitely like use cases that I can see that are awesome. But at the same time, like it's like even me with a watch, right? I totally understand. I often take this on Windows. I don't want to be disturbed.

SPEAKER_01:

I'm just not going to wear the if that's the case, then just leave your glasses at home. Right. But yeah. If like I think this is a it this is a high-end luxury, right? You know, you've if you could put your prescriptions in it, you can travel and you can be a heads up. Like, I don't want to be down here if I'm not in a place that I'm unfamiliar with. I want to be up here and just not only take it in, but just know where I'm at, so that if I have to go down the street again without my phone or glasses, I just know the way. So that this seems to be the idea. If they're gonna if it's 5,000 nits, you're way outside. Yeah.

SPEAKER_00:

You have the other element an iteration of the AI capabilities of the existing Ray Bans, right? So these have dual cameras in them. You're gonna be able to. Part of the thing is that what they've realized is that some people don't like responding to the Ray Bans. They don't like talking to it. Yeah, that's right.

SPEAKER_02:

And they don't like I would be one of those people.

SPEAKER_00:

Uh I would definitely be one of those people. I've tried, I I think it's so awkward doing voice commands in public. I just literally don't do them. I don't really do them that much unless I'm at home, and I don't really do them that much.

SPEAKER_01:

Yeah. Okay. So period.

SPEAKER_00:

But that bad use case scenario, example of a bad use case scenario in your life where you might if you're on the bus and you're like, hey Meta, blah blah blah, tells you everyone on on the bus that I am a tech influencer, I probably have thousands of dollars of of stuff that you could rob me. And this is where I live.

SPEAKER_02:

Yeah.

SPEAKER_00:

Yeah.

SPEAKER_03:

Yeah.

SPEAKER_02:

And it's just obnoxious.

SPEAKER_00:

I mean And it's obnoxious. And I'm somebody who if I don't want to hear your stupid phone playing music at the bar. Correct.

SPEAKER_02:

At the bar, at the park, at any fucking where actually. Yeah.

SPEAKER_00:

I don't want to hear your stupid stuff turn the volume down, put in put in your butts.

SPEAKER_01:

Yes. I don't want to hear your colorful metaphors. Yeah.

SPEAKER_00:

So yeah. Anyways. So the interesting thing about these glasses is that they're the they're really the first full color display glasses that have been available at this price point and with this functionality and at this like size. Yeah. I don't even know if there's another full color display glasses that has launched. I think all of them have been like green or yeah, like red. Especially not stuff that you don't need a wire for. Like this is Yeah. And this is, yeah, they're all standalone. Supposedly it's running this the same chip as the existing Ray Ban glasses, so they're not going to be particular particularly performant, especially with all the extra stuff they're doing. But I think this is very much there's some I I see sort of a split opinion on this. Some people are thinking that these are gonna bomb. I do not think they're probably gonna bomb. I think they're actually gonna be a huge hit.

SPEAKER_01:

Where are they gonna put them in a where are they gonna leave them out? Like what do you mean? Like, where is the entry barrier for this? Like, is it online? Do you can you you want to try on glasses, you gotta put them on your face. Oh yeah, yeah.

SPEAKER_00:

I mean, that is a big that is a huge problem with all video glasses, is it's it's experiential and you can't really demo it. But I think that's why you know their existing implementation of the UI.

SPEAKER_02:

Well, and also the fact that they're partners with Luxotica. So those things you can get them in Luxotica stores.

SPEAKER_00:

Stores. And I mean I've never set foot in a Ray-Ban store, but I actually have. Actually, one time I did, I was at Universal Studios. Yeah. And they had a Ray-Ban store. And I walked in and I was like, oh hey, that's cool. I can't.

SPEAKER_02:

I don't even know if there's a Ray Ban store, just a just as like a sunglasses tool. Yeah, either way.

SPEAKER_01:

So if you're at the Ray Ban and Universal, like there might be an augmentant experience that you that Universal like bakes into their store experience. Yeah.

SPEAKER_00:

Now that you mentioned that, they actually, during Metaconnect, Disney, they showed a video of Disney employees walking around Disneyland interacting with not these, but the existing Ray-Ban Meta classes and doing real-time AI interactions to be like, hey, what is what is this? And it shows like the Matterhorn, and it was like, oh, the Matterhorn was built in like 1960X, you know.

SPEAKER_02:

It's like your own personal audio tour that you didn't.

SPEAKER_00:

Yeah, and I mean that's that's pretty cool, actually. Imagine walking around like the Louvre with that, you know, or or the Smithsonian and being like, hey, rather than buying this well, and actually, the they're ending the the 3DS relationship. So certain museums around the world partnered with Nintendo over the last like 15, 20 years by using DSs and 3DS with pre-built-in. So you'd like walk up, you had headphones plugged into them, you'd walk up to the the thing, scan the QR code, and it would take you to the relevant audio data.

SPEAKER_01:

I didn't know it's they didn't know that Nintendo had that relationship with Well, they're ending it. Yeah, yeah. I read something about it like a couple of days ago. It's like, you know, it's this is the new version of that as well. Instead of a yeah, uh a lot of museums do this thing where you br they have a a walkie-talkie and a headset, and then you basically walk around the museum.

SPEAKER_00:

Well, and if you had a new version of the glasses, you could do live translation too. So like if you went up to like a placard at you know a museum in Japan, you could just scan the placard and it would do a live translation in front of you in real time, which is I think these type of accessibility elements and these types of like these are the things that I think a lot of people will see as the the aspirational tech and like the the big not evil tech, you know?

SPEAKER_01:

Sure. And aspirational it's I think should be baseline when it comes to accessibility. Like but it is a little pricey at$7.99.

SPEAKER_00:

So yeah, but I mean there are some like luxury glasses that are like three times that cost.

SPEAKER_02:

Yeah, it's it's gonna find a market, I think, no problem.

SPEAKER_00:

Yeah, I mean it's it's less than a smartphone, uh like a high-end smartphone. So yes, it is expensive. It's more expensive than the Quest 3, but for a certain type of person that they've already been buying these types of products at half the price. So if it's if it's got a display built in, double the price doesn't seem like out outside of a realm. Yeah, let's get on the plane. Not just that, but I'm guessing that there's gonna be a small use case for like consumer-facing enterprise solutions, like the Disney one, where it's like you could buy this$800 device rather than buying this$10,000 device for your enterprise solution that you know has XYZ capability into it, that this stupid dumb thing can can solve that these high-end tech solutions are 50 grand versus, you know. Right. I can't think of anything immediately off the top of my head, but I I know that those use cases exist. So that's cool. They were even saying that the the SEMG wristbands eventually will have the ability to work as a keyboard on flat services.

SPEAKER_02:

Yeah, I saw them demo like a little one-handed typing or two hands.

SPEAKER_00:

Uh well.

SPEAKER_02:

Oh, they probably want to. There's no reason why you couldn't have two of them.

SPEAKER_00:

Yeah. They want to do an upsell on an wristband will eventually be usable with existing VR devices.

SPEAKER_02:

That's what I'm I'm excited to have that with a quest. That would be awesome.

SPEAKER_00:

And then next. So we sort of touched on Vision OS 26 on our last episode, but it's actually out now. And so there's a bunch of features that were not necessarily announced in the beta and stuff that are actually out now. So we talked about this before, but PSVR2 sense controllers finally came. That's big. You can that means you'll be able to have full fat VR games available for developers. And it seems like it's built into the SDK. And honestly, having played around with this on Unity myself, getting controller, if if there's an SDK for it, getting controllers working into most VR games at this point, that's like half of what the devs do. Yeah. So it it shouldn't be hard to just it's almost a one one-click integration at this point. I don't know what the differences are with polyspatial because I haven't used polyspatial, but I'm I'm guessing it's not difficult. They've they also got their own Logitech style like stylus, like we were talking about a couple months ago. I still I haven't used one of these, but I think conceptually they're just so cool. Yeah, so there's support for that finally.

SPEAKER_02:

Yeah, persistent widgets, right? These are the widgets that can stay even when you these are cool, yeah.

SPEAKER_00:

The the persistent and and they stay there. So actually the new Quest home environment has something very similar to this. You can have your home environment has persistent, like like I put Spotify in the wall, and then it just lived there.

SPEAKER_03:

Right.

SPEAKER_00:

But not just that, I have I use Koba's high-res music streaming service, and in its web player, I was able to do that with just like a like a website, like the the web app. I was able to just pin. So that means any web app that can run on the Quest 3 browser, you can pin wherever you want and just have it as a persistent widget.

SPEAKER_02:

So I haven't gotten this new, I haven't got that that update.

SPEAKER_00:

You haven't got that update yet?

SPEAKER_02:

It's still like on like three or something like that.

SPEAKER_00:

Well, because the persistent widget stuff is actually fun. And so it felt like I had like a boom box in the corner. Nice. Anyway, so sort of gimmicky, but so yeah, next are the improved personas. So I actually saw a clip of this earlier. They're so much better than they used to be. Yeah. Before it was just like sort of like a blotchy sort of it just made you look like you're yeah, you got a soft focus on your face, and now it looks like somebody removed the soft focus filter.

SPEAKER_02:

Yeah, there it is. Improvement, difference improvement. Yeah, it's it's so much better. Still slightly on the side of the uncanny valley, but so much better than it used to be.

SPEAKER_00:

It feels usable now. It's usable. It felt like a gimmick before.

SPEAKER_02:

Yeah. Yeah. It's just kind of like it looks real but like overly AI, like you like you took a filter and overdid it a little bit. Like so it's pretty good.

SPEAKER_01:

I imagine that if you knew the person that you were looking at, that the uncanny valley might seem a little bit. Oh, for sure.

SPEAKER_02:

Yes. 100%. Yeah.

SPEAKER_01:

And that's that's the goal here. Yeah, but how many people do I know with Apple Vision Pros, am I right? Exactly.

SPEAKER_02:

What is it? I am not a rich man.

SPEAKER_00:

John Krasinski and his office. All the people literally, all the people that I know who have Apple Vision Pros, I have not met in real life.

SPEAKER_01:

Yeah. But soon you could feel like you are meeting them if you get on this new spatial tech. Look at this. Look at this. Is this a real building here?

SPEAKER_00:

Or is it like Yeah, that's the Apple headquarters. It is, right? Yeah, it is. Yeah. But that's the spaceship.

SPEAKER_01:

Yeah. Right. That's the Death Star.

SPEAKER_00:

So other than that, they also have local share play. So it means you have multiple users can sort of huddle around the same virtual environment. And I think it's baked into the actual, it's like OS level. So if you like pulled out like a like a 3D model, you could everybody could look at it. It's like 3D AirDrive. There has been a similar implementation for request devices for twice in time, but it's finally in the Apple Vision Pro. So now they they finally uh introduced 90 Hertz hand tracking, which developers had been sort of like begging for since the beginning because they were locked at 30 hertz prior to that. That's huge. The hand tracking on Apple Vision Pro was good. What? Nothing.

SPEAKER_01:

This is good because it means that you can do things like you can play drums. You can actually, it's a lot closer to like it's just reading your hands and fingers and you can make do stuff that's like fine, fine art stuff.

SPEAKER_00:

The the hand tracking on the Apple Vision Pro was good, but the hand tracking on Quest was dramatically better and has been dramatically better for a long time. But now it's to the point where you could actually build it around like even gaming experience. Yes. Where before Dirty Hertz was not enough.

SPEAKER_01:

It's a better kind of surgery. It's a better kind of uh operational.

SPEAKER_00:

They also introduced a thing, spatial scenes. A new generative AI feature allows you to create multi-perspective volumetric scenes from a single 2D photo.

SPEAKER_01:

Aaron Powell So I can take a picture, upload it, and then it'll be like this is my house.

SPEAKER_02:

Yeah. It was like I think what they're doing here is is using LiDAR information with that single photo to like.

SPEAKER_00:

Because all the iPhones have LIDARs built in now, right?

SPEAKER_02:

Not all uh pros.

SPEAKER_00:

The pros do, okay. Yeah. That's cool. I mean, what's a good idea?

SPEAKER_02:

I will say that it's it they'll actually show it here. Well, like when you pin a photo, so this is what they're talking about.

SPEAKER_00:

Yeah.

SPEAKER_02:

So you get some of this 3D depth information.

SPEAKER_00:

So it gives you a slight uh fake pattern. So that when you have it, exactly.

SPEAKER_02:

When you have it on your wall as a widget, oh you can when you do this and look at it, there's some depth to it. So that's cool.

SPEAKER_00:

That's pretty cool. The mess working out there. I I do think that the this sort of like perceptual reconstruction of elements and 3D and like 2D elements and 3D elements, I think that's just sort of the direction that all XR content is going. Because the amount of benefit you get from uh having a depth map to the way that the human brain perceives 3D dramatically increases uh the effectiveness of the 3D and diminishes simsickness and eye strain with uh with 3D content.

SPEAKER_01:

Okay. So you so like a I think the entire projector. So like in a four-year project like three to four years, this technology is going to make uh this technology a little easier to put on. Like sickness.

SPEAKER_00:

It'll make it easier to distribute and it'll make it easier to watch long term. Because rather than having a baked in uh uh 3D stereoscopic effect, it'll be dynamic.

SPEAKER_01:

Okay. But like the difference between like okay taking a picture and then seeing what it what what it might look like, what would the Mona Lisa look like?

SPEAKER_00:

These things are always gonna have uh artifacts, right?

SPEAKER_01:

Well I know, but that but that's that's just the first part of it. Like that's not that's making it easier for people to get in without feeling sim sickness. Right? Yeah. Like that's the idea. And then the the barrier point, the entry barrier point is not that they won't feel it, it's just it'll decrease it, but sure. But they walk past go. Yeah.

SPEAKER_00:

Well because what it's doing is it's creating a full mesh for your eyes to be able to adjust dynamically to where it's focusing rather than having a folk a fixed focus.

SPEAKER_01:

Yeah. And that's this is some real art school stuff here that's that we're talking about.

SPEAKER_00:

And it's important. And I think Apple, I was sort of dismissive at first of Apple's implementation of the Apple immersive video format and their other spatial formats, but uh long term I I have absolutely come around to their way of thinking because traditional 3D video has been quite limiting.

SPEAKER_01:

So Anthony, it's also gigantic.

SPEAKER_00:

It still is. Yeah.

SPEAKER_01:

Yeah. So Anthony, have you ever been into a museum and gone? That Italian woman is definitely looking at me no matter where I go. I'm on I'm like to the left, to the right, they're definitely looking at me. And then you realize she's looking at everybody because she is a painting.

SPEAKER_00:

Made hundreds of I would imagine that she's not looking at anyone because she's a painting and not a human person. So you see where I'm coming from? The entry point. They also finally, and this is a long time coming, got uh native support for 180 and 360 degree content.

SPEAKER_02:

Yeah. Explain the difference here because obviously we had this content available that was native on the headset. But so what do you mean? This is just like embedded in websites or so.

SPEAKER_00:

Before, if you wanted to watch our videos on an Apple Vision Pro, you had to have a specific app or sort of hacked together like workaround to be able to watch our videos natively in an Apple Vision V.

SPEAKER_02:

So there's no YouTube VR on A V.

SPEAKER_00:

No.

SPEAKER_02:

Right, okay.

SPEAKER_00:

And the only exceptions to that were people got like created apps to make their own apps for their own content. To fake it. Yeah, yeah.

SPEAKER_01:

And now you can natively in wait, wait, they were fake we were getting faked on Apple V. We were app being. You couldn't use the official YouTube app and watch. Whoa, whoa, whoa, whoa. This is huge. I'm still not sure that you can. But we have an audience that like wanted to wrap that like to like they built Oh, I don't know whether or not people actually did this for us.

SPEAKER_02:

Yeah, no, yeah, it wasn't definitely not.

SPEAKER_00:

Well, let's pretend they did. But this is this was sort of creators were begging for this for a long time, being like, How did you not have this implemented? Because they wanted to push their proprietary format. That's why. And Apple immersive video is a interesting format. I don't in some ways I don't think it goes far enough. In other ways, I think you know, it is generally pretty cool. Like stereo delta is a good way to encode. But other than that, this there are a lot of people who are going to continue to shoot in traditional 180 and 360 like formats for a long time in perpetuity. It's not going anywhere. Uh yeah.

SPEAKER_02:

I mean, there's no there's no affordable way to do otherwise, right?

SPEAKER_00:

Yeah. Not currently.

SPEAKER_02:

Not that you could even call this affordable, really.

SPEAKER_00:

Yeah.

SPEAKER_02:

Not to the average.

SPEAKER_00:

It also turned turns out that they they finally instituted macOS spatial rendering. This is effectively PCBR, so you can link your headset to your more performant or possibly more performant PC to push XR experiences to the device. I don't really know if there's anything built currently that can take advantage of that because Steam doesn't even run on Macs anymore. It barely does. Yeah. Unless it's virtualizing Windows or something. Yeah.

SPEAKER_02:

I'm not sure now.

SPEAKER_01:

I even tried to download Cyberpunk 2077 onto my Mac using the Steam version. And I got it to install on a separate hard drive, but I couldn't get it to run.

SPEAKER_00:

Yeah, exactly. And so we'll see. This is sort of a we'll see down the line what's gonna happen with it. I don't think there's gonna be a huge this is mostly, I'm almost certain, for like enterprise uh creations, people who have like super ultra high res like models in you know Blender and Nuke and stuff.

SPEAKER_01:

I've got a picture of the enterprise D. I don't want you to experience what it's like up close.

SPEAKER_00:

Yeah. And their dev environment is Macs instead of Windows or something. That's what I think it'll most likely be used for for a while. Because there's not a lot of PC, there's almost zero PC VR gaming experiences on Mac.

SPEAKER_01:

So yeah, we haven't quite reached that point in the timeline.

SPEAKER_00:

All right. Now, also iPhone integration. I think that's pretty cool. You can unlock your phone and answer its calls from within the headset. I'm surprised they haven't had that since the beginning. That to me is like a like late. And also, like, uh why couldn't it like it knows what your phone looks like? Why couldn't we simulate your phone? Right. Yeah.

SPEAKER_02:

I'm assuming it's the unlock part, I think. That's the key. Right.

SPEAKER_03:

Oh, yeah, the encryption face scan. Yeah.

SPEAKER_02:

Right. Because you're wearing a headset. Oh they're probably doing a trusted device thing like where it's it's like I know this is your vision or whatever.

SPEAKER_00:

Well, they'd also scan your eyes, so well, right.

SPEAKER_02:

But I'm saying, like, so they they tied that together instead of just relying on the scanner on the phone itself.

SPEAKER_00:

Oh, you ended up getting the Oh yeah.

SPEAKER_02:

I went orange. So why not?

SPEAKER_00:

Also, game controller breakthrough. A new visual feature shows your real hands and arms holding physical game pads. That's pretty cool.

SPEAKER_01:

Well, what if they what if my gamepad is like zero good? Yeah, what if I want it to be like this big? You know, or like this big. Uh we're just one.

SPEAKER_00:

I don't think this is what that's about. Also we're like, oh stretching. I don't know. And then there's some other cool stuff like, oh, this is a big one. Developers gain the ability ability to use Apple's on-device large language models within their own Vision OSS. This seems like interesting. Some top secret stuff. That seems like if so, like if say you built an app and you wanted it to run some like LLM instructions on the device, there's now an SDK to be able to do that.

SPEAKER_02:

But you mean like just passing through commands basically? Yeah.

SPEAKER_00:

Leveraging the NPU on the device rather than using like the GPU or the CPU.

SPEAKER_01:

Why would we need to use the neural processor in order to build anything why why wouldn't the app already?

SPEAKER_00:

Say you wanted to brute force some really high-level math that the GPU is not efficientized for.

SPEAKER_01:

You were Robert Downey Jr.

SPEAKER_00:

You need to use how to time travel. So MPUs are highly highly optimized for running code, maths. High-level math quickly at low power. GPUs are not designed to run it.

SPEAKER_02:

It's sort of like the same difference between like a CPU and a GPU. GPUs.

SPEAKER_01:

It's the difference between a charcoal and like a graphite is really more special.

SPEAKER_02:

It's specialized to this specific type of calculator. Right.

SPEAKER_00:

And generally, there are certain tasks that are much much better to be done with the NPU than either the other processing units in and especially in real time. And that's the biggest element. So because a lot of the AI features that need to happen quickly and fast go through the NPU.

SPEAKER_01:

Is this a collaborative use case? I mean, is this use case best for collaboration or like solo works? Because like that to me is the is is interesting. What do what do you mean? Imagine that you needed to use the LLM in a group setting to figure out a problem, right? That you all have shared expertise on. Like, is that a thing that could be that's not really that's not really that's not what this is a good idea?

SPEAKER_02:

Because they did specifically talk about the LLM and not the NPU or the neural version. So maybe there is a way that you could collaborate, like you got a few people with AVPs wherever and you're collaborating about something, you might be able to tap into the like a shared LLM. Maybe that's network somehow. I mean it's unclear. I sort of thought that this was it's unclear what they mean by this.

SPEAKER_00:

Well, because if you're using the LLM, then you're you're gaining access to the NPU.

SPEAKER_02:

Right. But it could mean both, it could mean one or the other.

SPEAKER_00:

Like it's they didn't say NPU though, actually. Right. They said we're gonna do that. Which means that it's like it means may mean that they just have access to the the LLM implementation. Exactly.

SPEAKER_02:

So again, we don't really know what this means yet. Interesting. Either way, it's interesting. And anything that's you know extending functionality is also there's a new environment.

SPEAKER_00:

It's Jupiter, it's big, it's cool. You saw it earlier. Sorry. And then look to scroll. So accessibility, you just look at the bottom of the device and it scrolls. Oh my god. That's huge and then that's what we all said. It's like that's it.

SPEAKER_02:

That's awesome.

SPEAKER_00:

Yeah. And then team device sharing. This sounds like it's tell tell us more about what you think.

SPEAKER_02:

This is what because specific because it says like with enterprise users, right? That's specific you use the word enterprise. So I haven't looked into this at all, but my speculation is that you need like an Apple server or a domain environment, an Apple like enterprise environment. Because it said specifically Teams, too. So I think you have to have like an infrastructure in place to unlock this particular functionality. They still really, really want you to buy four AVPs, right?

SPEAKER_00:

Like one for everybody. So this is only it's like a land party making like lots of money.

SPEAKER_02:

Yeah, yeah, exactly. It's like if we had an environment where we're running all macOS stuff and we have a server, that's my own. I'd have to have my own iPhone. Because they I don't think they would have used like the words Teams and Enterprise if it wasn't something like that.

SPEAKER_00:

Yeah, so all this stuff is ready to go out of the box now for the most part. I think September generally seems like it's the biggest news month for all XR stuff. And it's a pretty big one. The the Samsung headset is gonna make or break the next couple years, I think. Yeah. But there might be another. Well, we already know. Another uh sled? It's a it's the steam frame.

SPEAKER_02:

I was like, there's no slide for that.

SPEAKER_00:

Uh that we we we didn't cover the steam frame leak because it sort of was an implication that the headset was gonna launch, but then it didn't. So yeah.

SPEAKER_02:

Heavy rumors that didn't turn out.

SPEAKER_00:

So I guess we just gotta keep waiting.

SPEAKER_02:

Next episode, maybe?

SPEAKER_00:

Yeah, I mean, we can sort of talk a little bit of the steam frame, is supposedly what the name of the headset is, and all of the leaks that have come out are saying that it is almost certainly a standalone headset and not a split rendering device, like was thought before, and that the split rendering is literally just going to be the next iteration of the steam boxes, but the device itself is going to be standalone.

SPEAKER_01:

It's like what X-Beam is doing, X is doing with the uh X Reels doing with the X Beam for and like what Meta is doing with the SCG and the on the onboard, like pocket, battery, and all that. Like Steam Deck is gonna serve that function for split render. Because people who have Steam are already in that ecosystem, and if you're gonna get a headset, yeah, so effectively like have a standalone headset, but if you own a Steam box, you'll be able to use it.

SPEAKER_00:

Yeah, it's and they even like the the final patent for their their wireless streaming dongle, which uses Wi-Fi 6E. Yeah, so that's cool. Steam frame looks like it's closer, but who the hell knows when it's gonna come out? A turtle always wins last. Last thing I just want to say is I picked up the Victure Pro XR a couple months ago on sale, sort of because I was getting FOMO on the X Real glasses that were released. We all and these were a cheaper alternative. But here's the thing. I love these things, and they have actually made our the use case for some things a lot easier.

SPEAKER_02:

Yes. I was very like skeptical of like video glasses. Any you and then I use these, especially with the iPhone. You plug it right in. You go into spatial spatialify or whatever, yeah. You can directly watch your spatial videos without any conversion or anything like that. Just plays back perfectly in the glasses, yeah, which is very convenient. You don't have to wait to upload to the headset or any of that nonsense.

SPEAKER_00:

Um we also got them to work with, and this is coming up with our upcoming sneak peek for our video on the 2-3 U, the Quest 3 or the Quocam 3 Ultra VR 180 mod. Spoiler alert, we loved it. It's a great camera, you should just get it. But we were able to get it working with an I think we may have been the first people outside of of Oh yeah, Shiang to get it working. And the because the implementation is that you you'll be able to plug this directly into to your Quokam3 Ultra and get it a live preview in stereo, and it works flawlessly.

SPEAKER_02:

I wouldn't say flawlessly. I mean it's pretty close.

SPEAKER_00:

It works well, it works really well. It's not like the resolution is a little low.

SPEAKER_02:

And it's not just that, you kind of have to cross your eyes to actually get a good feel for what you're looking at. Whereas like if you're looking at other stuff that's formatted correctly for it, it's comfortable. You don't have to like but with this, you get you do get a live preview, like a box. You basically get the little preview that is on the display of the Quocam is replicated in your eyes. So you just kind of have to cross your eyes and then you get like a good 3D preview.

SPEAKER_00:

There's a lot of really great stereo uses for these glasses. My biggest problem though is the Android implementation of the Spacewalker app sucks. You need to make it better. Yeah, it's borderline and usable currently.

SPEAKER_02:

Pleasantly surprised on iOS.

SPEAKER_00:

Yeah. And but generally, the actual usage of the glasses, like, for example, I plugged those into my Logitech G Cloud last night, which is a little streaming Steam Deck like device, and I played Alan Wake American Wastelands on GForce Now through my glasses, through my Logitech G Cloud, streaming over the internet with sub-millisecond lag, and it felt like I had you know TV this big right in front of me, and it worked perfectly. And I played that for about three hours in bed last night.

SPEAKER_02:

Yeah, I'm definitely impressed with like the resolution and the clarity and the brightness of the display. Like I yeah.

SPEAKER_00:

Yeah. So, Victure, if you want to send us any other devices, like the new Lumas or anything to test out, we'd love to do that. We'd even make a video about it if you wanted. Um, that's true. And this is not paid. This is just because I like the device and I am impressed by them. So yeah.

SPEAKER_01:

I like how you can adjust the diopter that there's an inbuilt like focus screen here. I mean that's good.

SPEAKER_00:

It's useful to a point for me because I have I have astigmatism, so I had to get these inserts. I got them from Hans VR, and I have to call out Hans VR here. So this is actually the second version of these inserts that I got. The first ones weren't good. They did not work. However, I contacted them and I was like, hey, this isn't this isn't working for me. And they got me a replacement within less than a week. Nice. It I was they were in my hands less than a week later, and they were perfect. So shout out to Hans VR's support team for rectifying and completely blowing away that. So that was cool. Anyways, thank you for paying attention to us after we sort of, you know, didn't show up for a couple months, but we've been incredibly busy. And uh yeah, thanks a lot. See you around.

Podcasts we love

Check out these other fine podcasts recommended by us, not an algorithm.

Voices of VR Artwork

Voices of VR

Kent Bye