STEREOSCOPE
Welcome to the STEREOSCOPE Podcast, the place where we dive deep into everything immersive video. From VR180, 3D360, Spatial Video, Volumetric to Photogrammetry, we cover it all. Our show is dedicated to covering the latest news, best practices, and workflows that are essential to the immersive video community. The VR industry has been a major force behind the rapid growth of this medium and we are excited to showcase how it impacts immersive video. Every episode, we feature two videos created by our talented community members to inspire and showcase the amazing work being done in this space. Join us on the next phase of cinema as we gaze through the STEREOSCOPE.
STEREOSCOPE
Blackmagic 12K Ursa Immersive, Gracia Volumetric Capture, Logitech Ink for Meta Quest, and Apple TV+ Immersive Content Rollout
Curious about how the latest tech advances are shaping the future of VR storytelling Immersive Video? Tune in to the sixth iteration of the Stereoscope podcast, where we eschew our usual studio setting to embrace the fresh air of Portland. We'll explore Apple's groundbreaking announcements from WWDC, engage with the Gracia volumetric capture platform, and share our experiences using the Luma AI app's Gaussian splatting technology. With industry giants like DEOVR showing interest, we'll also ponder the potential and challenges of volumetric capture in various media forms.
Ever wondered what Apple's next move in the VR market might be? This episode takes a deep dive into the next iteration of Apple Vision Pro (AVP), including potential design tweaks and supply chain hurdles. We also spotlight Apple TV+'s foray into immersive video content, featuring a highly anticipated documentary series and a significant fictional project directed by Oscar-winner Edward Berger. These advancements signal a pivotal moment for scripted VR content and set the stage for a new era of immersive storytelling.
Looking to stay ahead in immersive video production? We cover the latest from Blackmagic, highlighting their soon-to-be-released 12K Ursa Immersive camera and its seamless integration with DaVinci Resolve. From tackling the evolving landscape of high-resolution cameras to discussing Apple's strategic ties with Vimeo for professional content distribution, this episode is packed with insights. We also share a few crumbs about our ambitious new project. Join us on this journey and support our vision for the Stereoscope podcast as your go-to resource for cutting-edge immersive filmmaking insights.
Hey, welcome to the sixth iteration of the Stereoscope podcast. Obviously, we are out of the dungeon. Yes, into nature.
Speaker 2:The weather has improved here in Portland so we're trying to do a change of venue a little bit, because you know the camera's mobile, we're mobile. Yeah, we camera's mobile, we're mobile.
Speaker 1:Yeah, we can walk around, we can take things outside. Got some diffusion going on up there and generally it's been a couple months.
Speaker 2:A couple months.
Speaker 1:We had some projects we needed to get under our belts. You had some work, and then we've got some stuff cooking that took up quite a bit of our time recently. So, yeah, well, we'll touch on that a little bit towards the end, but we're not going to go into too much detail either, all right, so it seems like there has been especially since we haven't done a podcast in a while there's been a lot of news, but there's been a lot of news just in the last couple of weeks mean the big stuff for me anyway, yeah, and you know we're going to touch on the wwdc announcements from apple, but there is some stuff that happened recently with the announcement of this new gracia volumetric capture platform. So it's very interesting the timing of this because, um, there's this app called Luma AI and it's effectively a Gaussian splat recording system, and just last week I heard about it and I was like, hmm, let's see.
Speaker 1:So I got my phone out and I took a little video of my Quest 3. I took a little video of my Quest 3, so I got a Gaussian splat of effectively like my kitchen and my living room with the Quest 3. And, holy hell, it was a lot easier than I thought it was going to be. I mean, it took a while to render, but it was really conceptually interesting.
Speaker 2:Impressive results for not using LiDAR or anything, just pure image capture and layering and processing.
Speaker 1:And the other funny thing was I'd watched watched a quarter crew episode specifically on it. That's how I found out about the luma ai app quarter crew are a group of vfx youtubers. They're all industry people, you know. They've worked on big stuff and small stuff, but their whole episode was about ga, gaussian splatting and how the technology is really coming around. But, that being said, this new platform comes out and they're offering an entire Gaussian splatting volumetric capture platform for watching this stuff back, and so it kicked off a discussion with us about what its use will be, how relevant it is for the industry, because I do know that there are going to be some major players in the industry specifically DOVR we're talking about them that are interested in volumetric playback as a platform. What do you think?
Speaker 2:yeah, I mean I and I would say this is for dio. I'd say it's more than an interest. I mean they really just invested heavily in yeah, they're, they're heavily capture.
Speaker 2:Yeah, they're heavily interested yeah, I mean and it's, you know, it's hard to say about the, the broader industry, because it's a larger industry we tend to focus on what we do, which is vr video and telling stories in vr video, and in that way I don't see a lot of application to what we do for volumetric capture, but I think there is a place for it for sure.
Speaker 1:I mean, I got into a reddit thread and was talking to some people and you know, I I I have some skepticism towards volumetric capture. It's interesting and I have I I think it will be useful for vr specifically. I mean there was that game that came out, the vr game. It's an adaptation of an existing seven guest. Vr came out. It's a remake of legendary horror point-and-click adventure game from the 1990s and this game studio came out and they use volumetric capture for all the performances and for games. It's really interesting. That's very interesting to me because it means that you can capture live performances and then put those performances into a game where people can walk around them, and put those performances into a game where people can walk around them. Now, from a storytelling perspective, that's interesting for games, for passive media. Congratulations, you've just invented live theater.
Speaker 2:Yeah.
Speaker 1:And I think the main use case will be porn, of course, can only imagine you're gonna run into some really awkward occlusion yeah, yeah, it's still.
Speaker 2:And again, I, I don't know, I mean all that thrusting there's a lot of overlap, slapping, just a lot of overlap. So yeah, I mean, and even there, like, is it that much? Part of me wonders and I'll probably eat my words at some point but it feels like the 3D TV of the industry, in terms of like it's neat and then it's just a thing that you don't use again, and specifically in terms of video capture for passive experiences?
Speaker 1:I think yeah for passive?
Speaker 2:for passive experiences, I think yeah, because only so many times it's going to be interesting to walk around somebody doing a dance number or whatever in your living room. Beyond that, like how many people have that much space to do that anyway?
Speaker 1:yeah, I think in terms of an interactive medium, there definitely is some conceptually interesting use cases there, but we'll have to see. I mean and I also don't want to be one of those people that just because I don't understand the usefulness means that I completely, because we know some people that have that same sort of mindset and it's quite limiting.
Speaker 2:Yeah, yeah, so there's always a and frustrating. Yeah. So maybe we just can't see the future and I'm old, yeah, and frustrating to. Yeah, so maybe we just can't see the future and I'm old, yeah, but just so far. Like, proper volumetric capture requires a lot of light and cameras and green, and it's not. You can't just set that up and say like, oh, let's capture a play, you know, speaking of theater.
Speaker 1:Yeah, in volumetric, definitely that's not happening anytime soon anyway, especially if for like mood setting and that type of thing, like you, you can't you it requires a massive amount of light. Yeah, and so plus a lot of green. I'm sure there will be some people who figure out some of these limitations that we are possibly just not saying maybe, yeah, maybe you can do like tracked lidar, where you can just track the moving subjects and filter everything else.
Speaker 2:Yeah, I also didn't see that with gaussian splatting.
Speaker 1:The way that a lot of this works, especially with luma ai, is that the mesh itself doesn't have any lighting data. So interesting, you can't relight it because it's baked it's.
Speaker 2:The lighting is baked into the texture right, yeah, any shadow that's captured is permanent.
Speaker 1:You can relight with the texture, but the texture has all the lighting data baked in yeah, yeah I'm sure somebody will figure out a way to?
Speaker 2:well, maybe not. I mean, you need lidar. That's the only way to do that without lidar, right? Yeah, yeah, so it's. You're basically using the shadows to capture the, the depth okay, so the this announcement was logitech MX Inc.
Speaker 1:It's a stylus design for MetaQuest headsets that supports both 2D and 3D creativity. It's for creating content in 3D. So there's two places where I think this is heavily relevant and has already been sort of alluded to via other projects. The first is the sony creativity specific headset that launched or that was announced a couple, a couple months ago yeah and that one included a stylus.
Speaker 1:That was effectively the same sort of concept for this Logitech MX Pen. It's captured in 3D space so you can create models, you can trace existing 3D objects or you can use it as a 2D input method in a 3D space. So if you want to write on a virtual canvas or a piece of paper or something like that, and it's got buttons, it's tracked Pressure sensitivity Pressure sensitivity, which is super important and also it's supported at a system level by Meta, and that is a really big one, and it's one of the things that we've said about why Apple is. They're sort of they hint at a lot of these use use cases for creativity, the Apple Vision Pro as a production platform, where Meta hasn't really had any of this same sort of functionality. However, I will say that Meta did sort of allude to something like this previously with the quest pro controllers that had a stylus tip yeah, so the recently deprecated and removed feature.
Speaker 1:But yes, yes, exactly but they they were sort of implying that you know, we want people to be able to draw in a 3d space and here's the thing you can already do most of what this device does just with your controller. Yeah, but it's clunky, it doesn't have the same sort of design aesthetic and it doesn't have the same, I think, sort of use case yeah, I want to say this is going to be more precise.
Speaker 2:It looks more like easier to handle, because I know I like playing with gravity sketch and sort of holding the button and doing the whole thing. Yeah, yeah, it works, but it's not great.
Speaker 1:So this is actually pretty interesting to me and we've even discussed for our upcoming project the viability of doing 3D spatial storyboards effectively, and a device like this would make it a lot more approachable. You could still do it with controllers, but this seems like a much.
Speaker 2:I wouldn't even attempt it. I mean, granted, I guess, now that I know about this, I wouldn't even control it. Attempt it with controllers, yeah and it's, it's already.
Speaker 1:You can already pre-order it. It's 130 bucks and it launches in september.
Speaker 2:Yeah, I think there's like a 20 off coupon, that's cool, I probably will pick this up yeah, I'll wait and see wait for review, because I yeah I'm not gonna pre-order it, I'm hoping you know, because it it pairs as like a third controller. You don't have to unpair anything, like it just works.
Speaker 1:So I actually think I would like this on airplanes instead of having a controller just as an interface, just so it doesn't look goofy, it feels lighter, it just feels yeah, like you don't have the strap that's a good.
Speaker 1:That's a good point. Like the controllers, it's sort of like an in-between, like I. So I actually really like controllers. I think they offer a level of input that is. Hand tracking just is still fairly clunky. I think wires light, which a lot of times is not available on an airplane but something like a stylus is sort of an in-between that I think still works really well, yeah. So and you don't look as goofy.
Speaker 2:Yeah, and there's my silly use case scenario, which is cause I'm often obviously consuming video on the headset, so I don't need the, I don't need all the functionality of the controller, and if my cat is on my lap and I have the controller, that string he can't help himself so he can remove the string I know, but then I need it later.
Speaker 1:Right. So it's just. I'm always afraid I'm gonna lose it, yeah exactly.
Speaker 1:So you know that is a 110 fix, for that is goofy, I'll admit, but you know it's just one of many use cases all right now we're gonna launch into a lot of Apple where this is effectively the WWDC part of of the show and there's a lot to go over. We hit, we sort of split it up into three parts, though this specific part wasn't announced at WWDC, but Upload VR reported on it a couple of days ago, and it's hugely relevant because and that is that there are rumors Upload VR has been able to confirm it with their own sources but that they are deprecating the Apple Vision Pro I wouldn't use that word.
Speaker 2:Well, not deprecating, they're placing it on, they're developing, they're prioritizing a lighter version of the Apple Vision.
Speaker 1:Pro over an AVP2. Yeah, which I don't think is as shocking as it seems, it's 100%, exactly what I expected. It's also what I expected, I will say it means that this was always a dev kit.
Speaker 2:Which is what they said it was.
Speaker 1:Well, they didn't say it was a dev kit.
Speaker 2:Oh really, everybody else said it was a dev kit. Yeah, I was like to me. It's, except for Apple, yeah.
Speaker 1:So yeah, and the, the new headset is gonna be cheaper. I'm guessing it's gonna be like you know 1250 or 1500.
Speaker 2:Yeah, what they? Yeah, they said like they cost the same as a high-end iphone, which is up to 1600.
Speaker 1:So yeah, and I'm guessing they'll come in with a lower skew at, like I said, like $1,250 or so, but then you know you can spec that up to $2,000, you know for 8 gigawatts of RAM.
Speaker 2:This doesn't mean the AVP is dead by any means. It just means their next move is down market before they go back up market. Yeah.
Speaker 1:That's all that means. I yeah, that's all that means. I mean that they put it on hold. They didn't kill it, I think it's. I think it's most likely that it's going to be a very long time before we see the apple?
Speaker 2:yeah, I mean because this is rumored for end of 25 so yeah, but I think that's fine. I think they did what they need to do with this and they're going to go down market which, and it sounds like maybe with minimal cuts, like hopefully they make it lighter, like less glass and metal and stuff. But it sounds like they're not going to try to low go with lower end screens, they're just trying to up the supply of them to make them cheaper. Yeah, and then probably losing the screen on the front.
Speaker 1:But and a lot of that, the those screens that they're, they've developed. They've run into so many supply chain issues with them because they're incredibly difficult to make and supposedly they lost about a third of them in the production, yeah, process, yeah, which is massive, yeah, and it has to be incredibly frustrating and they're not even that great like they have great colors, but that refresh rate thing kills me.
Speaker 2:Yeah, like I admit supposedly the next.
Speaker 1:the next technology that they're working on is going to have much, much lower persistence because of a new screen technology, yeah, and so that will effectively minimize the motion blur. Yeah, and on the apple vision pro headsets, because effect, right at this point, anything with moving your head or at a high refresh rate Really really destroys the quality. Got great blacks, though Definitely breaks the immersion.
Speaker 2:Yeah, it's like, once you're aware of the screen, things start falling apart real quick. I mean it's like the new screen door effect.
Speaker 1:Yeah, because right, because at this point the screen door effect has been effectively eliminated. I very, very occasionally can notice it On my quest 3, but honestly, not really.
Speaker 2:No, you have to look, you have to look for it, whereas for me the refresh thing kind of jumped out.
Speaker 1:I actually have been especially using my quest 3 recently. I have been sort of amazed at how good it looks well, and they keep improving it too well, especially because the the resolution bump from the quest 2 to Quest 3 wasn't really that much. Yeah, but the lenses were, but the lenses absolutely were, yeah. So next up, this was WWC stuff and there was a lot there. I think this part is really interesting because it's they're launching. This is effectively what they should have launched with. They're launching this is effectively what they should have launched with, but they're releasing a lot more Apple TV plus immersive video content. So they have a few things. There are currently only seven total Apple immersive videos and not including the trailer for all of them. Right, the fact that they launched with only seven and they're only between like five to 20 minutes long is sort of absurd as far as I'm concerned. But they only have so many production partners because they're and this is something that we've talked about there are only so many studios that know how to make this content that exists at all, especially with them with budgets, yeah, yeah, so there is almost guaranteed a production backlog because they can only produce so many of these things at the same time. You know, I'm guessing the majority of that is NextVR and there are, I'm guessing, five teams within the one overarching NextVR team, but then maybe they're outsourcing a few other of these projects. Mind you, this is all speculation.
Speaker 1:Yeah, so the upcoming Apple immersive video content is another episode of Adventure called Ice Driving. Wildlife got new episodes for elephants and sharks, which that wildlife one was in the trailer that we watched during our demo was by far the most impressive elevated maine and hawaii. And then a red bull extreme sports series, which I think is going to be really cool. I've actually been thinking about doing some similar stuff, maybe because I think, like skate videos in vr would be really cool, especially going down to burnside skate park. That is a very dynamic location.
Speaker 1:Yeah, a short video from the weekend, a new concert series and then, hmm, combat ready Doesn't really say what this is. And then this is really the most interesting one for this new project. It's called Submerged and it's the first scripted, I'm guessing, fictional Apple immersive video short by an Oscar-winning director, edwin Berger, who directed All Quiet on the Western Front. That is the big one, because all the stuff that's come out from these platforms thus far has been documentary. Documentary, effectively, I mean, the dinosaur stuff is obviously not real, but it's based on science facts. So, right, you know, still documentary. But this one is actually scripted and fictional, which is hugely relevant for us because we're heavily interested in that, in that filmmaking style, but has been largely avoided by the larger community, other than a few projects by like atlas 5 and yeah, the faceless lady.
Speaker 1:That type of thing yeah I really want to watch this when it comes out. I'm gonna have to try and get my hands on an apple vision pro to figure it out. Yeah, what do you think about these, these projects?
Speaker 2:I mean it's obviously great, the more content, the better and it makes me wonder was this always part of the strategy? Just put out a few things just to have enough to for people to play with when they first get it, or was like the response to the content, a lot stronger than they had expected, and now they're scrambling to make more of it, because there's a huge demand.
Speaker 1:I'm guessing these projects take a long time to produce, especially this, because there's almost certainly a lot of vfx in all of these projects and that stuff is takes a while. So I'm guessing that this was always the plan. Yeah, just to sort of stagger it because they know how good quality these videos are. I mean, yeah, we watched them and they were really good but you know it's like we.
Speaker 2:We obviously are fans of the format, so to see that most of the people like I, I don't have any comments. I've seen on like facebook and stuff like that, where people like where can I see more? I need more content.
Speaker 1:Yeah, I consistently see this narrative. Mind you, it's also probably, you know, self-selection bias, but I do feel like there is a very strong demand for more high quality, immersive videos. Yeah, and there just aren't enough people that have a budget to be able to create this stuff, because there isn't a good place to distribute it unless you have a celebrity attached right, which is frustrating as filmmakers. But you know, we're trying to cross that Rubicon.
Speaker 2:So yeah, yeah. It's encouraging, though, because the more that these big companies are spending on content, the better for everybody.
Speaker 1:Yeah, and you know it, it's not. It's definitely not as dire as it used to be. I think right now, all the major players are probably ramping up producing more content because, especially when google launches and you know, I I wrote a post on linkedin yesterday calling out meta and effectively saying that they should create a vr video accelerator program. Yeah, I think if they work directly with creators to create an, create a platform for onboarding that talks about best practices, that talks about shooting styles, the hardware production pipeline, they could really benefit from creating a platform, because creating this stuff is very, very, very different from creating traditional video and most flat filmmakers, when they first get into this space, are totally bewildered, and you know for good reason, because there are a lot of pitfalls and there's a lot of things that you're not going to be able to do in the same way things don't transfer, especially if you're on a gimbal especially yeah, please don't do that.
Speaker 1:Yeah, and there's so many, and so this space can benefit from these things, especially because it'll show people what you can do, how to do it. For the most part, you know you can figure it out, you can reverse engineer it. Yeah, and with that I mean we're going to go into what, effectively, is our, our major discussion. For the video which is also at wwdc, they announced this huge sort of kind of a bomb drop bombshell, yeah yeah, even though it was like, I think, all of maybe 30 seconds or something like that.
Speaker 1:Yeah, well, they left the actual details up to black magic yeah, which is totally fine.
Speaker 2:But yeah, as everybody already knows by now you know, they announced the black magic ursa immersive camera, which will come soon along with a couple of lenses from canon.
Speaker 1:But we'll get to that well and alongside that, they also dropped the bomb, that they'll be working directly with black magic to create creator tools for use in da vinci.
Speaker 2:Resolve, yeah, which is absolutely huge yeah, that's a big part of what I've got here is the, the. You know they've created a new end-to-end workflow. That's, you know, the probably the biggest news to me. I mean, the camera is awesome, but the fact that, like for us especially because I already use da vinci as post for these videos and using andrew's tools car to vr to do our, our mapping and whatnot so to have that built in and I'm also a black magic user, so I'm already a big fan of black magic raw so the fact that, like these lenses are calibrated and the, the metadata per lens, is in the camera from the factory and is carried through all the way through to the file, yeah, so you were telling me earlier that with the, with the camera system, with DaVinci and with Apple immersive video, you no longer going to need to recreate the lens profile.
Speaker 1:Yes, so like in resolve that's, that's the goal.
Speaker 2:So one of the big things that we do with carta and what you would also do in mystica if you're a mystica user is aligning your disparities right.
Speaker 2:So here's the quote from the actual thing, because there's a lot of information. The custom lens system is designed specifically for the ursa cines large format image sensor, which is the thing I'm going to get to here in a minute, with extremely accurate positional data that's read and stored at the time of manufacturing this immersive lens. Data, which is mapped, calibrated and stored per eye then travels through the post-production, through post-production in the black magic raw file itself. Wow. So that tells me that all the little things that we got to do right now with the r5c and the dual fisheye lens, with mapping it, calibrating it, all that stuff, and making sure I have that map file ready for any time we're doing it and sometimes you have to change the map file and this will just be moot, we just won't have to do because this this is a big deal because tiny little problems like this are the majority of the time that's taken up in the production pipeline.
Speaker 1:It bogs down the workflow so dramatically that this is I mean, I don't know why something like this doesn't already exist.
Speaker 2:Well, the demand had to be there, and or somebody with deep pockets, like apple, had to step in and convince them and make it happen well and or partner with them, I mean, share the cost of it, because this is not going to be a cheap camera to build and develop. Oh no, not you know. And the fact that they're doing that, they're going all in like I don't see any corners, cut professional camera.
Speaker 1:This is not a prosumer device. Yeah, this is significantly more expensive than the r5c yeah in every way, not just the camera, but the actual production capability. You're going to need a very, very mature production studio to pull off working with this.
Speaker 2:Yeah I mean from yeah, all the way through, even just when it comes to camera support, because the camera is much larger than DSLR mirrorless camera, which is already quite a bit more to deal with than, say, you know, a small, the Insta360.
Speaker 1:Evo, yeah, the Evo, or something like that. Which?
Speaker 2:I can fit in my pocket. Yeah, so I mean, you know it's it, it's you're getting a lot for it, but it is going to become. It is going to put it in an area that, like most people won't be able to mess with. Yeah, but I'm here for it because of the quality that is possible here, Because we're looking at, like, the fact that it's a large. So I'm curious about this, the specific thing where they say the large format image sensor, so one of the things like with the R5C. Obviously this works well, but we're doing both images on a single 8K sensor and splitting them. And where's my? 8,160 by 7,200 per eye, at 90 frames per second, that is, With 16 stops of DR. So not only are you you have the resolution that you need, but spread over a larger lens equals like a larger sensor equals you're using more of the lens, which means less optical clarity issues this is also that that image quality and pixel per degree.
Speaker 1:You're now getting close to the, the optimal resolution for near photo, realistic wonder one yeah like what, what was described by that researcher at valve, you know, almost 15 years ago or 10 years ago now, yeah, and so we've never had anything even approaching this type of quality in this field before. I mean, it's the type of thing that I mean. I'm guessing that apple is using something similar, but they don't even have this yet yeah, I mean, it's not.
Speaker 2:It is still in development, and you know, the one thing that's really interesting to me is the fact that it is a custom lens, like a new thing that they've built, so I'm curious where the optics came from. Like none of this is known yet. We just know that it's a custom built stereoscopic fixed lens yeah, I don't know the.
Speaker 1:There were some. There was was a little bit of talk that it it was eye sugar lenses, but could be we're not sure. That's also speculation.
Speaker 2:Yeah none of that's known. We don't know the speed yet, we don't even, frankly, know the field of view. We just know that it's designed for apple's 180, so it should be at least 180 degrees, but yeah, we still. We just know that it's designed for Apple's 180. So it should be at least 180 degrees, but yeah, we still don't know that. So there's, you know, there's a few things that we don't know, but what we do know is very, very encouraging. So this seems to be built on the new 12K Ursa. That hasn't been released yet, but it's coming out soon.
Speaker 1:It's coming out very soon and a lot, a lot of they've already announced a lot of the features set on that one, yes, that one's like, whereas, you know, with the Immersa, the Immersa, the Irsa, immersive.
Speaker 2:You know it's just a press release and we're trying to glean as much as we can where we actually have a spec sheet for the 12K. So it looks to be based on the 12K body at the very least. All that's very familiar, uses the same media module, which I think this is very interesting. So the camera comes with an eight terabyte ssd memory module. Looks to be maybe a couple of m2 drives in a raid zero, not sure but just because there's no existing cards there's no cards for it you just use the memory mod that can record at this?
Speaker 2:probably not, yeah, we don't know for sure we don't know, because the other thing with black magic raw is you have different quality settings like compression ratios. So there's a way you could possibly do it. But yeah, we don't know. There's a lot that we don't know. It's just interesting because both the 12K, regular URSA also works this way, which is, you know, one less sensor and probably two thirds of the resolution. So based on that we can glean two-thirds of the resolution. So based on that we can glean. So the the 12k is a fifteen thousand dollar camera with no other accessories, but that is ready to shoot. There's no eyepiece or battery, but that does come with the memory module which is a 1700, but it's only got one sensor it's only got the one sensor, yeah, and a lens right.
Speaker 1:So depending on the cost of development of the lens, and I mean for, for context, the, the, this vr lens was what two grand yeah, yeah.
Speaker 2:So and this is so, if it is large format sensors, that means the lenses are going to be bigger, the elements will be bigger because it's covering double the lens, double the sensor.
Speaker 1:Basically that, this aps but the so most likely that the lens is going to be much, much integrated into the price. It's going to be much more a part of the price structure yes, for sure, which is again new for black magic.
Speaker 2:They're not exactly a lens company, but the one thing about black magic is they probably have the fairest pricing in the market, like most things cost way more than this for way less camera, because they're about disruption. Yeah, they seem to be about disruption. I mean, the camera comes with a license for DaVinci Resolve. Every camera they sell it comes with a license for DaVinci Resolve, and that's not even just like the one that's out right now, that's a perpetual license, forever, like they're kind of great about this sort of thing. So I am so happy that, of all the companies I chose to partner with, it was black magic, because I think this camera is going to come in under 30. I'm going to shoot somewhere around 25, maybe just based on the 12 K being 15, but it could be less.
Speaker 1:Well, you said, because part of that is that you don't have to buy the memory module. It's included. It's included, right. It's ready to shoot, right.
Speaker 2:So that's like now. It is, in other ways, a hungrier camera. It comes with a B-mount battery and a 250-watt power supply.
Speaker 1:Oh my, God, so this is a thirsty boy. You need a generator for that damn thing, and that B-plus.
Speaker 2:So that's a 24-volt battery system. So yeah, your typical 12 volt, you know, not gonna work. Gold mount batteries aren't gonna work, presumably anyway. I mean, then maybe they'll come up with a shark fin for it or something, but other than that it's ready to go out of the gate and eight terabytes supposed to get you just over two hours of footage so.
Speaker 1:So this, that's where we were just describing that the type of production house you're going to be, you're going to have to have access to to work with this data is significant, yeah, significant. I mean it's currently out of our capability, yeah, not even quite a bit out of our capability, but you, you know we're getting there.
Speaker 2:But very, very exciting. And then, yeah, the the workflow stuff that's, you know, built in proxies and stuff like that being able to, if you have the infrastructure in place for this, you can be uploading either your raw files or your proxy files while you're shooting, so the editors can start working on your stuff through the whole cloud system that they've announced previously. So they're really going for a professional workflow here and again, not cutting any corners. You got 12 gsdi.
Speaker 1:You got 10 gig ethernet well, and a lot of these features, I'm sure were developed in tandem with professionals being like we can't use these, these unreliable systems anymore. You can't, you can't go to clients and say hodgepodge. You know you can't go to clients and say I'm sorry, the the st mapper keeps crashing right, yeah, our custom software that some dude wrote, you know for free isn't working anymore, or?
Speaker 2:whatever. Or you know an update, killed it or whatever. You know like that stuff doesn't work. And or whatever, or you know an update, killed it or whatever. You know, like that stuff doesn't work. And we've all been struggling with that this whole time with different, uh, with the hodgepodge of support. So this is like taking an existing or basically existing camera system and just extending it, which is again awesome. So one other thing about this I wanted to mention yeah, I mean even the fact that it has two screens. So, again, this is like a proper cinema camera. So it's got a fold-out screen for the operator. It's still got a screen on the other side for the assist. So, like the fact that they left that in, they're emphasizing that this is designed for professional workflows and like I'm here for it.
Speaker 1:Yeah you, you even said that you were able to figure out that it's. It still does have the nd filter.
Speaker 2:It looks like it does so again not, has not speculation again but if you look at the body again, because, like, basically from the lens back it's the 12k, so there's a couple of buttons on the top of the body, there plus and minus, that are for your nd filters on the 12k. So and those buttons are still there in the pictures anyway. So I'm crossing fingers and it would make a lot of sense that you've got a fixed lens system. The only way you can have nds on that is in the body built in so it looks like they really thought of everything on that this.
Speaker 1:This makes me very happy that this exists, because it legitimizes a lot of things about this medium in ways that we hadn't really. It's sort of like the. It's probably even bigger in scope than the announcement of the canon r5c dual fish eye mount yeah, I mean, well, that was yeah.
Speaker 2:How do you, how do you compare this? Okay, so, like that, that lens is kind of as disruptive as the 5d mark 2 was when it came out traditional, you know like, because that just shattered.
Speaker 1:It shattered everything.
Speaker 2:I mean it, it birthed dslr filmmaking. Yeah, there was nothing before. I mean there I was. I tell this story a lot because I think it's hilarious. I was doing 35 millimeter depth of field adapters, just put together a whole kit with a expensive 35 millimeter adapter and then that camera was announced and just made it completely irrelevant over.
Speaker 1:It was like a neutron bond, yeah.
Speaker 2:Yeah, so that's what this lens has done for the past couple of years. So now it's like okay, well, I don't know what you would call this equivalent.
Speaker 1:This is sort of like, maybe, the arrival of the red.
Speaker 2:Yeah, yeah, yeah. Actually, it's actually probably more the promise, because red was always supposed to be obsolescence, obsolete. It was supposed to be affordable and at the time I guess it was by comparison. You know you're talking about 20 grand instead of 100 grand for a camera, uh, but this is probably going to be the the real disruption in that professional, yeah yeah pipelines.
Speaker 1:another thing that was sort of nestled in there during the WWDC announcement sort of head scratching, but in retrospect it makes a lot of sense, given the market situation was that Apple's going to be partnering with Vimeo. Yeah, you know a platform that I have not heard about in years mentioned, yeah, which was was trying really, really hard to be the anti-youtube yeah, a long time and still, you know, there I I kid I still see a lot of professional portfolios on vimeo, because it's more of a streamlined professional platform yeah and actually vimeo has had support for 360 videos for a while.
Speaker 1:I'm unsure if they ever created vr 180 support. I don't believe they have. I don't believe that they have, but this sort of negates that, because they're going to be working directly with apple for and it's going to support mv for and it's going to support NVHEVC, it's going to support Apple immersive video format. It's sort of a and this is all happening because YouTube and Apple are not friends. They've been spurned. Google and Apple have a very, very, very complicated history, very, very complicated history, and currently it seems that, other than the the four billion or so that they pay google to have be the front page of safari, that's pretty much it. Yeah, and it has a lot to do with the fact that youtube, google and some samsung are working on their own.
Speaker 2:Yeah, competitive, competing, headset yeah, and they've just been sleeping on vr. I mean the fact that they just had to. They had to bring features back because they were deprecating stuff. So now we're back to 8k and all this stuff. You know that's. Nobody wants to bet on somebody who's gonna be like changing their mind all the time, be like, well, okay, it's out, no, it's in, it's out, we're in. Okay, now we're out. Now we're back. Yeah, and google had their own vr headset.
Speaker 1:They were ahead, they had 10 years ago and their actual, the app, the actual implementation, the software implement. Implementation of of youtube on the the daydream platform was miles ahead of facebook's software implementation. It was was much, much more robust feature set. The software was more stable. But because it didn't sell, you know, it wasn't an instant hit. They deprecated it after almost a year and so if I was any other creators or company trying to get into that space, I would run away from them at the earliest chance. Yeah, I mean it is a little confusing. Why vimeo? Because I don't know, I haven't seen vimeo relevant in a long time, but they do have the, the infrastructure for it for sure. Yeah, they have the streaming tech, they have the cloud rendering tech.
Speaker 2:So yeah, it's not already riddled with ads you know it's, it is a different platform and it is kind of just like why not vimeo? Right?
Speaker 1:I think that's just it's the only other viable alternative and rather than I mean I've always sort of wondered why apple didn't decide to just do it themselves and create their own user generated content platform it's just not their bag.
Speaker 2:It's kind of interesting, I I kind of agree, but I also understand why they didn't. I mean, they just make certain bets and that's.
Speaker 1:That's never really been their thing yeah, because no, I I I sort of get why they haven't done it, but it would solve a lot of problems for them.
Speaker 2:I don't know if it would, because I mean, then they become so. A big part of apple's thing is that they're about privacy and you, you're paying for an experience on the on the device right, which means no ads. Yeah, it just means no ads, and youtube has always been, and intended to be, an ad delivery platform. They're an ad company that's what they do it's what they do.
Speaker 2:It's what they do and so it's sort of anti the ethos of Apple in that way where it's like you spend more on the device but it's not subsidized by ads, it doesn't have apps on it that you cannot remove because Facebook paid to have that thing permanently on the phone or whatever. That's been the whole thing. So I think just steering clear of that. And also, they've always been about high quality and user generated content up until recently, has not been, has not been high quality. Yeah, you know. Hence why they've. They're really protective already about the immersive content yeah, and who's creating?
Speaker 1:well, and the interesting thing is meta tv. Meta has their own userated content platform. They don't do anything with it.
Speaker 2:Yeah, but there's no curation there's no curation. Well there is a little bit of curation.
Speaker 1:Actually, in fact, I think there might be a little too much curation. Yeah, maybe, but there's there's yeah but what they're not doing is that they're not reaching, they're not creating enough of a onboarding platform for there to really be something that takes off. A lot of the stuff that it seems like they're creating is very low production cost and dubiously creative yeah, we'll just say they're going for quantity over quality.
Speaker 2:Yeah, and clearly apple's going the other way, because there's seven things.
Speaker 1:So there's got to be, there's, a middle ground and I, I think when youtube, when samsung and google launched their headset. I'm guessing that youtube will probably have a better system in place because historically youtube has they've created their own content. They've they've sponsored youtube series. They even have their own youtube platform which has originals on it, though I have heard that they've dialed back a lot of that content. But if they're going to be launching an immersive platform, they're going to need immersive videos to launch on it and I'm guessing that they're probably batching that stuff up in the background. Again, this is speculation. Up in the back ground again, this is speculation. I, I, I think the the camera and the the black magic design partnership. I think is incredibly tactical. I think it's very smart move. I don't know what the user generated content on vidimio is going to look like.
Speaker 2:That's going to be yeah, which actually kind of leads to the next or the last point, anyway that I have. The other two things that were announced were these two Canon lenses. So one of them is looks just to be a scaled down version of the 5.2 mil, the 3.9 mil APS-C, you know, lens for design for APS-C size cameras. This is interesting because presumably it's going to be cheaper. I actually don't know.
Speaker 2:I don think yeah it's not actually out out, but so design seems to be for the r7. And the other thing that's interesting is they keep saying that it's going to be 4k, 30 frames, but the r7 can do 60 frames. And so there's like questions like is it going to be some weird thing where they lock it down via the firmware integration? That'll only do 30 for some reason, but that'll be in it. Maybe that's just a typo, who knows? Yeah, but you know, the main thing is that this thing is it's for small format cameras, which means only so much resolution as possible. So it, while it is a, it's up to 4k, right? Yeah, it's 4k, which, at least on the r7 like, if somebody can come out with an 8k aps-c size sensor, then maybe we'd be well, and uh, you know I watched hugh's review of it and that there was some stuff that I didn't see in the review.
Speaker 1:But it it seems like he says it's sharper in the center but the resolution is lower, but it also has a lower field of view. So the PPD is about the same.
Speaker 2:Right. So it's an interesting thing I'm glad they're doing. It means there's more more than likely going to be a version two of the 5.2 mil that incorporates these things, which is it's got autofocus. It's momentary autofocus, but still that's awesome, like it's not continuous, but it's just a nice thing to not sit there and look and you can just click and give a half press and you got. You got focus lock and that the inter lens focal adjustment is also electric and not this fiddly tool situation, so that's cool.
Speaker 2:The other biggest thing for me is the spin on filters on the back. Yeah, because you know we had to spend a pretty good amount of money and get really scary setting in, yeah, a magnetic frame on the inside of the r5c so that we could have nds on on the r5c. So this is, that's a great thing. I mean, granted, technically the 5.2 has a way to put nd back there, but it's got to be a gelatin filter that you screw down with. It's not, that's not that's not real.
Speaker 1:Yeah, I'm not doing that well. And then the the last, the last lens canon has sort of been hinting at this. We've showed pictures of it in previous episodes yeah, it looks a little different, but that's the spatial yeah, spatial video lens 7.8 millimeter.
Speaker 2:yeah, it's even slower, which is interesting, I think all these, this one's an f4 lens stm, I want to say, but there's very little about this one, but there's some interesting stuff about it, because so the thing that I thought was really interesting is it's still using. So we've all been speculating that the iPhone 15, they sort of, you know, colluge the spatial shooting with that, with how close the lenses are together, and this lens has what looks to be the same or very, very similar IPD Distance. Yeah, lenses just like this next to each other. So that's really interesting. But here's the from the press release. They use this language, which I think is really interesting. Features a field angle that is similar to a person's field of view, which enables videographers to naturally capture memorable moments.
Speaker 1:That's an interesting word soup.
Speaker 2:Yeah, they're very clearly obfuscating some information something there, because it's field angle sounds like field of view to some degree, but it is like but does that mean that apple is really sticking with that ipd for spatial? Because probably my understanding is on the avp.
Speaker 1:They are high distance apart. So that's interesting, but either way this it just shows naturally create a hypostereo effect, right, right?
Speaker 2:yeah, somewhat hypo, yeah, but I. So there's been all the speculation that apple's doing like post-processing with the lidar and this and that to to make it look more comfortable. But this kind of indicates that that's not true at all. But we'll see what this, what comes out of this, but I think it's interesting and this goes along with the Vimeo thing that, like this is, apparently there's going to be a market I don't know that, I see it yet for square 3D videos, like not immersive, but just 3D content, because that's what this generates.
Speaker 1:Yeah, I don't know, I disagree. I think there will be a market for it and we've even played. I didn't know. I I disagree. I think there will be a market for it and we've even I didn't say there wasn't.
Speaker 2:Oh. Oh, I'm saying that's what they're saying it is. I don't know.
Speaker 1:Oh, okay.
Speaker 2:Like I don't know that it's marketable to me.
Speaker 1:I see, yeah, I see.
Speaker 2:Yeah, but no, clearly they're. They're throwing money at it. Yeah, I don't know. I mean, I think it's interesting. I we obviously use spatial videos as like B cam type inserts on our immersive videos.
Speaker 1:And they work, and it works really really well, yeah, yeah, don't get me wrong, I'm quite, I've been quite, pleased with them.
Speaker 2:But I just wonder if it's another cause. It's like a half measure right Like without the immersion is it that interesting? I think it is I think it's interesting and for memories and the way that we're, that Apple has pitched it.
Speaker 1:I also think it's really interesting for looking at objects with a sense of, because I think and we've we've talked about this in in sort of in more filmmaking specific terms is what is the value of shooting in VR, one, 80, when sometimes you don't want to see the things around you is it's why wouldn't you just shoot it flat? And I think a lot of that has to do with a sense of depth. I think even and here's the thing is that even just stereoscopic effect alone, I think, creates a sense of presence.
Speaker 2:It does.
Speaker 1:That is not replicated by flat filming.
Speaker 2:Yeah, no, there's no question To counter the idea that, like because it's not just seeing around you, it's the fact that you move your head in the video is like you're still looking around the video a little bit, which you know when you're looking at a frame, that kind of is broken. I will say what is interesting about this is it does allow you to film in a more traditional manner, ie moving around a gimbal a lot yeah like you're not making people sick with spatial videos, so maybe it's a gateway and or you know one of those things.
Speaker 2:It's a lot harder, yeah there's like one of those things unless you're blowing it up, right, Because you're not messing- with.
Speaker 1:I mean I definitely have had.
Speaker 2:I've felt Eye strain maybe, but not Well no.
Speaker 1:I have watched some 3D movies on a really large screen and felt nauseous before.
Speaker 2:But that's a 3D movie, so that's different, right? Because several 3D movies were not captured in 3D.
Speaker 1:They were interpolated.
Speaker 2:And you can see that in those like there's slices to it.
Speaker 1:And you can very clearly see Like my buddies at Legend 3D used to do yeah Back in the day.
Speaker 2:And I know those are the ones that are like they're hard to watch. Really you really tweak them to get them like comfortable to look at, but the ones that were shot natively usually work pretty well. As far as for me, actual spatial video shot on the iPhone, they work and I've not ever been like sick. But again, you might have eyestrain, but only if you blow it up big enough to. It's one of those funny things with the brain. It's like you have to anchor something. But yeah, being able and I will say another pro towards spatial shooting is you can get closer to objects, right, much closer, much, much closer. And so, and because of that hypostereo effect, there is some cool like it's almost like a macro lens for immersive content.
Speaker 1:We're working on a project that we've. The reason we know about this so much is. We're working on a project that uses the utilizes the hypostereo effect to great results.
Speaker 2:Yeah, which is a pain on the iphone because of the two different sensors and and apertures and everything.
Speaker 1:That's why I've been sort of keeping my eye on other spatial video cameras, something that has got more consistency between the two sensors yeah, so in that way I would be interested in this spatial lens if it's affordable.
Speaker 2:Yeah, because we could use that for inserts instead of an iphone, or at least maybe the next iphone won't be doing this sort of kludge thing with two different cameras, because it stinks well, and I think that that probably we have just a little bit more, just a tiny bit of stuff to talk about.
Speaker 1:That is relevant to this is that? So people have been asking us what, what our next thing is, for a while. You know, adam sort of drove it into us on one of the podcast episodes and we're working on something right now and it's it's a big swing for us.
Speaker 1:It's a big project, it's very ambitious, it's not crazy, it's doable, it's doable but like with anything, it is gonna require a lot of resources, a lot of time, a lot of money not a crazy amount of money, yeah, but definitely a budget and we're putting that together. Right now, we applied for a grant from Oregon Film Fund. That itself is a swing. We may as we might not get it. We might get it, who knows, but we're planning around it.
Speaker 1:We're planning around it just in case it doesn't go that way. But effectively, we're going to be working on a crowdfunding campaign that's going to launch later this year. We're still very much in the pre-production phase. We want to make it high quality, we want to make it good, and so we want to really make sure that everything is taken care of, and we're going to need help from our more out there resources are are more out there resources. I think it's this type of project and especially for small filmmakers. We hope that it's in everyone's best interest, because this is a scripted fictional project and we want it to be really good and we want it to be our calling card. And what do you think, anthony?
Speaker 2:What he said, I mean yeah it's. It is all those things. There's a lot of effort and writing. We've worked on a lot of projects of similar size.
Speaker 1:This one is a little bit bigger and it's it's ours. This time it's not, you know, we're not showing up on set for a different producer, so it's a lot. It's a lot more ambitious, but it's something that we need to do and we hope that you will help us do it and we'll have more details in the future. We didn't we want to be sort of vague right now because we're still putting it together, but it's a big thing for us and it may, when it starts, when we get closer to the timeline of it, it may interrupt the production of the podcast because it's gonna take precedence, for sure. But we'll also be using the podcast as a platform to sort of bring people in, because we think it's important for our viewers to understand that this is a.
Speaker 1:This is a podcast about filmmaking and not just talking about other people doing filmmaking. Yeah, part of the reason we created this was we knew eventually we would get to this point and we wanted to be able to share our, our capabilities and our ambition, ambitions and pitfalls, yeah, with with the viewers, because I think that's really. You know, there are other podcasts that talk about, you know, movies and behind the scenes and production and screenwriting and that type of thing, and that's sort of more of the vein that we've always wanted to be in. We talk about the technical stuff because it's relevant to the industry, but we want to be sort of more holistic about the production of immersive video filmmaking.
Speaker 2:Yep.
Speaker 1:Yep, so I really hope that you can help us out when the time comes, because we're going to need it.
Speaker 2:Yeah, yeah, it's something we believe in. We think a good number of y'all believe in it Exactly. So you know, if we want more content, we've got to pull together and try to make it happen.
Speaker 1:Yeah, and with that we're bringing the Stereoscope podcast.