
VR and non-VR gaming
Foveated streaming! That's a great idea. Foveated rendering is complicated to implement with current rendering APIs in a way that actually improves performance, but foveated streaming seems like a much easier win that applies to all content automatically. And the dedicated 6 GHz dongle should do a much better job at streaming than typical wifi routers.
> Just like any SteamOS device, install your own apps, open a browser, do what you want: It's your PC.
It's an ARM Linux PC that presumably gives you root access, in addition to being a VR headset. And it has an SD card slot for storage expansion. Very cool, should be very hackable. Very unlike every other standalone VR headset.
> 2160 x 2160 LCD (per eye) 72-144Hz refresh rate
Roughly equivalent resolution to Quest 3 and less than Vision Pro. This won't be suitable as a monitor replacement for general desktop use. But the price is hopefully low. I'd love to see a high-end option with higher resolution displays in the future, good enough for monitor replacement.
> Monochrome passthrough
So AR is not a focus here, which makes sense. However:
> User accessible front expansion port w/ Dual high speed camera interface (8 lanes @ 2.5Gbps MIPI) / PCIe Gen 4 interface (1-lane)
Full color AR could be done as an optional expansion pack. And I can imagine people might come up with other fun things to put in there. Mouth tracking?
One thing I don't see here is optional tracking pucks for tracking objects or full body tracking. That's something the SteamVR Lighthouse tracking ecosystem had, and the Pico standalone headset also has it.
More detail from the LTT video: Apparently it can run Android APKs too? Quest compatibility layer maybe? There's an optional accessory kit that adds a top strap (I'm surprised it isn't standard) and palm straps that enable using the controllers in the style of the Valve Index's "knuckles" controllers.
> Foveated streaming! That's a great idea.
Back when I was in Uni, so late 80s or early 90s, my dad was Project Manager on an Air Force project for a new F-111 flight simulator, when Australia upgraded the avionics on their F-111 fighter/bombers.
The sim cockpit had a spherical dome screen and a pair of Silicon Graphics Reality Engines. One of them projected an image across the entire screen at a relatively low resolution. The other projector was on a turret that pan/tilted with the pilot's helmet, and projected a high resolution image but only in a perhaps 1.5m circle directly in from of where the helmet was aimed.
It was super fun being the project manager's kid, and getting to "play with it" on weekends sometimes. You could see what was happening while wearing the helmet and sitting in the seat if you tried - mostly ny intentionally pointing your eyes in a different direction to your head - but when you were "flying around" it was totally believable, and it _looked_ like everything was high resolution. It was also fun watching other people fly it, and being able to see where they were looking, and where they weren't looking and the enemy was speaking up on them.
I'll share a childhood story as well.
Somewhere between '93 and '95 my father took me abroad to Germany and we visited a gaming venue. It was packed with typical arcade machines, games where you sit in a cart holding a pistol and you shoot things on the screen while cart was moving all over the place simulating bumpy ride, etc.
But the highlight was a full 3D experience shooter. You got yourself into a tiny ring, 3D headset and a single puck hold in hand. Rotate the puck and you move. Push the button and you shoot. Look around with your head. Most memorable part - you could duck to avoid shots! Game itself, as I remember it, was full wireframe, akin to Q3DM17 (the longest yard) minus jump pads, but the layout was kind of similar. Player was holding a dart gun - you had a single shot and you had to wait until the projectile decayed or connected with other player.
I'm not entirely sure if the game was multiplayer or not.
I often come back to that memory because shortly after within that time frame my father took me to a computer fair where I had the opportunity to play doom/hexen with VFX1 (or whatever it was called) and it was supposed to revolutionize the world the way AI is suppose to do it now.
Then there was a P5 glove with jaw dropping demo videos of endless possibilities of 3D modelling with your hands, navigating a mech like you were actually inside, etc.
It never came.
That sounds like you're describing dactyl nightmare. [1] I played a version where you were attacking pterodactyls instead of other players, but it was more or less identical. That experience is what led me to believe that VR would eventually take over. I still, more or less, believe it even though it's yet to happen.
I think the big barrier remains price and experiences that are focusing more on visual fidelity over gameplay. An even bigger problem with high end visual fidelity tends to result in motion sickness and other side effects in a substantial chunk of people. But I'm sticking to my guns there - one day VR will win.
It is precisely that! My version was wireframe and I can't recall the dragon, but everything else is exactly like I remembered it!
For me this serves as an example.
Few years later VFX1 was the hype, years later Occulus, etc.
But 3D graphics in general - as seen in video games - are similar, minus recent lumen, it's still stuff from graphics gems from 80-90s, just on silicone.
Same thing is happening now to some degree with AI.
Nah, people spend 700 on consoles, the biggest barriers is comfort.
As long as the headsets are heavy, I won't get one, no matter how great the graphics are or how good the game is
I played that game in Berlin in the late 90s. There were four such pods, iirc, and you could see the other players. The frame rate was about 5 frames per second, so it was borderline unplayable, but it was fun nevertheless.
Later, I found out that it was a game called ”Dactyl Nightmare” that ran on Amiga hardware:
Maybe something like this?
https://en.wikipedia.org/wiki/Virtuality_(product)
I think I played with the 1000CS or similar in a bar or arcade at some point in early 90's
Yes!
The booth depicted on the 1000CS image looks exactly how I recall it, and the screenshot looks very similar to how I remember the game (minus dragon, and mine was fully wireframe), but the map layout looks very similar. It has this Q3DM17 vibe I was talking about.
Isn't this crazy, that we had this tech in ~'91 and it's still not just there yet?
On similar note - around that time, mid 90s, my father also took my to CEBIT. One building was almost fully occupied by Intel or IBM and they had different sections dedicated to all sorts of cool stuff. One of I won't forget was straight out of Minority Report, only many years earlier.
They had a whole section dedicated to showcasing a "smart watch". Imagine Casio G-Shock but with Linux. You could navigate options by twisting your wrist (up or down the menu) and you would press the screen or button to select an option.
They had different scenarios built in form of an amusement park - from restaurant where you would walk in with your watch - it would talk to the relay at the door and download menu for you just so you could twist your wrist to select your meal and order it without a human interaction and... leave without interaction as well, because the relay at the door would charge you based on your prior selection.
Or - and that was straight out of Minority Report - a scenario of an airport, where you would disembark at your location and walk past a big screen that would talk to your watch and display travel information for you, prompting question if you'd like to order a taxi to your destination, based on your data.
Oh wow, I also played with this one in what might have been a COMDEX, in the 90s.
I remember the game was a commercially available shooter though, but the machine was exactly the same, with the blue highlights.
>It never came.
Everything you described and more is available from modern home Vr devices you can purchase right now.
Mecha, planes, skyrim, cinema screens. In VR, with custom controllers or a regular controller if you want that. Go try it! It’s out and it’s cheap and it’s awesome. Set IPD FIRST.
[flagged]
My dad had an Apple Newton.
That’s reality cool. My first job out of college was implementing an image generator for the simulator for the landing signal officer on the USS Nimitz, also using SGI hardware. I would have loved to have seen the final product in person but sadly never had the chance.
I remember there was a flight simulator project that had something like that, or even it was that.
it was called ESPRIT, which I believe was eye slaved programmed retinal insertion technique.
> 2160 x 2160 LCD (per eye) 72-144Hz refresh rate
I question that we could not create a special purpose video codec that handles this without trickery. The "per eye" part sounds spooky at first, but how much information is typically different between these frames? The mutual information is probably 90%+ in most VR games.
If we were to enhance something like x264 to encode the 2nd display as a residual of the 1st display, this could become much more feasible from a channel capacity standpoint. Video codecs already employ a lot of tricks to make adjacent frames that are nearly identical occupy negligible space.
This seems very similar (identical?) to the problem of efficiently encoding a 3d movie:
I'm entirely unfamiliar with the vr rendering space, so all I have to go on is what (I think) your comment implies.
Is the current state of VR rendering really just rendering and transporting two videostreams independent of eachother? Surely there has to be at least some academic prior-art on the subject, no?
Foveated streaming is cool. FWIW the Vision Pro does that for their Mac virtual display as well, and it works really well to pump a lot more pixels through.
It's the same amount of pixels though, just with reduced bitrate for unfocused regions so you save time in encoding, transmitting, and decoding, essentially reducing latency.
For foveated rendering, the amount of rendered pixels are actually reduced.
At least when we implemented this in the first version of Oculus Link, the way it worked is that it was distorted (AADT [1]) to a deformed texture before compression and then rectilinear regenerated after compression as a cheap and simple way to emulate fixed foveated rendering. So it’s not that there’s some kind of adaptive bitrate which applies less bits outside the fovea region but achieves a similar result by giving it fewer pixels in the resulting image being compressed; doing adaptive bitrate would work too (and maybe even better) but encoders (especially HW accelerated ones) don’t support that.
Foveated streaming is presumably the next iteration of this where the eye tracking gives you better information about where to apply this distortion, although I’m genuinely curious how they manage to make this work well - eye tracking is generally high latency but the eye moves very very quickly (maybe HW and SW has improved but they allude to this problem so I’m curious if their argument about using this at a low frequency really improves meaningfully vs more static techniques)
[1] https://developers.meta.com/horizon/blog/how-does-oculus-lin...
Although your eye moves very quickly your brain has a delay in processing the completely new frame you switched to. It's very hard to look left and right with your eyes and read something quickly changing on both sides
That depends on the specifics of the encode/decode pipeline for the streamed frames. Could be the blurry part actually is lower res and lower bitrate until it's decoded, then upscaled and put together with the high res part. I'm not saying they do that, but it's an option.
It’s the same number of pixels rendered but it lets you reduce the amount of data sent , thereby allowing you to send more pixels than you would have been able to otherwise
I think it works really well to pump the same amount of pixels, just focusing them on the more important parts.
Always PIP, Pump Important Pixels
It lets you pump more pixels in a given bandwidth window.
People are conflating rendering (which is not what I’m talking about) with transmission (which is what I’m talking about).
Lowering the quality outside the in focus sections lets them reduce the encoding time and bandwidth required to transmit the frame over.
Foveated streaming is wild to me. Saccades are commonly as low as 20-30ms when reading text, so guaranteeing that latency over 2.4Ghz seems Sisyphean.
I wonder if they have an ML model doing partial upscaling until the eyetracking state is propagated and the full resolution image under the new fovea position is available. It also makes me wonder if there's some way to do neural compression of the peripheral vision optimized for a nice balance between peripheral vision and hints in the embedding to allow for nicer upscaling.
I worked on a foveated video streaming system for 3D video back in 2008, and we used eye tracking and extrapolated a pretty simple motion vector for eyes and ignored saccades entirely. It worked well, you really don't notice the lower detail in the periphery and with a slightly over-sized high resolution focal area you can detect a change in gaze direction before the user's focus exits the high resolution area.
Anyway that was ages ago and we did it with like three people, some duct tape and a GPU, so I expect that it should work really well on modern equipment if they've put the effort into it.
It is amazing how many inventions duck tape found its way into.
Foveated rendering very clearly works well with a dedicated connection, wiht predictable latency. My question was more about the latency spikes inherent in a ISM general use band combined with foveated rendering, which would make the effects of the latency spikes even worse.
They're doing it over 6GHz, if I understand correctly, which with a dedicated router gets you to a reasonable latency with reasonable quality even without foveated rendering (with e.g. a Quest 3).
With foveated rendering I expect this to be a breeze.
Even 5.8Ghz is getting congested. There's a dedicated router in this case (a USB fob), but you still have to share spectrum with the other devices. And at the 160Mhz symbol rate mode on WiFi6, you only have one channel in the 5.8GHz spectrum that needs to be shared.
You're talking about "Wi-Fi 6" not "6 GHz Wi-Fi".
"6 GHz Wi-Fi" means Wi-Fi 6E (or newer) with a frequency range of 5.925–7.125 GHz, giving 7 non-overlapping 160 MHz channels (which is not the same thing as the symbol rate, it's just the channel bandwidth component of that). As another bonus, these frequencies penetrate walls even less than 5 GHz does.
I live on the 3rd floor of a large apartment complex. 5 GHz Wi-Fi is so congested that I can get better performance on 2.4 in a rural area, especially accounting for DFS troubles in 5 GHz. 6 GHz is open enough I have a non-conflicting 160 MHz channel assigned to my AP (and has no DFS troubles).
Interestingly, the headset supports Wi-Fi 7 but the adapter only supports Wi-Fi 6E.
Not so much of an issue when neighbors with paper thin walls see that 6ghz as a -87 signal
That said, in the US it is 1200MHz aka 5.925 GHz to 7.125 GHz.
The real trick is not over complicating things. The goal is to have high fidelity rendering where the eye is currently focusing so to solve for saccades you just build a small buffer area around the idealized minimum high res center and the saccades will safely stay inside that area within the ability of the system to react to the larger overall movements.
Picture demonstrating the large area that foveated rendering actually covers as high or mid res: https://www.reddit.com/r/oculus/comments/66nfap/made_a_pic_t...
It was hard for me to believe as well but streaming games wirelessly on a Quest 2 was totally possible and surprisingly latency-free once I upgraded to wifi 6 (few years ago)
It works a lot better than you’d expect at face value.
At 100fps (mid range of the framerate), you need to deliver a new frame every 10ms anyway, so a 20ms saccade doesn't seem like it would be a problem. If you can't get new frames to users in 30ms, blur will be the least of your problems, when they turn their head, they'll be on the floor vomiting.
> Saccades are commonly as low as 20-30ms when reading text
What sort of resolution are one's eyes actually resolving during saccades? I seem to recall that there is at the very least a frequency reduction mechanism in play during saccades
During a saccade you are blind. Your brain receives no optical input. The problem is measuring/predicting where the eye will aim next and getting a sharp enough image in place over there by the time the movement ends and the saccade stabilizes.
Yeah. I’d love to understand how they tackle saccades. To be fair they do mention they’re on 6ghz - not sure if they support 2.4 although I doubt the frequency of the data radio matters here.
I would guess that the “foveated” region that they stream is larger than the human fovea, large enough to contain the saccades movement (with some good-enough probability).
Saccades afaik can jump to an arbitrary part of the eye which adds to the latency of finding the iris; basically the the software ends up having to look through the entire image to reacquire the iris whereas normally it’s doing it incrementally relative to the previous position.
Are you really sure overrendering the fovea region would really work?
They use a 6 Ghz dongle
> Roughly equivalent resolution to Quest 3 and less than Vision Pro. This won't be suitable as a monitor replacement for general desktop use. But the price is hopefully low.
Question, what is the criteria for deciding this to be the case? Could you not just move your face closer to the virtual screen to see finer details?
There's no precise criteria but the usual measure is ppd (pixels per degree) and it needs to be high enough such that detailed content (such as text) displayed at a reasonable size is clearly legible without eye strain.
> "Could you not just move your face closer to the virtual screen to see finer details?"
Sure, but then you have the problem of, say, using an IMAX screen as your computer monitor. The level of head motion required to consume screen content (i.e., a ton of large head movements) would make the device very uncomfortable quite quickly.
The Vision Pro has about ~35ppd and generally people seems to think it hits the bar for monitor replacement. Meta Quest 3 has ~25ppd and generally people seem to think it does not. The Steam Frame is specs-wise much closer to Quest 3 than Vision Pro.
There are some software things you can do to increase legibility of details like text, but ultimately you do need physical pixels.
Even the vision pro at 35ppd simply isn't close to the PPD you can get from a good desktop monitor (we can calculate PPD for desktop monitors too, using size and viewing distance).
Apple's "retina" HiDPI monitors typically have PPD well beyond 35 at ordinary viewing distances, even a 1080p 24 inch monitor on your desk can exceed this.
For me personally, 35ppd feels about the minimum I would accept for emulating a monitor for text work in a VR headset, but it's still not good enough for me to even begin thinking about using it to replace any of my monitors.
Oh yeah for sure. Most people seem to accept that 35ppd is "good enough" but not actually at-par with a high quality high-dpi monitor.
I agree with you - I would personally consider 35ppd to be the floor for usability for this purpose. It's good in a pinch (need a nice workstation setup in a hotel room?) but I would not currently consider any extant hardware as full-time replacements for a good monitor.
I think there is a missing number here: angular resolution of human eyeballs is believed to be ~60 ppd(some believes it's more like 90).
We get by with lower resolution monitors with lower pixel density all the time.
I think part of getting by with a lower PPD is the IRL pixels are fixed and have hard boundaries that OS affordances have co-evolved with.
(pixel alignment via lots of rectangular things - windows, buttons; text rendering w/ that in mind; "pixel perfect" historical design philosophy)
The VR PPD is in arbitrary orientations which will lead to more aliasing. MacOS kinda killed their low-dpi experience via bad aliasing as they moved to the hi-dpi regime. Now we have svg-like rendering instead of screen-pixel-aligned baked rasterized UIs.
I'm not sure most of us do anymore - see my 1080p/24 inch example.
No one who has bought almost any MacBook in the last 10 years or so has had PPD this low either.
One can get by with almost anything in a pinch, it doesn't mean its desirable.
Pixel density != PPD either, although increasing it can certainly help PPD. Lower density desktop displays routinely have higher PPD than most VR headsets - viewing distance matters!
Not only would it be a chore to constantly lean in closer to different parts of your monitor to see full detail, but looking at close-up objects in VR exacerbates the vergence-accommodation mismatch issue, which causes eye strain. You would need varifocal lenses to fix this, which have only been demonstrated in prototypes so far.
Couldn't you get around that by having a "zoom" feature on a very large but distant monitor?
Yes. You can make a low-resolution monitor (like 800x600px, once upon a time a usable resolution) and/or provide zoom and panning controls
I've tried that combination in an earlier iteration of Lenovo's smart glasses, and it technically works. But the experience you get is not fun or productive. If you need to do it (say to work on confidential documents in public) you can do it, but it's not something you'd do in a normal setup
Yes but that can create major motion sickness issues - motion that does not correspond top the user's actual physical movements create a dissonance that is expressed as motion sickness for a large portion of the population.
This is the main reason many VR games don't let you just walk around and opt for teleportation-based movement systems - your avatar moving while your body doesn't can be quite physically uncomfortable.
There are ways of minimizing this - for example some VR games give you "tunnel vision" by blacking out peripheral vision while the movement is happening. But overall there's a lot of ergo considerations here and no perfect solution. The equivalent for a virtual desktop might be to limit the size of the window while the user is zooming/panning.
For a small taste of what using that might be like turn on screen magnification on your existing computers. It's technically usable but not particularly productive or pleasant to use if you don't /have/ to use it.
This all sounds a bit like the “better horse” framing. Maybe richer content shouldn’t be consumed as primarily a virtualized page. Maybe mixing font sizes and over sized text can be a standard in itself.
It's just about what pixel per degree will get you close to the modern irl setup. Obviously it's enough for 80 char consoles but you'd need to dip into large fonts for a desktop.
I did the math on this site and I'd have to hunch less than a foot from the screen to hit 35 PPD on my work provided Thinkpad X1 Carbon with a 14" 1920x1200 screen. My usual distance is nearly double that so my ppd normally is more like 70 ppd, roughly.
https://phrogz.net/tmp/ScreenDensityCalculator.html#find:dis...
And foveated streaming has a 1-2ms wireless latency on modern GPUs according to LTT. Insane.
That's pretty quick. I've heard that in ideal circumstances Wi-Fi 6 can get close to 5ms and Wi-Fi 7 can get down to 2ms.
I's impressive if they're really able to get below 2ms motion-to-photon latency, given that modern consumer headsets with on-device compute are also right at that same 2ms mark.
Wow, that's just 1 frame of latency at 60 fps.
Edit: Nevermind, I'm dumb. 1/60th of a second is 16 milliseconds, not 1.6 milliseconds.
No, thats between 0.06 and 0.12 frame latency on 60fps. It's not even a frame on 144Hz (1s/144≈7ms)
Much less than, 1 frame is 16ms
60 fps is 16.67 ms per frame.
> Roughly equivalent resolution to Quest 3 and less than Vision Pro. This won't be suitable as a monitor replacement for general desktop use.
The real limiting factor is more likely to be having a large headset on your face for an extended period of time, combined with a battery that isn't meant for all-day use. The resolution is fine. We went decades with low resolution monitors. Just zoom in or bring it closer.
The battery isn't an issue if you're stationary, you can plug it in.
The resolution is a major problem. Old-school monitors used old-school OSes that did rendering suitable for the displays of the time. For example, anti-aliased text was not typically used for a long time. This meant that text on screen was blocky, but sharp. Very readable. You can't do this on a VR headset, because the pixels on your virtual screen don't precisely correspond with the pixels in the headset's displays. It's inevitably scaled and shifted, making it blurry.
There's also the issue that these things have to compete with what's available now. I use my Vision Pro as a monitor replacement sometimes. But it'll never be a full-time replacement, because the modern 4k displays I have are substantially clearer. And that's a headset with ~2x the resolution of this one.
> There's also the issue that these things have to compete with what's available now. [...] But it'll never be a full-time replacement, because the modern 4k displays I have are substantially clearer.
What's available now might vary from person to person. I'm using a normal-sized 1080p monitor, and this desk doesn't have space for a second monitor. That's what a VR headset would have to compete against for me; just having several virtual monitors might be enough of an advantage, even if their resolution is slightly lower.
(Also, I have used old-school VGA CRT monitors; as could be easily seen when switching to a LCD monitor with digital DVI input, text on a VGA CRT was not exactly sharp.)
VR does need a lot of resolution when trying to display text.
Can get away with less for games where text is minimized (or very large)
The weight on your face is half that of Quest 3, they put the rest of the weight on the back which perfectly balances it on your head. It's going to be super comfortable.
Yeah, already many people use something like the Bobovr alternative headstrap for the Quest3 that has an additional battery pack in the back, which helps balancing the device in the front.
Which doubles the weight on your head, which increases the inertia you feel when moving around playing active games. The Frame is half the weight on your face, so active games are going to be a lot more comfortable.
Whether or not we used to walk to school uphill both ways, that won't make the resolution fine.
To your point, I'd use my Vision Pro plugged in all day if it was half the weight. As it stands, its just too much nonsense when I have an ultrawide. If I were 20 year old me I'd never get a monitor (20 year old me also told his gf iPad 1 would be a good laptop for school, so,)
One problem is that in most settings a real monitor is just a better experience for multiple reasons. And in a tight setting like an airplane where VR monitors might be nice, the touch controls become more problematic. "Pardon me! I was trying to drag my screen around!"
> (20 year old me also told his gf iPad 1 would be a good laptop for school, so,)
Yikes. How'd that relationship end up? Haha.
Lol, I laughed then 20 seconds later started taking this literally: I think that was July, it had been two years, and it was over by November (presumably due to my other excellent qualities!) (all joking aside, for younger members in our audience, it was sweet and she was around in my life for at least another decade)
2k X 2k doesn't sound low res it is like full HD, but with twice vertical. My monitor is 1080p.
Never tried VR set, so I don't know if that translates similarly.
Your 2K monitor occupies something like a 20-degree field of view from a normal sitting position/distance. The 2K resolution in a VR headset covers the entire field of view.
So effectively your 1080p monitor has ~6x the pixel density of the VR headset.
Thank you for explaining, it makes sense now.
The problem is that 2k square is spread across the whole FOV of the headset so when it's replicating a monitor unless it's ridiculously close to your face a lot of those pixels are 'wasted' in comparison to a monitor with similar stats.
Totally true, but unlike a real monitor you can drag a virtual monitor close to your face without changing the focal distance, meaning it's no harder on your eyes. (Although it is harder on your neck.)
Why hasn't Meta tried this given the huge amount of R&D they've put into VR and they had literally John Carmack on the team in the past?
They prioritized cost, so they omitted eye tracking hardware. They've also bet more on standalone apps rather than streaming from a PC. These are reasonable tradeoffs. The next Quest may add eye tracking, who knows. Quest Pro had it but was discontinued for being too expensive.
We'll have to wait on pricing for Steam Frame, but I don't expect them to match Meta's subsidies, so I'm betting on this being more expensive than Quest. I also think that streaming from a gaming PC will remain more of a niche thing despite Valve's focus on it here, and people will find a lot of use for the x86/Windows emulation feature to play games from their Steam library directly on the headset.
It will be interesting to see how the X86 emulation plays out. In the Verge review of the headset they mentioned stutters when playing on the headset due to having to 'recompile x86 game code on the fly', but they may offer precompiled versions which can be downloaded ahead of time, similar to the precompiled shaders the Steam Deck downloads.
If they get everything working well I'm guessing we could see an ARM powered Steam Deck in the future.
Despite the fact it uses a Qualcomm chip, I'm curious on whether it retains the ability to load alternative OS's like other Steam hardware.
> Despite the fact it uses a Qualcomm chip, I'm curious on whether it retains the ability to load alternative OS's like other Steam hardware.
I think it should: we have Linux support/custom operating systems on Snapdragon 8 Gen 2 devices right now today, and the 8 Gen 3 has upstream support already AFAIK
If you mean foveated streaming - It’s available on the Quest Pro with Steam Link.
What do you mean? What part have they not tried?
I use a 1920x1080 headset as a monitor replacement. It's absolutely fine. 2160x2160 will be more than workable as long as the tracking is on point.
> But the price is hopefully low.
The main value of Meta VR and AR products is the massive price subsidy which is needed because the brand has been destroyed for all generations older than Alpha.
The current price estimate for the Steam Frame is $1200 vs Quest 3 at $600 which is still a very reasonable price given the technology, tariffs, and lack of ad invading privacy
Quest 3 is $499 and Quest 3S is $299 in the US
> Very cool, should be very hackable. Very unlike every other standalone VR headset. That might be the reason I'm going to buy it. I want to support this and Steam has done a lot to get gaming on linux going.
I guess there's a market for this but I'm personally disappointed that they've gone with the "cram a computer into the headset" route. I'd much rather have a simpler, more compact dumb device like the Bigscreen Beyond 2, which in exchange should prove much lighter and more comfortable to wear for long time periods.
The bulk and added component cost of the "all in one" PC/headset models is just unnecessary if you already have a gaming PC.
I'm personally quite hyped to see the first commercially available Linux-based standalone VR headset announced. This thing is quite a bit lighter than any of the existing "cram a computer in" solutions.
Strictly speaking the mobile Oculus/Meta Go/Quest headsets were linux/android based, you can run Termux terminal with Fedora/Ubuntu on them and use an Android VNC/X app to run the 2D graphical part. But I share your SteamOS enthousiasm.
Yeah, this is exactly what I've been waiting for for quite a long time. I'm very excited.
They crammed a computer into the headset, but UNLIKE Meta's offerings, this is indeed an actual computer you can run linux on. Perhaps even do standard computer stuff inside the headset like text editing, Blender modeling, or more.
As a current and frequent user of this form factor (Pico 4, with the top strap, which the Steam Frame will also have as an option, over Virtual Desktop) I can assure you that it's quite comfortable over long periods of time (several hours). Of course it will ultimately depend on the specific design decisions made for this headset, but this all looks really good to me.
Full color passthrough would have been nice though. Not necessarily for XR, but because it's actually quite useful to be able to switch to a view of the world around you with very low friction when using the headset.
There's always going to be a computer in it to drive it. It's just a matter of how generalised it is and how much weight/power consumption it's adding.
It's nice to have some local processing for tracking and latency mitigation. Cost from there to full computer on headset is marginal, so you might as well do that.
You can get a Beyond if that's what you want. It's an amazing device, and will be far more comfortable and higher resolution than this one. Valve has supported Bigscreen in integrating Lighthouse tracking, and I hope that they continue that support by somehow allowing them to integrate the inside-out tracking they've developed for this device in the next version of the Beyond.
That would probably add a lot of extra weight and it would need to make the device bigger.
I agree. Hopefully Bigscreen continues making hardware. I still have the original bigscreen beyond and im very happy with it besides the glare.
From the review section:
Nikos Q: Linux Desktop support? A: Hi,
Linux is not officially supported but can absolutely work with the Beyond 2. I'd suggest joining the Bigscreen Beyond Discord server for more information
Thanks By Bigscreen Support Team
---
Rant: they have disabled selected text for the reviews for some inexplicable reason.
its using SteamVR, so it should work
It's super light compared to Quest 3, half the weight on your face, the rest is on the back which balances the headset. Big Screen Beyond isn't wireless and has a narrower field of view.
> has a narrower field of view.
On the beyond 2, only by 2 degrees horizontally. I don't think that would even be noticeable.
I was worried about the built in computer as well, but then I found out it's only 185g. It is 78g more than the Bigscreen Beyond 2, but it's still pretty light.
I once lived in a place that had a bathroom with mirrors that faced each other. I think I convinced myself that not only is my attention to detail more concentrated at the center, but that my response time was also fastest there (can anyone confirm that?).
So this gets me thinking. What would it feel like to correct for that effect? Could you use the same technique to essentially play the further parts early, so it all comes in at once?
Kinda a hair brained idea, I know, but we have the technology, and I'm curious.
Peripheral vision is extremely good at spotting movement at low resolution and moving the eye to look at it.
I don't know if it's faster, but it's a non-trivial part of the experience.
Yea, I've heard and noticed that as well (thought about adding a note about it to my original comment). But what I'm curious about is the timing. What I suspect is that peripherals are more sensitive to motion, but still lag slightly behind the center of focus. I'm not sure if it's dependent on how actively you are trying to focus. I'd love to learn more about this, but I didn't find anything when I looked online a bit.
It's good enough to see flickering on crt monitors at 50-60hz for some people.
I can see the spinning color wheels inside cheaper projectors as rapidly-changing rainbow lights leaking out of their ventilation grilles, but only with peripheral vision and mostly only if I'm moving my head at the same time.
> Foveated streaming! That's a great idea.
It would be interesting to see⁰ how that behaves when presented with weird eyes like mine or worse. Mine often don't always point the same way and which one I'm actually looking through can be somewhat arbitrary from one moment to the next…
Though the flapping between eyes is usually in the presence of changes, however minor, in required focal distance, so maybe it wouldn't happen as much inside a VR headset.
----
[0] Sorry not sorry.
Have a look at this video by Dave2D. In his hands-on, he was very impressed with foveated streaming https://youtu.be/356rZ8IBCps.
Yet this is shaping up to be one of the most interesting VR releases
How the hell would foveated streaming even work, it seems physically impossible. Tracking where your eye is looking then sending that information to a server, it processing it and then streaming that back seems impossible.
The data you're sending out is just the position and motion vectors of the pupils. And you probably only need about 16 bits for each of these numbers for 2 eyes. So the equivalent of two floating point numbers along a particular channel or 32 bits at minimum. Any lag can be compensated for by simply interpolating the motion vectors.
It actually makes a lot of sense!
Eye tracking hardware and software specifically focus on low latency, e.g. FPGA close to the sensor. The resulting packets they send is also ridiculously small (e.g 2 numbers as x,y positions of the pupils) so ... I can see that happening.
Sure eyes move very VERY fast but if you do relatively small compute on dedicated hardware it can also go quite fast while remaining affordable.
It just needs to be less impossible than not doing it. I.e. sending a full frame of information must be an even more impossible problem.
> Mouth tracking?
What a vile thought in the context of the steam… catalogue.
I'm guessing it's main use case will be VR chat syncing mouths to avatars.
They're probably thinking of it in comparison to the Apple Pro which attempts to do some facial tracking of the bottom of your face to inform their 'Personas', it notably still fails quite badly on bearded people where it can't see the bottom half of the face well.
I gathered as much, but still.
Funny enough the Digital Foundry folks put a Gabe quote about tongue input in their most recent podcast.
This is the first standalone headset with an open ecosystem. That's a big deal.
Meta Quests & Apple Visions require developer verification to run your own software, and provide no root access, which slowed down innovation significantly.
Valve giving users root access out of the box is huge. It puts the headset in the same category as a real PC
Praise be to gaben
> first standalone headset with an open ecosystem
What about the Lynx XR1? Running Android sure but officially rooted (details https://lynx.miraheze.org/wiki/Rooting_Process ) and with Linux proper (details https://wiki.postmarketos.org/wiki/Lynx_R1_(lynx-r1) ) even though experimental.
There is but one issue with the Lynx XR1 - no one really got it. A few backers randomly got a few pieces but many others (including myself) are still waiting for their device to arrive (and will most likely wait for ever).
This has a serious impact on the developer ecosystem - there are still a few people who got their devices and are doing interesting work, but with so few users actually having devices the community is too small for much progress to be expected.
It's kinda similar to the old Jolla Tablet - it was a very interesting device (an x86 tablet running an open Linux distro in 2013!) but it ended up in too few hands due to funding issues & the amount of Sailfish OS apps actually supporting the tablet (eg. big screen, native x86 builds, etc.) reflected that.
> many others (including myself) are still waiting for their device
Sucks, sorry to hear that :(
Yes we released our headset with root access and an open bootloader. We are going to announce our next headset in a couple of months :-)
Not to mention Meta abandoned the Quest 1 very quickly. I bought a game when it came out and never got around to playing it (had kids). I tried to play it recently and it no longer even works! £30 down the drain, thanks Zuck.
I guess I can't complain too much given that I got it for free.
I bought an Oculus Go last year for € 30. Its support has been dropped for quite some time, and you can only activate developer mode and sideloading through an old version of the Meta Horizons app [1]. But if you do that, there are 71 GiBs of games to explore on the Internet Archive [2]. Some need patching to remove an online check to a server that no longer exists, but that is easy enough to do with a (regrettably Windows) tool someone published.
The Go is not the best headset of course, but the games are a different style because of the 3DoF tracking without camera's. Somewhat slower paced and sitting down. A style I personally like more.
You can also unlock the device to get root on it [3], which is quite neat, although there doesn't seem to be any homebrew scene at all. Not even the most bare-bones launcher that doesn't require a Meta login.
[1] That doesn't even seem intentional, but it does mean that once the old version of the app can't communicate with Meta servers anymore, any uninitialized Go turns into a brick.
[2] https://archive.org/details/gear-vr-oculus-go
[3] https://developers.meta.com/horizon/blog/unlocking-oculus-go...
That's not quite true - when did you get your free Quest 1? Only January of this year did Meta officially stop allowing devs to support those devices which IMO is not nice, but probably necessary to put resources towards newer devices since it was extremely outdated and very hard to keep supporting. The Quest 1 launched in May 2019, so it got almost 6 years of updates and if you have one, you can still install older versions of existing apps that choose to support it (which admittedly is very rare). I shut off support for my game back in 2024 when they recommended it, since the device is less than half as powerful as the Quest 2, very few users still had one, and the Q1 was a hard target to hit performance-wise vs newer devices. If you spend $50 to get a Quest 2 you'll get a couple years of updates or even better, spend $299 to get a 3S which is an amazing piece of kit and will probably be supported for at least 5 more years since it just came out.
> £30 down the drain, thanks Zuck.
I'm sure he put it to good use. Like 500ms worth of upkeep for one of his yachts.
sorry, maybe i missed it. But how do you know the ecosystem is open?
from the link we don't know if the OS can be changed (might be locked like many Android phones) or if a connected machine is required to run their DRM/Steam. The drivers may also not be open source
It's SteamOS and SteamVR - you can run arbitrary aarch64 Linux binaries that talk to SteamVR and they should just work
Yep, I'm back into VR with this move, specially if the price is closer to $500 than $1000.
Unless the lenses/displays are bad, but I figure we would have heard by now?
from a cursory look . it seems SteamVR is intended to be used with their DRM platform and isn't open source. Maybe its a bit less limiting vs Meta's offering?
i wouldnt characterize this as an "open ecosystem" though
The key takeaway is that you will rebuild the drivers less often:
1) The stack is mature now, we know what features can exist.
2) For me it's about having the same stack as on a 3588 SBC, so I don't need to download many GB of Android software just to build/run the game.
The distance to getting a open-source driver stack will probably be shorter because of these 2 things, meaning OpenVR/SteamVR being closed is less of a long term issue.
Android is also just Linux. But i cant install Debian on my phone
Android isn't "just Linux". It's a heavily modified kernel, it's often an even closed source bootloader in many cases and it's completely untrue for userspace, where it incorporates stuff from other OSs (BSDs, etc.). There are huge amounts of blobs.
Yes, there technically is a Linux kernel, but if it's "just Linux" then macOS is "just FreeBSD", because grep -V tells you so, because it has dtrace, because you run (ran?) Docker with effectively FreeBSD's bhyve, etc.
If you wanna spin it even further neither are Safari and Chrome or any other Webkit browsers just Konqueror because they took the layout engine code from KDE (KHTML).
And you can totally install Debian and even OpenBSD, etc. on a Steam Deck and at least the advertisement seems to indicate it won't be all that different for the VR headset.
The problem is that you're talking about the Linux desktop ecosystem whereas the op could be talking about the kernel. Both are just Linux (and the fact we've not evolved our nomenclature to differentiate the two is surprising). Also, fwiw, the android kernel is no longer heavily modified. Most of the custom stuff has been upstreamed.
SteamOS at its base is just Arch with Steam and some additional software installed.
Great username for this type of comment.
Even just have direct access to hardware apis is already a big win. On Oculus quest. The closest you can get is running with webxr. But webxr suffer from all those performance problem of web platforms. (And bug of meta softwares. The recent quest browser have bug that prevent you from disabling spatial audio, rendering it not usable for watch video at all)
I just want a "dumb" headset that I can use as a portable private display for my laptop.
That's it.
I don't need 3D, I don't need VR, I don't need weirdass controllers trying to be special. Just give me a damn simple monitor the size of my eyes.
Fuck off with your XR OSes and "vision" for XR, not even Apple could get it fully right, the people in charge everywhere are too out of touch and have no clue where the fuck to go after smartphones.
HUD glasses kind of suck since having a display oriented to your head is uncomfortable. Adding 3DOF tracking only partially solves that, so you go 6DOF to maximize optical/vestibular comfort. Now you're rendering a virtual display within a virtual environment, but look at all that wasted space! So add more virtual monitors! Now you need some mechanism to manage them, so you add that and now you have a windowing system... so why are you rendering virtual monitors with fixed space desktops when you can just be rendering the application windows themselves?
The best portable private display for your laptop will inevitably be a 6DOF tracked headset with an XR native desktop.
Yes sorry about my excessive use of French in the comment, I didn't mean it has to be a fixed 1:1 slab of the realspace screen, desktop app windows in XR space would be ideal, but none of the products seem to be able to get it all right yet.
Apple's visionOS comes close but it's crippled by the trademark Apple overcontrolling.
Then this is actually much closer than previous headsets?
There is a lot going on to render the desktop in a tracked 3D space, all that has to happen somewhere. If you're expecting to plug a HDMI cable into a headset and have a good time then I think you're underestimating how much work is being done.
OpenVR and OpenXR are really great software layers that help that all work out.
I am currently writing this from an xreal one pro. I think it fits what you are asking for.
I don't understand your comment. What you're describing has existed for years.
Well maybe not compatible with MacBooks then, with just a USB-C plug-and-play experience? or I presumed it didn't exist
When I look up the actual release dates on viable head mounted displays, it turns out I'm wrong: not years, more like "year."
You should check out the xreal one!
There you go https://www.sightful.com/
My Viture AR glasses are just a dumb display with an accelerometer, and work extremely well.
Frame is obviously the main headline here, but they've also launching a new SteamOS mini-PC and a new controller.
https://store.steampowered.com/sale/steammachine
https://store.steampowered.com/sale/steamcontroller
No prices listed for any of them yet, as far as I can tell.
Oh hell yes. There was a leak of specs (via a benchmarking database) of an upcoming machine from Valve and I had my fingers crossed that it was a mini PC and not some VR thingy, saw this thread, and was sad for a moment before I spotted this post.
6x as powerful as the Steam deck (that I use plugged in anyway 98% of the time—I’d have bought a Steam Deck 2, but I’m glad I get the option to put money toward more performance instead of battery and screen that I don’t use) is great. Not a lot of games I want to play won’t run well at least at 1080p with specs like that.
What is the draw of the Steam machine though? They say the price is comparable to similarly specced PC. So why not just buy/build any mini PC? There's plenty of options for that
A good while back I abandoned PC gaming because I was sick of driver issues, compatibility, and always having to update hardware to play the next game. Instead, I embraced consoles and haven't considered PC gaming since then. This, however, has me reconsidering that. I want it to "just work". When I want to play games, I don't want to deal with all of that other crap. I'm old, ain't nobody got time for that.
It's wild how experiences can vary so wildly. That's the nature of PC's though I suppose that you are trying to avoid.
I've had no driver or compatibility issues in longer than I can remember. Maybe Vista?
I also rarely upgrade because playing at console level settings means I can easily get effectively the same lifetime out of my hardware. Though I do tend to upgrade a little earlier than console users still leaning a bit more towards the enthusiast side.
I mean I just don't see the difference between this and getting any PC and slapping SteamOS on it.
As someone who has been building my PCs for decades, I have to admit seeing some appeal here:
It's apparently small, quiet, capable, and easy.
I'll keep building my own, but most people don't, and the value of saved time and reduced hassle should not be underestimated.
If comparing this device to other pre-built systems, consider that this one is likely to be a first class target for game developers, while others are not.
Some people really don't want to spend time exchanging parts when the memory they buy turns out to be incompatible or that the GPU doesn't fit the sleek mITX case. There's a lot of research to ensure all parts are compatible and optimal when building a PC - for some it's time that could be better spent on using the PC instead of building one.
It's tiny. It runs SteamOS which is built to be used with a controller on a TV. And it will probably be a performance target for many developers.
But I think the biggest feature might be the quick suspend and resume. Every modern console has that, but not PCs. You can try to put a computer to sleep, but many games won't like that.
My Windows desktop doesn’t like that. It wakes instantly, no idea why.
Not to mention windows laptops waking up in bags or backpacks in the middle of the night seemingly for the only purpose of burning themselves up.
It's a console basically. It comes ready to play without much maintenance needed from the user.
One can argue consoles are pcs that the manufacturers try super hard to not allow you to root them.
This steam machine here is a PC with steam preinstalled for a console-like setup and direct boot to your game library - but it’s still a pc.
The point is, computers are computers I guess ;)
I love SFF PCs, but you can’t get the same density as a manufacturer doing a fully bespoke design. Just look at those innards: no space is wasted.
Yeah the heatsink filling the whole silicon-less volume is… something.
i've spent plenty of time building custom PCs, but life changes and that's really not something i have any interest in doing any more.
there's plenty of people who just want to play games without researching what CPU and video card to buy.
For me it would be the small size and CEC capability. A custom built PC can't currently use CEC on HDMI to have a seamless experience the entire home theater like a console can.
The experience of using a custom build is terrible.
The best experience you can get atm is to use Steams big picture mode, and that doesn't give you pause/resume, and you will sometimes need to use keyb & mouse to solve issues, plus you need to manage the whole OS yourself etc.
Valves SteamOS which already runs on the Steam Deck gives you all the QoL that you expect out of a console. Pause / resume with power button press, complete control via controller, fully managed OS.
What's missing are "in experience" native apps like Netflix/AppleTV/etc. as well as support for certain games which are blocked on anti-cheat.
My wife is a research scientist who uses linux with her day job, but she isn't interested in dealing with any nonsense when she's relaxing at the end of the day. The Steam Deck has been a wonder for her - suddenly she's playing the same games as me with none of the hassle. The Steam Machine will suddenly open a bunch of my friends and family up to PC games as well.
It won't be long until you can put SteamOS on any machine you make yourself, but the Steam Machine will serve as reference and "default" hardware for the majority.
Lots of companies tried to recreate the Steam Deck and quite frankly, they're just not as good as the original.
SteamOS is a super controller-friendly desktop that would be right at home in a living room. Like the Deck, the Steam Machine could become a target profile for developers.
PC gaming on the couch at last
Snapdragon doesn't really have a good history of supporting proper desktop games. Windows for ARM had kinda bad compatibility. It seems the aim is to have most games just be playable like with the Deck. Fingers crossed but I have some reservations.
Their new mini PC isn’t ARM (the Frame is, though), it’s AMD hardware like the Steam Deck. Appears to be x86, should play basically anything in my library at 1080p or higher as long as it works under SteamOS.
Real shame it’s only 60Hz at 4k. There’s a gap for good 120Hz@4k streaming.
Hoping the next Apple TV will do it.
Edit - updated specs claim it can do this, but it’s limited to HDMI 2.0
(rewriting this comment because the spec sheet has seemingly been updated)
Looks like it can do 4k 120hz, but since it's limited to HDMI 2.0 it will have to rely on 4:2:0 chroma subsampling to get there. Unfortunately the lack of HDMI 2.1 might be down to politics, the RDNA3 GPU they're using should support it in hardware, but the HDMI Forum has blocked AMD from releasing an open source HDMI 2.1 implementation.
https://arstechnica.com/gadgets/2024/02/hdmi-forum-to-amd-no...
It seems it supports DP 1.4 as well, so perhaps you could get an adapter if your display only supports HDMI 2.1
I'm not sure that would work. From what I can tell, the adapters are basically dumb straight through cables, they aren't converting anything. And it's the actual GPU that's outputting a HDMI signal over the Displayport connector, which the adapter than rewires in to a HDMI shaped connector.
... but isn't it using a wireless dongle to connect to the headset to the PC so HDMI doesn't get involved?
It seems to me the wireless is pretty important. I have an MQ3 and I have the link cable. For software development I pretty much have to plug the MQ3 into my PC and it is not so bad to wander around the living room looking in a Mars boulder from all sides and such.
For games and apps that involve moving around, particularly things like Beat Saber or Supernatural the standalone headset has a huge advantage of having no cable. If I have a choice between buying a game on Steam or the MQ3 store I'm likely to buy the MQ3 game because of the convenience and freedom of standalone. A really good wireless link changes that.
> but isn't it using a wireless dongle to connect to the headset to the PC so HDMI doesn't get involved?
I'm talking about the Steam Machine here. In theory you could pipe 4k120 to the headset assuming there's enough wireless bandwidth, yeah.
So, in the specs for the mini-pc, it claims the video out can do 4K @ 120Hz (even faster if displayport). I assume the 4K @ 60Hz you saw is from the "4K gaming at 60 FPS with FSR" line.
I reckon it can probably stream at 4K@120 if it can game at half that.
Interesting. I also saw HDMI 2.0 - I guess it’s technically possible but with subsampling?
This is not true, from the specs:
HDMI 2.0
Up to 4K @ 120Hz
Supports HDR, FreeSync, and CEC
I have zero doubts the device can do 4k @ 120Hz streaming Hardware wise. In the end it is just a normal Linux desktop.
Considering how much they talk about Foveated rendering, I think it might not be constrained by the traditional limitations of screens - instead of sending a fixed resolution image at whatever frequency, it'll send a tiny but highly detailed image where your eyes are focusing, with the rest being considerably lower resolution.
Or that's what I think I may be completely wrong.
Where are you getting this number? I'm not seeing it on the specs page.
it's confusing rn because on the steam machine post people are commenting on the frame and vice-versa here.
This is for the steam machine, not the headset. Mentioned in the CPU & GPU section.
I am incredibly excited for the new controller. The og steam controller for me was unmatched as a controller, I could never play any first person game on anything else other than mouse and keyboard, not to mention it allows playing rts or point and clicks from the couch.
When they cancelled production I bought 8.
The controller looks pretty cool for sure, my biggest fear is the dpad though. I hope they go for a clicky feel like on the latest xbox controllers, and not the mushy feel you've got on the Dualshock 5 or even the 8BitDo Pro 2, which, for me, really is the only think missing from those. I'm more of a "Dpad in the top left" kind of guy, but I want it to be clicky like on the Xbox controllers :( We'll see!
I'm with you on the dpad. For me I've never found better dpads outside of retro focused controllers from companies like 8bitdo, so when I want to play a retro game with dpad I just grab one of those and use my steam controller for everything else.
A bit of topic, but I was wondering how much bigger is the steam machine compared to the mac mini m4, since that's what I have and is my frame of reference. Obviously comparing apples to oranges and only talking about physical volume, not features, compatibility, price, personal preferences, etc.
Mac Mini m4: 127 x 127 x 50 mm = 0.8 L
Steam Machine: 156 x 162 x 152 = 3.8 L
That's 4.76 times more volume.
> Obviously comparing apples to oranges
Or is it “comparing apples to steam engines”?
Given that Valve are the ones who released the Orange Box, methinks the original comparison is valid
It's only a little bigger than Mac Studio.
9.5 x 19.7 x 19.7 cm = 3,687 cm³
and half the size of my SFFPC @ 8.3L
> Frame is obviously the main headline here
Why? VR headsets are a dying fad of the 2020s. Way more excited for SteamOS on ARM.
... which likely wouldn't have happened if they didn't want a computer inside their VR headset. The steam machine is x86 considering it's an AMD processor.