Programming in VR

Published Fri Feb 09 2024 Updated Sat Feb 10 2024

With the Vision Pro becoming available this month I have been thinking about VR more. I got an Oculus Quest 3 in December and have had fun playing games with it, but I am curious about using it for more regular computer tasks. Virtual displays have high potential for offering great developer experience. Today I have decided to write some code (and this post) from inside my Quest headset.

Virtual Desktop Apps

For the Quest I first tried the built in browser. This worked better than expected, especially without controllers, but was obviously limited since I could only use a browser. From here I tried Meta’s Horizon Workrooms. This app was pretty cool, I really liked the virtual keyboard it overlaid on my laptop’s physical keyboard so I could see it without seeing it. I could see this app being great for meetings, the meeting room setup was awesome with a shared projector screen and whiteboarding tools. Positioning screens did not feel very good in Horizon.

Next I tried Immersed, another free app available on the Quest store. It could be my anti-Meta bias, but I find Immersed to be a better user experience for me. Overall the app is certainly a little less polished, but the hand menus and monitor placing are more intuitive. I also liked the 3D space I worked in more, Immersed has a larger catalog of environments. Some appear to be premium.

Neovim

Having a keyboard-only workflow is incredible in VR. You could be productive with a mouse as well, but it is so easy to get work done with my hands glued to home row. Using this has made me reconsider what my dream productivity office would look like. A wireless keyboard to place in my lap along with a comfortable headset is all I want and need.

Stacked monitors in VR

I am enjoying the above stacked display layout. This screenshot was taken while laying back on my couch. Using VR lets you place screens comfortably in any environment.

Rough Edges

  1. Screen resolution on my quest is good, but not great. While working there is also a small but noticeable delay, this is to be expected while using a wireless connection. Plugging the headset into my computer may be a worthwhile compromise especially since I sit while working.

Update: Immersed has a “low latency mode” that improved my input delay. For my M2 Macbook I got freezing with 3 displays, 2 runs smooth.

  1. I experienced eye strain after only about an hour of work. This could be due to not previously being a heavy user of VR, I believe this issue of comfort applies broadly for now.

  2. My ideal productivity setup for VR uses hand controls instead of physical controllers. Typing on a keyboard while in hand control mode occasionally results in grabbing or clicking something I didn’t mean to. Hand tracking is not the most reliable, I have many moments that remind me of trying to aim a Wiimote but with my hands.

Conclusion

I am still skeptical on whether or not this is “the future” for development and productivity environments. Until then this is a great option for portable productivity.

Long work sessions do not feel feasible for me right now, with time maybe I will adapt. For now we are still emulating 2D screens in a 3D environment. I’m interested to see if we get any VR specific developer tools or environments. Whatever the next step beyond virtual screens is will be cool!