This is going to be a very strange post in which I describe pretty pictures but don't actually show you what I'm looking at. Why ? Because I'm using a VR headset, and I can't yet make the final media into a shareable format.
I now have a computer capable of VR. This is almost as big a jump as getting the standalone Quest headset itself, because the graphical capabilities of the PC far exceed the high-end smartphone level of the headset alone.
One of the first things I tried was to examine the tiny handful of models I've uploaded to Sketchfab. On the Quest by itself, these are barely functional. The framerate and/or tracking are lousy, and though you can get the general idea, the experience is unpleasant. Not so with the PC, which easily handles much more complex models than these. So I can walk around my model as though it was actually there in my living room. I can even interactively rescale it with the thumbsticks. But of course, using Sketchfab isn't very convenient, especially given the pathetic limitations imposed on uploads.
That's where FRELLED comes in*. I used Blender 2.79 for this, partly out of ignorance. When I first looked at Eevee, back during the early test builds for 2.80, it wasn't up to much. Textures loaded slowly and even simple tests of FRELLED weren't at all successful, with the view essentially re-rendering whenever anything changed at all. That completely breaks the main benefit of FRELLED, which is that the view should update instantaneously (that is, at > 25 fps) and in real time. Since Blender 2.8+ lacks any other realtime capability, I stuck with 2.79.
* I've now managed to confirm that this works both on Windows and Mac. There's still quite a bit to do, but it's getting closer and closer to being released into the wild.
But nowadays Eevee is massively more powerful. For scenes as simple as those FRELLED constructs, Eevee renders in true real time, not the pseudo-realtime of before. And it doesn't have the problem of transparent materials needing to be ordered correctly, and - maybe best of all - you can adjust the brightness and contrast of materials in realtime as well. That makes it a dramatic and wholesale improvement over Blender 2.79's capabilities, not a poor substitute with a few fringe benefits.
Unfortunately, the switch from the Python in Blender 2.78 to 2.8+ is significant enough that I can't just directly convert everything. So FRELLED version 5 is already looking obsolete compared to a planned version 6, though, mercifully, the conversion will be nowhere near what I've had to do in the upgrade from Blender 2.49 (essentially a complete re-write of all 11,000 lines of code). But there's one capability of Eevee which I simply had to try out : virtual reality.
Getting this to work was remarkably easy. I started with importing isosurfaces and adding a few lights. Then I plugged in the headset, enabled the Link, and hit "start VR server" in Blender. And it just worked. I had a greyscale surface of M33 floating in front of me that I could walk around. Or at least partway, due to the limited length of the cable (I've got the wireless version of Virtual Desktop running, but so far that only works with Steam and I haven't figured out how to run it with Oculus software yet).
From there it was a simple matter of playing with Eevee's materials to get something more shiny. For my purposes, I barely need any lights - I can do everything with the material preview. In an hour or so I had this, floating in front of me. I could even stick my head inside it, though Blender gets sluggish if I do :
Now of course that needs some more effort to come up with nicer materials, but the proof of concept is solid. I was so impressed by how well this worked I began to wonder if it might even be possible to do the full volumetric display of FRELLED. And soon I found that yes, yes it is. I wrote a couple of short Python scripts to automate most of the process. So now I get to see M33 in its full volumetric glory, rendered as a cube about half a metre across that I can walk right round.
The main limitation appears to be proximity. From around 0.5m away the frame rate is very good. Get much closer, though, and it drops sharply. You can stick your head inside, but it's not much fun. I'm not sure why this is but I guess it's a limitation of Blender. Still, even this is more than sufficient for outreach.
And in some ways this is even simpler than the old process. Blender 2.79 had problems ordering materials, so that transparent surfaces weren't rendered correctly from behind. This meant an elaborate series of forward and reverse images with a background script deciding which ones to show based on the viewing angle. This isn't necessary in 2.91, where I can just show everything at once. I might eventually use a simpler version of the script to deal with larger data sets, but for small ones it isn't needed at all.
Is there any practical, scientific benefit to this though ? Honestly, I dunno. Personally I think the more visualisation techniques we have access to, the better. When you actually see it, I don't think there's any question that this is inherently better. The greater immersion helps focus on features you might never have noticed (but I won't really know this until it's developed enough to use in anger). Granted, VR could still benefit from lighter, cheaper headsets at higher resolution, but this was once true of television as well. If there's one piece of tech that's come closest to the sci-fi predictions of the last few decades, then VR is surely a leading contender.
Of course, at this stage it's nice for outreach but useless for science. Still, the current technology appears adequate to the point that it's the software which is now the chief bottleneck. Greater native integration of VR hardware in Blender would be nice, though I'd prefer some format which could be easily shared online without having to give away the actual .blend file*. But it's entirely feasible to conceive of sticking on a headset and doing all the standard analysis in VR, with negligible additional effort compared to using an ordinary monitor. In fact it's already possible, in that this could be developed on a timescale of weeks or months - certainly not years.
The only major practical issue, though, may not be the hardware so much as the space requirement. We're not going to be giving up 2D screens any time soon. Whether anyone will feel that the capabilities of VR are so beneficial as to give it dedicated areas (at least in astronomy) is something we're just going to have to find out by experiment. Personally I think it's something well worth exploring.
No comments:
Post a Comment