Sister blog of Physicists of the Caribbean. Shorter, more focused posts specialising in astronomy and data visualisation.

Friday 22 February 2019

Visualising isosurfaces in Blender for funzies

Most of the astronomy data sets I work with are three dimensional. Sometimes I use numerical simulations where we have all three spatial dimensions, but most of the time I use radio telescope data where the third axis is velocity. That doesn't mean it isn't useful to visualise in 3D though, as long as we're careful not to confuse it with true distance.

Finding ways to visualise the full volumetric data has been a long-term project of mine. It's relatively easy to show small parts of the data, but showing hundreds of millions of voxels is more of a challenge. Especially if you want to do it in realtime. I found a way to do this ages ago and have been banging on about it ever since.


Various techniques for displaying more limited aspects of the data have been around for even longer. For example, you could clip all the pixels above and below some brightness level and greatly reduce the amount of data you need to display. While useful for analysis, this has always worried me in terms of the discovery side of things : if you want to find what's in your data, ideally you don't want to chuck any of it away.The problem is that you've got to display things in a sensible way, or the fact that something is present doesn't mean it will be visible. The more data you have to display, the harder this is. And as I've been discovering recently, sometimes the best way to find new sources is through good old-fashioned (3D) contour plots.

Part of a hydrogen data cube from the Virgo cluster. Each stringy blob is a galaxy, with each contour being at the same brightness level but in different slices of the data.
What we see is that in some sources all the contours are circular, but in others some contours have little extensions. Because these tend to be very faint, these tend to be much harder to spot in the full volumetric data, so this is a great way to spot things we couldn't see otherwise - the extensions stand out very clearly in contour plots.

But these simple contours are crude. As you can see by the stringy appearance above, they plot each slice of the data independently. It's like the sort of 3D models you might make in school : kinda cool if they're done well, but not really ideal. It would definitely be preferable to plot true isosurfaces of constant brightness levels, which would smooth everything out.


That's what I've been experimenting with today. Plotting contours is relatively easy - I just use the standard matplotlib package to generate the lines, then extract this into Blender* and extrude them to create the thickened lines you see above. Interpolating the missing data in between each slice, though... to fill in the missing slopes, that seems much harder.

* Saving the plots in postscript file, which saves the positions of each vertex in convenient ASCII format.

There is in fact code for Blender to convert point clouds into surfaces, which I tried a while back but couldn't get nice results. I've finally figured out what I was doing wrong. First, the object scale needs to be applied, otherwise the skinning distance becomes meaningless. Second, the points need to be evenly sampled and reasonably dense, otherwise the poor thing can't work out how to join them up. The main problem for my the data was that it varies strongly from slice to slice (channel to channel for radio enthusiasts), leaving huge gaps in the resulting meshes. I found I could sometimes get around this by scaling down the mesh in the z-axis in edit mode, but this only helped a little. A better method was to interpolate extra channels so the data is sampled at more points, making it easier for the script to fit a true surface.

The first data set I tried this on was of a complex of HI clouds we're currently studying in Virgo, and I'd rather wait until the paper is submitted before showing that. The second one I tried was the older data of M33. This is a huge, very bright source so you see lots of different structures at different intensity levels, making it an ideal test for nested isosurfaces.

Full volumetric data. M33 is in the centre. The other stuff you can see around it is emission from the Milky Way.
Rendering the contours as thin lines shows a lot more of the detail, and also makes it easier to cut out the foreground Milky Way clouds. Here the contours are all at the same flux value, with each one at a different velocity channel.
Joining the contours forms a true isosurface. There are quite a few artifacts, however, as Blender isn't able to work out the normal directions for all the faces correctly.
Finally, we can do the same at a series of different intensity levels and make the isosurfaces transparent to show the full range of the different structures.
This is still at proof-of-concept stage. I had to modify the original code to only plot line contours rather than the surface meshes, manually run the interpolation code and scale down the meshes to get everything to work. So there's currently far too much manual intervention to use this for any kind of analysis, and the meshes aren't all that clean either. Getting this into a scientifically useful product will take a lot more work, but it's fun to play with and certainly presentation-worthy if not science ready. What might be especially fun for outreach is to upload the models for online viewing, which is MUCH easier for surfaces than for volumetric data.


EDIT : Now uploaded as an interactive Sketchfab model !

No comments:

Post a Comment

Back from the grave ?

I'd thought that the controversy over NGC 1052-DF2 and DF4 was at least partly settled by now, but this paper would have you believe ot...