Sister blog of Physicists of the Caribbean. Shorter, more focused posts specialising in astronomy and data visualisation.

Friday, 21 September 2018

M33 VR looking shiny and nice


Some major improvements to the M33 VR render. Low resolution video but that really doesn't matter because the data is low resolution anyway. Much better colour scheme so you see a lot more detail in this one, and the data range now shows enough noise to give a better sense of depth. Plus the colour scheme is just much prettier.

Unfortunately I forgot to render this in the .mp4 format required for YouTube so you'll probably still have to download this one, and it's only suitable for headsets. I'll try and get the YouTube version working next week.

I think this would be a very nice way to give a tour through a data cube. A full AGES data cube (e.g. https://www.youtube.com/watch?v=1YWGZhXe_gA) would be a lot of fun, but that would require breaking the image texture limit that Blender < 2.78 can handle. So either I reinstall Linux on my work machine, or try and get the images sequences to process as Cycles volumetrics instead of textured planes.

4 comments:

  1. How do you do this!!?? I've done a few simulations of n-body gravitation. When I get up to about 2000 massive bodies it takes like 5 seconds to render 1 frame. And that would be on a 4 GHz machine.

    ReplyDelete
  2. This is all in Blender. Blender is fine with a few thousand individual objects, though a few tends of thousands will be difficult. On the other hand the vertex count can be much higher - a few million is generally fine.

    The way this is rendered is to use a textured image plane for each slice of the data. That way a few hundred objects can hold the information from tens or even hundreds of millions of voxels. This one is rendering in ~20 seconds per frame, I think.
    http://www.rhysy.net/frelled-1.html

    ReplyDelete
  3. Rhys Taylor Read through some of your stuff. Very Impressive! You really are a full fledged Blender artist and scientist!

    For the simulation stuff I see you use Python almost exclusively. I've used C++ and Javascript and they both seem to give about the same level of performance. In the C++ case, I didn't bother with graphics really. I'll compute n time slices & the draw one pixel in an on screen canvas to represent a planet. I've used webGL with Javascript and at about 300 objects the n^2 thing gets me.

    I'm wondering if OpenCl would help? I'd hate to try it & find it's the same problem.

    Something I haven't tried is using a set of balanced trees, where I can place strongly interacting bodies in a single node & then carry out local interactions of the members within a node & then node to node interactions where I treat each node and parent node as a single object. The idea is to be able to use parallel processing & the CPU cache (L1 & L2) on blocks of object descriptors as efficiently as possible and to reduce the number of interactions to O*log(n).

    ReplyDelete
  4. Jack Martinelli Thanks !

    For simulations, all the computations are done externally (usually in Fortran). I use Blender's internal Python to display the output (and some analysis) but not to do the computations themselves. I'm not much of a theorist, so to an extent I use the simulation code as a black box and "just" have to visualise the output. The "gf" sph code has treecode gravity so it scales as nlog(n) rather than n^2. IIRC, for FLASH this is also true for gas (which is a grid based hydrocode) but not for particles. I've never tried to code my own such code; I think I managed a simple 3 body numerical simulator many years ago but that's about it.

    The advantage of using Blender or some other dedicated software is that all the difficult visualisation stuff is already taken care of. If all you need to do is display particles, I would highly recommend an approach like this. I believe there's code somewhere on my website (the particle display code is included in FRELLED but I think there's still a standalone script somewhere).

    Though I thought that with displaying simple particles, the speed limitation is going to be the gravitational computations rather than the visuals. For example this test particle javascript code has no problem displaying a few thousand particles :
    http://portia.astrophysik.uni-kiel.de/~koeppen/JS/GalaxyViewer.html

    ReplyDelete

Giants in the deep

Here's a fun little paper  about hunting the gassiest galaxies in the Universe. I have to admit that FAST is delivering some very impres...