Sister blog of Physicists of the Caribbean. Shorter, more focused posts specialising in astronomy and data visualisation.

Wednesday 26 October 2016

The MDAR is completely normal in standard cosmology

A New Combatant Joins The Cat Fight And Tries Very Hard To Be Nice To Everyone

Well, at least this one is a respectable paper (submitted but not yet accepted) and not an angry rant.

To recap, recently it was shown that there's a very strong relation between the density of normal matter in a galaxy and its rotational speed. This has made a lot of people very angry, and was widely regarded as a bad move.

... or for those of you not following this regularly, it's a problem because the dynamics of galaxies - how fast they rotate and suchlike - should be dominated by their dark matter. Normal matter only makes up a few percent of their mass, so it shouldn't be able to affect their rotation speed much at all. That is, if you accept dark matter in the first place. The main alternative is that our understanding of gravity (or even dynamics in general) is fundamentally flawed. Modified gravity theories like MOND can explain this "mass discrepancy relation" very well without dark matter, whereas it's not obvious if the standard model can account for it.

Recently there was a paper submitted showing that actually, when you account for all the complex physics of the normal matter, yes you can. But there was a problem. That paper only used 18 simulations of galaxies which were all of similar mass, so it didn't show if this relation could also be explained for smaller galaxies as the observations show. It also didn't really refute MOND much, which also works over a huge range of masses. Unfortunately Milgrom, who came up with the idea of MOND, took the results very personally and wrote a nasty public letter pointing out the flaws with rather more force than is generally necessary.

This new paper improves things with a much larger sample of simulated galaxies, 150-200 or so which are re-simulated using varying parameters for the so-called "sub-grid physics". Essentially some aspects of the physics (like how much energy stars inject into the gas) occur on scales too small to resolve directly in the simulation, so they have to be manually calibrated. Since this calibration is uncertain, they perform simulations with a range of possible values. And their dark matter masses vary by a factor of at least 1000, so it should be good enough to address Milgrom's, err, concerns. Though I'm not sure if the baryon/dark matter ratio varies sufficiently, but I suspect it does.

What they find is that yes, again, you can reproduce this relationship just fine with standard cosmology. Varying the complex sub-grid physics doesn't make much difference - in this paper it's just a natural scaling relation rather than anything more complex. True, this doesn't rule out MOND (and this paper is much nicer in tone, stating that the possibility of MOND should be taken seriously), but it seems that this test just isn't useful to choose between MOND and the standard model.

It will be interesting to see McGaugh's response to all this, who first raised the issue with his silly claim of having discovered "a new law of nature". Now it looks increasingly that there's nothing particularly deep or profound about this observation after all.
https://arxiv.org/abs/1610.07663

Tuesday 25 October 2016

A gloriously angry rant about the MDAR

The Cat Fight Continues

Recently a paper by McGaugh et al. described a tight correlation between the density of normal matter in galaxies and how fast it's rotating. They claimed quite correctly that this is a challenge for standard models, which it seems (naively) predict that rotation speed depends on dark matter content. Normal matter (in standard cosmology) makes up only a small fraction (< 10% or so) of the total dark matter mass, so it can't play much of a role in setting rotation speed.

Unbeknownst to me, the McGaugh result was immediately and sharply criticised by Milgrom, the creator of Modified Newtonian Dynamics - a theory of modifying gravity as an alternative to dark matter. Unfortunately, instead of attacking some of the genuine silliness in the McGaugh paper ("We've discovered a new law of nature !" - yeah, sure, whatever) Milgrom instead vents his wrath upon the fact that McGaugh isn't supporting MOND strongly enough. Which is quite correct. The McGaugh paper is silly to play the importance of the results so strongly, but quite correct and careful to emphasise that there are different interpretations of the results.

Then there was that paper on Friday of Keller & Wadsley which claimed that this result can be explained in standard models after all. This new, umm, response by Milgrom challenges the challenges. But although this article is on astro-ph only (not submitted to any journal), it degenerates in the first line from merely being a bit dramatic (as McGaugh was) to downright unprofessional. "Keller and Wadsley (2016) have smugly suggested..."

Sigh. Milgrom has some good points, but unfortunately it's a ludicrous over-reaction to a challenging paper. "They then jump to the conclusion that ΛCDM is “fully consistent” with..." No, they don't. "And so, by further unwarranted extrapolation, they seem to imply...." Come on. You can't write, "they seem to imply" in a professional article. We've reach blog-esque ranting here, not a serious rebuttal.

However, Milgrom does have a very good point that the K&W paper only simulates 18 galaxies of similar masses, whereas the problematic relation described in McGaugh covered 153 galaxies spanning a huge range of masses. According to Milgrom, the reason the KW paper gets a good result is because it only deals with particles in a very particular acceleration regime, which is quite different to what you'd find in low-mass galaxies (MOND's predictions are more complicated - acceleration doesn't just depend on total mass). So if they included dwarf galaxies they might get a totally different result. This is a valid criticism, though it does not follow that they definitely would get a different result. Worse though, "it could be a result of various adjustments in the simulations over the years, which tended to make them look, in some restricted regards, like observed galaxies."

Ouch.

Milgrom makes a further, and in my view totally ridiculous, criticism that KW try and simulate the evolution of galaxies. Yes, he really criticises them for this, as though trying to simulate galaxy evolution were a hopeless and silly endeavour : "The simulation in question attempt to treat very complicated, haphazard, and unknowable events and processes taking place during the formation and evolution histories of these galaxies" . Oh come off it. This is true regardless of what theory of gravity you adopt - you have to make some guesses and assumptions in order to make progress !

Cat-fighting aside, I think there are some valid points here - but the sharp response to an ubpublished, un-refereed paper was just plain silly and unnecessary. We certainly haven't seen the last response to this - watch this space.
https://arxiv.org/abs/1610.07538

Sunday 23 October 2016

Geometrical Jiggery-Pokery


More fun with the all-sky HI data... sort of.

Recently I posted what the data looks like if you assume velocity is the same as distance. Of course it isn't really, but transforming it into true distance isn't so easy.

The data from the telescope consists of maps of the sky each one at a slightly different line-of-sight velocity. Knowing the position and velocity of each point in the map, it's possible to convert this into distance with some fairly ugly trigonometry.

There are some intrinsic limitations to this that can't be avoided. For instance, if the gas is closer to the centre of the Galaxy than the Sun, the equations give two equally valid solutions and there's no easy way to decide which is correct. So that data has to be chucked out. Another is that if you've looking directly towards or away from the Galactic centre, you don't get meaningful velocity information - we can only measure velocity along our line of sight, but at those angles gas is moving entirely across the sky except for some small random motions. So data at angles close to the centre and anti-centre needs to be thrown out too.

Then there's the problem of how to display the data. Previously I tried to convert the raw velocity cube to a distance cube by applying the equations to each pixel in the original data. So knowing position, velocity and intensity of any point gives a corresponding position, distance and intensity in the new data cube.

This creates an extra problem. Although the original data is fully sampled (that is, each map of the sky at each velocity channel is complete - every pixel has measured values), that isn't automatically the case for the output distance cube. To simplify, imagine that position x,y corresponds to i,j in the new data set. The problem is essentially that position x+1,y+1 corresponds to something like i+2,j+2 - there are gaps. You could try interpolating the missing values, but it's not so easy - and you still lose data, because in some regions in turns out that multiple points in the original coordinates correspond to the same point in the new coordinates. You need the resolution to be adaptive.

Which is where this funky geometry comes in. What we have here are a series of planes in Blender, each one corresponding to a different velocity channel. By transforming the vertices to the corresponding distance, the faces between them automatically interpolate the missing regions. The animation just overlays each successive velocity channel converted into distance.

.... or at least that's the idea. I'm really not sure if this is working correctly. The funky shape might be a consequences of the non-trivial equations, or I might have set something wrong. It's very hard to visualise what the equations look like in 3D, which is what this is supposed to help with. This requires further thought, but it's quite nice to watch (the flickering dark shadows are Blender rendering artifacts).

Saturday 22 October 2016

LAB versus HI4PI


You've probably seen the press release going around about the "most detailed map of the Milky Way". Strictly speaking it's the most sensitive, highest-resolution all-sky HI survey of the Milky Way, but I think we can forgive the simplified headline in this instance.

Here's a little comparison between the previous Leiden/Argentine/Bonn survey, the new HI4PI survey, and the GALFA-HI survey. It's a bit crude - HI4PI* is definitely better than LAB, but I probably haven't done it justice. More details would probably show up if I chose better settings. Still, GALFA-HI is clearly still far superior in terms of resolution.

* I can't help but read this not as "4 pi", as in 4 pi steradians as it's intended, but "for P. I." as in "principal investigator"...

I was hoping to use the new data to remake the Hydrogen Sky project - last time I relied heavily on GALFA-HI, because the low resolution of LAB doesn't give such nice results. The problem with GALFA is that it's an Arecibo survey, so it has fantastic resolution and sensitivity but limited coverage. Which makes it rather tricky to find photographs of the correct orientation to overlay the data with any kind of accuracy.

Although HI4PI is better than LAB, I'm not sure it's worth the effort. The LAB data is only 250 MB, HI4PI is 32 GB (high spatial and velocity resolution and 32 bit data files instead of 16). I'm not even sure my computer (even in work) could load a 32 GB file into memory and the improvement wouldn't be that dramatic. Not that the survey isn't much better for science, just not so much for visualisation, unfortunately.

Friday 21 October 2016

FRELLED version 4.3

It's been over a year since the last update and there aren't that many changes. Still, here it is...
http://www.rhysy.net/Resources/FRELLEDArchive/v4/20161021/4.3ReleaseNotes.txt
http://www.rhysy.net/frelled-1.html

Let battle commence !

I predict a scientific cat fight on arXiv in the next few days.

A few weeks ago there was a claim that the rotation curve of galaxies scales very precisely with the density of the observable matter. This isn't a new result (see point 2 here) but the new paper apparently confirms it with much greater precision, so it can't be dismissed as an observational error. As an observer, I'm skeptical of that because errors are always larger than people claim.

But let's assume it's true. If so, it's no doubt very interesting. There's no obvious reason why the density of the observable matter should correlate so well with the rotation curve. Density of normal matter should make very little difference to how fast it's rotating around the centre of the galaxy since that should be dominated by the much greater mass of the dark matter. And it's not obvious why the dark matter should have strong enough density variations to cause wiggles in the rotation curve anyway - especially drops in the rotation speed, which are really hard to explain if the dark matter is the smooth distribution of particles it's supposed to be.

So could the whole dark matter edifice just be wrong ? Sure, that's always an option - but it's not a very likely one. The reason I don't often comment on major press releases is because they're invariably overblown, and this case was a particularly bad example ("...argue that the relation they've found is tantamount to a new natural law." - oh please, _seriously ?). And a mere month later, this paper challenges those claims very strongly.

The problem is that even without knowing the details, galaxy formation is messy. Gas can be heated by collisions or from stars and supernovae. Its cooling depends on its density and chemical composition, which depends on the star formation activity. It's got complicated magnetic fields doing all kinds of jiggery-pokery. And if there's dark matter, it's also being strongly affected by a much more massive component. So the gas and stars are subject to (potentially) very strong selection effects - what we're seeing is just the tip of the iceberg, biasing our view of the Universe.

The problem has been, for many years, that it's computationally very difficult to simulate both the complicated physics of normal matter and the apparently much simpler behaviour of the dark matter, on a large scale. That's starting to change as computational power improves, although for sure the physics used is still limited. This new paper uses these improved simulations to show that the "new law" can actually be very well explained entirely using conventional physics. In fact it appears that this can be explained using only the collapse and cooling of the gas - nothing else is needed.

Is the debate over ? Hell no. I'd be amazed if there wasn't a response from the original authors very soon. The new paper predicts there should be a strong dependency of this relation on the distance of a galaxy (since galaxies earlier in the Universe were forming stars at a faster rate). The original authors didn't comment on that. The new paper also doesn't really comment on the small-scale wiggles in individual galaxies, which modified gravity theories have no problems with (by design rather than due to predictive power). But their main result is very, very close to the earlier observation.

So CDM survives yet another challenge, and I maintain my 80:20 balance in favour of dark matter over some other explanation.
https://arxiv.org/abs/1610.06183

Sunday 16 October 2016

How bad was my hotel in Grenoble ? THIS ba

I reviewed my hotel of last week for booking.com. Well, they did ask.

[This remains by far my worst hotel experience ever. Not all science trips are as glamorous as the one in the castle, but most of them are a lot nicer than this !]

Monday 10 October 2016

IRAM interferometry school

With the exception of a single one-hour lecture by a VLBI expert at Arecibo, this is by far the best course on interferometry I've yet come across. Everything is clearly explained, step-by-step, starting from the over-simplified basics and gradually raising the complexity. Even with my shamefully poor mathematical skills I'm able to follow every step. The lectures are paced at just the right speed and (unlike every single other "conference" I've ever been on) there's a break between each one. Plenty of time to absorb what's been explained. No trying to cram unnecessary volumes of information or the seemingly malevolent attempts to exhaust everyone that plagues most workshops. It's about 10x better than NRAO's synthesis imaging summer school.

The downside to being in the lovely Grenoble with the wonderful lectures is the crappy "hotel". It's actually more like a low-grade (bordering on "dingy") hostel that charges hotel prices. Rooms are badly painted (I literally could have done a better job - for a start I wouldn't have chosen a mixture of deep pink and military grey), there's no wardrobe, free wifi is very slow, the sink light works entirely at random, the shower takes about 5 minutes (not exaggerating) to reach a state of tolerable warmth, light switch requires getting out of bed, "full breakfast" really means "a full croissant", cleanliness is not high, bed is squeaky (albeit comfortable), and the shared toilet (has a bizarre self-cleaning seat) has less legroom than any aircraft toilet. All in all, rather unpleasant.

Tomorrow I will celebrate my birthday by learning about calibration principles and atmospheric phase correction techniques.
http://www.iram-institute.org/EN/content-page-331-7-67-331-0-0.html

Thursday 6 October 2016

Classy workshop is classy

This years' ALMA all-hands meeting is in Lisbon, Portugal. And a very classy affair it is too.


And the workshop freebie... an official Atacama Large Millimetre Array USB fan ! Well I guess it will come in handy in another 7-8 months.... not much use for a fan in Prague in October.


Immediately after this : a school in Grenoble teaching us the wonders of interferometry.

Monday 3 October 2016

Today's schedule :

Today's schedule :
2am : Fall asleep.
5am : Wake up for no reason.
6am : Fall asleep again.
8am : Wake up with desperate urge to fall asleep again but a pressing need to go to work. Stagger around in a faze. Somehow reach work by ~9:30am.
10:45am : After final checks, submit paper on data visualisation. Make final changes to second paper.
12:15pm : No time for lunch. Go directly to all-afternoon seminar.
5:30pm : Return from seminar. Throw co-authors comments to the wind and submit second paper.

And now to pack for a work-related trip to Lisbon tomorrow followed immediately by a work-related trip to Grenoble. Normal services will be resumed on 15th October. Until then, posts will be intermittent and responses sporadic. But hey, I'll have at least three papers published by the end of the year (unless referee is horrendous) and probably a fourth. Sleep would be nice though.

Back from the grave ?

I'd thought that the controversy over NGC 1052-DF2 and DF4 was at least partly settled by now, but this paper would have you believe ot...