When something seems a little mysterious or we just don’t understand what is going on we like to describe it with the adjective ‘dark’.
This is one of the reasons why the term ‘dark’ matter got coined which was first proposed to explain the anomaly observed in the rotational velocities of galaxies. That is – the observed rotational velocities of the gas and dust at the outer edges of a galaxy is rotating just as fast as the gas and dust near its center. This anomaly was first noted in 1978 by Vera Rubin and W. Kent Ford who made precise measurements utilizing a new instrument that Ford himself had designed. At first, they thought their data could be erroneous, but then their results were corroborated by subsequent observations of galactic rotational velocities, suggesting that there was indeed an anomaly between what is expected and what was observed!
Well, to determine the mass of a celestial body some assumptions must be made about how things are expected to work – that is, the known laws of physics are taken into consideration. These assumptions or laws are then applied to what is observed such that the mass can be determined. As direct measurements for the mass of a celestial body cannot be made – astronomers instead observe the emitted light and look how it changes with time. An analysis of how the light changes with time reveals the dynamics of the system e.g. the velocity, which, based on the laws of physics, allows for the determination of the mass.
For example, in the case of galaxies their rotational velocities are calculated through the measurement of their doppler shifts in the light spectra. These calculated velocities are then plotted against their respective distance away from the galactic center, producing a rotation curve.
To measure the Doppler shift, astronomers disperse the light using a spectrograph (i.e. a prism), allowing the spectral lines – representing the electron transitions between orbitals – to be observed. These spectral lines will be of a specific wavelength depending on the atomic transition – but as well the wavelength will be shifted i.e. shortened or increased due to the gas moving towards or away from the astronomer’s line of sight. This technique is used to measure the rotational velocities.
To determine the rotation curve of the Galaxy we need to observe the light emitted from the galaxy. However, the visible light from stars suffer from interstellar extinction – where the light cannot penetrate the galactic dust clouds – and are therefore not the best source of galactic light. Instead, neutral hydrogen which exists in low density regions of the interstellar medium and emits light with a wavelength of 21cm – known as the Hydrogen 21-cm Line – is used. In the case of neutral hydrogen – the 21-cm wavelength (1420 MHz) radiation comes from the transition between the two levels of the hydrogen ground state.
Now from the known laws of physics one can assume that the velocity will change with distance, where in the case of a rigid or homogeneous system – like that assumed for the galactic nucleus – the velocity will be proportional to the distance, i.e. the velocity will increase with radius.
So, taking all this into account we would expect a galaxy rotation curve somewhat like this,
However, even with this more accurate observation using the 21cm-Hydrogen line, the resulting rotation curve is not as expected,
So how can we explain this rotation curve – why does it increase in velocity and then flatten? Based on what we know and just discussed one would assume that the proportionality constant in the equation governing the Keplerian dynamics would need to NOT be constant. That is – the mass would need to increase with respect to the radius, keeping the velocity approximately constant. As there is no observed extra mass it was thus proposed that the extra mass must be a different kind of mass that is undetectable i.e. ‘dark’.
Now although this idea of dark matter seemed new, it had already been proposed by the likes of Zwicky – who back in 1933, through the study of galactic velocities in the Coma Cluster, concluded that the total mass required to hold the Cluster together is about 400 times larger than what is observed  . This was then expounded upon by the work of many others and is still an active area of research. Today it is calculated that ~ 85% of all matter is ‘dark’ matter.
Shortly after the idea of dark matter first gained momentum, several theories were put forward proposing possible sources e.g. ‘dark’ objects formed at the early epoch of the Universe ; dark remnants of Population III stars (the original stars formed after the big bang and thus composed entirely of primordial gas) such as white dwarfs, neutron stars or black holes  ; and exotic elementary particles such as massive neutrinos  .
The idea of massive neutrinos was shortly put to bed when the mass of an electron neutrino was measured to be 30 eV  and an argument based on the Pauli exclusion principle – which states that two or more identical fermions cannot simultaneously occupy the same quantum state – showed that individual galaxy halos could not be made of neutrinos with masses that small .
A plethora of candidates were subsequently put into the lime light, the main characteristic being the fact that they are not detected in the known electromagnetic wavebands. It was thus assumed that this dark matter object will not significantly interact with the electromagnetic spectrum and only interacts via gravity or any other force which is as weak as, or weaker than the weak nuclear force. That is, they are very quiet and don’t communicate too well – much like some scientists.
In the early universe the materialization of particles and anti-particles from radiant energy through pair production – and the subsequent destruction through annihilation – were in equilibrium. That is the production rates for both the particles and the photons were the same as their destruction rates such that no photon and/or particle was permanent, just continuously fluctuating in and out of existence. As the universe cooled the energy was not sufficient for pair production and thus the number of particles and photons decreased until the particle interaction probability reached a critical low such that particle annihilation ceased, and the number density (number of particles) stabilized. For a specific particle, the number density that stabilization occurs depends on the particles mass. For a dark matter candidate, it would need to be sufficiently massive and slow moving (sub-relativistic) such that it could clump together and form the structure we observe today. This is the general view of dark matter and is referred to as the Cold Dark Matter model.
Supersymmetry – one of the candidate theories for quantum gravity, which focuses on the relationship between ordinary particles (fermions) and ‘force carrying’ entities (bosons) – predicts new elementary particles that fit the description of a weakly Interacting Massive Particle (WIMP) e.g. higgsinos, sneutrinos, squarks, selectrons.
Source: New Scientist
The lightest of these stable supersymmetric particles is the neutralino which happens to have a calculated number density approximately equal to the known density of dark matter. The neutralino is thus the most likely candidate for a WIMP and with a mass is within the energy levels that can be detected at particle accelerators such as the Large Hadron Collider (LHC). Note, any detection of the WIMP particle at the LHC would not be direct, instead it would be in the form of missing energy of the specific order. However, as yet no such detection has been made.
Direct detection of a WIMP would be the optimum confirmation of dark matter, however as they are weakly interacting it is an extremely low probability that they will interact, let alone that we would detect such a small energy range interaction. However, there are many experiments dedicated to detecting the interaction of a WIMP with atomic matter. Depending on the material of the detector e.g. silicon, germanium, sodium iodide etc., phonons – vibrations in the atomic lattice – and/or scintillation – luminescence from ionized electrons – can be detected. To reduce background events, these experiments operate deep underground and at extremely cold temperatures where they are shielded from cosmic rays and thermal excitations are minimized. Although numerous experiments are actively searching e.g. Deep Underground Science and Engineering Laboratory (DUSEL), Large Underground Xenon experiment (LUX), Sudbury Neutrino Observatory Laboratories (SNOLAB) and the China Jinping Underground Laboratory (CJPL), to date no WIMP has been detected.
These underground experimental methods allow for another source of indirect detection – through the detection of neutrinos. If we assume that WIMPS are the dark matter particle and exist in the halo of galaxies, then they would have been passing through our local surroundings and at some point, in the last several billion years, they would have been scattered by nuclei. This loss in energy would have trapped the WIMPS in the gravitational well of the Sun and/or Earth until the number density sufficiently increased for annihilation to occur. Annihilation of WIMPs results in high-energy neutrinos, so based on this reasoning you would expect a stream of neutrinos to be emanating from the sun. Neutrinos produced in nuclear reactions in the solar core have a much lower energy than the neutrinos produced through WIMP annihilation. As well, these higher-energy neutrinos interact in the atmosphere of the Earth producing muons. However, muons are also created through cosmic ray interaction with the Earth’s atmosphere so again, to reduce background events the detectors are placed deep underground. Detectors such as the Antarctic Muon And Neutrino Detector Array (AMANDA), the South Pole Neutrino Observatory (IceCube) and the Astronomy with a Neutrino Telescope and Abyss environmental RESearch project (ANTARES) are all searching for a signal, but these methods have also been to no avail.
Annihilation of WIMPS also produce gamma rays, which has been another focus of searching for these elusive dark matter particles. This annihilation is expected to take place in galactic halos and could be detected through an excess of gamma rays. However, distinguishing between gamma rays due to annihilation and those from the various astrophysical sources has proven difficult. For example a recent study using data from the Large Area Telescope on NASA’s Fermi Gamma-ray Space Telescope have suggested that the excess source of gamma ray emission may instead be a well-known population of pulsars.
The myriad of experiments and searching techniques have so far revealed no positive results, so it looks like the most promising candidate – WIMPS – is finally facing defeat and a team of leading scientists attending a workshop on new ideas in dark matter have been encouraged to look elsewhere and widen their perspective.
One such alternative candidate for dark matter that may well make up a fraction of this missing mass are MACHOs (MAssive Compact Halo Objects). Unlike WIMPs MACHOs are baryonic and come in the form of astronomical objects such as jupiter mass objects; brown dwarfs; black hole remnants of early generation stars, primordial black holes, neutron stars and white dwarfs. If the galactic halo was filled with objects such as these, they would not be detected by emission or absorption of light. Detection of a MACHO could be made through the phenomenon known as microlensing, where the light from a distant star is magnified when a MACHO-type object passes in front of it.
Non-baryonic matter has a much higher density (by a factor of ~ 5) than that of baryonic matter and gravity is too weak to grow the present structures from the smooth initial conditions observed in the cosmic microwave background (CMB). An additional mass would speed up the process, but only if it didn’t interact with light in the same way ordinary “baryonic” matter does. It was thus concluded that the majority of dark matter is most likely non-baryonic and if MACHOs did exist then they would only be responsible for a small fraction of dark matter primarily in the halos of spiral galaxies.
Axions are another viable dark matter candidate. As opposed to WIMPs, that are hypothesized to have been created thermally in the early universe, axions are suggested to have been created non-thermally during a phase transition event. They were first proposed in the 1970s to explain the strong CP problem – that is C as in charge and P as in parity (spatial inversion). The problem is, the question as to why quantum chromodynamics (QCD) does not seem to break CP symmetry when in principle it permits such a violation.
The fact that we live in a matter dominated universe indicates that the laws of physics are not the same for matter and antimatter and there must have been a violation of the fundamental symmetry of nature – known as CP violation . However, although this is the case for weak interactions , this is not the case for electromagnetic and strong interactions and is thus known as the Strong CP problem. Axions were thus postulated, by Roberto Peccei and Helen Quinn, to account for this, where the axions potential would exactly cancel out a CP violating term introduced into the QCD calculations. Since 2016 the Axion Dark Matter Experiment has been trying to tune a microwave antenna to the broadcast frequency of dark matter – to no avail. The latest study conducted at the Paul Scherrer Institute (PSI) also comes up null. The Ultra Cold Neutron Source (UCN) at the PSI is primarily being utilized to determine the electric dipole moment of a neutron but measurements over time could reveal a fluctuation of a consistent frequency – which would be indicative of an interaction between the neutron and the hypothetical axion particle.
Axions are 10,000 trillion-trillion times less massive than an electron and could theoretically condense into a Bose-Einstein condensate and thus be the hypothesized superfluid dark matter responsible for galaxy rotation as oppose to the normal dark matter responsible for galaxy clusters. However, this has since been ruled out on the basis that axions are weak and attractive and the superfluid dark matter particles are required to be repulsive.
For starters let’s start from the premise that everything is connected and in that case, there would not be a particle that only interacts via gravity. In fact, maybe, just maybe there is no ‘dark’ matter particle after all – and instead it’s the physics that we need to look at.
This is the premise of Mordehai Milgrom who came up with a theory on modified Newtonian dynamics (MOND)   . The central idea of MOND is scale invariant physics – that is, physics that does not change across scales – seems obvious right? However, this is not the general consensus, which currently states that the laws of physics change at different scales i.e. you have quantum mechanics at the quantum scale and general relativity at large scales.
What if this was not the case and instead we live in a scale invariant Universe i.e. a unified physics view where quantum gravity is the physics across all scales?
This is in line with the remark by Dirac (1973) that “… the equations expressing the basic laws of Physics should be invariant under the widest possible group of transformations. This is the case of the Maxwell equations of electrodynamics which in absence of charges and currents show the property of scale invariance” .
Milgrom makes an interesting step and introduces scale invariant physics at low accelerations below a critical limit that is defined within the framework of the model. From this approach the MOND model successfully predicts the observed dynamical effects of galaxies from dwarf to elliptical and spiral etc., and as well correctly predicts the correlation between the luminosity of a galaxy and its rotation rate, known as the Tully-Fisher relation.
This underlying scale invariant framework of MOND, where gravity is an emergent property, is also purported by Erik Verlinde in his theory of emergent gravity. Verlindes conceptual model, utilizes quantum information theory, string theory and black hole physics to suggest that space-time and gravity emerge together from entangled quantum interactions. This emergent gravity contains an additional force which according to Verlinde can explain the observed phenomena in galaxies and clusters currently attributed to dark matter.
Andre Maeder has presented a similar theory that assumes scale invariance. Much like that of MOND he defines a limit where scale invariance is applicable at large scales (i.e. low accelerations in MOND). Then, much like Verlinde he finds an additional force that opposes gravity. This force is only small at very low densities i.e. on Earth the force would be too small to make any measurable difference, but on the galactic scale it is strong enough to hold the rotating galaxies together with no need for dark matter. Maeders model utilises a new co-ordinate system and successfully explains the rotation rate of individual galaxies and the surprisingly high velocities of galaxies in galaxy clusters as well as the acceleration of the energy which is generally attributed to dark energy   .
These models of Milgrom, Verlinde and Maeder all offer an interesting look at dark matter, where, instead of looking for a new particle, the physics is addressed to account for the attributes of dark matter. However, not everyone is convinced as noted by David Spergel, who says Maeders theory, or any new theory that attempts to explain away dark matter, would have to fit all cosmological data e.g. the cleanest evidence for the existence of dark matter is the CMB – the radiation left over from the big bang – whose observed temperature fluctuations cannot be explained without dark matter. According to Spergel, the Cold Dark Matter Model fits the data really well, and Maeders theory does not. However, the validity of such a model has just been challenged in a recent paper by a team of astrophysicists studying the kinematics of satellite galaxies.
In any case, Spergel also believes that Maeder’s theory would not be able to explain gravitational lensing effects that are observed around some galaxies and can only be explained by additional mass – dark mass! Interestingly Maeder makes a similar conclusion about MOND and states that as it is a classical theory it cannot be contained within the cosmological model .
There are many more alternative theories of gravity that attempt to remove the need for dark matter and dark energy, but these models are still works in progress and cannot explain all the features of dark matter yet.
As well, and possibly most importantly these models have limits to the scale in-variance. If we are assuming a unified approach with scale invariance, where any dilation or contraction of space would not change the physics then you would assume this to be valid across all scales.
Haramein’s unified physics, in his generalized holographic model, offers such an approach where scale in-variance is valid across all scales. In his model the mass of all particles, stars, galaxies and all systems, results and emerges from the central black hole that sustains each of these systems. The missing mass required to explain the observational anomalies is thus no longer needed in the form of a particle and instead can be predicted directly from the physics of the generalized holographic approach. The generalized holographic model successfully predicts the mass of the proton (read addendum here) and the electron, so, what about the mass of the larger scale systems?
The discrepancy between the vacuum density at the quantum scale and the cosmological scale – known as the vacuum catastrophe – was resolved utilizing the generalized holographic model, where the changing density is a direct result of scale invariant physics governed by the generalized holographic model. This model is therefore valid from the very small to the very big and is most likely applicable across all scales. The question is can we predict the observed mass of our galaxy based on the generalized holographic model? Watch this space!