Higgs boson’s “little brother” probably never existed

Standard

The inflaton has not been seen in the decay of B+ mesons at the LHCb. (Courtesy: LHCb Collaboration, CERN)

The hypothetical inflaton is almost certainly not the particle behind the universe’s rapid expansion soon after the Big Bang. This is according to an international collaboration of physicists working at the LHCb experiment on the Large Hadron Collider (LHC) at CERN, who have been looking for traces of the inflaton in the decay of B+ mesons. Back in 1981, Alan Guth proposed a new model of the early universe to explain why it looks the same in all directions today. He theorized that after the Big Bang the universe initially expanded slowly, allowing time for matter to interact and the temperature to level out. Then, there was a very short, extremely fast expansion of space–time, which happened so rapidly that the universe now appears uniform throughout. For such an expansion to take place, however, there must have been a force field behind it. “A new [force] field always means the existence of a particle that is the carrier of the effect,” explains team member Marcin Chrzaszcz from the Institute of Nuclear Physics of the Polish Acadamy of Sciences (IFJ PAN). For a while, it was thought that this particle was the Higgs boson – however, when it was observed in 2012, the boson was too heavy to be the correct candidate. So theoreticians proposed a new particle called the inflaton, which had the properties of the Higgs boson but a smaller mass. To prove its existence, physicists looked at the decay of B+ mesons, which sometimes decay into K+ mesons and Higgs bosons. According to quantum mechanics, the near-identical nature of the “brother” particles means that they transform and oscillate between each other, so the Higgs boson should then convert into the inflaton. Rather than directly measuring the inflaton or Higgs, the LHCb detects their decay into a muon and antimuon. “Depending on the parameter describing the frequency of the inflaton–Higgs oscillation, the course of B+ meson decay should be slightly different,” Chrzaszcz explains. “We found nothing. We can therefore say with great certainty that the light inflaton simply does not exist.” The work is presented in Physics Review D.


via physicsworld.com

From generation to generation

Standard

by Robert Kowalewski from Nature Physics 11, 705–706 (2015) doi:10.1038/nphys3464


A new measurement from the LHCb experiment at CERN’s Large Hadron Collider impinges on a puzzle that has been troubling physicists for decades namely the breaking of the symmetry between matter and antimatter.

Experimental constraints on the unitarity triangle. Each band shows the allowed region (at 95% confidence level, CL) based on specific measured quantities. The quantities η and ρ are functions of the Cabibbo-Kobayashi-Maskawa (CKM) matrix elements, which allow the triangle to have a base of unit length oriented along the ρ axis. The angles α, β and γ correspond to the blue and tan bands, and are measured from matter-antimatter-violating asymmetries in B meson decay. The circular arcs centred on (10) show the constraints from the mass differences, Δmd and Δms, measured in studies of B-B oscillations. Measurements of matter-antimatter violation in the kaon system determine εK, which is a measure of the admixture of the CP-even eigenstate in the long-lived neutral kaon, and result in the green band. The dark green semi-circle centred on (0,0) shows the constraint from the measurement of the ratio IVubl/IVcbl, where Vub describes the transition of a b quark to a u quark. Image courtesy of the CKMfitter group.

Experimental constraints on the unitarity triangle. Each band shows the allowed region (at 95% confidence level, CL) based on specific measured quantities. The quantities η and ρ are functions of the Cabibbo-Kobayashi-Maskawa (CKM) matrix elements, which allow the triangle to have a base of unit length oriented along the ρ axis. The angles α, β and γ correspond to the blue and tan bands, and are measured from matter-antimatter-violating asymmetries in B meson decay. The circular arcs centred on (10) show the constraints from the mass differences, Δmd and Δms, measured in studies of BB oscillations. Measurements of matter-antimatter violation in the kaon system determine εK, which is a measure of the admixture of the CP-even eigenstate in the long-lived neutral kaon, and result in the green band. The dark green semi-circle centred on (0,0) shows the constraint from the measurement of the ratio |Vub|/|Vcb|, where Vub describes the transition of a b quark to a u quark. Image courtesy of the CKMfitter group.

We learn early that the matter in and around us is made up of three particles: electrons, and the up and down quarks found in nuclei. Add in the electron neutrino and we also account for nuclear fission and fusion and the stellar furnace that fuels life on Earth. But nature is not that simple. It replicates this four-particle structure in ‘generations’ of heavier, but otherwise similar, particles. The first evidence for this was the discovery of the muon in 1936. Other second-generation particles were subsequently discovered, as was another unexpected phenomenon: the violation of matter-antimatter (CP) symmetry in neutral kaons(1). Now, writing in Nature Physics, the LHCb collaboration(2) provides fresh evidence to fuel the ongoing discussion surrounding CP violation.
In 1973, Makoto Kobayashi and Toshihide Maskawa proposed a mechanism whereby mixing between the mass and weak eigenstates of quarks would, if there were three generations, result in an irreducible complex phase that could be responsible for CP violation(3).
The discovery of the first third-generation particle, the tau lepton(4), came a year later, followed in 1977 by the discovery of the third-generation ‘b’ quark(5). With the advent of high-intensity electron-positron colliders at the start of the twenty-first century, studies of CP violation in the decays of B mesons (which contain a b quark) at the BaBar and Belle experiments validated Kobayashi and Maskawas proposal, for which they shared in the 2008 Nobel Prize in Physics.
The CKM matrix – introduced by Kobayashi and Maskawa, following the formative work of Nicola Cabibbo – describes the mixing of quark mass and weak eigenstates in the standard model of particle physics. It is unitary and can be fully specified with four parameters: three real angles and one imaginary phase. This unitarity condition is the basis for a set of testable constraints in the form of products of complex numbers that sum to zero – for example, V*ud Vub + V*cd Vcb + V*td Vtb = 0 where Vub describes the transition of a b quark to a u quark. The triangle in Fig. 1 provides a convenient graphical representation of this equation. The unitarity condition connects a large set of measurable quantities in the standard model, including CP-violating asymmetries, which depend on the imaginary phase, and mixing strengths, which are magnitudes such as |Vub| and |Vcb|. In the standard model, all the bands corresponding to the different measurements in Fig. 1 should overlap at a unique point, which they do at the current level of precision. The presence of new particles or interactions would contribute to these measurable quantities in different ways, resulting in bands that fail to converge at a point. The ratio of matrix elements |Vub|/|Vcb| corresponds to the length of the side of the ‘unitarity triangle’ opposite the angle labelled β, which is well determined from measured CP-violating asymmetries. The precise determination of this ratio is a crucial ingredient in providing sensitivity to new particles and interactions.
Experiments at electron-positron colliders have measured |Vub| and |Vcb| for many years using two complementary methods based on the decays of a B meson to an electron or muon, its associated neutrino and one or more strongly interacting particles. The first method measures exclusive final states whose decay rates are proportional to |Vqb|2 (where q = u, c), and uses lattice quantum chromodynamic (QCD) calculations of form factors to determine |Vqb|. The second inclusive method requires only the presence of an electron or muon and sums over many exclusive final states. These summed rates are also proportional to |Vqb|2, the determination of |Vqb| in this case relies on perturbative QCD calculations and auxiliary measurements. Although these two methods have improved significantly in precision over the years, the values determined for both |Vub| and |Vcb| from the inclusive method persistently exceed those from the exclusive method by two to three standard deviations. This has prompted speculation that the familiar left-handed charged weak interaction has a right-handed counterpart that contributes
to this difference.
With this backdrop, the new measurement of the ratio |Vub|/|Vcb| from the LHCb experiment at CERN’s Large Hadron Collider (LHC) is a welcome addition to the literature(2). It is based on a different exclusive decay mode than can
be measured at the electron-positron collider experiments, namely that of a baryon containing b, u and d quarks (a heavier version of the neutruon) that decays into a proton, a muon and a neutrino. Particle physicists have been surprised that these decays, where the missing neutrino prevents reliance on kinematic constraints, can be distinguished from the huge background inherent in proton-proton collisions at the LHC This new result, which makes use of very precise spatial measurements of the decay vertices of short-lived particles and uses innovative analysis techniques, is a noteworthy achievement.
What have we learned? The new experimental information, instead of resolving the inclusive-exclusive puzzle, deepens it. The measurement and corresponding lattice QCD calculation lead to a value for |Vub|/|Vcb| that is lower than both the pre-existing exclusive and inclusive determinations. The consistency of the three determinations with a single value is only 1.8%, indicating that particle physicists have more work to do in this area. On a more positive note, the LHCb measurement, when combined with previous measurements, strongly disfavours the hypothesis of a right-handed weak interaction.


(1) Christenson, J. H., Cronin, J. W. Fitch, V L. & Turlay, R. Phys. Rev. Lett. 13, 138-140 (1964).
(2) The LHCb collaboration Nature Phys. 11,743-747 (2015).
(3) Kobayashi, M. & Maskawa, T. Prog. Theor Pinys. 49, 652-657 (1973).
(4) Perl, M. L. et al Phys. Rev. Lett 35,1489-1492 (1975).
(5) Herb, S. W. et al Phys. Rev Lett. 39, 252-255 (1977).

Physics paper sets record with more than 5,000 authors

Standard

by Davide Castelvecchi from Nature doi:10.1038/nature.2015.17567


Detector teams at the Large Hadron Collider collaborated for a more precise estimate of the size of the Higgs boson.

Thousands of scientists and engineers have worked on the Large Hadron Collider at CERN.

Thousands of scientists and engineers have worked on the Large Hadron Collider at CERN.

A physics paper with 5,154 authors has — as far as anyone knows — broken the record for the largest number of contributors to a single research article.
Only the first nine pages in the 33-page article, published on 14 May in Physical Review Letters(1), describe the research itself — including references. The other 24 pages list the authors and their institutions.
The article is the first joint paper from the two teams that operate ATLAS and CMS, two massive detectors at the Large Hadron Collider (LHC) at CERN, Europe’s particle-physics lab near Geneva, Switzerland. Each team is a sprawling collaboration involving researchers from dozens of institutions and countries.
By pooling their data, the two groups were able to obtain the most precise estimate yet of the mass of the Higgs boson — nailing it down to ±0.25%.
Robert Garisto, an editor of Physical Review Letters, says that publishing the paper presented challenges above and beyond the already Sisyphean task of dealing with teams that have thousands of members. “The biggest problem was merging the author lists from two collaborations with their own slightly different styles,” Garisto says. “I was impressed at how well the pair of huge collaborations worked together in responding to referee and editorial comments,” he adds.
Too big to print?
Every author name will also appear in the print version of the Physical Review Letters paper, says Garisto. By contrast, the 2,700-odd author list for a Nature paper on rare particle decays that was published on 15 May(2) will not appear in the June print version, but will only be available online.
Some biologists were upset this week about a genomics paper with more than 1,000 authors(3), but physicists have long been accustomed to “hyperauthorship” (a term credited to information scientist Blaise Cronin at Indiana University Bloomington(4)).
An article published in 2008 about the CMS experiment at the LHC(5), before the machine started colliding protons, became the first paper to top 3,000 authors, according to Christopher King, editorial manager of Thomson Reuters ScienceWatch. The paper that announced the ATLAS team’s observation of the Higgs particle in 2012 had 2,932 authors, of whom 21 were listed as deceased(6).


(1) Aad, G. et al. (ATLAS Collaboration, CMS Collaboration) Phys. Rev. Lett. 114, 191803 (2015).
(2) CMS Collaboration & LHCb Collaboration Nature http://dx.doi.org/10.1038/nature14474 (2015).
(3) Leung, W. et al. Genes Genomes Genet. 5, 719–740 (2015).
(4) Cronin, B. JASIST 52, 558–569 (2001).
(5) The CMS Collaboration et al. J. Instrum. 3, S08004 (2008).
(6) ATLAS Collaboration Phys. Lett. B 716, 1–29 (2012).

Proton smasher spots rare particle decays

Standard

by Daria Zieminska from Nature (2015) doi:10.1038/nature14520


The extremely rare decays of particles known as neutral B mesons have been observed at CERN’s Large Hadron Collider. The result may be a glimpse of physics beyond that of the standard model of particle physics.
For more than three decades, physicists have been looking for the decay of the ‘strange B meson’ particle into a pair of muons, the heavy cousins of electrons. The process is incredibly rare, and harder to find than the famous Higgs particle, the discovery of which at the Large Hadron Collider at CERN, near Geneva, Switzerland, was celebrated worldwide in 2012. The standard model of elementary particle physics(1) makes an exact prediction of the number of particle-decay events researchers should observe in an experiment. Anything more than the predicted value means potential trouble for the standard model. In a paper published on Nature‘s website, researchers working on the CMS and LHCb collaborations(2) at the Large Hadron Collider describe a joint analysis of data from proton collisions that set the decay rate of the strange B meson at about three in one billion — in agreement with the standard-model prediction. However, they find that the decay rate of another type of neutral B meson, the ‘non-strange’ B meson, is at odds with the expectation from the standard model.
The standard model is at a crossroads. It has been very successful in describing elementary particles and their interactions, but such particles comprise only 4% of the known Universe. The theory does not provide a candidate for the dark matter that binds galaxies together and makes up one-quarter of the cosmos. Nor does it accommodate dark energy, the remaining, unknown component of the Universe that is causing it to expand at an accelerated rate. It also does not explain the preponderance of matter over antimatter. Lastly, it makes a worrisome warning that the Universe is probably unstable, ready to collapse in a ‘big crunch’.
Many models have been proposed to solve some of these problems. One of the most compelling ideas for unknown physics beyond that of the standard model is supersymmetry(3), affectionately called SUSY. Supersymmetry states that, for every known particle, there is a twin ‘superparticle’ of much higher mass. These superparticles could in principle be produced in colliders. They should quickly decay to lighter superparticles and ordinary particles, except for the lightest superparticle, which should be stable — and that is SUSY’s candidate particle for dark matter.
Physicists have been searching for SUSY superparticles for years, so far with no success. In the absence of direct observations, they watch for discrepancies of measurements of particle properties from standard-model predictions. The decay of neutral B mesons to muons (Fig. 1) is a sensitive test of the standard model because the model predicts the decay rate with good precision. B mesons are made up of one quark and a ‘bottom’ antiquark, the antimatter partner to the quark; quarks are the elementary building blocks of protons and neutrons, and come in six flavours (up, down, strange, charm, top and bottom). There are two kinds of neutral B meson, which have no charge. One type, the strange B meson (Bs0), contains a bottom antiquark and a strange quark. The other, the non-strange B meson (B0), has a bottom antiquark paired with a down quark. The decay of the neutral B mesons to a pair of muons would mean that the bottom antiquark and its quark partner annihilate, and that the energy released in the process is given to the muons.

The CMS and LHCb collaborations have accelerated and smashed together beams of protons travelling in opposite directions in the Large Hadron Collider at CERN, near Geneva, Switzerland, producing neutral B mesons, among many other particles. The authors observed the extremely rare decay of the strange neutral B meson (Bs0) to two oppositely charged muons (μ+ and μ−) with high statistical significance.

The CMS and LHCb collaborations have accelerated and smashed together beams of protons travelling in opposite directions in the Large Hadron Collider at CERN, near Geneva, Switzerland, producing neutral B mesons, among many other particles. The authors observed the extremely rare decay of the strange neutral B meson (Bs0) to two oppositely charged muons (μ+ and μ) with high statistical significance.

But the standard model forbids the annihilation of quarks of different flavours, so it predicts the decay of the neutral B mesons into muons through an intermediate process that involves the exchange of a top quark between the quarks and the emission of two W bosons (elementary particles that mediate the weak nuclear force). The decay of the strange B meson is expected to occur by this process in about four parts in one billion, and that of the non-strange B meson in about one part in ten billion. However, if yet-unknown SUSY superparticles are exchanged between the quarks in addition to the top-quark exchange, these decay rates will be greatly enhanced relative to the standard-model rate.
The decay rate of the strange B meson observed by the CMS and LHCb collaborations confirms the standard-model prediction. That is good news for the standard model, but not such good news for physics beyond it. However, the decay of the non-strange B meson, which the authors also observed, albeit with a lower statistical significance than obtained for the strange B meson, exceeded the standard-model expectation by almost fourfold — something to watch in the years to come.
CMS and LHCb are two of seven particle detectors at the Large Hadron Collider. Their designs follow different concepts. CMS is a large cylinder (21.6 metres long and 14.6 metres in diameter) in which two counter-propagating beams of protons collide and give rise to neutral B mesons, among many other particles. LHCb is specifically designed to study B mesons, which tend to stay close to the line of the beam pipe. Unlike the CMS detector, which surrounds the proton collision point, the LHCb detector is a stack of instruments stretching for 20 metres along the beam pipe on one side of the collision point. But the two teams adopted a similar strategy to analyse their data. Both groups selected particle events that involved two oppositely charged muons travelling from a common point, which is displaced by a few hundred micrometres from the point at which the protons collide. The events associated with the decay of a neutral B meson are a small fraction of initial candidates. The rest are random pairs of muons originating from other, more common processes.
To separate the signal of the neutral B meson from background events, the teams each built a ‘decision tree’ — a sequence of binary splits of data into signal-like and background-like parts. The system ‘learns’ to distinguish between signal and background by ‘training’ on a simulated sample of the signal and on a sample of real data representing background events. For the selected signal-like events, the researchers deduced the mass of the parent particles using the momenta and directions of travel of the two muons. They then compared the spectrum of the deduced masses with that predicted for a sum of two bell-shaped curves corresponding to the two kinds of neutral B meson, strange and non-strange, and a smooth background.
The two collaborations had previously performed this type of analysis, and each reported their results in separate publications(4, 5). But it was only the combination of data from the two experiments that allowed the researchers to observe with high statistical significance the decay of the strange B meson. In the process, the researchers identified, and corrected, issues with the previous analyses. In particular, they isolated and subtracted a background from the decay of a particle called a bottom Lambda baryon that mimics the signal of a neutral B meson.
Studies of B-meson decays will continue in the coming years. The Large Hadron Collider has just restarted after a two-year break for upgrades, and will soon accelerate proton beams to an energy of 13 teraelectronvolts (TeV), increased from the 8-TeV level reached before the upgrades. The proton beams will also be more tightly focused and will collide at a higher rate than that achieved so far. Both experiments will collect a large number of rare events, and should eventually find which path away from the standard model nature has chosen.


(1) The Standard Model
(2) CMS Collaboration & LHCb Collaboration. Nature (2015).
(3) Supersymmetry
(4) Chatrchyan, S. et al. (CMS Collaboration) Phys. Rev. Lett. 111, 101804 (2013).
(5) Aaij, R. et al. (LHCb Collaboration) Phys. Rev. Lett. 111, 101805 (2013).

The mass of a top

Standard

by Peter Skands from Nature 514, 174–176 (09 October 2014) doi:10.1038/514174a

A measurement of the mass of the heftiest-known elementary particle, the top quark, which exists for less than a trillionth of a trillionth of a second, sheds light on the ultimate fate of our Universe, although ambiguities cloud its interpretation.

Writing in Physical Review Letters, researchers working in the D0 experiment (Abazov et al.(1)) at the Tevatron accelerator at Fermilab near Chicago, Illinois, report the most precise single measurement so far of the mass of the heaviest-known elementary particle, the top quark. The result concludes an exciting 20-year saga — from the joint discovery of the top quark by the D0 Collaboration(2) and its competitor the CDF Collaboration(3), to a measurement of the top quark’s mass with a precision better than 0.5%. A similar result from the CDF experiment is to be expected, updating their 2012 result(4).
The top quark is one of six types of quark predicted by the standard model of particle physics; quarks are elementary particles that make up composite particles such as protons and neutrons. The top quark existed in the extremely hot conditions of the early Universe and can be recreated artificially by large particle accelerators such as the Tevatron. The D0 experiment takes its name from its location on the accelerator ring. According to the D0 measurement, a top quark weighs 187.85 ± 0.82 atomic units (174.98 ± 0.76 gigaelectronvolts c−2 in particle-physics units, with c being the speed of light), just shy of the mass of a gold atom. Unlike atoms, however, the top quark is elementary, and acquires its mass by interacting with the elusive, omnipresent Higgs field, the telltale evidence of which — the Higgs boson — was famously discovered(5, 6) in 2012.
Briefly stated, the presence of the Higgs field in the Universe causes an increase in the potential energy of all particles except photons, gluons and possibly neutrinos. The extra potential energy is equivalent to a mass, and the size of this mass is proportional to the strength of the Higgs field (called its vacuum expectation value, a universal constant) and to the size of each particle’s ‘Higgs charge’, which determines how strongly each particle interacts with the Higgs field. For the top quark, this charge is called the top-quark Yukawa coupling, named after the Japanese physicist and Nobel laureate Hideki Yukawa.
The fact that the top quark has by far the largest mass among elementary particles implies that it has by far the largest Yukawa coupling, and this in turn gives rise to some of the most significant quantum fluctuations in nature. At the quantum level, the Higgs field constantly fluctuates into pairs of ‘virtual’ particles and antiparticles, which are allowed a brief existence by Heisenberg’s uncertainty principle. Because of the huge top-quark Yukawa coupling, the fluctuations involving the top quark affect the shape of the Higgs potential (which describes the potential energy of the Higgs field as a function of the field strength). In fact, making the strong assumption that there are no as-yet-undiscovered particles, the Higgs field seems to exist in a local minimum of the potential(7), which would make the Universe as we know it unstable. Fortunately, the calculated lifetime of the Universe comfortably exceeds its present age. So, rest assured, the Universe will not decay tomorrow. But the desire to ascertain its ultimate fate is a key reason for accurately determining the top-quark mass: differences of just 5% in this mass make the difference between stability and instability. With modern experimental measurements such as that from D0 reaching precisions better than 1%, it is now largely a matter of improving the delicate theoretical calculations to settle the issue.
The measurement is far from trivial. First, in collisions of protons and antiprotons at the Tevatron, top quarks are created predominantly by the strong nuclear force, which conserves ‘quark number’, and so they must be created in pairs (top plus antitop). And they decay within one-trillionth of a picosecond (1 picosecond is 10−12 s) into bottom quarks and W bosons, the latter being elementary particles that mediate the weak nuclear force. Bottom (b) quarks undergo a process called fragmentation and turn into jets: sprays of nuclear particles (hadrons) arranged in fractal-like patterns. D0 identifies these ‘b-quark jets’ with around 65% efficiency by using the fact that b-quarks travel about one centimetre before decaying, leaving a detectable ‘displaced vertex’. The W bosons decay immediately, either to a charged lepton (electron, muon or tauon) accompanied by a neutrino or an antineutrino, or to a quark jet and an antiquark jet. Putting all this together, the final states are therefore complex and are classified into three main categories according to the decay products of the two W bosons: all-jets, leptons and jets, and dileptons.

The collisions of protons (p) and antiprotons (p−) in the Tevatron accelerator can, among many other possible reactions, produce a pair of top quarks, one top quark (t) and one antitop quark (t−), which both decay rapidly to lighter particles. By measuring the energies and momenta of these particles accurately, the masses of the original top quarks can be reconstructed. The specific decay pattern shown here corresponds to the 'leptons + jets' channel used by the D0 Collaboration measurement(1), in which the top and the antitop each decay differently. The antitop quark decays into a negatively charged W boson (W−), which in turn decays to a charged lepton (l; an electron, muon or tauon) and an antineutrino (ν−). The top quark decays into a positively charged W boson (W+), which decays into quark (q) and antiquark (q−) jets. Both decays also produce a bottom (b)-quark jet.

The collisions of protons (p) and antiprotons (p) in the Tevatron accelerator can, among many other possible reactions, produce a pair of top quarks, one top quark (t) and one antitop quark (t), which both decay rapidly to lighter particles. By measuring the energies and momenta of these particles accurately, the masses of the original top quarks can be reconstructed. The specific decay pattern shown here corresponds to the ‘leptons + jets’ channel used by the D0 Collaboration measurement(1), in which the top and the antitop each decay differently. The antitop quark decays into a negatively charged W boson (W), which in turn decays to a charged lepton (l; an electron, muon or tauon) and an antineutrino (ν). The top quark decays into a positively charged W boson (W+), which decays into quark (q) and antiquark (q) jets. Both decays also produce a bottom (b)-quark jet.

The all-jets category represents half of the events, but there are significant non-top ‘multi-jet’ backgrounds, and accurate energy calibrations of the measured particle jets are challenging. Dileptons can be measured very precisely, but they occur only 4% of the time, and the two escaping neutrinos (one of which is actually an antineutrino) leave an irreducible ambiguity in the determination of the top-quark mass.
The best of both worlds is obtained in the ‘leptons + jets’ channel, which was used by the D0 Collaboration. The rate of occurrence of this channel (30%) is reasonable, there are fewer jets and lower non-top backgrounds than in the all-jets case, and momentum conservation accurately constrains the energy and direction of the single escaping neutrino.
The masses of the original top quarks are encoded in the energies and momenta of their decay products. The calibration techniques to extract the ‘true’ top-quark mass from the raw data have evolved enormously since 1995. In the D0 analysis, the set of measured variables is compared with state-of-the-art theoretical calculations of the likelihood of combined signal and background events for several reference top-quark mass values, to find the one that maximizes the overall likelihood.
The accuracy of the theoretical calculations therefore affects the measurement precision, and elaborate cross-checks are necessary to estimate the added uncertainty. Imperfections in the description of the jet-fragmentation process and the modelling of the energies and momenta of the top quarks and their decay products are especially important. The D0 analysis achieves an unprecedented systematic uncertainty (experimental plus theoretical) of just 0.5 GeV c−2.
The Tevatron collider was shut down in 2011, and this measurement is the final word from D0 on the leptons + jets channel, comprising about 2,500 collision events. Meanwhile, the ATLAS and CMS experiments at the Large Hadron Collider (LHC) at CERN near Geneva, Switzerland, have already published(8, 9) measurements rivalling those of CDF(4) and D0. When the LHC starts up again in 2015, top-quark production rates will skyrocket to thousands of times the Tevatron level.
Given the high precision being attained by these experiments, it is ironic that theorists are not quite sure how to interpret the measured values. In quantum field theory, particle masses can be defined in many slightly different ways, and currently there is no consensus about exactly which definition the experimental measurements correspond to, leaving a 1% ambiguity when converting between the measured mass and the Yukawa coupling. Because this is larger than the precision on the raw measurement, this is an issue that urgently needs to be resolved before we can truly claim that we know the mass of a top.


(1) Abazov, V. M. et al. Phys. Rev. Lett. 113, 032002 (2014).
(2) Abachi, S. et al. Phys. Rev. Lett. 74, 2632–2637 (1995).
(3) Abe, F. et al. Phys. Rev. Lett. 74, 2626–2631 (1995).
(4) Aaltonen, T. et al. Phys. Rev. Lett. 109, 152003 (2012).
(5) ATLAS Collaboration et al. Phys. Lett. B 716, 1–29 (2012).
(6) CMS Collaboration et al. Phys. Lett. B 716, 30–61 (2012).
(7) Elias-Miró, J. et al. Phys. Lett. B 709, 222–228 (2012).
(8) The ATLAS Collaboration. Eur. Phys. J. C 72, 2046 (2012).
(9) The CMS collaboration. J. High Energy Phys. 105, 1212 (2012).