Last week the Washington Post ran a story by Carolyn Johnson with the headline “A breakthrough in timekeeping: Progress toward a nuclear clock.” I didn’t see it when it ran, but a colleague asked me about it at happy hour, so I looked it up. It’s a good piece— the grad student anecdote that frames it is an all-timer—and I did a thread about it on Bluesky (in large part to see what the response would be on a new social network), but it’s a big enough subject to be worth a blog post as well.
The actual paper is super technical, as ultra-precise spectroscopy articles tend to be— a lot of fiddly details about how they identified and eliminated tiny sources of error. It’s also got a bunch of lightly tossed-off details— exciting one of the relevant transitions with the seventh harmonic of something, etc.— that hide superhuman accomplishments. The NIST Time and Frequency Division, and particularly the Jun Ye group, is full of laser jocks whose names are spoken in awed tones by other researchers in the field.
The question my colleague had, though, is “Why is this particular technology considered a big deal?” There’s a bit of the usual stuff in the article— clocks are connected to navigation, and ultra-precise clocks offer opportunities to search for exotic physics— but that doesn’t entirely explain why thorium specifically rates a splashy story in the WaPo. Which is an opening for me to dust off some stuff from A Brief History of Timekeeping and explain a few of the whys of atomic clock technology.
A key bit of background for all of this is that the phrase “atomic clock” is a tiny bit of a misnomer, in that the “tick” of the clock doesn’t take place in the atoms themselves, but in light. The current definition of the second is the time required for 9,192,631,770 oscillations of a particular type of light, so an “atomic clock” is ultimately a microwave source— you count the oscillations of the light that comes out, and that gives you a way to quantify the passage of time.

The atoms come into this as a reference to insure that the light is at the correct frequency. Quantum mechanics tells us that electrons in atoms occupy discrete energy states— in the not-perfect-but-gets-the-concept-across picture of an atom like a tiny solar system, these are combinations of distance from the nucleus, speed in the orbit, and spin of the electron and nucleus— and absorb and emit light when they move between those states. The light frequency is determined by the energy difference between the states, which is absolutely fixed by the laws of physics for a given atom.
So a cesium atomic clock is a microwave source whose frequency is checked against one of the frequencies of light that cesium atoms are allowed to absorb and emit. Loosely speaking, you take a little bit of the light, shine it on some cesium atoms, and see if they change states. If they all change, you know you’ve got the right frequency; if some of them don’t, you tweak the frequency a little bit and try again. This check-and-tweak process gets you a microwave source whose frequency is known to match the cesium frequency to such a degree of precision that two of these clocks would need to run continuously for something like a billion years before the accumulated difference between them added up to one second.
That seems awfully good, but isn’t actually the best we can do. The Jun Ye group at NIST in Boulder is one of a handful of groups running experimental clocks that are considerably better. These use strontium atoms held in an optical lattice and the “tick” of the clock is the oscillation of a red laser, rather than a microwave source. The precision of these requires even crazier analogies than the cesium standards— running times exceeding the current age of the universe before gaining or losing a second.
What makes the difference here? The big win is that the frequency of the “ticking” is a lot higher— around 429,000,000,000,000 Hz for the light associated with strontium. All else being equal— that is, if you have the ability to measure both frequencies to the same number of decimal places— a clock with a higher absolute frequency will have smaller uncertainty than one with a lower absolute frequency. That “all else being equal” is doing a ton of work and involves a couple of Nobel Prizes for the development of the necessary technology, but if you’re running a precision metrology group staffed with awe-inspiring laser jocks, you can measure the strontium frequency at a level not too far off that of the cesium frequency, so that factor of 47,000 increase in the light frequency gets you a much better clock.
So, what’s the big deal about thorium? Well, the thorium transition in question is a change of nuclear states, not electronic ones— a redistribution of energy stored in the 229 protons and neutrons making up the thorium nucleus— but the same basic physics applies. The frequency of the emitted light depends on the energy difference between nuclear states in the same way as the frequency of light emitted in an ordinary atomic clock depends on the change in the energy of the electrons.
Nuclear states, though, have their energies determined in part by the strong nuclear interaction, which as the name suggests is much stronger than the electromagnetic interaction that determines the energy of electron states. Which means that the light emitted by nuclei tends to be much higher in frequency— gamma rays or x-rays, not visible light or microwaves. The thorium transition here is actually freakishly low for a nuclear transition, meaning that it’s merely in an incredibly inconvenient range of ultraviolet light— but still light, allowing it to be studied with the same optical techniques used for the strontium clocks.
Sure, everything involving the laser has to be done under vacuum, but, y’know, no guts, no glory. If you have a group that can make a laser at the right frequency— see above— the frequency of the thorium transition is another factor of almost five higher than the strontium frequency. Which means another big potential increase in the precision of a clock based on thorium. On top of that, there are benefits from the nuclear nature of the transition— reduced sensitivity to some external perturbations, and increased sensitivity to some of the exotic physics things that you might look for with an ultra-precise clock. So it’s an exciting system for the kind of people who get excited by precision metrology.
People have been trying to observe the thorium transition for almost a quarter-century now, and it’s only within the last couple of years that its energy got nailed down well enough for people to start looking for it with lasers. The paper that prompted the WaPo story is the next step in that process: comparing the thorium frequency to the strontium one at twelve decimal places. Their spectroscopy is sensitive enough to see a little nuclear physics in action— they can resolve several closely spaced energy sublevels that come about because of the nuclear structure. It’s an impressive bit of work.
It’s still a long way to go from being a competitive frequency standard, let alone a clock (the distinction between them being, roughly, that a clock is operated continuously while a frequency standard only checks in some of the time). But it’s a major development in a rapidly evolving subfield, and definitely deserves some hype.
There’s a little bit of physics explaining to get your week off to a good start. If you like this kind of thing, here’s a button to get it in your inbox:
And if you have further questions or want to quibble with any of my explanations, the comments will be open:
Thanks for this. A Big Deal, indeed!
Random quote from (I hope still) your SO. From rass, likely, ages ago.
"What do people mean when they say the computer went down on me?"
--Marilyn Pittman/ Kate Nepveu
I wonder if A.I. did any better if it were asked to render an image of a particular type of clock escapement...