NHacker Next
login
▲Unprecedented optical clock network lays groundwork for redefining the secondphys.org
20 points by wglb 4 days ago | 12 comments
Loading comments...
wglb 4 days ago [-]
Referenced article in Optica: https://opg.optica.org/optica/abstract.cfm?doi=10.1364/OPTIC...
adrian_b 3 days ago [-]
Also "Roadmap towards the redefinition of the second"

https://iopscience.iop.org/article/10.1088/1681-7575/ad17d2

of which these experiments are a step towards this goal.

zokier 7 hours ago [-]
I would have thought that the frequencies (or their ratios) of atomic clocks could be calculated somehow from the fundamental physics. Like somehow the energy levels of electron shells could be determined from the configuration of the nucleus (how many protons/neutrons it has), and from that the transition frequency could be calculated. But apparently that is not the case. I guess the number of particles in these atoms is way too high for computing with our current quantum models?
fsh 4 hours ago [-]
The accuracy of atomic clocks is much better than our understanding of fundamental physics.

The best calculable atomic system is atomic hydrogen, and state-of-the-art quantum electrodynamics calculations reach a relative accuracy of around 1E-13 for its energy levels. However, already at the 1E-10 level, the structure of the proton becomes significant which can currently not be calculated from first principles. Instead, the proton size is taken as a free parameter which is determined from the measurements.

In contrast, the best realizations of the SI second are caesium fountain clocks which achieve relative uncertainties in the 1E-16 range. Clocks based on optical transitions (rather than microwave transitions) have now broken the 1E-18 barrier. Calculating atomic structure to this level is currently completely unthinkable, even for a system as simple as hydrogen.

staunton 6 hours ago [-]
It's always a question of what precision you want.

For really any physical system, when describing it at ever greater precision, more and more effects become relevant until you can't calculate it anymore (or even your theory itself breaks down). In this case, the precision they need is extremely high so this is a problem.

For the vast majority of systems, there's no point in going there because the precision of experiments is too low (which means that the experiments feature even more poorly controlled effects which would be unreasonable to model).

perlgeek 6 hours ago [-]
Our models use single atoms (or single molecules), and for those we have pretty good models that we can solve numerically, at least.

In a gas, the atoms or molecules only interact weakly, so you just get some known effects like a line broadening due to thermal motion of the particles

But you really still want experimental validation before you declare any of these as a new standard, for a whole variety of reasons:

* it's often complicated to calculate multiple excitations

* you might forget something in the models, like isotope ratios

* the models don't really give you a good sense of how impurities in your materials will affect the clocks

* there might be some practical issues, like glass (used in the optical fibers) not being a very good medium for some frequencies of light that would otherwise look promising as a time standard

... and so on.

tsimionescu 3 hours ago [-]
> Our models use single atoms (or single molecules), and for those we have pretty good models that we can solve numerically, at least.

We can only solve these with assumptions, like assuming that protons or neutrons are indivisible particles with experimentally determined sizes and perfectly spherical shapes - even though we know very well that they are in fact collections of quarks and gluons whose size and shape is fully determined by more fundamental intercations. We are nowhere near a point where we could compute anything about a whole hydrogen atom using only the standard model and no other assumptions. Quantum chromodynamics is far to complex to allow for a perfect simulation like this.

zokier 4 hours ago [-]
The thing that piqued my curiosity was this note from the paper:

> This strongly suggests that the recommended frequency value for the secondary representation of the second is offset from the unperturbed transition frequency by approximately twice its assigned uncertainty of 1.3×10^-15.

> the recommended frequency value is strongly dominated by a single absolute frequency measurement [53], which in light of recent results is to be considered suspect.

So I guess we don't have a usable theoretical reference value here.

ludicrousdispla 9 hours ago [-]
... a man with one clock knows what time it is, but a man with two can never be sure.
adrian_b 7 hours ago [-]
Unless he uses the ticks coming from the two clocks to generate a tick signal whose frequency is the geometric mean of the 2 tick frequencies (possibly by using a weighted mean, when one clock is known to be better than the other), and then displays the time by counting the ticks from the synthetic tick signal.

This is how the redefinition of the second will work, by using many different kinds of optical clocks, instead of the single cesium-based microwave clock that is used now.

In fact, today the big laboratories have many atomic clocks, whose clock frequencies are averaged to compute the time, even when the clocks are of the same kind. The international atomic time, TAI, is computed by averaging the clocks of all important laboratories.

torcete 3 hours ago [-]
I wonder why they use the geometric mean. Are the clocks expected to have spurious noisy ticks?
IAmBroom 3 hours ago [-]
All quantum things do.