Complex software models are used to understand the results from the Large Hadron Collider. These include simulations of the particle physics in the proton-proton collisions, as well as of the material and geometry of the detectors and the strength of the various magnetic fields. As more data are accumulated, the required precision of this software increases.
A recent review recommended that the number of decimal places used to represent numbers in the software should be increased. This means all mathematical constants such as e and pi, as well as physical constants and the measured dimensions of the detectors. So far, so routine. But when adding more precision to pi, a strange effect was noticed. The alignment of charged particle tracks across detector boundaries actually got worse when a more precise value was used. In addition, the agreement between simulation and data also got slightly worse.
This really should not happen – more precision should mean better alignment and better agreement.
Boring scientists say this is probably evidence that some physicists don’t know how to write proper code. However, string theorists have pointed out that a firm prediction of string theory is the existence of extra space-time dimensions. In a space which is curved into a higher dimension, the apparent value of pi can deviate from that seen in real life. And thus the LHC may have proved that they were right all along. More data are needed before we can be sure.
Of course, there's a much more interesting explanation involving the numbers "4" and "1" that would explain this mathematical oddity as well...
No comments:
Post a Comment