We read here and there that experiments were carried out with detection rates close to 100%, in all cases above 50%. Nothing is less true. The laboratory data produced in the presence of numerous observers and published show enormous distortions between the arms, that is to say, enormous rates of singletons and double non-detections. The common EPR-Bell detection rate is far from 1%.
It is after computer processing, according to surprising theories when applied here, that the data performs at the announced rates. It goes without saying that we are explicitly in non-compliant conditions since the rate we are talking about is the ratio of the pairs received and analyzed on the total number of pairs issued. Can we speak of quantum-like correlations observations, of whatever nature they might be?
The documents are easily accessible but often a written request is needed to access the raw data. We must thank these teams who share their data by taking the risk of criticism, such as this one. To their defense, the press releases are often written by communicants.
For example, let’s take data available on Arxiv for a given experiment at 92% , which is supposed to solve the detection loophole. The detectors analyze a single channel, the results can only be Zero or More. Zero includes non-detections and minus. This should already be enough to reduce the scope of the findings. Let’s go on.
The angles : \(a1 = 94°4\) , \(a2 = 62°4\) , \(b1 = -6°5\) , \(b2 = 25°5\) , respectively for Alice and Bob.
\(N_{11}^{++}\) is the number of “+” pairs with the polarizer rotations 1 and 1.
Number of issued pairs : \(N = 3,502,784,150\)
Number of detected pairs and singles :
\(N_{11} = N_{100°9} = 875,683,790; N_{11}^{++} = N_{100°9}^{++} = 141,439 \\N_{12} = N_{68°9} = 875,518,074; N_{12}^{+0} = N_{68°9}^{+0} = 67,941 \\
N_{21} = N_{68°9} = 875,882,007; N_{21}^{0+} = N_{68°9}^{0+} = 58,742 \\
N_{22} = N_{38°9} = 875,700,279; N_{22}^{++} = N_{38°9}^{++} = 8,392 \\\)
The \(N_{11}^{++}\) complements are \(N_{11}^{00} , N_{11}^{0+} and N_{11}^{+0}\).
First, billions pairs were sent but only less than one million were detected.
Try to evaluate the resulting values : Malus law, detection rate, error margins, etc.
Are they consistent? What are the correlation values in cos² from the estimates of the complements. Talking about 92% seems reckless. The hypothesis of natural fair sampling does not come out stronger at all.
To close the fair-sample loophole with a scientific proof, one must produce an experiment with more than 83% of pairs detections rate and a cos² curve. The more conservative would be satisfied by 75% + a little bit outside the margin error.
There is no need to do billions trials, it would be better to take the time to record slowly hundreds or thousands results since there are few differences of possible angles. Is it so difficult for a fundamental experiment to set a device recording one event by second during one month ?
Leave a Review
You must be logged in to post a comment.