How does one argue in the 21st century that the thesis of determinism is true?
Determinism: The world is governed by (or is under the sway of) determinism if and only if, given a specified way things are at a time t, the way things go thereafter is fixed as a matter of natural law.
Causal Determinism (Stanford Encyclopedia of Philosophy)
As already noted in #13, thesis of determinism can only be true if the postulate of realism used in the tests of Bell and Leggett-Garg inequalities is true. Realism is the assumption that the properties of quanta exist in a definite state in the absence of (prior to) a measurement. The findings of these experiments are consistent in demonstrating that those inequalities are violated, thus refuting the postulate of realism:
Bell's inequality is established based on local realism. The violation of Bell's inequality by quantum mechanics implies either locality or realism or both are untenable. Leggett's inequality is derived based on nonlocal realism. The violation of Leggett's inequality implies that quantum mechanics is neither local realistic nor nonlocal realistic. The incompatibility of nonlocal realism and quantum mechanics has been currently confirmed by photon experiments.
Testing Leggett's Inequality Using Aharonov-Casher Effect : Scientific Reports
Here are more such findings:
Strong Loophole-Free Test of Local Realism
Quantum mechanics at its heart is a statistical theory. It cannot with certainty predict the outcome of all single events, but instead it predicts probabilities of outcomes. This probabilistic nature of quantum theory is at odds with the determinism inherent in Newtonian physics and relativity, where outcomes can be exactly predicted given sufficient knowledge of a system. Einstein and others felt that quantum mechanics was incomplete. Perhaps quantum systems are controlled by variables, possibly hidden from us [2], that determine the outcomes of measurements. If we had direct access to these hidden variables then the properties of quantum systems would not need to be treated probabilistically. De Broglie’s 1927 pilot-wave theory was a first attempt at formulating a hidden variable theory of quantum physics [3]; it was completed in 1952 by David Bohm [4, 5]. While the pilot-wave theory can reproduce all of the predictions of quantum mechanics, it has the curious feature that hidden variables in one location can instantly change values because of events happening in distant locations. This seemingly violates the locality principle from relativity, which says that objects cannot signal one another faster than the speed of light. In 1935 the nonlocal feature of quantum systems was popularized by Einstein, Podolsky, and Rosen [6], and is something Einstein later referred to as “spooky actions at a distance”[7]. But in 1964 John Bell showed that it is impossible to construct a hidden variable theory that obeys locality and simultaneously reproduces all of the predictions of quantum mechanics [8]. Bell’s theorem fundamentally changed our understanding of quantum theory and today stands as a cornerstone of modern quantum information science.
Bell’s theorem does not prove the validity of quantum mechanics, but it does allows us to test the hypothesis that nature is governed by local realism. The principle of realism says that any system has pre-existing values for all possible measurements of the system. In local realistic theories, these pre-existing values depend only on events in the past lightcone of the system. Local hidden variable theories obey this principle of local realism. Local realism places constraints on the behavior of systems of multiple particles--constraints that do not apply to entangled quantum particles. This leads to different predictions that can be tested in an experiment known as a Bell test. In a typical two-party Bell test, a source generates particles and sends them to two distant parties, Alice and Bob. Alice and Bob independently and randomly choose properties of their individual particles to measure. Later, they compare the results of their measurements. Local realism constrains the joint probability distribution of their choices and measurements. The basis of a Bell test is an inequality that is obeyed by local realistic probability distributions but can be violated by the probability distributions of certain entangled quantum particles [8]. A few years after Bell derived his inequality, new forms were introduced by Clauser, Horne, Shimony and Holt [9], and Clauser and Horne [10] that are simpler to experimentally test.
In a series of landmark experiments, Freedman and Clauser [11] and Aspect, Grangier, Dalibard, and Roger [12–14] demonstrated experimental violations of Bell inequalities using pairs of polarization-entangled photons generated by an atomic cascade.
https://www.researchgate.net/profil...trong-Loophole-Free-Test-of-Local-Realism.pdf
Cosmic Bell Test: Measurement Settings from Milky Way Stars
Bell’s theorem states that some predictions of quantum mechanics cannot be reproduced by a local-realist theory. That conflict is expressed by Bell’s inequality, which is usually derived under the assumption that there are no statistical correlations between the choices of measurement settings and anything else that can causally affect the measurement outcomes. In previous experiments, this “freedom of choice” was addressed by ensuring that selection of measurement settings via conventional “quantum random number generators” was spacelike separated from the entangled particle creation. This, however, left open the possibility that an unknown cause affected both the setting choices and measurement outcomes as recently as mere microseconds before each experimental trial. Here we report on a new experimental test of Bell’s inequality that, for the first time, uses distant astronomical sources as “cosmic setting generators.” In our tests with polarization-entangled photons, measurement settings were chosen using real-time observations of Milky Way stars while simultaneously ensuring locality. Assuming fair sampling for all detected photons, and that each stellar photon’s color was set at emission, we observe statistically significant ≳7.31σ and ≳11.93σ violations of Bell’s inequality with estimated p values of ≲1.8×10^−13 and ≲4.0×10^−33, respectively, thereby pushing back by ∼600 years the most recent time by which any local-realist influences could have engineered the observed Bell violation.
Phys. Rev. Lett. 118, 060401 (2017) - Cosmic Bell Test: Measurement Settings from Milky Way Stars
Violation of the Leggett-Garg Inequality in Neutrino Oscillations
The original goal of LGI [Leggett-Garg Inequality] tests was to demonstrate macroscopic coherence -- that is, that quantum mechanics applies on macroscopic scales up to the level at which many-particle systems exhibit decoherence [3, 8–12]. For this reason, a major focus of recent LGI research has been scaling up to tests with macroscopic systems. Notably, Zhou et al. [9] recently reported finding LGI violation caused by quantum coherence in macroscopic crystals.
LGI tests have another purpose: to test “realism,” the notion that physical systems possess complete sets of definite values for various parameters prior to, and independent of, measurement. “Realism” is often encoded in hidden-variable theories, which allow for systems that are treated as identical according to quantum mechanics to be fundamentally distinguishable through a hidden set of parameters that they possess, such that any measurement on a system reveals a pre-existing value [13]. LGI violations imply that such hidden-variable (or “realistic”) alternatives to quantum mechanics cannot adequately describe a system’s time evolution. Experiments using few-particle systems can test “realism” even if they do not directly address macrorealism [13–18].
https://www.researchgate.net/profil...cillations/links/56c3ef1008ae60234250be1d.pdf
Violation of the Leggett-Garg Inequality in Neutrino Oscillations
The Leggett-Garg inequality, an analogue of Bell’s inequality involving correlations of measurements on a system at different times, stands as one of the hallmark tests of quantum mechanics against classical predictions. The phenomenon of neutrino oscillations should adhere to quantum-mechanical predictions and provide an observable violation of the Leggett-Garg inequality. We demonstrate how oscillation phenomena can be used to test for violations of the classical bound by performing measurements on an ensemble of neutrinos at distinct energies, as opposed to a single neutrino at distinct times. A study of the MINOS experiment’s data shows a greater than 6σ violation over a distance of 735 km, representing the longest distance over which either the Leggett-Garg inequality or Bell’s inequality has been tested.
Phys. Rev. Lett. 117, 050402 (2016) - Violation of the Leggett-Garg Inequality in Neutrino Oscillations