Monday, January 26, 2009

AWT, emergence and Hardy's paradox

Recently, the fundamental experimental evidence of Hardy's paradox was given, which basically means, quantum mechanics isn't pure statistics based theory following Bell inequalities anymore. The non-formal understanding of this paradox is easy: if every combination of mutually commutable quantities cannot be measured with certainty, how can we be sure about it? Whether some combination exists, which violates such uncertainty? By such way, uncertainty principle of quantum mechanics violates itself on background, thus enabling so called "weak" measurements.

This was demonstrated recently for the case of entangled photon pairs - it can serve as an evidence, even the photons have a distinct "shape", which is the manifestation of the rest mass of photon. This is because the explicit formulation of quantum mechanics neglects the gravity phenomena and the rest mass concept on background: by Schrödinger equation every particle should dissolve into whole Universe gradually - which violates the everyday observations, indeed. Such behavior is effectively prohibited by acceleration following from omni-directional Universe expansion i.e. the gravity potential, so that every locatable particle has a nonzero surface curvature and its conditionally stable at the human scale. From nested character of Aether fluctuations follows, not only single level of "weak" measurement should be achievable here. After all, the fact we can interact with another people and object without complete entanglement can serve as an evidence, the "weak" observation is very common at the human scale.

By AWT every strictly causual theory violates itself in less or more distant perspective due the emergence phenomena. While the classical formulation of general relativity remains seemingly self-consistent (being strictly based on single causality arrow) - the deeper analysis reveals, derivation of Einstein field equations neglects the stress energy tensor contribution (Yilmaz, Heim, Bekenstein and others), which is the result of mass-energy equivalence. This approach makes relativity implicit and infinitely fractal theory by the same way, like the quantum mechanics (which is AdS/CFT dual theory). For example, gravitational lensing, multiple event horizons of charged black holes and/or dark matter phenomena can serve as an evidence of spontaneous symmetry breaking of time arrows and manifestation of quantum uncertainty and super-symmetry in relativity. This uncertainty leads into landscape of many solutions for every theory quantum field or quantum gravity theory, based on combination of mutually inconsistent (i.e. different) postulates.

Such behavior follows Gödel's incompleteness theorems, by which formal proof of rules valid for sufficiently large natural number sets becomes more difficult, then these rules itself - thus remaining unresolvable by their very nature. This is a consequence of emergence, which introduces a principal dispersion into observation of large causal objects and/or phenomena, which cannot be avoided, or such artifacts wouldn't observable anymore. By such way, every strictly formal (i.e. sequential logic based) proof of natural law becomes violated in less or more distant perspective and it follows "More is Different" theorem. AWT demonstrates, this emergence is followed by causal (i.e. transversal wave based) energy spreading through large system of scale invariant symmetry fluctuations (unparticles), which are behaving like soap foam with respect to light spreading and they enable to observe the universe (and all objects inside it) both from excentric, both from insintric perspective simultaneously. The mutual interference of these two perspectives leads to the quantization of observable reality, which is insintrically chaotic, exsintrically causal by its very nature.

In this connection it's useful (..and sometimes entertaining) to follow deductions of formally thinking theorists, like Lubos Motl, whose strictly formal thinking leads him to the deep contradiction/confrontation with common sense and occasionally the whole rest of world undeniably. It may appear somewhat paradoxical, just fanatic proponent of string theory - which has introduced the duality concept into physics - has so deep problem with dual/plural thinking. This paradox is still logical though, if we realize, how complex the string theory is and how strictly formal thinking it requires for its comprehension.

By such way, "emergence group" of dense Aether theory makes understanding of observable reality quite transparent and easy task at sufficiently general level. It still doesn't mean, here's not still a lotta things to understand at the deeper levels, dedicated to individual formal theories.

1 comment:

Zephir said...

A commercial computer is judged by how fast it solves a given problem. So when Google was deciding whether to buy the D-Wave Two in 2013, speed was central: "Several times we'd tried to get Google to buy one of our machines," Rose explains. "The first two failed. One of the conditions of the purchase on this third [try] was that Google be allowed to set a bunch of acceptance criteria." What these tests revealed is that as D-Wave packs more qubits onto its chip--from the 128-qubit D-Wave-1 to the current 512 qubits to the thousands that will be needed to take on "hard" optimization and machine learning problems--the time needed to solve the problem appears to rise exponentially, in precisely the same way it does on a conventional computer. In other words, there appears to be no speedup over existing machines on the very problems the D-Wave is intended to solve.
At the moment, when the computational power of classical computers is already limited with Heisenberg uncertainty principle, then the power of quantum computers cannot beat it anyway. The quantum computers are very rough and approximate and for replication of precision of classical ones you would need to average their results multiple-times - so at the end you're ending with the same if not worse performance (i.e. the product of computational precision and computational speed).