Saturday, March 14, 2009

Inconsistency of general relativity

Concerning general relativity (GR), there exists confusion, as evidenced in the literature, regarding the nature of the gravitational field. Einstein identified the existence of gravity with the inertial motion of accelerating bodies (i.e. bodies in free-fall) whereas contemporary physicists identify the existence of gravity with space-time curvature (i.e. tidal forces). The interpretation of gravity as a curvature in space-time is an interpretation Einstein actually did not agree with due his rivalry to Minkowski in simmilar way, like with existence of black holes and or gravitational waves later (1, 2) . Contemporary interpretation of GR attributed to Einstein by mainstream propaganda surprisingly differs pretty much from GR in Einstein's time, in fact. Einstein once writed, "Mach's idea finds its full development in the ether of the general theory of relativity. According to this theory the metrical qualities of the continuum of space-time differ in the environment of different points of space-time, and are partly conditioned by the matter existing outside of the territory under consideration."

By AWT every theory, which is using more than single postulate (a nonzero rank implication tensor, be more specific) becomes inconsistent insintrically in less or more distant perspective, Peano algebra (K. Gödel, 1940) or general relativity (GR) is no exception. By [Kerr, Kerr & Ruth 1999, 621] textbook GR is using following postulates:
  1. Actions of inertia and gravity (i.e. the forces or even every inertial reference frame dependent observations) are indistinguishable each other - so called weak or strong equivalence principle.
  2. Four dimensional space-time is curved as a result of the presence of mass (mass-energy equivalence principle is apparently considered on background).
  3. Objects take the shortest path between two points in space-time, so called geodesics (principle of least action of Newtonian dynamics extended to 4D space-time).
To cover any connection of GR to Newtonian physics, mainstream propaganda is omitting the fact, without incorporation of gravitational constant via Newton's inverse square law we couldn't derive Einstein's field equation at all. In such way, Newton's gravitational law makes general relativity dependent and derived from Newtonian physics, instead of vice-versa.

Newtonian motivations of GR are collected here. General relativity is AdS-CFT dual to Einstein's original “refractive approach to gravitational light-bending and various Varying Speed of Light theories (VSL), like quantum mechanics (QM), which represents exsintric perspective of relativity phenomena. Locally GR appears as a very general theory, if we consider energy spreading in transversal waves as the only source of information. Unfortunately, because strictly local and causal perspective is only idealized model of reality, every look into Universe future or history violates the GR undeniably. Although with compare to 2nd-order theories, like quantum gravity or string theory - this violation can be minimized to arbitrary low value by narrowing of observation scope by introduction of De Sitter background dependent Poincare group (Kerr or Cartan's geometry) and/or by introduction of tachyon interactions in hidden dimensions, usually done unconsciously by introduction of "implicit higher order effects" (extended GR of Heim, Yilmaz or Bekenstein). In addition, we should distinguish an inconsistency of GR postulates from inconsistencies of its formal theorems, which are often using an additional approximations on background (for example by ignorance of mass-energy equivalence principle at the case of Einstein's field equations).

The most classical example of above inconsistency is gravitational lensing, which is manifestation of quantum uncertainty, as it splits result of remote object observation into odd number of images. Whereas GR is strictly causal theory in 4D spacetime, it doesn't allow any manifestation of quantum uncertainty or Lorentz symmetry violation, until additional time dimensions are included. This inconsistency manifest most pronouncedly in cosmological constant problem, because prediction of cosmological constant by GR differs from those of QM in two hundred orders of magnitude, thus violating correspondence of GR and QM.

Another source of inconsistencies is local character of equivalence principle. The inertial and gravity action can be always distinguished, if we consider 4D space-time only. Because every gravitating body inside of observable Universe must be of finite size and therefore it has always a center of mass, we can always distinguish a gravity action from acceleration by usage of pair of plummets at nonzero distance. The acceleration force resulting from omni-directional space-time collapse or expansion considers higher dimensions though, so it can exhibit a center of action. By AWT the gravitational field of massive object can be interpreted as a acceleration force, resulting from inhomogeneity in omni-directional Universe expansion. Therefore GR appears OK, only if we consider additional dimensions, thus violating 4D causality again. Here we can read about Kipp Thorne's method to show that GR contradicts its own equivalence principle.

Due the extreme gradient of space-time curvature, the nonlocal character of equivalence is especially pronounced at Planck scale and at the case of black holes, where it manifests by various violations of GR from exsintric perspective. The classical Einstein's field equation suffers by additional inconsistency because of neglecting of mass-energy equivalence during its derivation. It leads into additional supersymmetric phenomena, like precession, Zeeman effect, fragmentation of event horizon and surface tension effects of gravity field gradient, by dark matter and Pioneer spacecraft anomaly in particular.

Limited speed of light and omnidirectional expansion of Universe itself is sufficient for explanation of Pioneer anomaly and Newton law violation. The acelleration of Pioneer anomally agrees well with predictions of MOND theory a = H.c = (8 +- 1)E-10 m/s^2, where H is Hubble law constant. Alternativelly we can consider it as a dragging effect of background CMB photon field, in AWT these explanations are dual mutually. This duality illustrates, how violation of gravitational law at spacecraft scale is related to violation of gravity at Casimir force scale. Because gravitational law remains violated by Casimir force at small distances only, such violation of Newton law for small acelleration means, equivalence principle is violated for general relativity, too.

While relativity bothers with insintric perspective only, it neglects the fact, gravity field inside of each gravitating body is zero, because of zero space-time curvature. Therefore every massive object exhibits an inflexion point of space-time curvature or gravity force at larger distance. Whereas by relativity highest curvature should appear exactly at the center, which leads to Schwartzchild's solution with black hole singularity at center, which is apparently unphysical, though. Kerr's solution is only partial improvement of this problem - it just uses toroidal symmetry of singularity instead of spherical one. We can see, requirement of zero gravity force at center of every gravitating object leads to requirement of weak repulsive gravitation force at distance and dark matter phenomena, which is manifestation of quantum gravity phenomena by its very nature. It's symptomatic for mainstream physics, whole generations of relativists didn't bother by this trivial and apparent paradox.


While the inertial properties of electron and positron are the very same, their behavior in gravitational field suffers by CP symmetry violation - the antiparticles are attracted by weaker force, then particles. Antineutrinos should be even expelled by gravity nearly completely because of their negative curvature. Axions could be even of negative mass, i.e. a product of tachyon condensation, being a solitons - they explode into photons when halted like vortex rings.

Concept of Feinberg's tachyons brings even deeper inconsistency into GR. In sense of classical GR gravity cannot affect itself, so that the gravity field cannot curve the path of gravitational waves. Because path of photons is curved, it would mean, gravitational waves would propagate like tachyons along shorter path, then it's allowed for photons in general, which is consistent with AWT, but not with GR by its classical, i.e. Einstein's formulation, while extended GR allows this by introduction of more general reference frame in hyperspace. As a strictly causal theory in 4D, general relativity doesn't allow a tachyons and/or gravitational geons without introduction of universe expansion into hyperspace, neither formation of gravitational waves, because no environment can generate waves by itself without presence of objects composed of / living in compacted dimensions, which could serve as a source of inertia. In GR such source could serve only gravitational geon, which is stabilized against its collapse by omni-directional expansion of space-time into hidden dimensions. We can see, general relativity requires concept of hidden dimensions on background to be able to work consistently by the same way, like quantum mechanics, which predicts expansion of all wave packets without gravity potential of Universe collapse.

From this perspective is interesting, Einstein refused the concept of both black holes, both gravitational waves obstinately (original source removed?), although he should know about necessity of omni-directional Universe expansion in relativity quite well, if he claimed his cosmological constant as a "biggest blunder of his life".

Is dark matter composed of antimatter?

The origin of matter-antimatter asymmetry was a struggling problem of mainstream physics and cosmology for many years. By AWT antimatter never disappear from observable Universe, it was only finely divided (evaporated) into streaks of dark matter. These streaks are surrounding all massive bodies, including galaxies and stars, so that the concentration of antiparticles is especially easy to detect around black holes at the centers of most large galaxies. This prediction was presented a two years before and now recent ATIC and PAMELA observations seems to support it, while interpreting the source as a result of WIMPS annihilation. This is particularly because contemporary particle physics is quite desperate for funding. Guessing that dark matter consists of exotic particles may mean more funding to search for such particles.


AWT explanation doesn't consider WIMPS mechanism though, because WIMPS are supersymmetric bosons mediating surface tensions forces of strangeletes, i.e. large dense clusters of particles, which are supposed to form dark matter here. Such clusters would be of primordial origin, i.e. something like microscopic black holes in the S1/Z2 Randall - Sundrum (RS) braneworld model, which currently belongs into unverified hypothesis, as the only stable strangelets-like particles known so far are the atom nuclei. It's highly improbable, swarms of such thingies would be responsible for observed dark matter effects like gravitational lensing or Pioneer anomaly and even if they would exist, it's not necessary to consider their bosons (i.e. WIMPS) as the main source of dark matter gravitation - because no WIMP can exist standalone without its superpartner. Such explanation therefore just illustrates deep conceptual confusion of mainstream physics, which is overspecialized, so it cannot confront & reconcile inconsistently labeled ideas coming from various areas of physics mutually.

Instead of this, AWT explains mechanism of primordial origin of antimatter more naturally. We can consider black holes as a dense mixture of hot matter and antimatter in the form of graviton foam, analogous to foam of density fluctuations, formed briefly after condensation of supercritical fluid. In this foam particles are foam excitations are tied to inner surface gradients of foam bubbles, while the antiparticles are formed by outer surface of membranes (string theory uses a concept of string loops attached to branes, instead). But these particle gradients are never quite equivalent: the inner gradient are always of smaller diameter and higher curvature, therefore they're more stable, then the outer surface gradients. This is a general consequence of fact, we are always smaller, then the observable Universe generation, despite how far we can observe it, so that the particles forming environment are always of opposite chirality with compare to particles forming an objects in it.
Currently quantum foam forming vacuum is expanded heavily and its bubbles are formed by very thin walls, so here's rather subtle difference between particles and antiparticles behavior, as characterized by so called CP symmetry violation - but at the beginning of universe the situation was completely different. Graviton foam was very dense there and it was formed by rather spherical bubbles with thick walls. Inside of such foam the particles of antimatter were transformed into particles of matter readily, by the same way, like small bubbles in foam are collapsing on behalf of those larger ones or small flakes of snow sublime on behalf of large crystals, thus making snow "grainy". If such mixture would cool slowly, it would transform completely back into radiation again - but we were lucky: due the fast expansion of space-time during inflation era a subtle portion of matter and antimatter particles was left unreacted, so we are still here, being surrounded by voluminous streaks of dark matter composed of sparse unreacted particles of matter and antimatter separated by EM charge interactions against gravitational collapse (note the connection of AWT to H. Alfven's ambiplasma cosmology here.).


Because their negative curvature, the antiparticles tends to move toward gravity field so it separates from resulting condensate of matter and surrounds it by the same way, like particles forming the outer gradient of soap bubble membrane are surrounding those, which are forming inner surface of it. Standard model in combination with inflationary cosmology provides an mainstream explanation of this evolution. At the very beginning of Universe was very dense, so that weak nuclear force responsible for matter-antimatter interaction was a large distance interaction here. Because as we know, the matter annihilate with antimatter in direct contact, i.e. whenever weak interaction takes place, matter was annihilated with matter readily. But because the Universe expanded fast during inflation, weak nuclear force has become a short distance interaction (and it remained until now) - so that subtle portion of antimatter particles remained separated from the rest of matter safely.

The similar phenomena occurs during accretion of matter and during explosions of collapsars, i.e. the quasars, neutron and quark stars. While matter and antimatter annihilate in direct contact under formation of gamma radiation (i.e. EM field), in abundance of strong electromagnetic or gravitational field this process can be reversed under so called matter-antimatter pairs production. Strong magnetic field causes a separation of resulting particles and antiparticles near magnetars and spinning black hole, whereas gravity field is less effective in doing this in general, but it can separate even particles, which are differing just by their lepton charge, not only EM charge. In general, massive particles are attracted to inner part of gravity field gradient of positive curvature, whereas antiparticles were attracted by field gradient after its inflexion point, i.e. by its negative curvature. This resulted into separation of matter and antimatter particles. Note that the same mechanism (only applied to different dimensional scale) could explain chirality distribution, observed in nearby galaxies along streaks of dark matter and polarization of CMB.


While such process appears completely different from perspective of classical physics, it has a number of analogies in it, in fact, because it's driven by curvature of surface gradients in all cases. During fast condensation a mixture of water droplets of different sizes appears. But the smaller droplets have always a higher tension of vapor, so they're evaporating of behalf of these larger ones. When such condensation occurs as a consequence of fast vapor expansion (for example in Wilson cloud chamber expanded by piston), then the smaller droplets travels at the large distance from center of expansion, until they don't evaporate completely. After then they're forming an areas of more saturated vapor surrounding more "lucky" droplets, which have condensed at rest and which tends to evaporate. By AWT the dimensional scale, separating the matter evaporation from condensation roughly corresponds the human dimensional scale and wavelength of CMB photons, as it brings the highest complexity into Universe evolution. Particles smaller then CMB photons would condense into particles of matter, whereas the larger particle would expand into streaks of dark matter. From certain point of view antimatter is livin in inverse/reciprocal time arrow inside of observable Universe - it evaporates while the normal matter condenses, which is the reason, why the antimatter is finely dispersed over whole Universe.


The above example can be generalized in terms of unsteady-state thermodynamics, as described by Le Chattelier principle. For example, when we cool sooting candle flame by blowing of cold wind or by insertion of cold teaspoon, we can obtain a thin layer of soot. In this analogy, the flame corresponds the collapsar radiating an energy and the soot and oxygen are analogy of common matter and antimatter. As we know, these particles of these components reacts readily in mutual contact (i.e. they're "annihilate") - but when separated fast from flame by cooling of mixture or by adiabatic expansion of environment, they can remain intact and fast cooling freezes this metastable equilibrium on behalf of presence of the original components.

By AWT we can find an analogies to matter-antimatter separation even in biological and social evolution, where exist so called division of labor between different molecules and sexes, which are exhibiting a sort of chirality, too. AWT model explains, why this chirality was important during life formation in liposomes at human scale. Currently energy density of society is relatively high, which enables a high level of emancipation. But at the very beginning of human society the survival / fitness of community has depended on division of labor heavily, so today men are traveling for jobs at distance, while women are working near their homes - although they're not required to do so. But despite of low level of symmetry violation the complete exchange of male/female roles would lead to annihilation of social structures (i.e. family condensate in particular) by the same way, like switching of role left-handed proteins and right-handed sugars would lead to death of most of living organisms immediately. These analogies help us to understand, how deeply the chirality is working inside of our Universe at different dimensional scales.

AWT considers the existence of chirality as a byproduct of Universe inflation, i.e. the consequence of its fast cooling after brane collision, but such process occurs analogously at most dimensional scales, for example during collapse of stars and black holes, just in less or more pronounced way. The concept of omnidirectional Universe expansion enables to describe this process in terms of spatialized time, i.e. as a process of nonuniform propagation of transversal and longitudinal energy waves through hyperspace, but removal of space-time symmetry prohibits us in using of many temporal analogies, which could simplify the deeper understanding of this unsteady process.

We can just ask, why astronomers, who are looking for missing antimatter in Universe from the very beginning of BigBang theory (Lemaître 1931) didn't considered the posibility, dark matter may be composed of antimatter from the very beginning of dark matter observation (Fritz Zwicky, 1933). In another way - why publicity should pay scientists for seventy years of research of another explanations, if they ignore these most simplest one? This situation isn't very new here and it just illustrates, science needs a thorough public control by the same way, like every other branch of society. Arrogance of mainstream science proponents isn't in place here, as it often covers the incompetence.

Tuesday, March 10, 2009

Is reality classical?

This post is a reflection of recent analysis of Marco Frasca named "Ballentine and the decoherence program", where he disputes the question of "environmental decoherence".

Aether Wave Theory interprets a reality by nested density fluctuations of hypothetical Boltzmann gas, i.e. the Aether. It's a neoclassical local view consistent with neighboring reality, from which follows, when sighting along time dimension into past (cosmic scale) or future (Planck scale) a relativity or quantum mechanics phenomena will emerge. General relativity perspective has its Aether background in correspondence of Einstein's field equation and thermodynamic state equation ΔQ=TdS , which was proven by T.A. Jacobson. For example, AWT intepretes a 1st Newton inertia law and relativistic motion along geodesics in gravity field as a sort of diffusion and gravitational lensing as an optical lensing phenomena.



In dual way, quantum mechanics perspective has its background in Thomas-Fermi approximation, which recovers the semi-classical limit for large QM systems. Hamilton–Jacobi equations for this Hamiltonian are then the same as the geodesics on the Riemann manifold, Hamiltonian flow can be intepreted as a diffusive motion along geodesics as well. In accordance with this, most of quantum mechanics phenomena (like the result of double slit experiment) can be intepreted easily by semi-classical models by AWT, quantum entanglement is no exception.



What we know already, every combination of these two theories (the string theory and quantum gravity theory in particular) leads to landscape of infinite number of solutions, thus reflecting insintric randomness of Aether reality anyway. No relativity, no quantum mechanics based on empirically ad-hoced postulates has it's own intersubjectivelly accepted interpretation with exception of famous dictum “Shut up and calculate” (attributed to Feynman by mistake probably), which basically replaces the understanding of reality by its numeric regression. Does it mean, reality is non-classical, just because it can be described in distant and rare cases by two (or more) mutually inconsistent boundary theories too? Should we ignore paralellistic and holistic approach to reality understanding just because multicomponent systems cannot be handled by consecutive logics of formal math well? Should we replace the local description or reality by its description from distant past or future perspective? Is principal limitation of formal language supposed to become a crucial problem in general understanding? Is the transversal wave motion along single time arrow the only way of information spreading?

Well, I don't think so - nevertheless this question still has a great significance for formally thinking individuals, who doesn't care about consistent description of reality, until it doesn't fit their pet theory. What they're afraiding for is their social credit and further grant support of their theories like ultra-conservative proponents of Holy Church of medieval era. This is by my opinion the main problem of mainstream science with Aether concept as a whole. For the purpose of such people, lack of general & transparent understanding of reality is highly welcomed on background - every deeper understanding may reveal their theories ad-hoced naively and inconsistent.

Such approach is nothing very new in human history, indeed. Scientists are behaving like priests or medicinmanns of modern era from this perspective - adherence on their incomprehensible formal models brings them meritocracy, social respect, safe jobs separated from public control and helps them survive more easily in human society. As the result, contemporary learning system has many connections to novitiate of sectarian communities - high school novices are purposely trained in formal thinking with minimal connection to underlying physical models and they're not allowed to publish, until they don't pass various tests of "compatibility in thinking".

Please note, that this stance is of emergent nature: with exception of rare individuals most of scientists are people, who are "rather" opened further progress in "another" areas, at least proclamativelly. But their competitive behavior and bias toward formal description of reality cumulates through society, thus making a driving force, which becomes a solid brake of the further understanding at large scales. For those, who knows, how slowly magnetic field penetrates the superconductor (Meissner-Ochsenfeld effect) may be understandable, why I consider every rigid sectarian community a boson condensate and/or black hole, as being observed/interacted from outside.

William James: "A great many people think they're thinking when they are merely rearranging their prejudices."