Friday, October 16, 2009

Can Time-travelling Higgs sabotage the LHC?

This post is motivated by recent paper by H. Nielsen and M. Ninomiya and related NewScientist article and New York Times essay, in which organized effort in finding of Higgs boson would be inherently predestined to become unsuccessful in laws of thermodynamics and quantum mechanics. Article proposes an explanation, why USA Congress stopped the funding for the USA's SSC in 1993, and why the LHC itself suffered an embarrassing meltdown shortly after starting up last year just by this aspect of time travel behavior. This story illustrates in such way, in contemporary science every nonsense can be promoted, providing its supported by formal math, thus evading the accusation from crackpotism, which obligued some formally thinking bloggers to vindicate this generally accepted difference between speculation and crackpottery. Anyway, as the result of ongoing discussion, arXiv has reclassified related papers to "less serious" General Physics section.

The problem of commonly used reasoning of physical models by abstract math and/or even computer simulations is indeed in violation of causal hierarchy, in which formal models are always based on predicate logics, not vice-versa. Therefore if underlying model is proven logically wrong, then the whole formal derivations based on it becomes wrong as well - as the destiny of some formally brilliant - though logically missunderstood models has demonstrated clearly (hollow Earth theory, geocentric model of epicycles, interpretation of luminiferous Aether model by Michellson-Morley experiments, etc..). In Aether theory Higgs model plays no significant model of casual background, because AWT assumes, there are infinitely many levels of space-time compactification, which manifests in real world by may complex high dimensional interactions inside of complex ecosystems, like Borneo jungle or human society. Constrained string theory models of twelve or twenty six dimensions cannot be considered as ultimate causal background of Universe from practical reasons, Higgs boson background of Standard Model the less, because observable world is apparently more rich and dimensional, then these models are considering.

In addition, Higgs model is too vague to be considered seriously, because it has more then single formulations: Higgs model in classical physics is based on different phenomena, then Higgs-Anderson model in boson condensates and its technical derivation consists in a mere reshuffling of degrees of freedom by transforming the Higgs Lagrangian in a gauge-invariant manner. Well known "hierarchy problem" implies, that quantum corrections can make the mass of the Higgs particle arbitrarily large, since virtual particles with arbitrarily large energies are allowed in quantum mechanics. Therefore in my opinion physicists are just mixing various concepts and mechanisms mutually at each level of physical model derivation from phenomenological to formal one, which leads effectively in prediction of many types of Higgs bosons of different rest mass and behavior, thus making such hypothesis untestable.

We are facing this conceptual confusion clearly at the moment, when mainstream physics presents some discrete predictions about Higgs boson. Each particle that couples to the Higgs field has a Yukawa coupling, too. The mass of a fermion is proportional to its Yukawa coupling, meaning that the Higgs boson will couple to the most massive particle. This means that the most significant corrections to the Higgs mass will originate from the heaviest particles, most prominently the top quark. From Standard model follows, the product of Higgs boson Yukawa coupling to the left- and right-handed top quarks have nearly the same rest mass (173.1±1.3 GeV/c2) like those predicted for Higgs boson (178.0 ± 4.3 GeV/c2). We can compare the way, in which Higgs is supposed to be proved and detected at LHC:


And the way, in which formation of top-quark pairs was evidenced and detected already at Fermilab:



Because the observation agrees well both in Higgs mass, both in decay mechanism expected, it basically means, Higgs boson was observed already as a dilepton channel of top-quark pairs decay and no further research is necessary, investments into LHC experiments the less from perspective of evidence of this particular Higgs boson model - which indeed falsifies the above hypothesis of Nielsen & Ninomiya as well. Of course, conflict of many research interests with needs of society keeps these connections in secret more effectively, then every model of time-traveling Higgs thinkable can do. In another way, physicists didn't recognize the duality of heaviest particle of matter (top quark) and Higgs boson in similar way, in which they didn't recognize the duality of most lightweight photons and gravitational waves at the opposite side of energy density spectrum.

This stance is nothing very new in contemporary physics, which often looks for evidence at incorrect places, while neglecting or even refusing clear evidence from dual view of AWT. We can compare it to search for event horizon during travel into black hole, while it's evident from more distant/general perspective, we crossed it already. The "unsuccessful" research for luminiferous Aether, while ignoring dense Aether model is the iconic case of this confusion, but we can find many other analogies here. For example, scientists are looking for evidence of Lorentz symmetry violation and hidden dimensions by violation of gravitational law, while ignoring Casimir force, or they trying to search for gravitational waves, while filtering out noise from detectors, just because they don't understand their subject at transparent, intuitive level.

Apparently, additional cost of research and general confusion of layman society is the logical consequence of this collective ignorance, while it keeps many scientists in their safe jobs and salaries in the same way, like mysticism of Catholic Church of medieval era - so I don't believe in comprehension and subsequent atonement in real time.

Monday, October 12, 2009

Rachel Bean: GR is probably (98%) wrong

This post is motivated by recent finding of Rachel Bean, who found, various WMAP, 2MASS, SDSS, COSMOS data concerning the Sachs-Wolfe, galaxy distributions, weak lensing shear field, and the cosmic expansion history doesn't fit general theory of relativity (GR for short). The reactions of Sean Carroll and/or Lubos Motl are careful, as someone may expect : "well, this could be challenging - but probably irrelevant, because GR has proved itself so many times, but the science should care about such details, mumbojumbo..."

Jeez - but how GR was derived before eighty years? This theory puts an equivalence between curvature of space and spatial distribution of energy of gravitational potential, as borrowed from Newton's theory (because we really have no better source for function of gravitational potential with distance, then the forty years old gravitational law). So, if we know the mass of object, we can compute the spatial distribution of potential energy, so we can compute the spatial distribution of space-time curvature - end of story (of GR). Or not?

Not at all, because from the very same theory follows, energy density is equivalent to mass density by E=mc^2 formula - so we are facing new distribution of matter in space, which should lead into another distribution of space-time curvature and energy of gravitational potential curvature, which leads to another distribution of matter, and so on - recursively. Such implicit character of GR was never mentioned in classical field theory of GR and corresponding textbooks - so it's nothing strange, it violates all observations available by now. But it's still prediction of GR postulates and it fits well with fractal implicit character of Universe and AWT - it just requires to derive Einstein's field equations more consequently and thoroughly.



Wow, this could be really breakthrough in physics and challenging task for new Einstein - or not? Of course not - and here we come to real problem of contemporary science - because such approach is fifty years old already and its even used in dark matter theory, in fact. Such modification would lead into quantization of gravity and longly awaited quantum gravity - the only problem for formally thinking physmatics is, it brings a quantum chaos into ordered world of formal relativity too, as there is (nearly) infinite number of ways, how to derive it - and all ways are still only approximations of real situation. The names like Cartan, Evans, Heim, Yilmaz, J. Bekenstein or Rudi V. Nieuwenhove are all dealing with this approach in less or more straightforward form - but this cannot change thinking of incompetent, though loudly blogging people, who invested two or more years of their life into learning of GR derivations, until they become "productive" with it (as measured by number of articles published) - so now they simply have no time and/or mental capacity to understand something new, to extrapolate the less.

Of course it's not just a problem of few desoriented bloggers, but inertia of whole mainstream community, the size of which prohibits introduction of new ideas and which has chosen formal approach to classical theories as a salary generator for their safe life. In this way, every new idea or derivation is simply forgotten, until it's revealed again in another, slightly different connection, when everyone appears surprised, how is it possible, GR isn't working properly?