Wednesday, September 21, 2005

Point--> Line-->Plane <---> Point<-- String<-- Brane

Under the heading of Klein`s Ordering of the Geometries :

A theorem which is valid for a geometry in this sequence is automatically valid for the ones that follow. The theorems of projective geometry are automatically valid theorems of Euclidean geometry. We say that topological geometry is more abstract than projective geometry which is turn is more abstract than Euclidean geometry.

Now the usual thinking here has been placed under intense thinking by the introduction of a new way in which to look at "geometry" that has gone through a "revision" in thinking.

New trigonometry is a sign of the times

Lubos Motl introduces this topic and link in his blog entry and from this this has caused great consternation in how I am seeing. I see Lisa Randall might counter this in terms of what the brain is capable of, in line with this revisionary seeing, and comparative examples of this geometry Lubos links.

Dangling Particles,By LISA RANDALL
Published: September 18, 2005 New York Yimes

Lisa Randall:
Most people think of "seeing" and "observing" directly with their senses. But for physicists, these words refer to much more indirect measurements involving a train of theoretical logic by which we can interpret what is "seen." I do theoretical research on string theory and particle physics and try to focus on aspects of those theories we might experimentally test. My most recent research is about extra dimensions of space. Remarkably, we can potentially "see" or "observe" evidence of extra dimensions. But we won't reach out and touch those dimensions with our fingertips or see them with our eyes. The evidence will consist of heavy particles known as Kaluza-Klein modes that travel in extra-dimensional space. If our theories correctly describe the world, there will be a precise enough link between such particles (which will be experimentally observed) and extra dimensions to establish the existence of extra dimensions.

But first before I get to the essence of the title of my blog entry, I like to prep the mind for what is seemingly a consistent move towards geometry that has it's basis in applicabilty to physics, and move through GR to a vast new comprehsnsion in non-euclidean geometries. Must we now move backwards that we had gained in insight, or was it recognition of the "length scales" that we now say, how could such a dynamcial view ever be assigned to the eucildean discription under the guise of brane world recognitions?

Moving Backwards?

What exactly do I mean here?

Well the idea is that if you move to fifth dimensional views, and there are ways to wrap this within our "Brains":) We then see the dynamcial nature of our neurons have found acceptable ways in which to see this brane feature. As well as, approaches in use of new processes in geometerical considerations as those linked by Lubos.

Dealing with 5D world

Thomas Banchoff is instrumental here is showing us that fifth dimensional views can be utilized in our computer screens, and such comparisons, reduce to a two dimensional frame, makes it very easy to accept this new way in which to attack the dynamcial nature of reality.

How indeed now could our computer screen act a liason with the reality of our world, when see from screen imagery effects, that all the rules of order have been safely applied for inspection and consistancy in physics approaches.


  1. Hi Plato,

    Current teaching of general relativity, as causing a flat surface like a rubber sheet to curve into a manifold, is unhelpful to further progress in unifying quantum space with gravitation, since physical space fills volume, not surface area.

    Georges Louis LeSage, between 1747-82, explained gravity classically as a shadowing effect of space pressure by masses. The speculative, non-quantitative mechanism was published in French and is available online (G.S. LeSage, Lucrece Newtonien, Nouveaux Memoires De L’Academie Royal de Sciences et Belle Letters, 1782, pp. 404-31, ). Because gravity depends on the mass within the whole earth’s volume, LeSage predicted that the atomic structure was mostly void, a kind of nuclear atom which was confirmed by Rutherford’s work in 1911. LeSage argued that there is some kind of pressure in space, and that masses shield one another from the space pressure, thus being pushed together by the unshielded space pressure on the opposite side. Feynman explained that the major advance of general relativity, the contraction term, shortens the radius of every mass. He does not derive the equation, but I've now modified my page to begin with this ( ).

    The 4-D spacetime as a brane on 5-D seems the same thing as saying that the 4-D world is a hologram of 5-D. The key thing to keep in mind is that the extra dimensions must be rolled up, so this brane and hologram analogy is a mathematical equivalence.

    There is no more basis for thinking of branes like a sheet than there is for thinking of gravitation in general relativity as causing an indentation in a two dimensional rubber sheet.

    The analogy is a poor one, and explains why manifold speculations have led nowhere.

    It is very difficult to make progress on more classical and clear alternatives like the LeSage mechanism. Traditionally this is simply ignored on the basis that a physical mechanism requires a spacetime fabric, and the assumption usually made is that the fabric is like a gas.

    This gas would simply fill in any geometric shadows between an apple and the masses of particles in the earth, preventing gravity from working. In addition, the gas would have viscosity and impose drag, slowing down the planets in a way which is contrary to knowledge of the solar system.

    Therefore, if the LeSage mechanism is right, the source of gravity is a type of light speed radiation which is not stopped by the the surface of a planet, but can penetrate through it and be stopped only by a very small shielding area of each fundamental particle.

    LeSage recognised this to some extent (not perhaps the light speed of the radiation), and turned around the obvious objections to make the prediction of the nuclear atom 163 years before it was widely accepted as a fact.

    It is a pity that there seems to be no way to draw anybody's attention to the error in the Hubble interpretation. Spacetime implies that light-observed distance is equivalent mathematically to time past. I cannot see why the Hubble expansion constant is only expressed as velocity/distance, and not velocity/time past which has units of acceleration and is useful to the LeSage problem.

    It seems as if the whole subject needs to be reviewed and presented with great clarity before anyone will take it seriously. I've revised my internet pages on the subject, but Google has now removed them from being listed. It seems that the editor of New Scientist thinks he can suppress news not only through not publishing it, but also by getting Google to stop listing sites which contain extracts from his emails.

    I've given up on arXiv, PRL, etc. As far as they are concerned, this subject is a waste of time and not worth reviewing.

    Best wishes,

  2. Hi Plato,

    The two explanations to the gravity force/heirarchy problem which you mention above seem untestable.

    The first is that the weakness of gravity can be 'explained away by saying that it leaks into the estra dimensions'. This looks like weak (dare I say crackpot?)science to me, because it cannot be checked quantitatively.

    The second explanation, the Randall-Sundrum world, where gravity is weak because one dimension is stretched out, also lacks a quantitative prediction that can test it easily.

    The supposedly crackpot idea of LeSage, with a proper interpretation of the Hubble expansion in the big bang, gives testable predictions. It would be compatible with superstring theory if as t'Hooft suggests, the 5th dimension in M-theory is the spacetime fabric, filling 3-D space with graviton radiation.

    It is vital to note that gravity goes at the speed of light, not instantly, so the relevant picture of the big bang is the observed one in which the velocity of recession appears to increase with time past. Thus recession speeds vary from zero locally towards c (radius divided by time past, c = RH) at the edge of the universe, while the corresponding times vary from 15,000 million years (t = 1/H) toward zero with increasing observable distance. This implies acceleration of a = dv/dt = (RH)/(1/H) = RH2 = cH = 6 x 10-10 ms-2. Although a very small acceleration, a large amount of mass is involved so the force is very big and can induce much larger accelerations by implosion (focussing of the inward force of the big bang on small areas around shielding masses). The recession speed increases with distance partly because we are looking back to earlier times, and partly because we are also looking to bigger distances. When the universe was a cloud of hydrogen gas it was expanding under pressure. Now that it has clumped into galaxies, that mechanism no longer exists, so it is now coasting, and affected only by forces like gravity.

    Best wishes,

  3. Nigel,

    You must look into this site for information about the "missing energy". This is not something that is made up for the heck of it.

    I follow the defintions of what theoretical means, and in this case the experiment has evidence to support this.

    It is easy to hand out labels of crackpotism, and it seems it is flying wild from all sides. But lets put this aside, and follow through.

    The powerful LHC will greatly improve the chances for detecting missing energy events and other prominent extradimension effects.

    Ways in which to percieve these extra dimensions. Think about this very carefully.

    Ask yourself this question.

    If it takes so much energy to produce these particle energy results from collisions, all energy should be accounted for.

    Is this correct?

  4. Hi Plato,

    My reply is the revised paper:

    please check Lunsford's 6-dimensional space-time, the fifth dimension will still be the spacetime fabric by the holographic conjecture?

    Rueda and Haisch, Physical Review A v49 p 678 (1994), showed that the virtual radiation of electromagnetism can cause inertial mass, and in Annalen der Physik v14 p479 they do the same for gravity in general relativity. The virtual radiation acts on fundamental particles of mass, quarks and electrons, which are always charged. It doesn’t ignore all the quarks in a neutron just because they have no net charge. A gauge boson going at light speed doesn’t discriminate between neutrons and protons, only the fundamental quarks inside them. Therefore, the background field of virtual radiation pressure besides causing inertia (and the contraction of moving objects in the direction of motion) also causes gravity (and the contraction in the direction of gravitational fields, the reduction in GR).

    The coupling constant for electromagnetism is then naturally related to gravity. Between similar charges, the electric field causing the radiation pressure adds up like a series of batteries. In any line, there will be approximately equal numbers of both charges, so the sum will be zero. The only way it can add up is by a drunkard’s walk, where the statistics show the net charge will be the square root of the number of similar charges in the universe. Since there are 10^80 charges, electromagnetism will be 10^40 times stronger than gravity. Attraction is due to opposite charges screening each other and being pushed together by the radiation from the surrounding universe, while repulsion is due to the fact that nearby charges exchange gauge bosons which aren’t redshifted by cosmic expansion (and so produce a mutual recoil), while the radiation pushing them together is red-shifted by the big bang and so is weaker.

    Lunsford’s CERN document server paper discounts the historical attempts by Kaluza, Pauli, Klein, Einstein, Mayer, Eddington and Weyl. It proceeds to the correct unification of general relativity and Maxwell’s equations, finding 4-d spacetime inadequate: ‘… We see now that we are in trouble in 4-d. The first three [dimensions] will lead to 4th order differential equations in the metric. Even if these may be differentially reduced to match up with gravitation as we know it, we cannot be satisfied with such a process, and in all likelihood there is a large excess of unphysical solutions at hand. … Only first in six dimensions can we form simple rational invariants that lead to a sensible variational principle. The volume factor now has weight 3, so the possible scalars are weight -3, and we have the possibilities [equations]. In contrast to the situation in 4-d, all of these will lead to second order equations for the g, and all are irreducible - no arbitrary factors will appear in the variation principle. We pick the first one. The others are unsuitable … It is remarkable that without ever introducing electrons, we have recovered the essential elements of electrodynamics, justifying Einstein’s famous statement …’

    D.R. Lunsford shows that 6 dimensions in SO(3,3) should replace the Kaluza-Klein 5-dimensional spacetime, unifying GR and electromagnetism: ‘One striking feature of these equations ... is the absent gravitational constant - in fact the ratio of scalars in front of the energy tensor plays that role. This explains the odd role of G in general relativity and its scaling behavior. The ratio has conformal weight 1 and so G has a natural dimensionfulness that prevents it from being a proper coupling constant - so this theory explains why ordinary general relativity, even in the linear approximation and the quantum theory built on it, cannot be regularized.’

    Best wishes,