## Sunday, February 28, 2010

Logic is the art of thinking; grammar, the art of inventing symbols and combining them to express thought; and rhetoric, the art of communicating thought from one mind to another, the adaptation of language to circumstance.Sister Miriam Joseph

Painting by Cesare Maccari (1840-1919), Cicero Denounces Catiline.

In medieval universities, the trivium comprised the three subjects taught first: grammar, logic, and rhetoric. The word is a Latin term meaning “the three ways” or “the three roads” forming the foundation of a medieval liberal arts education. This study was preparatory for the quadrivium. The trivium is implicit in the De nuptiis of Martianus Capella, although the term was not used until the Carolingian era when it was coined in imitation of the earlier quadrivium.[1] It was later systematized in part by Petrus Ramus as an essential part of Ramism.

# Formal grammar

A formal grammar (sometimes simply called a grammar) is a set of rules of a specific kind, for forming strings in a formal language. The rules describe how to form strings from the language's alphabet that are valid according to the language's syntax. A grammar does not describe the meaning of the strings or what can be done with them in whatever context —only their form.

Formal language theory, the discipline which studies formal grammars and languages, is a branch of applied mathematics. Its applications are found in theoretical computer science, theoretical linguistics, formal semantics, mathematical logic, and other areas.

A formal grammar is a set of rules for rewriting strings, along with a "start symbol" from which rewriting must start. Therefore, a grammar is usually thought of as a language generator. However, it can also sometimes be used as the basis for a "recognizer"—a function in computing that determines whether a given string belongs to the language or is grammatically incorrect. To describe such recognizers, formal language theory uses separate formalisms, known as automata theory. One of the interesting results of automata theory is that it is not possible to design a recognizer for certain formal languages.

Parsing is the process of recognizing an utterance (a string in natural languages) by breaking it down to a set of symbols and analyzing each one against the grammar of the language. Most languages have the meanings of their utterances structured according to their syntax—a practice known as compositional semantics. As a result, the first step to describing the meaning of an utterance in language is to break it down part by part and look at its analyzed form (known as its parse tree in computer science, and as its deep structure in generative grammar).
Logic

As a discipline, logic dates back to Aristotle, who established its fundamental place in philosophy. The study of logic is part of the classical trivium.

Averroes defined logic as "the tool for distinguishing between the true and the false"[4]; Richard Whately, '"the Science, as well as the Art, of reasoning"; and Frege, "the science of the most general laws of truth". The article Definitions of logic provides citations for these and other definitions.

Logic is often divided into two parts, inductive reasoning and deductive reasoning. The first is drawing general conclusions from specific examples, the second drawing logical conclusions from definitions and axioms. A similar dichotomy, used by Aristotle, is analysis and synthesis. Here the first takes an object of study and examines its component parts, the second considers how parts can be combined to form a whole.
Logic is also studied in argumentation theory.[5]

## Tuesday, February 23, 2010

### Calorimetric Equivalence Principle Test

With Stefan shutting down the blog temporary I thought to gather my thoughts here.

# Gravitomagnetism

This approximate reformulation of gravitation as described by general relativity makes a "fictitious force" appear in a frame of reference different from a moving, gravitating body. By analogy with electromagnetism, this fictitious force is called the gravitomagnetic force, since it arises in the same way that a moving electric charge creates a magnetic field, the analogous "fictitious force" in special relativity. The main consequence of the gravitomagnetic force, or acceleration, is that a free-falling object near a massive rotating object will itself rotate. This prediction, often loosely referred to as a gravitomagnetic effect, is among the last basic predictions of general relativity yet to be directly tested.
Indirect validations of gravitomagnetic effects have been derived from analyses of relativistic jets. Roger Penrose had proposed a frame dragging mechanism for extracting energy and momentum from rotating black holes.[2] Reva Kay Williams, University of Florida, developed a rigorous proof that validated Penrose's mechanism.[3] Her model showed how the Lense-Thirring effect could account for the observed high energies and luminosities of quasars and active galactic nuclei; the collimated jets about their polar axis; and the asymmetrical jets (relative to the orbital plane).[4] All of those observed properties could be explained in terms of gravitomagnetic effects.[5] Williams’ application of Penrose's mechanism can be applied to black holes of any size.[6] Relativistic jets can serve as the largest and brightest form of validations for gravitomagnetism.
A group at Stanford University is currently analyzing data from the first direct test of GEM, the Gravity Probe B satellite experiment, to see if they are consistent with gravitomagnetism.

A group at Stanford University is currently analyzing data from the first direct test of GEM, the Gravity Probe B satellite experiment, to see if they are consistent with gravitomagnetism.

While I am not as progressed in terms of the organization of your thought process(inexperience in terms of the education) I am holding the ideas of Mendeleev in mind as I look at this topic you've gathered. And Newton as well, but not in the way one might have deferred to as the basis if gravity research.

It is more on the idea of what we can create in reality given all the elements at our disposal. This is also the same idea in mathematics that all the information is there and only has t be discovered. Such a hierarchy in thinking is also the idea of geometrical presence stretched to higher dimensions, as one would point to mater assmptins as t a higher order preset in the development of the material of earth as to the planet.

***

Uncle Al,

Overview:A parity calorimetry test offers a 33,000-fold improvement in EP anomaly sensitivity in only two days of measurements.

we are not so different....that this quest may not be apparent for many, yet it is a simple question about what is contracted to help understand "principals of formation" had been theoretically developed in terms of the genus figures(Stanley Mandelstam) that we understand that this progression mathematically has been slow.

So we scientifically build this experimental progression.

But indeed, it's a method in terms of moving from "the false vacuum to the true?" What is the momentum called toward materialization?

Such an emergent feature while discussing some building block model gives some indication of a "higher order principal" that is not clearly understood, while from a condense matter theorist point of view, this is a emergent feature?

Best,

Bordeaux, France is 44.83 N

http://www.mazepath.com/uncleal/lajos.htm#b7
***

According to general relativity, the gravitational field produced by a rotating object (or any rotating mass-energy) can, in a particular limiting case, be described by equations that have the same form as the magnetic field in classical electromagnetism. Starting from the basic equation of general relativity, the Einstein field equation, and assuming a weak gravitational field or reasonably flat spacetime, the gravitational analogs to Maxwell's equations for electromagnetism, called the "GEM equations", can be derived. GEM equations compared to Maxwell's equations in SI are:[7] [8][9][10]

GEM equations Maxwell's equations
$\nabla \cdot \mathbf{E}_\text{g} = -4 \pi G \rho \$ $\nabla \cdot \mathbf{E} = \frac{\rho_\text{em}}{\epsilon_0} \$
$\nabla \cdot \mathbf{B}_\text{g} = 0 \$ $\nabla \cdot \mathbf{B} = 0 \$
$\nabla \times \mathbf{E}_\text{g} = -\frac{\partial \mathbf{B}_\text{g} } {\partial t} \$ $\nabla \times \mathbf{E} = -\frac{\partial \mathbf{B} } {\partial t} \$
$\nabla \times \mathbf{B}_\text{g} = -\frac{4 \pi G}{c^2} \mathbf{J} + \frac{1}{c^2} \frac{\partial \mathbf{E}_\text{g}} {\partial t}$ $\nabla \times \mathbf{B} = \frac{1}{\epsilon_0 c^2} \mathbf{J}_\text{em} + \frac{1}{c^2} \frac{\partial \mathbf{E}} {\partial t}$

where:

## Monday, February 22, 2010

### Physicists Discover How to Entangle at High Temperatures

While I do not just like to echo in the world of information it is important to me to see how we can use entanglement to give us information about quantum gravity. Is it possible?

Entanglement is the weird quantum process in which two objects share the same existence. So a measurement on one object immediately influences the other, not matter how far apart they may be.
Entanglement is a strange and fragile thing. Sneeze and it vanishes. The problem is that entanglement is destroyed by any interaction with the environment and these interactions are hard to prevent. So physicists have only ever been able to study and exploit entanglement in systems that do not interact easily with the environment, such as photons, or at temperatures close to absolute zero where the environment becomes more benign.

In fact, physicists believe that there is a fundamental limit to the thermal energies at which entanglement can be usefully exploited. And this limit is tiny, comparable to very lowest temperatures.
Today, Fernando Galve at the University of the Balearic Islands in Spain and a few buddies, show how this limit can be dramatically increased. The key behind their idea is the notion of a squeezed state.
In quantum mechanics, Heisenberg's uncertainty principle places important limits on how well certain pairs of complementary properties can be observed. For example, the more accurately you measure position, the less well you can determine momentum. The same is true of energy and time and also of the phase and amplitude of a quantum state.

Physicists have learnt how to play around with these complementary observables to optimise the way they make measurements. They've discovered that they can trade their knowledge of one complementary observable for an improvement in the other. See more here:Physicists Discover How to Entangle at High Temperatures

## Saturday, February 20, 2010

### Economy, as Science

A shift in paradigm can lead, via the theory-dependence of observation, to a difference in one's experiences of things and thus to a change in one's phenomenal world.ON Thomas Kuhn

Control the information you control the people?:) Again as heart felt and idealistic you can become in your efforts, it's not enough to cry out in political verbiage because you'll always end up with another person saying that it is only a political perspective. That it is the progressive conservative you don't like and their leader? It's not enough.

So what do you do?

Do you succumb to the frustration that what is moving as a sub-culture working from the inside/out, is the idea that you can build a better consensus from what is moving the fabric of society to know that we can change the outcome as to what Canada shall become as well?

They( a conspirator thought) as a force that is undermining the public perception while society did not grasp the full understanding of what has been done to them. Society having been cast to fighting at the "local level to advance a larger agenda?"

Does it not seem that once you occupy the mind in such close quarter conflagrations that mind has been circumvented from the larger picture?

Pain, and emotional turmoil does this.

Historically once the fire has been started, like some phoenix, a new cultural idealism manifests as to what the individual actually wants when they are in full recognition that "as a force" moved forward in a democratic compunction as a government in waiting to advance the principals by which it can stand as the public mind.

However, the incommensurability thesis is not Kuhn's only positive philosophical thesis. Kuhn himself tells us that “The paradigm as shared example is the central element of what I now take to be the most novel and least understood aspect of [The Structure of Scientific Revolutions]” (1970a, 187). Nonetheless, Kuhn failed to develop the paradigm concept in his later work beyond an early application of its semantic aspects to the explanation of incommensurability. The explanation of scientific development in terms of paradigms was not only novel but radical too, insofar as it gives a naturalistic explanation of belief-change. Naturalism was not in the early 1960s the familiar part of philosophical landscape that it has subsequently become. Kuhn's explanation contrasted with explanations in terms of rules of method (or confirmation, falsification etc.) that most philosophers of science took to be constitutive of rationality. Furthermore, the relevant disciplines (psychology, cognitive science, artificial intelligence) were either insufficiently progressed to support Kuhn's contentions concerning paradigms, or were antithetical to them (in the case of classical AI). Now that naturalism has become an accepted component of philosophy, there has recently been interest in reassessing Kuhn's work in the light of developments in the relevant sciences, many of which provide corroboration for Kuhn's claim that science is driven by relations of perceived similarity and analogy to existing problems and their solutions (Nickles 2003b, Nersessian 2003). It may yet be that a characteristically Kuhnian thesis will play a prominent part in our understanding of science.
I would advance that the word "science" in quote above, be changed to "economy."

What paradigmatic solution has been advanced that such a thing can turn over the present equatorial function assigned to the pubic mind, that we will be in better control of our destinies as Canadians?

Precursor to such changes are revolutions in the thought patterns established as functionary pundits of money orientated societies. They have become "fixed to a particular agenda." Rote systems assumed and brought up in,  extolled as to the highest moral obligation is  to live well, and on the way, fix ourselves to debt written obligations that shall soon over come the sensibility of what it shall take to live?

Force upon them is the understanding that we had become a slave to our reason and a slave to a master disguised as what is healthy and knows no boundaries? A capitalistic dream.

Update:

# Money Supply and Energy: Is The Economy Inherently Unstable?

## Tuesday, February 16, 2010

### Article From New York Times and More

Brookhaven National Laboratory

HOT A computer rendition of 4-trillion-degree Celsius quark-gluon plasma created in a demonstration of what scientists suspect shaped cosmic history.

In Brookhaven Collider, Scientists Briefly Break a Law of Nature

The Brookhaven scientists and their colleagues discussed their latest results from RHIC in talks and a news conference at a meeting of the American Physical Society Monday in Washington, and in a pair of papers submitted to Physical Review Letters. “This is a view of what the world was like at 2 microseconds,” said Jack Sandweiss of Yale, a member of the Brookhaven team, calling it, “a seething cauldron.”

Among other things, the group announced it had succeeded in measuring the temperature of the quark-gluon plasma as 4 trillion degrees Celsius, “by far the hottest matter ever made,” Dr. Vigdor said. That is 250,000 times hotter than the center of the Sun and well above the temperature at which theorists calculate that protons and neutrons should melt, but the quark-gluon plasma does not act the way theorists had predicted.

Instead of behaving like a perfect gas, in which every quark goes its own way independent of the others, the plasma seemed to act like a liquid. “It was a very big surprise,” Dr. Vigdor said, when it was discovered in 2005. Since then, however, theorists have revisited their calculations and found that the quark soup can be either a liquid or a gas, depending on the temperature, he explained. “This is not your father’s quark-gluon plasma,” said Barbara V. Jacak, of the State University at Stony Brook, speaking for the team that made the new measurements.

It is now thought that the plasma would have to be a million times more energetic to become a perfect gas. That is beyond the reach of any conceivable laboratory experiment, but the experiments colliding lead nuclei in the Large Hadron Collider outside Geneva next winter should reach energies high enough to see some evolution from a liquid to a gas.

***

Violating Parity with Quarks and Gluons
by Sean Carroll of Cosmic Variance
This new result from RHIC doesn’t change that state of affairs, but shows how quarks and gluons can violate parity spontaneously if they are in the right environment — namely, a hot plasma with a magnetic field.

So, okay, no new laws of physics. Just a much better understanding of how the existing ones work! Which is most of what science does, after all
.

***

# Quark–gluon plasma

A QGP is formed at the collision point of two relativistically accelerated gold ions in the center of the STAR detector at the relativistic heavy ion collider at the Brookhaven national laboratory.

A quark-gluon plasma (QGP) or quark soup[1] is a phase of quantum chromodynamics (QCD) which exists at extremely high temperature and/or density. This phase consists of (almost) free quarks and gluons, which are the basic building blocks of matter. Experiments at CERN's Super Proton Synchrotron (SPS) first tried to create the QGP in the 1980s and 1990s: the results led CERN to announce indirect evidence for a "new state of matter"[2] in 2000. Current experiments at Brookhaven National Laboratory's Relativistic Heavy Ion Collider (RHIC) are continuing this effort.[3] Three new experiments running on CERN's Large Hadron Collider (LHC), ALICE,[4] ATLAS and CMS, will continue studying properties of QGP.

## Contents

• 1 General introduction

• 1.1 Why this is referred to as "plasma"
• 1.2 How the QGP is studied theoretically
• 1.3 How it is created in the lab
• 1.4 How the QGP fits into the general scheme of physics
• 2 Expected properties

• 2.1 Thermodynamics
• 2.2 Flow
• 2.3 Excitation spectrum
• 3 Experimental situation
• 4 Formation of quark matter
• 6 References

## General introduction

The quark-gluon plasma contains quarks and gluons, just as normal (baryonic) matter does. The difference between these two phases of QCD is that in normal matter each quark either pairs up with an anti-quark to form a meson or joins with two other quarks to form a baryon (such as the proton and the neutron). In the QGP, by contrast, these mesons and baryons lose their identities and dissolve into a fluid of quarks and gluons.[5] In normal matter quarks are confined; in the QGP quarks are deconfined.
Although the experimental high temperatures and densities predicted as producing a quark-gluon plasma have been realized in the laboratory, the resulting matter does not behave as a quasi-ideal state of free quarks and gluons, but, rather, as an almost perfect dense fluid.[6] Actually the fact that the quark-gluon plasma will not yet be "free" at temperatures realized at present accelerators had been predicted already in 1984 [7] as a consequence of the remnant effects of confinement.

Why this is referred to as "plasma"

A plasma is matter in which charges are screened due to the presence of other mobile charges; for example: Coulomb's Law is modified to yield a distance-dependent charge. In a QGP, the color charge of the quarks and gluons is screened. The QGP has other analogies with a normal plasma. There are also dissimilarities because the color charge is non-abelian, whereas the electric charge is abelian. Outside a finite volume of QGP the color electric field is not screened, so that volume of QGP must still be color-neutral. It will therefore, like a nucleus, have integer electric charge.

### How the QGP is studied theoretically

One consequence of this difference is that the color charge is too large for perturbative computations which are the mainstay of QED. As a result, the main theoretical tools to explore the theory of the QGP is lattice gauge theory. The transition temperature (approximately 175 MeV) was first predicted by lattice gauge theory. Since then lattice gauge theory has been used to predict many other properties of this kind of matter. The AdS/CFT correspondence is a new interesting conjecture allowing insights in QGP.

### How it is created in the lab

The QGP can be created by heating matter up to a temperature of 2×1012 kelvin, which amounts to 175 MeV per particle. This can be accomplished by colliding two large nuclei at high energy (note that 175 MeV is not the energy of the colliding beam). Lead and gold nuclei have been used for such collisions at CERN SPS and BNL RHIC, respectively. The nuclei are accelerated to ultrarelativistic speeds and slammed into each other while Lorentz contracted. They largely pass through each other, but a resulting hot volume called a fireball is created after the collision. Once created, this fireball is expected to expand under its own pressure, and cool while expanding. By carefully studying this flow, experimentalists hope to put the theory to test.

### How the QGP fits into the general scheme of physics

QCD is one part of the modern theory of particle physics called the Standard Model. Other parts of this theory deal with electroweak interactions and neutrinos. The theory of electrodynamics has been tested and found correct to a few parts in a trillion. The theory of weak interactions has been tested and found correct to a few parts in a thousand. Perturbative aspects of QCD have been tested to a few percent. In contrast, non-perturbative aspects of QCD have barely been tested. The study of the QGP is part of this effort to consolidate the grand theory of particle physics.
The study of the QGP is also a testing ground for finite temperature field theory, a branch of theoretical physics which seeks to understand particle physics under conditions of high temperature. Such studies are important to understand the early evolution of our universe: the first hundred microseconds or so. While this may seem esoteric, this is crucial to the physics goals of a new generation of observations of the universe (WMAP and its successors). It is also of relevance to Grand Unification Theories or 'GUTS' which seek to unify the four fundamental forces of nature.

## Expected properties

### Thermodynamics

The cross-over temperature from the normal hadronic to the QGP phase is about 175 MeV, corresponding to an energy density of a little less than 1 GeV/fm3. For relativistic matter, pressure and temperature are not independent variables, so the equation of state is a relation between the energy density and the pressure. This has been found through lattice computations, and compared to both perturbation theory and string theory. This is still a matter of active research. Response functions such as the specific heat and various quark number susceptibilities are currently being computed.

### Flow

The equation of state is an important input into the flow equations. The speed of sound is currently under investigation in lattice computations. The mean free path of quarks and gluons has been computed using perturbation theory as well as string theory. Lattice computations have been slower here, although the first computations of transport coefficients have recently been concluded. These indicate that the mean free time of quarks and gluons in the QGP may be comparable to the average interparticle spacing: hence the QGP is a liquid as far as its flow properties go. This is very much an active field of research, and these conclusions may evolve rapidly. The incorporation of dissipative phenomena into hydrodynamics is another recent development that is still in an active stage.

### Excitation spectrum

Does the QGP really contain (almost) free quarks and gluons? The study of thermodynamic and flow properties would indicate that this is an over-simplification. Many ideas are currently being evolved and will be put to test in the near future. It has been hypothesized recently that some mesons built from heavy quarks (such as the charm quark) do not dissolve until the temperature reaches about 350 MeV. This has led to speculation that many other kinds of bound states may exist in the plasma. Some static properties of the plasma (similar to the Debye screening length) constrain the excitation spectrum.

## Experimental situation

Those aspects of the QGP which are easiest to compute are not the ones which are the easiest to probe in experiments. While the balance of evidence points towards the QGP being the origin of the detailed properties of the fireball produced in the RHIC, this is the main barrier which prevents experimentalists from declaring a sighting of the QGP. For a summary see 2005 RHIC Assessment.
The important classes of experimental observations are

## Formation of quark matter

In April 2005, formation of quark matter was tentatively confirmed by results obtained at Brookhaven National Laboratory's Relativistic Heavy Ion Collider (RHIC). The consensus of the four RHIC research groups was that they had created a quark-gluon liquid of very low viscosity. However, contrary to what was at that time still the widespread assumption, it is yet unknown from theoretical predictions whether the QCD "plasma", especially close to the transition temperature, should behave like a gas or liquid[8]. Authors favoring the weakly interacting interpretation derive their assumptions from the lattice QCD calculation, where the entropy density of quark-gluon plasma approaches the weakly interacting limit. However, since both energy density and correlation shows significant deviation from the weakly interacting limit, it has been pointed out by many authors that there is in fact no reason to assume a QCD "plasma" close to the transition point should be weakly interacting, like electromagnetic plasma (see, e.g., [9]).

## Friday, February 12, 2010

### The Last Question by Isaac Asimov

The problem of heat can be a frustrating one if one can contend with the computer chips and how this may of resulted in a reboot of the machine( or it's death) into a better state of existence then what was previously used in working model form.

So the perfection is to the very defining model of a super race that is devoid of all the trappings in human form that can be ruled by the mistakes of combining body parts from Frankenstein sense to what the new terminator models have in taken over..but they are not human?

“You ask Multivac. I dare you. Five dollars says it can’t be done.”
“Adell was just drunk enough to try, just sober enough to be able to phrase the necessary symbols and operations into a question which, in words, might have corresponded to this: Will mankind one day without the net expenditure of energy be able to restore the sun to its full youthfulness even after it had died of old age?
Or maybe it could be put more simply like this: How can the net amount of entropy of the universe be massively decreased?
Multivac fell dead and silent. The slow flashing of lights ceased, the distant sounds of clicking relays ended.
Then, just as the frightened technicians felt they could hold their breath no longer, there was a sudden springing to life of the teletype attached to that portion of Multivac. Five words were printed: INSUFFICIENT DATA FOR MEANINGFUL ANSWER.

***

## Timeframe for heat death

From the Big Bang through the present day and well into the future, matter and dark matter in the universe is concentrated in stars, galaxies, and galaxy clusters. Therefore, the universe is not in thermodynamic equilibrium and objects can do physical work.[11], §VID. The decay time of a roughly galaxy-mass (1011 solar masses) supermassive black hole due to Hawking radiation is on the order of 10100 years,[12], so entropy can be produced until at least that time. After that time, the universe enters the so-called dark era, and is expected to consist chiefly of a dilute gas of photons and leptons.[11], §VIA. With only very diffuse matter remaining, activity in the universe will have tailed off dramatically, with very low energy levels and very large time scales. Speculatively, it is possible that the Universe may enter a second inflationary epoch, or, assuming that the current vacuum state is a false vacuum, the vacuum may decay into a lower-energy state.[11], §VE. It is also possible that entropy production will cease and the universe will achieve heat death.[11], §VID.

***

### Creating the Perfect Human Being or Maybe.....

..... a Frankenstein? :)

Artificial Intelligence (AI) is the intelligence of machines and the branch of computer scienceintelligent agents,"[1] where an intelligent agent is a system that perceives its environment and takes actions which maximize its chances of success.[2] John McCarthy, who coined the term in 1956,[3][4] which aims to create it. Textbooks define the field as "the study and design of defines it as "the science and engineering of making intelligent machines."
The field was founded on the claim that a central property of humans, intelligence—the sapience of Homo sapiens—can be so precisely described that it can be simulated by a machine.[5] This raises philosophical issues about the nature of the mind and limits of scientific hubris, issues which have been addressed by myth, fiction and philosophy since antiquity.[6] Artificial intelligence has been the subject of breathtaking optimism,[7] has suffered stunning setbacks[8][9] and, today, has become an essential part of the technology industry, providing the heavy lifting for many of the most difficult problems in computer science.
AI research is highly technical and specialized, deeply divided into subfields that often fail to communicate with each other.[10] Subfields have grown up around particular institutions, the work of individual researchers, the solution of specific problems, longstanding differences of opinion about how AI should be done and the application of widely differing tools. The central problems of AI include such traits as reasoning, knowledge, planning, learning, communication, perception and the ability to move and manipulate objects.[11] General intelligence (or "strong AI") is still a long-term goal of (some) research.[12]

## Wednesday, February 10, 2010

### GOCE delivering data for best gravity map ever

30 September 2009
Following the launch and in-orbit testing of the most sophisticated gravity mission ever built, ESA’s GOCE satellite is now in ‘measurement mode’, mapping tiny variations in Earth’s gravity in unprecedented detail.

The ‘Gravity field and steady-state Ocean Circulation Explorer’ (GOCE) satellite was launched on 17 March from northern Russia. The data now being received will lead to a better understanding of Earth’s gravity, which is important for understanding how our planet works.

It is often assumed that gravity exerts an equal force everywhere on Earth. However, owing to factors such as the rotation of the planet, the effects of mountains and ocean trenches, and density variations in Earth’s interior, this fundamental force is not quite the same all over.

Credit:ESA

Over two six-month uninterrupted periods, GOCE will map these subtle variations with extreme detail and accuracy. This will result in a unique model of the ‘geoid’ – the surface of an ideal global ocean at rest.

A precise knowledge of the geoid is crucial for accurate measurement of ocean circulation and sea-level change, both of which are influenced by climate. The data from GOCE are also much-needed to understand the processes occurring inside Earth. In addition, by providing a global reference to compare heights anywhere in the world, the GOCE-derived geoid will be used for practical applications in areas such as surveying and levelling.See More here and here

See:Plato's Nightlight Mining Company is claiming Aristarchus Crater and Surrounding Region and the rest is history:)

## Saturday, February 06, 2010

### A New Time Travel Scenario?

Black Hole-Powered Jet of Electrons and Sub-Atomic Particles Streams From Center of Galaxy M87

NASA's Hubble Space Telescope Yields Clear View of Optical Jet in Galaxy M87

A NASA Hubble Space Telescope (HST) view of a 4,000 light-year long jet of plasma emanating from the bright nucleus of the giant elliptical galaxy M87. This ultraviolet light image was made with the European Space Agency's Faint Object Camera (FOC), one of two imaging systems aboard HST. This photo is being presented on Thursday, January 16th at the 179th meeting of the American Astronomical Society meeting in Atlanta, Georgia. M87 is a giant elliptical galaxy with an estimated mass of 300 billion suns. Located 52 million light-years away at the heart of the neighboring Virgo cluster of galaxies, M87 is the nearest example of an active galactic nucleus with a bright optical jet. The jet appears as a string of knots within a widening cone extending out from the core of M87. The FOC image reveals unprecedented detail in these knots, resolving some features as small as ten light-years across. According to one theory, the jet is most likely powered by a 3 billion solar mass black hole at the nucleus of M87. Magnetic fields generated within a spinning accretion disk surrounding the black hole, spiral around the edge of the jet. The fields confine the jet to a long narrow tube of hot plasma and charged particles. High speed electrons and protons which are accelerated near the black hole race along the tube at nearly the speed of light. When electrons are caught up in the magnetic field they radiate in a process called synchrotron radiation. The Faint Object Camera image clearly resolves these localized electron acceleration, which seem to trace out the spiral pattern of the otherwise invisible magnetic field lines. A large bright knot located midway along the jet shows where the blue jet disrupts violently and becomes more chaotic. Farther out from the core the jet bends and dissipates as it rams into a wall of gas, invisible but present throughout the galaxy which the jet has plowed in front of itself. HST is ideally suited for studying extragalactic jets. The Telescope's UV sensitivity allows it to clearly separate a jet from the stellar background light of its host galaxy. What's more, the FOC's high angular resolution is comparable to sub arc second resolution achieved by large radio telescope arrays.
See:Hubble Site>

***

Willem Jacob van Stockum (November 20, 1910-June 10, 1944) was a mathematician who made an important contribution to the early development of general relativity.

Van Stockum was born in Hattem in the Netherlands. His father was a mechanically talented officer in the Dutch Navy. After the family (less the father) relocated to Ireland in the late 1920s, Willem studied mathematics at the Trinity College, Dublin, where he earned a gold medal. He went on to earn an M.A. from the University of Toronto and his Ph.D. from University of Edinburgh.

In the mid nineteen thirties, van Stockum became an early enthusiast of the then new theory of gravitation, general relativity. In 1937, he published a paper which contains one of the first exact solutions in general relativity which modeled the gravitational field produced by a configuration of rotating matter, the van Stockum dust, which remains an important example noted for its unusual simplicity. In this paper, van Stockum was apparently the first to notice the possibility of closed timelike curves, one of the strangest and most disconcerting phenomena in general relativity.
***
The chronology protection conjecture is a conjecture by the physicist Professor Stephen Hawking that the laws of physics are such as to prevent time travel on all but sub-microscopic scales. Mathematically, the permissibility of time travel is represented by the existence of closed timelike curves.

***

## Tipler Cylinder

An Overview and Comparison by Dr. David Lewis Anderson

A Tipler Cylinder uses a massive and long cylinder spinning around its longitudinal axis. The rotation creates a frame-dragging effect and fields of closed time-like curves traversable in a way to achieve subluminal time travel to the past.

***

We see a pulsar, then, when one of its beams of radiation crosses our line-of-sight. In this way, a pulsar is like a lighthouse. The light from a lighthouse appears to be "pulsing" because it only crosses our line-of-sight once each time it spins. Similarly, a pulsar "pulses" because we see bright flashes every time the star spins.
See: Pulsars

## Thursday, February 04, 2010

### Perspective of the Theoretical Scientist

Most people think of "seeing" and "observing" directly with their senses. But for physicists, these words refer to much more indirect measurements involving a train of theoretical logic by which we can interpret what is "seen."- Lisa Randall

There are certain advantages to the theoretical perspective that can best portray the concepts of the world they live in with what appears, however abstract, with the minds value of image solicitor impressionism which helps the minds state of acceptance. So it had to be explained first.

Cubist art revolted against the restrictions that perspective imposed. Picasso's art shows a clear rejection of the perspective, with women's faces viewed simultaneously from several angles. Picasso's paintings show multiple perspectives, as though they were painted by someone from the 4th dimension, able to see all perspectives simultaneously.

Cubist Art: Picasso's painting 'Portrait of Dora Maar'

P. Picasso Portrait of Ambrose Vollard (1910)

M. Duchamp Nude Descending a Staircase, No. 2 (1912)

J. Metzinger Le Gouter/Teatime (1911)

The appearance of figures in cubist art --- which are often viewed from several direction simultaneously --- has been linked to ideas concerning extra dimensions:

As if, looking at it from a larger perspective. If you stand outside of the image and see that it is capable of illuminating many angles of perspective. This helped us to see that it is derived from a much larger understanding then what is solidified to the everyday we live in.

For the artist it was a bold move to understanding that perspective could help us see Mona Lisa's smile as moving with us as we move around. So that was the challenge then was to appreciate the value of this artistic push into how we see as to understanding the road non- euclidean took was meet by people as well to culminate in a geometrical transitional form

Hyperspace: A Scientific Odyssey

A look at the higher dimensionsBy Michio Kaku

"Why must art be clinically “realistic?” This Cubist “revolt against perspective” seized the fourth dimension because it touched the third dimension from all possible perspectives. Simply put, Cubist art embraced the fourth dimension. Picasso's paintings are a splendid example, showing a clear rejection of three dimensional perspective, with women's faces viewed simultaneously from several angles. Instead of a single point-of-view, Picasso's paintings show multiple perspectives, as if they were painted by a being from the fourth dimension, able to see all perspectives simultaneously. As art historian Linda Henderson has written, “the fourth dimension and non-Euclidean geometry emerge as among the most important themes unifying much of modern art and theory."

Then, it quickly comes home to mind that maybe what is given,  lets say in context of Lee Smolin's road to Quantum Gravity of the thing will help us quickly see the value of describing "the space of an interior" with what is happening on the screen/label.

Spacetime in String Theory

More then just a Bekenstein imagery to illustrate a conformal approach to describing what are the contends of the tomato soup can from it's label.

Campbell's Soup Can by Andy Warhol Exhibited in New York (USA), Leo Castelli Gallery

It was necessary to see that the geometric used here were helping to shape perspective around not only "time travel" but a means to an end to use mathematical perspective to actually mean something in relation to understanding our world. A way to describe abstract concepts that were correlated with the progression of those mathematics. Klein's ordering of geometries then take on a new meaning as we move deep into the world we all know and love.

In 1919, Kaluza sent Albert Einstein a preprint --- later published in 1921 --- that considered the extension of general relativity to five dimensions. He assumed that the 5-dimensional field equations were simply the higher-dimensional version of the vacuum Einstein equation, and that all the metric components were independent of the fifth coordinate. The later assumption came to be known as the cylinder condition. This resulted in something remarkable: the fifteen higher-dimension field equations naturally broke into a set of ten formulae governing a tensor field representing gravity, four describing a vector field representing electromagnetism, and one wave equation for a scalar field. Furthermore, if the scalar field was constant, the vector field equations were just Maxwell's equations in vacuo, and the tensor field equations were the 4-dimensional Einstein field equations sourced by an EM field. In one fell swoop, Kaluza had written down a single covariant field theory in five dimensions that yielded the four dimensional theories of general relativity and electromagnetism. Naturally, Einstein was very interested in this preprint .

I quickly divert the attention to the world of Thomas Banchoff because it is an extraordinary move from all that we know is safe. It is not lost to some computer animator world that one engages loses the self in the process? It is also to show that what Lee Smolin tried to distance himself from, was in fact seeking to find itself understood in this way. Concurrent agreement that theoretics was trying to arrive at a consensus of different approaches saying the same thing?

Monte Carlo methods are a class of computational algorithms that rely on repeated random sampling to compute their results. Monte Carlo methods are often used in simulating physical and mathematical systems. Because of their reliance on repeated computation of random or pseudo-random numbers, these methods are most suited to calculation by a computer and tend to be used when it is unfeasible or impossible to compute an exact result with a deterministic algorithm.[1]

Monte Carlo simulation methods are especially useful in studying systems with a large number of coupled degrees of freedom, such as fluids, disordered materials, strongly coupled solids, and cellular structures (see cellular Potts model). More broadly, Monte Carlo methods are useful for modeling phenomena with significant uncertainty in inputs, such as the calculation of riskdefinite integrals, particularly multidimensional integrals with complicated boundary conditions. It is a widely successful method in risk analysis when compared with alternative methods or human intuition. When Monte Carlo simulations have been applied in space exploration and oil exploration, actual observations of failures, cost overruns and schedule overruns are routinely better predicted by the simulations than by human intuition or alternative "soft" methods.[2]
For me it had to make some sense such transference from that artistic impressionism help to direct the mind to the ways and means of understanding quantum gravity was being inspected in terms of Monte Carlo methods to understanding. These had a surface value in my mind to an accumulate acceptance of the geometry and methods used to model this understanding.

So you understand now how we arrived at an interpretation of the value of lets say Dyson's opinion about how we might view Riemann's Hypothesis?

Dyson, one of the most highly-regarded scientists of his time, poignantly informed the young man that his findings into the distribution of prime numbers corresponded with the spacing and distribution of energy levels of a higher-ordered quantum state. Mathematics Problem That Remains Elusive —And Beautiful By Raymond Petersen

***

### DNA Computing

DNA computing is a form of computing which uses DNA, biochemistry and molecular biology, instead of the traditional silicon-based computer technologies. DNA computing, or, more generally, molecular computing, is a fast developing interdisciplinary area. Research and development in this area concerns theory, experiments and applications of DNA computing See:DNA computing

***

Clifford of Asymptotia is hosting a guest post by Len Adleman: Quantum Mechanics and Mathematical Logic.

Today I’m pleased to announce that we have a guest post from a very distinguished colleague of mine, Len Adleman. Len is best known as the “A” in RSA and the inventor of DNA-computing. He is a Turing Award laureate. However, he considers himself “a rank amateur” (his words!) as a physicist.

Len Adleman-For a long time, physicists have struggled with perplexing “meta-questions” (my phrase): Does God play dice with the universe? Does a theory of everything exist? Do parallel universes exist? As the physics community is acutely aware, these are extremely difficult questions and one may despair of ever finding meaningful answers. The mathematical community has had its own meta-questions that are no less daunting: What is “truth”? Do infinitesimals exist? Is there a single set of axioms from which all of mathematics can be derived? In what many consider to be on the short list of great intellectual achievements, Frege, Russell, Tarski, Turing, Godel, and other logicians were able to clear away the fog and sort these questions out. The framework they created, mathematical logic, has put a foundation under mathematics, provided great insights and profound results. After many years of consideration, I have come to believe that mathematical logic, suitably extended and modified (perhaps to include complexity theoretic ideas), has the potential to provide the same benefits to physics. In the following remarks, I will explore this possibility.