Pages

Showing posts with label SNO. Show all posts
Showing posts with label SNO. Show all posts

Friday, July 06, 2012

The Bolshoi simulation

A virtual world?

 The more complex the data base the more accurate one's simulation is achieved. The point is though that you have to capture scientific processes through calorimeter examinations just as you do in the LHC.

So these backdrops are processes in identifying particle examinations as they approach earth or are produced on earth. See Fermi and capture of thunder storms and one might of asked how Fermi's picture taking would have looked had they pointed it toward the Fukushima Daiichi nuclear disaster?

So the idea here is how you map particulates as a measure of natural processes? The virtual world lacks the depth of measure with which correlation can exist in the natural world? Why? Because it asks the designers of computation and memory to directly map the results of the experiments. So who designs the experiments to meet the data?

 How did they know the energy range that the Higg's Boson would be detected in?





The Bolshoi simulation is the most accurate cosmological simulation of the evolution of the large-scale structure of the universe yet made ("bolshoi" is the Russian word for "great" or "grand"). The first two of a series of research papers describing Bolshoi and its implications have been accepted for publication in the Astrophysical Journal. The first data release of Bolshoi outputs, including output from Bolshoi and also the BigBolshoi or MultiDark simulation of a volume 64 times bigger than Bolshoi, has just been made publicly available to the world's astronomers and astrophysicists. The starting point for Bolshoi was the best ground- and space-based observations, including NASA's long-running and highly successful WMAP Explorer mission that has been mapping the light of the Big Bang in the entire sky. One of the world's fastest supercomputers then calculated the evolution of a typical region of the universe a billion light years across.

The Bolshoi simulation took 6 million cpu hours to run on the Pleiades supercomputer—recently ranked as seventh fastest of the world's top 500 supercomputers—at NASA Ames Research Center. This visualization of dark matter is 1/1000 of the gigantic Bolshoi cosmological simulation, zooming in on a region centered on the dark matter halo of a very large cluster of galaxies.Chris Henze, NASA Ames Research Center-Introduction: The Bolshoi Simulation



Snapshot from the Bolshoi simulation at a red shift z=0 (meaning at the present time), showing filaments of dark matter along which galaxies are predicted to form.
CREDIT: Anatoly Klypin (New Mexico State University), Joel R. Primack (University of California, Santa Cruz), and Stefan Gottloeber (AIP, Germany).
 THREE “BOLSHOI” SUPERCOMPUTER SIMULATIONS OF THE EVOLUTION OF THE UNIVERSE ANNOUNCED BY AUTHORS FROM UNIVERSITY OF CALIFORNIA, NEW MEXICO STATE UNIVERSITY



Pleiades Supercomputer

 MOFFETT FIELD, Calif. – Scientists have generated the largest and most realistic cosmological simulations of the evolving universe to-date, thanks to NASA’s powerful Pleiades supercomputer. Using the "Bolshoi" simulation code, researchers hope to explain how galaxies and other very large structures in the universe changed since the Big Bang.

To complete the enormous Bolshoi simulation, which traces how largest galaxies and galaxy structures in the universe were formed billions of years ago, astrophysicists at New Mexico State University Las Cruces, New Mexico and the University of California High-Performance Astrocomputing Center (UC-HIPACC), Santa Cruz, Calif. ran their code on Pleiades for 18 days, consumed millions of hours of computer time, and generating enormous amounts of data. Pleiades is the seventh most powerful supercomputer in the world.

“NASA installs systems like Pleiades, that are able to run single jobs that span tens of thousands of processors, to facilitate scientific discovery,” said William Thigpen, systems and engineering branch chief in the NASA Advanced Supercomputing (NAS) Division at NASA's Ames Research Center.
See|:NASA Supercomputer Enables Largest Cosmological Simulations



See Also: Dark matter’s tendrils revealed

Thursday, October 27, 2011

ICECUBE Blogging Research Material and more

In regards to Cherenkov Light

Thinking outside the box See: A physicist inthe cancer lab

Ackerman became interested in physics in middle school, reading popular science books about quantum mechanics and string theory. As an undergraduate at the Massachusetts Institute of Technology, she traveled to CERN, the European particle physics laboratory near Geneva, to work on one of the detectors at the Large Hadron Collider, the most powerful particle collider in the world. Then she spent a summer at SLAC working on BaBar, an experiment investigating the universe’s puzzling shortage of antimatter, before starting her graduate studies there in 2007.

 Linking Experiments(Majorana, EXO); How do stars create the heavy elements? (DIANA); What role did neutrinos play in the evolution of the universe? (LBNE). In addition, scientists propose to build a generic underground facility (FAARM) ...

 Dialogos of Eide: Neutrinoless Double Beta DecayCOBRA · CUORICINO and CUORE · EXO · GERDA · MAJORANA · MOON · NEMO-3 and SuperNEMO · SNO+. See Also:Direct Dark Matter Detection.

Also From my research:

  1. Neutrinoless Double Beta Decay
  2. A first look at the Earth interior from the Gran Sasso underground laboratory
  3. Mysterious Behavior of Neutrinos sent Straight through the Earth
    *** 
     
ICECUBE Blog put up some links that I wanted to go through to see what is happening there. Their links provided at bottom of blog post here. Each link of theirs I have provided additional information in concert while I explore above.

Thursday, September 29, 2011

Cassiopeia A

In conclusion, we have a rich panorama of experiments that all make use of neutrinos as probes of exotic phenomena as well as processes which we have to measure better to gain understanding of fundamental physics as well as gather information about the universe. See:Vernon Barger: perspectives on neutrino physics May 22, 2008


This image presents a beautiful composite of X-rays from Chandra (red, green, and blue) and optical data from Hubble (gold) of Cassiopeia A, the remains of a massive star that exploded in a supernova. Evidence for a bizarre state of matter has been found in the dense core of the star left behind, a so-called neutron star, based on cooling observed over a decade of Chandra observations. The artist's illustration in the inset shows a cut-out of the interior of the neutron star where densities increase from the crust (orange) to the core (red) and finally to the region where the "superfluid" exists (inner red ball). X-ray: NASA/CXC/UNAM/Ioffe/D. Page, P. Shternin et al.; Optical: NASA/STScI; Illustration: NASA/CXC/M. WeissSee Also:Superfluid and superconductor discovered in star's core

Illustration of Cassiopeia A Neutron Star
This is an artist's impression of the neutron star at the center of the Cassiopeia A supernova remnant. The different colored layers in the cutout region show the crust (orange), the higher density core (red) and the part of the core where the neutrons are thought to be in a superfluid state (inner red ball). The blue rays emanating from the center of the star represent the copious numbers of neutrinos that are created as the core temperature falls below a critical level and a superfluid is formed.
(Credit: Illustration: NASA/CXC/M.Weiss)


X-ray and Optical Images of Cassiopeia A
Two independent research teams studied the supernova remnant Cassiopeia A, the remains of a massive star, 11,000 light years away that would have appeared to explode about 330 years as observed from Earth. Chandra data are shown in red, green and blue along with optical data from Hubble in gold. The Chandra data revealed a rapid decline in the temperature of the ultra-dense neutron star that remained after the supernova. The data showed that it had cooled by about 4% over a ten-year period, indicating that a superfluid is forming in its core.
(Credit: X-ray: NASA/CXC/UNAM/Ioffe/D.Page,P.Shternin et al; Optical: NASA/STScI)
***

See: Galactic Neutrino Communications

Friday, April 22, 2011

Geo-neutrinos



The main geophysical and geochemical processes that have driven the evolution of the Earth are strictly bound by the planet̓s energy budget. The current flux of energy entering the Earth’s atmosphere is well known: the main contribution comes from solar radiation (1.4 × 103 W m–2), while the energy deposited by cosmic rays is significantly smaller (10–8 W m–2). The uncertainties on terrestrial thermal power are larger – although the most quoted models estimate a global heat loss in the range of 40–47 TW, a global power of 30 TW is not excluded. The measurements of the temperature gradient taken from some 4 × 104 drill holes distributed around the world provide a constraint on the Earth’s heat production. Nevertheless, these direct investigations fail near the oceanic ridge, where the mantle content emerges: here hydrothermal circulation is a highly efficient heat-transport mechanism.

The generation of the Earth’s magnetic field, its mantle circulation, plate tectonics and secular (i.e. long lasting) cooling are processes that depend on terrestrial heat production and distribution, and on the separate contributions to Earth’s energy supply (radiogenic, gravitational, chemical etc.). An unambiguous and observationally based determination of radiogenic heat production is therefore necessary for understanding the Earth’s energetics. Such an observation requires determining the quantity of long-lived radioactive elements in the Earth. However, the direct geochemical investigations only go as far as the upper portion of the mantle, so all of the geochemical estimates of the global abundances of heat-generating elements depend on the assumption that the composition of meteorites reflects that of the Earth.
See:Looking into the Earth’s interior with geo-neutrinos

Thursday, July 29, 2010

Lighting up the dark universe


Image ...
The CHASE detector. The end of the magnet (orange) can be seen on the right.

Exploring our dark universe is often the domain of extreme physics. Traces of dark matter particles are searched for by huge neutrino telescopes located underwater or under Antarctic ice, by scientists at powerful particle colliders, and deep underground.  Clues to mysterious dark energy will be investigated using big telescopes on Earth and experiments that will be launched into space.
But an experiment doesn’t have to be exotic to explore the unexplained. At the International Conference on High Energy Physics, which ended today in Paris, scientists unveiled the first results from the GammeV-CHASE experiment, which used 30 hours’ worth of data from a 10-meter-long experiment to place the world’s best limits on the existence of dark energy particles.
CHASE, which stands for Chameleon Afterglow Search, was constructed at Fermilab to search for hypothetical particles called chameleons. Physicists theorize that these particles may be responsible for the dark energy that is causing the accelerating expansion of our universe.

“One of the reasons I felt strongly about doing this experiment is that it was a good example of a laboratory experiment to test dark energy models,” says CHASE scientist Jason Steffen, who presented the results at ICHEP. “Astronomical surveys are important as well, but they’re not going to tell us everything.” CHASE was a successor to Fermilab’s GammeV experiment, which searched for chameleon particles and another hypothetical particle called the axion.

See: Lighting up the dark universe by Katie Yurkewicz Posted in ICHEP 2010

See Also:Backreaction: Detection of Dark Energy on Earth? - Improbable

Sunday, December 13, 2009

SuperCDMS An Improvement on Detection

So far no WIMP interaction has been observed, so the sensitivity needs to be improved further. This will be achieved by increasing the total detector mass (and with this the probability that a WIMP interacts in the detector) and at the same time reducing the background and improving the discrimination power. This effort started in 2009 under the name SuperCDMS.

The first set of new detectors has been installed in the experimental setup at Soudan and is operating since summer 2009. First tests show that the background levels are in the expected range. Over the course of the next year all CDMS detectors will be replaced by the new larger detectors. The active mass will increase by more than a factor of three to about 15 kg.
See:CDMS and SuperCDMS Experiments
***


It is known since the 1930's that a significant part of the mass of the universe is invisible. This invisible material has been named Dark Matter. Weakly Interacting Massive Particles (WIMPs) are considered as one of the most convincing explanation for this phenomenon.See:SuperCDMS Queen's Home
***



SNOLAB is an underground science laboratory specializing in neutrino and dark matter physics. Situated two km below the surface in the Vale Inco Creighton Mine located near Sudbury Ontario Canada, SNOLAB is an expansion of the existing facilities constructed for the Sudbury Neutrino Observatory (SNO) solar neutrino experiment. SNOLAB follows on the important achievements in neutrino physics achieved by SNO and other underground physics measurements. The primary scientific emphasis at SNOLAB will be on astroparticle physics with the principal topics being:
Low Energy Solar Neutrinos;
Neutrinoless Double Beta Decay;
Cosmic Dark Matter Searches;
Supernova Neutrino Searches.

***




The Sudbury Neutrino Laboratory, located two kilometres below the surface, is the site of groundbreaking international research.

 The 17-metre-wide SNO detector in Vale Inco’s Creighton Mine.
Ernest Orlando, Lawrence Berkeley National Laboratory

Update:
Latest Results in the Search for Dark Matter
Thursday, December 17, 2009

 

 

Dark Matter Detected, or Not? Live Blogging the Seminar

by JoAnne

Monday, January 07, 2008

What is Dark Energy?

I am recreating this post in accordance with other viewers who are currently looking at the subject of Dark Energy. Also some resources for further reading.

All events shown here (except KEK test detector) were generated by Monte-Carlo simulation program, written by Clark. The visualizing software which produced the detector images was written by Tomasz.


While the sun was easily recognizable building "monte carlo" patterns in computer technology developed from SNO work made such views easily discernible?

Imagine putting all that information through a single point? That "point" is important in terms of the energy perspective. It reveals something very interesting about our universe.

If such experiments as listed here are to be considered in the "forward perspective" then what do you think we have gained in our understanding of supersymmetry? Yes indeed, the undertanding is amazing with the reading of what is given to us below in the Interaction.org links.

The complexity of the information seems well, like, "ligo information" being transcribed into a working image of the cosmos? Complexity of all that information/energy is being processed through the LHC experiment. Consider it's energy values, and all that is being produced as "particle constituents" and yes, there is more.

Cosmic particle collision understanding in this correlation of experiment at LHC, we learn much about the universe.

Quantum physics has revealed a stunning truth about “nothing”: even the emptiest vacuum is filled with elementary particles, continually created and destroyed. Particles appear and disappear, flying apart and coming together, in an intricate quantum dance. This far-reaching consequence of quantum mechanics has withstood the most rigorous experimental scrutiny. In fact, these continual fluctuations are at the heart of our quantum understanding of nature.

The dance of quantum particles has special significance today because it contributes to the dark energy that is driving the universe apart. But there’s a problem: the vacuum has too much energy. A naive theoretical estimate gives an amount about 10120 times too large to fit cosmological observations. The only known way to reduce the energy is to cancel contributions of different particle species against each other, possibly with a new symmetry called supersymmetry. With supersymmetry the result is 1060 times better—a huge improvement, but not enough. Even with supersymmetry, what accounts for the other 60 orders of magnitude is still a mystery.

Physics theory predicts that one of the most important particles in the quantum vacuum is the Higgs particle. The Higgs pervades the vacuum, slowing the motion of particles, giving them mass, and preventing atoms from disintegrating. Since it fills the vacuum, the Higgs itself contributes to the embarrassing factor of 10120.

The next accelerators are opening a window on the pivotal role of symmetry in fundamental physics. New discoveries will teach us about the role of the Higgs particle and supersymmetry in defining the vacuum. Such discoveries are key to understanding what tames the quantum vacuum, a topic that is fundamental to any real understanding of the mysterious dark energy that determines the destiny of our cosmos.


It took me a long time to get to the very point made in terms of the supersymmetrical valuation by understanding what existed "before" was transform from to being by presented another possibily on the other side.

"In fact, these continual fluctuations are at the heart of our quantum understanding of nature."

The only known way to reduce the energy is to cancel contributions of different particle species against each other, possibly with a new symmetry called supersymmetry.


It had to be taken down to a reductionistic point of view in order for this to make any sense. You needed experiments in which this was made possible. Without them, how could we be "lead by science?"

Conclusions


Particle physics is in the midst of a great revolution. Modern data and ideas have challenged long-held beliefs about matter, energy, space and time. Observations have confirmed that 95 percent of the universe is made of dark energy and dark matter unlike any we have seen or touched in our most advanced experiments. Theorists have found a way to reconcile gravity with quantum physics, but at the price of postulating extra dimensions beyond the familiar four dimensions of space and time. As the magnitude of the current revolution becomes apparent, the science of particle physics has a clear path forward. The new data and ideas have not only challenged the old ways of thinking, they have also pointed to the steps required to make progress. Many advances are within reach of our current program; others are close at hand. We are extraordinarily fortunate to live in a time when the great questions are yielding a whole new level of understanding. We should seize the moment and embrace the challenges.


A new LHC experiment is born, is an effect from what existed before? What come after.

Yes, the idea is that universe was not born from colliding particles, but from the supersymetical valuation that existed in the universe in the very beginning. You had to know, how to get there. That such events are still feasible, and are being produced cosmologically as we see evidenced in the "fast forward" experiment.


Dark Matter and Dark Energy: from the Universe to the Laboratory-Conclusion

Monday, April 09, 2007

Blackhole evaporation: What's New From the Subatomic-Sized Holes ?

...the creative principle resides in mathematics. In a certain sense therefore, I hold it true that pure thought can grasp reality, as the ancients dreamed.Albert Einstein
See What is Cerenkov Radiation?

We are being "politically correct" (a sociological observation) when we change the wording of the "microstate blackhole production" to "Sub Atomic Sized Holes?" To maybe "inferr" the desired differences of cosmological blackholes, versus, what we see quickly evaporating in subatomic-sized to be revealed in a footprint?

David Kestenbaum, NPR-Alvaro De Rujula is a physicist at CERN, the world's largest particle physics laboratory. Three hundred feet below his desk, workers are building a massive particle accelerator that will be capable of reproducing energies present just after the big bang.

Let's pretend that the reporting was not so good back in 1999, and the information we had then was to cause some needless concerns? Good reporting already existed in term of what the Dark Matter was doing. Now it's okay if someone else saids it, and reveals all the dark matter info with Wikipedia. How nice:)Your credible?

Was there any evidence to think a method was already determined "back then" and has become part of the process of discovery?

Bad reporting?

At first bad reporting? Producing fear into the public mind?

In recent years the main focus of fear has been the giant machines used by particle physicists. Could the violent collisions inside such a machine create something nasty? "Every time a new machine has been built at CERN," says physicist Alvaro de Rujula, "the question has been posed and faced." August 1999

Peter Steinberg, when at Quantum diaries, lead us through this.

The creepy part of these kind of discussions is that one doesn't say that RHIC collisions "create" black holes, but that nucleus-nucleus collisions, and even proton-proton collisions, are in some sense black holes, albeit black holes in some sort of "dual" space which makes the theory easier.


Alvaro was the one who put "James Blodgett of Risk assessment" at ease in regards to strangelets. Now, could strangelets have been considered a consequence of the evaporation? Does this not look similar?

deconstruction: event display
Usually all physicists see are the remnants of a new particle decaying into other types of particles. From that, they infer the existence of the new species and can determine some of its characteristics.
SeeNeutrino Mixing Explained in 60 seconds

Now everything is safe and cozy with these subatomic-sized holes which would simply evaporate. :) How would you know "what is new" after the subatomic holes had evaporated? Are sterile neutrinos new?

While these paragraphs have been selective, they show that experimental processes are being used and detective work applied.

Current evidence shows that neutrinos do oscillate, which indicates that neutrinos do have mass. The Los Alamos data revealed a muon anti-neutrino cross over to an electron neutrino. This type of oscillation is difficult to explain using only the three known types of neutrinos. Therefore, there might be a fourth neutrino, which is currently being called a "sterile" neutrino, which interacts more weakly than the other three neutrinos.

Any add on experimental processes at Cern with regards to the LHC are reflect in this second paragraph?

"We find," Chiao said, "that a barrier placed in the path of a tunneling particle does not slow it down. In fact, we detect particles on the other side of the barrier that have made the trip in less time than it would take the particle to traverse an equal distance without a barrier -- in other words, the tunnelling speed apparently greatly exceeds the speed of light. Moreover, if you increase the thickness of the barrier the tunneling speed increases, as high as you please.

See Gran Sasso

So while one may think I have some "new process" to make the world happy, it is nothing of the sort. It is interpreting the current theoretical models in regards to current experimental research.

For some reason some scientist think that one can be devoid of this reasoning and apply it to any model/person, while the scientist/lay people already know what is required.

This has been reflected time and again through the interactions of scientist with the public. What is one to think when one scientist calls another scientists devoid of such reason, while he works to develop the string theory model. They don't like that do they?:)


So do you think that Clifford of Asymptotia is practising what he did not like in Peter Woit's summation of the state of affairs in string theory? That while criticizing him he was doing the same thing to others? I laughed when I came across the censoring post on Not even Wrong, and why I had to write my new article on Censoring.

I have never seen such "happy trigger fingers" as to deletion of posts that would contradict the statements Clifford could make about another person, or what Peter Woit could say about "Clifford censoring" statements. Peter provides a forum for those who feel shafted who could voice there displeasure?:)

Don't worry Peter I certainly won't be crying on your blog. Deletion knows it boundaries in terms of censoring there too.:) But anyway, onto the important stuff.

This summer, CERN gave the starting signal for the long-distance neutrino race to Italy. The CNGS facility (CERN Neutrinos to Gran Sasso), embedded in the laboratory's accelerator complex, produced its first neutrino beam. For the first time, billions of neutrinos were sent through the Earth's crust to the Gran Sasso laboratory, 732 kilometres away in Italy, a journey at almost the speed of light which they completed in less than 2.5 milliseconds. The OPERA experiment at the Gran Sasso laboratory was then commissioned, recording the first neutrino tracks. See Strangelets and Strange Matter

Tunnelling in the string theory landscape

Now it may not seem so odd that I would place a string theory landscape picture up for revue, and have one think about hill climbers and valley crossers. Would it be wrong not to include the "potential hills" and the thought of the "blackhole horizon?" It was "theoretical appealing" as a thought experiment to me, to think about what could traverse those potential hills. We had to use "a mechanism" to help us understand how the cross over point was being established and "new universes" begin to unfold? New particle creation from such collision processes had to be established first. Both at Cern and with "high energy particles from space." IceCube was to be the backdrop for the footprint, and resulting Cerenkov radiation from that collision process?

One needed to see such experiment as taking place currently to help us see the jest of where science is currently taking us on our journey's. So you had to be able to see this process in action back to the insecurities of our ignorance, in relation too, sub-atomic sized holes...ahem...dualites?

So you had to know that the collision process would detail some "crossover point" for consideration? What this means that "after the collision process" you are given a new particle with which to work.

You need to be able to capture this "new particle" and the mediums with which this is done, are the barriers that supply the back drop for foot prin,t to what can be traversed in faster then light potentials. Again Gran Sasso, and let's not forget ICECUBE.

Cross over point

Is it not important to see the experimental process as a natural one?

Bringing the Heavens down to Earth

If mini black holes can be produced in high-energy particle interactions, they may first be observed in high-energy cosmic-ray neutrino interactions in the atmosphere. Jonathan Feng of the University of California at Irvine and MIT, and Alfred Shapere of the University of Kentucky have calculated that the Auger cosmic-ray observatory, which will combine a 6000 km2 extended air-shower array backed up by fluorescence detectors trained on the sky, could record tens to hundreds of showers from black holes before the LHC turns on in 2007. See here


So here we are talking about the "before" and "after" and we had not spoken about the point of exchange here? If I were to tell you that such a reductionistic process had taken us to the limits what the heck could this mean? That we had indeed found the transference point of energy to matter, matter to energy and we say it may be the perfect fluids that supplies us this "anomalistic behaviour" with which we will introduce the GR? Talk about Navier-stokes in relation to the perfect fluid and what and how something can traverse through and come out on the other side?