Showing posts with label Fukushima. Show all posts
Showing posts with label Fukushima. Show all posts

Monday, September 08, 2014


Geant4 is a toolkit for the simulation of the passage of particles through matter. Its areas of application include high energy, nuclear and accelerator physics, as well as studies in medical and space science. The two main reference papers for Geant4 are published in Nuclear Instruments and Methods in Physics Research A 506 (2003) 250-303, and IEEE Transactions on Nuclear Science 53 No. 1 (2006) 270-278. See: Geant4

See: Applications 


Geant4 [1][2] (for GEometry ANd Tracking) is a platform for "the simulation of the passage of particles through matter," using Monte Carlo methods. It is the successor of the GEANT series of software toolkits developed by CERN, and the first to use object oriented programming (in C++). Its development, maintenance and user support are taken care by the international Geant4 Collaboration. Application areas include high energy physics and nuclear experiments, medical, accelerator and space physics studies. The software is used by a number of research projects around the world.
The Geant4 software and source code is freely available from the project web site; until version 8.1 (released June 30, 2006), no specific software license for its use existed; Geant4 is now provided under the Geant4 Software License.

Muon Tomography (cont)

It is the case of an article on Muon Tomography, titled New Muon Detector Could Find Hidden Nukes. The article appeared a few days ago on Wired. It is centered on Lisa Grossman's interview to Marcus Hohlmann, a colleague from the Florida Institute of Technology. In a nutshell, the article explains how muon particles from cosmic rays can be used to detect heavy elements (as in nuclear fuel) hidden in transport containers. And what makes things sexier is that the used technology is a spin-off from experiments from particle physics. See: Muon Tomography: Who Is Leading The Research ?
See Also:


Beginning next year, two detectors (shown here in green) on either side of Fukushima Daiichi’s Unit 2 will record the path of muons (represented by the orange line) that have passed through the reactor. By determining how the muons scatter between the detectors, scientists will compile the first picture of the damaged reactor’s interior. See:
Particle physics to aid nuclear cleanup

The progression of Muon Tomography,  is an interesting subject in relation to what can be used to help us understand issues we face here on earth. Situations that need new ways in which to diagnostically deal with extreme situations. Example given in relation too, rock density, magma flows, or, even nuclear reactors.

One has to learn to understand "links that are dropped" which pursue a thread of evolution. These help one to understand the processional use of the technologies as used to understand the ways things are measured in those extreme situations. Sensor-ability,  then takes on a new meaning while using current scientific research and understandings in particle physics.

Friday, July 06, 2012

The Bolshoi simulation

A virtual world?

 The more complex the data base the more accurate one's simulation is achieved. The point is though that you have to capture scientific processes through calorimeter examinations just as you do in the LHC.

So these backdrops are processes in identifying particle examinations as they approach earth or are produced on earth. See Fermi and capture of thunder storms and one might of asked how Fermi's picture taking would have looked had they pointed it toward the Fukushima Daiichi nuclear disaster?

So the idea here is how you map particulates as a measure of natural processes? The virtual world lacks the depth of measure with which correlation can exist in the natural world? Why? Because it asks the designers of computation and memory to directly map the results of the experiments. So who designs the experiments to meet the data?

 How did they know the energy range that the Higg's Boson would be detected in?

The Bolshoi simulation is the most accurate cosmological simulation of the evolution of the large-scale structure of the universe yet made ("bolshoi" is the Russian word for "great" or "grand"). The first two of a series of research papers describing Bolshoi and its implications have been accepted for publication in the Astrophysical Journal. The first data release of Bolshoi outputs, including output from Bolshoi and also the BigBolshoi or MultiDark simulation of a volume 64 times bigger than Bolshoi, has just been made publicly available to the world's astronomers and astrophysicists. The starting point for Bolshoi was the best ground- and space-based observations, including NASA's long-running and highly successful WMAP Explorer mission that has been mapping the light of the Big Bang in the entire sky. One of the world's fastest supercomputers then calculated the evolution of a typical region of the universe a billion light years across.

The Bolshoi simulation took 6 million cpu hours to run on the Pleiades supercomputer—recently ranked as seventh fastest of the world's top 500 supercomputers—at NASA Ames Research Center. This visualization of dark matter is 1/1000 of the gigantic Bolshoi cosmological simulation, zooming in on a region centered on the dark matter halo of a very large cluster of galaxies.Chris Henze, NASA Ames Research Center-Introduction: The Bolshoi Simulation

Snapshot from the Bolshoi simulation at a red shift z=0 (meaning at the present time), showing filaments of dark matter along which galaxies are predicted to form.
CREDIT: Anatoly Klypin (New Mexico State University), Joel R. Primack (University of California, Santa Cruz), and Stefan Gottloeber (AIP, Germany).

Pleiades Supercomputer

 MOFFETT FIELD, Calif. – Scientists have generated the largest and most realistic cosmological simulations of the evolving universe to-date, thanks to NASA’s powerful Pleiades supercomputer. Using the "Bolshoi" simulation code, researchers hope to explain how galaxies and other very large structures in the universe changed since the Big Bang.

To complete the enormous Bolshoi simulation, which traces how largest galaxies and galaxy structures in the universe were formed billions of years ago, astrophysicists at New Mexico State University Las Cruces, New Mexico and the University of California High-Performance Astrocomputing Center (UC-HIPACC), Santa Cruz, Calif. ran their code on Pleiades for 18 days, consumed millions of hours of computer time, and generating enormous amounts of data. Pleiades is the seventh most powerful supercomputer in the world.

“NASA installs systems like Pleiades, that are able to run single jobs that span tens of thousands of processors, to facilitate scientific discovery,” said William Thigpen, systems and engineering branch chief in the NASA Advanced Supercomputing (NAS) Division at NASA's Ames Research Center.
See|:NASA Supercomputer Enables Largest Cosmological Simulations

See Also: Dark matter’s tendrils revealed

Tuesday, March 15, 2011

Turn Fermi Toward Japan Skies

Tokyo Electric Power Co./AP  View of damaged No.  4 unit of the Fukushima nuclear plant in northeastern Japan.

Workers abandoned Japan's quake-stricken nuclear plant on the verge of meltdown Tuesday when increasing radiation levels made it too dangerous to remain.See: Japan nuclear crisis: Workers halt desperate struggle to stop meltdown at Fukushima plant

Sometimes the tools in which we use to measure events in space as satellites out, can be used to help detection flow patterns of radiation emissions from the Nuclear Reactors affected by Earthquakes in Japan?

2011 Japanese Earthquake and Tsunami

A massive 8.9/9.0 magnitude earthquake hit the Pacific Ocean nearby Northeastern Japan at around 2:46pm on March 11 (JST) causing damage with blackouts, fire and tsunami. On this page we are providing the information regarding the disaster and damage with realtime updates.

The large earthquake triggered a tsunami warning for countries all around the Pacific ocean.