|The "Hubble Volume Simulations"|
HV page in Michigan/
The Hubble Volume Project is a joint effort of the Virgo Consortium
and collaborators in U.S., Canada, U.K., and Germany.
To study the formation of clusters of galaxies, filaments and
void-structures, a significant fraction of the entire observable
universe is modelled
and simulated by employing one billion(=1000000000) mass
particles. This is the largest such computer simulation ever done.
The 512-processor Garching T3E parallel computer was used
for this simulation, which produced almost a Terabyte of output data
in some 70 hours.
This page contains informations on the Hubble Volume Simulations and also for those who are interested in the simulation data, instructions to obtain the data sets are given here. Please note that the pictures in this page are freely available with a suitable credit and reference.
An important aspect associated with the Hubble Volume Simulations is that data are output along a light-cone so the clustering evolution is incorporated in output data (see the picture below).
A narrow wedge showing the evolution of the clustering:
Get this picture in various color combinations!
An unsmoothed version of combined two wedges.
Deep wedge of LCDM simulation
Credit to all the light-cone pictures: Gus Evrard and Andrzej Kudlicki , ref.1
|Images of cluster populations along light-cone|
|Images of snapshots|
Pic.1 This picture shows projected matter distribution in a slab of
2000 x 2000 x 20 (Mpc/h)3 taken from the snapshot
(=particle positions are fixed at an instance) of tauCDM model.
If you click on the picture you get it in its original size. Watch out, it is huge.
Snapshot pictures in various sizes.
Here is another representation which gives a much better impression of
the clustering. There are some huge voids and superclusters!
Click here to get this picture in gzipped PS format.
Credit: Hugh Couchman, ref.3
|List of the available data|
The following data except the cluster catalogues are very large in size and therefore it is practically impossible to download them via internet or ftp. By selecting data types and clicking "GO" you will get informations on the data such as number of pieces, size, magnetic tape format and also a detailed description of the data will be shown. Please read the description carefully and send the request as instructed.
Cluster catalogues (downloadbable)
Snapshot : By FOF algorithm with linking parameter b=0.2 for tCDM and 0.164 for LCDM, ref 2.
tCDM snapshot z=0.0 (36 MB) LCDM snapshot z=0.0 (42 MB) tCDM snapshot z=0.5 LCDM snapshot z=0.5
Lightcone : SO algorithm with density threshold 200, ref 1. See also DESCRIPTION(HTML/txt)
tCDM narrow (160 KB) LCDM deep (430 KB) tCDM octant A (10 MB) LCDM octant A (22 MB) tCDM octant B (10 MB) LCDM octant B (11 MB) tCDM sphere A (29 MB) LCDM sphere A (40 MB) tCDM sphere B (28 MB) LCDM sphere B (40 MB)
Others: Some data avaliable in Michigan, ref1.
Here is an example of proper credit for these Hubble data.
"The simulations in this paper were carried out by the Virgo Supercomputing Consortium using computer s based at the Computing Centre of the Max-Planck Society in Garching and at the Edinburgh parallel Computing Centre. The data are publicly available at http://www.mpa-garching.mpg.de/NumCos
Back to MPA/ NumCos
Last modified: Wed Oct 31 12:20:17 MET 2001 by Virgo Administrator