Re: error using zoom ICs in multiple files

From: Volker Springel <volker_at_MPA-Garching.MPG.DE>
Date: Sat, 05 Aug 2006 19:15:19 +0200

Hi Michele,

It looks like the npartTotal[6] fields in your file ics_mt_try.dat.1 are
not correctly set (they are all 0, but should give the total number of
particles of each type), only in the ics_mt_try.dat.0 they are present.

This is presumably due to a bug in an old version of pgenIC, which you
apparently obtained by somebody... A possible work-around should be to
set NumFilesWrittenInParallel=1 in gadget2.

Volker



Michele Trenti wrote:
> Hi All,
>
> I am trying to start a Gadget2 run using zoom initial conditions.
> Everything is fine if the ICs are only on one file, but I get an error if
> the particles are spread on more than one file. On the other side, if I
> use only one kind of particles, I am able to read and run multiple ICs
> files with no problems.
>
> My question is if there is a specific convention to follow when
> preparing zoom ICs on multiple files. I used P-GenIC.
>
> The problem seem to be in All.MaxPart = 0 in one of the reading
> tasks (see log below).
>
> Thanks in advance for your help,
>
> Michele
>
> ---------------------------------------
> carina> mpiexec -n 2 ./source/Gadget2 cluster.param
>
> This is Gadget, version `2.0'.
>
> Running on 2 processors.
>
> found 1 times in output-list.
>
> Allocated 40 MByte communication buffer per processor.
>
> Communication buffer has room for 806596 particles in gravity computation
> Communication buffer has room for 327680 particles in density computation
> Communication buffer has room for 262144 particles in hydro computation
> Communication buffer has room for 243854 particles in domain decomposition
>
>
> Hubble (internal units) = 0.1
> G (internal units) = 43007.1
> UnitMass_in_g = 1.989e+43
> UnitTime_in_s = 3.08568e+16
> UnitVelocity_in_cm_per_s = 100000
> UnitDensity_in_cgs = 6.76991e-22
> UnitEnergy_in_cgs = 1.989e+53
>
> Task=0 FFT-Slabs=64
> Task=1 FFT-Slabs=64
>
> Allocated 32.25 MByte for FFT kernel(s).
>
>
> Allocated 41.3557 MByte for particle storage. 80
>
>
> reading file `./ics_mt_try.dat.0' on task=0 (contains 403813 particles.)
> distributing this file to tasks 0-0
> Type 0 (gas): 0 (tot= 0000000000) masstab=0
> Type 1 (halo): 248832 (tot= 0000373248) masstab=0.0317631
> Type 2 (disk): 27880 (tot= 0000049040) masstab=2.03284
> Type 3 (bulge): 127101 (tot= 0000255285) masstab=16.2627
> Type 4 (stars): 0 (tot= 0000000000) masstab=0
> Type 5 (bndry): 0 (tot= 0000000000) masstab=0
>
>
> reading file `./ics_mt_try.dat.1' on task=1 (contains 273760 particles.)
> distributing this file to tasks 1-1
> Type 0 (gas): 0 (tot= 0000000000) masstab=0
> Type 1 (halo): 124416 (tot= 0000000000) masstab=0.0317631
> Type 2 (disk): 21160 (tot= 0000000000) masstab=2.03284
> Type 3 (bulge): 128184 (tot= 0000000000) masstab=16.2627
> Type 4 (stars): 0 (tot= 0000000000) masstab=0
> Type 5 (bndry): 0 (tot= 0000000000) masstab=0
>
> too many particles
> task 1: endrun called with an error level of 1313
>
>
> [cli_1]: aborting job:
> application called MPI_Abort(MPI_COMM_WORLD, 1313) - process 1
> rank 1 in job 31 carina.stsci.edu_35299 caused collective abort of all
> ranks
> exit status of rank 1: return code 33
> ----------------------------------------------
>
> If I write All.Max.Part when the "too many particles" condition happens, I
> get:
>
> -----------------
> too many particles
> All.MaxPart = 0
> -----------------
>
> I compiled the code with:
>
> OPT += -DPERIODIC
> OPT += -DUNEQUALSOFTENINGS
> OPT += -DPEANOHILBERT
> OPT += -DWALLCLOCK
> OPT += -DPMGRID=128
> OPT += -DPLACEHIGHRESREGION=3
> OPT += -DENLARGEREGION=1.2
> OPT += -DSYNCHRONIZATION
>
> OPTIMIZE = -O2 -Wall -g
>
> -------------------------------
>
> and extracted from the parameter file:
>
> % Code options
>
> ICFormat 1
> SnapFormat 1
> ComovingIntegrationOn 1
>
> TypeOfTimestepCriterion 0
> OutputListOn 1
> PeriodicBoundariesOn 1
>
> % Softening lengths
>
> SofteningHalo 15.0
> SofteningDisk 30.0
> SofteningBulge 60.0
>
> SofteningHaloMaxPhys 5.0
> SofteningDiskMaxPhys 10.0
> SofteningBulgeMaxPhys 20.0
>
>
>
> Michele Trenti
> Space Telescope Science Institute
> 3700 San Martin Drive Phone: +1 410 338 4987
> Baltimore MD 21218 U.S. Fax: +1 410 338 4767
>
>
> " We shall not cease from exploration
> And the end of all our exploring
> Will be to arrive where we started
> And know the place for the first time. "
>
> T. S. Eliot
>
>
>
>
> -----------------------------------------------------------
>
> If you wish to unsubscribe from this mailing, send mail to
> minimalist_at_MPA-Garching.MPG.de with a subject of: unsubscribe gadget-list
> A web-archive of this mailing list is available here:
> http://www.mpa-garching.mpg.de/gadget/gadget-list
Received on 2006-08-05 19:15:20

This archive was generated by hypermail 2.3.0 : 2022-09-01 14:03:41 CEST