Re: domain decomposition problem

From: Arman Khalatyan <arm2arm_at_googlemail.com>
Date: Wed, 18 Apr 2007 07:59:57 +0200

You should increase PartAllocFactor .


Arman.



On 4/17/07, Javiera Guedes <javiera_at_ucolick.org> wrote:
>
>
> Thanks Yves,
>
> I have tried increasing the TreeAllocFactor but I got the same error. I
> run the same simulation before on my desktop computer for a few hours
> and it run just fine. I don't understand why it doesn't run as smoothly
> on a supercomputer cluster....
> -Javiera
>
> Yves Revaz wrote:
> >
> > Javiera,
> >
> > This problem occurs typically when the distance between two or more
> > particles is to small compared to the system size. In this case, the
> > tree must be
> > very deep and the allocated memory is not large enough to contain it.
> >
> > If possible, you should increase the allocated memory to the tree
> > construction,
> > by increasing the parameter TreeAllocFactor.
> >
> > However, in some cases, especially when you have a very dissipative
> > component, some particles may form clumps and it will be difficult to
> > avoid the problem.
> >
> > Regards,
> >
> > Yves
> >
> >
> >
> >
> > Javiera Guedes wrote:
> >
> >>
> >> Hi all,
> >>
> >> I'm trying to use Gadget2 to evolve a galaxy with halo, gas, and star
> >> particles. I was running this on a single processor until I got an
> >> account at the Pleiades, a brand new supercomputer at UCSC. I get the
> >> following error:
> >>
> >> **********************************************************
> >> This is Gadget, version `2.0'.
> >>
> >> Running on 8 processors.
> >>
> >> Allocated 50 MByte communication buffer per processor.
> >>
> >> Communication buffer has room for 1008246 particles in gravity
> >> computation
> >> Communication buffer has room for 409600 particles in density
> >> computation
> >> Communication buffer has room for 327680 particles in hydro computation
> >> Communication buffer has room for 304818 particles in domain
> >> decomposition
> >>
> >>
> >> Hubble (internal units) = 4.13479
> >> G (internal units) = 0.999652
> >> UnitMass_in_g = 6.30712e+48
> >> UnitTime_in_s = 1.27586e+18
> >> UnitVelocity_in_cm_per_s = 6.91e+07
> >> UnitDensity_in_cgs = 9.20417e-30
> >> UnitEnergy_in_cgs = 3.01153e+64
> >>
> >> Task=0 FFT-Slabs=32
> >> Task=1 FFT-Slabs=32
> >> Task=2 FFT-Slabs=32
> >> Task=3 FFT-Slabs=32
> >> Task=4 FFT-Slabs=32
> >> Task=5 FFT-Slabs=32
> >> Task=6 FFT-Slabs=32
> >> Task=7 FFT-Slabs=32
> >>
> >> Allocated 8.0625 MByte for FFT kernel(s).
> >>
> >>
> >> Allocated 20.0963 MByte for particle storage. 80
> >>
> >> Allocated 4.09163 MByte for storage of SPH data. 84
> >>
> >>
> >> reading file `/home/javiera/ICs/MW1.bh.000.gadget' on task=0
> >> (contains 1404834 particles.)
> >> distributing this file to tasks 0-7
> >> Type 0 (gas): 272409 (tot= 0000272409) masstab=0
> >> Type 1 (halo): 951539 (tot= 0000951539) masstab=0
> >> Type 2 (disk): 0 (tot= 0000000000) masstab=0
> >> Type 3 (bulge): 0 (tot= 0000000000) masstab=0
> >> Type 4 (stars): 180886 (tot= 0000180886) masstab=0
> >> Type 5 (bndry): 0 (tot= 0000000000) masstab=0
> >>
> >> reading done.
> >> Total number of particles : 0001404834
> >>
> >> allocated 0.0762939 Mbyte for ngb search.
> >>
> >> Allocated 15.6342 MByte for BH-tree. 64
> >>
> >> domain decomposition...
> >> NTopleaves= 834
> >>
> >> *No domain decomposition that stays within memory bounds is possible.*
> >>
> >> **************************************************************
> >>
> >> The relevant options on my parameter files are:
> >>
> >> % Memory allocation
> >>
> >> PartAllocFactor 1.5
> >> TreeAllocFactor 0.8
> >> BufferSize 50 % in MByte
> >>
> >>
> >> And my makefile options are:
> >> OPT += -DUNEQUALSOFTENINGS
> >> OPT += -DPEANOHILBERT
> >> OPT += -DWALLCLOCK OPT += -DSYNCHRONIZATION
> >>
> >> Any suggestions would be much appreciated.
> >>
> >> Thanks,
> >> -- Javiera
> >>
> >> -----------------------------------
> >> Javiera Guedes UCSC Astronomy Department
> >> javiera_at_astro.ucsc.edu
> >> ph: (831) 459-5722 fax: (831) 459-5265
> >> -----------------------------------
> >>
> >>
> >>
> >>
> >> -----------------------------------------------------------
> >> If you wish to unsubscribe from this mailing, send mail to
> >> minimalist_at_MPA-Garching.MPG.de with a subject of: unsubscribe
> >> gadget-list
> >> A web-archive of this mailing list is available here:
> >> http://www.mpa-garching.mpg.de/gadget/gadget-list
> >
> >
> >
>
> --
>
> -----------------------------------
> Javiera Guedes
> UCSC Astronomy Department
> javiera_at_astro.ucsc.edu
> ph: (831) 459-5722
> fax: (831) 459-5265
> -----------------------------------
>
>
>
>
> -----------------------------------------------------------
>
> If you wish to unsubscribe from this mailing, send mail to
> minimalist_at_MPA-Garching.MPG.de with a subject of: unsubscribe gadget-list
> A web-archive of this mailing list is available here:
> http://www.mpa-garching.mpg.de/gadget/gadget-list
>
Received on 2007-04-18 08:00:01

This archive was generated by hypermail 2.3.0 : 2023-01-10 10:01:30 CET