Re: Maximum Number of Tree Nodes Reached

From: Volker Springel <volker_at_MPA-Garching.MPG.DE>
Date: Fri, 10 Nov 2017 08:35:04 +0100

Hi Jared,

Your particle data is stored in single precision (float), right? In this case it can happen that you have a coordinate collision of two particles - they are mapped to the very same (discrete) floating point coordinates. This is not terribly likely for the particle number you run, but can occur if you are unlucky. Now, the tree construction algorithm of gadget2 tries to always split particles into their own nodes. This fails in this case and will lead to code termination with the error you got. The code can wiggle around this if you allow it modify the node association of particles once the node side length has fallen below a small fraction of the softening length. For this to work in your case, you need to have specified a softening for particle type 0... Maybe this is set to zero? In this case, give it a sensible value (like the one for type 1) and retry.

Regards,
Volker


> On 10. Nov 2017, at 06:00, Jared Coughlin <Jared.W.Coughlin.29_at_nd.edu> wrote:
>
> Hello! I've run three dark matter only simulations with N = 1024^3 particles and L = 40 Mpc/h. Each simulation is started from the same initial conditions (generated using 2LPTic) and being run on 72 processors (each node of the supercomputer I'm using has 24 processors and 256GB of RAM, so I'm using 3 full nodes). I save five snapshots. However, once the simulations are done (they all run successfully), I need to know the density field, so I have a quick code that simply reads the snapshot and changes the particle types from 1 to 0. I then use gadget's density calculation on each snapshot in order to get smoothing lengths and densities for each of my particles. The reason I'm doing the whole simulation as pure N-body is for run-time reasons.
>
> For two of my simulations (the non-standard dark energy ones), this works like a charm and gives me what I need for each of the total of ten snapshots. However, for the cosmological constant simulation, only four out of the five snapshots run successfully. The one snapshot at z = 3 fails with the error "task 57: maximum number 25836911 tree-nodes reached." I've tracked this down to the while(1) loop in force_treebuild_single() in forcetree.c. I should also note that I've run plenty of cosmological constant simulations using this method in the past, with the only difference being that those ones has fewer particles (I've even run one that had the same resolution successfully, 428^3 particles in a 15Mpc/h box).
>
> I've read the other questions concerning this error on the mailing list, but their solutions do not appear to be working for me. I increased TreeAllocFac to 1.5 without changing the number of processors, and then I also increased the number of processors to 96 and it still fails. I've been attempting to understand the inner workings of the tree construction for the last several days, but I'm getting nowhere quite quickly.
>
> What's really confusing me is why all the other snapshots worked just fine. I cannot for the life of me figure out what is different about this one, so if anyone has any thoughts on what causes this error, I would love to hear them. Thank you very much for your time!
>
> Sincerely,
> -Jared
>
>
>
> -----------------------------------------------------------
>
> If you wish to unsubscribe from this mailing, send mail to
> minimalist_at_MPA-Garching.MPG.de with a subject of: unsubscribe gadget-list
> A web-archive of this mailing list is available here:
> http://www.mpa-garching.mpg.de/gadget/gadget-list
Received on 2017-11-10 08:35:02

This archive was generated by hypermail 2.3.0 : 2023-01-10 10:01:32 CET