Re: endrun called with errorlevel 1

From: Volker Springel <volker_at_MPA-Garching.MPG.DE>
Date: Sat, 01 Mar 2008 19:35:20 +0100

Ammar Ahmad Awan wrote:
> Hi all,
>
> I am trying to execute the gassphere example and i get an error after a
> few timesteps as
>
>
> Begin Step 83, Time: 0.691406, Systemstep: 0.00292969
> domain decomposition...
> NTopleaves= 344
> work-load balance=1 memory-balance=1
> domain decomposition done.
> begin Peano-Hilbert order...
> Peano-Hilbert done.
> Start force computation...
> Tree construction.
> task 0: maximum number 441 of tree-nodes reached.
> for particle 341
> task 0: endrun called with an error level of 1
>
>
> [chenab1:04227] MPI_ABORT invoked on rank 0 in communicator
> MPI_COMM_WORLD with errorcode 1
> task 2: endrun called with an error level of 1
>
>
> [chenab1:04229] MPI_ABORT invoked on rank 2 in communicator
> MPI_COMM_WORLD with errorcode 1
> task 1: endrun called with an error level of 1
>
>
> [chenab1:04228] MPI_ABORT invoked on rank 1 in communicator
> MPI_COMM_WORLD with errorcode 1
> task 3: endrun called with an error level of 1
>
>
> [chenab1:04230] MPI_ABORT invoked on rank 3 in communicator
> MPI_COMM_WORLD with errorcode 1
>
> I have changed a few options in the gassphere.param option like
> increaseing/decreasing PartAllocFactor but can't figure out the problem.
>

Hi,

This should go away if you increase TreeAllocFactor.

Since the code always needs a certain number of tree-nodes for the
top-level tree structure, running a very small simulation on
several/many CPUs tends to require a larger setting of TreeAllocFactor
to provide enough storage space for the tree.

Cheers,
Volker
Received on 2008-03-01 19:33:58

This archive was generated by hypermail 2.3.0 : 2023-01-10 10:01:30 CET