Dear gadget2 user
I¡¯m a postgraduate and I am learning how to run gadget2.0.7
I understand from user-guide that I can run my IC with hdf5 format , and my tutor said I can use hdf5 snapshot file to be the IC , so I tried use a snapshot from galaxy.param and galaxy_littleendian.dat , but it don¡¯t work and return me an error level of 1 .
I wonder whether I use the wrong file format or I should write a new hdf5 IC by other different way?
Here¡¯s the error it return:
reading file `./test.hdf5' on task=0 (contains 400 particles.)
distributing this file to tasks 0-0
Type 0 (gas): 0 (tot= 0000000000) masstab=0
Type 1 (halo): 0 (tot= 0000000000) masstab=0
Type 2 (disk): 400 (tot= 0000000400) masstab=0
Type 3 (bulge): 0 (tot= 0000000000) masstab=0
Type 4 (stars): 0 (tot= 0000000000) masstab=0
Type 5 (bndry): 0 (tot= 0000000000) masstab=0
reading done.
Total number of particles : 0000000400
allocated 0.0762939 Mbyte for ngb search.
Allocated 0.796875 MByte for BH-tree. 64
domain decomposition...
done with snapshot.
Setting next time for snapshot file to Time_next= 0.1
Begin Step 0, Time: 0, Systemstep: 0
domain decomposition...
NTopleaves= 127
work-load balance=1 memory-balance=1
domain decomposition done.
begin Peano-Hilbert order...
Peano-Hilbert done.
Start force computation...
Tree construction.
task 0: endrun called with an error level of 1
NTopleaves= 127
work-load balance=1 memory-balance=1
domain decomposition done.
begin Peano-Hilbert order...
Peano-Hilbert done.
Begin Ngb-tree construction.
Ngb-Tree contruction finished
Setting next time for snapshot file to Time_next= 0
writing snapshot file...
--------------------------------------------------------------------------
Primary job terminated normally, but 1 process returned
a non-zero exit code. Per user-direction, the job has been aborted.
--------------------------------------------------------------------------
task 0: maximum number 480 of tree-nodes reached.
for particle 335
Allocated 0.0389099 MByte for particle storage. 68
reading file `./test.hdf5' on task=0 (contains 400 particles.)
distributing this file to tasks 0-0
Type 0 (gas): 0 (tot= 0000000000) masstab=0
Type 1 (halo): 0 (tot= 0000000000) masstab=0
Type 2 (disk): 400 (tot= 0000000400) masstab=0
Type 3 (bulge): 0 (tot= 0000000000) masstab=0
Type 4 (stars): 0 (tot= 0000000000) masstab=0
Type 5 (bndry): 0 (tot= 0000000000) masstab=0
application called MPI_Abort(MPI_COMM_WORLD, 1) - process 0
HDF5-DIAG: Error detected in HDF5 (1.8.13) thread 0:
#000: H5Gdeprec.c line 326 in H5Gopen1(): unable to open group
major: Symbol table
minor: Can't open object
#001: H5Gint.c line 320 in H5G__open_name(): group not found
major: Symbol table
minor: Object not found
#002: H5Gloc.c line 430 in H5G_loc_find(): can't find object
major: Symbol table
minor: Object not found
#003: H5Gtraverse.c line 861 in H5G_traverse(): internal path traversal failed
major: Symbol table
minor: Object not found
#004: H5Gtraverse.c line 641 in H5G_traverse_real(): traversal operator failed
major: Symbol table
minor: Callback failed
#005: H5Gloc.c line 385 in H5G_loc_find_cb(): object 'PartType2' doesn't exist
major: Symbol table
minor: Object not found
HDF5-DIAG: Error detected in HDF5 (1.8.13) thread 0:
#000: H5Ddeprec.c line 244 in H5Dopen1(): not a location
major: Invalid arguments to routine
minor: Inappropriate type
#001: H5Gloc.c line 253 in H5G_loc(): invalid object ID
major: Invalid arguments to routine
minor: Bad value
HDF5-DIAG: Error detected in HDF5 (1.8.13) thread 0:
#000: H5Dio.c line 140 in H5Dread(): not a dataset
major: Invalid arguments to routine
minor: Inappropriate type
HDF5-DIAG: Error detected in HDF5 (1.8.13) thread 0:
#000: H5D.c line 415 in H5Dclose(): not a dataset
major: Invalid arguments to routine
minor: Inappropriate type
HDF5-DIAG: Error detected in HDF5 (1.8.13) thread 0:
#000: H5Ddeprec.c line 244 in H5Dopen1(): not a location
major: Invalid arguments to routine
minor: Inappropriate type
#001: H5Gloc.c line 253 in H5G_loc(): invalid object ID
major: Invalid arguments to routine
minor: Bad value
HDF5-DIAG: Error detected in HDF5 (1.8.13) thread 0:
#000: H5Dio.c line 140 in H5Dread(): not a dataset
major: Invalid arguments to routine
minor: Inappropriate type
HDF5-DIAG: Error detected in HDF5 (1.8.13) thread 0:
#000: H5D.c line 415 in H5Dclose(): not a dataset
major: Invalid arguments to routine
minor: Inappropriate type
HDF5-DIAG: Error detected in HDF5 (1.8.13) thread 0:
#000: H5Ddeprec.c line 244 in H5Dopen1(): not a location
major: Invalid arguments to routine
minor: Inappropriate type
#001: H5Gloc.c line 253 in H5G_loc(): invalid object ID
major: Invalid arguments to routine
minor: Bad value
HDF5-DIAG: Error detected in HDF5 (1.8.13) thread 0:
#000: H5Dio.c line 140 in H5Dread(): not a dataset
major: Invalid arguments to routine
minor: Inappropriate type
HDF5-DIAG: Error detected in HDF5 (1.8.13) thread 0:
#000: H5D.c line 415 in H5Dclose(): not a dataset
major: Invalid arguments to routine
minor: Inappropriate type
HDF5-DIAG: Error detected in HDF5 (1.8.13) thread 0:
#000: H5Ddeprec.c line 244 in H5Dopen1(): not a location
major: Invalid arguments to routine
minor: Inappropriate type
#001: H5Gloc.c line 253 in H5G_loc(): invalid object ID
major: Invalid arguments to routine
minor: Bad value
HDF5-DIAG: Error detected in HDF5 (1.8.13) thread 0:
#000: H5Dio.c line 140 in H5Dread(): not a dataset
major: Invalid arguments to routine
minor: Inappropriate type
HDF5-DIAG: Error detected in HDF5 (1.8.13) thread 0:
#000: H5D.c line 415 in H5Dclose(): not a dataset
major: Invalid arguments to routine
minor: Inappropriate type
HDF5-DIAG: Error detected in HDF5 (1.8.13) thread 0:
#000: H5G.c line 811 in H5Gclose(): not a group
major: Invalid arguments to routine
minor: Inappropriate type
task 0: endrun called with an error level of 1
--------------------------------------------------------------------------
mpirun detected that one or more processes exited with non-zero status, thus causing
the job to be terminated. The first process to do so was:
Process name: [[31325,1],0]
Exit code: 1
In addition , I use 1.8.13 version of hdf5 package , but I think maybe it¡¯s not the problem , the error also happened in 1.6.10 version of hdf5
Best wishes
wood
´Ó Windows °æÓʼþ<
https://go.microsoft.com/fwlink/?LinkId=550986>·¢ËÍ
Received on 2022-11-16 13:42:59