I have a problem with the read_ic.
I need to initialize a new vectorial variable for every particle (some kind of normal vector, coming from geometrical issues).. I am currently trying to add a new vectorial block to the Gadget2 read_ic.
I read the previous posts about IC and the documentation: from what I understood, the input/output form for Gadget it is the same and they are "managed" in the same "code locations".
The fortran routine with which I get the IC file seems to be ok, I assigned the new vector components to every particles like I assigned their position. I can compile successfully, also with the printing I get what I want .
The problem comes out when I try to add the new block in Gadget2 (read_ic); in the recent past I have add some scalar quantity (which I would like to have as output), and I had no problem at all. I donīt understand why I have this difficulties with a vector.
I updated the following things:
allsvars.h
-#define IO_NBLOCKS 21
-enum iofields /*!< this enumeration lists the defined output blocks in snapshot files. Not all of them need to be present. */
{
IO_POS,
IO_VEL,
IO_ID,
IO_MASS,
IO_U,
IO_RHO,
.....
}; ----> I have 21 fields, as blocks number
-Pnorm is defined as:
double Pnorm[3] in extern struct particle_data
io.c
I add the same cases that the variable POS in the various function of io.c (because they are supposed to be similar). See following:
case IO_Pnorm:
strncpy(Tab_IO_Labels[IO_Pnorm], "Pnorm ", 4);
break;
I add also the new vectorial quantity in int blockpresent(enum iofields blocknr) , because I donīt want to get any output.
#ifndef OUTPUTPnorm
if(blocknr == IO_Pnorm)
return 0;
#endif
read_ic.c
case IO_Pnorm: /* normal vector */
for(n = 0; n < pc; n++)
for(k = 0; k < 3; k++)
P[offset + n].Pnorm[k] = *fp++;
break;
I compile and it seems to be fine. But then, when I execute, I get this error message:
reading file `../ICs/3Dtropfen' on task=0 (contains 4682 particles.)
incorrect block-sizes detected!
Task=0 blocknr=3 blksize1=56184 blksize2=0
task 0: endrun called with an error level of 1889
--------------------------------------------------------------------------
MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD
with errorcode 1889.
NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.
I am really confused, I donīt understand why I get this and I guess I have not clear what blksize1 and blksize2 actually are...How can I go over the problem?
If someone has an idea about it and can explain me why I get this problem only with vectors, would be fantastic.
I thanks you a lot in advance.
Kind regards,
Marzia
Received on 2013-06-26 15:17:59
This archive was generated by hypermail 2.3.0
: 2023-01-10 10:01:31 CET