TACC: Starting up job 4244477 TACC: Starting parallel tasks... Shared memory islands host a minimum of 28 and a maximum of 28 MPI ranks. We shall use 16 MPI ranks in total for assisting one-sided communication (1 per shared memory node). ___ __ ____ ___ ____ ____ __ / __) /__\ ( _ \ / __)( ___)(_ _)___ /. | ( (_-. /(__)\ )(_) )( (_-. )__) )( (___)(_ _) \___/(__)(__)(____/ \___/(____) (__) (_) This is Gadget, version 4.0. Git commit 30019281bed6dcfb8d018c09095aeaa1e6ff5042, Wed Aug 11 12:22:28 2021 +0200 Code was compiled with the following compiler and flags: mpicxx -std=c++11 -Wwrite-strings -Wredundant-decls -Woverloaded-virtual -Wcast-qual -Wcast-align -Wpointer-arith -Wmissing-declarations -g -Wall -W -O3 -march=native -I/opt/apps/gcc9_1/hdf5/1.10.4/x86_64/include -I/opt/apps/gcc9_1/gsl/2.6/include -I/opt/apps/gcc9_1/impi19_0/fftw3/3.3.8/include -Ibuild -Isrc Code was compiled with the following settings: DOUBLEPRECISION=1 FOF FOF_GROUP_MIN_LEN=32 FOF_LINKLENGTH=0.2 FOF_PRIMARY_LINK_TYPES=2 FOF_SECONDARY_LINK_TYPES=1+16+32 GADGET2_HEADER IDS_64BIT NTYPES=6 PERIODIC PMGRID=128 SELFGRAVITY SUBFIND Running on 432 MPI tasks. BEGRUN: Size of particle structure 136 [bytes] BEGRUN: Size of sph particle structure 192 [bytes] BEGRUN: Size of gravity tree node 104 [bytes] BEGRUN: Size of neighbour tree node 192 [bytes] BEGRUN: Size of subfind auxiliary data 104 [bytes] ------------------------------------------------------------------------------------------------------------------------- AvailMem: Largest = 186318.00 Mb (on task= 108), Smallest = 186195.86 Mb (on task= 297), Average = 186277.99 Mb Total Mem: Largest = 191386.89 Mb (on task= 108), Smallest = 191386.89 Mb (on task= 0), Average = 191386.89 Mb Committed_AS: Largest = 5191.03 Mb (on task= 297), Smallest = 5068.89 Mb (on task= 108), Average = 5108.90 Mb SwapTotal: Largest = 0.00 Mb (on task= 0), Smallest = 0.00 Mb (on task= 0), Average = 0.00 Mb SwapFree: Largest = 0.00 Mb (on task= 0), Smallest = 0.00 Mb (on task= 0), Average = 0.00 Mb AllocMem: Largest = 5191.03 Mb (on task= 297), Smallest = 5068.89 Mb (on task= 108), Average = 5108.90 Mb avail /dev/shm: Largest = 95257.31 Mb (on task= 297), Smallest = 95257.25 Mb (on task= 51), Average = 95257.29 Mb ------------------------------------------------------------------------------------------------------------------------- Task=0 has the maximum commited memory and is host: c203-034.frontera.tacc.utexas.edu ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ Obtaining parameters from file 'param.txt': InitCondFile lcdm_gas_littleendian.dat OutputDir output SnapshotFileBase snap OutputListFilename outputs_lcdm_gas.txt ICFormat 3 SnapFormat 3 TimeLimitCPU 180000 CpuTimeBetRestartFile 7200 MaxMemSize 3000 TimeBegin 0.0909091 TimeMax 1 ComovingIntegrationOn 1 Omega0 0.2814 OmegaLambda 0.7186 OmegaBaryon 0.0464 HubbleParam 0.697 Hubble 0.1 BoxSize 400000 OutputListOn 0 TimeBetSnapshot 1.83842 TimeOfFirstSnapshot 0.047619 TimeBetStatistics 0.05 NumFilesPerSnapshot 1 MaxFilesWithConcurrentIO 1 ErrTolIntAccuracy 0.012 CourantFac 0.15 MaxSizeTimestep 0.025 MinSizeTimestep 0 TypeOfOpeningCriterion 1 ErrTolTheta 0.9 ErrTolThetaMax 1 ErrTolForceAcc 0.0025 TopNodeFactor 2.5 ActivePartFracForNewDomainDecomp 0.01 DesNumNgb 64 MaxNumNgbDeviation 3 UnitLength_in_cm 3.08568e+21 UnitMass_in_g 1.989e+43 UnitVelocity_in_cm_per_s 100000 GravityConstantInternal 0 DesLinkNgb 20 SofteningComovingClass0 1.5 SofteningComovingClass1 1.5 SofteningComovingClass2 1.5 SofteningComovingClass3 1.5 SofteningComovingClass4 1.5 SofteningComovingClass5 1.5 SofteningMaxPhysClass0 1.5 SofteningMaxPhysClass1 1.5 SofteningMaxPhysClass2 1.5 SofteningMaxPhysClass3 1.5 SofteningMaxPhysClass4 1.5 SofteningMaxPhysClass5 1.5 SofteningClassOfPartType0 0 SofteningClassOfPartType1 0 SofteningClassOfPartType2 0 SofteningClassOfPartType3 0 SofteningClassOfPartType4 0 SofteningClassOfPartType5 0 ArtBulkViscConst 1 MinEgySpec 0 InitGasTemp 1000 MALLOC: Allocation of shared memory took 15.4055 sec BEGRUN: Hubble (internal units) = 0.1 BEGRUN: h = 0.697 BEGRUN: G (internal units) = 43018.7 BEGRUN: UnitMass_in_g = 1.989e+43 BEGRUN: UnitLenth_in_cm = 3.08568e+21 BEGRUN: UnitTime_in_s = 3.08568e+16 BEGRUN: UnitVelocity_in_cm_per_s = 100000 BEGRUN: UnitDensity_in_cgs = 6.76991e-22 BEGRUN: UnitEnergy_in_cgs = 1.989e+53 READIC: filenr=0, 'output/snap_086.hdf5' contains: READIC: Type 0 (gas): 64130279 (tot= 64130279) masstab= 0 READIC: Type 1: 68158041 (tot= 68158041) masstab= 0 READIC: Type 2: 0 (tot= 0) masstab= 0 READIC: Type 3: 0 (tot= 0) masstab= 0 READIC: Type 4: 1540436 (tot= 1540436) masstab= 0 READIC: Type 5: 1 (tot= 1) masstab= 0 READIC: Reading file `output/snap_086.hdf5' on task=0 and distribute it to 0 to 431. READIC: reading block 0 (Coordinates)... READIC: reading block 1 (Velocities)... READIC: reading block 2 (ParticleIDs)... READIC: reading block 3 (Masses)... READIC: reading block 4 (InternalEnergy)... READIC: reading block 5 (Density)... Dataset Density not present for particle type 0, using zero. READIC: reading block 6 (SmoothingLength)... Dataset SmoothingLength not present for particle type 0, using zero. READIC: reading done. Took 7.8447 sec, total size 6370.83 MB, corresponds to effective I/O rate of 812.12 MB/sec READIC: Total number of particles : 133828757 INIT: Testing ID uniqueness... INIT: success. took=0.0704687 sec DOMAIN: Begin domain decomposition (sync-point 0). DOMAIN: Sum=4 TotalCost=4 NumTimeBinsToBeBalanced=1 MultipleDomains=4 DOMAIN: Increasing TopNodeAllocFactor=0.08 new value=0.104 DOMAIN: Increasing TopNodeAllocFactor=0.104 new value=0.1352 DOMAIN: Increasing TopNodeAllocFactor=0.1352 new value=0.17576 DOMAIN: Increasing TopNodeAllocFactor=0.17576 new value=0.228488 DOMAIN: Increasing TopNodeAllocFactor=0.228488 new value=0.297034 DOMAIN: Increasing TopNodeAllocFactor=0.297034 new value=0.386145 DOMAIN: Increasing TopNodeAllocFactor=0.386145 new value=0.501988 DOMAIN: Increasing TopNodeAllocFactor=0.501988 new value=0.652585 DOMAIN: Increasing TopNodeAllocFactor=0.652585 new value=0.84836 DOMAIN: Increasing TopNodeAllocFactor=0.84836 new value=1.10287 DOMAIN: Increasing TopNodeAllocFactor=1.10287 new value=1.43373 DOMAIN: Increasing TopNodeAllocFactor=1.43373 new value=1.86385 DOMAIN: Increasing TopNodeAllocFactor=1.86385 new value=2.423 DOMAIN: Increasing TopNodeAllocFactor=2.423 new value=3.1499 DOMAIN: Increasing TopNodeAllocFactor=3.1499 new value=4.09487 DOMAIN: Increasing TopNodeAllocFactor=4.09487 new value=5.32333 DOMAIN: NTopleaves=18306, determination of top-level tree involved 6 iterations and took 0.650465 sec DOMAIN: we are going to try at most 503 different settings for combining the domains on tasks=432, nnodes=16 DOMAIN: total_cost=4 total_load=2 DOMAIN: best solution found after 1 iterations by task=196 for nextra=25, reaching maximum imbalance of 1.12958|1.1311 DOMAIN: combining multiple-domains took 0.0127207 sec DOMAIN: exchange of 69698478 particles DOMAIN: particle exchange done. (took 0.316543 sec) DOMAIN: domain decomposition done. (took in total 0.990447 sec) PEANO: Begin Peano-Hilbert order... PEANO: done, took 0.0720377 sec. INIT: Converting u -> entropy All.cf_a3inv=729 FOF: Begin to compute FoF group catalogue... (presently allocated=172.527 MB) FOF: Comoving linking length: 11.3647 FOFTREE: Ngb-tree construction done. took 0.161189 sec =54159.9 NTopnodes=20921 NTopleaves=18306 FOF: Start linking particles (presently allocated=189.416 MB) FOF: linking of small cells took 0.00183059 sec FOF: local links done (took 0.229191 sec, avg-work=0.19936, imbalance=1.14121). FOF: Marked=176184 out of the 68158041 primaries which are linked FOF: begin linking across processors (presently allocated=191.147 MB) FOF: have done 5722 cross links (processed 176184, took 0.00846926 sec) FOF: have done 2167 cross links (processed 82065, took 0.00539543 sec) FOF: have done 480 cross links (processed 15318, took 0.00491691 sec) FOF: have done 103 cross links (processed 3677, took 0.00476298 sec) FOF: have done 13 cross links (processed 620, took 0.00462231 sec) FOF: have done 4 cross links (processed 113, took 0.0045977 sec) FOF: have done 0 cross links (processed 25, took 0.00460884 sec) FOF: Local groups found. FOF: primary group finding took = 0.27869 sec FOF: Start finding nearest dm-particle (presently allocated=189.416 MB) FOF: fof-nearest iteration 1: need to repeat for 62191058 particles. (took = 0.123336 sec) FOF: fof-nearest iteration 2: need to repeat for 48853982 particles. (took = 0.147595 sec) FOF: fof-nearest iteration 3: need to repeat for 24537858 particles. (took = 0.156258 sec) FOF: fof-nearest iteration 4: need to repeat for 5822055 particles. (took = 0.100167 sec) FOF: fof-nearest iteration 5: need to repeat for 578231 particles. (took = 0.0318054 sec) FOF: fof-nearest iteration 6: need to repeat for 10968 particles. (took = 0.0065263 sec) FOF: done finding nearest dm-particle FOF: attaching gas and star particles to nearest dm particles took = 0.574034 sec FOF: compiling local group data and catalogue took = 0.0256362 sec FOF: Total number of FOF groups with at least 32 particles: 117018 FOF: Largest FOF group has 1174 particles. FOF: Total number of particles in FOF groups: 133828757 FOF: group properties are now allocated.. (presently allocated=177.495 MB) FOF: computation of group properties took = 0.00733598 sec FOF: start assigning group numbers FOF: Assigning of group numbers took = 0.0749606 sec FOF: Finished computing FoF groups. Complete work took 1.16873 sec (presently allocated=172.625 MB) SUBFIND: We now execute a parallel version of SUBFIND. TREE: Full tree construction for all particles. (presently allocated=176.732 MB) GRAVTREE: Tree construction done. took 0.662758 sec =80784 NTopnodes=20921 NTopleaves=18306 tree-build-scalability=0.741625 SUBFIND: finding total densities around all particles SUBFIND: ngb iteration 1: need to repeat for 133595022 particles. (took 2.66853 sec) SUBFIND: ngb iteration 2: need to repeat for 133067599 particles. (took 2.55704 sec) SUBFIND: ngb iteration 3: need to repeat for 131858357 particles. (took 2.42547 sec) SUBFIND: ngb iteration 4: need to repeat for 129391323 particles. (took 2.28108 sec) SUBFIND: ngb iteration 5: need to repeat for 125289399 particles. (took 2.15857 sec) SUBFIND: ngb iteration 6: need to repeat for 119666712 particles. (took 2.03625 sec) SUBFIND: ngb iteration 7: need to repeat for 112940838 particles. (took 1.9176 sec) SUBFIND: ngb iteration 8: need to repeat for 105187962 particles. (took 1.83247 sec) SUBFIND: ngb iteration 9: need to repeat for 96435637 particles. (took 1.76308 sec) SUBFIND: ngb iteration 10: need to repeat for 86808628 particles. (took 1.67346 sec) SUBFIND: ngb iteration 11: need to repeat for 75926437 particles. (took 1.57216 sec) SUBFIND: ngb iteration 12: need to repeat for 64103110 particles. (took 1.43418 sec) SUBFIND: ngb iteration 13: need to repeat for 51869486 particles. (took 1.27108 sec) SUBFIND: ngb iteration 14: need to repeat for 40058765 particles. (took 1.08167 sec) SUBFIND: ngb iteration 15: need to repeat for 29442692 particles. (took 0.874151 sec) SUBFIND: ngb iteration 16: need to repeat for 20313557 particles. (took 0.662103 sec) SUBFIND: ngb iteration 17: need to repeat for 13011673 particles. (took 0.472288 sec) SUBFIND: ngb iteration 18: need to repeat for 7856346 particles. (took 0.315903 sec) SUBFIND: ngb iteration 19: need to repeat for 4609123 particles. (took 0.20775 sec) SUBFIND: ngb iteration 20: need to repeat for 2707391 particles. (took 0.131857 sec) SUBFIND: ngb iteration 21: need to repeat for 1640347 particles. (took 0.089306 sec) SUBFIND: ngb iteration 22: need to repeat for 1008031 particles. (took 0.0609091 sec) SUBFIND: ngb iteration 23: need to repeat for 593901 particles. (took 0.0400968 sec) SUBFIND: ngb iteration 24: need to repeat for 326364 particles. (took 0.0269066 sec) SUBFIND: ngb iteration 25: need to repeat for 160732 particles. (took 0.0190829 sec) SUBFIND: ngb iteration 26: need to repeat for 66628 particles. (took 0.0123942 sec) SUBFIND: ngb iteration 27: need to repeat for 22971 particles. (took 0.00718713 sec) SUBFIND: ngb iteration 28: need to repeat for 6166 particles. (took 0.00336028 sec) SUBFIND: ngb iteration 29: need to repeat for 1156 particles. (took 0.00184684 sec) SUBFIND: ngb iteration 30: need to repeat for 149 particles. (took 0.000932939 sec) SUBFIND: ngb iteration 31: need to repeat for 12 particles. (took 0.000387046 sec) SUBFIND: ngb iteration 32: need to repeat for 1 particles. (took 0.000148196 sec) SUBFIND: iteration to correct primary neighbor count and density estimate took 29.6075 sec SUBFIND: Number of FOF halos treated with collective SubFind algorithm = 0 SUBFIND: Number of processors used in different partitions for the collective SubFind code = 0 SUBFIND: (The adopted size-limit for the collective algorithm was 133828757 particles, for threshold size factor 0.6) SUBFIND: The other 117018 FOF halos are treated in parallel with serial code SUBFIND: subfind_distribute_groups() took 0.00169567 sec SUBFIND: particle balance=1.12818 SUBFIND: subfind_exchange() took 0.468302 sec SUBFIND: particle balance for processing=1.00054 SUBFIND: subdomain decomposition took 0.104342 sec SUBFIND: serial subfind subdomain decomposition took 0.102939 sec SUBFIND: subfind_hbt_single_group() processing for Ngroups=271 took 15.2255 sec SUBFIND: root-task=0: Serial processing of halo 0 took 15.482 SUBFIND: Processing overall took (total time=15.8771 sec) SUBFIND: 0 out of 166655 decisions, fraction 0, where influenced by previous subhalo size ALLOCATE: Changing to MaxPart = 516318 ALLOCATE: Changing to MaxPartSph = 247416 SUBFIND: subfind_exchange() (for return to original CPU) took 0.526436 sec TREE: Full tree construction for all particles. (presently allocated=159.063 MB) GRAVTREE: Tree construction done. took 1.49373 sec =80784 NTopnodes=20921 NTopleaves=18306 tree-build-scalability=0.741625 SUBFIND: SO iteration 1: need to repeat for 117018 halo centers. (took 0.0109537 sec) SUBFIND: SO iteration 2: need to repeat for 117018 halo centers. (took 0.0103134 sec) SUBFIND: SO iteration 3: need to repeat for 117018 halo centers. (took 0.00989716 sec) SUBFIND: SO iteration 4: need to repeat for 117018 halo centers. (took 0.00953674 sec) SUBFIND: SO iteration 5: need to repeat for 117018 halo centers. (took 0.00933746 sec) SUBFIND: SO iteration 6: need to repeat for 117018 halo centers. (took 0.00929437 sec) SUBFIND: SO iteration 7: need to repeat for 117018 halo centers. (took 0.00924491 sec) SUBFIND: SO iteration 8: need to repeat for 117018 halo centers. (took 0.00928667 sec) SUBFIND: SO iteration 9: need to repeat for 117018 halo centers. (took 0.00926791 sec) SUBFIND: SO iteration 10: need to repeat for 117018 halo centers. (took 0.00921445 sec) SUBFIND: SO iteration 11: need to repeat for 117018 halo centers. (took 0.00921459 sec) SUBFIND: SO iteration 12: need to repeat for 117018 halo centers. (took 0.00923201 sec) SUBFIND: SO iteration 13: need to repeat for 117018 halo centers. (took 0.00924559 sec) SUBFIND: SO iteration 14: need to repeat for 117018 halo centers. (took 0.00923215 sec) SUBFIND: SO iteration 15: need to repeat for 117018 halo centers. (took 0.00923679 sec) SUBFIND: SO iteration 16: need to repeat for 1380 halo centers. (took 0.00917938 sec) SUBFIND: SO iteration 17: need to repeat for 1 halo centers. (took 0.000396833 sec) SUBFIND: SO iteration 1: need to repeat for 117018 halo centers. (took 0.00916271 sec) SUBFIND: SO iteration 2: need to repeat for 117018 halo centers. (took 0.00877118 sec) SUBFIND: SO iteration 3: need to repeat for 117018 halo centers. (took 0.00893205 sec) SUBFIND: SO iteration 4: need to repeat for 117018 halo centers. (took 0.0089971 sec) SUBFIND: SO iteration 5: need to repeat for 117018 halo centers. (took 0.00906754 sec) SUBFIND: SO iteration 6: need to repeat for 117018 halo centers. (took 0.00907866 sec) SUBFIND: SO iteration 7: need to repeat for 117018 halo centers. (took 0.00911629 sec) SUBFIND: SO iteration 8: need to repeat for 117018 halo centers. (took 0.00911361 sec) SUBFIND: SO iteration 9: need to repeat for 117018 halo centers. (took 0.009064 sec) SUBFIND: SO iteration 10: need to repeat for 117018 halo centers. (took 0.00907252 sec) SUBFIND: SO iteration 11: need to repeat for 117018 halo centers. (took 0.00911886 sec) SUBFIND: SO iteration 12: need to repeat for 117018 halo centers. (took 0.00907128 sec) SUBFIND: SO iteration 13: need to repeat for 117018 halo centers. (took 0.00911571 sec) SUBFIND: SO iteration 14: need to repeat for 117018 halo centers. (took 0.00910813 sec) SUBFIND: SO iteration 15: need to repeat for 117018 halo centers. (took 0.00909661 sec) SUBFIND: SO iteration 16: need to repeat for 633 halo centers. (took 0.00910881 sec) SUBFIND: SO iteration 1: need to repeat for 117018 halo centers. (took 0.00916684 sec) SUBFIND: SO iteration 2: need to repeat for 117018 halo centers. (took 0.00981477 sec) SUBFIND: SO iteration 3: need to repeat for 117018 halo centers. (took 0.0096956 sec) SUBFIND: SO iteration 4: need to repeat for 117018 halo centers. (took 0.00938869 sec) SUBFIND: SO iteration 5: need to repeat for 117018 halo centers. (took 0.00924874 sec) SUBFIND: SO iteration 6: need to repeat for 117018 halo centers. (took 0.00924934 sec) SUBFIND: SO iteration 7: need to repeat for 117018 halo centers. (took 0.00923643 sec) SUBFIND: SO iteration 8: need to repeat for 117018 halo centers. (took 0.00920347 sec) SUBFIND: SO iteration 9: need to repeat for 117018 halo centers. (took 0.00920658 sec) SUBFIND: SO iteration 10: need to repeat for 117018 halo centers. (took 0.00922497 sec) SUBFIND: SO iteration 11: need to repeat for 117018 halo centers. (took 0.00919971 sec) SUBFIND: SO iteration 12: need to repeat for 117018 halo centers. (took 0.00921769 sec) SUBFIND: SO iteration 13: need to repeat for 117018 halo centers. (took 0.00926133 sec) SUBFIND: SO iteration 14: need to repeat for 117018 halo centers. (took 0.00919584 sec) SUBFIND: SO iteration 15: need to repeat for 117018 halo centers. (took 0.00922898 sec) SUBFIND: SO iteration 16: need to repeat for 1411 halo centers. (took 0.00920328 sec) SUBFIND: SO iteration 17: need to repeat for 1 halo centers. (took 0.00042942 sec) SUBFIND: SO iteration 1: need to repeat for 117018 halo centers. (took 0.00912228 sec) SUBFIND: SO iteration 2: need to repeat for 117018 halo centers. (took 0.0100539 sec) SUBFIND: SO iteration 3: need to repeat for 117018 halo centers. (took 0.00954266 sec) SUBFIND: SO iteration 4: need to repeat for 117018 halo centers. (took 0.00985569 sec) SUBFIND: SO iteration 5: need to repeat for 117018 halo centers. (took 0.00983474 sec) SUBFIND: SO iteration 6: need to repeat for 117018 halo centers. (took 0.00983793 sec) SUBFIND: SO iteration 7: need to repeat for 117018 halo centers. (took 0.00984814 sec) SUBFIND: SO iteration 8: need to repeat for 117018 halo centers. (took 0.00982604 sec) SUBFIND: SO iteration 9: need to repeat for 117018 halo centers. (took 0.00982 sec) SUBFIND: SO iteration 10: need to repeat for 117018 halo centers. (took 0.00987458 sec) SUBFIND: SO iteration 11: need to repeat for 117018 halo centers. (took 0.00982929 sec) SUBFIND: SO iteration 12: need to repeat for 117018 halo centers. (took 0.00982999 sec) SUBFIND: SO iteration 13: need to repeat for 117018 halo centers. (took 0.00981043 sec) SUBFIND: SO iteration 14: need to repeat for 117018 halo centers. (took 0.00985639 sec) SUBFIND: SO iteration 15: need to repeat for 117018 halo centers. (took 0.0098881 sec) SUBFIND: SO iteration 16: need to repeat for 67005 halo centers. (took 0.00996483 sec) SUBFIND: SO iteration 17: need to repeat for 361 halo centers. (took 0.0065505 sec) SUBFIND: SO iteration 18: need to repeat for 8 halo centers. (took 0.000227591 sec) SUBFIND: determining spherical overdensity masses took 0.611696 sec SUBFIND: assembled and ordered groups and subhalos (took 0.0629924 sec) FOF/SUBFIND: writing group catalogue file: 'output/fof_subhalo_tab_086' (file 1 of 1) FOF/SUBFIND: writing group catalogue block 0 (GroupLen)... FOF/SUBFIND: writing group catalogue block 1 (GroupMass)... FOF/SUBFIND: writing group catalogue block 2 (GroupPos)... FOF/SUBFIND: writing group catalogue block 3 (GroupVel)... FOF/SUBFIND: writing group catalogue block 4 (GroupLenType)... FOF/SUBFIND: writing group catalogue block 5 (GroupOffsetType)... FOF/SUBFIND: writing group catalogue block 6 (GroupMassType)... FOF/SUBFIND: writing group catalogue block 7 (GroupAscale)... FOF/SUBFIND: writing group catalogue block 8 (Group_M_Mean200)... FOF/SUBFIND: writing group catalogue block 9 (Group_R_Mean200)... FOF/SUBFIND: writing group catalogue block 10 (Group_M_Crit200)... FOF/SUBFIND: writing group catalogue block 11 (Group_R_Crit200)... FOF/SUBFIND: writing group catalogue block 12 (Group_M_Crit500)... FOF/SUBFIND: writing group catalogue block 13 (Group_R_Crit500)... FOF/SUBFIND: writing group catalogue block 14 (Group_M_TopHat200)... FOF/SUBFIND: writing group catalogue block 15 (Group_R_TopHat200)... FOF/SUBFIND: writing group catalogue block 16 (GroupNsubs)... FOF/SUBFIND: writing group catalogue block 17 (GroupFirstSub)... FOF/SUBFIND: writing group catalogue block 18 (SubhaloGroupNr)... FOF/SUBFIND: writing group catalogue block 19 (SubhaloRankInGr)... FOF/SUBFIND: writing group catalogue block 20 (SubhaloLen)... FOF/SUBFIND: writing group catalogue block 21 (SubhaloMass)... FOF/SUBFIND: writing group catalogue block 22 (SubhaloPos)... FOF/SUBFIND: writing group catalogue block 23 (SubhaloVel)... FOF/SUBFIND: writing group catalogue block 24 (SubhaloLenType)... FOF/SUBFIND: writing group catalogue block 25 (SubhaloOffsetType)... FOF/SUBFIND: writing group catalogue block 26 (SubhaloMassType)... FOF/SUBFIND: writing group catalogue block 27 (SubhaloCM)... FOF/SUBFIND: writing group catalogue block 28 (SubhaloSpin)... FOF/SUBFIND: writing group catalogue block 29 (SubhaloVelDisp)... FOF/SUBFIND: writing group catalogue block 30 (SubhaloVmax)... FOF/SUBFIND: writing group catalogue block 31 (SubhaloVmaxRad)... FOF/SUBFIND: writing group catalogue block 32 (SubhaloHalfmassRad)... FOF/SUBFIND: writing group catalogue block 33 (SubhaloHalfmassRadType)... FOF/SUBFIND: writing group catalogue block 34 (SubhaloIDMostbound)... FOF/SUBFIND: writing group catalogue block 35 (SubhaloParentRank)... FOF/SUBFIND: Group catalogues saved. took = 16.86 sec, total size 77.9039 MB, corresponds to effective I/O rate of 4.62065 MB/sec SUBFIND: Subgroup catalogues saved. took = 16.8613 sec SUBFIND: Finished with SUBFIND. (total time=66.2475 sec) SUBFIND: Total number of subhalos with at least 20 particles: 282838 SUBFIND: Largest subhalo has 1174 particles/cells. SUBFIND: Total number of particles/cells in subhalos: 133828757 FOF: preparing output order of particles took 1.35517 sec SNAPSHOT: writing snapshot file #86 @ time 0.111111 ... SNAPSHOT: writing snapshot file: 'output/snap_086' (file 1 of 1) SNAPSHOT: writing snapshot rename 'output/snap_086.hdf5' to 'output/bak-snap_086.hdf5' SNAPSHOT: writing snapshot block 0 (Coordinates)... SNAPSHOT: writing snapshot block 1 (Velocities)... SNAPSHOT: writing snapshot block 2 (ParticleIDs)... SNAPSHOT: writing snapshot block 3 (Masses)... SNAPSHOT: writing snapshot block 4 (InternalEnergy)... Abort(806969615) on node 226 (rank 226 in comm 0): Fatal error in PMPI_Finalize: Other MPI error, error stack: PMPI_Finalize(214)...............: MPI_Finalize failed PMPI_Finalize(159)...............: MPID_Finalize(1288)..............: MPIDI_OFI_mpi_finalize_hook(1892): OFI domain close failed (ofi_init.c:1892:MPIDI_OFI_mpi_finalize_hook:Device or resource busy) Abort(806969615) on node 196 (rank 196 in comm 0): Fatal error in PMPI_Finalize: Other MPI error, error stack: PMPI_Finalize(214)...............: MPI_Finalize failed PMPI_Finalize(159)...............: MPID_Finalize(1288)..............: MPIDI_OFI_mpi_finalize_hook(1892): OFI domain close failed (ofi_init.c:1892:MPIDI_OFI_mpi_finalize_hook:Device or resource busy) SNAPSHOT: writing snapshot block 5 (Density)... Abort(806969615) on node 45 (rank 45 in comm 0): Fatal error in PMPI_Finalize: Other MPI error, error stack: PMPI_Finalize(214)...............: MPI_Finalize failed PMPI_Finalize(159)...............: MPID_Finalize(1288)..............: MPIDI_OFI_mpi_finalize_hook(1892): OFI domain close failed (ofi_init.c:1892:MPIDI_OFI_mpi_finalize_hook:Device or resource busy) Abort(806969615) on node 183 (rank 183 in comm 0): Fatal error in PMPI_Finalize: Other MPI error, error stack: PMPI_Finalize(214)...............: MPI_Finalize failed PMPI_Finalize(159)...............: MPID_Finalize(1288)..............: MPIDI_OFI_mpi_finalize_hook(1892): OFI domain close failed (ofi_init.c:1892:MPIDI_OFI_mpi_finalize_hook:Device or resource busy) Abort(806969615) on node 411 (rank 411 in comm 0): Fatal error in PMPI_Finalize: Other MPI error, error stack: PMPI_Finalize(214)...............: MPI_Finalize failed PMPI_Finalize(159)...............: MPID_Finalize(1288)..............: MPIDI_OFI_mpi_finalize_hook(1892): OFI domain close failed (ofi_init.c:1892:MPIDI_OFI_mpi_finalize_hook:Device or resource busy) Abort(806969615) on node 75 (rank 75 in comm 0): Fatal error in PMPI_Finalize: Other MPI error, error stack: PMPI_Finalize(214)...............: MPI_Finalize failed PMPI_Finalize(159)...............: MPID_Finalize(1288)..............: MPIDI_OFI_mpi_finalize_hook(1892): OFI domain close failed (ofi_init.c:1892:MPIDI_OFI_mpi_finalize_hook:Device or resource busy) Abort(806969615) on node 363 (rank 363 in comm 0): Fatal error in PMPI_Finalize: Other MPI error, error stack: PMPI_Finalize(214)...............: MPI_Finalize failed PMPI_Finalize(159)...............: MPID_Finalize(1288)..............: MPIDI_OFI_mpi_finalize_hook(1892): OFI domain close failed (ofi_init.c:1892:MPIDI_OFI_mpi_finalize_hook:Device or resource busy) Abort(806969615) on node 125 (rank 125 in comm 0): Fatal error in PMPI_Finalize: Other MPI error, error stack: PMPI_Finalize(214)...............: MPI_Finalize failed PMPI_Finalize(159)...............: MPID_Finalize(1288)..............: MPIDI_OFI_mpi_finalize_hook(1892): OFI domain close failed (ofi_init.c:1892:MPIDI_OFI_mpi_finalize_hook:Device or resource busy) Abort(806969615) on node 390 (rank 390 in comm 0): Fatal error in PMPI_Finalize: Other MPI error, error stack: PMPI_Finalize(214)...............: MPI_Finalize failed PMPI_Finalize(159)...............: MPID_Finalize(1288)..............: MPIDI_OFI_mpi_finalize_hook(1892): OFI domain close failed (ofi_init.c:1892:MPIDI_OFI_mpi_finalize_hook:Device or resource busy) Abort(806969615) on node 258 (rank 258 in comm 0): Fatal error in PMPI_Finalize: Other MPI error, error stack: PMPI_Finalize(214)...............: MPI_Finalize failed PMPI_Finalize(159)...............: MPID_Finalize(1288)..............: MPIDI_OFI_mpi_finalize_hook(1892): OFI domain close failed (ofi_init.c:1892:MPIDI_OFI_mpi_finalize_hook:Device or resource busy) Abort(806969615) on node 324 (rank 324 in comm 0): Fatal error in PMPI_Finalize: Other MPI error, error stack: PMPI_Finalize(214)...............: MPI_Finalize failed PMPI_Finalize(159)...............: MPID_Finalize(1288)..............: MPIDI_OFI_mpi_finalize_hook(1892): OFI domain close failed (ofi_init.c:1892:MPIDI_OFI_mpi_finalize_hook:Device or resource busy) Abort(806969615) on node 297 (rank 297 in comm 0): Fatal error in PMPI_Finalize: Other MPI error, error stack: PMPI_Finalize(214)...............: MPI_Finalize failed PMPI_Finalize(159)...............: MPID_Finalize(1288)..............: MPIDI_OFI_mpi_finalize_hook(1892): OFI domain close failed (ofi_init.c:1892:MPIDI_OFI_mpi_finalize_hook:Device or resource busy) Abort(806969615) on node 447 (rank 447 in comm 0): Fatal error in PMPI_Finalize: Other MPI error, error stack: PMPI_Finalize(214)...............: MPI_Finalize failed PMPI_Finalize(159)...............: MPID_Finalize(1288)..............: MPIDI_OFI_mpi_finalize_hook(1892): OFI domain close failed (ofi_init.c:1892:MPIDI_OFI_mpi_finalize_hook:Device or resource busy) Abort(806969615) on node 95 (rank 95 in comm 0): Fatal error in PMPI_Finalize: Other MPI error, error stack: PMPI_Finalize(214)...............: MPI_Finalize failed PMPI_Finalize(159)...............: MPID_Finalize(1288)..............: MPIDI_OFI_mpi_finalize_hook(1892): OFI domain close failed (ofi_init.c:1892:MPIDI_OFI_mpi_finalize_hook:Device or resource busy) Abort(806969615) on node 161 (rank 161 in comm 0): Fatal error in PMPI_Finalize: Other MPI error, error stack: PMPI_Finalize(214)...............: MPI_Finalize failed PMPI_Finalize(159)...............: MPID_Finalize(1288)..............: MPIDI_OFI_mpi_finalize_hook(1892): OFI domain close failed (ofi_init.c:1892:MPIDI_OFI_mpi_finalize_hook:Device or resource busy) SNAPSHOT: writing snapshot block 6 (SmoothingLength)... Abort(806969615) on node 213 (rank 213 in comm 0): Fatal error in PMPI_Finalize: Other MPI error, error stack: PMPI_Finalize(214)...............: MPI_Finalize failed PMPI_Finalize(159)...............: MPID_Finalize(1288)..............: MPIDI_OFI_mpi_finalize_hook(1892): OFI domain close failed (ofi_init.c:1892:MPIDI_OFI_mpi_finalize_hook:Device or resource busy) SNAPSHOT: done with writing snapshot. Took 25.8678 sec, total size 5328.56 MB, corresponds to effective I/O rate of 205.992 MB/sec endrun called, calling MPI_Finalize() bye! Abort(806969615) on node 74 (rank 74 in comm 0): Fatal error in PMPI_Finalize: Other MPI error, error stack: PMPI_Finalize(214)...............: MPI_Finalize failed PMPI_Finalize(159)...............: MPID_Finalize(1288)..............: MPIDI_OFI_mpi_finalize_hook(1892): OFI domain close failed (ofi_init.c:1892:MPIDI_OFI_mpi_finalize_hook:Device or resource busy) Abort(806969615) on node 268 (rank 268 in comm 0): Fatal error in PMPI_Finalize: Other MPI error, error stack: PMPI_Finalize(214)...............: MPI_Finalize failed PMPI_Finalize(159)...............: MPID_Finalize(1288)..............: MPIDI_OFI_mpi_finalize_hook(1892): OFI domain close failed (ofi_init.c:1892:MPIDI_OFI_mpi_finalize_hook:Device or resource busy) Abort(806969615) on node 305 (rank 305 in comm 0): Fatal error in PMPI_Finalize: Other MPI error, error stack: PMPI_Finalize(214)...............: MPI_Finalize failed PMPI_Finalize(159)...............: MPID_Finalize(1288)..............: MPIDI_OFI_mpi_finalize_hook(1892): OFI domain close failed (ofi_init.c:1892:MPIDI_OFI_mpi_finalize_hook:Device or resource busy) Abort(806969615) on node 421 (rank 421 in comm 0): Fatal error in PMPI_Finalize: Other MPI error, error stack: PMPI_Finalize(214)...............: MPI_Finalize failed PMPI_Finalize(159)...............: MPID_Finalize(1288)..............: MPIDI_OFI_mpi_finalize_hook(1892): OFI domain close failed (ofi_init.c:1892:MPIDI_OFI_mpi_finalize_hook:Device or resource busy) Abort(806969615) on node 24 (rank 24 in comm 0): Fatal error in PMPI_Finalize: Other MPI error, error stack: PMPI_Finalize(214)...............: MPI_Finalize failed PMPI_Finalize(159)...............: MPID_Finalize(1288)..............: MPIDI_OFI_mpi_finalize_hook(1892): OFI domain close failed (ofi_init.c:1892:MPIDI_OFI_mpi_finalize_hook:Device or resource busy) Abort(806969615) on node 430 (rank 430 in comm 0): Fatal error in PMPI_Finalize: Other MPI error, error stack: PMPI_Finalize(214)...............: MPI_Finalize failed PMPI_Finalize(159)...............: MPID_Finalize(1288)..............: MPIDI_OFI_mpi_finalize_hook(1892): OFI domain close failed (ofi_init.c:1892:MPIDI_OFI_mpi_finalize_hook:Device or resource busy) Abort(806969615) on node 19 (rank 19 in comm 0): Fatal error in PMPI_Finalize: Other MPI error, error stack: PMPI_Finalize(214)...............: MPI_Finalize failed PMPI_Finalize(159)...............: MPID_Finalize(1288)..............: MPIDI_OFI_mpi_finalize_hook(1892): OFI domain close failed (ofi_init.c:1892:MPIDI_OFI_mpi_finalize_hook:Device or resource busy) Abort(806969615) on node 442 (rank 442 in comm 0): Fatal error in PMPI_Finalize: Other MPI error, error stack: PMPI_Finalize(214)...............: MPI_Finalize failed PMPI_Finalize(159)...............: MPID_Finalize(1288)..............: MPIDI_OFI_mpi_finalize_hook(1892): OFI domain close failed (ofi_init.c:1892:MPIDI_OFI_mpi_finalize_hook:Device or resource busy) Abort(806969615) on node 21 (rank 21 in comm 0): Fatal error in PMPI_Finalize: Other MPI error, error stack: PMPI_Finalize(214)...............: MPI_Finalize failed PMPI_Finalize(159)...............: MPID_Finalize(1288)..............: MPIDI_OFI_mpi_finalize_hook(1892): OFI domain close failed (ofi_init.c:1892:MPIDI_OFI_mpi_finalize_hook:Device or resource busy) TACC: MPI job exited with code: 15 TACC: Shutdown complete. Exiting.