TACC: Starting up job 4244607 TACC: Starting parallel tasks... Shared memory islands host a minimum of 28 and a maximum of 28 MPI ranks. We shall use 32 MPI ranks in total for assisting one-sided communication (1 per shared memory node). ___ __ ____ ___ ____ ____ __ / __) /__\ ( _ \ / __)( ___)(_ _)___ /. | ( (_-. /(__)\ )(_) )( (_-. )__) )( (___)(_ _) \___/(__)(__)(____/ \___/(____) (__) (_) This is Gadget, version 4.0. Git commit 30019281bed6dcfb8d018c09095aeaa1e6ff5042, Wed Aug 11 12:22:28 2021 +0200 Code was compiled with the following compiler and flags: mpicxx -std=c++11 -Wwrite-strings -Wredundant-decls -Woverloaded-virtual -Wcast-qual -Wcast-align -Wpointer-arith -Wmissing-declarations -g -Wall -W -O3 -march=native -I/opt/apps/gcc9_1/hdf5/1.10.4/x86_64/include -I/opt/apps/gcc9_1/gsl/2.6/include -I/opt/apps/gcc9_1/impi19_0/fftw3/3.3.8/include -Ibuild -Isrc Code was compiled with the following settings: DOUBLEPRECISION=1 FOF FOF_GROUP_MIN_LEN=32 FOF_LINKLENGTH=0.2 FOF_PRIMARY_LINK_TYPES=2 FOF_SECONDARY_LINK_TYPES=1+16+32 GADGET2_HEADER IDS_64BIT NTYPES=6 PERIODIC PMGRID=128 SELFGRAVITY SUBFIND Running on 864 MPI tasks. BEGRUN: Size of particle structure 136 [bytes] BEGRUN: Size of sph particle structure 192 [bytes] BEGRUN: Size of gravity tree node 104 [bytes] BEGRUN: Size of neighbour tree node 192 [bytes] BEGRUN: Size of subfind auxiliary data 104 [bytes] ------------------------------------------------------------------------------------------------------------------------- AvailMem: Largest = 186476.28 Mb (on task= 297), Smallest = 186398.93 Mb (on task= 432), Average = 186427.07 Mb Total Mem: Largest = 191515.34 Mb (on task= 27), Smallest = 191515.20 Mb (on task= 0), Average = 191515.33 Mb Committed_AS: Largest = 5116.40 Mb (on task= 432), Smallest = 5039.05 Mb (on task= 297), Average = 5088.25 Mb SwapTotal: Largest = 0.00 Mb (on task= 0), Smallest = 0.00 Mb (on task= 0), Average = 0.00 Mb SwapFree: Largest = 0.00 Mb (on task= 0), Smallest = 0.00 Mb (on task= 0), Average = 0.00 Mb AllocMem: Largest = 5116.40 Mb (on task= 432), Smallest = 5039.05 Mb (on task= 297), Average = 5088.25 Mb avail /dev/shm: Largest = 95321.44 Mb (on task= 27), Smallest = 95321.34 Mb (on task= 22), Average = 95321.42 Mb ------------------------------------------------------------------------------------------------------------------------- Task=0 has the maximum commited memory and is host: c106-162.frontera.tacc.utexas.edu ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ Obtaining parameters from file 'param.txt': InitCondFile lcdm_gas_littleendian.dat OutputDir output SnapshotFileBase snap OutputListFilename outputs_lcdm_gas.txt ICFormat 3 SnapFormat 3 TimeLimitCPU 180000 CpuTimeBetRestartFile 7200 MaxMemSize 3000 TimeBegin 0.0909091 TimeMax 1 ComovingIntegrationOn 1 Omega0 0.2814 OmegaLambda 0.7186 OmegaBaryon 0.0464 HubbleParam 0.697 Hubble 0.1 BoxSize 400000 OutputListOn 0 TimeBetSnapshot 1.83842 TimeOfFirstSnapshot 0.047619 TimeBetStatistics 0.05 NumFilesPerSnapshot 1 MaxFilesWithConcurrentIO 1 ErrTolIntAccuracy 0.012 CourantFac 0.15 MaxSizeTimestep 0.025 MinSizeTimestep 0 TypeOfOpeningCriterion 1 ErrTolTheta 0.9 ErrTolThetaMax 1 ErrTolForceAcc 0.0025 TopNodeFactor 2.5 ActivePartFracForNewDomainDecomp 0.01 DesNumNgb 64 MaxNumNgbDeviation 3 UnitLength_in_cm 3.08568e+21 UnitMass_in_g 1.989e+43 UnitVelocity_in_cm_per_s 100000 GravityConstantInternal 0 DesLinkNgb 20 SofteningComovingClass0 1.5 SofteningComovingClass1 1.5 SofteningComovingClass2 1.5 SofteningComovingClass3 1.5 SofteningComovingClass4 1.5 SofteningComovingClass5 1.5 SofteningMaxPhysClass0 1.5 SofteningMaxPhysClass1 1.5 SofteningMaxPhysClass2 1.5 SofteningMaxPhysClass3 1.5 SofteningMaxPhysClass4 1.5 SofteningMaxPhysClass5 1.5 SofteningClassOfPartType0 0 SofteningClassOfPartType1 0 SofteningClassOfPartType2 0 SofteningClassOfPartType3 0 SofteningClassOfPartType4 0 SofteningClassOfPartType5 0 ArtBulkViscConst 1 MinEgySpec 0 InitGasTemp 1000 MALLOC: Allocation of shared memory took 15.7881 sec BEGRUN: Hubble (internal units) = 0.1 BEGRUN: h = 0.697 BEGRUN: G (internal units) = 43018.7 BEGRUN: UnitMass_in_g = 1.989e+43 BEGRUN: UnitLenth_in_cm = 3.08568e+21 BEGRUN: UnitTime_in_s = 3.08568e+16 BEGRUN: UnitVelocity_in_cm_per_s = 100000 BEGRUN: UnitDensity_in_cgs = 6.76991e-22 BEGRUN: UnitEnergy_in_cgs = 1.989e+53 READIC: filenr=0, 'output/snap_086.hdf5' contains: READIC: Type 0 (gas): 62181444 (tot= 62181444) masstab= 0 READIC: Type 1: 71646920 (tot= 71646920) masstab= 0 READIC: Type 2: 0 (tot= 0) masstab= 0 READIC: Type 3: 0 (tot= 0) masstab= 0 READIC: Type 4: 607 (tot= 607) masstab= 0 READIC: Type 5: 0 (tot= 0) masstab= 0 READIC: Reading file `output/snap_086.hdf5' on task=0 and distribute it to 0 to 863. READIC: reading block 0 (Coordinates)... READIC: reading block 1 (Velocities)... READIC: reading block 2 (ParticleIDs)... READIC: reading block 3 (Masses)... READIC: reading block 4 (InternalEnergy)... READIC: reading block 5 (Density)... Dataset Density not present for particle type 0, using zero. READIC: reading block 6 (SmoothingLength)... Dataset SmoothingLength not present for particle type 0, using zero. READIC: reading done. Took 4.2206 sec, total size 6363.41 MB, corresponds to effective I/O rate of 1507.7 MB/sec READIC: Total number of particles : 133828971 INIT: Testing ID uniqueness... INIT: success. took=0.0639115 sec DOMAIN: Begin domain decomposition (sync-point 0). DOMAIN: Sum=4 TotalCost=4 NumTimeBinsToBeBalanced=1 MultipleDomains=4 DOMAIN: Increasing TopNodeAllocFactor=0.08 new value=0.104 DOMAIN: Increasing TopNodeAllocFactor=0.104 new value=0.1352 DOMAIN: Increasing TopNodeAllocFactor=0.1352 new value=0.17576 DOMAIN: Increasing TopNodeAllocFactor=0.17576 new value=0.228488 DOMAIN: Increasing TopNodeAllocFactor=0.228488 new value=0.297034 DOMAIN: Increasing TopNodeAllocFactor=0.297034 new value=0.386145 DOMAIN: Increasing TopNodeAllocFactor=0.386145 new value=0.501988 DOMAIN: Increasing TopNodeAllocFactor=0.501988 new value=0.652585 DOMAIN: Increasing TopNodeAllocFactor=0.652585 new value=0.84836 DOMAIN: Increasing TopNodeAllocFactor=0.84836 new value=1.10287 DOMAIN: Increasing TopNodeAllocFactor=1.10287 new value=1.43373 DOMAIN: Increasing TopNodeAllocFactor=1.43373 new value=1.86385 DOMAIN: Increasing TopNodeAllocFactor=1.86385 new value=2.423 DOMAIN: Increasing TopNodeAllocFactor=2.423 new value=3.1499 DOMAIN: Increasing TopNodeAllocFactor=3.1499 new value=4.09487 DOMAIN: Increasing TopNodeAllocFactor=4.09487 new value=5.32333 DOMAIN: NTopleaves=32467, determination of top-level tree involved 6 iterations and took 0.453219 sec DOMAIN: we are going to try at most 503 different settings for combining the domains on tasks=864, nnodes=32 DOMAIN: total_cost=4 total_load=2 DOMAIN: best solution found after 1 iterations by task=174 for nextra=64, reaching maximum imbalance of 1.11163|1.112 DOMAIN: combining multiple-domains took 0.0239091 sec DOMAIN: exchange of 71647527 particles DOMAIN: particle exchange done. (took 0.273154 sec) DOMAIN: domain decomposition done. (took in total 0.757555 sec) PEANO: Begin Peano-Hilbert order... PEANO: done, took 0.0312567 sec. INIT: Converting u -> entropy All.cf_a3inv=729 FOF: Begin to compute FoF group catalogue... (presently allocated=85.9265 MB) FOF: Comoving linking length: 11.3647 FOFTREE: Ngb-tree construction done. took 0.0522017 sec =76541.6 NTopnodes=37105 NTopleaves=32467 FOF: Start linking particles (presently allocated=97.4436 MB) FOF: linking of small cells took 0.000861734 sec FOF: local links done (took 0.0716068 sec, avg-work=0.062517, imbalance=1.11896). FOF: Marked=177218 out of the 71646920 primaries which are linked FOF: begin linking across processors (presently allocated=98.441 MB) FOF: have done 18296 cross links (processed 177218, took 0.00548942 sec) FOF: have done 3785 cross links (processed 90910, took 0.00389636 sec) FOF: have done 589 cross links (processed 13269, took 0.00293314 sec) FOF: have done 107 cross links (processed 2008, took 0.00278222 sec) FOF: have done 17 cross links (processed 294, took 0.00313962 sec) FOF: have done 1 cross links (processed 46, took 0.00321911 sec) FOF: have done 0 cross links (processed 2, took 0.00278702 sec) FOF: Local groups found. FOF: primary group finding took = 0.101375 sec FOF: Start finding nearest dm-particle (presently allocated=97.4436 MB) FOF: fof-nearest iteration 1: need to repeat for 61106306 particles. (took = 0.0461802 sec) FOF: fof-nearest iteration 2: need to repeat for 55808189 particles. (took = 0.0488041 sec) FOF: fof-nearest iteration 3: need to repeat for 40413227 particles. (took = 0.0524913 sec) FOF: fof-nearest iteration 4: need to repeat for 18500262 particles. (took = 0.0457338 sec) FOF: fof-nearest iteration 5: need to repeat for 3358271 particles. (took = 0.0264445 sec) FOF: fof-nearest iteration 6: need to repeat for 77936 particles. (took = 0.00805781 sec) FOF: done finding nearest dm-particle FOF: attaching gas and star particles to nearest dm particles took = 0.231734 sec FOF: compiling local group data and catalogue took = 0.0137114 sec FOF: Total number of FOF groups with at least 32 particles: 2701440 FOF: Largest FOF group has 98 particles. FOF: Total number of particles in FOF groups: 133828924 FOF: group properties are now allocated.. (presently allocated=89.3033 MB) FOF: computation of group properties took = 0.00699127 sec FOF: start assigning group numbers FOF: Assigning of group numbers took = 0.378369 sec FOF: Finished computing FoF groups. Complete work took 1.02368 sec (presently allocated=86.9807 MB) SUBFIND: We now execute a parallel version of SUBFIND. TREE: Full tree construction for all particles. (presently allocated=92.1148 MB) GRAVTREE: Tree construction done. took 0.217455 sec =92892.2 NTopnodes=37105 NTopleaves=32467 tree-build-scalability=0.601021 SUBFIND: finding total densities around all particles SUBFIND: ngb iteration 1: need to repeat for 133502208 particles. (took 0.421617 sec) SUBFIND: ngb iteration 2: need to repeat for 132979965 particles. (took 0.367997 sec) SUBFIND: ngb iteration 3: need to repeat for 132148019 particles. (took 0.393721 sec) SUBFIND: ngb iteration 4: need to repeat for 130699005 particles. (took 0.347767 sec) SUBFIND: ngb iteration 5: need to repeat for 127995768 particles. (took 0.31791 sec) SUBFIND: ngb iteration 6: need to repeat for 122750422 particles. (took 0.317295 sec) SUBFIND: ngb iteration 7: need to repeat for 112757688 particles. (took 0.309916 sec) SUBFIND: ngb iteration 8: need to repeat for 96940525 particles. (took 0.277599 sec) SUBFIND: ngb iteration 9: need to repeat for 80302707 particles. (took 0.22647 sec) SUBFIND: ngb iteration 10: need to repeat for 69537532 particles. (took 0.170321 sec) SUBFIND: ngb iteration 11: need to repeat for 64661470 particles. (took 0.132892 sec) SUBFIND: ngb iteration 12: need to repeat for 62923790 particles. (took 0.115914 sec) SUBFIND: ngb iteration 13: need to repeat for 62395372 particles. (took 0.113803 sec) SUBFIND: ngb iteration 14: need to repeat for 62245360 particles. (took 0.110376 sec) SUBFIND: ngb iteration 15: need to repeat for 62201047 particles. (took 0.108533 sec) SUBFIND: ngb iteration 16: need to repeat for 62187445 particles. (took 0.101871 sec) SUBFIND: ngb iteration 17: need to repeat for 62182892 particles. (took 0.0951289 sec) SUBFIND: ngb iteration 18: need to repeat for 62180472 particles. (took 0.0885739 sec) SUBFIND: ngb iteration 19: need to repeat for 62178133 particles. (took 0.0829925 sec) SUBFIND: ngb iteration 20: need to repeat for 62174458 particles. (took 0.0798387 sec) SUBFIND: ngb iteration 21: need to repeat for 62168538 particles. (took 0.0800547 sec) SUBFIND: ngb iteration 22: need to repeat for 62157141 particles. (took 0.0812891 sec) SUBFIND: ngb iteration 23: need to repeat for 62136212 particles. (took 0.087622 sec) SUBFIND: ngb iteration 24: need to repeat for 62097191 particles. (took 0.0865958 sec) SUBFIND: ngb iteration 25: need to repeat for 62025649 particles. (took 0.0906851 sec) SUBFIND: ngb iteration 26: need to repeat for 61894045 particles. (took 0.0968591 sec) SUBFIND: ngb iteration 27: need to repeat for 61657446 particles. (took 0.107519 sec) SUBFIND: ngb iteration 28: need to repeat for 61247377 particles. (took 0.115718 sec) SUBFIND: ngb iteration 29: need to repeat for 60565327 particles. (took 0.137256 sec) SUBFIND: ngb iteration 30: need to repeat for 59487711 particles. (took 0.155136 sec) SUBFIND: ngb iteration 31: need to repeat for 57859346 particles. (took 0.192281 sec) SUBFIND: ngb iteration 32: need to repeat for 55485301 particles. (took 0.206695 sec) SUBFIND: ngb iteration 33: need to repeat for 52116129 particles. (took 0.220655 sec) SUBFIND: ngb iteration 34: need to repeat for 47549894 particles. (took 0.220922 sec) SUBFIND: ngb iteration 35: need to repeat for 41650836 particles. (took 0.208357 sec) SUBFIND: ngb iteration 36: need to repeat for 34552616 particles. (took 0.181545 sec) SUBFIND: ngb iteration 37: need to repeat for 26695605 particles. (took 0.155375 sec) SUBFIND: ngb iteration 38: need to repeat for 18844936 particles. (took 0.129048 sec) SUBFIND: ngb iteration 39: need to repeat for 11884377 particles. (took 0.0995623 sec) SUBFIND: ngb iteration 40: need to repeat for 6506808 particles. (took 0.0725766 sec) SUBFIND: ngb iteration 41: need to repeat for 3009688 particles. (took 0.0490681 sec) SUBFIND: ngb iteration 42: need to repeat for 1133585 particles. (took 0.0266533 sec) SUBFIND: ngb iteration 43: need to repeat for 330904 particles. (took 0.0145178 sec) SUBFIND: ngb iteration 44: need to repeat for 73388 particles. (took 0.00614001 sec) SUBFIND: ngb iteration 45: need to repeat for 11118 particles. (took 0.00233944 sec) SUBFIND: ngb iteration 46: need to repeat for 1199 particles. (took 0.000765329 sec) SUBFIND: ngb iteration 47: need to repeat for 64 particles. (took 0.000316476 sec) SUBFIND: iteration to correct primary neighbor count and density estimate took 7.00999 sec SUBFIND: Number of FOF halos treated with collective SubFind algorithm = 0 SUBFIND: Number of processors used in different partitions for the collective SubFind code = 0 SUBFIND: (The adopted size-limit for the collective algorithm was 133828971 particles, for threshold size factor 0.6) SUBFIND: The other 2701440 FOF halos are treated in parallel with serial code SUBFIND: subfind_distribute_groups() took 0.00912996 sec SUBFIND: particle balance=1.10997 SUBFIND: subfind_exchange() took 0.241885 sec SUBFIND: particle balance for processing=1.00042 SUBFIND: subdomain decomposition took 0.051296 sec SUBFIND: serial subfind subdomain decomposition took 0.0506343 sec SUBFIND: subfind_hbt_single_group() processing for Ngroups=3127 took 33.3328 sec SUBFIND: root-task=0: Serial processing of halo 0 took 33.4566 SUBFIND: Processing overall took (total time=34.3419 sec) SUBFIND: 0 out of 1540 decisions, fraction 0, where influenced by previous subhalo size ALLOCATE: Changing to MaxPart = 258160 ALLOCATE: Changing to MaxPartSph = 119950 SUBFIND: subfind_exchange() (for return to original CPU) took 0.499244 sec TREE: Full tree construction for all particles. (presently allocated=79.234 MB) GRAVTREE: Tree construction done. took 0.828859 sec =92892.2 NTopnodes=37105 NTopleaves=32467 tree-build-scalability=0.601021 SUBFIND: SO iteration 1: need to repeat for 2701395 halo centers. (took 0.119492 sec) SUBFIND: SO iteration 2: need to repeat for 2701395 halo centers. (took 0.168713 sec) SUBFIND: SO iteration 3: need to repeat for 2701395 halo centers. (took 0.147812 sec) SUBFIND: SO iteration 4: need to repeat for 2701395 halo centers. (took 0.156589 sec) SUBFIND: SO iteration 5: need to repeat for 2701395 halo centers. (took 0.0885332 sec) SUBFIND: SO iteration 6: need to repeat for 2701395 halo centers. (took 0.0762344 sec) SUBFIND: SO iteration 7: need to repeat for 2701395 halo centers. (took 0.211214 sec) SUBFIND: SO iteration 8: need to repeat for 2701395 halo centers. (took 0.144994 sec) SUBFIND: SO iteration 9: need to repeat for 2701395 halo centers. (took 0.139236 sec) SUBFIND: SO iteration 10: need to repeat for 2701395 halo centers. (took 0.122531 sec) SUBFIND: SO iteration 11: need to repeat for 2701395 halo centers. (took 0.265688 sec) SUBFIND: SO iteration 12: need to repeat for 2701395 halo centers. (took 0.298356 sec) SUBFIND: SO iteration 13: need to repeat for 2701395 halo centers. (took 0.14147 sec) SUBFIND: SO iteration 14: need to repeat for 2701395 halo centers. (took 0.251652 sec) SUBFIND: SO iteration 15: need to repeat for 2701395 halo centers. (took 0.125856 sec) SUBFIND: SO iteration 16: need to repeat for 286211 halo centers. (took 0.100646 sec) SUBFIND: SO iteration 17: need to repeat for 2577 halo centers. (took 0.0529489 sec) SUBFIND: SO iteration 1: need to repeat for 2701395 halo centers. (took 0.0527214 sec) SUBFIND: SO iteration 2: need to repeat for 2701395 halo centers. (took 0.0326588 sec) SUBFIND: SO iteration 3: need to repeat for 2701395 halo centers. (took 0.0918346 sec) SUBFIND: SO iteration 4: need to repeat for 2701395 halo centers. (took 0.0772595 sec) SUBFIND: SO iteration 5: need to repeat for 2701395 halo centers. (took 0.182887 sec) SUBFIND: SO iteration 6: need to repeat for 2701395 halo centers. (took 0.287085 sec) SUBFIND: SO iteration 7: need to repeat for 2701395 halo centers. (took 0.294954 sec) SUBFIND: SO iteration 8: need to repeat for 2701395 halo centers. (took 0.334419 sec) SUBFIND: SO iteration 9: need to repeat for 2701395 halo centers. (took 0.219598 sec) SUBFIND: SO iteration 10: need to repeat for 2701395 halo centers. (took 0.279569 sec) SUBFIND: SO iteration 11: need to repeat for 2701395 halo centers. (took 0.145756 sec) SUBFIND: SO iteration 12: need to repeat for 2701395 halo centers. (took 0.0314593 sec) SUBFIND: SO iteration 13: need to repeat for 2701395 halo centers. (took 0.0536656 sec) SUBFIND: SO iteration 14: need to repeat for 2701395 halo centers. (took 0.131609 sec) SUBFIND: SO iteration 15: need to repeat for 2701395 halo centers. (took 0.248261 sec) SUBFIND: SO iteration 16: need to repeat for 168574 halo centers. (took 0.182253 sec) SUBFIND: SO iteration 17: need to repeat for 1283 halo centers. (took 0.0907862 sec) SUBFIND: SO iteration 1: need to repeat for 2701395 halo centers. (took 0.206573 sec) SUBFIND: SO iteration 2: need to repeat for 2701395 halo centers. (took 0.135138 sec) SUBFIND: SO iteration 3: need to repeat for 2701395 halo centers. (took 0.287292 sec) SUBFIND: SO iteration 4: need to repeat for 2701395 halo centers. (took 0.286481 sec) SUBFIND: SO iteration 5: need to repeat for 2701395 halo centers. (took 0.566033 sec) SUBFIND: SO iteration 6: need to repeat for 2701395 halo centers. (took 0.302142 sec) SUBFIND: SO iteration 7: need to repeat for 2701395 halo centers. (took 0.154169 sec) SUBFIND: SO iteration 8: need to repeat for 2701395 halo centers. (took 0.23372 sec) SUBFIND: SO iteration 9: need to repeat for 2701395 halo centers. (took 0.052154 sec) SUBFIND: SO iteration 10: need to repeat for 2701395 halo centers. (took 0.0609387 sec) SUBFIND: SO iteration 11: need to repeat for 2701395 halo centers. (took 0.0575057 sec) SUBFIND: SO iteration 12: need to repeat for 2701395 halo centers. (took 0.0329836 sec) SUBFIND: SO iteration 13: need to repeat for 2701395 halo centers. (took 0.106186 sec) SUBFIND: SO iteration 14: need to repeat for 2701395 halo centers. (took 0.160139 sec) SUBFIND: SO iteration 15: need to repeat for 2701395 halo centers. (took 0.153755 sec) SUBFIND: SO iteration 16: need to repeat for 290122 halo centers. (took 0.260611 sec) SUBFIND: SO iteration 17: need to repeat for 2594 halo centers. (took 0.3197 sec) SUBFIND: SO iteration 1: need to repeat for 2701395 halo centers. (took 0.652913 sec) SUBFIND: SO iteration 2: need to repeat for 2701395 halo centers. (took 0.434402 sec) SUBFIND: SO iteration 3: need to repeat for 2701395 halo centers. (took 0.130323 sec) SUBFIND: SO iteration 4: need to repeat for 2701395 halo centers. (took 0.198115 sec) SUBFIND: SO iteration 5: need to repeat for 2701395 halo centers. (took 0.092116 sec) SUBFIND: SO iteration 6: need to repeat for 2701395 halo centers. (took 0.162294 sec) SUBFIND: SO iteration 7: need to repeat for 2701395 halo centers. (took 0.276188 sec) SUBFIND: SO iteration 8: need to repeat for 2701395 halo centers. (took 0.465096 sec) SUBFIND: SO iteration 9: need to repeat for 2701395 halo centers. (took 0.404165 sec) SUBFIND: SO iteration 10: need to repeat for 2701395 halo centers. (took 0.27706 sec) SUBFIND: SO iteration 11: need to repeat for 2701395 halo centers. (took 0.348348 sec) SUBFIND: SO iteration 12: need to repeat for 2701395 halo centers. (took 0.408882 sec) SUBFIND: SO iteration 13: need to repeat for 2701395 halo centers. (took 0.0699003 sec) SUBFIND: SO iteration 14: need to repeat for 2701395 halo centers. (took 0.0367462 sec) SUBFIND: SO iteration 15: need to repeat for 2701395 halo centers. (took 0.0264526 sec) SUBFIND: SO iteration 16: need to repeat for 2350770 halo centers. (took 0.0234774 sec) SUBFIND: SO iteration 17: need to repeat for 171135 halo centers. (took 0.0261096 sec) SUBFIND: SO iteration 18: need to repeat for 21027 halo centers. (took 0.00870475 sec) SUBFIND: determining spherical overdensity masses took 12.7823 sec SUBFIND: assembled and ordered groups and subhalos (took 0.7751 sec) FOF/SUBFIND: writing group catalogue file: 'output/fof_subhalo_tab_086' (file 1 of 1) FOF/SUBFIND: writing group catalogue block 0 (GroupLen)... FOF/SUBFIND: writing group catalogue block 1 (GroupMass)... FOF/SUBFIND: writing group catalogue block 2 (GroupPos)... FOF/SUBFIND: writing group catalogue block 3 (GroupVel)... FOF/SUBFIND: writing group catalogue block 4 (GroupLenType)... FOF/SUBFIND: writing group catalogue block 5 (GroupOffsetType)... FOF/SUBFIND: writing group catalogue block 6 (GroupMassType)... FOF/SUBFIND: writing group catalogue block 7 (GroupAscale)... FOF/SUBFIND: writing group catalogue block 8 (Group_M_Mean200)... FOF/SUBFIND: writing group catalogue block 9 (Group_R_Mean200)... FOF/SUBFIND: writing group catalogue block 10 (Group_M_Crit200)... FOF/SUBFIND: writing group catalogue block 11 (Group_R_Crit200)... FOF/SUBFIND: writing group catalogue block 12 (Group_M_Crit500)... FOF/SUBFIND: writing group catalogue block 13 (Group_R_Crit500)... FOF/SUBFIND: writing group catalogue block 14 (Group_M_TopHat200)... FOF/SUBFIND: writing group catalogue block 15 (Group_R_TopHat200)... FOF/SUBFIND: writing group catalogue block 16 (GroupNsubs)... FOF/SUBFIND: writing group catalogue block 17 (GroupFirstSub)... FOF/SUBFIND: writing group catalogue block 18 (SubhaloGroupNr)... FOF/SUBFIND: writing group catalogue block 19 (SubhaloRankInGr)... FOF/SUBFIND: writing group catalogue block 20 (SubhaloLen)... FOF/SUBFIND: writing group catalogue block 21 (SubhaloMass)... FOF/SUBFIND: writing group catalogue block 22 (SubhaloPos)... FOF/SUBFIND: writing group catalogue block 23 (SubhaloVel)... FOF/SUBFIND: writing group catalogue block 24 (SubhaloLenType)... FOF/SUBFIND: writing group catalogue block 25 (SubhaloOffsetType)... FOF/SUBFIND: writing group catalogue block 26 (SubhaloMassType)... FOF/SUBFIND: writing group catalogue block 27 (SubhaloCM)... FOF/SUBFIND: writing group catalogue block 28 (SubhaloSpin)... FOF/SUBFIND: writing group catalogue block 29 (SubhaloVelDisp)... FOF/SUBFIND: writing group catalogue block 30 (SubhaloVmax)... FOF/SUBFIND: writing group catalogue block 31 (SubhaloVmaxRad)... FOF/SUBFIND: writing group catalogue block 32 (SubhaloHalfmassRad)... FOF/SUBFIND: writing group catalogue block 33 (SubhaloHalfmassRadType)... FOF/SUBFIND: writing group catalogue block 34 (SubhaloIDMostbound)... FOF/SUBFIND: writing group catalogue block 35 (SubhaloParentRank)... FOF/SUBFIND: Group catalogues saved. took = 26.911 sec, total size 1010.21 MB, corresponds to effective I/O rate of 37.5391 MB/sec SUBFIND: Subgroup catalogues saved. took = 26.9175 sec SUBFIND: Finished with SUBFIND. (total time=83.7036 sec) SUBFIND: Total number of subhalos with at least 20 particles: 2702932 SUBFIND: Largest subhalo has 98 particles/cells. SUBFIND: Total number of particles/cells in subhalos: 133826522 FOF: preparing output order of particles took 1.2046 sec SNAPSHOT: writing snapshot file #86 @ time 0.111111 ... SNAPSHOT: writing snapshot file: 'output/snap_086' (file 1 of 1) SNAPSHOT: writing snapshot rename 'output/snap_086.hdf5' to 'output/bak-snap_086.hdf5' SNAPSHOT: writing snapshot block 0 (Coordinates)... SNAPSHOT: writing snapshot block 1 (Velocities)... SNAPSHOT: writing snapshot block 2 (ParticleIDs)... SNAPSHOT: writing snapshot block 3 (Masses)... SNAPSHOT: writing snapshot block 4 (InternalEnergy)... SNAPSHOT: writing snapshot block 5 (Density)... SNAPSHOT: writing snapshot block 6 (SmoothingLength)... Abort(806969615) on node 280 (rank 280 in comm 0): Fatal error in PMPI_Finalize: Other MPI error, error stack: PMPI_Finalize(214)...............: MPI_Finalize failed PMPI_Finalize(159)...............: MPID_Finalize(1288)..............: MPIDI_OFI_mpi_finalize_hook(1892): OFI domain close failed (ofi_init.c:1892:MPIDI_OFI_mpi_finalize_hook:Device or resource busy) SNAPSHOT: done with writing snapshot. Took 7.71495 sec, total size 5306.26 MB, corresponds to effective I/O rate of 687.789 MB/sec endrun called, calling MPI_Finalize() bye! Abort(806969615) on node 446 (rank 446 in comm 0): Fatal error in PMPI_Finalize: Other MPI error, error stack: PMPI_Finalize(214)...............: MPI_Finalize failed PMPI_Finalize(159)...............: MPID_Finalize(1288)..............: MPIDI_OFI_mpi_finalize_hook(1892): OFI domain close failed (ofi_init.c:1892:MPIDI_OFI_mpi_finalize_hook:Device or resource busy) Abort(806969615) on node 181 (rank 181 in comm 0): Fatal error in PMPI_Finalize: Other MPI error, error stack: PMPI_Finalize(214)...............: MPI_Finalize failed PMPI_Finalize(159)...............: MPID_Finalize(1288)..............: MPIDI_OFI_mpi_finalize_hook(1892): OFI domain close failed (ofi_init.c:1892:MPIDI_OFI_mpi_finalize_hook:Device or resource busy) Abort(806969615) on node 274 (rank 274 in comm 0): Fatal error in PMPI_Finalize: Other MPI error, error stack: PMPI_Finalize(214)...............: MPI_Finalize failed PMPI_Finalize(159)...............: MPID_Finalize(1288)..............: MPIDI_OFI_mpi_finalize_hook(1892): OFI domain close failed (ofi_init.c:1892:MPIDI_OFI_mpi_finalize_hook:Device or resource busy) Abort(806969615) on node 569 (rank 569 in comm 0): Fatal error in PMPI_Finalize: Other MPI error, error stack: PMPI_Finalize(214)...............: MPI_Finalize failed PMPI_Finalize(159)...............: MPID_Finalize(1288)..............: MPIDI_OFI_mpi_finalize_hook(1892): OFI domain close failed (ofi_init.c:1892:MPIDI_OFI_mpi_finalize_hook:Device or resource busy) Abort(806969615) on node 150 (rank 150 in comm 0): Fatal error in PMPI_Finalize: Other MPI error, error stack: PMPI_Finalize(214)...............: MPI_Finalize failed PMPI_Finalize(159)...............: MPID_Finalize(1288)..............: MPIDI_OFI_mpi_finalize_hook(1892): OFI domain close failed (ofi_init.c:1892:MPIDI_OFI_mpi_finalize_hook:Device or resource busy) Abort(806969615) on node 671 (rank 671 in comm 0): Fatal error in PMPI_Finalize: Other MPI error, error stack: PMPI_Finalize(214)...............: MPI_Finalize failed PMPI_Finalize(159)...............: MPID_Finalize(1288)..............: MPIDI_OFI_mpi_finalize_hook(1892): OFI domain close failed (ofi_init.c:1892:MPIDI_OFI_mpi_finalize_hook:Device or resource busy) Abort(806969615) on node 97 (rank 97 in comm 0): Fatal error in PMPI_Finalize: Other MPI error, error stack: PMPI_Finalize(214)...............: MPI_Finalize failed PMPI_Finalize(159)...............: MPID_Finalize(1288)..............: MPIDI_OFI_mpi_finalize_hook(1892): OFI domain close failed (ofi_init.c:1892:MPIDI_OFI_mpi_finalize_hook:Device or resource busy) Abort(806969615) on node 688 (rank 688 in comm 0): Fatal error in PMPI_Finalize: Other MPI error, error stack: PMPI_Finalize(214)...............: MPI_Finalize failed PMPI_Finalize(159)...............: MPID_Finalize(1288)..............: MPIDI_OFI_mpi_finalize_hook(1892): OFI domain close failed (ofi_init.c:1892:MPIDI_OFI_mpi_finalize_hook:Device or resource busy) Abort(806969615) on node 518 (rank 518 in comm 0): Fatal error in PMPI_Finalize: Other MPI error, error stack: PMPI_Finalize(214)...............: MPI_Finalize failed PMPI_Finalize(159)...............: MPID_Finalize(1288)..............: MPIDI_OFI_mpi_finalize_hook(1892): OFI domain close failed (ofi_init.c:1892:MPIDI_OFI_mpi_finalize_hook:Device or resource busy) Abort(806969615) on node 473 (rank 473 in comm 0): Fatal error in PMPI_Finalize: Other MPI error, error stack: PMPI_Finalize(214)...............: MPI_Finalize failed PMPI_Finalize(159)...............: MPID_Finalize(1288)..............: MPIDI_OFI_mpi_finalize_hook(1892): OFI domain close failed (ofi_init.c:1892:MPIDI_OFI_mpi_finalize_hook:Device or resource busy) Abort(806969615) on node 767 (rank 767 in comm 0): Fatal error in PMPI_Finalize: Other MPI error, error stack: PMPI_Finalize(214)...............: MPI_Finalize failed PMPI_Finalize(159)...............: MPID_Finalize(1288)..............: MPIDI_OFI_mpi_finalize_hook(1892): OFI domain close failed (ofi_init.c:1892:MPIDI_OFI_mpi_finalize_hook:Device or resource busy) Abort(806969615) on node 797 (rank 797 in comm 0): Fatal error in PMPI_Finalize: Other MPI error, error stack: PMPI_Finalize(214)...............: MPI_Finalize failed PMPI_Finalize(159)...............: MPID_Finalize(1288)..............: MPIDI_OFI_mpi_finalize_hook(1892): OFI domain close failed (ofi_init.c:1892:MPIDI_OFI_mpi_finalize_hook:Device or resource busy) Abort(806969615) on node 859 (rank 859 in comm 0): Fatal error in PMPI_Finalize: Other MPI error, error stack: PMPI_Finalize(214)...............: MPI_Finalize failed PMPI_Finalize(159)...............: MPID_Finalize(1288)..............: MPIDI_OFI_mpi_finalize_hook(1892): OFI domain close failed (ofi_init.c:1892:MPIDI_OFI_mpi_finalize_hook:Device or resource busy) Abort(806969615) on node 375 (rank 375 in comm 0): Fatal error in PMPI_Finalize: Other MPI error, error stack: PMPI_Finalize(214)...............: MPI_Finalize failed PMPI_Finalize(159)...............: MPID_Finalize(1288)..............: MPIDI_OFI_mpi_finalize_hook(1892): OFI domain close failed (ofi_init.c:1892:MPIDI_OFI_mpi_finalize_hook:Device or resource busy) Abort(806969615) on node 425 (rank 425 in comm 0): Fatal error in PMPI_Finalize: Other MPI error, error stack: PMPI_Finalize(214)...............: MPI_Finalize failed PMPI_Finalize(159)...............: MPID_Finalize(1288)..............: MPIDI_OFI_mpi_finalize_hook(1892): OFI domain close failed (ofi_init.c:1892:MPIDI_OFI_mpi_finalize_hook:Device or resource busy) Abort(806969615) on node 116 (rank 116 in comm 0): Fatal error in PMPI_Finalize: Other MPI error, error stack: PMPI_Finalize(214)...............: MPI_Finalize failed PMPI_Finalize(159)...............: MPID_Finalize(1288)..............: MPIDI_OFI_mpi_finalize_hook(1892): OFI domain close failed (ofi_init.c:1892:MPIDI_OFI_mpi_finalize_hook:Device or resource busy) Abort(806969615) on node 42 (rank 42 in comm 0): Fatal error in PMPI_Finalize: Other MPI error, error stack: PMPI_Finalize(214)...............: MPI_Finalize failed PMPI_Finalize(159)...............: MPID_Finalize(1288)..............: MPIDI_OFI_mpi_finalize_hook(1892): OFI domain close failed (ofi_init.c:1892:MPIDI_OFI_mpi_finalize_hook:Device or resource busy) Abort(806969615) on node 182 (rank 182 in comm 0): Fatal error in PMPI_Finalize: Other MPI error, error stack: PMPI_Finalize(214)...............: MPI_Finalize failed PMPI_Finalize(159)...............: MPID_Finalize(1288)..............: MPIDI_OFI_mpi_finalize_hook(1892): OFI domain close failed (ofi_init.c:1892:MPIDI_OFI_mpi_finalize_hook:Device or resource busy) Abort(806969615) on node 276 (rank 276 in comm 0): Fatal error in PMPI_Finalize: Other MPI error, error stack: PMPI_Finalize(214)...............: MPI_Finalize failed PMPI_Finalize(159)...............: MPID_Finalize(1288)..............: MPIDI_OFI_mpi_finalize_hook(1892): OFI domain close failed (ofi_init.c:1892:MPIDI_OFI_mpi_finalize_hook:Device or resource busy) Abort(806969615) on node 599 (rank 599 in comm 0): Fatal error in PMPI_Finalize: Other MPI error, error stack: PMPI_Finalize(214)...............: MPI_Finalize failed PMPI_Finalize(159)...............: MPID_Finalize(1288)..............: MPIDI_OFI_mpi_finalize_hook(1892): OFI domain close failed (ofi_init.c:1892:MPIDI_OFI_mpi_finalize_hook:Device or resource busy) Abort(806969615) on node 659 (rank 659 in comm 0): Fatal error in PMPI_Finalize: Other MPI error, error stack: PMPI_Finalize(214)...............: MPI_Finalize failed PMPI_Finalize(159)...............: MPID_Finalize(1288)..............: MPIDI_OFI_mpi_finalize_hook(1892): OFI domain close failed (ofi_init.c:1892:MPIDI_OFI_mpi_finalize_hook:Device or resource busy) Abort(806969615) on node 89 (rank 89 in comm 0): Fatal error in PMPI_Finalize: Other MPI error, error stack: PMPI_Finalize(214)...............: MPI_Finalize failed PMPI_Finalize(159)...............: MPID_Finalize(1288)..............: MPIDI_OFI_mpi_finalize_hook(1892): OFI domain close failed (ofi_init.c:1892:MPIDI_OFI_mpi_finalize_hook:Device or resource busy) Abort(806969615) on node 506 (rank 506 in comm 0): Fatal error in PMPI_Finalize: Other MPI error, error stack: PMPI_Finalize(214)...............: MPI_Finalize failed PMPI_Finalize(159)...............: MPID_Finalize(1288)..............: MPIDI_OFI_mpi_finalize_hook(1892): OFI domain close failed (ofi_init.c:1892:MPIDI_OFI_mpi_finalize_hook:Device or resource busy) Abort(806969615) on node 772 (rank 772 in comm 0): Fatal error in PMPI_Finalize: Other MPI error, error stack: PMPI_Finalize(214)...............: MPI_Finalize failed PMPI_Finalize(159)...............: MPID_Finalize(1288)..............: MPIDI_OFI_mpi_finalize_hook(1892): OFI domain close failed (ofi_init.c:1892:MPIDI_OFI_mpi_finalize_hook:Device or resource busy) Abort(806969615) on node 809 (rank 809 in comm 0): Fatal error in PMPI_Finalize: Other MPI error, error stack: PMPI_Finalize(214)...............: MPI_Finalize failed PMPI_Finalize(159)...............: MPID_Finalize(1288)..............: MPIDI_OFI_mpi_finalize_hook(1892): OFI domain close failed (ofi_init.c:1892:MPIDI_OFI_mpi_finalize_hook:Device or resource busy) Abort(806969615) on node 365 (rank 365 in comm 0): Fatal error in PMPI_Finalize: Other MPI error, error stack: PMPI_Finalize(214)...............: MPI_Finalize failed PMPI_Finalize(159)...............: MPID_Finalize(1288)..............: MPIDI_OFI_mpi_finalize_hook(1892): OFI domain close failed (ofi_init.c:1892:MPIDI_OFI_mpi_finalize_hook:Device or resource busy) Abort(806969615) on node 8 (rank 8 in comm 0): Fatal error in PMPI_Finalize: Other MPI error, error stack: PMPI_Finalize(214)...............: MPI_Finalize failed PMPI_Finalize(159)...............: MPID_Finalize(1288)..............: MPIDI_OFI_mpi_finalize_hook(1892): OFI domain close failed (ofi_init.c:1892:MPIDI_OFI_mpi_finalize_hook:Device or resource busy) Abort(806969615) on node 437 (rank 437 in comm 0): Fatal error in PMPI_Finalize: Other MPI error, error stack: PMPI_Finalize(214)...............: MPI_Finalize failed PMPI_Finalize(159)...............: MPID_Finalize(1288)..............: MPIDI_OFI_mpi_finalize_hook(1892): OFI domain close failed (ofi_init.c:1892:MPIDI_OFI_mpi_finalize_hook:Device or resource busy) Abort(806969615) on node 604 (rank 604 in comm 0): Fatal error in PMPI_Finalize: Other MPI error, error stack: PMPI_Finalize(214)...............: MPI_Finalize failed PMPI_Finalize(159)...............: MPID_Finalize(1288)..............: MPIDI_OFI_mpi_finalize_hook(1892): OFI domain close failed (ofi_init.c:1892:MPIDI_OFI_mpi_finalize_hook:Device or resource busy) Abort(806969615) on node 75 (rank 75 in comm 0): Fatal error in PMPI_Finalize: Other MPI error, error stack: PMPI_Finalize(214)...............: MPI_Finalize failed PMPI_Finalize(159)...............: MPID_Finalize(1288)..............: MPIDI_OFI_mpi_finalize_hook(1892): OFI domain close failed (ofi_init.c:1892:MPIDI_OFI_mpi_finalize_hook:Device or resource busy) Abort(806969615) on node 361 (rank 361 in comm 0): Fatal error in PMPI_Finalize: Other MPI error, error stack: PMPI_Finalize(214)...............: MPI_Finalize failed PMPI_Finalize(159)...............: MPID_Finalize(1288)..............: MPIDI_OFI_mpi_finalize_hook(1892): OFI domain close failed (ofi_init.c:1892:MPIDI_OFI_mpi_finalize_hook:Device or resource busy) Abort(806969615) on node 9 (rank 9 in comm 0): Fatal error in PMPI_Finalize: Other MPI error, error stack: PMPI_Finalize(214)...............: MPI_Finalize failed PMPI_Finalize(159)...............: MPID_Finalize(1288)..............: MPIDI_OFI_mpi_finalize_hook(1892): OFI domain close failed (ofi_init.c:1892:MPIDI_OFI_mpi_finalize_hook:Device or resource busy) Abort(806969615) on node 408 (rank 408 in comm 0): Fatal error in PMPI_Finalize: Other MPI error, error stack: PMPI_Finalize(214)...............: MPI_Finalize failed PMPI_Finalize(159)...............: MPID_Finalize(1288)..............: MPIDI_OFI_mpi_finalize_hook(1892): OFI domain close failed (ofi_init.c:1892:MPIDI_OFI_mpi_finalize_hook:Device or resource busy) Abort(806969615) on node 252 (rank 252 in comm 0): Fatal error in PMPI_Finalize: Other MPI error, error stack: PMPI_Finalize(214)...............: MPI_Finalize failed PMPI_Finalize(159)...............: MPID_Finalize(1288)..............: MPIDI_OFI_mpi_finalize_hook(1892): OFI domain close failed (ofi_init.c:1892:MPIDI_OFI_mpi_finalize_hook:Device or resource busy) Abort(806969615) on node 621 (rank 621 in comm 0): Fatal error in PMPI_Finalize: Other MPI error, error stack: PMPI_Finalize(214)...............: MPI_Finalize failed PMPI_Finalize(159)...............: MPID_Finalize(1288)..............: MPIDI_OFI_mpi_finalize_hook(1892): OFI domain close failed (ofi_init.c:1892:MPIDI_OFI_mpi_finalize_hook:Device or resource busy) Abort(806969615) on node 510 (rank 510 in comm 0): Fatal error in PMPI_Finalize: Other MPI error, error stack: PMPI_Finalize(214)...............: MPI_Finalize failed PMPI_Finalize(159)...............: MPID_Finalize(1288)..............: MPIDI_OFI_mpi_finalize_hook(1892): OFI domain close failed (ofi_init.c:1892:MPIDI_OFI_mpi_finalize_hook:Device or resource busy) Abort(806969615) on node 773 (rank 773 in comm 0): Fatal error in PMPI_Finalize: Other MPI error, error stack: PMPI_Finalize(214)...............: MPI_Finalize failed PMPI_Finalize(159)...............: MPID_Finalize(1288)..............: MPIDI_OFI_mpi_finalize_hook(1892): OFI domain close failed (ofi_init.c:1892:MPIDI_OFI_mpi_finalize_hook:Device or resource busy) Abort(806969615) on node 3 (rank 3 in comm 0): Fatal error in PMPI_Finalize: Other MPI error, error stack: PMPI_Finalize(214)...............: MPI_Finalize failed PMPI_Finalize(159)...............: MPID_Finalize(1288)..............: MPIDI_OFI_mpi_finalize_hook(1892): OFI domain close failed (ofi_init.c:1892:MPIDI_OFI_mpi_finalize_hook:Device or resource busy) Abort(806969615) on node 260 (rank 260 in comm 0): Fatal error in PMPI_Finalize: Other MPI error, error stack: PMPI_Finalize(214)...............: MPI_Finalize failed PMPI_Finalize(159)...............: MPID_Finalize(1288)..............: MPIDI_OFI_mpi_finalize_hook(1892): OFI domain close failed (ofi_init.c:1892:MPIDI_OFI_mpi_finalize_hook:Device or resource busy) Abort(806969615) on node 642 (rank 642 in comm 0): Fatal error in PMPI_Finalize: Other MPI error, error stack: PMPI_Finalize(214)...............: MPI_Finalize failed PMPI_Finalize(159)...............: MPID_Finalize(1288)..............: MPIDI_OFI_mpi_finalize_hook(1892): OFI domain close failed (ofi_init.c:1892:MPIDI_OFI_mpi_finalize_hook:Device or resource busy) Abort(806969615) on node 527 (rank 527 in comm 0): Fatal error in PMPI_Finalize: Other MPI error, error stack: PMPI_Finalize(214)...............: MPI_Finalize failed PMPI_Finalize(159)...............: MPID_Finalize(1288)..............: MPIDI_OFI_mpi_finalize_hook(1892): OFI domain close failed (ofi_init.c:1892:MPIDI_OFI_mpi_finalize_hook:Device or resource busy) Abort(806969615) on node 777 (rank 777 in comm 0): Fatal error in PMPI_Finalize: Other MPI error, error stack: PMPI_Finalize(214)...............: MPI_Finalize failed PMPI_Finalize(159)...............: MPID_Finalize(1288)..............: MPIDI_OFI_mpi_finalize_hook(1892): OFI domain close failed (ofi_init.c:1892:MPIDI_OFI_mpi_finalize_hook:Device or resource busy) Abort(806969615) on node 749 (rank 749 in comm 0): Fatal error in PMPI_Finalize: Other MPI error, error stack: PMPI_Finalize(214)...............: MPI_Finalize failed PMPI_Finalize(159)...............: MPID_Finalize(1288)..............: MPIDI_OFI_mpi_finalize_hook(1892): OFI domain close failed (ofi_init.c:1892:MPIDI_OFI_mpi_finalize_hook:Device or resource busy) Abort(806969615) on node 266 (rank 266 in comm 0): Fatal error in PMPI_Finalize: Other MPI error, error stack: PMPI_Finalize(214)...............: MPI_Finalize failed PMPI_Finalize(159)...............: MPID_Finalize(1288)..............: MPIDI_OFI_mpi_finalize_hook(1892): OFI domain close failed (ofi_init.c:1892:MPIDI_OFI_mpi_finalize_hook:Device or resource busy) Abort(806969615) on node 624 (rank 624 in comm 0): Fatal error in PMPI_Finalize: Other MPI error, error stack: PMPI_Finalize(214)...............: MPI_Finalize failed PMPI_Finalize(159)...............: MPID_Finalize(1288)..............: MPIDI_OFI_mpi_finalize_hook(1892): OFI domain close failed (ofi_init.c:1892:MPIDI_OFI_mpi_finalize_hook:Device or resource busy) Abort(806969615) on node 528 (rank 528 in comm 0): Fatal error in PMPI_Finalize: Other MPI error, error stack: PMPI_Finalize(214)...............: MPI_Finalize failed PMPI_Finalize(159)...............: MPID_Finalize(1288)..............: MPIDI_OFI_mpi_finalize_hook(1892): OFI domain close failed (ofi_init.c:1892:MPIDI_OFI_mpi_finalize_hook:Device or resource busy) Abort(806969615) on node 235 (rank 235 in comm 0): Fatal error in PMPI_Finalize: Other MPI error, error stack: PMPI_Finalize(214)...............: MPI_Finalize failed PMPI_Finalize(159)...............: MPID_Finalize(1288)..............: MPIDI_OFI_mpi_finalize_hook(1892): OFI domain close failed (ofi_init.c:1892:MPIDI_OFI_mpi_finalize_hook:Device or resource busy) Abort(806969615) on node 15 (rank 15 in comm 0): Fatal error in PMPI_Finalize: Other MPI error, error stack: PMPI_Finalize(214)...............: MPI_Finalize failed PMPI_Finalize(159)...............: MPID_Finalize(1288)..............: MPIDI_OFI_mpi_finalize_hook(1892): OFI domain close failed (ofi_init.c:1892:MPIDI_OFI_mpi_finalize_hook:Device or resource busy) Abort(806969615) on node 19 (rank 19 in comm 0): Fatal error in PMPI_Finalize: Other MPI error, error stack: PMPI_Finalize(214)...............: MPI_Finalize failed PMPI_Finalize(159)...............: MPID_Finalize(1288)..............: MPIDI_OFI_mpi_finalize_hook(1892): OFI domain close failed (ofi_init.c:1892:MPIDI_OFI_mpi_finalize_hook:Device or resource busy) TACC: MPI job exited with code: 15 TACC: Shutdown complete. Exiting.