计算化学公社

标题: 求教:STEOM-DLPNO-CCSD 计算报错 Fatal error in MPI_Recv [打印本页]

作者
Author:
waker    时间: 2023-1-12 08:44
标题: 求教:STEOM-DLPNO-CCSD 计算报错 Fatal error in MPI_Recv
各位大大,我用STEOM-DLPNO-CCSD 计算激发态能级,但是最后的结果报错, Fatal error in MPI_Recv: Invalid tag, error stack: 输入文件如附件,请教各位大大,如何解决呀?之前用相同的输入文件跑其他分子就没有这个问题。

输出结果:


----------------------
ACTIVE SPACE SELECTION
----------------------
Calculating the RI metric                              ... done

job aborted:
[ranks] message

[0] fatal error
Fatal error in MPI_Recv: Invalid tag, error stack:
MPI_Recv(buf=0x0000008B6BE92580, count=1, dtype=0x4c000836, src=1, tag=-38, MPI_COMM_WORLD, status=0x0000008B6BE92560) failed
Invalid tag, value is -38

[1] fatal error
Fatal error in MPI_Send: Invalid tag, error stack:
MPI_Send(buf=0x00000034038923B8, count=1, dtype=0x4c000836, dest=0, tag=-38, MPI_COMM_WORLD) failed
Invalid tag, value is -38

[2] fatal error
Fatal error in MPI_Send: Invalid tag, error stack:
MPI_Send(buf=0x0000008C0CC91E78, count=1, dtype=0x4c000836, dest=1, tag=-37, MPI_COMM_WORLD) failed
Invalid tag, value is -37

[3] fatal error
Fatal error in MPI_Send: Invalid tag, error stack:
MPI_Send(buf=0x0000007825092118, count=1, dtype=0x4c000836, dest=2, tag=-36, MPI_COMM_WORLD) failed
Invalid tag, value is -36

[4] fatal error
Fatal error in MPI_Send: Invalid tag, error stack:
MPI_Send(buf=0x000000D135A92618, count=1, dtype=0x4c000836, dest=3, tag=-35, MPI_COMM_WORLD) failed
Invalid tag, value is -35

[5] fatal error
Fatal error in MPI_Send: Invalid tag, error stack:
MPI_Send(buf=0x0000009E62891FE8, count=1, dtype=0x4c000836, dest=4, tag=-34, MPI_COMM_WORLD) failed
Invalid tag, value is -34

[6] fatal error
Fatal error in MPI_Send: Invalid tag, error stack:
MPI_Send(buf=0x00000052DBE921F8, count=1, dtype=0x4c000836, dest=5, tag=-33, MPI_COMM_WORLD) failed
Invalid tag, value is -33

[7] fatal error
Fatal error in MPI_Send: Invalid tag, error stack:
MPI_Send(buf=0x0000008447292398, count=1, dtype=0x4c000836, dest=6, tag=-32, MPI_COMM_WORLD) failed
Invalid tag, value is -32

[8] fatal error
Fatal error in MPI_Send: Invalid tag, error stack:
MPI_Send(buf=0x0000009A93691FF8, count=1, dtype=0x4c000836, dest=7, tag=-31, MPI_COMM_WORLD) failed
Invalid tag, value is -31

[9] fatal error
Fatal error in MPI_Send: Invalid tag, error stack:
MPI_Send(buf=0x0000008EF7292048, count=1, dtype=0x4c000836, dest=8, tag=-30, MPI_COMM_WORLD) failed
Invalid tag, value is -30

[10-11] terminated

[12] fatal error
Fatal error in MPI_Send: Invalid tag, error stack:
MPI_Send(buf=0x00000033A52920B8, count=1, dtype=0x4c000836, dest=11, tag=-27, MPI_COMM_WORLD) failed
Invalid tag, value is -27

[13] fatal error
Fatal error in MPI_Send: Invalid tag, error stack:
MPI_Send(buf=0x000000D1692923B8, count=1, dtype=0x4c000836, dest=12, tag=-26, MPI_COMM_WORLD) failed
Invalid tag, value is -26

[14-15] terminated

[16] fatal error
Fatal error in MPI_Send: Invalid tag, error stack:
MPI_Send(buf=0x000000BED30925A8, count=1, dtype=0x4c000836, dest=15, tag=-23, MPI_COMM_WORLD) failed
Invalid tag, value is -23

[17] fatal error
Fatal error in MPI_Send: Invalid tag, error stack:
MPI_Send(buf=0x0000002A66692568, count=1, dtype=0x4c000836, dest=16, tag=-22, MPI_COMM_WORLD) failed
Invalid tag, value is -22

[18] fatal error
Fatal error in MPI_Send: Invalid tag, error stack:
MPI_Send(buf=0x00000035A4E92058, count=1, dtype=0x4c000836, dest=17, tag=-21, MPI_COMM_WORLD) failed
Invalid tag, value is -21

[19] fatal error
Fatal error in MPI_Send: Invalid tag, error stack:
MPI_Send(buf=0x00000064E8891F28, count=1, dtype=0x4c000836, dest=18, tag=-20, MPI_COMM_WORLD) failed
Invalid tag, value is -20

[20] fatal error
Fatal error in MPI_Send: Invalid tag, error stack:
MPI_Send(buf=0x000000BEDCE92058, count=1, dtype=0x4c000836, dest=19, tag=-19, MPI_COMM_WORLD) failed
Invalid tag, value is -19

[21] fatal error
Fatal error in MPI_Send: Invalid tag, error stack:
MPI_Send(buf=0x000000BFE9692608, count=1, dtype=0x4c000836, dest=20, tag=-18, MPI_COMM_WORLD) failed
Invalid tag, value is -18

[22] fatal error
Fatal error in MPI_Send: Invalid tag, error stack:
MPI_Send(buf=0x000000A4FBE92268, count=1, dtype=0x4c000836, dest=21, tag=-17, MPI_COMM_WORLD) failed
Invalid tag, value is -17

[23] fatal error
Fatal error in MPI_Send: Invalid tag, error stack:
MPI_Send(buf=0x000000E8B2C91FA8, count=1, dtype=0x4c000836, dest=22, tag=-16, MPI_COMM_WORLD) failed
Invalid tag, value is -16

[24] fatal error
Fatal error in MPI_Send: Invalid tag, error stack:
MPI_Send(buf=0x0000004C62C92598, count=1, dtype=0x4c000836, dest=23, tag=-15, MPI_COMM_WORLD) failed
Invalid tag, value is -15

[25] fatal error
Fatal error in MPI_Send: Invalid tag, error stack:
MPI_Send(buf=0x0000003A942923E8, count=1, dtype=0x4c000836, dest=24, tag=-14, MPI_COMM_WORLD) failed
Invalid tag, value is -14

[26-31] terminated

[32] fatal error
Fatal error in MPI_Send: Invalid tag, error stack:
MPI_Send(buf=0x0000008AA5292288, count=1, dtype=0x4c000836, dest=31, tag=-7, MPI_COMM_WORLD) failed
Invalid tag, value is -7

[33] fatal error
Fatal error in MPI_Send: Invalid tag, error stack:
MPI_Send(buf=0x000000F1FCE92168, count=1, dtype=0x4c000836, dest=32, tag=-6, MPI_COMM_WORLD) failed
Invalid tag, value is -6

[34-35] terminated

[36] fatal error
Fatal error in MPI_Send: Invalid tag, error stack:
MPI_Send(buf=0x00000077F18921B8, count=1, dtype=0x4c000836, dest=35, tag=-3, MPI_COMM_WORLD) failed
Invalid tag, value is -3

[37] fatal error
Fatal error in MPI_Send: Invalid tag, error stack:
MPI_Send(buf=0x0000007249891ED8, count=1, dtype=0x4c000836, dest=36, tag=-2, MPI_COMM_WORLD) failed
Invalid tag, value is -2

[38-79] terminated

---- error analysis -----

[0-9,12-13,16-25,32-33,36-37] on DESKTOP-BLC1267
mpi has detected a fatal error and aborted C:\orca\orca_mdci_mpi.exe

---- error analysis -----

ORCA finished by error termination in MDCI
Calling Command: mpiexec -np 80  C:\orca\orca_mdci_mpi.exe DLPNO.mdciinp.tmp DLPNO
[file orca_tools/qcmsg.cpp, line 465]:
  .... aborting the run  

作者
Author:
shalene    时间: 2023-1-12 12:34
都用ccsd了,基组还选svp?
作者
Author:
waker    时间: 2023-1-12 13:31
shalene 发表于 2023-1-12 12:34
都用ccsd了,基组还选svp?

选TZVP电脑跑不动,储存空间不足
作者
Author:
wzkchem5    时间: 2023-1-12 16:00
waker 发表于 2023-1-12 06:31
选TZVP电脑跑不动,储存空间不足

宁可用双杂化泛函/triple zeta,也不要用STEOM-DLPNO-CCSD/double zeta
作者
Author:
wzkchem5    时间: 2023-1-12 16:00
这个问题可重复吗?
作者
Author:
waker    时间: 2023-1-12 21:41
wzkchem5 发表于 2023-1-12 16:00
这个问题可重复吗?

老师,我换了triple zeta 试了试,还是报错

job aborted:
[ranks] message

[0] fatal error
Fatal error in MPI_Recv: Invalid tag, error stack:
MPI_Recv(buf=0x000000BB2FA92180, count=1, dtype=0x4c000836, src=1, tag=-70, MPI_COMM_WORLD, status=0x000000BB2FA92160) failed
Invalid tag, value is -70

[1] fatal error
Fatal error in MPI_Send: Invalid tag, error stack:
MPI_Send(buf=0x000000B7E0891F88, count=1, dtype=0x4c000836, dest=0, tag=-70, MPI_COMM_WORLD) failed
Invalid tag, value is -70

[2] fatal error
Fatal error in MPI_Send: Invalid tag, error stack:
MPI_Send(buf=0x000000F303C92558, count=1, dtype=0x4c000836, dest=1, tag=-69, MPI_COMM_WORLD) failed
Invalid tag, value is -69

[3] fatal error
Fatal error in MPI_Send: Invalid tag, error stack:
MPI_Send(buf=0x000000100AC92108, count=1, dtype=0x4c000836, dest=2, tag=-68, MPI_COMM_WORLD) failed
Invalid tag, value is -68

[4] fatal error
Fatal error in MPI_Send: Invalid tag, error stack:
MPI_Send(buf=0x000000A260091F38, count=1, dtype=0x4c000836, dest=3, tag=-67, MPI_COMM_WORLD) failed
Invalid tag, value is -67

[5-111] terminated

---- error analysis -----

[0-4] on DESKTOP-BLC1267
mpi has detected a fatal error and aborted C:\orca\orca_mdci_mpi.exe

---- error analysis -----

ORCA finished by error termination in MDCI
Calling Command: mpiexec -np 112  C:\orca\orca_mdci_mpi.exe DLPNO.mdciinp.tmp DLPNO
[file orca_tools/qcmsg.cpp, line 465]:
  .... aborting the run
作者
Author:
wzkchem5    时间: 2023-1-12 21:51
waker 发表于 2023-1-12 14:41
老师,我换了triple zeta 试了试,还是报错

job aborted:

同一台机子,跑很小分子的STEOM-DLPNO-CCSD,会报错吗
作者
Author:
waker    时间: 2023-1-12 23:24
wzkchem5 发表于 2023-1-12 21:51
同一台机子,跑很小分子的STEOM-DLPNO-CCSD,会报错吗

跑了个乙烷分子,没有报错
作者
Author:
wzkchem5    时间: 2023-1-13 14:13
waker 发表于 2023-1-12 16:24
跑了个乙烷分子,没有报错

那可以检查一下是不是maxcore设置的问题,太小(小于计算所需的内存)也不行,太大(maxcore*nprocs接近或大于计算机的物理内存)也不行
作者
Author:
waker    时间: 2023-1-17 08:44
wzkchem5 发表于 2023-1-13 14:13
那可以检查一下是不是maxcore设置的问题,太小(小于计算所需的内存)也不行,太大(maxcore*nprocs接近 ...

谢谢老师,调整maxcore 和 nprocs 之后确实可以了




欢迎光临 计算化学公社 (http://bbs.keinsci.com/) Powered by Discuz! X3.3