计算化学公社

 找回密码 Forget password
 注册 Register
Views: 11197|回复 Reply: 2
打印 Print 上一主题 Last thread 下一主题 Next thread

[ORCA] ORCA-4.1.1运行出错

[复制链接 Copy URL]

81

帖子

0

威望

915

eV
积分
996

Level 4 (黑子)

跳转到指定楼层 Go to specific reply
楼主
本帖最后由 hanshan 于 2019-4-4 13:54 编辑

下载了orca411的2个版本1个是orca_4_1_1_linux_x86-64_shared_openmpi313
1个orca_4_1_1_linux_x86-64_openmpi313
两个运行程序都报错如下
请问是怎么解决呢,openmpi的lib已经配置了
@sobereva

[ad01:28484] PMIX ERROR: UNPACK-PAST-END in file unpack.c at line 206
[ad01:28484] PMIX ERROR: UNPACK-PAST-END in file unpack.c at line 147
[ad01:28484] PMIX ERROR: UNPACK-PAST-END in file client/pmix_client.c at line 223
[ad01:28482] PMIX ERROR: UNPACK-PAST-END in file unpack.c at line 206
[ad01:28482] PMIX ERROR: UNPACK-PAST-END in file unpack.c at line 147
[ad01:28482] PMIX ERROR: UNPACK-PAST-END in file client/pmix_client.c at line 223
[ad01:28482] OPAL ERROR: Error in file pmix2x_client.c at line 109
[ad01:28483] PMIX ERROR: UNPACK-PAST-END in file unpack.c at line 206
[ad01:28483] PMIX ERROR: UNPACK-PAST-END in file unpack.c at line 147
[ad01:28483] PMIX ERROR: UNPACK-PAST-END in file client/pmix_client.c at line 223
[ad01:28483] OPAL ERROR: Error in file pmix2x_client.c at line 109
[ad01:28484] OPAL ERROR: Error in file pmix2x_client.c at line 109
[ad01:28485] PMIX ERROR: UNPACK-PAST-END in file unpack.c at line 206
[ad01:28485] PMIX ERROR: UNPACK-PAST-END in file unpack.c at line 147
[ad01:28485] PMIX ERROR: UNPACK-PAST-END in file client/pmix_client.c at line 223
[ad01:28485] OPAL ERROR: Error in file pmix2x_client.c at line 109
*** An error occurred in MPI_Init
*** on a NULL communicator
*** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort,
***    and potentially your MPI job)
[ad01:28484] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed!
*** An error occurred in MPI_Init
*** on a NULL communicator
*** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort,
***    and potentially your MPI job)
[ad01:28482] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed!
*** An error occurred in MPI_Init
*** on a NULL communicator
*** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort,
***    and potentially your MPI job)
[ad01:28485] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed!
-------------------------------------------------------
Primary job  terminated normally, but 1 process returned
a non-zero exit code.. Per user-direction, the job has been aborted.
-------------------------------------------------------
*** An error occurred in MPI_Init
*** on a NULL communicator
*** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort,
***    and potentially your MPI job)
[ad01:28483] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed!
--------------------------------------------------------------------------
mpirun detected that one or more processes exited with non-zero status, thus causing
the job to be terminated. The first process to do so was:

  Process name: [[23038,1],2]
  Exit code:    1
--------------------------------------------------------------------------

ORCA finished by error termination in GTOInt
Calling Command: mpirun -np 4  /home/zz09/bin/orca_4_1_1_linux_x86-64_openmpi313/orca_gtoint_mpi to3.int.tmp to3
[file orca_tools/qcmsg.cpp, line 458]:
  .... aborting the run

[file orca_tools/qcmsg.cpp, line 458]:
  .... aborting the run

81

帖子

0

威望

915

eV
积分
996

Level 4 (黑子)

3#
 楼主 Author| 发表于 Post on 2019-4-5 08:53:20 | 只看该作者 Only view this author
sobereva 发表于 2019-4-5 05:34
MPI问题
一般就是当前机子里处于启用状态的MPI和ORCA自身要求的MPI不是同一个库、同一个版本

试了好多遍,终于发现,换成了
orca_4_1_0_linux_x86-64_openmpi215
采用openmpi的215版本,就可以运算了
问题应该在于不支持openmpi313

6万

帖子

99

威望

6万

eV
积分
125127

管理员

公社社长

2#
发表于 Post on 2019-4-5 05:34:56 | 只看该作者 Only view this author
MPI问题
一般就是当前机子里处于启用状态的MPI和ORCA自身要求的MPI不是同一个库、同一个版本
北京科音自然科学研究中心http://www.keinsci.com)致力于计算化学的发展和传播,长期开办极高质量的各种计算化学类培训:初级量子化学培训班中级量子化学培训班高级量子化学培训班量子化学波函数分析与Multiwfn程序培训班分子动力学与GROMACS培训班CP2K第一性原理计算培训班,内容介绍以及往届资料购买请点击相应链接查看。这些培训是计算化学从零快速入门以及进一步全面系统性提升研究水平的高速路!培训各种常见问题见《北京科音办的培训班FAQ》
欢迎加入北京科音微信公众号获取北京科音培训的最新消息,并避免错过网上有价值的计算化学文章!
欢迎加入人气极高、专业性特别强的理论与计算化学综合交流群思想家公社QQ群(群号见此链接),合计达一万多人。北京科音培训班的学员在群中可申请VIP头衔,提问将得到群主Sobereva的最优先解答。
思想家公社的门口Blog:http://sobereva.com(发布大量原创计算化学相关博文)
Multiwfn主页:http://sobereva.com/multiwfn(十分强大、极为流行的量子化学波函数分析程序)
Google Scholar:https://scholar.google.com/citations?user=tiKE0qkAAAAJ
ResearchGate:https://www.researchgate.net/profile/Tian_Lu

手机版 Mobile version|北京科音自然科学研究中心 Beijing Kein Research Center for Natural Sciences|京公网安备 11010502035419号|计算化学公社 — 北京科音旗下高水平计算化学交流论坛 ( 京ICP备14038949号-1 )|网站地图

GMT+8, 2026-2-18 23:02 , Processed in 0.158310 second(s), 21 queries , Gzip On.

快速回复 返回顶部 返回列表 Return to list