计算化学公社

 找回密码 Forget password
 注册 Register
楼主 Author: 啦啦黑还黑
打印 Print 上一主题 Last thread 下一主题 Next thread

[CP2K] 超详细CP2K编译过程,GNU9.3+intel MKL+openmpi【含视频】

  [复制链接 Copy URL]

1236

帖子

1

威望

3495

eV
积分
4751

Level 6 (一方通行)

61#
发表于 Post on 2021-4-19 11:22:14 | 只看该作者 Only view this author
本帖最后由 biogon 于 2021-4-19 15:37 编辑
ShuangfeiZhu 发表于 2020-9-22 16:02
各位老师,前面都没问题,后面报错如下,请问该怎么解决

(base) [room@localhost cp2k-7.1.0]$ make -j  ...

在/cp2k/exts 路径下执行
  1. git clone https://github.com/cp2k/dbcsr.git
复制代码
如提示fypp有问题,则再在/cp2k/exts/dbcsr下执行
  1. git submodule update --init
复制代码


16

帖子

0

威望

117

eV
积分
133

Level 2 能力者

62#
发表于 Post on 2021-4-27 15:23:58 | 只看该作者 Only view this author
haibeih 发表于 2020-9-10 20:29
借个楼提醒一下,gcc版本最重要,特别注意用这种方法编译cp2k时,需要python也是用官网推荐的gcc版本编译 ...

您好,我想请教您一个问题: 我在make CP2K 时, popt 可以顺利通过,但psmp 就会出现错误。 而且生成的cp2k.popt 在运行使用CPU 并行 mpirun -np N  的时候会crash, 单个CPU则可以正常通过,我感觉这是openmpi 的问题(根据你的建议自己安装的最新4.0.5版),请问你遇到过这样的问题了吗?如何解决?谢谢

1236

帖子

1

威望

3495

eV
积分
4751

Level 6 (一方通行)

63#
发表于 Post on 2021-4-27 15:58:07 | 只看该作者 Only view this author
myyang 发表于 2021-4-27 15:23
您好,我想请教您一个问题: 我在make CP2K 时, popt 可以顺利通过,但psmp 就会出现错误。 而且生成的c ...

你的什么编译器编译的openmpi?

17

帖子

0

威望

133

eV
积分
150

Level 3 能力者

64#
发表于 Post on 2021-8-2 20:57:36 | 只看该作者 Only view this author
各位老师,我目前在用toolchain的方法编译CP2K8.1, 一切顺利直到编译quip。

编译过程的信息很少:
...
==================== Installing QUIP ====================
QUIP-1ff93b3400b83e804f0f2e857e70c5e4133d9705.tar.gz is found
Installing from scratch into /home/bin/cp2k_pool/cp2k-8.1/tools/toolchain/install/quip-1ff93b3400b83e804f0f2e857e70c5e4133d9705
ERROR: (./scripts/install_quip.sh, line 100) Non-zero exit code detected.

于是我去翻看了make.log文件,发现问题全在MPI的相关库文件缺失:
...
/opt/rh/devtoolset-9/root/usr/libexec/gcc/x86_64-redhat-linux/9/ld: /opt/intel/compilers_and_libraries_2020.4.304/linux/mkl/lib/intel64/libmkl_blacs_openmpi_lp64.so: undefined reference to `ompi_mpi_dblprec'
/opt/rh/devtoolset-9/root/usr/libexec/gcc/x86_64-redhat-linux/9/ld: /opt/intel/compilers_and_libraries_2020.4.304/linux/mkl/lib/intel64/libmkl_blacs_openmpi_lp64.so: undefined reference to `MPI_Isend'
/opt/rh/devtoolset-9/root/usr/libexec/gcc/x86_64-redhat-linux/9/ld: /opt/intel/compilers_and_libraries_2020.4.304/linux/mkl/lib/intel64/libmkl_blacs_openmpi_lp64.so: undefined reference to `MPI_Type_f2c'
/opt/rh/devtoolset-9/root/usr/libexec/gcc/x86_64-redhat-linux/9/ld: /opt/intel/compilers_and_libraries_2020.4.304/linux/mkl/lib/intel64/libmkl_blacs_openmpi_lp64.so: undefined reference to `MPI_Op_f2c'
/opt/rh/devtoolset-9/root/usr/libexec/gcc/x86_64-redhat-linux/9/ld: /opt/intel/compilers_and_libraries_2020.4.304/linux/mkl/lib/intel64/libmkl_blacs_openmpi_lp64.so: undefined reference to `MPI_Type_vector'
/opt/rh/devtoolset-9/root/usr/libexec/gcc/x86_64-redhat-linux/9/ld: /opt/intel/compilers_and_libraries_2020.4.304/linux/mkl/lib/intel64/libmkl_blacs_openmpi_lp64.so: undefined reference to `MPI_Alltoall'
/opt/rh/devtoolset-9/root/usr/libexec/gcc/x86_64-redhat-linux/9/ld: /opt/intel/compilers_and_libraries_2020.4.304/linux/mkl/lib/intel64/libmkl_blacs_openmpi_lp64.so: undefined reference to `MPI_Finalize'
/opt/rh/devtoolset-9/root/usr/libexec/gcc/x86_64-redhat-linux/9/ld: /opt/intel/compilers_and_libraries_2020.4.304/linux/mkl/lib/intel64/libmkl_blacs_openmpi_lp64.so: undefined reference to `MPI_Sendrecv'
/opt/rh/devtoolset-9/root/usr/libexec/gcc/x86_64-redhat-linux/9/ld: /opt/intel/compilers_and_libraries_2020.4.304/linux/mkl/lib/intel64/libmkl_blacs_openmpi_lp64.so: undefined reference to `ompi_mpi_op_sum'
/opt/rh/devtoolset-9/root/usr/libexec/gcc/x86_64-redhat-linux/9/ld: /opt/intel/compilers_and_libraries_2020.4.304/linux/mkl/lib/intel64/libmkl_blacs_openmpi_lp64.so: undefined reference to `MPI_Op_c2f'
/opt/rh/devtoolset-9/root/usr/libexec/gcc/x86_64-redhat-linux/9/ld: /opt/intel/compilers_and_libraries_2020.4.304/linux/mkl/lib/intel64/libmkl_blacs_openmpi_lp64.so: undefined reference to `MPI_Comm_size'
/opt/rh/devtoolset-9/root/usr/libexec/gcc/x86_64-redhat-linux/9/ld: /opt/intel/compilers_and_libraries_2020.4.304/linux/mkl/lib/intel64/libmkl_blacs_openmpi_lp64.so: undefined reference to `ompi_mpi_real'
/opt/rh/devtoolset-9/root/usr/libexec/gcc/x86_64-redhat-linux/9/ld: /opt/intel/compilers_and_libraries_2020.4.304/linux/mkl/lib/intel64/libmkl_blacs_openmpi_lp64.so: undefined reference to `MPI_Group_free'
/opt/rh/devtoolset-9/root/usr/libexec/gcc/x86_64-redhat-linux/9/ld: /opt/intel/compilers_and_libraries_2020.4.304/linux/mkl/lib/intel64/libmkl_blacs_openmpi_lp64.so: undefined reference to `MPI_Type_indexed'
/opt/rh/devtoolset-9/root/usr/libexec/gcc/x86_64-redhat-linux/9/ld: /opt/intel/compilers_and_libraries_2020.4.304/linux/mkl/lib/intel64/libmkl_blacs_openmpi_lp64.so: undefined reference to `MPI_Barrier'
/opt/rh/devtoolset-9/root/usr/libexec/gcc/x86_64-redhat-linux/9/ld: /opt/intel/compilers_and_libraries_2020.4.304/linux/mkl/lib/intel64/libmkl_blacs_openmpi_lp64.so: undefined reference to `MPI_Type_commit'
/opt/rh/devtoolset-9/root/usr/libexec/gcc/x86_64-redhat-linux/9/ld: /opt/intel/compilers_and_libraries_2020.4.304/linux/mkl/lib/intel64/libmkl_blacs_openmpi_lp64.so: undefined reference to `MPI_Type_c2f'
/opt/rh/devtoolset-9/root/usr/libexec/gcc/x86_64-redhat-linux/9/ld: /opt/intel/compilers_and_libraries_2020.4.304/linux/mkl/lib/intel64/libmkl_blacs_openmpi_lp64.so: undefined reference to `MPI_Comm_rank'
collect2: error: ld returned 1 exit status
make[1]: *** [quip] Error 1
make[1]: Leaving directory `/home/bin/cp2k_pool/cp2k-8.1/tools/toolchain/build/QUIP-1ff93b3400b83e804f0f2e857e70c5e4133d9705/build/linux_x86_64_gfortran'
make: *** [Programs] Error 2

网上有人遇到类似情况(不是编译cp2k),解决方法是用intelmpi进行编译(没有说明问题原因)。但是我使用的是AMD核,且之前都没有用intelmpi编译,我怕两种编译运行后会有不兼容情况,而也并不清楚intel在AMD核上会不会比openmpi更高效。

请问这种情况有什么办法解决吗?谢谢!

2301

帖子

1

威望

5477

eV
积分
7798

Level 6 (一方通行)

65#
发表于 Post on 2021-8-2 23:35:11 | 只看该作者 Only view this author
learnerNONE 发表于 2021-8-2 20:57
各位老师,我目前在用toolchain的方法编译CP2K8.1, 一切顺利直到编译quip。

编译过程的信息很少:

能解决你的问题吗?

http://bbs.keinsci.com/thread-24469-1-1.html
High-Performance Computing for You
为您专属定制的高性能计算解决方案

更多讯息,请访问:
https://labitc.top
http://tophpc.top:8080
电邮: ask@hpc4you.top

17

帖子

0

威望

133

eV
积分
150

Level 3 能力者

66#
发表于 Post on 2021-8-3 09:50:39 | 只看该作者 Only view this author
abin 发表于 2021-8-2 23:35
能解决你的问题吗?

http://bbs.keinsci.com/thread-24469-1-1.html

感谢abin老师的回复!

我看了您给的教程,确实非常快捷方便,但是结尾提到这个是官方的预编译版本,我担心这会牺牲一点效率。而我目前使用的这个方法只剩一个quip无法解决,只是一步之遥了,可以的话,还是希望一套方法下来把quip编译好。之前也花了很多时间出坑,现在直接放弃,有点舍不得。

2301

帖子

1

威望

5477

eV
积分
7798

Level 6 (一方通行)

67#
发表于 Post on 2021-8-3 10:05:28 | 只看该作者 Only view this author
learnerNONE 发表于 2021-8-3 09:50
感谢abin老师的回复!

我看了您给的教程,确实非常快捷方便,但是结尾提到这个是官方的预编译版 ...

我有提及,
如果自己本地优化编译搞不定,
那就采用预编译的版本。

效率嘛,自己测试看了。

如果采用intel套件,全程本地优化编译,
请特别留意,
保证所有的套件都才用intel来编译。

8.2.0版本,有要求,libxc cosma版本。

我试了才用intel工具链,有几个算例有错误。
不过7.1是全盘通过,可以支持AVX512指令集。
High-Performance Computing for You
为您专属定制的高性能计算解决方案

更多讯息,请访问:
https://labitc.top
http://tophpc.top:8080
电邮: ask@hpc4you.top

2301

帖子

1

威望

5477

eV
积分
7798

Level 6 (一方通行)

68#
发表于 Post on 2021-8-3 10:53:47 | 只看该作者 Only view this author
learnerNONE 发表于 2021-8-2 20:57
各位老师,我目前在用toolchain的方法编译CP2K8.1, 一切顺利直到编译quip。

编译过程的信息很少:

https://github.com/cp2k/cp2k/issues/1617

Lucien1234,
这是你吗?

RHEL中, devtoolset-9, 应该是9.1.1吧?

既然自己编译, 看看CP2K官方支持说明呀.
https://www.cp2k.org/dev:compiler_support
其中 x86_64平台, GCC版本5.5, 7.3, 7.5, 8.3, 8.4, 9.3, 10.2, 均可.

大致的原因是,
你采用的编译器捣鼓的openMPI 或者MPICH, 和你所采用的MKL不兼容.

推荐方案,
更换编译器为官方推荐版本.
采用官方推荐的intel编译器对应的MKL版本.

重新打包处理OpenMPI或者MPICH.

或者, 我能搞定.
cp2k 8.1 8.2 GNU+MKL, 算例测试全通过.

祝好.
High-Performance Computing for You
为您专属定制的高性能计算解决方案

更多讯息,请访问:
https://labitc.top
http://tophpc.top:8080
电邮: ask@hpc4you.top

17

帖子

0

威望

133

eV
积分
150

Level 3 能力者

69#
发表于 Post on 2021-8-11 21:17:21 | 只看该作者 Only view this author
abin 发表于 2021-8-3 10:53
https://github.com/cp2k/cp2k/issues/1617

Lucien1234,

非常感谢abin老师的耐心回复~

那个是我,我在github上也开了一个问题。

可能是版本的原因,但不是很确定,因为我查gcc --version的时候确实是9.3.1.后面我又按刘博i方法尝试了安装cp2k7.1,发现openmpi的版本是4.0.1,这个版本下的quip搭配2020mkl是可以正常安装的。

Intel的编译器过期了,暂时还没有新的,只能申请到了再试试...

试过多次8.1的安装无果,现已决定躺平,就用7.1的好了...

11

帖子

0

威望

357

eV
积分
368

Level 3 能力者

70#
发表于 Post on 2021-9-3 20:24:15 | 只看该作者 Only view this author
请问刘博,服务器默认的编译器是intelmpi,是否需要注释掉或者改成openmpi?因为按照您的方法编出来的CP2K跑多核并行的时候速度还没有单核快,所以猜是不是编译过程出了问题。。

67

帖子

0

威望

309

eV
积分
376

Level 3 能力者

71#
发表于 Post on 2021-9-4 19:33:57 | 只看该作者 Only view this author
刘老师,您好,我的系统是 CentOS Linux release 8.4.2105
想安装 cp2k8.2,运行了:
  1. > --math-mode=mkl \
  2. > --with-openmpi=install \
  3. > --with-scalapack=no \
  4. > --with-ptscotch=install \
  5. > --with-parmetis=install \
  6. > --with-metis=install \
  7. > --with-superlu=install \
  8. > --with-pexsi=install \
  9. > --with-quip=install \
  10. > --with-plumed=install
复制代码


但是提示错误:不认识 metis和parmetis选项:

(./install_cp2k_toolchain.sh) Unknown flag: --with-metis=install
(./install_cp2k_toolchain.sh) Unknown flag: --with-parmetis=install

是因为我的gcc版本是8.4.1,而不是9吗?

谢谢


67

帖子

0

威望

309

eV
积分
376

Level 3 能力者

72#
发表于 Post on 2021-9-4 19:50:13 | 只看该作者 Only view this author
也无法安装 scl 包。提示

  1. sudo yum install centos-release-scl scl-utils-build
  2. Last metadata expiration check: 1:01:59 ago on Sat 04 Sep 2021 06:39:54 AM EDT.
  3. No match for argument: centos-release-scl
  4. Error: Unable to find a match: centos-release-scl
复制代码

1633

帖子

4

威望

4093

eV
积分
5806

Level 6 (一方通行)

喵星人

73#
发表于 Post on 2021-9-5 06:38:34 | 只看该作者 Only view this author
hitvip 发表于 2021-9-4 19:33
刘老师,您好,我的系统是 CentOS Linux release 8.4.2105
想安装 cp2k8.2,运行了:

cp2k 8系列不需要写着两个,就用系统自带的gcc8.4就可以了

评分 Rate

参与人数
Participants 1
eV +1 收起 理由
Reason
JamesBourbon + 1 谢谢

查看全部评分 View all ratings

67

帖子

0

威望

309

eV
积分
376

Level 3 能力者

74#
发表于 Post on 2021-9-5 10:07:39 | 只看该作者 Only view this author
本帖最后由 hitvip 于 2021-9-5 10:18 编辑
喵星大佬 发表于 2021-9-5 06:38
cp2k 8系列不需要写着两个,就用系统自带的gcc8.4就可以了

非常感谢您的解答,如您所说,我去掉了那两个软件包,并且采用cent8自带的gcc 8.4.1 来编译,但是我现在就差 QUIP 编译不过去,其它都没有问题。

提示: ERROR: (./scripts/stage6/install_quip.sh, line 107) Non-zero exit code detected.
搜了下论坛,也有人遇到了这个问题(http://bbs.keinsci.com/thread-20432-1-1.html),但没有给出解决办法。麻烦大家给看看,谢谢。

这是我用的toochain:
  1. ./install_cp2k_toolchain.sh \
  2. --with-openmpi=install \
  3. --with-cmake=install \
  4. --with-libint=install \
  5. --with-ptscotch=install  \
  6. --with-pexsi=install \
  7. --with-superlu=install \
  8. --with-quip=install \
  9. --with-plumed=install \
  10. --with-sirius=install \
  11. --with-cosma=install
复制代码


然后我查了下QUIP编译出错的log,似乎是mkl引起的问题,部分log文件内容如下:
  1. Making Programs

  2. ********************************************
  3. rm -f /home/pfg/Apps/cp2k-8.2/tools/toolchain/build/QUIP-b4336484fb65b0e73211a8f920ae4361c7c353fd/build/linux_x86_64_gfortran/Makefile
  4. cp /home/pfg/Apps/cp2k-8.2/tools/toolchain/build/QUIP-b4336484fb65b0e73211a8f920ae4361c7c353fd/src/Programs/Makefile /home/pfg/Apps/cp2k-8.2/tools/toolchain/build/QUIP-b4336484fb65b0e73211a8f920ae4361c7c353fd/build/linux_x86_64_gfortran/Makefile
  5. make -C /home/pfg/Apps/cp2k-8.2/tools/toolchain/build/QUIP-b4336484fb65b0e73211a8f920ae4361c7c353fd/build/linux_x86_64_gfortran QUIP_ROOT=/home/pfg/Apps/cp2k-8.2/tools/toolchain/build/QUIP-b4336484fb65b0e73211a8f920ae4361c7c353fd VPATH=/home/pfg/Apps/cp2k-8.2/tools/toolchain/build/QUIP-b4336484fb65b0e73211a8f920ae4361c7c353fd/src/Programs -I/home/pfg/Apps/cp2k-8.2/tools/toolchain/build/QUIP-b4336484fb65b0e73211a8f920ae4361c7c353fd -I/home/pfg/Apps/cp2k-8.2/tools/toolchain/build/QUIP-b4336484fb65b0e73211a8f920ae4361c7c353fd/arch
  6. make[1]: Entering directory '/home/pfg/Apps/cp2k-8.2/tools/toolchain/build/QUIP-b4336484fb65b0e73211a8f920ae4361c7c353fd/build/linux_x86_64_gfortran'
  7. gfortran  -x f95-cpp-input -ffree-line-length-none -ffree-form -fno-second-underscore -fPIC -g  -I/home/pfg/Apps/cp2k-8.2/tools/toolchain/build/QUIP-b4336484fb65b0e73211a8f920ae4361c7c353fd/src/libAtoms -I/home/pfg/Apps/cp2k-8.2/tools/toolchain/build/QUIP-b4336484fb65b0e73211a8f920ae4361c7c353fd/src/fox/objs.linux_x86_64_gfortran/finclude -O3  -DGETARG_F2003 -DGETENV_F2003 -DGFORTRAN -DFORTRAN_UNDERSCORE -D'GIT_VERSION="NOT_A_GIT_REPOSITORY"'  -D'QUIP_ARCH="linux_x86_64_gfortran"' -D'SIZEOF_FORTRAN_T=2' -DHAVE_PRECON -DHAVE_QR  -c  /home/pfg/Apps/cp2k-8.2/tools/toolchain/build/QUIP-b4336484fb65b0e73211a8f920ae4361c7c353fd/src/Programs/vacancy_map_mod.f95 -o vacancy_map_mod.o
  8. gfortran  -x f95-cpp-input -ffree-line-length-none -ffree-form -fno-second-underscore -fPIC -g  -I/home/pfg/Apps/cp2k-8.2/tools/toolchain/build/QUIP-b4336484fb65b0e73211a8f920ae4361c7c353fd/src/libAtoms -I/home/pfg/Apps/cp2k-8.2/tools/toolchain/build/QUIP-b4336484fb65b0e73211a8f920ae4361c7c353fd/src/fox/objs.linux_x86_64_gfortran/finclude -O3  -DGETARG_F2003 -DGETENV_F2003 -DGFORTRAN -DFORTRAN_UNDERSCORE -D'GIT_VERSION="NOT_A_GIT_REPOSITORY"'  -D'QUIP_ARCH="linux_x86_64_gfortran"' -D'SIZEOF_FORTRAN_T=2' -DHAVE_PRECON -DHAVE_QR  -c  /home/pfg/Apps/cp2k-8.2/tools/toolchain/build/QUIP-b4336484fb65b0e73211a8f920ae4361c7c353fd/src/Programs/quip.f95 -o quip.o
  9. gfortran  -o quip  quip.o vacancy_map_mod.o -L. -lquiputils -lquip_core  -latoms  -O3  -L/home/pfg/Apps/cp2k-8.2/tools/toolchain/build/QUIP-b4336484fb65b0e73211a8f920ae4361c7c353fd/src/fox/objs.linux_x86_64_gfortran/lib -lFoX_sax -lFoX_wxml -lFoX_utils -lFoX_common -lFoX_fsys  -L/home/pfg/intel/compilers_and_libraries_2019.3.199/linux/mkl/lib/intel64 -Wl,-rpath=/home/pfg/intel/compilers_and_libraries_2019.3.199/linux/mkl/lib/intel64 -lmkl_scalapack_lp64 -Wl,--start-group -lmkl_gf_lp64 -lmkl_sequential -lmkl_core -lmkl_blacs_openmpi_lp64 -Wl,--end-group -lpthread -lm -ldl  
  10. /home/pfg/intel/compilers_and_libraries_2019.3.199/linux/mkl/lib/intel64/libmkl_blacs_openmpi_lp64.so: undefined reference to `MPI_Waitall'
  11. /home/pfg/intel/compilers_and_libraries_2019.3.199/linux/mkl/lib/intel64/libmkl_blacs_openmpi_lp64.so: undefined reference to `ompi_mpi_comm_null'
  12. /home/pfg/intel/compilers_and_libraries_2019.3.199/linux/mkl/lib/intel64/libmkl_blacs_openmpi_lp64.so: undefined reference to `MPI_Abort'
  13. /home/pfg/intel/compilers_and_libraries_2019.3.199/linux/mkl/lib/intel64/libmkl_blacs_openmpi_lp64.so: undefined reference to `MPI_Waitany'
  14. /home/pfg/intel/compilers_and_libraries_2019.3.199/linux/mkl/lib/intel64/libmkl_blacs_openmpi_lp64.so: undefined reference to `MPI_Comm_f2c'
  15. /home/pfg/intel/compilers_and_libraries_2019.3.199/linux/mkl/lib/intel64/libmkl_blacs_openmpi_lp64.so: undefined reference to `MPI_Wtime'
  16. /home/pfg/intel/compilers_and_libraries_2019.3.199/linux/mkl/lib/intel64/libmkl_blacs_openmpi_lp64.so: undefined reference to `MPI_Comm_dup'
  17. /home/pfg/intel/compilers_and_libraries_2019.3.199/linux/mkl/lib/intel64/libmkl_blacs_openmpi_lp64.so: undefined reference to `ompi_mpi_double'
  18. /home/pfg/intel/compilers_and_libraries_2019.3.199/linux/mkl/lib/intel64/libmkl_blacs_openmpi_lp64.so: undefined reference to `MPI_Op_free'
  19. /home/pfg/intel/compilers_and_libraries_2019.3.199/linux/mkl/lib/intel64/libmkl_blacs_openmpi_lp64.so: undefined reference to `MPI_Comm_create'
  20. /home/pfg/intel/compilers_and_libraries_2019.3.199/linux/mkl/lib/intel64/libmkl_blacs_openmpi_lp64.so: undefined reference to `MPI_Group_incl'
  21. /home/pfg/intel/compilers_and_libraries_2019.3.199/linux/mkl/lib/intel64/libmkl_blacs_openmpi_lp64.so: undefined reference to `MPI_Attr_get'
  22. /home/pfg/intel/compilers_and_libraries_2019.3.199/linux/mkl/lib/intel64/libmkl_blacs_openmpi_lp64.so: undefined reference to `ompi_mpi_dblcplex'
  23. /home/pfg/intel/compilers_and_libraries_2019.3.199/linux/mkl/lib/intel64/libmkl_blacs_openmpi_lp64.so: undefined reference to `MPI_Init'
  24. /home/pfg/intel/compilers_and_libraries_2019.3.199/linux/mkl/lib/intel64/libmkl_blacs_openmpi_lp64.so: undefined reference to `MPI_Type_create_struct'
  25. /home/pfg/intel/compilers_and_libraries_2019.3.199/linux/mkl/lib/intel64/libmkl_blacs_openmpi_lp64.so: undefined reference to `MPI_Bcast'
  26. /home/pfg/intel/compilers_and_libraries_2019.3.199/linux/mkl/lib/intel64/libmkl_blacs_openmpi_lp64.so: undefined reference to `MPI_Alltoallv'
  27. /home/pfg/intel/compilers_and_libraries_2019.3.199/linux/mkl/lib/intel64/libmkl_blacs_openmpi_lp64.so: undefined reference to `MPI_Cart_create'
  28. /home/pfg/intel/compilers_and_libraries_2019.3.199/linux/mkl/lib/intel64/libmkl_blacs_openmpi_lp64.so: undefined reference to `MPI_Scatterv'
  29. /home/pfg/intel/compilers_and_libraries_2019.3.199/linux/mkl/lib/intel64/libmkl_blacs_openmpi_lp64.so: undefined reference to `MPI_Initialized'
  30. /home/pfg/intel/compilers_and_libraries_2019.3.199/linux/mkl/lib/intel64/libmkl_blacs_openmpi_lp64.so: undefined reference to `MPI_Type_free'
  31. /home/pfg/intel/compilers_and_libraries_2019.3.199/linux/mkl/lib/intel64/libmkl_blacs_openmpi_lp64.so: undefined reference to `MPI_Iprobe'
  32. /home/pfg/intel/compilers_and_libraries_2019.3.199/linux/mkl/lib/intel64/libmkl_blacs_openmpi_lp64.so: undefined reference to `MPI_Testall'
  33. /home/pfg/intel/compilers_and_libraries_2019.3.199/linux/mkl/lib/intel64/libmkl_blacs_openmpi_lp64.so: undefined reference to `MPI_Allgatherv'
  34. /home/pfg/intel/compilers_and_libraries_2019.3.199/linux/mkl/lib/intel64/libmkl_blacs_openmpi_lp64.so: undefined reference to `MPI_Comm_group'
  35. /home/pfg/intel/compilers_and_libraries_2019.3.199/linux/mkl/lib/intel64/libmkl_blacs_openmpi_lp64.so: undefined reference to `MPI_Cart_sub'
  36. /home/pfg/intel/compilers_and_libraries_2019.3.199/linux/mkl/lib/intel64/libmkl_blacs_openmpi_lp64.so: undefined reference to `MPI_Comm_split'
  37. /home/pfg/intel/compilers_and_libraries_2019.3.199/linux/mkl/lib/intel64/libmkl_blacs_openmpi_lp64.so: undefined reference to `MPI_Send'
  38. /home/pfg/intel/compilers_and_libraries_2019.3.199/linux/mkl/lib/intel64/libmkl_blacs_openmpi_lp64.so: undefined reference to `MPI_Allreduce'
  39. /home/pfg/intel/compilers_and_libraries_2019.3.199/linux/mkl/lib/intel64/libmkl_blacs_openmpi_lp64.so: undefined reference to `ompi_mpi_packed'
  40. /home/pfg/intel/compilers_and_libraries_2019.3.199/linux/mkl/lib/intel64/libmkl_blacs_openmpi_lp64.so: undefined reference to `ompi_mpi_op_max'
  41. /home/pfg/intel/compilers_and_libraries_2019.3.199/linux/mkl/lib/intel64/libmkl_blacs_openmpi_lp64.so: undefined reference to `ompi_mpi_op_maxloc'
  42. /home/pfg/intel/compilers_and_libraries_2019.3.199/linux/mkl/lib/intel64/libmkl_blacs_openmpi_lp64.so: undefined reference to `ompi_mpi_op_min'
  43. /home/pfg/intel/compilers_and_libraries_2019.3.199/linux/mkl/lib/intel64/libmkl_blacs_openmpi_lp64.so: undefined reference to `MPI_Rsend'
  44. /home/pfg/intel/compilers_and_libraries_2019.3.199/linux/mkl/lib/intel64/libmkl_blacs_openmpi_lp64.so: undefined reference to `ompi_mpi_byte'
  45. /home/pfg/intel/compilers_and_libraries_2019.3.199/linux/mkl/lib/intel64/libmkl_blacs_openmpi_lp64.so: undefined reference to `MPI_Irecv'
  46. /home/pfg/intel/compilers_and_libraries_2019.3.199/linux/mkl/lib/intel64/libmkl_blacs_openmpi_lp64.so: undefined reference to `MPI_Type_contiguous'
  47. /home/pfg/intel/compilers_and_libraries_2019.3.199/linux/mkl/lib/intel64/libmkl_blacs_openmpi_lp64.so: undefined reference to `MPI_Recv'
  48. /home/pfg/intel/compilers_and_libraries_2019.3.199/linux/mkl/lib/intel64/libmkl_blacs_openmpi_lp64.so: undefined reference to `MPI_Comm_free'
  49. /home/pfg/intel/compilers_and_libraries_2019.3.199/linux/mkl/lib/intel64/libmkl_blacs_openmpi_lp64.so: undefined reference to `ompi_mpi_comm_world'
  50. /home/pfg/intel/compilers_and_libraries_2019.3.199/linux/mkl/lib/intel64/libmkl_blacs_openmpi_lp64.so: undefined reference to `MPI_Address'
  51. /home/pfg/intel/compilers_and_libraries_2019.3.199/linux/mkl/lib/intel64/libmkl_blacs_openmpi_lp64.so: undefined reference to `MPI_Comm_compare'
  52. /home/pfg/intel/compilers_and_libraries_2019.3.199/linux/mkl/lib/intel64/libmkl_blacs_openmpi_lp64.so: undefined reference to `ompi_mpi_float'
  53. /home/pfg/intel/compilers_and_libraries_2019.3.199/linux/mkl/lib/intel64/libmkl_blacs_openmpi_lp64.so: undefined reference to `MPI_Reduce'
  54. /home/pfg/intel/compilers_and_libraries_2019.3.199/linux/mkl/lib/intel64/libmkl_blacs_openmpi_lp64.so: undefined reference to `MPI_Pack'
  55. /home/pfg/intel/compilers_and_libraries_2019.3.199/linux/mkl/lib/intel64/libmkl_blacs_openmpi_lp64.so: undefined reference to `ompi_request_null'
  56. /home/pfg/intel/compilers_and_libraries_2019.3.199/linux/mkl/lib/intel64/libmkl_blacs_openmpi_lp64.so: undefined reference to `ompi_mpi_cplex'
  57. /home/pfg/intel/compilers_and_libraries_2019.3.199/linux/mkl/lib/intel64/libmkl_blacs_openmpi_lp64.so: undefined reference to `ompi_mpi_unsigned_short'
  58. /home/pfg/intel/compilers_and_libraries_2019.3.199/linux/mkl/lib/intel64/libmkl_blacs_openmpi_lp64.so: undefined reference to `MPI_Comm_c2f'
  59. /home/pfg/intel/compilers_and_libraries_2019.3.199/linux/mkl/lib/intel64/libmkl_blacs_openmpi_lp64.so: undefined reference to `MPI_Gatherv'
  60. /home/pfg/intel/compilers_and_libraries_2019.3.199/linux/mkl/lib/intel64/libmkl_blacs_openmpi_lp64.so: undefined reference to `MPI_Wait'
  61. /home/pfg/intel/compilers_and_libraries_2019.3.199/linux/mkl/lib/intel64/libmkl_blacs_openmpi_lp64.so: undefined reference to `MPI_Allgather'
  62. /home/pfg/intel/compilers_and_libraries_2019.3.199/linux/mkl/lib/intel64/libmkl_blacs_openmpi_lp64.so: undefined reference to `ompi_mpi_double_int'
  63. /home/pfg/intel/compilers_and_libraries_2019.3.199/linux/mkl/lib/intel64/libmkl_blacs_openmpi_lp64.so: undefined reference to `MPI_Op_create'
  64. /home/pfg/intel/compilers_and_libraries_2019.3.199/linux/mkl/lib/intel64/libmkl_blacs_openmpi_lp64.so: undefined reference to `ompi_mpi_int'
  65. /home/pfg/intel/compilers_and_libraries_2019.3.199/linux/mkl/lib/intel64/libmkl_blacs_openmpi_lp64.so: undefined reference to `MPI_Pack_size'
  66. /home/pfg/intel/compilers_and_libraries_2019.3.199/linux/mkl/lib/intel64/libmkl_blacs_openmpi_lp64.so: undefined reference to `ompi_mpi_long_long_int'
  67. /home/pfg/intel/compilers_and_libraries_2019.3.199/linux/mkl/lib/intel64/libmkl_blacs_openmpi_lp64.so: undefined reference to `MPI_Unpack'
  68. /home/pfg/intel/compilers_and_libraries_2019.3.199/linux/mkl/lib/intel64/libmkl_blacs_openmpi_lp64.so: undefined reference to `ompi_mpi_integer'
  69. /home/pfg/intel/compilers_and_libraries_2019.3.199/linux/mkl/lib/intel64/libmkl_blacs_openmpi_lp64.so: undefined reference to `MPI_Test'
  70. /home/pfg/intel/compilers_and_libraries_2019.3.199/linux/mkl/lib/intel64/libmkl_blacs_openmpi_lp64.so: undefined reference to `ompi_mpi_dblprec'
  71. /home/pfg/intel/compilers_and_libraries_2019.3.199/linux/mkl/lib/intel64/libmkl_blacs_openmpi_lp64.so: undefined reference to `MPI_Isend'
  72. /home/pfg/intel/compilers_and_libraries_2019.3.199/linux/mkl/lib/intel64/libmkl_blacs_openmpi_lp64.so: undefined reference to `MPI_Type_f2c'
  73. /home/pfg/intel/compilers_and_libraries_2019.3.199/linux/mkl/lib/intel64/libmkl_blacs_openmpi_lp64.so: undefined reference to `MPI_Op_f2c'
  74. /home/pfg/intel/compilers_and_libraries_2019.3.199/linux/mkl/lib/intel64/libmkl_blacs_openmpi_lp64.so: undefined reference to `ompi_mpi_ub'
  75. /home/pfg/intel/compilers_and_libraries_2019.3.199/linux/mkl/lib/intel64/libmkl_blacs_openmpi_lp64.so: undefined reference to `MPI_Type_vector'
  76. /home/pfg/intel/compilers_and_libraries_2019.3.199/linux/mkl/lib/intel64/libmkl_blacs_openmpi_lp64.so: undefined reference to `MPI_Alltoall'
  77. /home/pfg/intel/compilers_and_libraries_2019.3.199/linux/mkl/lib/intel64/libmkl_blacs_openmpi_lp64.so: undefined reference to `MPI_Finalize'
  78. /home/pfg/intel/compilers_and_libraries_2019.3.199/linux/mkl/lib/intel64/libmkl_blacs_openmpi_lp64.so: undefined reference to `MPI_Sendrecv'
  79. /home/pfg/intel/compilers_and_libraries_2019.3.199/linux/mkl/lib/intel64/libmkl_blacs_openmpi_lp64.so: undefined reference to `ompi_mpi_op_sum'
  80. /home/pfg/intel/compilers_and_libraries_2019.3.199/linux/mkl/lib/intel64/libmkl_blacs_openmpi_lp64.so: undefined reference to `MPI_Op_c2f'
  81. /home/pfg/intel/compilers_and_libraries_2019.3.199/linux/mkl/lib/intel64/libmkl_blacs_openmpi_lp64.so: undefined reference to `MPI_Comm_size'
  82. /home/pfg/intel/compilers_and_libraries_2019.3.199/linux/mkl/lib/intel64/libmkl_blacs_openmpi_lp64.so: undefined reference to `ompi_mpi_real'
  83. /home/pfg/intel/compilers_and_libraries_2019.3.199/linux/mkl/lib/intel64/libmkl_blacs_openmpi_lp64.so: undefined reference to `MPI_Group_free'
  84. /home/pfg/intel/compilers_and_libraries_2019.3.199/linux/mkl/lib/intel64/libmkl_blacs_openmpi_lp64.so: undefined reference to `MPI_Type_indexed'
  85. /home/pfg/intel/compilers_and_libraries_2019.3.199/linux/mkl/lib/intel64/libmkl_blacs_openmpi_lp64.so: undefined reference to `MPI_Barrier'
  86. /home/pfg/intel/compilers_and_libraries_2019.3.199/linux/mkl/lib/intel64/libmkl_blacs_openmpi_lp64.so: undefined reference to `MPI_Type_commit'
  87. /home/pfg/intel/compilers_and_libraries_2019.3.199/linux/mkl/lib/intel64/libmkl_blacs_openmpi_lp64.so: undefined reference to `MPI_Type_c2f'
  88. /home/pfg/intel/compilers_and_libraries_2019.3.199/linux/mkl/lib/intel64/libmkl_blacs_openmpi_lp64.so: undefined reference to `MPI_Comm_rank'
  89. collect2: error: ld returned 1 exit status
  90. make[1]: *** [Makefile:71: quip] Error 1
  91. make[1]: Leaving directory '/home/pfg/Apps/cp2k-8.2/tools/toolchain/build/QUIP-b4336484fb65b0e73211a8f920ae4361c7c353fd/build/linux_x86_64_gfortran'
  92. make: *** [Makefile:155: Programs] Error
复制代码


1633

帖子

4

威望

4093

eV
积分
5806

Level 6 (一方通行)

喵星人

75#
发表于 Post on 2021-9-5 16:37:58 | 只看该作者 Only view this author
hitvip 发表于 2021-9-5 10:07
非常感谢您的解答,如您所说,我去掉了那两个软件包,并且采用cent8自带的gcc 8.4.1 来编译,但是我现在 ...

不用mkl就好了。。。。

本版积分规则 Credits rule

手机版 Mobile version|北京科音自然科学研究中心 Beijing Kein Research Center for Natural Sciences|京公网安备 11010502035419号|计算化学公社 — 北京科音旗下高水平计算化学交流论坛 ( 京ICP备14038949号-1 )|网站地图

GMT+8, 2024-11-25 11:55 , Processed in 0.170811 second(s), 21 queries , Gzip On.

快速回复 返回顶部 返回列表 Return to list