计算化学公社

 找回密码 Forget password
 注册 Register
楼主 Author: sobereva
打印 Print 上一主题 Last thread 下一主题 Next thread

[CP2K] CP2K第一性原理程序在Linux中的安装方法

  [复制链接 Copy URL]

374

帖子

2

威望

1539

eV
积分
1953

Level 5 (御坂)

31#
发表于 Post on 2021-2-19 18:48:27 | 只看该作者 Only view this author
sobereva 发表于 2021-2-19 07:26
最近几天我会写专门的帖子。

简单来说,载入任意Multiwfn支持的含有结构信息的文件(最好也含有晶胞信 ...

非常感谢sob老师!

1060

帖子

0

威望

3254

eV
积分
4314

Level 6 (一方通行)

32#
发表于 Post on 2021-2-19 20:18:53 | 只看该作者 Only view this author
gog 发表于 2021-2-19 12:47
openmpi 用root权限先编译好啊。

自己编译 openmpi 和最后没有 popt 有必然联系?

1060

帖子

0

威望

3254

eV
积分
4314

Level 6 (一方通行)

33#
发表于 Post on 2021-2-19 21:12:38 | 只看该作者 Only view this author
gog 发表于 2021-2-19 12:47
openmpi 用root权限先编译好啊。

重新编译了一次

  1. ./install_cp2k_toolchain.sh --with-mkl=system --with-openmpi=install --with-libsmm=install --with-ptscotch=install --with-superlu=install --with-pexsi=install --with-plumed=install
复制代码


直接用 CP2K 提供的 VERSION,没有自作主张加上 popt

  1. make -j 40 ARCH=local VERSION="ssmp psmp"
复制代码


得到 exe/local 目录下的结果:

  1. memory_utilities_unittest.ssmp
  2. parallel_rng_types_unittest.ssmp
  3. graph.ssmp
  4. memory_utilities_unittest.psmp
  5. parallel_rng_types_unittest.psmp
  6. graph.psmp
  7. grid_miniapp.ssmp
  8. grid_unittest.ssmp
  9. grid_miniapp.psmp
  10. grid_unittest.psmp
  11. dumpdcd.ssmp
  12. xyz2dcd.ssmp
  13. xyz2dcd.psmp
  14. dumpdcd.psmp
  15. libcp2k_unittest.ssmp
  16. cp2k.ssmp
  17. cp2k.sopt -> cp2k.ssmp   # 软链接
  18. cp2k_shell.ssmp -> cp2k.ssmp # 软链接
  19. cp2k.psmp
  20. libcp2k_unittest.psmp
  21. cp2k_shell.psmp -> cp2k.psmp # 软链接
  22. cp2k.popt -> cp2k.psmp # 软链接
复制代码


可以发现,8.1 版默认用软链接把 psmp 版作为 popt 版了。

328

帖子

0

威望

1916

eV
积分
2244

Level 5 (御坂)

34#
发表于 Post on 2021-2-19 21:39:41 | 只看该作者 Only view this author
乐平 发表于 2021-2-19 20:18
自己编译 openmpi 和最后没有 popt 有必然联系?

软件迭代更新、去掉了吧。刘博士的帖子回复,intelmpi编译的,也没有popt。

246

帖子

0

威望

1059

eV
积分
1305

Level 4 (黑子)

35#
发表于 Post on 2021-2-19 22:59:59 | 只看该作者 Only view this author
sobereva 发表于 2021-2-19 07:35
https://www.cp2k.org/version_history
我觉得算不上很明显的改进

了解,谢谢社长
努力挖掘科学的本质

1060

帖子

0

威望

3254

eV
积分
4314

Level 6 (一方通行)

36#
发表于 Post on 2021-2-19 23:13:10 | 只看该作者 Only view this author
本帖最后由 乐平 于 2021-2-19 17:14 编辑

想请教一下 sob 老师,跨节点并行应该怎么设置?用的是 openmpi 4.0.1

比如,我的服务器单个节点 16-core
mpirun -np 16 cp2k.popt -i H2O-128.inp -o H2O-128.out
可以正常运行。

如果我希望用跨两个节点,调用 32-core 应该怎么设置呢?

我尝试了用
mpirun -np 32 cp2k.popt -i H2O-128.inp -o H2O-128.out
或者
mpirun -np 32 cp2k.psmp -i H2O-128.inp -o H2O-128.out

但是任务都没法运行。

5万

帖子

99

威望

5万

eV
积分
112349

管理员

公社社长

37#
 楼主 Author| 发表于 Post on 2021-2-20 15:22:04 | 只看该作者 Only view this author
乐平 发表于 2021-2-19 23:13
想请教一下 sob 老师,跨节点并行应该怎么设置?用的是 openmpi 4.0.1

比如,我的服务器单个节点 16-cor ...

https://www.open-mpi.org/faq/?ca ... pirun-specify-hosts
北京科音自然科学研究中心http://www.keinsci.com)致力于计算化学的发展和传播,长期开办高质量的各种计算化学类培训:初级量子化学培训班中级量子化学培训班高级量子化学培训班量子化学波函数分析与Multiwfn程序培训班分子动力学与GROMACS培训班CP2K第一性原理计算培训班,内容介绍以及往届资料购买请点击相应链接查看。这些培训是计算化学从零快速入门以及进一步全面系统性提升研究水平的高速路!培训各种常见问题见《北京科音办的培训班FAQ》
欢迎加入“北京科音”微信公众号获取北京科音培训的最新消息、避免错过网上有价值的计算化学文章!
欢迎加入人气非常高、专业性特别强的综合性理论与计算化学交流QQ群“思想家公社QQ群”:1号:18616395,2号:466017436,3号:764390338,搜索群号能搜到哪个说明目前哪个能加,合计9000人。北京科音培训班的学员在群中可申请VIP头衔,提问将得到群主Sobereva的最优先解答。
思想家公社的门口Blog:http://sobereva.com(发布大量原创计算化学相关博文)
Multiwfn主页:http://sobereva.com/multiwfn(十分强大的量子化学波函数分析程序)
ResearchGate:https://www.researchgate.net/profile/Tian_Lu
Money and papers are rubbish, get a real life!

1060

帖子

0

威望

3254

eV
积分
4314

Level 6 (一方通行)

38#
发表于 Post on 2021-2-20 16:35:28 | 只看该作者 Only view this author
sobereva 发表于 2021-2-20 09:22
https://www.open-mpi.org/faq/?category=running#mpirun-specify-hosts

谢谢,不过似乎没看太明白。这个链接里(9)说的是在某些指定的 hosts 运行 mpi,一下小节(10)说的是在多个 hosts 上运行 mpi
openmpi 官网的帖子里说得好复杂,对于新手太不友好……



我把我的 PBS 文件修改了一下,可以跨两个节点运行了,如下。

  1. #!/bin/bash
  2. #PBS -N cp2k_H2O-512
  3. #PBS -l nodes=2:ppn=16
  4. #PBS -j n
  5. #PBS -e ${PBS_JOBNAME}.e
  6. #PBS -o ${PBS_JOBNAME}.o
  7. #PBS -q v3

  8. module load gcc/9.3.1
  9. module load openmpi/4.0.1
  10. source /public1/apps/cp2k-8.1/tools/toolchain/install/setup


  11. cd $PBS_O_WORKDIR
  12. EXEC=/public1/apps/cp2k-8.1/exe/local/cp2k.popt

  13. mpirun -np 32 $EXEC -i H2O-512.inp -o H2O-512.out
复制代码



修改的地方如下:
【1】 #PBS -l nodes=2:ppn=16
修改之前是 nodes=1 ,所以只能在一个节点上运行。

【2】mpirun -np 32
32 = 2 * 16,也就是节点数乘以每个节点上的 CPU 核心数,之前只改了这里,没有改 【1】的项目,故导致任务无法运行。

34

帖子

0

威望

1774

eV
积分
1808

Level 5 (御坂)

39#
发表于 Post on 2021-3-24 16:22:05 | 只看该作者 Only view this author
编译QUIP模块,编译不过去
==================== Installing QUIP ====================
QUIP-1ff93b3400b83e804f0f2e857e70c5e4133d9705.tar.gz is found
Installing from scratch into /share/apps/soft/cp2k81/tools/toolchain/install/quip-1ff93b3400b83e804f0f2e857e70c5e4133d9705
ERROR: (./scripts/install_quip.sh, line 100) Non-zero exit code detected.

make.log有如下报错

Making Programs

********************************************
rm -f /share/apps/soft/cp2k81/tools/toolchain/build/QUIP-1ff93b3400b83e804f0f2e857e70c5e4133d9705/build/linux_x86_64_gfortran/Makefile
cp /share/apps/soft/cp2k81/tools/toolchain/build/QUIP-1ff93b3400b83e804f0f2e857e70c5e4133d9705/src/Programs/Makefile /share/apps/soft/cp2k81/tools/toolchain/build/QUIP-1ff93b3400b83e804f0f2e857e70c5e4133d9705/build/linux_x86_64_gfortran/Makefile
make -C /share/apps/soft/cp2k81/tools/toolchain/build/QUIP-1ff93b3400b83e804f0f2e857e70c5e4133d9705/build/linux_x86_64_gfortran QUIP_ROOT=/share/apps/soft/cp2k81/tools/toolchain/build/QUIP-1ff93b3400b83e804f0f2e857e70c5e4133d9705 VPATH=/share/apps/soft/cp2k81/tools/toolchain/build/QUIP-1ff93b3400b83e804f0f2e857e70c5e4133d9705/src/Programs -I/share/apps/soft/cp2k81/tools/toolchain/build/QUIP-1ff93b3400b83e804f0f2e857e70c5e4133d9705 -I/share/apps/soft/cp2k81/tools/toolchain/build/QUIP-1ff93b3400b83e804f0f2e857e70c5e4133d9705/arch
make[1]: Entering directory `/share/apps/soft/cp2k81/tools/toolchain/build/QUIP-1ff93b3400b83e804f0f2e857e70c5e4133d9705/build/linux_x86_64_gfortran'
gfortran  -x f95-cpp-input -ffree-line-length-none -ffree-form -fno-second-underscore -fPIC -g  -I/share/apps/soft/cp2k81/tools/toolchain/build/QUIP-1ff93b3400b83e804f0f2e857e70c5e4133d9705/src/libAtoms -I/share/apps/soft/cp2k81/tools/toolchain/build/QUIP-1ff93b3400b83e804f0f2e857e70c5e4133d9705/src/fox/objs.linux_x86_64_gfortran/finclude -O3  -DGETARG_F2003 -DGETENV_F2003 -DGFORTRAN -DFORTRAN_UNDERSCORE -D'GIT_VERSION="NOT_A_GIT_REPOSITORY"'  -D'QUIP_ARCH="linux_x86_64_gfortran"' -D'SIZEOF_FORTRAN_T=2' -DHAVE_PRECON -DHAVE_QR  -c  /share/apps/soft/cp2k81/tools/toolchain/build/QUIP-1ff93b3400b83e804f0f2e857e70c5e4133d9705/src/Programs/vacancy_map_mod.f95 -o vacancy_map_mod.o
gfortran  -x f95-cpp-input -ffree-line-length-none -ffree-form -fno-second-underscore -fPIC -g  -I/share/apps/soft/cp2k81/tools/toolchain/build/QUIP-1ff93b3400b83e804f0f2e857e70c5e4133d9705/src/libAtoms -I/share/apps/soft/cp2k81/tools/toolchain/build/QUIP-1ff93b3400b83e804f0f2e857e70c5e4133d9705/src/fox/objs.linux_x86_64_gfortran/finclude -O3  -DGETARG_F2003 -DGETENV_F2003 -DGFORTRAN -DFORTRAN_UNDERSCORE -D'GIT_VERSION="NOT_A_GIT_REPOSITORY"'  -D'QUIP_ARCH="linux_x86_64_gfortran"' -D'SIZEOF_FORTRAN_T=2' -DHAVE_PRECON -DHAVE_QR  -c  /share/apps/soft/cp2k81/tools/toolchain/build/QUIP-1ff93b3400b83e804f0f2e857e70c5e4133d9705/src/Programs/quip.f95 -o quip.o
gfortran  -o quip  quip.o vacancy_map_mod.o -L. -lquiputils -lquip_core  -latoms   -O3  -L/share/apps/soft/cp2k81/tools/toolchain/build/QUIP-1ff93b3400b83e804f0f2e857e70c5e4133d9705/src/fox/objs.linux_x86_64_gfortran/lib -lFoX_sax -lFoX_wxml -lFoX_utils -lFoX_common -lFoX_fsys  /opt/intel/compilers_and_libraries_2020.4.304/linux/mkl/lib/intel64/libmkl_scalapack_lp64.so -Wl,--start-group /opt/intel/compilers_and_libraries_2020.4.304/linux/mkl/lib/intel64/libmkl_gf_lp64.so /opt/intel/compilers_and_libraries_2020.4.304/linux/mkl/lib/intel64/libmkl_sequential.so /opt/intel/compilers_and_libraries_2020.4.304/linux/mkl/lib/intel64/libmkl_core.so /opt/intel/compilers_and_libraries_2020.4.304/linux/mkl/lib/intel64/libmkl_blacs_openmpi_lp64.so -Wl,--end-group -lpthread -lm -ldl  
/opt/intel/compilers_and_libraries_2020.4.304/linux/mkl/lib/intel64/libmkl_blacs_openmpi_lp64.so: undefined reference to `MPI_Waitall'
/opt/intel/compilers_and_libraries_2020.4.304/linux/mkl/lib/intel64/libmkl_blacs_openmpi_lp64.so: undefined reference to `ompi_mpi_comm_null'
/opt/intel/compilers_and_libraries_2020.4.304/linux/mkl/lib/intel64/libmkl_blacs_openmpi_lp64.so: undefined reference to `MPI_Abort'
/opt/intel/compilers_and_libraries_2020.4.304/linux/mkl/lib/intel64/libmkl_blacs_openmpi_lp64.so: undefined reference to `MPI_Waitany'
/opt/intel/compilers_and_libraries_2020.4.304/linux/mkl/lib/intel64/libmkl_blacs_openmpi_lp64.so: undefined reference to `MPI_Comm_f2c'
/opt/intel/compilers_and_libraries_2020.4.304/linux/mkl/lib/intel64/libmkl_blacs_openmpi_lp64.so: undefined reference to `MPI_Wtime'
/opt/intel/compilers_and_libraries_2020.4.304/linux/mkl/lib/intel64/libmkl_blacs_openmpi_lp64.so: undefined reference to `MPI_Comm_dup'
/opt/intel/compilers_and_libraries_2020.4.304/linux/mkl/lib/intel64/libmkl_blacs_openmpi_lp64.so: undefined reference to `ompi_mpi_double'
/opt/intel/compilers_and_libraries_2020.4.304/linux/mkl/lib/intel64/libmkl_blacs_openmpi_lp64.so: undefined reference to `MPI_Op_free'
/opt/intel/compilers_and_libraries_2020.4.304/linux/mkl/lib/intel64/libmkl_blacs_openmpi_lp64.so: undefined reference to `MPI_Comm_create'
/opt/intel/compilers_and_libraries_2020.4.304/linux/mkl/lib/intel64/libmkl_blacs_openmpi_lp64.so: undefined reference to `MPI_Group_incl'
/opt/intel/compilers_and_libraries_2020.4.304/linux/mkl/lib/intel64/libmkl_blacs_openmpi_lp64.so: undefined reference to `MPI_Comm_get_attr'
/opt/intel/compilers_and_libraries_2020.4.304/linux/mkl/lib/intel64/libmkl_blacs_openmpi_lp64.so: undefined reference to `ompi_mpi_dblcplex'
/opt/intel/compilers_and_libraries_2020.4.304/linux/mkl/lib/intel64/libmkl_blacs_openmpi_lp64.so: undefined reference to `MPI_Init'
/opt/intel/compilers_and_libraries_2020.4.304/linux/mkl/lib/intel64/libmkl_blacs_openmpi_lp64.so: undefined reference to `MPI_Type_create_struct'
/opt/intel/compilers_and_libraries_2020.4.304/linux/mkl/lib/intel64/libmkl_blacs_openmpi_lp64.so: undefined reference to `MPI_Bcast'
/opt/intel/compilers_and_libraries_2020.4.304/linux/mkl/lib/intel64/libmkl_blacs_openmpi_lp64.so: undefined reference to `MPI_Alltoallv'
/opt/intel/compilers_and_libraries_2020.4.304/linux/mkl/lib/intel64/libmkl_blacs_openmpi_lp64.so: undefined reference to `MPI_Cart_create'
/opt/intel/compilers_and_libraries_2020.4.304/linux/mkl/lib/intel64/libmkl_blacs_openmpi_lp64.so: undefined reference to `MPI_Scatterv'
/opt/intel/compilers_and_libraries_2020.4.304/linux/mkl/lib/intel64/libmkl_blacs_openmpi_lp64.so: undefined reference to `MPI_Initialized'
/opt/intel/compilers_and_libraries_2020.4.304/linux/mkl/lib/intel64/libmkl_blacs_openmpi_lp64.so: undefined reference to `MPI_Type_free'
/opt/intel/compilers_and_libraries_2020.4.304/linux/mkl/lib/intel64/libmkl_blacs_openmpi_lp64.so: undefined reference to `MPI_Iprobe'
/opt/intel/compilers_and_libraries_2020.4.304/linux/mkl/lib/intel64/libmkl_blacs_openmpi_lp64.so: undefined reference to `MPI_Testall'
/opt/intel/compilers_and_libraries_2020.4.304/linux/mkl/lib/intel64/libmkl_blacs_openmpi_lp64.so: undefined reference to `MPI_Allgatherv'
/opt/intel/compilers_and_libraries_2020.4.304/linux/mkl/lib/intel64/libmkl_blacs_openmpi_lp64.so: undefined reference to `MPI_Comm_group'
/opt/intel/compilers_and_libraries_2020.4.304/linux/mkl/lib/intel64/libmkl_blacs_openmpi_lp64.so: undefined reference to `MPI_Cart_sub'
/opt/intel/compilers_and_libraries_2020.4.304/linux/mkl/lib/intel64/libmkl_blacs_openmpi_lp64.so: undefined reference to `MPI_Comm_split'
/opt/intel/compilers_and_libraries_2020.4.304/linux/mkl/lib/intel64/libmkl_blacs_openmpi_lp64.so: undefined reference to `MPI_Send'
/opt/intel/compilers_and_libraries_2020.4.304/linux/mkl/lib/intel64/libmkl_blacs_openmpi_lp64.so: undefined reference to `MPI_Allreduce'
/opt/intel/compilers_and_libraries_2020.4.304/linux/mkl/lib/intel64/libmkl_blacs_openmpi_lp64.so: undefined reference to `ompi_mpi_packed'
/opt/intel/compilers_and_libraries_2020.4.304/linux/mkl/lib/intel64/libmkl_blacs_openmpi_lp64.so: undefined reference to `ompi_mpi_op_max'
/opt/intel/compilers_and_libraries_2020.4.304/linux/mkl/lib/intel64/libmkl_blacs_openmpi_lp64.so: undefined reference to `ompi_mpi_op_maxloc'
/opt/intel/compilers_and_libraries_2020.4.304/linux/mkl/lib/intel64/libmkl_blacs_openmpi_lp64.so: undefined reference to `ompi_mpi_op_min'
/opt/intel/compilers_and_libraries_2020.4.304/linux/mkl/lib/intel64/libmkl_blacs_openmpi_lp64.so: undefined reference to `MPI_Rsend'
/opt/intel/compilers_and_libraries_2020.4.304/linux/mkl/lib/intel64/libmkl_blacs_openmpi_lp64.so: undefined reference to `ompi_mpi_byte'
/opt/intel/compilers_and_libraries_2020.4.304/linux/mkl/lib/intel64/libmkl_blacs_openmpi_lp64.so: undefined reference to `MPI_Irecv'
/opt/intel/compilers_and_libraries_2020.4.304/linux/mkl/lib/intel64/libmkl_blacs_openmpi_lp64.so: undefined reference to `MPI_Type_contiguous'
/opt/intel/compilers_and_libraries_2020.4.304/linux/mkl/lib/intel64/libmkl_blacs_openmpi_lp64.so: undefined reference to `MPI_Recv'
/opt/intel/compilers_and_libraries_2020.4.304/linux/mkl/lib/intel64/libmkl_blacs_openmpi_lp64.so: undefined reference to `MPI_Comm_free'
/opt/intel/compilers_and_libraries_2020.4.304/linux/mkl/lib/intel64/libmkl_blacs_openmpi_lp64.so: undefined reference to `ompi_mpi_comm_world'
/opt/intel/compilers_and_libraries_2020.4.304/linux/mkl/lib/intel64/libmkl_blacs_openmpi_lp64.so: undefined reference to `MPI_Comm_compare'
/opt/intel/compilers_and_libraries_2020.4.304/linux/mkl/lib/intel64/libmkl_blacs_openmpi_lp64.so: undefined reference to `ompi_mpi_float'
/opt/intel/compilers_and_libraries_2020.4.304/linux/mkl/lib/intel64/libmkl_blacs_openmpi_lp64.so: undefined reference to `MPI_Reduce'
/opt/intel/compilers_and_libraries_2020.4.304/linux/mkl/lib/intel64/libmkl_blacs_openmpi_lp64.so: undefined reference to `MPI_Pack'
/opt/intel/compilers_and_libraries_2020.4.304/linux/mkl/lib/intel64/libmkl_blacs_openmpi_lp64.so: undefined reference to `ompi_request_null'
/opt/intel/compilers_and_libraries_2020.4.304/linux/mkl/lib/intel64/libmkl_blacs_openmpi_lp64.so: undefined reference to `ompi_mpi_cplex'
/opt/intel/compilers_and_libraries_2020.4.304/linux/mkl/lib/intel64/libmkl_blacs_openmpi_lp64.so: undefined reference to `ompi_mpi_unsigned_short'
/opt/intel/compilers_and_libraries_2020.4.304/linux/mkl/lib/intel64/libmkl_blacs_openmpi_lp64.so: undefined reference to `MPI_Comm_c2f'
/opt/intel/compilers_and_libraries_2020.4.304/linux/mkl/lib/intel64/libmkl_blacs_openmpi_lp64.so: undefined reference to `MPI_Gatherv'
/opt/intel/compilers_and_libraries_2020.4.304/linux/mkl/lib/intel64/libmkl_blacs_openmpi_lp64.so: undefined reference to `MPI_Get_address'
/opt/intel/compilers_and_libraries_2020.4.304/linux/mkl/lib/intel64/libmkl_blacs_openmpi_lp64.so: undefined reference to `MPI_Wait'
/opt/intel/compilers_and_libraries_2020.4.304/linux/mkl/lib/intel64/libmkl_blacs_openmpi_lp64.so: undefined reference to `MPI_Allgather'
/opt/intel/compilers_and_libraries_2020.4.304/linux/mkl/lib/intel64/libmkl_blacs_openmpi_lp64.so: undefined reference to `MPI_Type_create_resized'
/opt/intel/compilers_and_libraries_2020.4.304/linux/mkl/lib/intel64/libmkl_blacs_openmpi_lp64.so: undefined reference to `ompi_mpi_double_int'
/opt/intel/compilers_and_libraries_2020.4.304/linux/mkl/lib/intel64/libmkl_blacs_openmpi_lp64.so: undefined reference to `MPI_Op_create'
/opt/intel/compilers_and_libraries_2020.4.304/linux/mkl/lib/intel64/libmkl_blacs_openmpi_lp64.so: undefined reference to `ompi_mpi_int'
/opt/intel/compilers_and_libraries_2020.4.304/linux/mkl/lib/intel64/libmkl_blacs_openmpi_lp64.so: undefined reference to `MPI_Pack_size'
/opt/intel/compilers_and_libraries_2020.4.304/linux/mkl/lib/intel64/libmkl_blacs_openmpi_lp64.so: undefined reference to `ompi_mpi_long_long_int'
/opt/intel/compilers_and_libraries_2020.4.304/linux/mkl/lib/intel64/libmkl_blacs_openmpi_lp64.so: undefined reference to `MPI_Unpack'
/opt/intel/compilers_and_libraries_2020.4.304/linux/mkl/lib/intel64/libmkl_blacs_openmpi_lp64.so: undefined reference to `ompi_mpi_integer'
/opt/intel/compilers_and_libraries_2020.4.304/linux/mkl/lib/intel64/libmkl_blacs_openmpi_lp64.so: undefined reference to `MPI_Test'
/opt/intel/compilers_and_libraries_2020.4.304/linux/mkl/lib/intel64/libmkl_blacs_openmpi_lp64.so: undefined reference to `ompi_mpi_dblprec'
/opt/intel/compilers_and_libraries_2020.4.304/linux/mkl/lib/intel64/libmkl_blacs_openmpi_lp64.so: undefined reference to `MPI_Isend'
/opt/intel/compilers_and_libraries_2020.4.304/linux/mkl/lib/intel64/libmkl_blacs_openmpi_lp64.so: undefined reference to `MPI_Type_f2c'
/opt/intel/compilers_and_libraries_2020.4.304/linux/mkl/lib/intel64/libmkl_blacs_openmpi_lp64.so: undefined reference to `MPI_Op_f2c'
/opt/intel/compilers_and_libraries_2020.4.304/linux/mkl/lib/intel64/libmkl_blacs_openmpi_lp64.so: undefined reference to `MPI_Type_vector'
/opt/intel/compilers_and_libraries_2020.4.304/linux/mkl/lib/intel64/libmkl_blacs_openmpi_lp64.so: undefined reference to `MPI_Alltoall'
/opt/intel/compilers_and_libraries_2020.4.304/linux/mkl/lib/intel64/libmkl_blacs_openmpi_lp64.so: undefined reference to `MPI_Finalize'
/opt/intel/compilers_and_libraries_2020.4.304/linux/mkl/lib/intel64/libmkl_blacs_openmpi_lp64.so: undefined reference to `MPI_Sendrecv'
/opt/intel/compilers_and_libraries_2020.4.304/linux/mkl/lib/intel64/libmkl_blacs_openmpi_lp64.so: undefined reference to `ompi_mpi_op_sum'
/opt/intel/compilers_and_libraries_2020.4.304/linux/mkl/lib/intel64/libmkl_blacs_openmpi_lp64.so: undefined reference to `MPI_Op_c2f'
/opt/intel/compilers_and_libraries_2020.4.304/linux/mkl/lib/intel64/libmkl_blacs_openmpi_lp64.so: undefined reference to `MPI_Comm_size'
/opt/intel/compilers_and_libraries_2020.4.304/linux/mkl/lib/intel64/libmkl_blacs_openmpi_lp64.so: undefined reference to `ompi_mpi_real'
/opt/intel/compilers_and_libraries_2020.4.304/linux/mkl/lib/intel64/libmkl_blacs_openmpi_lp64.so: undefined reference to `MPI_Group_free'
/opt/intel/compilers_and_libraries_2020.4.304/linux/mkl/lib/intel64/libmkl_blacs_openmpi_lp64.so: undefined reference to `MPI_Type_indexed'
/opt/intel/compilers_and_libraries_2020.4.304/linux/mkl/lib/intel64/libmkl_blacs_openmpi_lp64.so: undefined reference to `MPI_Barrier'
/opt/intel/compilers_and_libraries_2020.4.304/linux/mkl/lib/intel64/libmkl_blacs_openmpi_lp64.so: undefined reference to `MPI_Type_commit'
/opt/intel/compilers_and_libraries_2020.4.304/linux/mkl/lib/intel64/libmkl_blacs_openmpi_lp64.so: undefined reference to `MPI_Type_c2f'
/opt/intel/compilers_and_libraries_2020.4.304/linux/mkl/lib/intel64/libmkl_blacs_openmpi_lp64.so: undefined reference to `MPI_Comm_rank'
collect2: error: ld returned 1 exit status
make[1]: *** [quip] Error 1
make[1]: Leaving directory `/share/apps/soft/cp2k81/tools/toolchain/build/QUIP-1ff93b3400b83e804f0f2e857e70c5e4133d9705/build/linux_x86_64_gfortran'
make: *** [Programs] Error 2

麻烦老师们看看是啥原因

5万

帖子

99

威望

5万

eV
积分
112349

管理员

公社社长

40#
 楼主 Author| 发表于 Post on 2021-3-25 07:02:38 | 只看该作者 Only view this author
ball2006 发表于 2021-3-24 16:22
编译QUIP模块,编译不过去
==================== Installing QUIP ====================
QUIP-1ff93b3400 ...

跟MPI库链接有问题。检查MPI方面的安装和设置

如果只有这一个过不去并且你并不使用经验势做计算,QUIP也可以不装
北京科音自然科学研究中心http://www.keinsci.com)致力于计算化学的发展和传播,长期开办高质量的各种计算化学类培训:初级量子化学培训班中级量子化学培训班高级量子化学培训班量子化学波函数分析与Multiwfn程序培训班分子动力学与GROMACS培训班CP2K第一性原理计算培训班,内容介绍以及往届资料购买请点击相应链接查看。这些培训是计算化学从零快速入门以及进一步全面系统性提升研究水平的高速路!培训各种常见问题见《北京科音办的培训班FAQ》
欢迎加入“北京科音”微信公众号获取北京科音培训的最新消息、避免错过网上有价值的计算化学文章!
欢迎加入人气非常高、专业性特别强的综合性理论与计算化学交流QQ群“思想家公社QQ群”:1号:18616395,2号:466017436,3号:764390338,搜索群号能搜到哪个说明目前哪个能加,合计9000人。北京科音培训班的学员在群中可申请VIP头衔,提问将得到群主Sobereva的最优先解答。
思想家公社的门口Blog:http://sobereva.com(发布大量原创计算化学相关博文)
Multiwfn主页:http://sobereva.com/multiwfn(十分强大的量子化学波函数分析程序)
ResearchGate:https://www.researchgate.net/profile/Tian_Lu
Money and papers are rubbish, get a real life!

34

帖子

0

威望

1774

eV
积分
1808

Level 5 (御坂)

41#
发表于 Post on 2021-3-25 11:11:29 | 只看该作者 Only view this author
好的,sob老师。
我再查查,多谢。

140

帖子

0

威望

1477

eV
积分
1617

Level 5 (御坂)

42#
发表于 Post on 2021-3-26 15:33:15 | 只看该作者 Only view this author
老师,编译make -j 4 时出现了这个问题,该怎么解决呢
Traceback (most recent call last):
  File "/calc/msi/cp2k-8.1/tools/build_utils/check_archives.py", line 69, in <module>
    main()
  File "/calc/msi/cp2k-8.1/tools/build_utils/check_archives.py", line 49, in main
    output = check_output([ar_exe, "t", archive_fn], encoding="utf8")
  File "/usr/local/lib/python3.5/subprocess.py", line 626, in check_output
    **kwargs).stdout
  File "/usr/local/lib/python3.5/subprocess.py", line 693, in run
    with Popen(*popenargs, **kwargs) as process:
TypeError: __init__() got an unexpected keyword argument 'encoding'
make[2]: *** [makedep] 错误 1
make[2]: *** 正在等待未完成的任务....
Traceback (most recent call last):
  File "/calc/msi/cp2k-8.1/tools/build_utils/check_archives.py", line 69, in <module>
    main()
  File "/calc/msi/cp2k-8.1/tools/build_utils/check_archives.py", line 49, in main
    output = check_output([ar_exe, "t", archive_fn], encoding="utf8")
  File "/usr/local/lib/python3.5/subprocess.py", line 626, in check_output
    **kwargs).stdout
  File "/usr/local/lib/python3.5/subprocess.py", line 693, in run
    with Popen(*popenargs, **kwargs) as process:
TypeError: __init__() got an unexpected keyword argument 'encoding'
make[2]: *** [makedep] 错误 1
make[2]: *** 正在等待未完成的任务....
Resolving dependencies ...
Resolving dependencies ...
make[1]: *** [ssmp] 错误 2
make[1]: *** 正在等待未完成的任务....
make[1]: *** [psmp] 错误 2
make: *** [all] 错误 2

5万

帖子

99

威望

5万

eV
积分
112349

管理员

公社社长

43#
 楼主 Author| 发表于 Post on 2021-3-26 19:32:26 | 只看该作者 Only view this author
yiranfengbai 发表于 2021-3-26 15:33
老师,编译make -j 4 时出现了这个问题,该怎么解决呢
Traceback (most recent call last):
  File "/cal ...

信息不足,无法判断。什么系统什么编译器具体怎么编译的都得交代非常清楚

系统不要装成中文语言
北京科音自然科学研究中心http://www.keinsci.com)致力于计算化学的发展和传播,长期开办高质量的各种计算化学类培训:初级量子化学培训班中级量子化学培训班高级量子化学培训班量子化学波函数分析与Multiwfn程序培训班分子动力学与GROMACS培训班CP2K第一性原理计算培训班,内容介绍以及往届资料购买请点击相应链接查看。这些培训是计算化学从零快速入门以及进一步全面系统性提升研究水平的高速路!培训各种常见问题见《北京科音办的培训班FAQ》
欢迎加入“北京科音”微信公众号获取北京科音培训的最新消息、避免错过网上有价值的计算化学文章!
欢迎加入人气非常高、专业性特别强的综合性理论与计算化学交流QQ群“思想家公社QQ群”:1号:18616395,2号:466017436,3号:764390338,搜索群号能搜到哪个说明目前哪个能加,合计9000人。北京科音培训班的学员在群中可申请VIP头衔,提问将得到群主Sobereva的最优先解答。
思想家公社的门口Blog:http://sobereva.com(发布大量原创计算化学相关博文)
Multiwfn主页:http://sobereva.com/multiwfn(十分强大的量子化学波函数分析程序)
ResearchGate:https://www.researchgate.net/profile/Tian_Lu
Money and papers are rubbish, get a real life!

140

帖子

0

威望

1477

eV
积分
1617

Level 5 (御坂)

44#
发表于 Post on 2021-3-26 22:44:36 | 只看该作者 Only view this author
sobereva 发表于 2021-3-26 19:32
信息不足,无法判断。什么系统什么编译器具体怎么编译的都得交代非常清楚

系统不要装成中文语言

抱歉老师,系统是centos7.6, gcc9.3.1。我通把这个关键词删掉之后,这个问题暂时没出来了,不知道是否正确

2301

帖子

1

威望

5473

eV
积分
7794

Level 6 (一方通行)

45#
发表于 Post on 2021-3-30 20:06:01 | 只看该作者 Only view this author
本帖最后由 abin 于 2021-4-17 21:37 编辑
ball2006 发表于 2021-3-24 16:22
编译QUIP模块,编译不过去
==================== Installing QUIP ====================
QUIP-1ff93b3400 ...

我记得,
使用intel2020u2没有问题。修正一下,
intel2020u2, 运行某些计算会遇到内存爆浆的问题. 请谨慎使用.   

intel2020u4, u3都有很多坑。

另,
我有编译好的
cp2k v7.1 v8.1, 分别针对E5 v3/v4 或者新的AVX2/AVX512优化的编译版本。
可以有偿提供。
需要目标机器有root授权才能完成部署。
High-Performance Computing for You
为您专属定制的高性能计算解决方案

更多讯息,请访问:
https://labitc.top
http://tophpc.top:8080
电邮: ask@hpc4you.top

本版积分规则 Credits rule

手机版 Mobile version|北京科音自然科学研究中心 Beijing Kein Research Center for Natural Sciences|京公网安备 11010502035419号|计算化学公社 — 北京科音旗下高水平计算化学交流论坛 ( 京ICP备14038949号-1 )|网站地图

GMT+8, 2024-11-23 05:19 , Processed in 0.178889 second(s), 21 queries , Gzip On.

快速回复 返回顶部 返回列表 Return to list