[Columbus] Installing Columbus with OpenMolcas
Zihan Pengmei
zpengmei at uchicago.edu
Thu Jun 9 02:27:27 EDT 2022
Dear all,
Just successfully compiled it with openmolcas, really appreciated to all of your help here! It’s really important to configure the global array and let it compile intel MKL first and then separately using this GA build to compile serial and parallel molcas, and finally come back to the columbus. Some version of intel compiler and mpi didn’t work out for me.
Again, much appreciated for help!
Zihan
From: Columbus <columbus-bounces at lists.osc.edu> on behalf of columbus-request at lists.osc.edu <columbus-request at lists.osc.edu>
Date: Wednesday, June 8, 2022 at 2:47 PM
To: columbus at lists.osc.edu <columbus at lists.osc.edu>
Subject: Columbus Digest, Vol 112, Issue 8
Send Columbus mailing list submissions to
columbus at lists.osc.edu
To subscribe or unsubscribe via the World Wide Web, visit
https://lists.osu.edu/mailman/listinfo/columbus
or, via email, send a message with subject or body 'help' to
columbus-request at lists.osc.edu
You can reach the person managing the list at
columbus-owner at lists.osc.edu
When replying, please edit your Subject line so it is more specific
than "Re: Contents of Columbus digest..."
Today's Topics:
1. Re: Installing Columbus with OpenMolcas (Zihan Pengmei)
----------------------------------------------------------------------
Message: 1
Date: Wed, 8 Jun 2022 19:46:43 +0000
From: Zihan Pengmei <zpengmei at uchicago.edu>
To: "columbus at lists.osc.edu" <columbus at lists.osc.edu>
Subject: Re: [Columbus] Installing Columbus with OpenMolcas
Message-ID:
<LV2PR11MB59972C957FCAB075244426CBBAA49 at LV2PR11MB5997.namprd11.prod.outlook.com>
Content-Type: text/plain; charset="windows-1252"
Hi all,
Thank you all for kind advice! Sadly I still can?t work this out today, a bit summary here:
1. I did fresh install of GA as Felix pointed out
2. I freshly compiled two OpenMolcas from Felix?s gitlab, one sequential (--compiler intel ?mkl $MKLROOT), one parallel (--compiler intel ?mkl $MKLROOT ?mpi ?GA $GAROOT), using exactly the same compiler and newly installed GA
3. Cpan standard grad installations are successful
4. Parallel fails
I attached config and logs here, I tried to supply either MOLCAS or PMOLCAS but they didn?t work.
Much appreciated for all of those effort and support from the community!
Zihan
Install.config:
GREP /bin/grep
GMAKE gmake
TAR /bin/tar
RANLIB /usr/bin/ranlib
CPPL cpp -E -traditional
BLASLIBRARY -Wl,--start-group ${MKLROOT}/lib/intel64/libmkl_intel_ilp64.a ${MKLROOT}/lib/intel64/libmkl_sequential.a ${MKLROOT}/lib/intel64/libmkl_core.a -Wl,--end-group -lpthread -lm -ldl
LAPACKLIBRARY -Wl,--start-group ${MKLROOT}/lib/intel64/libmkl_intel_ilp64.a ${MKLROOT}/lib/intel64/libmkl_sequential.a ${MKLROOT}/lib/intel64/libmkl_core.a -Wl,--end-group -lpthread -lm -ldl
MACHINEID linux64.ifc
GUNZIP /bin/gunzip
GACOMMUNICATION MPI
MPI_MAINDIR /software/intel/parallel_studio_xe_2018_update4/impi/2018.4.274
MPI_CC /software/intel/parallel_studio_xe_2018_update4/impi/2018.4.274/intel64/bin/mpicc
MPI_LD /software/intel/parallel_studio_xe_2018_update4/impi/2018.4.274/intel64/bin/mpif77 -z muldefs
MPI_FC /software/intel/parallel_studio_xe_2018_update4/impi/2018.4.274/intel64/bin/mpif90 -z muldefs
PSCRIPT
GAVERSION GA53
VMOLCAS 8.0
MOLCAS /Apps/COLUMBUS/parallel/OpenMolcas/builds/intel_normal_mkl
PMOLCAS /Apps/COLUMBUS/parallel/OpenMolcas/builds/intel_normal_mpi_ga_mkl
MPI_LIBS /software/intel/parallel_studio_xe_2018_update4/impi/2018.4.274/intel64/lib/libmpi.a
MPI_STARTUP /software/intel/parallel_studio_xe_2018_update4/impi/2018.4.274/intel64/bin mpirun -np _NPROC_ _EXE_ _EXEOPTS_
COLUMBUS /Apps/COLUMBUS/parallel/Columbus
DALTON /Apps/COLUMBUS/parallel/Columbus/source/dalton
COLUMBUSVERSION 7.0
INSTALLOPTION NOGA_INSTALL
Install.log.parallel:
===== ./colinstall.sh: pciudg =====
Apps/COLUMBUS/parallel/Columbus/makefile:22: INCFILE= Apps/COLUMBUS/parallel/Columbus/machine.cfg/linux64.ifc
Apps/COLUMBUS/parallel/Columbus/makefile:50: GAVERSION=GA53:
Apps/COLUMBUS/parallel/Columbus/makefile:65: PMAKE= yes
Apps/COLUMBUS/parallel/Columbus/makefile:69: setting DKEYWORDSLOCAL 1 to -DJSC -DUNIX -DLINUX -DMILSTD1753 -DBIT64 -DFORMATDOLLAR -DBLAS2 -DBLAS3 -DPIPEMODE -DF90 -DF95 -DINT64 -DMOLCAS_INT64 -DNOENVSUPPORT -DGA50 -DMOLCAS_LABEL10 -DPARALLEL -DMPI
Apps/COLUMBUS/parallel/Columbus/makefile:70: setting CPPDIRLOCAL 1 to -I Apps/COLUMBUS/parallel/Columbus/source/dalton/include -DCOLUMBUS -DSYS_LINUX -DPTR64 -Df90 -DVAR_BLAS3 -DINT64 -DPARALLEL -DMPI
Apps/COLUMBUS/parallel/Columbus/makefile:81: setting DKEYWORDSLOCAL 3 to -DJSC -DUNIX -DLINUX -DMILSTD1753 -DBIT64 -DFORMATDOLLAR -DBLAS2 -DBLAS3 -DPIPEMODE -DF90 -DF95 -DINT64 -DMOLCAS_INT64 -DNOENVSUPPORT -DGA50 -DMOLCAS_LABEL10 -DPARALLEL -DMPI -DMOLCAS -DMOLCAS_INT64
Apps/COLUMBUS/parallel/Columbus/makefile:82: LIBMOLCAS set to $LIBMOLCAS
Apps/COLUMBUS/parallel/Columbus/makefile:85: PLIBS= Apps/COLUMBUS/parallel/Columbus/libga.a Apps/COLUMBUS/parallel/Columbus/libarmci.a /software/intel/parallel_studio_xe_2018_update4/impi/2018.4.274/intel64/lib/libmpi.a
Apps/COLUMBUS/parallel/Columbus/makefile:90: setting PLIBS to Apps/COLUMBUS/parallel/Columbus/libga.a Apps/COLUMBUS/parallel/Columbus/libarmci.a /software/intel/parallel_studio_xe_2018_update4/impi/2018.4.274/intel64/lib/libmpi.a
Apps/COLUMBUS/parallel/Columbus/makefile:91: setting LIBS to Apps/COLUMBUS/parallel/Columbus/colib.a Apps/COLUMBUS/parallel/Columbus/blaswrapper.a -Wl,--start-group /software/intel/parallel_studio_xe_2020_update1/mkl/lib/intel64/libmkl_intel_ilp64.a /software/intel/parallel_studio_xe_2020_update1/mkl/lib/intel64/libmkl_sequential.a /software/intel/parallel_studio_xe_2020_update1/mkl/lib/intel64/libmkl_core.a -Wl,--end-group -lpthread -lm -ldl -Wl,--start-group /software/intel/parallel_studio_xe_2020_update1/mkl/lib/intel64/libmkl_intel_ilp64.a /software/intel/parallel_studio_xe_2020_update1/mkl/lib/intel64/libmkl_sequential.a /software/intel/parallel_studio_xe_2020_update1/mkl/lib/intel64/libmkl_core.a -Wl,--end-group -lpthread -lm -ldl
Apps/COLUMBUS/parallel/Columbus/makefile:92: setting LIBS2 to Apps/COLUMBUS/parallel/Columbus/colib.a Apps/COLUMBUS/parallel/Columbus/libmolcas_col.a Apps/COLUMBUS/parallel/Columbus/blaswrapper.a -Wl,--start-group /software/intel/parallel_studio_xe_2020_update1/mkl/lib/intel64/libmkl_intel_ilp64.a /software/intel/parallel_studio_xe_2020_update1/mkl/lib/intel64/libmkl_sequential.a /software/intel/parallel_studio_xe_2020_update1/mkl/lib/intel64/libmkl_core.a -Wl,--end-group -lpthread -lm -ldl -Wl,--start-group /software/intel/parallel_studio_xe_2020_update1/mkl/lib/intel64/libmkl_intel_ilp64.a /software/intel/parallel_studio_xe_2020_update1/mkl/lib/intel64/libmkl_sequential.a /software/intel/parallel_studio_xe_2020_update1/mkl/lib/intel64/libmkl_core.a -Wl,--end-group -lpthread -lm -ldl
Apps/COLUMBUS/parallel/Columbus/makefile:105: cpp WITH_REDIRECTON ::
ifort -c -nowarn -mkl=sequential -free -DJSC -DUNIX -DLINUX -DMILSTD1753 -DBIT64 -DFORMATDOLLAR -DBLAS2 -DBLAS3 -DPIPEMODE -DF90 -DF95 -DINT64 -DMOLCAS_INT64 -DNOENVSUPPORT -DGA50 -DMOLCAS_LABEL10 -DPARALLEL -DMPI -DMOLCAS -DMOLCAS_INT64 -i8 -O2 ciudg_data.F90
ifort -c -nowarn -mkl=sequential -free -DJSC -DUNIX -DLINUX -DMILSTD1753 -DBIT64 -DFORMATDOLLAR -DBLAS2 -DBLAS3 -DPIPEMODE -DF90 -DF95 -DINT64 -DMOLCAS_INT64 -DNOENVSUPPORT -DGA50 -DMOLCAS_LABEL10 -DPARALLEL -DMPI -DMOLCAS -DMOLCAS_INT64 -i8 -O2 molcaswrapper.F90
ifort -c -nowarn -mkl=sequential -free -DJSC -DUNIX -DLINUX -DMILSTD1753 -DBIT64 -DFORMATDOLLAR -DBLAS2 -DBLAS3 -DPIPEMODE -DF90 -DF95 -DINT64 -DMOLCAS_INT64 -DNOENVSUPPORT -DGA50 -DMOLCAS_LABEL10 -DPARALLEL -DMPI -DMOLCAS -DMOLCAS_INT64 -i8 -O2 ciudg_matutil.F90
ifort -c -nowarn -mkl=sequential -free -DJSC -DUNIX -DLINUX -DMILSTD1753 -DBIT64 -DFORMATDOLLAR -DBLAS2 -DBLAS3 -DPIPEMODE -DF90 -DF95 -DINT64 -DMOLCAS_INT64 -DNOENVSUPPORT -DGA50 -DMOLCAS_LABEL10 -DPARALLEL -DMPI -DMOLCAS -DMOLCAS_INT64 -i8 -O2 ciudg_gautil.F90
ifort -c -nowarn -mkl=sequential -free -DJSC -DUNIX -DLINUX -DMILSTD1753 -DBIT64 -DFORMATDOLLAR -DBLAS2 -DBLAS3 -DPIPEMODE -DF90 -DF95 -DINT64 -DMOLCAS_INT64 -DNOENVSUPPORT -DGA50 -DMOLCAS_LABEL10 -DPARALLEL -DMPI -DMOLCAS -DMOLCAS_INT64 -i8 -O2 ciudg_ioutil.F90
ifort -c -nowarn -mkl=sequential -free -DJSC -DUNIX -DLINUX -DMILSTD1753 -DBIT64 -DFORMATDOLLAR -DBLAS2 -DBLAS3 -DPIPEMODE -DF90 -DF95 -DINT64 -DMOLCAS_INT64 -DNOENVSUPPORT -DGA50 -DMOLCAS_LABEL10 -DPARALLEL -DMPI -DMOLCAS -DMOLCAS_INT64 -i8 -O2 ciudg_drtutil.F90
ifort -c -nowarn -mkl=sequential -free -DJSC -DUNIX -DLINUX -DMILSTD1753 -DBIT64 -DFORMATDOLLAR -DBLAS2 -DBLAS3 -DPIPEMODE -DF90 -DF95 -DINT64 -DMOLCAS_INT64 -DNOENVSUPPORT -DGA50 -DMOLCAS_LABEL10 -DPARALLEL -DMPI -DMOLCAS -DMOLCAS_INT64 -i8 -O2 ciudg_utils.F90
ifort -c -nowarn -mkl=sequential -free -DJSC -DUNIX -DLINUX -DMILSTD1753 -DBIT64 -DFORMATDOLLAR -DBLAS2 -DBLAS3 -DPIPEMODE -DF90 -DF95 -DINT64 -DMOLCAS_INT64 -DNOENVSUPPORT -DGA50 -DMOLCAS_LABEL10 -DPARALLEL -DMPI -DMOLCAS -DMOLCAS_INT64 -i8 -O0 -g -noautomatic ciudg_loop.F90
ifort -c -nowarn -mkl=sequential -free -DJSC -DUNIX -DLINUX -DMILSTD1753 -DBIT64 -DFORMATDOLLAR -DBLAS2 -DBLAS3 -DPIPEMODE -DF90 -DF95 -DINT64 -DMOLCAS_INT64 -DNOENVSUPPORT -DGA50 -DMOLCAS_LABEL10 -DPARALLEL -DMPI -DMOLCAS -DMOLCAS_INT64 -i8 -O2 ciudg_setup.F90
ifort -c -nowarn -mkl=sequential -free -DJSC -DUNIX -DLINUX -DMILSTD1753 -DBIT64 -DFORMATDOLLAR -DBLAS2 -DBLAS3 -DPIPEMODE -DF90 -DF95 -DINT64 -DMOLCAS_INT64 -DNOENVSUPPORT -DGA50 -DMOLCAS_LABEL10 -DPARALLEL -DMPI -DMOLCAS -DMOLCAS_INT64 -i8 -O2 maksortmod.F90
ifort -c -nowarn -mkl=sequential -free -DJSC -DUNIX -DLINUX -DMILSTD1753 -DBIT64 -DFORMATDOLLAR -DBLAS2 -DBLAS3 -DPIPEMODE -DF90 -DF95 -DINT64 -DMOLCAS_INT64 -DNOENVSUPPORT -DGA50 -DMOLCAS_LABEL10 -DPARALLEL -DMPI -DMOLCAS -DMOLCAS_INT64 -i8 -O2 cisrtmod.F90
ifort -c -nowarn -mkl=sequential -free -DJSC -DUNIX -DLINUX -DMILSTD1753 -DBIT64 -DFORMATDOLLAR -DBLAS2 -DBLAS3 -DPIPEMODE -DF90 -DF95 -DINT64 -DMOLCAS_INT64 -DNOENVSUPPORT -DGA50 -DMOLCAS_LABEL10 -DPARALLEL -DMPI -DMOLCAS -DMOLCAS_INT64 -i8 -O2 ciudg_diag.F90
ifort -c -nowarn -mkl=sequential -free -DJSC -DUNIX -DLINUX -DMILSTD1753 -DBIT64 -DFORMATDOLLAR -DBLAS2 -DBLAS3 -DPIPEMODE -DF90 -DF95 -DINT64 -DMOLCAS_INT64 -DNOENVSUPPORT -DGA50 -DMOLCAS_LABEL10 -DPARALLEL -DMPI -DMOLCAS -DMOLCAS_INT64 -i8 -O2 ciudg_allin.F90
ifort -c -nowarn -mkl=sequential -free -DJSC -DUNIX -DLINUX -DMILSTD1753 -DBIT64 -DFORMATDOLLAR -DBLAS2 -DBLAS3 -DPIPEMODE -DF90 -DF95 -DINT64 -DMOLCAS_INT64 -DNOENVSUPPORT -DGA50 -DMOLCAS_LABEL10 -DPARALLEL -DMPI -DMOLCAS -DMOLCAS_INT64 -i8 -O2 ciudg_fourex_mod.F90
ifort -c -nowarn -mkl=sequential -free -DJSC -DUNIX -DLINUX -DMILSTD1753 -DBIT64 -DFORMATDOLLAR -DBLAS2 -DBLAS3 -DPIPEMODE -DF90 -DF95 -DINT64 -DMOLCAS_INT64 -DNOENVSUPPORT -DGA50 -DMOLCAS_LABEL10 -DPARALLEL -DMPI -DMOLCAS -DMOLCAS_INT64 -i8 -O2 ciudg_threx.F90
ifort -c -nowarn -mkl=sequential -free -DJSC -DUNIX -DLINUX -DMILSTD1753 -DBIT64 -DFORMATDOLLAR -DBLAS2 -DBLAS3 -DPIPEMODE -DF90 -DF95 -DINT64 -DMOLCAS_INT64 -DNOENVSUPPORT -DGA50 -DMOLCAS_LABEL10 -DPARALLEL -DMPI -DMOLCAS -DMOLCAS_INT64 -i8 -O2 ciudg_twoext.F90
ifort -c -nowarn -mkl=sequential -free -DJSC -DUNIX -DLINUX -DMILSTD1753 -DBIT64 -DFORMATDOLLAR -DBLAS2 -DBLAS3 -DPIPEMODE -DF90 -DF95 -DINT64 -DMOLCAS_INT64 -DNOENVSUPPORT -DGA50 -DMOLCAS_LABEL10 -DPARALLEL -DMPI -DMOLCAS -DMOLCAS_INT64 -i8 -O2 ciudg_onext.F90
ifort -c -nowarn -mkl=sequential -free -DJSC -DUNIX -DLINUX -DMILSTD1753 -DBIT64 -DFORMATDOLLAR -DBLAS2 -DBLAS3 -DPIPEMODE -DF90 -DF95 -DINT64 -DMOLCAS_INT64 -DNOENVSUPPORT -DGA50 -DMOLCAS_LABEL10 -DPARALLEL -DMPI -DMOLCAS -DMOLCAS_INT64 -i8 -O2 ciudg_spinorbit.F90
ifort -c -nowarn -mkl=sequential -free -DJSC -DUNIX -DLINUX -DMILSTD1753 -DBIT64 -DFORMATDOLLAR -DBLAS2 -DBLAS3 -DPIPEMODE -DF90 -DF95 -DINT64 -DMOLCAS_INT64 -DNOENVSUPPORT -DGA50 -DMOLCAS_LABEL10 -DPARALLEL -DMPI -DMOLCAS -DMOLCAS_INT64 -i8 -O2 ciudg_oneden.F90
ifort -c -nowarn -mkl=sequential -free -DJSC -DUNIX -DLINUX -DMILSTD1753 -DBIT64 -DFORMATDOLLAR -DBLAS2 -DBLAS3 -DPIPEMODE -DF90 -DF95 -DINT64 -DMOLCAS_INT64 -DNOENVSUPPORT -DGA50 -DMOLCAS_LABEL10 -DPARALLEL -DMPI -DMOLCAS -DMOLCAS_INT64 -i8 -O2 ciudg_twoden.F90
ifort -c -nowarn -mkl=sequential -free -DJSC -DUNIX -DLINUX -DMILSTD1753 -DBIT64 -DFORMATDOLLAR -DBLAS2 -DBLAS3 -DPIPEMODE -DF90 -DF95 -DINT64 -DMOLCAS_INT64 -DNOENVSUPPORT -DGA50 -DMOLCAS_LABEL10 -DPARALLEL -DMPI -DMOLCAS -DMOLCAS_INT64 -i8 -O2 ciudg_main.F90
ifort -c -nowarn -mkl=sequential -free -DJSC -DUNIX -DLINUX -DMILSTD1753 -DBIT64 -DFORMATDOLLAR -DBLAS2 -DBLAS3 -DPIPEMODE -DF90 -DF95 -DINT64 -DMOLCAS_INT64 -DNOENVSUPPORT -DGA50 -DMOLCAS_LABEL10 -DPARALLEL -DMPI -DMOLCAS -DMOLCAS_INT64 -i8 -O2 drivercid.F90
ifort -c -nowarn -mkl=sequential -free -DJSC -DUNIX -DLINUX -DMILSTD1753 -DBIT64 -DFORMATDOLLAR -DBLAS2 -DBLAS3 -DPIPEMODE -DF90 -DF95 -DINT64 -DMOLCAS_INT64 -DNOENVSUPPORT -DGA50 -DMOLCAS_LABEL10 -DPARALLEL -DMPI -DMOLCAS -DMOLCAS_INT64 -i8 -O2 driver.F90
ifort -c -nowarn -mkl=sequential -free -DJSC -DUNIX -DLINUX -DMILSTD1753 -DBIT64 -DFORMATDOLLAR -DBLAS2 -DBLAS3 -DPIPEMODE -DF90 -DF95 -DINT64 -DMOLCAS_INT64 -DNOENVSUPPORT -DGA50 -DMOLCAS_LABEL10 -DPARALLEL -DMPI -DMOLCAS -DMOLCAS_INT64 -i8 -O2 driversize.F90
ifort -c -nowarn -mkl=sequential -free -DJSC -DUNIX -DLINUX -DMILSTD1753 -DBIT64 -DFORMATDOLLAR -DBLAS2 -DBLAS3 -DPIPEMODE -DF90 -DF95 -DINT64 -DMOLCAS_INT64 -DNOENVSUPPORT -DGA50 -DMOLCAS_LABEL10 -DPARALLEL -DMPI -DMOLCAS -DMOLCAS_INT64 -i8 -O2 driverwrap.F90
ifort -c -nowarn -mkl=sequential -free -DJSC -DUNIX -DLINUX -DMILSTD1753 -DBIT64 -DFORMATDOLLAR -DBLAS2 -DBLAS3 -DPIPEMODE -DF90 -DF95 -DINT64 -DMOLCAS_INT64 -DNOENVSUPPORT -DGA50 -DMOLCAS_LABEL10 -DPARALLEL -DMPI -DMOLCAS -DMOLCAS_INT64 -i8 -O2 ciudg.F90
$MPI_LD -o pciudg.x ciudg_data.o molcaswrapper.o ciudg_matutil.o ciudg_gautil.o ciudg_ioutil.o ciudg_drtutil.o ciudg_utils.o ciudg_loop.o ciudg_setup.o maksortmod.o cisrtmod.o ciudg_diag.o ciudg_allin.o ciudg_fourex_mod.o ciudg_threx.o ciudg_twoext.o ciudg_onext.o ciudg_allin.o ciudg_diag.o ciudg_spinorbit.o ciudg_oneden.o ciudg_twoden.o ciudg_main.o drivercid.o driver.o driversize.o driverwrap.o ciudg.o Apps/COLUMBUS/parallel/Columbus/libdalton2.a Apps/COLUMBUS/parallel/Columbus/libga.a Apps/COLUMBUS/parallel/Columbus/libarmci.a /software/intel/parallel_studio_xe_2018_update4/impi/2018.4.274/intel64/lib/libmpi.a -Wl,--start-group /software/intel/parallel_studio_xe_2020_update1/mkl/lib/intel64/libmkl_intel_ilp64.a /software/intel/parallel_studio_xe_2020_update1/mkl/lib/intel64/libmkl_sequential.a /software/intel/parallel_studio_xe_2020_update1/mkl/lib/intel64/libmkl_core.a -Wl,--end-group -lpthread -lm -ldl Apps/COLUMBUS/parallel/Columbus/colib.a Apps/COLUMBUS/par
allel/Columbus/libmolcas_col.a Apps/COLUMBUS/parallel/Columbus/blaswrapper.a -Wl,--start-group /software/intel/parallel_studio_xe_2020_update1/mkl/lib/intel64/libmkl_intel_ilp64.a /software/intel/parallel_studio_xe_2020_update1/mkl/lib/intel64/libmkl_sequential.a /software/intel/parallel_studio_xe_2020_update1/mkl/lib/intel64/libmkl_core.a -Wl,--end-group -lpthread -lm -ldl -Wl,--start-group /software/intel/parallel_studio_xe_2020_update1/mkl/lib/intel64/libmkl_intel_ilp64.a /software/intel/parallel_studio_xe_2020_update1/mkl/lib/intel64/libmkl_sequential.a /software/intel/parallel_studio_xe_2020_update1/mkl/lib/intel64/libmkl_core.a -Wl,--end-group -lpthread -lm -ldl
/usr/lib/gcc/x86_64-redhat-linux/8/../../../../lib64/crt1.o: In function `_start':
(.text+0x24): undefined reference to `main'
molcaswrapper.o: In function `molcaswrapper_mp_mcget_iarray_':
molcaswrapper.F90:(.text+0xbc): undefined reference to `for_check_mult_overflow64'
molcaswrapper.F90:(.text+0x12c): undefined reference to `for_alloc_allocatable'
molcaswrapper.F90:(.text+0x21f): undefined reference to `for_dealloc_allocatable?
??.
blaswrapper.f:(.text+0x653): undefined reference to `_intel_fast_memcpy'
/project/lgagliardi/shared/Apps/COLUMBUS/parallel/Columbus/blaswrapper.a(blaswrapper.o): In function `icopy_wr_':
blaswrapper.f:(.text+0x833): undefined reference to `_intel_fast_memcpy'
collect2: error: ld returned 1 exit status
gmake: *** [/project/lgagliardi/shared/Apps/COLUMBUS/parallel/Columbus/makefile:388: pciudg.x] Error 1
+ for i in $*
+ cp pciudg.x /project/lgagliardi/shared/Apps/COLUMBUS/parallel/Columbus/pciudg.x
cp: cannot stat 'pciudg.x': No such file or directory
From: Columbus <columbus-bounces+zpengmei=uchicago.edu at lists.osc.edu> on behalf of columbus-request at lists.osc.edu <columbus-request at lists.osc.edu>
Date: Wednesday, June 8, 2022 at 8:43 AM
To: columbus at lists.osc.edu <columbus at lists.osc.edu>
Subject: Columbus Digest, Vol 112, Issue 7
Send Columbus mailing list submissions to
columbus at lists.osc.edu
To subscribe or unsubscribe via the World Wide Web, visit
https://lists.osu.edu/mailman/listinfo/columbus
or, via email, send a message with subject or body 'help' to
columbus-request at lists.osc.edu
You can reach the person managing the list at
columbus-owner at lists.osc.edu
When replying, please edit your Subject line so it is more specific
than "Re: Contents of Columbus digest..."
Today's Topics:
1. Re: Installing Columbus with OpenMolcas (Felix Plasser)
----------------------------------------------------------------------
Message: 1
Date: Wed, 8 Jun 2022 13:42:00 +0000
From: Felix Plasser <F.Plasser at lboro.ac.uk>
To: "columbus at lists.osc.edu" <columbus at lists.osc.edu>
Subject: Re: [Columbus] Installing Columbus with OpenMolcas
Message-ID: <e6e72104-3234-81ce-a822-8ab312f5315a at lboro.ac.uk>
Content-Type: text/plain; charset="utf-8"
Hi, just a quick comment on this:
I would use "sockets" for single-node usage.
MPI-PR is better for multinode usage but, as far as I remember, MPI-PR
requires a slight modification in the source code (at some point we
should include that as a preprocessor option).
TCGMSG is more of a historical option, I think.
-Felix
On 07/06/2022 22:01, Hans Lischka via Columbus wrote:
>
>
> Maybe, you should request "static". There is a library libs (or
> something similar) in the GA installation. You should find the libraries
> there.
> Don't use tcgmsg! Use MPI-PR
>
> On 6/7/2022 2:47 PM, Zihan Pengmei via Columbus wrote:
>> Hi Felix, I followed the instructions, reinstalled the GA and redownload
>> the code, the error is still there. I think the major problem is I don?t
>> have those GA library files(libma, libglobal, libtcgmsg-mpi.a ) after
>> building the GA package.
>>
>> Hi Felix,
>>
>> I followed the instructions, reinstalled the GA and redownload the code,
>> the error is still there. I think the major problem is I don?t have
>> those GA library files(libma, libglobal, libtcgmsg-mpi.a ) after
>> building the GA package. I used the following command: ../configure
>> MPICC=mpiicc CC=icc F77=ifort F90=ifort MPICXX=mpiicpc MPIF77=mpiifort
>> MPIF90=mpiifort CXX=icpc
>> --with-mpi=/software/intel/parallel_studio_xe_2018_update4/impi/2018.4.274
>>
>> --with-ma --with-global --with-tcgmsg --with-tcgmsg-mpi --with-sockets
>> --with-blas=-mkl --with-lapack=-mkl
>>
>> Any suggestions?
>>
>> Thanks,
>>
>> Zihan
>>
>> *From: *Zihan Pengmei <zpengmei at uchicago.edu>
>> *Date: *Tuesday, June 7, 2022 at 10:02 AM
>> *To: *columbus at lists.osc.edu <columbus at lists.osc.edu>
>> *Subject: *Re: Installing Columbus with OpenMolcas
>>
>> Hi all,
>>
>> Much appreciated for all of advice! I actually later found this problem
>> and successfully compiled the sequential version, but failed at
>> installing PARALLEL.
>>
>> I built an OpenMolcas with intel_GA_mkl_mpi options and gave a keywork
>> PMOLCAS with the route. Just to clarify, I am building a 7.0.2 source
>> code, and the install.log.parallel warned it can?t find pciudg under
>> source folder. Then, I copied this pciudg folder from 7.0 source code,
>> but triggering the following error:
>>
>> (it seems those GA libraries are still needed, but I didn?t find them
>> anywhere in my GA build, is there any special flag that I should add
>> when compiling?)
>>
>> (Sorry for bothering on this, because I am running on cluster so it?s
>> not easy to use precompiled 7.1 version.)
>>
>> ===== ./colinstall.sh: pciudg =====
>>
>> /zpengmei/columbus/Columbus/makefile:22: INCFILE=
>> /zpengmei/columbus/Columbus/machine.cfg/linux64.ifc
>>
>> /zpengmei/columbus/Columbus/makefile:50: GAVERSION=GA58:
>>
>> /zpengmei/columbus/Columbus/makefile:65: PMAKE= yes
>>
>> /zpengmei/columbus/Columbus/makefile:69: setting DKEYWORDSLOCAL 1to
>> -DJSC -DUNIX -DLINUX -DMILSTD1753 -DBIT64 -DFORMATDOLLAR -DBLAS2 -DBLAS3
>> -DPIPEMODE -DF90 -DF95 -DINT64 -DMOLCAS_INT64 -DNOENVSUPPORT -DGA50
>> -DMOLCAS_LABEL10-DPARALLEL -DMPI
>>
>> /zpengmei/columbus/Columbus/makefile:70: setting CPPDIRLOCAL 1to -I
>> /zpengmei/columbus/Columbus/source/dalton/include -DCOLUMBUS-DSYS_LINUX
>> -DPTR64 -Df90-DVAR_BLAS3 -DINT64-DPARALLEL -DMPI
>>
>> /zpengmei/columbus/Columbus/makefile:81: setting DKEYWORDSLOCAL 3to
>> -DJSC -DUNIX -DLINUX -DMILSTD1753 -DBIT64 -DFORMATDOLLAR -DBLAS2 -DBLAS3
>> -DPIPEMODE -DF90 -DF95 -DINT64 -DMOLCAS_INT64 -DNOENVSUPPORT -DGA50
>> -DMOLCAS_LABEL10-DPARALLEL -DMPI -DMOLCAS -DMOLCAS_INT64
>>
>> /zpengmei/columbus/Columbus/makefile:82: LIBMOLCAS set to $LIBMOLCAS
>>
>> /zpengmei/columbus/Columbus/makefile:85: PLIBS=
>> /zpengmei/columbus/Columbus/libglobal.a/zpengmei/columbus/Columbus/libarmci.a/zpengmei/columbus/Columbus/libtcgmsg-mpi.a
>>
>> /zpengmei/columbus/Columbus/libma.a/software/intel/parallel_studio_xe_2018_update4/impi/2018.4.274/intel64/lib/libmpi.a
>>
>>
>> /zpengmei/columbus/Columbus/makefile:90: setting PLIBS
>> to/zpengmei/columbus/Columbus/libglobal.a/zpengmei/columbus/Columbus/libarmci.a/zpengmei/columbus/Columbus/libtcgmsg-mpi.a
>>
>> /zpengmei/columbus/Columbus/libma.a/software/intel/parallel_studio_xe_2018_update4/impi/2018.4.274/intel64/lib/libmpi.a
>>
>>
>> /zpengmei/columbus/Columbus/makefile:91: setting LIBS
>> to/zpengmei/columbus/Columbus/colib.a/zpengmei/columbus/Columbus/blaswrapper.a
>>
>> -Wl,--start-group
>> /software/intel/parallel_studio_xe_2020_update1/mkl/lib/intel64/libmkl_intel_ilp64.a
>>
>> /software/intel/parallel_studio_xe_2020_update1/mkl/lib/intel64/libmkl_sequential.a
>>
>> /software/intel/parallel_studio_xe_2020_update1/mkl/lib/intel64/libmkl_core.a
>>
>> -Wl,--end-group -lpthread -lm -ldl -Wl,--start-group
>> /software/intel/parallel_studio_xe_2020_update1/mkl/lib/intel64/libmkl_intel_ilp64.a
>>
>> /software/intel/parallel_studio_xe_2020_update1/mkl/lib/intel64/libmkl_sequential.a
>>
>> /software/intel/parallel_studio_xe_2020_update1/mkl/lib/intel64/libmkl_core.a
>>
>> -Wl,--end-group -lpthread -lm -ldl
>>
>> /zpengmei/columbus/Columbus/makefile:92: setting LIBS2
>> to/zpengmei/columbus/Columbus/colib.a/zpengmei/columbus/Columbus/libmolcas_col.a/zpengmei/columbus/Columbus/blaswrapper.a
>>
>> -Wl,--start-group
>> /software/intel/parallel_studio_xe_2020_update1/mkl/lib/intel64/libmkl_intel_ilp64.a
>>
>> /software/intel/parallel_studio_xe_2020_update1/mkl/lib/intel64/libmkl_sequential.a
>>
>> /software/intel/parallel_studio_xe_2020_update1/mkl/lib/intel64/libmkl_core.a
>>
>> -Wl,--end-group -lpthread -lm -ldl -Wl,--start-group
>> /software/intel/parallel_studio_xe_2020_update1/mkl/lib/intel64/libmkl_intel_ilp64.a
>>
>> /software/intel/parallel_studio_xe_2020_update1/mkl/lib/intel64/libmkl_sequential.a
>>
>> /software/intel/parallel_studio_xe_2020_update1/mkl/lib/intel64/libmkl_core.a
>>
>> -Wl,--end-group -lpthread -lm -ldl
>>
>> /zpengmei/columbus/Columbus/makefile:105: cpp WITH_REDIRECTON::
>>
>> $MPI_LD -o pciudg.x ciudg_data.o molcaswrapper.o ciudg_matutil.o
>> ciudg_gautil.o ciudg_ioutil.o ciudg_drtutil.o ciudg_utils.o ciudg_loop.o
>> ciudg_setup.o maksortmod.o cisrtmod.o ciudg_diag.o
>> ciudg_allin.ociudg_fourex_mod.o ciudg_threx.o ciudg_twoext.o
>> ciudg_onext.o ciudg_allin.o ciudg_diag.o ciudg_spinorbit.o
>> ciudg_oneden.o ciudg_twoden.o ciudg_main.o drivercid.o driver.o
>> driversize.o driverwrap.o ciudg.o
>> /zpengmei/columbus/Columbus/libdalton2.a/zpengmei/columbus/Columbus/libglobal.a/zpengmei/columbus/Columbus/libarmci.a/zpengmei/columbus/Columbus/libtcgmsg-mpi.a
>>
>> /zpengmei/columbus/Columbus/libma.a/software/intel/parallel_studio_xe_2018_update4/impi/2018.4.274/intel64/lib/libmpi.a
>>
>> -Wl,--start-group
>> /software/intel/parallel_studio_xe_2020_update1/mkl/lib/intel64/libmkl_intel_ilp64.a
>>
>> /software/intel/parallel_studio_xe_2020_update1/mkl/lib/intel64/libmkl_sequential.a
>>
>> /software/intel/parallel_studio_xe_2020_update1/mkl/lib/intel64/libmkl_core.a
>>
>> -Wl,--end-group -lpthread -lm
>> -ldl/zpengmei/columbus/Columbus/colib.a/zpengmei/columbus/Columbus/libmolcas_col.a/zpengmei/columbus/Columbus/blaswrapper.a
>>
>> -Wl,--start-group
>> /software/intel/parallel_studio_xe_2020_update1/mkl/lib/intel64/libmkl_intel_ilp64.a
>>
>> /software/intel/parallel_studio_xe_2020_update1/mkl/lib/intel64/libmkl_sequential.a
>>
>> /software/intel/parallel_studio_xe_2020_update1/mkl/lib/intel64/libmkl_core.a
>>
>> -Wl,--end-group -lpthread -lm -ldl -Wl,--start-group
>> /software/intel/parallel_studio_xe_2020_update1/mkl/lib/intel64/libmkl_intel_ilp64.a
>>
>> /software/intel/parallel_studio_xe_2020_update1/mkl/lib/intel64/libmkl_sequential.a
>>
>> /software/intel/parallel_studio_xe_2020_update1/mkl/lib/intel64/libmkl_core.a
>>
>> -Wl,--end-group -lpthread -lm -ldl
>>
>> gfortran: error:/zpengmei/columbus/Columbus/libglobal.a: No such file or
>> directory
>>
>> gfortran: error:/zpengmei/columbus/Columbus/libtcgmsg-mpi.a: No such
>> file or directory
>>
>> gfortran: error:/zpengmei/columbus/Columbus/libma.a: No such file or
>> directory
>>
>> gmake: *** [ /zpengmei/columbus/Columbus/makefile:388: pciudg.x] Error 1
>>
>> + for i in $*
>>
>> + cp pciudg.x/zpengmei/columbus/Columbus/pciudg.x
>>
>> cp: cannot stat 'pciudg.x': No such file or directory
>>
>> Much appreciated!
>>
>> Zihan Pengmei
>>
>> *From: *Columbus <columbus-bounces+zpengmei=uchicago.edu at lists.osc.edu>
>> on behalf of columbus-request at lists.osc.edu
>> <columbus-request at lists.osc.edu>
>> *Date: *Tuesday, June 7, 2022 at 9:18 AM
>> *To: *columbus at lists.osc.edu <columbus at lists.osc.edu>
>> *Subject: *Columbus Digest, Vol 112, Issue 2
>>
>> Send Columbus mailing list submissions to
>> ???????? columbus at lists.osc.edu
>>
>> To subscribe or unsubscribe via the World Wide Web, visit
>> https://lists.osu.edu/mailman/listinfo/columbus
>> <https://lists.osu.edu/mailman/listinfo/columbus>
>> or, via email, send a message with subject or body 'help' to
>> ???????? columbus-request at lists.osc.edu
>>
>> You can reach the person managing the list at
>> ???????? columbus-owner at lists.osc.edu
>>
>> When replying, please edit your Subject line so it is more specific
>> than "Re: Contents of Columbus digest..."
>>
>>
>> Today's Topics:
>>
>> ??? 1. Re: Installing Columbus with OpenMolcas (Hans Lischka)
>> ??? 2. Re: Installing Columbus with OpenMolcas (Felix Plasser)
>>
>>
>> ----------------------------------------------------------------------
>>
>> Message: 1
>> Date: Mon, 6 Jun 2022 13:03:40 -0500
>> From: Hans Lischka <hans.lischka at univie.ac.at>
>> To: columbus at lists.osc.edu
>> Subject: Re: [Columbus] Installing Columbus with OpenMolcas
>> Message-ID: <a4fb80f2-4d90-0e3a-19db-59944b0ac590 at univie.ac.at>
>> Content-Type: text/plain; charset="UTF-8"; format=flowed
>>
>> Hi Zihan,
>>
>> Here is my interpretation of the problem:
>> The file libma.a is missing in the loading process (should be in
>> /zpengmei/columbus/Columbus/libma.a). This is GA library. You should
>> find this file (and two others, like libga.a and a third one) in the
>> libs directory of the GA installation. Copy them to the main Columbus
>> directory
>>
>> Best regards, Hans
>>
>> On 6/5/2022 9:35 PM, Zihan Pengmei via Columbus wrote:
>>> Dear Columbus developers, I am trying to compile a Columbus 7.0.2
>>> interfacing with openmolcas and sharc on the cluster(with intel
>>> compiler, mkl2020, intelmpi/2018.4), but having the following
>>> complaint:
>>> ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ?
>>> ? ? ?
>>>
>>> Dear Columbus developers,
>>>
>>> ? ??????????????? I am trying to compile a Columbus 7.0.2 interfacing
>>> with openmolcas and sharc on the cluster(with intel compiler, mkl2020,
>>> intelmpi/2018.4), but having the following complaint:
>>>
>>> e_2020_update1/mkl/lib/intel64/libmkl_core.a -Wl,--end-group -lpthread
>>> -lm -ldl /zpengmei/columbus/Columbus/libma.a
>>>
>>> ifort: error #10236: File not
>>> found:'/zpengmei/columbus/Columbus/libma.a'
>>>
>>> gmake: *** [/zpengmei/columbus/Columbus/makefile:345: molcas.a] Error 1
>>>
>>> ? ??????????????? Just to clarify, I did git clone and compiled the
>>> openmolcas for columbus 7 from gitlab, one parallel version with
>>> intelmpi, mkl , intel compiler and ga 5.8.2, and another serial version
>>> with intel compiler, mkl.
>>>
>>> In install.config, I tried MOLCAS/PMOLCAS both keywords with
>>> corresponding versions, but neither of them worked out and came up with
>>> this warning when installing standard.
>>>
>>> ? ??????????????? I checked openmolcas/libs and ga/libs and I didn?t
>>> find
>>> libma.a anywhere. Any help would be very useful!
>>>
>>> Thanks,
>>>
>>> Zihan Pengmei
>>>
>>>
>>> _______________________________________________
>>> Columbus mailing list
>>> Columbus at lists.osc.edu
>>> https://lists.osu.edu/mailman/listinfo/columbus
>> <https://lists.osu.edu/mailman/listinfo/columbus>
>>
>>
>>
>> ------------------------------
>>
>> Message: 2
>> Date: Tue, 7 Jun 2022 14:17:34 +0000
>> From: Felix Plasser <F.Plasser at lboro.ac.uk>
>> To: "columbus at lists.osc.edu" <columbus at lists.osc.edu>
>> Subject: Re: [Columbus] Installing Columbus with OpenMolcas
>> Message-ID: <0d3fc882-a1be-0890-4057-c315ab687fca at lboro.ac.uk>
>> Content-Type: text/plain; charset="utf-8"
>>
>> Hi, I just remembered what this is. libma.a is only relevant for older
>> versions of Molcas. You can deactivate libma.a by setting in
>> install.config:
>>
>> VMOLCAS 8.0
>>
>> But, in any case, I will make this the default. I don't think that
>> Columbus would be compatible with MOLCAS < 8 anyway ...
>>
>> -Felix
>>
>> On 06/06/2022 19:03, Hans Lischka via Columbus wrote:
>>> ** THIS MESSAGE ORIGINATED OUTSIDE LOUGHBOROUGH UNIVERSITY **
>>>
>>> ** Be wary of links or attachments, especially if the email is
>>> unsolicited or you don't recognise the sender's email address. **
>>>
>>> Hi Zihan,
>>>
>>> Here is my interpretation of the problem:
>>> The file libma.a is missing in the loading process (should be in
>>> /zpengmei/columbus/Columbus/libma.a). This is GA library. You should
>>> find this file (and two others, like libga.a and a third one) in the
>>> libs directory of the GA installation. Copy them to the main Columbus
>>> directory
>>>
>>> Best regards, Hans
>>>
>>> On 6/5/2022 9:35 PM, Zihan Pengmei via Columbus wrote:
>>>> Dear Columbus developers, I am trying to compile a Columbus 7.0.2
>>>> interfacing with openmolcas and sharc on the cluster(with intel
>>>> compiler, mkl2020, intelmpi/2018.4), but having the following
>>>> complaint:
>>>> ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ?
>>>> ? ?
>>>>
>>>> Dear Columbus developers,
>>>>
>>>> ???????????????? I am trying to compile a Columbus 7.0.2 interfacing
>>>> with openmolcas and sharc on the cluster(with intel compiler, mkl2020,
>>>> intelmpi/2018.4), but having the following complaint:
>>>>
>>>> e_2020_update1/mkl/lib/intel64/libmkl_core.a -Wl,--end-group -lpthread
>>>> -lm -ldl /zpengmei/columbus/Columbus/libma.a
>>>>
>>>> ifort: error #10236: File not
>>>> found:'/zpengmei/columbus/Columbus/libma.a'
>>>>
>>>> gmake: *** [/zpengmei/columbus/Columbus/makefile:345: molcas.a]
>>>> Error 1
>>>>
>>>> ???????????????? Just to clarify, I did git clone and compiled the
>>>> openmolcas for columbus 7 from gitlab, one parallel version with
>>>> intelmpi, mkl , intel compiler and ga 5.8.2, and another serial
>>>> version
>>>> with intel compiler, mkl.
>>>>
>>>> In install.config, I tried MOLCAS/PMOLCAS both keywords with
>>>> corresponding versions, but neither of them worked out and came up
>>>> with
>>>> this warning when installing standard.
>>>>
>>>> ???????????????? I checked openmolcas/libs and ga/libs and I didn?t
>>>> find
>>>> libma.a anywhere. Any help would be very useful!
>>>>
>>>> Thanks,
>>>>
>>>> Zihan Pengmei
>>>>
>>>>
>>>> _______________________________________________
>>>> Columbus mailing list
>>>> Columbus at lists.osc.edu
>>>> https://lists.osu.edu/mailman/listinfo/columbus
>> <https://lists.osu.edu/mailman/listinfo/columbus>
>>>
>>> _______________________________________________
>>> Columbus mailing list
>>> Columbus at lists.osc.edu
>>> https://lists.osu.edu/mailman/listinfo/columbus
>> <https://lists.osu.edu/mailman/listinfo/columbus>
>>
>>
>>
>>
>> ------------------------------
>>
>> Subject: Digest Footer
>>
>> _______________________________________________
>> Columbus mailing list
>> Columbus at lists.osc.edu
>> https://lists.osu.edu/mailman/listinfo/columbus
>> <https://lists.osu.edu/mailman/listinfo/columbus>
>>
>> ------------------------------
>>
>> End of Columbus Digest, Vol 112, Issue 2
>> ****************************************
>>
>>
>> _______________________________________________
>> Columbus mailing list
>> Columbus at lists.osc.edu
>> https://lists.osu.edu/mailman/listinfo/columbus
>
> _______________________________________________
> Columbus mailing list
> Columbus at lists.osc.edu
> https://lists.osu.edu/mailman/listinfo/columbus
------------------------------
Subject: Digest Footer
_______________________________________________
Columbus mailing list
Columbus at lists.osc.edu
https://lists.osu.edu/mailman/listinfo/columbus
------------------------------
End of Columbus Digest, Vol 112, Issue 7
****************************************
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.osu.edu/pipermail/columbus/attachments/20220608/7781df46/attachment.html>
------------------------------
Subject: Digest Footer
_______________________________________________
Columbus mailing list
Columbus at lists.osc.edu
https://lists.osu.edu/mailman/listinfo/columbus
------------------------------
End of Columbus Digest, Vol 112, Issue 8
****************************************
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.osu.edu/pipermail/columbus/attachments/20220609/dd557164/attachment-0001.html>
More information about the Columbus
mailing list