QM/MM simulation crashed due to Segmentation fault

Hello everyone,

I am trying to run a QM/MM minimization using Gromacs 2022.1/CP2K 9.1, but it failed due to Segmentation fault.

Steepest Descents:
Tolerance (Fmax) = 1.00000e+03
Number of steps = 1000
srun: error: s02r2b66: tasks 2-3,5,9,11: Segmentation fault
srun: Terminating job step 23458162.0
slurmstepd: error: *** STEP 23458162.0 ON s02r2b66 CANCELLED AT 2022-06-10T09:31:34 ***
srun: error: s02r2b66: tasks 0-1,4,6-8,10: Terminated
srun: error: s02r2b67: tasks 12-23: Terminated
srun: Force Terminated job step 23458162.0

My system is simple, a lipid bilayer with water, and the QM region is 14 waters. I guess this is related to the system setup by Gromacs for CP2K calculations, specifically, the QM box size. When I used:

qmmm-cp2k-qmmethod = PBE

I got a QM box size of 464646 A3, while the size of the QM region obtained from VMD is: 13.49.427.5 A3.

Main< (Chetan) 4 % measure minmax $qm
{30.100000381469727 15.4399995803833 19.190000534057617} {43.41999816894531 24.739999771118164 46.66999816894531}

Then I tried to use qmmm-cp2k-qmmethod = INPUT, and setup a smaller QM box size, but the simulation crashed due to the same reason.

I don’t know how the QM box size is determined by Gromacs. In another simple test, ACE-Ala-NME dipeptide in water, the QM box size was set correctly, and simulation was running successfully.

So the crashed simulation is due to a big QM box size is required by Gromacs, which requires a large amount of memory. I don’t understand the different determination of QM box size for the two systems.
I used 96 cores for the simulation, by default, 2GB/core memory is assigned for the job on the cluster.
The memory should be enough, as I used the same number of cores to run QM/MM simulations with CP2K only.

Could someone give some comments and suggestions? Thanks a lot!

All the best,
Qinghua

I tried to shut down the SETTLE constraints on all water molecules, and the simulation crashed all the same.

The LJ parameters for hydrogens of water molecules are ZERO, should it changed to Non-ZERO ones, so that the electrons would not get overlapped too much? Could be this a reason? Thanks!

All the best,
Qinghua

I tried on another system, an enzyme + a ligand + water, I chose the ligand for QM region (40 atoms), using

qmmm-cp2k-qmmethod = PBE
The QM box size obtained by Gromacs is:
&CELL
A 19.496 0.000 0.000
B 0.000 19.496 0.000
C 0.000 0.000 19.496
PERIODIC XYZ
&END CELL

Still, the simulation crashed, just complained about Segmentation fault in the slurm output file, but no error was reported in the cp2k output.

Here are the cp2k input file generated by Gromacs and tpr file for minimization:

https://www.dropbox.com/s/c6plag6kzjzzfxv/em.tpr?dl=0

I really appreciate if someone could help me fixing it.

All the best,
Qinghua

This is correct behavior, automatic box determined as 1.5 times bigger than the largest distance in your QM system.
QM system should be as compact as possible it is not recommended to mix it with MM atoms. Also QM waters tend to fly away from the box so your system will crash sooner or later.

That system works fine for me.

Have you tried systems from the tutorial? Are they working fine?

Thanks Dmitry!

The 14 water molecules are inside the lipids, so basically they are confined. If they are flying away, I may put a wall potential on the QM box.

Does the Gromacs/CP2K interface only accept cubic QM box? In my case, I could not just set extra 5-7 angstrom space on my QM region? 1.5 times of the largest distance of the QM region leads to a very big QM box, which makes the QM calculation time consuming.

All the best,
Qinghua

Thanks Dmitry!

I did not try the system from the tutorial, as the files on github are not available at the moment.
But I set up a system of ACE-ALA-NME in water, and it worked fine.

Very strange, I tried to use 4 nodes (192 cores) again, but it failed anyway. Do you think there might be some problem of the installation (Gromacs 2022.1 + cp2k 9.1 + plumed 2.8)? How do you run the minimization based on em.tpr I provided? Thanks a lot!

All the best,
Qinghua

By default, yes. But you can use any arbitrary box with qmmm-cp2k-qmmethod = INPUT

Just straightforwardly run it with my installation, which is also Gromacs 2022.1 + cp2k 9.1 + plumed 2.8. Try to run your em_cp2k.inp file with CP2K itself to check if the problem is not within cp2k installation.

Thanks Dmitry!

After some more tests, it turned out that the installation has some problem, the simulations were running well on another cluster with a new installation.

For the QM/MM calculation, I understand the QM calculation is done by cp2k, the MM calculation is done by Gromacs. How is the QM-MM interaction calculated? I guess it is done by both Gromacs and cp2k,
as cp2k/GEEP is used, but cp2k does not read Gromacs topology file. Could you please give a little more details on how the communication between Gromacs and cp2k is done in term of QM-MM interactions? Thanks a lot!

All the best,
Qinghua

QMMM is done indeed with GEEP, CP2K only needs point charges on MM atoms, which are provided within “_cp2k.pdb” file.

Thanks Dmitry!

I am running a QM/MM steered MD simulation using Gromacs 2022.1/CP2K 9.1/PLUMED 2.8.

During the simulations, PLUMED can write out the collective variables and biases, but it seems that the biases (including wall potentials) provided by PLUMED have no influence on the system, it is like free dynamics.

For this installation, PLUMED was patched to both CP2K and Gromacs, but PLUMED was activated through Gromacs. I am wondering whether there is some conflicts. Any comments? Thanks a lot!

My files are in the Dropbox: Dropbox - smd.zip - Simplify your life

I did not used plumed with GROMACS myself, but QMMM should not affect its functionality at all. Try to use it without QMMM to check if everything OK with it.

Thanks, I tried to use it without QM/MM, but it is the same, metadynamics potentials were not passed to Gromacs either (metadynamics simulation on the Ala dipeptide).

I am wondering whether this is related to how I installed Gromacs:

cmake … -DBUILD_SHARED_LIBS=OFF -DGMXAPI=OFF -DGMX_INSTALL_NBLIB_API=OFF -DGMX_DOUBLE=ON -DGMX_MPI=OFF -DGMX_FFT_LIBRARY=fftw3 -DGMX_CP2K=ON -DCMAKE_INSTALL_PREFIX=~/programs/Gromacs-CP2K/Gromacs2022.1 -DGMX_CP2K=ON -DCP2K_DIR=~/programs/Gromacs-CP2K/cp2k/lib/local/psmp/ -DGMX_EXTERNAL_BLAS=ON -DGMX_EXTERNAL_LAPACK=ON

Some disabled feathers have some influence on PLUMED? I will try a installation of just Gromacs 2022.1 and PLUMED 2.9-dev. Thanks!

All the best,
Qinghua

Hello,

After some tests, it turns out that there is some bugs in the PLUMED interface (v2.8). Thanks!

All the best,
Qinghua