Both sides previous revision
Previous revision
Next revision
|
Previous revision
|
pub:forge [2019/01/28 18:50] blspcy [Hardware] |
pub:forge [2022/05/06 20:15] (current) |
===== System Information ===== | ===== System Information ===== |
==== Software ==== | ==== Software ==== |
The Forge was built with rocks 6.1.1 which is built on top of CentOS 6.5. With the Forge we made the conversion from using PBS/Maui to using SLURM as our scheduler and resource manager. | The Forge was built with rocks 6.1.1 which is built on top of CentOS 6.8. With the Forge we made the conversion from using PBS/Maui to using SLURM as our scheduler and resource manager. |
| |
==== Hardware ==== | ==== Hardware ==== |
==Scratch Directories== | ==Scratch Directories== |
| |
Each user will get a scratch directory created for them at /mnt/stor/scratch/$USER. As with all storage scratch space is not backed up, and even more to the fact of data impermanence in this location it is actively cleaned in an attempt to prevent the storage from filling up. The intent for this storage is for your programs to create temporary files which you may need to keep after the calculation completes for a short time only. The volume is a high speed network attached scratch space, there currently are no quotas placed on the directories in this scratch space, however if the 20TB volume filling up becomes a problem we will have to implement quotas. | Each user will get a scratch directory created for them at /mnt/stor/scratch/$USER an alias of `cdsc` has also been made for users to cd directly to this location. As with all storage scratch space is not backed up, and even more to the fact of data impermanence in this location it is actively cleaned in an attempt to prevent the storage from filling up. The intent for this storage is for your programs to create temporary files which you may need to keep after the calculation completes for a short time only. The volume is a high speed network attached scratch space, there currently are no quotas placed on the directories in this scratch space, however if the 20TB volume filling up becomes a problem we will have to implement quotas. |
| |
Along with the networked scratch space, there is always local scratch on each compute node for use during calculations in /tmp. There is no quota placed on this space, and it is cleaned regularly as well, but things stored in this space will only be available to processes executing on the node in which they were created. Meaning if you create it in /tmp in a job, you won't be able to see it on the login node, and other processes won't be able to see it if they are on a different node than the process which created the file. | Along with the networked scratch space, there is always local scratch on each compute node for use during calculations in /tmp. There is no quota placed on this space, and it is cleaned regularly as well, but things stored in this space will only be available to processes executing on the node in which they were created. Meaning if you create it in /tmp in a job, you won't be able to see it on the login node, and other processes won't be able to see it if they are on a different node than the process which created the file. |
An important concept for running on the cluster is modules. Unlike a traditional computer where you can run every program from the command line after installing it, with the cluster we install the programs to a main "repository" so to speak, and then you load only the ones you need as modules. To see which modules are available for you to use you would type "module avail". Once you find which module you need for your file, you type "module load <module>" where <module> is the module you found you wanted from the module avail list. You can see which modules you already have loaded by typing "module list". | An important concept for running on the cluster is modules. Unlike a traditional computer where you can run every program from the command line after installing it, with the cluster we install the programs to a main "repository" so to speak, and then you load only the ones you need as modules. To see which modules are available for you to use you would type "module avail". Once you find which module you need for your file, you type "module load <module>" where <module> is the module you found you wanted from the module avail list. You can see which modules you already have loaded by typing "module list". |
| |
Here is the output of module avail as of 06/26/2017 | Here is the output of module avail as of 02/01/2019 |
<file> | <file> |
------------------------------------------------- /share/apps/modulefiles/mst-mpi-modules ------------------------------------------------- | ------------------------------------------------------------------------------------------------------ /share/apps/modulefiles/mst-app-modules ------------------------------------------------------------------------------------------------------ |
intelmpi openmpi/1.10.6/intel/2013_sp1.3.174 openmpi/2.1.0/intel/2015.2.164 (D) | AnsysEM/17/17 SAC/101.2 ansys/17.0 curl/7.63.0 gnuplot matlab/2012 msc/nastran/2014 rlwrap/0.43 trinity |
mvapich2/gnu/4.9.2/eth openmpi/1.10.6/intel/2015.2.164 (D) openmpi/gnu/4.9.2/eth | AnsysEM/19/19 SAC/101.6a (D) ansys/18.0 dirac gperf/3.1 matlab/2013a msc/patran/2014 rum valgrind/3.12 |
mvapich2/gnu/4.9.2/ib (D) openmpi/1.10/gnu/4.9.2 openmpi/gnu/4.9.2/ib (D) | CORE/06.05.92 SAS/9.4 ansys/18.1 dock/6.6eth gridgen matlab/2014a (D) mseed2sac/2.2 samtools/0.1.19 valgrind/3.14 (D) |
mvapich2/intel/11/eth openmpi/1.10/intel/15 openmpi/intel/11/backup | CST/2014 (D) SEGY2ASCII/2.0 ansys/18.2 dock/6.6ib (D) gromacs matlab/2016a namd/2.9 scipy vasp/4.6 |
mvapich2/intel/11/ib (D) openmpi/2.0.2/gnu/4.9.3 openmpi/intel/11/eth | CST/2015 SEISAN/10.5 ansys/19.0 espresso gv/3.7.4 matlab/2017a namd/2.10 (D) siesta vasp/5.4.1 (D) |
mvapich2/intel/13/eth openmpi/2.0.2/gnu/5.4.0 openmpi/intel/11/ib (D) | CST/2016 SU2/3.2.8 ansys/19.2 feh/2.28 hadoop/1.1.1 matlab/2017b namd/2.12b spark vim/8.0.1376 |
mvapich2/intel/13/ib (D) openmpi/2.0.2/gnu/6.3.0 openmpi/intel/13/backup | CST/2017 SU2/4.1.3 apbs fio/3.12 hadoop/1.2.1 matlab/2018a nwchem/6.6 sparta visit/2.8.2 |
mvapich2/intel/15/eth openmpi/2.0.2/gnu/7.1.0 (D) openmpi/intel/13/eth | CST/2018 SU2/5.0.0 bison/3.1 flex/2.6.0 hadoop/2.6.0 (D) matlab/2018b octave/3.8.2 spin visit/2.9.0 (D) |
mvapich2/intel/15/ib (D) openmpi/2.0.2/intel/2011_sp1.11.339 openmpi/intel/13/ib (D) | GMT/4.5.17 SU2/6.0.0 (D) bowtie freefem++ hk/1.3 mcnp/mcnp-nic (D) octave/4.2.1 (D) starccm/6.04 vmd |
mvapich2/pgi/11.4/eth openmpi/2.0.2/intel/2013_sp1.3.174 openmpi/intel/15/eth | GMT/5.4.2 (D) SeisUnix/44R2 cadence/cadence fun3d/custom (D) htop/2.0.2 mcnp/6.1 openssl/1.1.1a starccm/7.04 voro++/0.4.6 |
mvapich2/pgi/11.4/ib (D) openmpi/2.0.2/intel/2015.2.164 (D) openmpi/intel/15/ib (D) | GMTSAR/gmon.out SeismicHandler/2013a calcpicode fun3d/12.2 imake mctdh overture starccm/8.04 (D) vulcan |
openmpi openmpi/2.1.0/gnu/4.9.3 openmpi/pgi/11.4/eth | GMTSAR/5.6.test a2ps/4.14 casino fun3d/12.3 jdupes/1.10.4 metis/5.1.0 packmol starccm/10.04 x11/1.20.1 |
openmpi/1.10.6/gnu/4.9.3 openmpi/2.1.0/gnu/5.4.0 openmpi/pgi/11.4/ib (D) | GMTSAR/5.6 (D) abaqus/6.11-2 cmg fun3d/12.4 lammps/9Dec14 molpro/2010.1.nightly parallel/20180922 starccm/10.06 xdvi/22.87.03 |
openmpi/1.10.6/gnu/5.4.0 openmpi/2.1.0/gnu/6.3.0 rocks-openmpi | JWM/2.3.7 abaqus/6.12-3 (D) comsol/4.3a fun3d/2014-12-16 lammps/17Nov16 molpro/2010.1.25 parmetis/4.0.3 starccm/11.02 yade/gmon.out |
openmpi/1.10.6/gnu/6.3.0 openmpi/2.1.0/gnu/7.1.0 (D) rocks-openmpi_ib | LIGGGHTS/3.8 abaqus/6.14-1 comsol/4.4 (D) gamess lammps/23Oct2017 (D) molpro/2012.1.nightly (D) proj starccm/11.04 yade/2017.01a (D) |
openmpi/1.10.6/gnu/7.1.0 (D) openmpi/2.1.0/intel/2011_sp1.11.339 | OpenFoam/2.3.1 abaqus/2016 comsol/5.2a_deng gaussian lammps/24Jul17 molpro/2012.1.9 psi4 starccm/12.02 zpaq/7.15 |
openmpi/1.10.6/intel/2011_sp1.11.339 openmpi/2.1.0/intel/2013_sp1.3.174 | OpenFoam/2.4.x abaqus/2018 comsol/5.2a_park gaussian-09-e.01 lammps/30July16 molpro/2015.1.source.block qe starccm/12.06 |
| OpenFoam/5.x abinit/8.6.3 comsol/5.2a_yang gccmakedep/1.0.3 linpack molpro/2015.1.source.s qmcpack/3.0 starccm/13.04 |
| OpenFoam/6.x (D) abinit/8.10.1 (D) comsol/5.2a_zaeem gdal lz4/1.8.3 molpro/2015.1.source qmcpack/2014 tecplot/2013 |
| ParaView accelrys/material_studio/6.1 comsol/5.3a_park gdk-pixbuf/2.24.1 madagascar/2.0 molpro/2015.1 qmcpack/2016 (D) tecplot/2014 (D) |
| QtCreator/Qt5 amber/12 comsol/5.4_park geos maple/16 moose/3.6.4 quantum-espresso/6.1 tecplot/2016.old |
| R/3.1.2 amber/13 (D) cp2k git/2.20.1 maple/2015 (D) mpb-meep rdkit/2017-03-3 tecplot/2016 |
| R/3.4.2 ansys/14.0 cplot gmp maple/2016 mpc rdseed/5.3.1 tecplot/2017r3 |
| R/3.5.1 (D) ansys/15.0 (D) cpmd gnu-tools maple/2017 msc/adams/2015 rheotool/3.0 tecplot/2017 |
| |
------------------------------------------------- /share/apps/modulefiles/mst-lib-modules ------------------------------------------------- | --------------------------------------------------------------------------------------------------- /share/apps/modulefiles/mst-compiler-modules ---------------------------------------------------------------------------------------------------- |
arpack glibc/2.14 python/modules/argparse/1.4.0 | cilk/5.4.6 cmake/3.9.4 gmake/4.2.1 gnu/5.4.0 gnu/7.1.0 gpi2/1.1.1 (D) intel/2015.2.164 (D) java/9.0.4 (D) ninja/1.8.2 pgi/RCS/11.4,v python/2.7.14 python/3.5.2 python/3.7.1 (D) |
atlas/gnu/3.10.2 hdf/4/gnu/2.10 python/modules/decorator/4.0.11 | cmake/3.2.1 (D) cmake/3.11.1 gnu/4.9.2 (D) gnu/6.3.0 gnu/7.2.0 intel/2011_sp1.11.339 intel/2018.3 julia/0.6.2 perl/5.26.1 python/gmon.out python/2.7.15 python/3.6.2 |
atlas/intel/3.10.2 hdf/4/intel/2.10 python/modules/hostlist/1.15 | cmake/3.6.2 cmake/3.13.1 gnu/4.9.3 gnu/6.4.0 gpi2/1.0.1 intel/2013_sp1.3.174 java/8u161 mono/3.12.0 pgi/11.4 python/2.7.9 python/3.5.2_GCC python/3.6.4 |
boost/gnu/1.55.0 (D) hdf/4/pgi/2.10 python/modules/kwant/1.0.5 | |
boost/gnu/1.62.0 hdf/5/mvapich2_ib/gnu/1.8.14 python/modules/lasagne/1 | |
boost/intel/1.55.0 (D) hdf/5/mvapich2_ib/intel/1.8.14 python/modules/matplotlib/1.4.2 | |
boost/intel/1.62.0 hdf/5/mvapich2_ib/pgi/1.8.14 python/modules/mpi4py/2.0.1 | |
cgns/openmpi_ib/gnu (D) hdf/5/openmpi_ib/gnu/1.8.14 python/modules/mpmath/0.19 | |
cgns/openmpi_ib/intel hdf/5/openmpi_ib/intel/1.8.14 python/modules/networkx/1.11 | |
fftw/mvapich2_ib/gnu4.9.2/2.1.5 hdf/5/openmpi_ib/pgi/1.8.14 python/modules/nolearn/0.6 | |
fftw/mvapich2_ib/gnu4.9.2/3.3.4 (D) mkl/2011_sp1.11.339 python/modules/numpy/1.10.0 | |
fftw/mvapich2_ib/intel2015.2.164/2.1.5 mkl/2013_sp1.3.174 python/modules/obspy/1.0.4 | |
fftw/mvapich2_ib/intel2015.2.164/3.3.4 (D) mkl/2015.2.164 (D) python/modules/opencv/3.1.0 | |
fftw/mvapich2_ib/pgi11.4/2.1.5 netcdf/mvapich2_ib/gnu/3.6.2 python/modules/pylab/0.1.4 | |
fftw/mvapich2_ib/pgi11.4/3.3.4 (D) netcdf/mvapich2_ib/gnu/4.3.2 (D) python/modules/scipy/0.16.0 | |
fftw/openmpi_ib/gnu4.9.2/2.1.5 netcdf/mvapich2_ib/intel/3.6.2 python/modules/sympy/1.0 | |
fftw/openmpi_ib/gnu4.9.2/3.3.4 (D) netcdf/mvapich2_ib/intel/4.3.2 (D) python/modules/theano/0.8.0 | |
lines 1-79 | |
| |
------------------------------------------------- /share/apps/modulefiles/mst-app-modules ------------------------------------------------- | ------------------------------------------------------------------------------------------------------ /share/apps/modulefiles/mst-mpi-modules ------------------------------------------------------------------------------------------------------ |
AnsysEM/17/17 cp2k maple/2015 (D) qe | intelmpi mvapich2/pgi/11.4/eth openmpi/1.10.7/intel/2011_sp1.11.339 openmpi/2.0.3/gnu/7.2.0 (D) openmpi/2.1.1/gnu/7.2.0 (D) openmpi/intel/13/eth |
CST/2014 (D) cplot maple/2016 qmcpack/3.0 | mvapich2/2.2/intel/2015.2.164 mvapich2/pgi/11.4/ib (D) openmpi/1.10.7/intel/2013_sp1.3.174 openmpi/2.0.3/intel/2011_sp1.11.339 openmpi/2.1.1/intel/2011_sp1.11.339 openmpi/intel/13/ib (D) |
CST/2015 cpmd matlab/2012 qmcpack/2014 | mvapich2/2.2/intel/2018.3 (D) openmpi openmpi/1.10.7/intel/2015.2.164 (D) openmpi/2.0.3/intel/2013_sp1.3.174 openmpi/2.1.1/intel/2013_sp1.3.174 openmpi/intel/15/eth |
CST/2016 dirac matlab/2013a qmcpack/2016 (D) | mvapich2/2.3/gnu/7.2.0 openmpi/1.10.6/gnu/4.9.3 openmpi/1.10/gnu/4.9.2 openmpi/2.0.3/intel/2015.2.164 openmpi/2.1.1/intel/2015.2.164 openmpi/intel/15/ib (D) |
CST/2017 dock/6.6eth matlab/2014a (D) rum | mvapich2/2.3/intel/2018.3 openmpi/1.10.6/gnu/5.4.0 openmpi/1.10/intel/15 openmpi/2.0.3/intel/2018.3 (D) openmpi/2.1.1/intel/2018.3 (D) openmpi/pgi/11.4/eth |
OpenFoam/2.3.1 dock/6.6ib (D) matlab/2016a samtools | mvapich2/gnu/4.9.2/eth openmpi/1.10.6/gnu/6.3.0 openmpi/2.0.2/gnu/4.9.3 openmpi/2.1.0/gnu/4.9.3 openmpi/2.1.5/intel/2018.3 openmpi/pgi/11.4/ib (D) |
OpenFoam/2.4.x (D) espresso matlab/2017a scipy | mvapich2/gnu/4.9.2/ib (D) openmpi/1.10.6/gnu/7.1.0 openmpi/2.0.2/gnu/5.4.0 openmpi/2.1.0/gnu/5.4.0 openmpi/3.1.2/intel/2018.3 rocks-openmpi |
ParaView fun3d/custom (D) mcnp siesta | mvapich2/intel/11/eth openmpi/1.10.6/gnu/7.2.0 (D) openmpi/2.0.2/gnu/6.3.0 openmpi/2.1.0/gnu/6.3.0 openmpi/gnu/4.9.2/eth rocks-openmpi_ib |
R fun3d/12.2 mctdh spark | mvapich2/intel/11/ib (D) openmpi/1.10.6/intel/2011_sp1.11.339 openmpi/2.0.2/gnu/7.1.0 (D) openmpi/2.1.0/gnu/7.1.0 (D) openmpi/gnu/4.9.2/ib (D) |
SAS/9.4 fun3d/12.3 metis/5.1.0 sparta | mvapich2/intel/13/eth openmpi/1.10.6/intel/2013_sp1.3.174 openmpi/2.0.2/intel/2011_sp1.11.339 openmpi/2.1.0/intel/2011_sp1.11.339 openmpi/intel/11/backup |
SU2/3.2.8 fun3d/12.4 molpro/2010.1.nightly spin | mvapich2/intel/13/ib (D) openmpi/1.10.6/intel/2015.2.164 (D) openmpi/2.0.2/intel/2013_sp1.3.174 openmpi/2.1.0/intel/2013_sp1.3.174 openmpi/intel/11/eth |
SU2/4.1.3 (D) fun3d/2014-12-16 molpro/2010.1.25 starccm/6.04 | mvapich2/intel/15/eth openmpi/1.10.7/gnu/6.4.0 openmpi/2.0.2/intel/2015.2.164 (D) openmpi/2.1.0/intel/2015.2.164 (D) openmpi/intel/11/ib (D) |
abaqus/6.11-2 gamess molpro/2012.1.nightly (D) starccm/7.04 | mvapich2/intel/15/ib (D) openmpi/1.10.7/gnu/7.2.0 (D) openmpi/2.0.3/gnu/6.4.0 openmpi/2.1.1/gnu/6.4.0 openmpi/intel/13/backup |
abaqus/6.12-3 (D) gaussian molpro/2012.1.9 starccm/8.04 (D) | |
abaqus/6.14-1 gaussian-09-e.01 molpro/2015.1.source.block starccm/10.04 | |
abaqus/2016 gdal molpro/2015.1.source.s starccm/10.06 | |
accelrys/material_studio/6.1 geos molpro/2015.1.source starccm/11.02 | |
amber/12 gmp molpro/2015.1 starccm/11.04 | |
amber/13 (D) gnu-tools moose/3.6.4 tecplot/2013 | |
ansys/14.0 gnuplot mpb-meep tecplot/2014 (D) | |
ansys/15.0 (D) gridgen mpc tecplot/2016.old | |
ansys/17.0 gromacs msc/adams/2015 tecplot/2016 | |
ansys/18.0 hadoop/1.1.1 msc/nastran/2014 trinity | |
apbs hadoop/1.2.1 msc/patran/2014 valgrind/3.12 | |
bowtie hadoop/2.6.0 (D) namd/2.9 vasp/4.6 | |
casino lammps/9Dec14 namd/2.10 (D) vasp/5.4.1 (D) | |
cmg lammps/17Nov16 namd/2.12b visit/2.8.2 | |
comsol/4.3a lammps/30July16 (D) octave visit/2.9.0 (D) | |
comsol/4.4 (D) linpack overture vmd | |
comsol/5.2a_deng lsdyna/8.0 packmol vulcan | |
comsol/5.2a_park lsdyna/8.1.hybrid parmetis/4.0.3 | |
comsol/5.2a_yang lsdyna/8.1 (D) proj | |
comsol/5.2a_zaeem maple/16 psi4 | |
| |
---------------------------------------------- /share/apps/modulefiles/mst-compiler-modules ----------------------------------------------- | ------------------------------------------------------------------------------------------------------ /share/apps/modulefiles/mst-lib-modules ------------------------------------------------------------------------------------------------------ |
cilk/5.4.6 gnu/4.9.3 gpi2/1.0.1 intel/2015.2.164 (D) python/2.7.9 | CUnit/2.1-3 cryptopp/7.0.0 glibc/2.23 (D) libtiff/5.3.0 netcdf/openmpi_ib/gnu/4.3.2 (D) python/modules/theano/0.8.0 |
cmake/3.2.1 (D) gnu/5.4.0 gpi2/1.1.1 (D) mono/3.12.0 python/3.5.2_GCC | Imlib2/1.4.4 docbook/4.1.2 glpk/4.63 libxcb/0.4.0 netcdf/openmpi_ib/intel/3.6.2 python/modules/tinyarray/1.0.5 |
cmake/3.6.2 gnu/6.3.0 intel/2011_sp1.11.339 pgi/11.4 python/3.5.2 (D) | Qt5/5.6.3 eigen/3.3.4 gtk-doc/1.9 libxcomposite/0.4.4 netcdf/openmpi_ib/intel/4.3.2 (D) readline/7.0-alt |
gnu/4.9.2 (D) gnu/7.1.0 intel/2013_sp1.3.174 pgi/RCS/11.4,v | Qt5/5.11.3 (D) expat/2.2.6 gts/0.7.6 libxdmcp/1.1.2 pixman/0.34.0 readline/7.0 (D) |
| arpack fftw/3.3.6/gnu/7.1.0 hdf/4/gnu/2.10 libxfont/1.5.4 python/modules/argparse/1.4.0 sgao |
| atlas/gnu/3.10.2 fftw/mvapich2_ib/gnu4.9.2/2.1.5 hdf/4/intel/2.10 libxkbfile/1.0.9 python/modules/decorator/4.0.11 snappy/1.1.7 |
| atlas/intel/3.10.2 fftw/mvapich2_ib/gnu4.9.2/3.3.4 (D) hdf/4/pgi/2.10 libxpm/3.5.12 python/modules/hostlist/1.15 vtk/6.1 |
| autoconf-archive/2018.03.13 fftw/mvapich2_ib/intel2015.2.164/2.1.5 hdf/5/mvapich2_ib/gnu/1.8.14 libxtst/1.2.2 python/modules/kwant/1.0.5 vtk/8.1.1 (D) |
| boost/gnu/gmon.out fftw/mvapich2_ib/intel2015.2.164/3.3.4 (D) hdf/5/mvapich2_ib/intel/1.8.14 loki/gmon.out python/modules/lasagne/1 xbitmaps/1.1.2 |
| boost/gnu/1.51.0 fftw/mvapich2_ib/pgi11.4/2.1.5 hdf/5/mvapich2_ib/pgi/1.8.14 loki/0.1.7 (D) python/modules/matplotlib/1.4.2 xcb/lib/1.13 |
| boost/gnu/1.53.0 fftw/mvapich2_ib/pgi11.4/3.3.4 (D) hdf/5/openmpi_ib/gnu/1.8.14 mkl/2011_sp1.11.339 python/modules/mpi4py/2.0.1 xcb/proto/1.13 |
| boost/gnu/1.55.0 (D) fftw/openmpi_ib/gnu4.9.2/2.1.5 hdf/5/openmpi_ib/intel/1.8.14 mkl/2013_sp1.3.174 python/modules/mpmath/0.19 xcb/util/0.4.0 |
| boost/gnu/1.62.0 fftw/openmpi_ib/gnu4.9.2/3.3.4 (D) hdf/5/openmpi_ib/pgi/1.8.14 mkl/2015.2.164 (D) python/modules/networkx/1.11 xcb/util/image/0.4.0 |
| boost/gnu/1.65.1-python2 fftw/openmpi_ib/intel2015.2.164/2.1.5 lapack/3.8.0 mkl/2018.3 python/modules/nolearn/0.6 xcb/util/keysyms/0.4.0 |
| boost/gnu/1.65.1 fftw/openmpi_ib/intel2015.2.164/3.3.4 (D) libdrm/2.4.94 ncurses/6.1 python/modules/numpy/1.10.0 xcb/util/renderutil/0.3.9 |
| boost/gnu/1.69.0 fftw/openmpi_ib/pgi11.4/2.1.5 libepoxy/1.5.2 netcdf/mvapich2_ib/gnu/3.6.2 python/modules/obspy/1.0.4 xcb/util/wm/0.4.1 |
| boost/intel/1.55.0 (D) fftw/openmpi_ib/pgi11.4/3.3.4 (D) libfontenc/1.1.3 netcdf/mvapich2_ib/gnu/4.3.2 (D) python/modules/opencv/3.1.0 xorg-macros/1.19.1 |
| boost/intel/1.62.0 freeglut/3.0.0 libgbm/2.1 netcdf/mvapich2_ib/intel/3.6.2 python/modules/pylab/0.1.4 xorgproto/2018.4 |
| cgns/openmpi_ib/gnu (D) glibc/2.14 libqglviewer/2.7.0 netcdf/mvapich2_ib/intel/4.3.2 (D) python/modules/scipy/0.16.0 xtrans/1.3.5 |
| cgns/openmpi_ib/intel glibc/2.18 libqglviewer/2.7.1 (D) netcdf/openmpi_ib/gnu/3.6.2 python/modules/sympy/1.0 |
| |
Where: | Where: |
L: Module is loaded | |
D: Default Module | D: Default Module |
| |
==== Abaqus ==== | ==== Abaqus ==== |
* Default Vesion = 6.12.3 | * Default Vesion = 6.12.3 |
* Other versions available: 6.11.2, 6.14.1, 2016, 2018 | * Other versions available: 2019, 2018, 2016, 6.14.1, 6.11.2 |
| <code> |
| module load abaqus/2019 |
| module load abaqus/2018 |
| module load abaqus/2016 |
| module load abaqus/6.14-1 |
| module load abaqus/6.12-3 |
| </code> |
| |
This example is for 6.12 | This example is for 6.12 |
==== Abaqus 2016 ==== | ==== Abaqus 2016 ==== |
| |
| <code> |
module load abaqus/2016 | module load abaqus/2016 |
| </code> |
| |
<file bash abaqus2016.sbatch> | <file bash abaqus2016.sbatch> |
rlm_roam | rlm_roam |
</code> | </code> |
| |
| ====Comsol==== |
| |
| ===Licensing=== |
| |
| The licensing scheme for Comsol is seat based, which means if you only have 1 license, and you have it installed on your workstation and the Forge you can only use one or the other at the same point in time. If your workstation has the license checked out when your Forge job starts, your job will fail to get a license and will fail to run. |
| |
| This problem only gets compounded when you are also working with other users inside a group with a handful of shared seats. You will need to coordinate with them when you and they will run so that you don't run into license problems when trying to run. |
| |
| |
| ===Running Batch=== |
| |
| Running Comsol with the Forge as your solver is fairly straight forward, first you load your module (module names for comsol differ by which license pool you are using). |
| module load comsol/5.4_$pool |
| |
| then create your batch file for submission. |
| |
| <file bash comsol.sbatch> |
| #SBATCH --job-name=comsol_test |
| #SBATCH --nodes=1 |
| #SBATCH --ntasks=20 |
| #SBATCH --mem=20000 |
| #SBATCH --time=60:00:00 |
| #SBATCH --export=ALL |
| |
| comsol -np 20 batch -inputfile multilayer.mph -outputfile laminate_out.mph |
| |
| </file> |
| |
| Please note that comsol is a memory intensive application, you will likely have to adjust the values for mem and ntasks to suit what your simulation will need. |
| |
| We also advise creating the input file on a windows workstation and using the Forge for simulation solving, however running interactively should be possible inside an interactive job with X forwarding enabled. |
| |
====CST==== | ====CST==== |
This contains the output of the finished job.\\ | This contains the output of the finished job.\\ |
==== Matlab ==== | ==== Matlab ==== |
| |
| **IMPORTANT NOTE** |
| Currently campus has 100 Matlab seat licenses to be shared between the Forge and research desktops. There are certain times of the year where Matlab usage is quite high. License check out is on a first come, first served basis. If you are not able to get a Matlab license, you might consider using GNU Octave. This is available on the Forge and will do much of what Matlab will do. |
| |
Matlab is available to run in batch form or interactively on the cluster. | Matlab is available to run in batch form or interactively on the cluster. |
===Python 3=== | ===Python 3=== |
There are many modules available for Python 3. However, unlike Python 2.7, all of the python modules are installed in the python directory rather than being separate modules. Currently, the newest Python 3 version available on the Forge is Python 3.6.4. To see a current list of available python versions, run the command <code>module avail python</code> To see a list of all available Python modules available for a particular Python version, load that Python version and run <code>pip list</code> Users are welcome to install additional modules on their account using <code>pip install --user <package name></code> | There are many modules available for Python 3. However, unlike Python 2.7, all of the python modules are installed in the python directory rather than being separate modules. Currently, the newest Python 3 version available on the Forge is Python 3.6.4. To see a current list of available python versions, run the command <code>module avail python</code> To see a list of all available Python modules available for a particular Python version, load that Python version and run <code>pip list</code> Users are welcome to install additional modules on their account using <code>pip install --user <package name></code> |
| |
| ***NOTE*** |
| The default versions of Python on the Forge do ***NOT*** have pip installed. You will have to load a Python module to get pip functionality. |
====QMCPack==== | ====QMCPack==== |
| |