This shows you the differences between two versions of the page.
Both sides previous revision Previous revision Next revision | Previous revision | ||
pub:forge [2018/06/21 16:07] weaverjon [Abaqus 2016] |
pub:forge [2022/05/06 20:15] (current) |
||
---|---|---|---|
Line 2: | Line 2: | ||
===== System Information ===== | ===== System Information ===== | ||
==== Software ==== | ==== Software ==== | ||
- | The Forge was built with rocks 6.1.1 which is built on top of CentOS 6.5. With the Forge we made the conversion from using PBS/Maui to using SLURM as our scheduler and resource manager. | + | The Forge was built with rocks 6.1.1 which is built on top of CentOS 6.8. With the Forge we made the conversion from using PBS/Maui to using SLURM as our scheduler and resource manager. |
==== Hardware ==== | ==== Hardware ==== | ||
Line 19: | Line 19: | ||
===Storage=== | ===Storage=== | ||
- | The Forge home directory storage is a Dell NSS high availability storage solution. This storage will provide 480 TB of raw storage, 349 TB of that is available for users, limited to 50GB per user, which can be expanded upon request with proof of need. **This volume is not backed up, we do not provide any data recovery guarantee in the event of a storage system failure.** System failures where data loss occurs are rare, but they do happen. All this to say, you ** should not ** be storing the only copy of your critical data on this system. | + | ==General Policy Notes== |
+ | None of the cluster attached storage available to users is backed up in any way by us, this means that if you delete something and don't have a copy somewhere else, **it is gone**. Please note the data stored on cluster attached storage is limited to Data Class 1 and 2 as defined by [[ https:// | ||
+ | |||
+ | ==Home Directories== | ||
+ | |||
+ | The Forge home directory storage is available from an NFS share, meaning your home directory is the same across the entire cluster. This storage will provide 14 TB of raw storage, 13 TB of that is available for users, limited to 50GB per user, which can be expanded upon request with proof of need. **This volume is not backed up, we do not provide any data recovery guarantee in the event of a storage system failure.** System failures where data loss occurs are rare, but they do happen. All this to say, you ** should not ** be storing the only copy of your critical data on this system. | ||
+ | |||
+ | ==Scratch Directories== | ||
+ | |||
+ | Each user will get a scratch directory created for them at / | ||
+ | |||
+ | Along with the networked scratch space, there is always local scratch on each compute node for use during calculations in /tmp. There is no quota placed on this space, and it is cleaned regularly as well, but things stored in this space will only be available to processes executing on the node in which they were created. Meaning if you create it in /tmp in a job, you won't be able to see it on the login node, and other processes won't be able to see it if they are on a different node than the process which created the file. | ||
+ | |||
+ | ==Leased Space== | ||
+ | |||
+ | If home directory, and scratch space availability aren't enough for your storage needs we also lease out quantities of cluster attached space. If you are interested in leasing storage please contact us. If you already are leasing storage, but need a reference guide on how to manage the storage please go [[ pub: | ||
==== Policies ==== | ==== Policies ==== | ||
Line 198: | Line 213: | ||
An important concept for running on the cluster is modules. Unlike a traditional computer where you can run every program from the command line after installing it, with the cluster we install the programs to a main " | An important concept for running on the cluster is modules. Unlike a traditional computer where you can run every program from the command line after installing it, with the cluster we install the programs to a main " | ||
- | Here is the output of module avail as of 06/26/2017 | + | Here is the output of module avail as of 02/01/2019 |
< | < | ||
- | ------------------------------------------------- / | + | ------------------------------------------------------------------------------------------------------ / |
- | intelmpi | + | AnsysEM/17/17 SAC/101.2 |
- | | + | |
- | mvapich2/gnu/4.9.2/ib | + | |
- | mvapich2/intel/11/eth | + | CST/2014 (D) SEGY2ASCII/2.0 ansys/18.2 dock/6.6ib |
- | mvapich2/intel/11/ib | + | |
- | mvapich2/intel/13/eth | + | CST/2016 SU2/3.2.8 ansys/19.2 |
- | mvapich2/intel/13/ib (D) | + | CST/2017 SU2/4.1.3 |
- | mvapich2/intel/15/eth | + | CST/2018 SU2/5.0.0 bison/3.1 flex/2.6.0 hadoop/ |
- | mvapich2/intel/ | + | GMT/4.5.17 |
- | mvapich2/pgi/11.4/eth | + | GMT/5.4.2 |
- | mvapich2/pgi/11.4/ib (D) | + | |
- | openmpi | + | GMTSAR/5.6.test |
- | | + | GMTSAR/5.6 |
- | | + | JWM/2.3.7 abaqus/6.12-3 (D) comsol/4.3a fun3d/2014-12-16 |
- | openmpi/1.10.6/gnu/6.3.0 openmpi/2.1.0/gnu/7.1.0 (D) | + | LIGGGHTS/3.8 |
- | openmpi/1.10.6/gnu/7.1.0 (D) | + | OpenFoam/2.3.1 abaqus/ |
- | openmpi/1.10.6/intel/2011_sp1.11.339 openmpi/2.1.0/intel/2013_sp1.3.174 | + | |
+ | | ||
+ | | ||
+ | ParaView | ||
+ | QtCreator/Qt5 amber/ | ||
+ | R/3.1.2 amber/ | ||
+ | R/3.4.2 ansys/14.0 cplot gmp maple/2016 mpc | ||
+ | R/3.5.1 (D) ansys/15.0 (D) cpmd | ||
- | ------------------------------------------------- / | + | --------------------------------------------------------------------------------------------------- / |
- | arpack | + | cilk/5.4.6 cmake/3.9.4 gmake/ |
- | | + | cmake/3.2.1 (D) |
- | | + | cmake/3.6.2 cmake/3.13.1 gnu/4.9.3 gnu/6.4.0 gpi2/1.0.1 intel/2013_sp1.3.174 java/8u161 mono/3.12.0 pgi/11.4 |
- | | + | |
- | | + | |
- | | + | |
- | | + | |
- | | + | |
- | cgns/openmpi_ib/ | + | |
- | | + | |
- | fftw/mvapich2_ib/ | + | |
- | fftw/mvapich2_ib/ | + | |
- | | + | |
- | | + | |
- | fftw/mvapich2_ib/ | + | |
- | fftw/openmpi_ib/ | + | |
- | fftw/openmpi_ib/ | + | |
- | lines 1-79 | + | |
- | ------------------------------------------------- / | + | ------------------------------------------------------------------------------------------------------ / |
- | AnsysEM/17/17 | + | intelmpi |
- | CST/2014 (D) | + | mvapich2/2.2/intel/2015.2.164 |
- | | + | mvapich2/2.2/ |
- | CST/2016 dirac matlab/2013a qmcpack/2016 (D) | + | |
- | CST/2017 dock/6.6eth matlab/2014a (D) | + | mvapich2/2.3/ |
- | | + | mvapich2/gnu/4.9.2/eth |
- | OpenFoam/2.4.x (D) | + | mvapich2/gnu/ |
- | | + | mvapich2/intel/ |
- | | + | mvapich2/intel/ |
- | SAS/9.4 fun3d/12.3 metis/5.1.0 | + | mvapich2/ |
- | SU2/3.2.8 fun3d/12.4 | + | |
- | SU2/4.1.3 | + | mvapich2/intel/ |
- | | + | mvapich2/intel/ |
- | abaqus/6.12-3 (D) | + | |
- | abaqus/6.14-1 | + | |
- | | + | |
- | | + | |
- | amber/12 gmp molpro/2015.1 starccm/11.04 | + | |
- | amber/13 (D) | + | |
- | ansys/14.0 | + | |
- | ansys/15.0 | + | |
- | ansys/17.0 | + | |
- | ansys/18.0 | + | |
- | apbs hadoop/1.2.1 msc/patran/ | + | |
- | | + | |
- | | + | |
- | | + | |
- | comsol/4.3a | + | |
- | comsol/4.4 (D) | + | |
- | | + | |
- | | + | |
- | | + | |
- | | + | |
- | ---------------------------------------------- / | + | ------------------------------------------------------------------------------------------------------ / |
- | cilk/5.4.6 gnu/4.9.3 | + | CUnit/ |
- | cmake/3.2.1 (D) gnu/ | + | |
- | cmake/3.6.2 gnu/6.3.0 | + | |
- | | + | Qt5/5.11.3 |
+ | | ||
+ | atlas/gnu/ | ||
+ | | ||
+ | | ||
+ | | ||
+ | boost/gnu/1.51.0 | ||
+ | | ||
+ | | ||
+ | | ||
+ | boost/gnu/1.65.1-python2 | ||
+ | | ||
+ | | ||
+ | | ||
+ | | ||
+ | | ||
+ | | ||
Where: | Where: | ||
- | | ||
| | ||
Line 435: | Line 437: | ||
==== Abaqus ==== | ==== Abaqus ==== | ||
- | * Default Vesion = 6.12 | + | * Default Vesion = 6.12.3 |
- | * Other versions available | + | * Other versions available: 2019, 2018, 2016, 6.14.1, 6.11.2 |
+ | < | ||
+ | module load abaqus/ | ||
+ | module load abaqus/ | ||
+ | module load abaqus/ | ||
+ | module load abaqus/6.14-1 | ||
+ | module load abaqus/ | ||
+ | </ | ||
This example is for 6.12 | This example is for 6.12 | ||
Line 484: | Line 493: | ||
==== Abaqus 2016 ==== | ==== Abaqus 2016 ==== | ||
+ | < | ||
module load abaqus/2016 | module load abaqus/2016 | ||
+ | </ | ||
<file bash abaqus2016.sbatch> | <file bash abaqus2016.sbatch> | ||
Line 533: | Line 544: | ||
* Default Version = 15.0 | * Default Version = 15.0 | ||
- | * Other versions available | + | * Other versions available: 14.0, 17.0, 18.0, 18.1, 18.2, 19.0, 19.2 |
- | * 14.0 | + | |
=== Running the Workbench === | === Running the Workbench === | ||
Line 767: | Line 777: | ||
* Default Version = 17.0 | * Default Version = 17.0 | ||
+ | * Other installed version: | ||
* Also Called Ansys Electronics Desktop | * Also Called Ansys Electronics Desktop | ||
Line 873: | Line 884: | ||
rlm_roam | rlm_roam | ||
</ | </ | ||
+ | |||
+ | ====Comsol==== | ||
+ | |||
+ | ===Licensing=== | ||
+ | |||
+ | The licensing scheme for Comsol is seat based, which means if you only have 1 license, and you have it installed on your workstation and the Forge you can only use one or the other at the same point in time. If your workstation has the license checked out when your Forge job starts, your job will fail to get a license and will fail to run. | ||
+ | |||
+ | This problem only gets compounded when you are also working with other users inside a group with a handful of shared seats. You will need to coordinate with them when you and they will run so that you don't run into license problems when trying to run. | ||
+ | |||
+ | |||
+ | ===Running Batch=== | ||
+ | |||
+ | Running Comsol with the Forge as your solver is fairly straight forward, first you load your module (module names for comsol differ by which license pool you are using). | ||
+ | module load comsol/ | ||
+ | | ||
+ | then create your batch file for submission. | ||
+ | |||
+ | <file bash comsol.sbatch> | ||
+ | #SBATCH --job-name=comsol_test | ||
+ | #SBATCH --nodes=1 | ||
+ | #SBATCH --ntasks=20 | ||
+ | #SBATCH --mem=20000 | ||
+ | #SBATCH --time=60: | ||
+ | #SBATCH --export=ALL | ||
+ | |||
+ | comsol -np 20 batch -inputfile multilayer.mph -outputfile laminate_out.mph | ||
+ | |||
+ | </ | ||
+ | |||
+ | Please note that comsol is a memory intensive application, | ||
+ | |||
+ | We also advise creating the input file on a windows workstation and using the Forge for simulation solving, however running interactively should be possible inside an interactive job with X forwarding enabled. | ||
====CST==== | ====CST==== | ||
* (Computer Simulation Techonolgy) | * (Computer Simulation Techonolgy) | ||
* Default version = 2014 | * Default version = 2014 | ||
- | * Other versions available = 2015 | + | * Other versions available = 2015, 2016, 2017, 2018 |
Job Submission information | Job Submission information | ||
Line 1072: | Line 1115: | ||
Default version = 2015 | Default version = 2015 | ||
+ | Other installed versions: 2016, 2017 | ||
Example:\\ | Example:\\ | ||
Line 1137: | Line 1181: | ||
==== Matlab ==== | ==== Matlab ==== | ||
- | Matlab is available to run in batch form or interactively on the cluster, we currently have versions | + | **IMPORTANT NOTE** |
+ | Currently campus has 100 Matlab seat licenses to be shared between the Forge and research desktops. | ||
+ | |||
+ | Matlab is available to run in batch form or interactively on the cluster. | ||
+ | * Default version = 2014a | ||
+ | * Other installed version: | ||
=== Interactive Matlab === | === Interactive Matlab === | ||
Line 1298: | Line 1347: | ||
===Python 3=== | ===Python 3=== | ||
There are many modules available for Python 3. However, unlike Python 2.7, all of the python modules are installed in the python directory rather than being separate modules. Currently, the newest Python 3 version available on the Forge is Python 3.6.4. To see a current list of available python versions, run the command < | There are many modules available for Python 3. However, unlike Python 2.7, all of the python modules are installed in the python directory rather than being separate modules. Currently, the newest Python 3 version available on the Forge is Python 3.6.4. To see a current list of available python versions, run the command < | ||
+ | |||
+ | ***NOTE*** | ||
+ | The default versions of Python on the Forge do ***NOT*** have pip installed. | ||
====QMCPack==== | ====QMCPack==== | ||
Line 1415: | Line 1467: | ||
Other working versions: | Other working versions: | ||
- | 11.02.009\\ < | + | * 13.04 |
- | 10.06.010\\ < | + | * 12.06 |
- | 8.04.010\\ < | + | * 11.04 |
- | 7.04.011\\ < | + | * 11.02.009 |
- | 6.04.016\\ < | + | |
+ | | ||
+ | | ||
+ | | ||