A+ A A-

Sun Compilers

1. Sun Compilers

To use sun compilers, clustertools and intel compilers, please do the following on the command line:

user@login02:~> module add sunstudio

user@login02:~> module add clustertools

user@login02:~> module add intel-XE/11.1 Or intel-XE/12.0 Or intel-XE/13.0

These will add sun compilers and clustertools (including MPI compiled to run over the infiniband and MPI compiler) to your path.

Path to SUN MPI compilers:

Code Name

Directory

Notes

mpicc

/opt/gridware/sun-hpc-ct-8.2-Linux-sun/bin/mpicc

MPI Sun C compiler

mpicxx

/opt/gridware/sun-hpc-ct-8.2-Linux-sun/bin/mpicxx

MPI Sun C++ compiler

mpif77

/opt/gridware/sun-hpc-ct-8.2-Linux-sun/bin/mpif77

MPI Sun Fotran 77 compiler

mpif90

/opt/gridware/sun-hpc-ct-8.2-Linux-sun/bin/mpif90

MPI Sun Fotran 90 compiler

 

2. GNU Compilers

Path to GNU compilers:

gcc: /usr/bin/gcc

gfortran: /usr/bin/gfortran

Path to GNU MPI compilers:

Code Name

Directory

Notes

mpicc

/opt/gridware/sun-hpc-ct-8.2-Linux-gnu/bin/mpicc

MPI gcc C compiler

mpicxx

/opt/gridware/sun-hpc-ct-8.2-Linux-gnu/bin/mpicxx

MPI g++ C++ compiler

mpif77

/opt/gridware/sun-hpc-ct-8.2-Linux-gnu/bin/mpif77

MPI gfotran Fotran 77 compiler

mpif90

/opt/gridware/sun-hpc-ct-8.2-Linux-gnu/bin/mpif90

MPI gfotran Fotran 90 compiler

 

3. Intel Compilers

Path to Intel MPI compilers:

Code Name

Directory

Notes

mpicc

/opt/gridware/sun-hpc-ct-8.2-Linux-intel/bin/mpicc

MPI icc C compiler

mpicxx

/opt/gridware/sun-hpc-ct-8.2-Linux-intel/bin/mpicxx

MPI ic++ C++ compiler

mpif77

/opt/gridware/sun-hpc-ct-8.2-Linux-intel/bin/mpif77

MPI ifort Fotran 77 compiler

mpif90

/opt/gridware/sun-hpc-ct-8.2-Linux-intel/bin/mpif90

MPI ifort Fotran 90 compiler

 

Last Updated on Tuesday, 03 June 2014 14:48

Hits: 2491

FAQ

SUN Cluster

Q: How do I Login to SUN cluster?
A: Logging in - via Secure Shell

Q: How do I change my password?
A: Logging in - Changing your password

Q: What are the complilers on SUN?
A: Compilers on SUN

Q: How do I Submit/Run Jobs?
A: Submit/Run on Sun

Last Updated on Tuesday, 03 June 2014 14:54

Hits: 2052

Running Jobs

PBS WorkLoad Manager

All jobs on the GPU and SUN are scheduled by PBSPro

How to submit on GPU:

1. Compile your code

2. Run your submit script. (Please click here to view example scripts for customization)

Please note that you need to use an MPI inorder to call an mpi program. The system installed MPI is under /GPU/opt/open-mpi-new/.

export the mpi using the following command or add the lines on your .profile

 

1. export PATH=/GPU/opt/open-mpi-new/bin:$PATH

2. export LD_LIBRARY_PATH=/GPU/opt/open-mpi-new/lib:$LD_LIBRARY_PATH

 

Partitions available on GPU:

1. C2070

2. C1060

 

Moab Job Submit:

msub  scriptname -l feature=feature-name

allows users to submit jobs directly to Moab.

top of the page

 


Moab job submit on SUN:

1. Compile your code (Please click here on how to compile on SUN)

2. Run your submit script (Please click here to view example scripts for customization)

Partitions available on SUN

1. nehalem

2. westmere

3. dell

4. sparc

5. test

6. viz

Moab Job Submit:

msub  scriptname -l feature=feature-name

Allows users to submit to Moab.

top of the page


How to cancel jobs

Cancel jobs on GPU and SUN cluster:

mjobctl -c jobid

used to selectively cancel the specified job(s) (active, idle, or non-queued) from the queue


Debugging

mpirun_dbg.dbx, mpirun_dbg.ddd, mpirun_dbg.gdb


Monitoring

For monitoring on nodes use one of

  • nmon
  • vmstat
  • top
  • xloadl(X11)

and of course ps and free

top of the page

Last Updated on Tuesday, 03 June 2014 14:37

Hits: 3192

Scientific codes running on Sun cluster

Sun Microsystems cluster

Code

version

directory

Notes

GROMACS

4.0.5

/opt/gridware
/gromacs

A molecular dynamics package primarily designed for biomolecular. For more information, click HERE.

DL_poly

3.07,
2.18

/opt/gridware
/dlpoly

A general purpose serial and parallel Molecular dynamic simulation package, this version of Dl_poly has a wider range of structure optimisation features to help with setting up the starting configuration. For more info about this code click HERE and to see the script for running DL_poly_3.07 and 2.18 in the Sun, e1350 and BG/P systems, click HERE.

EMBOSS

6.2.0

/opt/gridware
/EMBOSS

EMBOSS is an open source software package developed to meet the needs of molecular biology community. For more information about this package, click HERE.

ATLAS

3.9

/opt/gridware
/atlas3.9

ATLAS is an Automatically Tuned Linear Algebra Software. For more information about this software, click HERE.

GAUSSIAN

g09

/opt/gridware
/gaussian

Gaussian is a structure calculation software. For more information about this software, click HERE and to see the script for running GAUSSIAN in the Sun, e1350 and BG/P systems, click HERE.

SEADAS

6.1

/opt/gridware
/SeaDas

SEADAS is a comprehensive image analysis package. For more information about this package, click HERE.


 

Graphical Processing Unit (GPU)

Code

version

directory

Notes

EMBOSS

6.3.1

/GPU/opt/emboss-intel-new

Intel Compilation:EMBOSS is an open source software package developed to meet the needs of molecular biology community. For more information on how to run EMBOSS on the GPU cluster HERE

EMBOSS

6.3.1

/GPU/opt/emboss-gcc-6.3

GCC Compilation:EMBOSS is an open source software package developed to meet the needs of molecular biology community. For more information on how to run EMBOSS on the GPU cluster HERE

NAMD

2.8

/GPU/opt/namd/NAMD_2.8_Source/

NAMD is a free-of-charge molecular dynamics simulation package written using the Charm++ parallel programming model, noted for its parallel efficiency and often used to simulate large systems (millions of atoms) HERE

 

Last Updated on Tuesday, 03 June 2014 14:12

Hits: 3440

CHPC in the News

Documentation for users:

Tsessebe Cluster Available

Graphical Processing Unit Cluster Available

CHPC SAGrid Cluster Available

Dirisa Storage Unit Available

Social Share

FacebookTwitterGoogle BookmarksLinkedin

Website developed by Multidimensions