A+ A A-

Compiling and Linking Codes

Sun compilers

Please see this page for information on sun compilers, clustertools and intel compilers.

Compilers and Libraries

-------------------------------------------------------------------------------------------------------------------

Code Name

Version

Directory

Notes

gcc

4.5.1

/opt/gridware/compilers

with-gmp

zlib

1.2.7

/opt/gridware/compilers

with gcc

ImageMagick

6.7.9

/opt/gridware/compilers

with intel 2012

NCO

4.2.1

/opt/gridware/compilers

with gcc-4.5.1 , intel 11 and openmpi-1.4.2-intel

netcdf-gnu

4.1.2

/opt/gridware/libraries

with gcc

netcdf-intel

4.1.2

/opt/gridware/libraries

with intel 2012

mvapich2 (r5668)

1.8

/opt/gridware/libraries

with intel 2012

mvapich

2.1.8

/opt/gridware/libraries

with gcc

HDF5

1.8.9

/opt/gridware/compilers

with intel 11.1

OpenMPI

1.6.1

/opt/gridware/compilers/OpenMPI

with intel 2012

OpenMPI

1.6.1

/opt/gridware/compilers/OpenMPI

with gcc

FFTW

3.3.2

/opt/gridware/libraries

with intel 2012 , using mvapich2(r5668) mpi lib

FFTW

2.1.5

/opt/gridware/libraries

with intel 2012 , using mvapich2(r5668) mpi lib

 


 

top of the page

Last Updated on Tuesday, 03 June 2014 14:16

Hits: 2126

Logging In

Find guidelines on the following below:

 

CHPC Use Policies

Please make sure you have read and signed the CHPC Use Policy and returned it. Chances are you have already done so to get to this point.

top of the page

Logging in via Secure Shell

CHPC systems use the UNIX operating system. Click here to download the readme file for all our clusters.

Most systems have an SSH client that may be used to log in to the CHPC. Linux and MacOS systems have this as standard, while PuTTY is a free downloadable client for MS-Windows.

Login using your ssh client to the system (GPU & Sun system) and optionally set the command line argument -X to enable X-windows display forwarding back to your local host. For example:

To login into the GPU:

ssh This email address is being protected from spambots. You need JavaScript enabled to view it. (Anywhere from the internet)


SUN cluster loggins Using Linux

1. Login from anywhere on the internet

ssh This email address is being protected from spambots. You need JavaScript enabled to view it.

2. Login from CSIR

ssh This email address is being protected from spambots. You need JavaScript enabled to view it.

Login via Putty

1.     Open Putty.exe

2.     Category: Session

3.     Under Host Name or IP address:

·         sun.chpc.ac.za (from anywhere in the internet)

·         or : gpu.chpc.ac.za

4.     Port: 22

5.     Connection Type: SSH

6.     Saved Session: e.g CHPC-SUN or CHPC-GPU

7.     Close window on exit: Only on clean exit

8.     Click Open

9.     Your Username [press Enter]

10.   Your Password [press Enter]

This will connect you to a shell on the login node of the cluster. From here you will be able to conduct almost all of your activities.

top of the page

Directories

The root directory in unix / (forward slash) is the base of the file system. Other disk systems may be mounted on mount points on the root directory. The other directories are normally on separate disk subsystems from the system directories containing the libraries and programs.

The directory in which a user's login session starts, is the home directory.

In commands, it may also be referred by a short form, using the tilde symbol, ~.

The tilde is expanded by the shell to refer to the full directory path of the home directory, typically /GPU/home/username (GPU) or /export/home/username (Sun). This directory is owned by the user and contains files enabling correct startup of the user's session such as setting shell variables.

The current working directory may be referred to by its full pathname or . (dot), while the parent directory which is one level up is referred to by .. (double dot).

You may change to your home directory by typing cd on its own. Or, you may refer to files in your home directory by using the tilde shortcut symbol when in a different working directory, eg.:

cat ~/myfile.text

to display the contents of the file in /GPU/home/username/myfile.text (GPU) OR /export/home/username/myfile.text (Sun) on the console.

Tip: to change your working directory to the previous directory, type cd -

top of the page

File permissions

In unix, file permissions for reading, writing and executing may be specified for the classes owner, group and world. In this way access may be controlled. The chown and chmod commands are used to change a file or directory's permissions.

top of the page

Disk space

The unix disk free command df shows the filesystem free space and mount points. The '-h' command line switch causes the output to be in a format more easily read by a human.
For example, to show all free space in GPU cluster:

% df -h

Filesystem            Size  Used Avail Use% Mounted on
/dev/md0               49G   18G   29G  38% /
tmpfs                  12G     0   12G   0% /dev/shm
/dev/gpfs              14T  942G   13T   7% /GPU

For example, to show all free space in Sun cluster:
% df -h

Filesystem Size Used Avail Use% Mounted on
/dev/sda3 119G 45G 68G 40% /
udev 7.9G 188K 7.9G 1% /dev
/dev/sda1 130M 25M 98M 21% /boot
172.17.203.15:/mnt/gridware 1.6T 680G 865G 45% /opt/gridware
172.17.203.15:/mnt/home 1.9T 717G 1.1T 41% /export/home
172.17.203.50:/scratch/work 3.6T 1.8T 1.9T 50% /scratch/work
172.17.203.50:/scratch/home 2.0T 210G 1.8T 11% /scratch/home
172.17.195.20@o2ib0:172.17.195.21@o2ib0:/lfs01 72T 15G 68T 1% /lustre/SCRATCH1
172.17.195.21@o2ib0:172.17.195.20@o2ib0:/lfs02 72T 13T 56T 19% /lustre/SCRATCH2
172.17.195.20@o2ib0:172.17.195.21@o2ib0:/lfs03 72T 38T 31T 55% /lustre/SCRATCH3
172.17.195.21@o2ib0:172.17.195.20@o2ib0:/lfs04 72T 2.9T 66T 5% /lustre/SCRATCH4

To show disk usage, use the unix command du :
du -sh .

Show usage in all subdirectories of a specified directory
du -sh directoryname

top of the page

Changing your password on GPU OR Sun cluster

To change your password, login to gpu.chpc.ac.za (GPU) OR sun.chpc.ac.za (Sun).

To change your password type passwd and follow the prompts, first to enter your existing password, and then the new password. You will be prompted twice for the new password to ensure correctness.

For example (GPU):

yppasswd
Please enter old (i.e. current) password:
Please enter new password:
Please re-enter new password:

For example (SUN cluster):

NB:You'll be requested to enter a new password and to confirm it.

username@login02:~>passwd username
Changing password for username.
Old Password:
New Password:
Reenter New Password:
Changing NIS password for username on batch01.
Password changed.

 

Choose a "strong" password, with mixed case alphabetic characters and digits. As per the CHPC agreements, please keep your password private and change it immediately if you suspect it has become known to anyone else.

If you have forgotten the password or otherwise cannot log in, you will have to request that your password be reset by the CHPC system admin.

top of the page

Changing your login shell

To change your shell type 'chsh' and follow the prompts. A list of valid shells is available in the text file /etc/shells although the bash shell will suffice for most operations. Current shells include csh, ksh, tcsh, zsh.

For example, to change your shell to bash:

chsh -s bash

top of the page

Changing bash command line editing mode

By default, the bash shell is set up to use VI-style editing keys. To change, run the following, or add the following to your .bashrc file to execute on login:

set -o emacs

Useful keys in this mode:

  • ^a Start of line
  • ^e End of line
  • ^w Delete previous word
  • Up/Down arrow access command history

To revert to VI mode

set -o vi

top of the page

Command-line completion

Pressing Tab will perform automatic filename completion, both for commands that are available on the shell's PATH, a variable that determines which directories are searched for programs, and for file and directory name completion.

top of the page

Last Updated on Monday, 19 November 2012 13:31

Hits: 2591

CHPC Newsletter

First Edition

 

A note from the Director

happy sithole

The CHPC endevours to communicate with its user base using multiple platforms including the website and our new newsletter. As an indication of this, the centre’s website will be changing soon! Do not be alarmed, users can expect the same online support facility as well as more interactive media in the form of a blog facility, for all research infomation sharing and a social media interface.

The newletter will be emailed on a quarterly basis and will carry the latest developments in the CHPC, profiles of researchers and the nature of the work they are doing among other things. We hope our users will utilise this newsletter at a tool stay abreast of the latest development in the CHPC.

Happy Sithole
Director
 

CHPC National Meeting

It is that time again when we are hard at work to bring together our CHPC community, leaders of industry in HPC and technology vendors. The meeting aims to gauge international trends in HPC applications, look at what South African researchers are doing and determine a way of keeping the country competetive (Industrial Advisory Council) in this industry. The national meeting takes place from 3 – 7 December 2012 at the Durban Internation Conference Centre. The 3rd and 4th December will cover tutorials and forums on HPC and 5 – 7 December will constitute the main conference days. The theme of this year’s conference is “HPC and Data Applications for Increased Impact on Research” and our intention is to highlight successful applications of HPC.

I am excited to announce that the conference will carry the finale of the first South African Student Cluster Build Competition. Four teams of five will be competing against each other and the winning team will represent South Africa at the 2013 International Supercomputing Conference in Germany. Thanks to a generous R150 000.00 sponsorship from Dell, the winning team will also visit the Dell Headquarters in Austin, Texas, to visit Dell’s development team on HPC and to learn from them.

The Hotseat Industrial Session will take place on Friday, 07 December 2012 and has proved to be a favoured session from last year’s conference delegates. Vendors have booked their places and are gearing-up to face the scrutiny of our inquisitors.

I urge you to register for this conference by visiting www.chpcconf.co.za The call for contributions closes on 26 October 2012.
 

NAG/CHPC Partnership

The centre has partnered with NAG (Numerical Algorithm Group), a United Kingdom based company to assist CHPC users with their codes. The aim of this partnership is to assist users to tweak and scale their codes on the CHPC infrastructure.

The NAG High Performance Computing services include among others: focused computer science and engineering (CSE) projects, mentoring and training of local CSE personnel as well as advice and support in procurement processes. As part of this partnership, NAG and the CHPC visited the CHPC hosted a workshop the centre’s infrastructure users in September.

The aim of the workshop was to allow users to share their experiences with the type of codes they are running and to see how they could optimise them. As an example of the kind of services CHPC users can now expect, NAG took a user’s personally developed code “Particle-In-Cell Simulations” code, optimized and parallelised it. The user is utilising the code to simulate waves in an electron-beam plasma and it is written in C++. Initially, the code was compiled on CHPC cluster with OpenMP and was limited to running on 8 processors (one node). The aim was to run on multiple nodes by introducing MPI. After the introduction of MPI, the code was successfully scaled from one to eight nodes / 12 to 96 cores which could run for 9.1 seconds, a startling achievement for research.
 

First South African Cyber Infrastructure Committee Meeting

chpcIn September the first meeting of the committee and sector working groups for the development of a national intergrated cyber-infrastructure system was held at the CHPC offices in Cape Town.

The committee has been established to investigate international cyber-infrastructure best practice which is optimally applicable to South Africa and appropriately advise the Minister of Science and Technology on a model which will maximise the impact, sustainability and effective governance and management of the SA National Cyber-Infrastructure System. The expected outcome is that the Minister will be informed as to how this important initiative should be optimally institutionalised.

Currently, the main components of the core South African national cyber-infrastructure arrangement are: the Centre for High Performance Computing (CHPC), the South African National Research Network (SANReN), the Data Intensive Research Infrastructure of South Africa (DIRISA) formerly known as VLDB and the SAGrid Initiative. Outside of these, other parties own and manage diverse other components of the broader SA cyber-infrastructure ecosystem.
 

Researcher's Corner

A Shining Star Arises Through CHPC Facilitated Research

reginaDr Regina Maphanga is a senior researcher at the Materials Modelling Centre of the University of Limpopo. She has won several awards in recognition of her work and is the 2010 recipient of the National Science and Technology Forum (NSTF) Award for the category: Distinguished Black Female Researcher over 2-5 years. This was for her contribution to computational modelling of materials, in particular, electrolytic manganese dioxide.

Regina is from a rural village called Ngwanallela in GaMatlala, about 70km west of Polokwane. She has always been an academic achiever, being exempted from doing grade 6 during her primary schooling and finishing matric at the age of 16. Her very first use of a computer happened during her honours degree which she passed with distinctions, going further to do a Master’s and Doctorate in Physics, specialising in computational modelling of materials.

She describes computational modelling as a relatively new research method which combines theory and experimental research to calculate the properties of materials. Instead of laboratory equipments and samples used in traditional experiments, computational modelling makes use of computers and mathematical models to solve problems. The various methods, based on the theory, can be used to bridge the gaps between fundamental science and industrial application. These can be applied to a variety of different materials and can then be used to understand the properties of complex materials. This gives an attractive approach for the many fields where it is hard or impossible to get experimental data.

Her research work is based on computer simulations and EXAFS experiments for electrolytic manganese dioxide, which is a positive cathode material used in alkaline batteries. Ab initio and atomistic simulations (Energy Minimization and Molecular Dynamics Techniques) are used to simulate materials. She uses a state of the art and rare technique called the Amorphisation and re-crystallisation (A and R) method. During the simulation, the material is allowed to undergo amorphous configuration and calculations are prolonged until the material re-crystallises. Prolonged dynamical simulations result in re-crystallisation of the structure together with the evolution of the structural features observed experimentally. Hence the technique was found to be appropriate in the simulation of complex materials.

Regina’s research findings have been presented at national and international conferences and published in journals and conference proceedings. She currently supervises postgraduate students.

Other achievements and awards:

  • Selected by IAP (InterAcademy Panel for International Issues) as a Young Scientist to represent South Africa during the World Economic Forum’s Annual Meeting of the New Champions in Dalian, China ( 2011)
  • Selected as a member of Global Young Academy: the voice of the young scientists around the world (2011)
  • Finalist of LOREAL/UNESCO Fellowship of “For Women in Science” South African Programme (2006)
  • Recipient of Special Mention Award of LOREAL/UNESCO Fellowship “For Women in Science” (2006)

Regina is a long time user of the CHPC, due to the computationally intensive nature of her research. “The CHPC became very handy when we were starting with the projects on Large Scale Simulations, and it provided us with the computing power and resources we required to carry out our simulations. It is still making a huge difference and making it possible for us to progress with our work,” she says.

 

Last Updated on Tuesday, 16 October 2012 16:36

Hits: 1906

Documentation for users:

CHPC Student Cluster Competition 2013

Tsessebe Cluster Available

Graphical Processing Unit Cluster Available

CHPC SAGrid Cluster Available

Dirisa Storage Unit Available

Social Share

FacebookTwitterGoogle BookmarksLinkedin

Website developed by Multidimensions