Software‎ > ‎Tutorials‎ > ‎

HPC Basics

HPC Modules

• HPC uses a modules application to provide access to modular dependencies

– libraries: netcdf, hdf5, etc

– executables: python, ifort

• More details at HPC wiki

Exercise 1: HPC Modules

Libraries for air quality models will be met using the HPC's module system. The module system is a convenient way for HPC to provide options to users for standard needs. Standard needs would include compilers and libraries. Compilers take code and make programs. This is useful, because you download model code and you want a program that simulates the atmosphere. Libraries are code subsets that are useful independent of a particular program. For example, CMAQ and GEOS-Chem use NetCDF as a file type. NetCDF has libraries written in Fortran, C, and Java that allow any program to interface with these files. That means that programs like ArcGIS have adopted NetCDF as an input file. More broadly, it means that any software that adopts NetCDF inputs would be useful for working with model output files. Programs “link” to these libraries when they are compiled and, sometimes, when they run. For you, this means that you will need NetCDF available when you compile your model and when it runs. For convenience, we will make these libraries and compilers available in your environment1. Most air quality models are easily usable with the Intel compilers.

1. Login to HPC via PuTTY

2. Execute the command `module`

3. Use `module help` to find out

(a) how to list available modules? _______________________

(b) list versions of an available module? _______________________

(c) how to load modules? _______________________

4. List the available commands; using “space” for down a page. If you need to go up, hit “q” and start over.

(a) What is the first section listed (— section name —)? __________________

(b) What is the version of NetCDF? ________________

(c) What is the version of HDF5? ________________

5. List the available version of Intel compiler assuming that the modules name is simply “intel”

6. Add the default version of intel using the command from 3c

7. What commands would you execute to add intel, python, netcdf, and hdf5 as modules?

8. Execute `ifort`, what do you get?

9. Execute `python --version`, what do you get?

10. Execute `ncdump`, what do you get?

11. Execute `h5dump`, what do you get?

12. Close your putty session and reopen it

13. Execute `ifort`, what do you get?

14. Execute `python --version`, what do you get?

15. Execute `ncdump`, what do you get?

16. Execute `h5dump`, what do you get?



Environment: Current and Default

• Any variables that are defined, exist in your environment.

• Your environment is the context for you linux commands

• The environment is lost when you log out.

• This section will show you how to make your environment persist from session to session.

Exercise 2: Creating an environment for compiling/runing a model

On the HPC system, your default SHELL is bash. bash can read files from your home directory to configure your default environment. bash can read from ~/.bash_profile or ~/.bashrc. For this class, I want your default environment to include intel/2012, python/2.7.3, hdf5/1.8.9, and netcdf/4.2. Take the commands from question 7 above and put them into your ~/.bash_profile file. You should be able to navigate to this file using FileZilla and then edit following the same steps we used in the “GNU Make” exercise.

1. Open FileZilla

2. Navigate to your home directory on the remote machine

3. Right click on .bash_profile and choose “View/Edit” (if it doesn't exist, what should you do?)

4. Add the `module load ` commands from question 7 above.

5. Execute `ifort`, what do you get?

6. Log out of any putty sessions you have open and log back in

7. Execute `ifort`, what do you get?

8. Execute `python –version`, what do you get?

9. Execute `ncdump`, what do you get?

10. Execute `h5dump`, what do you get?

11. Compare results from this section to the previous section.



Comments