This page is no longer actively maintained and will soon be archived. Please visit our new page at https://docs.hpc.wvu.edu for up to date information.
WVU Research Computing Resources
Welcome to the WVU Research Computing (RC) High Performance Computing (HPC) Wiki. This wiki contains supplement information to the RCs HPC main site, mainly on specific system information and using the cluster.
- WVU Research Computing Website - General Overview
- HPC HelpDesk - Request help from HPC team and Requesting HPC Access
- Wiki - Documentation for end-users performing certain tasks
For requesting help, create a new ticket at the Research Computing HPC Help Desk web page. You are welcome to e-mail any member of the RC HPC team directly, but since we are not always at our desk, the ticket system will guarantee that your support question will be seen by someone currently available.
High Performance Computing Clusters
- Mountaineer mountaineer.hpc.wvu.edu
Old cluster, 32 compute nodes with Westmere X5650 processors at 2.67 GHz. To be decommissioned after the deployment of Thorny Flat
- Spruce Knob spruce.hpc.wvu.edu
Current production cluster, 176 compute nodes with Sandy Bridge, Ivy Bridge, Haswell and Broadwell processors.
- Thorny Flat tf.hpc.wvu.edu
Future cluster. To be deployed around fall 2018. Compute nodes most probably being SkyLake Gold 6138 or similar.
Software and Libraries
- Using Modulefiles
- Installing Python and R packages
- Python virtual environments
- Installing packages compiled from source
- Genetic Alignment with BSseeker and Bowtie2
- GNU Compiler Collection
- Python 2 and 3 and scientific libraries
- ANSYS Products
- Mountaineer Batch Queues
- Spruce Batch Queues
- Sample Job Scripts
- Using X Windows applications
- Tricks to clean the Scratch
- Intro to HPC
- Intro to Linux
- Intro to PBS/Torque
- 2014 HPC Summer Institute
- 2017 CS Graduate Seminar -- Intro to WVU Research Computing and XSEDE
- 2018 Seminar for Timothy Driscoll class