User menu


Main menu

LTS Research Computing Services

We offer a variety of platforms and software for research and high performance computing. Users may choose from a range of services depending upon their requirements and interests.

All users of Lehigh's research computing systems must obtain an account for their exclusive use with their standard Lehigh University userid.

Service Level 1 (no charge)

Service Level 1 HPC account provides access to Maia (32-core, 128GB SMP) to the entire university community for batch-scheduled use. Access to Maia is provided by the Polaris gateway host, where users can create their batch scripts for submission. No direct SSH access to Maia is supported, and GUI-based codes cannot be run there.

Each HPC Service Level 1 user has a home directory storage quota of 5GB.  Scratch storage of 4TB is available on Maia (at /scratch/userid). Users may use this scratch storage for temporary storage of data while their jobs are running. The scratch filesystem cannot be used for long-term storage of any data; all data older than 14 days will be deleted.

Maia will reach its targeted end of life on June 30, 2017. As of Jan 2017, there is no decision made on retiring or replacing Maia. Please consider this if you are using Maia for your computational needs to support grant funded research. 

Account Request: To obtain a HPC account on Maia, please use the Account Request Form and select "FREE Linux command-line computing.

Service Level 2 (fee-based allocations)

Service Level 2 HPC account provides access to Sol, a 55-node condo cluster for research. A faculty member may obtain access to Sol by purchasing a minimum allocation of 50,000 core-hours or service units (SU) for $500. Additional allocations can be purchased in increments of 10,000 SUs for $100 each. An allocation cycle begins on Oct. 1 and ends on Sep. 30 of the next year and will not roll over to the next allocation cycle. At launch, there are 1,400,000 SUs available per allocation cycle. Faculty members who require more computing time or want a gauranteed share of total available computing time annually should consider a Condo Investment.

The faculty member or PI can request additional accounts sharing the same allocation for an annual charge of $50 (each). Each service level 2 user has a home directory quota of 150GB.

Account Request: To obtain a HPC account or allocation or both on Sol, the Faculty sponsoring the user account should contact Manager of Research Computing for more information.


Condo Investments

Faculty, Departments, Centers and Colleges can invest in Sol by purchasing additional compute nodes to support their research thereby increasing the overall capacity of Sol. Such investors, Condo Investors, will be provided with an annual allocation proportional to their investment that can be shared with their collaborators. Condo Investors who need more computing time than their investment can purchase allocation, if available, in blocks of 10,000 SUs for $100. These increments must be expended during the allocation cycle that they were purchased and cannot be rolled over to the next cycle.

A Condo Investor can request additional accounts sharing their allocation for an annual charge of $50 (each).

Prospective Investors should review the Condo Program before contacting HPC for investing in Sol.


Additional Storage: Additional storage (Level 1 & 2) is available by purchasing a Ceph project volume @ $200/TB/year. To request additional home directory storage, please submit request to
Definition of One Core-hour/Service Unit/SU: 1 hour of computing on 1 core. The base compute node on Sol with 20 cores will consume 20 SUs per hour of computing.
More Information: The Research Computing wiki contains details on using the HPC resources, and can be accessed from on-campus locations or via VPN). Polaris and Service Level 2 resources are accessible via ssh. SSH clients are available on all Linux distributions and on MacOS (via Windows SSH/SFTP clients are available from

Teaching uses of Research Computing Systems

Faculty members considering use of research computing facilities for teaching purposes should submit a request at at least eight weeks prior to the class start date with an anticipated enrollment count, a proposed syllabus, and details of their proposed use of HPC systems.

Service Level 1 Research Computing Resource

There is no charge to use Level 1 research computing resource. A Level 1 account gives access to Maia (32 core, 128GB SMP) for batch-scheduled use.  Access to Maia is provided by the Polaris gateway host, where users can create their batch scripts for submission. No direct SSH access to Maia is supported, and GUI-based codes cannot be run there. To request an account for your course, all students registered or auditing the course will need to request an account by visiting the Account Request page. Each account will have a home directory quota of 5GB.  If additional storage is required for coursework, a Ceph Project volume will need to be requested at $200/TB/year. To request additional storage, please submit request to with class roster. Instructors who use Maia for credited courses should be aware that there will be no reservations created to accommodate increased usage for completing assignments, projects and other course related workloads.

Service Level 2 Research Computing Resource

These accounts are typically associated with a rostered course, and last for the duration of that course (up to one semester).  The course instructor can request these accounts for his/her students, and the department offering the course is responsible for paying the associated fees.  

A course allocation provides 1TB Ceph space and an allocation based on number of students in the course. The fee is broken down as follows

  • 1 TB Ceph space.: $200

  • Charge per student: $15

    • Provides 500 SU/student.

    • A course with 10 students will thus have a total allocation of 5000 SU that is shared among all students in the course.

  • Additional allocations in units of

    • 1000 SUs can be purchased for $10 up to 10K SUs and

    • 10K SU units for $100 for allocation sizes > 10K

Instructors requiring assistance with estimating total or per student SU requirements for the course should contact Research Computing Staff at least 4 weeks prior to beginning of the semester.



Usage Policy: Student accounts cannot be shared and will be active until two weeks past the end of the semester. All compute intensive tasks must be submitted via the batch scheduler. On request, LTS Research Computing staff will guest lecture on how to use the resource, write and submit job scripts and monitor jobs. Compute intensive task is defined as any operation on the HPC resource other than editing, copying, moving or deleting files, submitting and monitoring jobs and issuing simple commands such as ls, cp, mv, mkdir, rm, tail, tar, gzip/gunzip, more, cat and less. All student data not saved in the Ceph project space will be purged when accounts are deactivated.


Ceph Storage Resource

Faculty can request a Ceph project to provide storage resources for coursework independent of accounts on Research Computing clusters. The cost of a Ceph allocation is $200/TB/year and must be paid by the department offering the course. Any request for a Ceph volume for course work will need to be accompanied by

  1. Total storage space requested.
  2. List of students (rostered or auditing), instructor, teaching assistants, and support staff (if any) who need access.
  3. Type of project volume, Managed or Open (see Ceph FAQ).
    By default, a Managed volume will be created with each account given 10GB personal space and any remaining space shared among all users listed in 2. above.
  4. Access to the Ceph volume will be for the current semester only plus an additional two weeks following the end of the semester to allow students to backup/transfer their data.

Acknowledging use of LTS Research Computing Services

Please acknowledge Lehigh University in publications, reports and presentations that utilize LTS Research Computing Services with the following statement:

Portions of this research were conducted with research computing resources provided by Lehigh University

Last Updated: January 24, 2017