The HPC model enables Iowa State University to:
- Increase the scale and availability of HPC resources for researchers,
- Develop a model that is affordable to researchers and sustainable by Iowa State,
- Leverage an economy of scale in high performance computing and reduce overhead and duplicative costs without negatively impacting the science,
- Build a community of need around an emerging technology in an efficient manner,
- Enhance the research infrastructure at Iowa State not only to encourage more aggressive research but also to have facilities available that will help recruit and retain premiere faculty, and
- Create an environment where those who do not have existing grant funding can obtain development resources.
HPC research at Iowa State University totals approximately $3.5m per year with research grants from the Department of Energy, the National Science Foundation, the National Institutes of Health, and other granting organizations. The following are just a few examples of the research being done through the support of this facility.
Submitting a Proposal?
For estimating purposes, please use the following information in the development of your proposal budget. The final cost to the grant award will be actual rather than estimated. Note: ISU research groups could share cost of nodes or storage.
Dual 32 core Intel Icelake processors, 512 GB of shared memory, 1.6 TB of NVME 3DWPD SSD local disk space, Mellanox EDR (100 Gbit) Infiniband card/cable, and 1Gbit Ethernet & cable + 5 years parts warranty. Nodes come in groups of 4, with 2 hot swappable redundant power supplies providing power for all 4.
$39,000 (Per file system for 262 TB of disk space with local backup)
Storage is purchased as a fileserver with redundant power supplies and 2 44 bay JBOD enclosure attached. Each enclosure has redundant fans, and power supplies, all of which are hotswappable. All disks are 12Gb/s 7200 RPM Enterprise SAS drives. The storage is configured so that yesterday's copy of the data is retained in case of filesystem corruption.
This provides a 262 TB zpool. The fileservers are connected via 100Gbit Infiniband to provide fast access to the data.
NSF "Facilities, Equipment and Other Resources" Information
Facilities, Equipment, and Other Resources
[Other resources here]
The research in part will be supported by the Nova Cluster available at Iowa State University as part of a MRI grant from the National Science Foundation. The equipment comprises the following.
- 140 Intel nodes, 36-48 cores/node, 192-384GB/node
- 40 AMD nodes, 64 cores/node, 512GB/node
- 54 Intel nodes, 64 cores/node, 512GB/node
- 3 Intel nodes, 64-96 cores/node, 3TB/node
- 25 GPU nodes, 36-64 cores/node, 192-512GB/node, 1-2 V100 or 4-8 A100 GPU cards
- A head node, 16 I/O nodes, and one data transfer node
- 102TB shared scratch disk space and 2PB of RAID-6 long term NFS disk space
- GigE and EDR (100Gbit) InfiniBand interconnects for each node (via a stacked Ethernet switch and a 324 port InfiniBand switch)
Preparing a Paper or Publication?
If you are preparing a paper or publication that references research that used ISU HPC resources, add the following:
"The research reported in this paper is partially supported by the HPC@ISU equipment at Iowa State University, some of which has been purchased through funding provided by NSF under MRI grants number 1726447 and MRI2018594."