NAISS
SUPR
SUPR
SNIC Medium Storage 2021

Decided

This round has been closed as all proposals have been handled.

Monthly evaluation of proposals during the year. To apply, you must be a scientist in Swedish academia, at least at the level of assistant professor.

See further information.

Resources

Resource Centre Total
Requested
Upper
Limit
Available Unit Note
Cephyr NOBACKUP C3SE 240 748 40 000 200 000 GiB Storage attached to Alvis, Vera and Hebbe at C3SE
Project storage based on Ceph with a total usable area of 5 PiB.
  • 14 Storage servers, each with 3 NvMe:s (for database and journal).
  • 7 JBOD, each with 42 14 TiB HDD:s.
Nobackup HPC2N 968 300 30 000 300 000 GiB
When a Storage Project has been approved it is important to make sure all files belonging to the project are moved from each members personal directory under /pfs/nobackup/home/ to the new project storage directory. 30 days after the approval of the project storage the quota settings for each user in the /pfs/nobackup/home file system will be reduced to 25GB, just enough to keep certain user specific configuration files. Project storage is by default only accessible by the members of the project.

Active project storage without backup for local HPC2N projects.
Centrestorage nobackup LUNARC 638 000 40 000 600 000 GiB
Centre Storage NSC 1 624 420 50 000 500 000 GiB Storage on Centre Storage at NSC for compute projects allocated time on NSC resources.
Centre Storage @ NSC is designed for fast access from compute resources at NSC. The purpose is to provide storage for active data for SNIC compute projects allocated time on compute resources at NSC. Apply for a file quota of 1 million files unless you have a good motivation for more.

Project storage for NAISS as well as LiU Local projects with compute allocations on resources hosted by NSC.

Centre Storage @ NSC is designed for fast access from compute resources at NSC. It consists of one IBM ESS GL6S building block and one IBM ESS 5000 SC4 building block.

In total there are 946 spinning hard disks and a small number of NVRAM devices and SSDs which act as a cache to speed up small writes. The total disk space that is usable for storing files is approximately 6.9 PiB.

Klemming PDC 923 000 100 000 1 000 000 GiB Storage attached to Beskow and Tegner at PDC
When a storage allocation has been approved, any data belonging to the project needs to be moved from the users' 'nobackup' directories and into the project directory as soon as possible. 30 days after the storage allocation starts, the 25GiB quota will be enforced in the 'nobackup' directories of the users that belong to the allocation. More information about project directories in Klemming can be found at https://www.pdc.kth.se/support/documents/data_management/lustre.html.

Project storage for NAISS as well as PDC projects with compute allocations on resources hosted by PDC.

Klemming is designed for fast access from compute resources at PCD. It uses the Lustre parallel file system, which is optimized for handling data from many clients at the same time. The total size of Klemming is 12 PB.

dCache Swestore 132 100 1 000 TiB

Swestore is a Research Data Storage Infrastructure, intended for active research data and operated by the National Academic Infrastructure for Supercomputing in Sweden, NAISS,

The storage resources provided by Swestore are made available for free for academic research funded by VR and Formas through open calls such that the best Swedish research is supported and new research is facilitated.

The purpose of Swestore allocations, granted by National Allocations Committee (NAC), is to provide large scale data storage for “live” or “working” research data, also known as active research data.

See the documentation at: https://docs.swestore.se
iRODS Swestore 150 100 200 TiB This resource is no longer active as it will be taken out of operation by the end of 2021.
National storage infrastructure for large-scale research data built on iRODS technology (www.irods.org).
Crex 1 UPPMAX 4 147 800 100 000 3 000 000 GiB UPPMAX storage system attached to Rackham and Snowy, suitable for active data.
UPPMAX storage system attached to Rackham and Snowy, suitable for active data. Only apply if you have a Compute project or proposal at UPPMAX. Not suitable for long-term data storage. Backup is available. Use "nobackup" in directory names to exempt data from backup. Note: Allocations on Crex are decided strictly on a monthly basis. Proposals submitted after the 23rd are decided the following month. See this page for detailed information: https://www.uppmax.uu.se/support/getting-started/applying-for-storage/

Active data storage for Rackham projects. Primarily for life science projects.

Click above to show more information about the resource.