GFI&SKD Storage: Difference between revisions

From gfi
mNo edit summary
m (From fileserver leo.hpc.uib.o to UiB NetApp)
 
(12 intermediate revisions by the same user not shown)
Line 1: Line 1:
Coming along with the computing system '''cyclone.hpc.uib.no''' there are also three raid based storage units, each with size of 120TB, and all mounted within the same shared lustre filesystem. Two of the storage units are funded by GFI, the third is funded by SKD.
Organized via UiB ITA network storage (NetApp), GFI&SKD rents a total quota of 450TB disk storage (May 2024).


The storage areas are available from these main folder paths on cyclone (local access) and any other GFI&SKD linux-client (network acces via nfs):
The storage areas are available from these main folder paths on cyclone (local access) and any other GFI&SKD linux-client connected to UiB std. cable network (network acces via nfs):


  - /Data/gfi  and /Data/skd
  - /Data/gfi  and /Data/skd


For GFI&SKD windows-clients and mac-clients inside UiB-network, the storage areas are in general available from the following network shares and you can also mount subfolders of these:
For GFI&SKD windows-clients and mac-clients inside UiB-network, the storage areas are accessed via one servers and two different paths for GFI and SKD. The paths are like follows  and you can mount subfolders of these:


  - From Win10, use \\leo.hpc.uib.no\gfi and \\leo.hpc.uib.no\skd
  - From Win10, backup folders use:
            \\klient.uib.no\felles\matnat\gfi and \\klient.uib.no\felles\matnat\skd


  - From Macos, use smb://uib.no;<userid>@leo.hpc.uib.no/gfi (or change tail gfi to skd). Please include the entire domain path uib.no rather than uib only also on Macs.
  - From Macos, backup folders use:
              smb://uib.no;userid@klient.uib.no/felles/matnat/gfi (or change tail gfi to skd).
  Please include the entire domain path uib.no rather than uib only also on Macs.


Howto on connecting from Windows/Mac is published from UiB ITA (just change string for correct server and path):
Howto on connecting from Windows/Mac is published from UiB ITA (just change string for correct server and path):
Line 33: Line 36:


{| class="wikitable"
{| class="wikitable"
!colspan="7"|GFI&SKD Storage folders inside /Data/gfi/
!colspan="7"|GFI&SKD Storage folders inside /Data/gfi/ - (from leo.hpc.uib.no)
|-
|-
!colspan="1"|Subfolder
!colspan="1"|Subfolder
Line 54: Line 57:
|shared era5 data
|shared era5 data
|48
|48
|130 TB
|150 TB
|GFI & SKD
|GFI & SKD
|
|
Line 74: Line 77:
|
|
|
|
|-
|users
|individual with backup
|7-44
|20 TB
|GFI
|*
|*
|-
|projects
|various project shares (sum of subfolders)
|71-78
|4.25 TB
|GFI
|
|*
|-
|projects/metdata
|project group share
|72
|1 TB
|METDATA
|
|*
|-
|projects/farlab
|project group share
|73
|1 TB
|FARLAB
|
|*
|-
|projects/isomet
|project group share
|74
|0.25 TB
|ISOMET
|
|*
|-
|-
|met
|met
Line 134: Line 97:
|group share
|group share
|47
|47
|4 TB
|5 TB
|EXPREC
|EXPREC
|
|
Line 150: Line 113:
|Undisposed storage
|Undisposed storage
|
|
|5.25 TB
|8.5 TB
|GFI
|GFI
|
|
Line 156: Line 119:
|-
|-
!colspan="1"|Total
!colspan="1"|Total
!colspan="1"|Size of GFI storage
!colspan="1"|Size of GFI storage - (from leo.hpc.uib.no)
!colspan="1"|
!colspan="1"|
!colspan="1"|240T
!colspan="1"|240 TB
!colspan="1"|GFI & SKD
!colspan="1"|
!colspan="1"|
|-
|}
{| class="wikitable"
!colspan="7"|GFI&SKD Storage folders inside /Data/gfi/ - (from UiB SAN - felles.uib.no)
|-
!colspan="1"|Subfolder
!colspan="1"|Purpose
!colspan="1"|Projid
!colspan="1"|Size limit
!colspan="1"|Group assign
!colspan="1"|User quota
!colspan="1"|Backup
|-
|users
|individual with backup
|None
|20 TB
|GFI
|*
|*
|-
|projects
|various project shares (sum of subfolders)
|None
|4.25 TB
|GFI
|
|*
|-
|projects/metdata
|project group share
|None
|1.5 TB
|METDATA
|
|*
|-
|projects/farlab
|project group share
|None
|1 TB
|FARLAB
|
|*
|-
|projects/isomet
|project group share
|None
|0.25 TB
|ISOMET
|
|*
|-
|-
!colspan="1"|Total
!colspan="1"|Size of GFI storage - (from UiB SAN - felles.uib.no)
!colspan="1"|
!colspan="1"|24.25 TB
!colspan="1"|GFI & SKD
!colspan="1"|GFI & SKD
!colspan="1"|
!colspan="1"|
Line 208: Line 232:
|
|
|-
|-
|users
|stormrisk
|individual with backup
|group share
|51-65
|83
|5 TB
|5 TB
|SKD
|STORMRISK
|*
|
|*
|
|-
|-
|projects
|projects
|various project shares
|various project shares
|49
|49
|10 TB
|15 TB
|SKD
|SKD
|
|
|
|
|-
|users
|individual with backup - This is from UIB SAN felles.uib.no
|None
|5 TB
|SKD
|*
|*
|-
|-
!colspan="1"|Total
!colspan="1"|Total
!colspan="1"|Size of SKD storage
!colspan="1"|Size of SKD storage
!colspan="1"|
!colspan="1"|
!colspan="1"|120T
!colspan="1"|125 TB
!colspan="1"|GFI & SKD
!colspan="1"|GFI & SKD
!colspan="1"|
!colspan="1"|
Line 234: Line 266:
|}
|}
<br>
<br>
To monitor your quota, you can run the following command from an ssh login on cyclone:<br>
To monitor your quota (on folders from leo.hpc.uib.no), you can run the following command from an ssh login on cyclone:<br>
  - lfs quota -hp <projid> /shared/
  - lfs quota -hp <projid> /shared/


  - example for /Data/gfi/scratch: lfs quota -hp 79 /shared/
  - example for /Data/gfi/scratch: lfs quota -hp 79 /shared/
To monitor your 30 GB homefolder quota on cyclone, you can run the following command from an ssh login on cyclone:<br>
- lfs quota -hp $(id -u) /shared/
or just
- lfs quota -hp $(id -u) $HOME

Latest revision as of 08:35, 13 May 2024

Organized via UiB ITA network storage (NetApp), GFI&SKD rents a total quota of 450TB disk storage (May 2024).

The storage areas are available from these main folder paths on cyclone (local access) and any other GFI&SKD linux-client connected to UiB std. cable network (network acces via nfs):

- /Data/gfi  and /Data/skd

For GFI&SKD windows-clients and mac-clients inside UiB-network, the storage areas are accessed via one servers and two different paths for GFI and SKD. The paths are like follows and you can mount subfolders of these:

- From Win10, backup folders use:
           \\klient.uib.no\felles\matnat\gfi and \\klient.uib.no\felles\matnat\skd
- From Macos, backup folders use:
              smb://uib.no;userid@klient.uib.no/felles/matnat/gfi (or change tail gfi to skd).
  Please include the entire domain path uib.no rather than uib only also on Macs.

Howto on connecting from Windows/Mac is published from UiB ITA (just change string for correct server and path):

* Win7/Win10: How to connect to a network share

* Mac OS X: Connecting to your network share

The storage is organised by priority on larger shared folders for modelling data, in group and projects folders with and without need for backup and folders for individual use with or without backup. Because backup of large quantities are connected with considerable annual cost, users and groups will need to consider which data need a daily backup and which are reproducible and have a reasonably security within a modern storage system without backup.

As a general rule, we recommend backup of all data files that cannot be reproduced by re-executing an analysis script, re-running a model or that can be downloaded from other locations. Examples for data files to be placed in backup folders are scripts for starting a model run, matlab and python scripts for data analysis, manuscripts, and so on. Examples for data files that can remain without backup are results from model runs where scripts are available to re-run, data files downloaded from other servers, and plots that can be recreated by running a script.

Most folders assigned for group access have general read-write access on group level. Owners of each folder and file may still have set exclusive individual access on these folders and files. If other members of the assigned group find this inconvenient, please contact the formal owner of the file or folder.

Individual quotas (inside /Data/gfi/users and /Data/skd/users) are set by default to 200GB and adjusted by request on individual needs - normally < 1TB, but still in respect for other users needs and limited by the max. size of the parent folder. Larger amount should be considered for their own project space. Project and group quotas are managed by configured maximal limits of main project/group folders.

Access on top level folders are often set read-only to keep folder conventions & structure strict and tidy. When in need for storage space and/or adjustments and customisation, please submit issue by email to: driftATgfi.uib.no.

Please note that "the work-folder" is meant for short time storage - Any file or folder older than 30 days inside "the work-folder" will be deleted automatically and without further notice.

Sizes, quotas, purpose etc. for the variety of folders inside GFI&SKD storage are given below.

GFI&SKD Storage folders inside /Data/gfi/ - (from leo.hpc.uib.no)
Subfolder Purpose Projid Size limit Group assign User quota Backup
share shared model data 48 20TB GFI & SKD
share/era5 shared era5 data 48 150 TB GFI & SKD
scratch individual and group 79 25 TB GFI
work shared short time (30days) Symbolic link -> /Data/skd/work 50 28.75 TB GFI & SKD
met group share 45 15 TB MET
spengler group share 46 15 TB SPENGLER
exprec group share 47 5 TB EXPREC
metno daily&arch. met.no prognosis & obs 80 1.5 TB GFI
Undisposed Undisposed storage 8.5 TB GFI
Total Size of GFI storage - (from leo.hpc.uib.no) 240 TB GFI & SKD
GFI&SKD Storage folders inside /Data/gfi/ - (from UiB SAN - felles.uib.no)
Subfolder Purpose Projid Size limit Group assign User quota Backup
users individual with backup None 20 TB GFI * *
projects various project shares (sum of subfolders) None 4.25 TB GFI *
projects/metdata project group share None 1.5 TB METDATA *
projects/farlab project group share None 1 TB FARLAB *
projects/isomet project group share None 0.25 TB ISOMET *
Total Size of GFI storage - (from UiB SAN - felles.uib.no) 24.25 TB GFI & SKD


GFI&SKD Storage folders inside /Data/skd/
Subfolder Purpose Projid Size limit Group assign User quota Backup
share Symbolic link ->/Data/gfi/share/ - shared model data 48 GFI & SKD
skd-share Several subfolders /Data/gfi/share/* -> /Data/skd/skd-share/. 66 51.25 TB GFI & SKD
scratch individual and group share 67 20 TB SKD
work shared short time (30days) 50 28.75 TB GFI & SKD
stormrisk group share 83 5 TB STORMRISK
projects various project shares 49 15 TB SKD
users individual with backup - This is from UIB SAN felles.uib.no None 5 TB SKD * *
Total Size of SKD storage 125 TB GFI & SKD


To monitor your quota (on folders from leo.hpc.uib.no), you can run the following command from an ssh login on cyclone:

- lfs quota -hp <projid> /shared/
- example for /Data/gfi/scratch: lfs quota -hp 79 /shared/

To monitor your 30 GB homefolder quota on cyclone, you can run the following command from an ssh login on cyclone:

- lfs quota -hp $(id -u) /shared/

or just

- lfs quota -hp $(id -u) $HOME