GFI&SKD Storage: Difference between revisions

From gfi
Ngfih (talk | contribs)
mNo edit summary
Ngfih (talk | contribs)
m From fileserver leo.hpc.uib.o to UiB NetApp
 
(7 intermediate revisions by the same user not shown)
Line 1: Line 1:
Coming along with the computing system '''cyclone.hpc.uib.no''' there are also three raid based storage units, each with size of 120TB, and all mounted within the same shared lustre filesystem. Two of the storage units are funded by GFI, the third is funded by SKD.
Organized via UiB ITA network storage (NetApp), GFI&SKD rents a total quota of 450TB disk storage (May 2024).


The storage areas are available from these main folder paths on cyclone (local access) and any other GFI&SKD linux-client (network acces via nfs):
The storage areas are available from these main folder paths on cyclone (local access) and any other GFI&SKD linux-client connected to UiB std. cable network (network acces via nfs):


  - /Data/gfi  and /Data/skd
  - /Data/gfi  and /Data/skd


For GFI&SKD windows-clients and mac-clients inside UiB-network, the storage areas are accessed via two different servers - dependent on whether they are included in backup or not. The folders marked for backup in scheme below are accessible from a different server and path compared with those with no-backup. And you can mount subfolders of these:
For GFI&SKD windows-clients and mac-clients inside UiB-network, the storage areas are accessed via one servers and two different paths for GFI and SKD. The paths are like follows  and you can mount subfolders of these:


  - From Win10, no-backup folders use: \\leo.hpc.uib.no\gfi and \\leo.hpc.uib.no\skd
  - From Win10, backup folders use:
  From Win10, backup folders use:
             \\klient.uib.no\felles\matnat\gfi and \\klient.uib.no\felles\matnat\skd
             \\klient.uib.no\FELLES\MATNAT\GFI and \\klient.uib.no\FELLES\MATNAT\SKD


  - From Macos, no-backup folders use: smb://uib.no;<userid>@leo.hpc.uib.no/gfi
  - From Macos, backup folders use:
              (or change tail gfi to skd).
               smb://uib.no;userid@klient.uib.no/felles/matnat/gfi (or change tail gfi to skd).
  From Macos, backup folders use:
               smb://uib.no;userid@klient.uib.no/FELLES/MATNAT/GFI (or change tail gfi to skd).
   Please include the entire domain path uib.no rather than uib only also on Macs.
   Please include the entire domain path uib.no rather than uib only also on Macs.


Line 60: Line 57:
|shared era5 data
|shared era5 data
|48
|48
|130 TB
|150 TB
|GFI & SKD
|GFI & SKD
|
|
Line 116: Line 113:
|Undisposed storage
|Undisposed storage
|
|
|5.25 TB
|8.5 TB
|GFI
|GFI
|
|
Line 124: Line 121:
!colspan="1"|Size of GFI storage - (from leo.hpc.uib.no)
!colspan="1"|Size of GFI storage - (from leo.hpc.uib.no)
!colspan="1"|
!colspan="1"|
!colspan="1"|213.75T
!colspan="1"|240 TB
!colspan="1"|GFI & SKD
!colspan="1"|GFI & SKD
!colspan="1"|
!colspan="1"|
Line 143: Line 140:
|users
|users
|individual with backup
|individual with backup
|1
|None
|20 TB
|20 TB
|GFI
|GFI
Line 151: Line 148:
|projects
|projects
|various project shares (sum of subfolders)
|various project shares (sum of subfolders)
|1
|None
|4.25 TB
|4.25 TB
|GFI
|GFI
Line 159: Line 156:
|projects/metdata  
|projects/metdata  
|project group share
|project group share
|1
|None
|1.5 TB
|1.5 TB
|METDATA
|METDATA
Line 167: Line 164:
|projects/farlab
|projects/farlab
|project group share
|project group share
|1
|None
|1 TB
|1 TB
|FARLAB
|FARLAB
Line 175: Line 172:
|projects/isomet
|projects/isomet
|project group share
|project group share
|1
|None
|0.25 TB
|0.25 TB
|ISOMET
|ISOMET
Line 185: Line 182:
!colspan="1"|Size of GFI storage - (from UiB SAN - felles.uib.no)
!colspan="1"|Size of GFI storage - (from UiB SAN - felles.uib.no)
!colspan="1"|
!colspan="1"|
!colspan="1"|24.25T
!colspan="1"|24.25 TB
!colspan="1"|GFI & SKD
!colspan="1"|GFI & SKD
!colspan="1"|
!colspan="1"|
Line 234: Line 231:
|
|
|
|
|-
|users
|individual with backup
|51-65
|5 TB
|SKD
|*
|*
|-
|-
|stormrisk
|stormrisk
Line 254: Line 243:
|various project shares
|various project shares
|49
|49
|10 TB
|15 TB
|SKD
|SKD
|
|
|
|
|-
|users
|individual with backup - This is from UIB SAN felles.uib.no
|None
|5 TB
|SKD
|*
|*
|-
|-
!colspan="1"|Total
!colspan="1"|Total
!colspan="1"|Size of SKD storage
!colspan="1"|Size of SKD storage
!colspan="1"|
!colspan="1"|
!colspan="1"|120T
!colspan="1"|125 TB
!colspan="1"|GFI & SKD
!colspan="1"|GFI & SKD
!colspan="1"|
!colspan="1"|
Line 269: Line 266:
|}
|}
<br>
<br>
To monitor your quota, you can run the following command from an ssh login on cyclone:<br>
To monitor your quota (on folders from leo.hpc.uib.no), you can run the following command from an ssh login on cyclone:<br>
  - lfs quota -hp <projid> /shared/
  - lfs quota -hp <projid> /shared/


  - example for /Data/gfi/scratch: lfs quota -hp 79 /shared/
  - example for /Data/gfi/scratch: lfs quota -hp 79 /shared/
To monitor your 30 GB homefolder quota on cyclone, you can run the following command from an ssh login on cyclone:<br>
- lfs quota -hp $(id -u) /shared/
or just
- lfs quota -hp $(id -u) $HOME

Latest revision as of 08:35, 13 May 2024

Organized via UiB ITA network storage (NetApp), GFI&SKD rents a total quota of 450TB disk storage (May 2024).

The storage areas are available from these main folder paths on cyclone (local access) and any other GFI&SKD linux-client connected to UiB std. cable network (network acces via nfs):

- /Data/gfi  and /Data/skd

For GFI&SKD windows-clients and mac-clients inside UiB-network, the storage areas are accessed via one servers and two different paths for GFI and SKD. The paths are like follows and you can mount subfolders of these:

- From Win10, backup folders use:
           \\klient.uib.no\felles\matnat\gfi and \\klient.uib.no\felles\matnat\skd
- From Macos, backup folders use:
              smb://uib.no;userid@klient.uib.no/felles/matnat/gfi (or change tail gfi to skd).
  Please include the entire domain path uib.no rather than uib only also on Macs.

Howto on connecting from Windows/Mac is published from UiB ITA (just change string for correct server and path):

* Win7/Win10: How to connect to a network share

* Mac OS X: Connecting to your network share

The storage is organised by priority on larger shared folders for modelling data, in group and projects folders with and without need for backup and folders for individual use with or without backup. Because backup of large quantities are connected with considerable annual cost, users and groups will need to consider which data need a daily backup and which are reproducible and have a reasonably security within a modern storage system without backup.

As a general rule, we recommend backup of all data files that cannot be reproduced by re-executing an analysis script, re-running a model or that can be downloaded from other locations. Examples for data files to be placed in backup folders are scripts for starting a model run, matlab and python scripts for data analysis, manuscripts, and so on. Examples for data files that can remain without backup are results from model runs where scripts are available to re-run, data files downloaded from other servers, and plots that can be recreated by running a script.

Most folders assigned for group access have general read-write access on group level. Owners of each folder and file may still have set exclusive individual access on these folders and files. If other members of the assigned group find this inconvenient, please contact the formal owner of the file or folder.

Individual quotas (inside /Data/gfi/users and /Data/skd/users) are set by default to 200GB and adjusted by request on individual needs - normally < 1TB, but still in respect for other users needs and limited by the max. size of the parent folder. Larger amount should be considered for their own project space. Project and group quotas are managed by configured maximal limits of main project/group folders.

Access on top level folders are often set read-only to keep folder conventions & structure strict and tidy. When in need for storage space and/or adjustments and customisation, please submit issue by email to: driftATgfi.uib.no.

Please note that "the work-folder" is meant for short time storage - Any file or folder older than 30 days inside "the work-folder" will be deleted automatically and without further notice.

Sizes, quotas, purpose etc. for the variety of folders inside GFI&SKD storage are given below.

GFI&SKD Storage folders inside /Data/gfi/ - (from leo.hpc.uib.no)
Subfolder Purpose Projid Size limit Group assign User quota Backup
share shared model data 48 20TB GFI & SKD
share/era5 shared era5 data 48 150 TB GFI & SKD
scratch individual and group 79 25 TB GFI
work shared short time (30days) Symbolic link -> /Data/skd/work 50 28.75 TB GFI & SKD
met group share 45 15 TB MET
spengler group share 46 15 TB SPENGLER
exprec group share 47 5 TB EXPREC
metno daily&arch. met.no prognosis & obs 80 1.5 TB GFI
Undisposed Undisposed storage 8.5 TB GFI
Total Size of GFI storage - (from leo.hpc.uib.no) 240 TB GFI & SKD
GFI&SKD Storage folders inside /Data/gfi/ - (from UiB SAN - felles.uib.no)
Subfolder Purpose Projid Size limit Group assign User quota Backup
users individual with backup None 20 TB GFI * *
projects various project shares (sum of subfolders) None 4.25 TB GFI *
projects/metdata project group share None 1.5 TB METDATA *
projects/farlab project group share None 1 TB FARLAB *
projects/isomet project group share None 0.25 TB ISOMET *
Total Size of GFI storage - (from UiB SAN - felles.uib.no) 24.25 TB GFI & SKD


GFI&SKD Storage folders inside /Data/skd/
Subfolder Purpose Projid Size limit Group assign User quota Backup
share Symbolic link ->/Data/gfi/share/ - shared model data 48 GFI & SKD
skd-share Several subfolders /Data/gfi/share/* -> /Data/skd/skd-share/. 66 51.25 TB GFI & SKD
scratch individual and group share 67 20 TB SKD
work shared short time (30days) 50 28.75 TB GFI & SKD
stormrisk group share 83 5 TB STORMRISK
projects various project shares 49 15 TB SKD
users individual with backup - This is from UIB SAN felles.uib.no None 5 TB SKD * *
Total Size of SKD storage 125 TB GFI & SKD


To monitor your quota (on folders from leo.hpc.uib.no), you can run the following command from an ssh login on cyclone:

- lfs quota -hp <projid> /shared/
- example for /Data/gfi/scratch: lfs quota -hp 79 /shared/

To monitor your 30 GB homefolder quota on cyclone, you can run the following command from an ssh login on cyclone:

- lfs quota -hp $(id -u) /shared/

or just

- lfs quota -hp $(id -u) $HOME