Welcome to the Info TEST server!

Skip to content. | Skip to navigation

Info Services > Computing Guide > Cluster Processing > Resource Limits and Data Retention

Resource Limits and Data Retention

Allocation and limits on processing resources applies to both NRAO staff and observers.

Quota Limits

Observers are limited to 5TB of space in their data area on the Lustre filesystem (e.g. /lustre/aoc/observers/<username> or /lustre/naasc/observers/<username>). User data in large project areas is accounted for via separate group quotas and does not count toward the user's quota. You can view your current usage with the lfs command.

lfs quota -h -g <group> <filesystem>

<group> is the group owner of your data area on Lustre.  For nm-* and cv-* style accounts, the group should be the same as your username.  <filesystem> is /lustre/aoc for NMASC (nmpost) and /lustre/naasc for NAASC (cvpost).

The following shows the quota for the account nm-4386.

$ lfs quota -h -g nm-4386 /lustre/aoc
Disk quotas for group nm-4386 (gid 24386):
Filesystem    used   quota   limit   grace   files   quota   limit   grace
/lustre/aoc  5.2T*     5T      6T 6d22h49m58s 296       0       0       -

To accommodate brief processing spikes, Lustre incorporates a one week grace period where usage can increase by 20% to 6TB. Attempts to write above 6TB or failure to reduce back below 5TB after the grace period will trigger an I/O error.

The above example shows the account is over the 5TB quota (trailing "*") but below the 6TB hard limit with 6 days and 22 hours remaining on the one week grace period.


Cluster Resource limits

For interactive jobs, users are limited to one compute node. Interactive nodes are assigned for 1 to 14 days and can only be accessed by the reserving account.

During periods of increased pressure, users may be asked to release idle nodes to allow broader community access.

For batch requests, users are asked to limit in flight batch jobs to one node's worth of jobs.

Requests for increased access duration, storage space and compute nodes should be submitted as a ticket to https://help.nrao.edu (Data archive and Download department) and will be reviewed by designated NMASC or NAASC staff on a case by case basis.


Data Retention

External accounts, along with any data products, will be removed two weeks from the completion of a processing request.  You will receive an email warning prior to account deletion.  In the event of multiple processing requests, the account expiration date will be triggered by the last request to complete.


Large Proposals

Observers associated with large proposals (greater than 200 hours) who plan to use NRAO computing resources may request the creation of a project area on the Lustre filesystem and a Unix group for shared data access among proposal members.

Project area size limits will be negotiated with NRAO at the time of the request to match project data rates and imaging plans but will typically be 10TB in size.  Quotas for projects are enforced via the project group.

Search All NRAO