Setting up archer2-jasmin archiving in the nesting suite

the agent will persist, possibly for weeks.

I don’t see a connection error in /home/n02/n02/shakka/cylc-run/u-cz478/log/job/19991231T1200Z/archive_files_Arctic/20

(I can’t see in the polarres gws)

Awesome, cheers.

Really? That’s strange.

This is what I’m getting:

[FAIL] rsync -aLv /work/n02/n02/shakka/cylc-run/u-cz478/work/20000101T1200Z/archive_files_Arctic/tmpXkWng4/MetUM_PolarRES_Arctic_11km_20000101T1200Z_ut_m3h_000 /work/n02/n02/shakka/cylc-run/u-cz478/work/20000101T1200Z/archive_files_Arctic/tmpXkWng4/MetUM_PolarRES_Arctic_11km_20000101T1200Z_ut_mlev_000 /work/n02/n02/shakka/cylc-run/u-cz478/work/20000101T1200Z/archive_files_Arctic/tmpXkWng4/MetUM_PolarRES_Arctic_11km_20000101T1200Z_ut_day_000 /work/n02/n02/shakka/cylc-run/u-cz478/work/20000101T1200Z/archive_files_Arctic/tmpXkWng4/MetUM_PolarRES_Arctic_11km_20000101T1200Z_ut_3ht_000 /work/n02/n02/shakka/cylc-run/u-cz478/work/20000101T1200Z/archive_files_Arctic/tmpXkWng4/MetUM_PolarRES_Arctic_11km_20000101T1200Z_ut_i6h_000 /work/n02/n02/shakka/cylc-run/u-cz478/work/20000101T1200Z/archive_files_Arctic/tmpXkWng4/MetUM_PolarRES_Arctic_11km_20000101T1200Z_ut_m6h_000 /work/n02/n02/shakka/cylc-run/u-cz478/work/20000101T1200Z/archive_files_Arctic/tmpXkWng4/MetUM_PolarRES_Arctic_11km_20000101T1200Z_ut_mi1h_000 /work/n02/n02/shakka/cylc-run/u-cz478/work/20000101T1200Z/archive_files_Arctic/tmpXkWng4/MetUM_PolarRES_Arctic_11km_20000101T1200Z_ut_plev_000 /work/n02/n02/shakka/cylc-run/u-cz478/work/20000101T1200Z/archive_files_Arctic/tmpXkWng4/MetUM_PolarRES_Arctic_11km_20000101T1200Z_ut_6ht_000 /work/n02/n02/shakka/cylc-run/u-cz478/work/20000101T1200Z/archive_files_Arctic/tmpXkWng4/MetUM_PolarRES_Arctic_11km_20000101T1200Z_ut_i3h_000 shakka@hpxfer1.jasmin.ac.uk:/gws/nopw/j04/polarres/ella/evaluation_run/Arcticfiles # return-code=255, stderr=
[FAIL] ssh: connect to host hpxfer1.jasmin.ac.uk port 22: Connection timed out
[FAIL] rsync: connection unexpectedly closed (0 bytes received so far) [sender]
[FAIL] rsync error: unexplained error (code 255) at io.c(228) [sender=3.2.3]
[FAIL] ! /gws/nopw/j04/polarres/ella/evaluation_run/Arcticfiles [compress=None, t(init)=2023-09-01T18:05:56Z, dt(tran)=0s, dt(arch)=130s, ret-code=255]
[FAIL] !	MetUM_PolarRES_Arctic_11km_20000101T1200Z_ut_3ht_000 (ut_3ht_000)
[FAIL] !	MetUM_PolarRES_Arctic_11km_20000101T1200Z_ut_6ht_000 (ut_6ht_000)
[FAIL] !	MetUM_PolarRES_Arctic_11km_20000101T1200Z_ut_day_000 (ut_day_000)
[FAIL] !	MetUM_PolarRES_Arctic_11km_20000101T1200Z_ut_i3h_000 (ut_i3h_000)
[FAIL] !	MetUM_PolarRES_Arctic_11km_20000101T1200Z_ut_i6h_000 (ut_i6h_000)
[FAIL] !	MetUM_PolarRES_Arctic_11km_20000101T1200Z_ut_m3h_000 (ut_m3h_000)
[FAIL] !	MetUM_PolarRES_Arctic_11km_20000101T1200Z_ut_m6h_000 (ut_m6h_000)
[FAIL] !	MetUM_PolarRES_Arctic_11km_20000101T1200Z_ut_mi1h_000 (ut_mi1h_000)
[FAIL] !	MetUM_PolarRES_Arctic_11km_20000101T1200Z_ut_mlev_000 (ut_mlev_000)
[FAIL] !	MetUM_PolarRES_Arctic_11km_20000101T1200Z_ut_plev_000 (ut_plev_000)
2023-09-01T18:08:08Z CRITICAL - failed/EXIT

…just noticed we’re looking at different cycles - this is 20000101T1200Z

it appears to not be running on the login node (see the job.out file:)

Suite    : u-cz478
Task Job : 20000101T1200Z/archive_files_Arctic/10 (try 10)
User@Host: shakka@nid002219     <====  this is a compute node
it should say something like
Suite    : u-cy223
Task Job : 19991231T1200Z/archive_files_rsync/24 (try 2)
User@Host: grenvill@ln04

did you rose suite-run --reload after adding method = background

Hi Grenville,

I thought I had but maybe I was in the wrong directory. It’s now running on the login node but still failing to authenticate properly:

[FAIL] rsync -aLv /work/n02/n02/shakka/cylc-run/u-cz478/work/20000101T1200Z/archive_files_Arctic/tmp7Ui0ft/MetUM_PolarRES_Arctic_11km_20000101T1200Z_ut_m3h_000 /work/n02/n02/shakka/cylc-run/u-cz478/work/20000101T1200Z/archive_files_Arctic/tmp7Ui0ft/MetUM_PolarRES_Arctic_11km_20000101T1200Z_ut_mlev_000 /work/n02/n02/shakka/cylc-run/u-cz478/work/20000101T1200Z/archive_files_Arctic/tmp7Ui0ft/MetUM_PolarRES_Arctic_11km_20000101T1200Z_ut_day_000 /work/n02/n02/shakka/cylc-run/u-cz478/work/20000101T1200Z/archive_files_Arctic/tmp7Ui0ft/MetUM_PolarRES_Arctic_11km_20000101T1200Z_ut_3ht_000 /work/n02/n02/shakka/cylc-run/u-cz478/work/20000101T1200Z/archive_files_Arctic/tmp7Ui0ft/MetUM_PolarRES_Arctic_11km_20000101T1200Z_ut_i6h_000 /work/n02/n02/shakka/cylc-run/u-cz478/work/20000101T1200Z/archive_files_Arctic/tmp7Ui0ft/MetUM_PolarRES_Arctic_11km_20000101T1200Z_ut_m6h_000 /work/n02/n02/shakka/cylc-run/u-cz478/work/20000101T1200Z/archive_files_Arctic/tmp7Ui0ft/MetUM_PolarRES_Arctic_11km_20000101T1200Z_ut_mi1h_000 /work/n02/n02/shakka/cylc-run/u-cz478/work/20000101T1200Z/archive_files_Arctic/tmp7Ui0ft/MetUM_PolarRES_Arctic_11km_20000101T1200Z_ut_plev_000 /work/n02/n02/shakka/cylc-run/u-cz478/work/20000101T1200Z/archive_files_Arctic/tmp7Ui0ft/MetUM_PolarRES_Arctic_11km_20000101T1200Z_ut_6ht_000 /work/n02/n02/shakka/cylc-run/u-cz478/work/20000101T1200Z/archive_files_Arctic/tmp7Ui0ft/MetUM_PolarRES_Arctic_11km_20000101T1200Z_ut_i3h_000 shakka@hpxfer1.jasmin.ac.uk:/gws/nopw/j04/polarres/ella/evaluation_run/Arcticfiles # return-code=255, stderr=
[FAIL] 
[FAIL]             Access to this system is monitored and restricted to
[FAIL]             authorised users.   If you do not have authorisation
[FAIL]             to use  this system,  you should not  proceed beyond
[FAIL]             this point and should disconnect immediately.
[FAIL] 
[FAIL]             Unauthorised use could lead to prosecution.
[FAIL] 
[FAIL]     (See also - http://www.stfc.ac.uk/aup)
[FAIL] 
[FAIL] shakka@hpxfer1.jasmin.ac.uk: Permission denied (publickey,gssapi-keyex,gssapi-with-mic).
[FAIL] rsync: connection unexpectedly closed (0 bytes received so far) [sender]
[FAIL] rsync error: unexplained error (code 255) at io.c(228) [sender=3.2.3]

can you run the rsync on the command line?

rsync -aLv /work/n02/n02/shakka/cylc-run/u-cz478/work/20000101T1200Z/archive_files_Arctic/tmp7Ui0ft/MetUM_PolarRES_Arctic_11km_20000101T1200Z_ut_m3h_000 /work/n02/n02/shakka/cylc-run/u-cz478/work/20000101T1200Z/archive_files_Arctic/tmp7Ui0ft/MetUM_PolarRES_Arctic_11km_20000101T1200Z_ut_mlev_000 /work/n02/n02/shakka/cylc-run/u-cz478/work/20000101T1200Z/archive_files_Arctic/tmp7Ui0ft/MetUM_PolarRES_Arctic_11km_20000101T1200Z_ut_day_000 /work/n02/n02/shakka/cylc-run/u-cz478/work/20000101T1200Z/archive_files_Arctic/tmp7Ui0ft/MetUM_PolarRES_Arctic_11km_20000101T1200Z_ut_3ht_000 /work/n02/n02/shakka/cylc-run/u-cz478/work/20000101T1200Z/archive_files_Arctic/tmp7Ui0ft/MetUM_PolarRES_Arctic_11km_20000101T1200Z_ut_i6h_000 /work/n02/n02/shakka/cylc-run/u-cz478/work/20000101T1200Z/archive_files_Arctic/tmp7Ui0ft/MetUM_PolarRES_Arctic_11km_20000101T1200Z_ut_m6h_000 /work/n02/n02/shakka/cylc-run/u-cz478/work/20000101T1200Z/archive_files_Arctic/tmp7Ui0ft/MetUM_PolarRES_Arctic_11km_20000101T1200Z_ut_mi1h_000 /work/n02/n02/shakka/cylc-run/u-cz478/work/20000101T1200Z/archive_files_Arctic/tmp7Ui0ft/MetUM_PolarRES_Arctic_11km_20000101T1200Z_ut_plev_000 /work/n02/n02/shakka/cylc-run/u-cz478/work/20000101T1200Z/archive_files_Arctic/tmp7Ui0ft/MetUM_PolarRES_Arctic_11km_20000101T1200Z_ut_6ht_000 /work/n02/n02/shakka/cylc-run/u-cz478/work/20000101T1200Z/archive_files_Arctic/tmp7Ui0ft/MetUM_PolarRES_Arctic_11km_20000101T1200Z_ut_i3h_000 shakka@hpxfer1.jasmin.ac.uk:/gws/nopw/j04/polarres/ella/evaluation_run/Arcticfiles

I can run rsync successfully when I directly transfer the files I want from the directory they’re in (i.e. /share/cycle/) to the directory I want on jasmin (/gws/nopw/j04/polarres/ella/evaluation_run/Arctic - note it’s not Arcticfiles) but I’m guessing there’s something wrong with the renaming convention because a) the files should be called something like MetUM_PolarRES_Arctic_11km_< cycle >_out_3ht_000 (without the ‘o’ missing) and b) the target directory is slightly wrong.

Hi Ella

Re missing output from the LAMS (for u-cz478)

Items output on ATM_SOIL, DSNOWTILE are from section 0 and have space code 3 (see https://code.metoffice.gov.uk/doc/um/vn13.0/papers/umdp_C04.pdf sectn C.2) which says:

3 Section 0, 33 or 34 items only: primary field unavailable to STASH which is addressed in the dump
and D1. This is the case for fields which are not full horizontal fields, especially those compressed
onto land points only

I guess the suite should use STASH from section 8 instead ( I note that glm uses SOIL MOISTURE CONTENT IN A LAYER sectn 8 item 223 rather than sectn 0 item 9)

u-cz478 doesn’t have STASH on TILES or SOIL (but maybe it’s the same problem elsewhere)

Grenville

Aha! I will try and find some alternatives and see if this solves the problem. Hold that thought…

Update: managed to get all the alternative outputs to work, apart from snow depth. Am also missing all the clear-sky fluxes, but I’m not going to worry about those because they were an optional add-on anyway.

Snow depth seems to be calculated from snow mass anyway, so in theory I suppose I could post-process it if absolutely necessary…