UKESM1.1 AMIP run -- u-dp730

Hi,
I have made a copy of u-dp730 (UKESM1.1 AMIP) @ 13.8 as u-dr157. I’ve set owner to simontett, account to n02-TERRAFIRMA, site to ARCHER2 and Q to Standard. When I try and run it I get an error: KeyError:‘suite.rc’ with output (output#1) shown below. One thing that strikes me is the CYLC_VERSION is 7.8.12 but I think this job is cylc 8. So what should I do?

I also tried using UM13.7 (cylc 7 I think) u-dl478 with a local copy u-dr159. I set Archer2, q=standard, user name = tetts, owner to simontett, account to n02-TERRAFIRMA. When I submit this one it fails with a timeout error message from the ssh to archer2… See output#2 for the log from this attempt. I’ve already tried deleting the ssh environment and so forced a restart of my ssh-agent.

output#1 ====================================

Simon

[INFO] 2025-06-30T13:21:30+0100 cylc get-global-config -i [hosts][localhost]run\ directory
[INFO] 2025-06-30T13:21:31+0100 cylc get-global-config -i [hosts][localhost]work\ directory
[INFO] 2025-06-30T13:21:31+0100 Configuration: /home4/home/n02-puma/tetts/roses/u-dr157/
[INFO] 2025-06-30T13:21:31+0100 file: rose-suite.conf
[INFO] 2025-06-30T13:21:31+0100 cylc --version
[INFO] 2025-06-30T13:21:31+0100 export CYLC_VERSION=7.8.12
[INFO] 2025-06-30T13:21:31+0100 export ROSE_ORIG_HOST=puma2.archer2.ac.uk
[INFO] 2025-06-30T13:21:31+0100 export ROSE_SITE=
[INFO] 2025-06-30T13:21:31+0100 export ROSE_VERSION=2019.01.3
[INFO] 2025-06-30T13:21:31+0100 create: log.20250630T122131Z
[INFO] 2025-06-30T13:21:31+0100 delete: log
[INFO] 2025-06-30T13:21:31+0100 symlink: log.20250630T122131Z <= log
[INFO] 2025-06-30T13:21:31+0100 tar -czf log.20250630T114826Z.tar.gz log.20250630T114826Z
[INFO] 2025-06-30T13:21:31+0100 log.20250630T114826Z.tar.gz <= log.20250630T114826Z
[INFO] 2025-06-30T13:21:31+0100 delete: log.20250630T114826Z/
[INFO] 2025-06-30T13:21:31+0100 create: log/suite
[INFO] 2025-06-30T13:21:31+0100 create: log/rose-conf
[INFO] 2025-06-30T13:21:31+0100 svn info --non-interactive
[INFO] 2025-06-30T13:21:31+0100 svn status --non-interactive
[INFO] 2025-06-30T13:21:31+0100 svn diff --internal-diff --non-interactive
[INFO] 2025-06-30T13:21:32+0100 git describe
[INFO] 2025-06-30T13:21:32+0100 symlink: rose-conf/20250630T132131-run.conf <= log/rose-suite-run.conf
[INFO] 2025-06-30T13:21:32+0100 symlink: rose-conf/20250630T132131-run.version <= log/rose-suite-run.version
[INFO] 2025-06-30T13:21:32+0100 fcm info --xml fcm:um.xm_tr/rose-stem/ana/mule_cumf.py@vn13.6

#output#2
[INFO] 2025-06-30T13:38:28+0100 chdir: /home/n02/n02/tetts/roses/u-dr159/
[INFO] 2025-06-30T13:38:28+0100 Configuration: /home4/home/n02-puma/tetts/roses/u-dr159/
[INFO] 2025-06-30T13:38:28+0100 file: rose-suite.conf
[INFO] 2025-06-30T13:38:28+0100 export CYLC_VERSION=7.8.12
[INFO] 2025-06-30T13:38:28+0100 export ROSE_ORIG_HOST=puma2.archer2.ac.uk
[INFO] 2025-06-30T13:38:28+0100 export ROSE_SITE=
[INFO] 2025-06-30T13:38:28+0100 export ROSE_VERSION=2019.01.3
[INFO] 2025-06-30T13:38:28+0100 create: log.20250630T123828Z
[INFO] 2025-06-30T13:38:28+0100 delete: log
[INFO] 2025-06-30T13:38:28+0100 symlink: log.20250630T123828Z <= log
[INFO] 2025-06-30T13:38:28+0100 tar -czf log.20250630T123040Z.tar.gz log.20250630T123040Z
[INFO] 2025-06-30T13:38:28+0100 log.20250630T123040Z.tar.gz <= log.20250630T123040Z
[INFO] 2025-06-30T13:38:28+0100 delete: log.20250630T123040Z/
[INFO] 2025-06-30T13:38:28+0100 create: log/suite
[INFO] 2025-06-30T13:38:28+0100 create: log/rose-conf
[INFO] 2025-06-30T13:38:28+0100 svn info --non-interactive
[INFO] 2025-06-30T13:38:28+0100 svn status --non-interactive
[INFO] 2025-06-30T13:38:28+0100 svn diff --internal-diff --non-interactive
[INFO] 2025-06-30T13:38:28+0100 git describe
[INFO] 2025-06-30T13:38:28+0100 symlink: rose-conf/20250630T133828-run.conf <= log/rose-suite-run.conf
[INFO] 2025-06-30T13:38:28+0100 symlink: rose-conf/20250630T133828-run.version <= log/rose-suite-run.version
[INFO] 2025-06-30T13:38:28+0100 fcm info --xml fcm:um.xm_tr/rose-stem/ana/mule_cumf.py@vn11.4
[INFO] 2025-06-30T13:38:29+0100 unchanged: ana/mule_cumf.py
[INFO] 2025-06-30T13:38:29+0100 source: svn://puma2.archer2.ac.uk/um.xm/main/trunk/rose-stem/ana/mule_cumf.py@71790 (fcm:um.xm_tr/rose-stem/ana/mule_cumf.py@vn11.4)
[INFO] 2025-06-30T13:38:29+0100 unchanged: app
[INFO] 2025-06-30T13:38:29+0100 source: /home4/home/n02-puma/tetts/roses/u-dr159/app
[INFO] 2025-06-30T13:38:29+0100 unchanged: bin
[INFO] 2025-06-30T13:38:29+0100 source: /home4/home/n02-puma/tetts/roses/u-dr159/bin
[INFO] 2025-06-30T13:38:29+0100 unchanged: meta
[INFO] 2025-06-30T13:38:29+0100 source: /home4/home/n02-puma/tetts/roses/u-dr159/meta
[INFO] 2025-06-30T13:38:29+0100 unchanged: rose-suite.info
[INFO] 2025-06-30T13:38:29+0100 source: /home4/home/n02-puma/tetts/roses/u-dr159/rose-suite.info
[INFO] 2025-06-30T13:38:29+0100 unchanged: site
[INFO] 2025-06-30T13:38:29+0100 source: /home4/home/n02-puma/tetts/roses/u-dr159/site
[INFO] 2025-06-30T13:38:29+0100 unchanged: suite-tests-graph.rc
[INFO] 2025-06-30T13:38:29+0100 source: /home4/home/n02-puma/tetts/roses/u-dr159/suite-tests-graph.rc
[INFO] 2025-06-30T13:38:29+0100 unchanged: suite-tests-runtime.rc
[INFO] 2025-06-30T13:38:29+0100 source: /home4/home/n02-puma/tetts/roses/u-dr159/suite-tests-runtime.rc
[INFO] 2025-06-30T13:38:32+0100 ssh -oBatchMode=yes -oStrictHostKeyChecking=no -oConnectTimeout=8 -n login.archer2.ac.uk env\ ROSE_VERSION=2019.01.3\ CYLC_VERSION=7.8.12\ bash\ -l\ -c\ '"$0"\ "$@"'\ rose\ suite-run\ -vv\ -n\ u-dr159\ –run=run\ –remote=uuid=13ae56d2-257c-4e2f-8eb9-0be07d253328,now-str=20250630T123828Z

Simon

u-dr159 needs

        [[[remote]]]
            host = $(rose host-select archer2)

(not what it currently has)

for cylc 8 jobs, please see Cylc 8 on PUMA2 and ARCHER2

Grenville

Hi Grenville,
for the 13.8/cylc 8 workflow (u-dr157) I got it running :-), but then it failed with a bunch of errors suggesting it can not connect to archer2 login nodes.
Simon

2025-06-30T14:23:08Z INFO - platform: ln02 - remote init (on ln02)
2025-06-30T14:23:09Z WARNING - platform: ln02 - Could not connect to ln02.
* ln02 has been added to the list of unreachable hosts
* remote-init will retry if another host is available.
2025-06-30T14:23:09Z INFO - platform: ln04 - remote init (on ln04)
2025-06-30T14:23:09Z INFO - [19790101T0000Z/fcm_make_pp_archive_host/01:preparing] submitted to localhost:background[1510991]
2025-06-30T14:23:10Z INFO - [19790101T0000Z/fcm_make_pp_archive_host/01:preparing] => submitted
2025-06-30T14:23:10Z WARNING - platform: ln04 - Could not connect to ln04.
* ln04 has been added to the list of unreachable hosts
* remote-init will retry if another host is available.
2025-06-30T14:23:10Z INFO - platform: ln03 - remote init (on ln03)
2025-06-30T14:23:11Z WARNING - platform: ln03 - Could not connect to ln03.
* ln03 has been added to the list of unreachable hosts
* remote-init will retry if another host is available.
2025-06-30T14:23:11Z INFO - platform: ln01 - remote init (on ln01)
2025-06-30T14:23:12Z WARNING - platform: ln01 - Could not connect to ln01.
* ln01 has been added to the list of unreachable hosts
* remote-init will retry if another host is available.
2025-06-30T14:23:12Z ERROR - [jobs-submit cmd] (remote init)
[jobs-submit ret_code] 1
2025-06-30T14:23:12Z ERROR - [19790101T0000Z/remote_setup/01:preparing] submission failed

Simon

I think ln04 is being used for OS upgrade testing - I have had one or two fails there, but a retrigger worked.

What do you get on puma2 when you type
ssh ln01 (or ln02, ln03)?

Applied that correction in site/archer2.rc and things get submitted. So partial success. I then got failures in fcm_make_um & fcm_make_pp with fails in make extract. Turned off all the testing – I don’t need that right! And tried again and got same errors.
Output from fcm_make_um below.

Simon

Suite : u-dr159
Task Job : 19790101T0000Z/fcm_make_um/01 (try 1)
User@Host: tetts@puma2.archer2.ac.uk

2025-06-30T15:10:15Z INFO - started
[INFO] Configuration: /home/n02/n02/tetts/cylc-run/u-dr159/app/fcm_make_um/
[INFO] file: rose-app.conf
[INFO] optional key: archer2
[INFO] optional key: (archer2)
[INFO] export COUPLER=none
[INFO] export DR_HOOK=false
[INFO] export PATH=/home4/home/n02-puma/fcm/metomi/rose-2019.01.3/bin:/home/n02/n02/tetts/cylc-run/u-dr159/bin:/home/n02/n02/tetts/cylc-run/u-dr159/bin:/home/n02/n02/tetts/cylc-run/u-dr159:/home4/home/n02-puma/fcm/metomi/cylc-7.8.12/bin:/home/n02/n02/tetts/cylc-run/u-dr159/bin:/usr/local/bin:/usr/bin:/usr/local/sbin:/usr/sbin:/home/n02/n02/p2local/bin:/home/n02/n02/fcm/metomi/bin:/home/n02/n02/p2local/bin:/home/n02/n02/fcm/metomi/bin
[INFO] export casim_rev=um13.7
[INFO] export casim_sources=
[INFO] export compile_atmos=preprocess-atmos\ build-atmos
[INFO] export compile_recon=preprocess-recon\ build-recon
[INFO] export config_revision=
[INFO] export config_root_path=fcm:um.xm_br/dev/simonwilson/vn12.1_archer2_compile
[INFO] export config_type=atmos
[INFO] export eccodes=false
[INFO] export extract=extract
[INFO] export fcflags_overrides=
[INFO] export gwd_ussp_precision=double
[INFO] export jules_rev=um13.7
[INFO] export jules_sources=
[INFO] export land_surface_model=jules
[INFO] export ldflags_overrides_prefix=
[INFO] export ldflags_overrides_suffix=
[INFO] export ls_precipitation_precision=double
[INFO] export mirror=mirror
[INFO] export mpp_version=1C
[INFO] export netcdf=true
[INFO] export openmp=true
[INFO] export optimisation_level=safe
[INFO] export platagnostic=false
[INFO] export platform_config_dir=ncas-ex-cce
[INFO] export portio_version=2A
[INFO] export prebuild=
[INFO] export shumlib_rev=um13.7
[INFO] export shumlib_sources=
[INFO] export socrates_rev=um13.7
[INFO] export socrates_sources=
[INFO] export stash_version=1A
[INFO] export thread_utils=false
[INFO] export timer_version=3A
[INFO] export ukca_rev=um13.7
[INFO] export ukca_sources=
[INFO] export um_rev=vn13.7
[INFO] export um_sources=
[INFO] unchanged: fcm-make.cfg
[INFO] source: /home/n02/n02/tetts/cylc-run/u-dr159/app/fcm_make_um/file/fcm-make.cfg
[INFO] export ROSE_TASK_MIRROR_TARGET=ln03:cylc-run/u-dr159/share/fcm_make_um
[INFO] export MIRROR_TARGET=ln03:cylc-run/u-dr159/share/fcm_make_um
[init] make # 2025-06-30T15:10:19Z
[info] FCM 2021.05.0 (/home4/home/n02-puma/fcm/metomi/fcm-2021.05.0)
[init] make config-parse # 2025-06-30T15:10:19Z
[info] config-file=/home4/home/n02-puma/tetts/cylc-run/u-dr159/work/19790101T0000Z/fcm_make_um/fcm-make.cfg
[info] config-file= - svn://puma2.archer2.ac.uk/um.xm/main/branches/dev/simonwilson/vn12.1_archer2_compile/fcm-make/ncas-ex-cce/um-atmos-safe.cfg@130172
[info] config-file= - - svn://puma2.archer2.ac.uk/um.xm/main/branches/dev/simonwilson/vn12.1_archer2_compile/fcm-make/inc/um-atmos-common.cfg@130172
[info] config-file= - - - svn://puma2.archer2.ac.uk/um.xm/main/branches/dev/simonwilson/vn12.1_archer2_compile/fcm-make/ncas-ex-cce/inc/parallel.cfg@130172
[info] config-file= - - - - svn://puma2.archer2.ac.uk/um.xm/main/branches/dev/simonwilson/vn12.1_archer2_compile/fcm-make/ncas-ex-cce/inc/external_paths.cfg@130172
[info] config-file= - - - - svn://puma2.archer2.ac.uk/um.xm/main/branches/dev/simonwilson/vn12.1_archer2_compile/fcm-make/inc/options/common.cfg@130172
[info] config-file= - - - - - svn://puma2.archer2.ac.uk/um.xm/main/branches/dev/simonwilson/vn12.1_archer2_compile/fcm-make/inc/options/coupler/none.cfg@130172
[info] config-file= - - - - - svn://puma2.archer2.ac.uk/um.xm/main/branches/dev/simonwilson/vn12.1_archer2_compile/fcm-make/inc/options/stash/1A.cfg@130172
[info] config-file= - - - - - svn://puma2.archer2.ac.uk/um.xm/main/branches/dev/simonwilson/vn12.1_archer2_compile/fcm-make/inc/options/portio/2A.cfg@130172
[info] config-file= - - - - - svn://puma2.archer2.ac.uk/um.xm/main/branches/dev/simonwilson/vn12.1_archer2_compile/fcm-make/inc/options/mpp/1C.cfg@130172
[info] config-file= - - - - - svn://puma2.archer2.ac.uk/um.xm/main/branches/dev/simonwilson/vn12.1_archer2_compile/fcm-make/inc/options/timer/3A.cfg@130172
[info] config-file= - - - - - svn://puma2.archer2.ac.uk/um.xm/main/branches/dev/simonwilson/vn12.1_archer2_compile/fcm-make/inc/options/land_surface/jules.cfg@130172
[info] config-file= - - - - - svn://puma2.archer2.ac.uk/um.xm/main/branches/dev/simonwilson/vn12.1_archer2_compile/fcm-make/inc/options/ls_precip/double.cfg@130172
[info] config-file= - - - - - svn://puma2.archer2.ac.uk/um.xm/main/branches/dev/simonwilson/vn12.1_archer2_compile/fcm-make/inc/options/ussp/double.cfg@130172
[info] config-file= - - - - - svn://puma2.archer2.ac.uk/um.xm/main/branches/dev/simonwilson/vn12.1_archer2_compile/fcm-make/inc/options/recon_mpi/parallel.cfg@130172
[info] config-file= - - - - - svn://puma2.archer2.ac.uk/um.xm/main/branches/dev/simonwilson/vn12.1_archer2_compile/fcm-make/inc/options/drhook/false.cfg@130172
[info] config-file= - - - - - svn://puma2.archer2.ac.uk/um.xm/main/branches/dev/simonwilson/vn12.1_archer2_compile/fcm-make/inc/options/openmp/true.cfg@130172
[info] config-file= - - - - - svn://puma2.archer2.ac.uk/um.xm/main/branches/dev/simonwilson/vn12.1_archer2_compile/fcm-make/inc/options/mkl/false.cfg@130172
[info] config-file= - - - - - svn://puma2.archer2.ac.uk/um.xm/main/branches/dev/simonwilson/vn12.1_archer2_compile/fcm-make/inc/options/eccodes/false.cfg@130172
[info] config-file= - - - - - svn://puma2.archer2.ac.uk/um.xm/main/branches/dev/simonwilson/vn12.1_archer2_compile/fcm-make/inc/options/netcdf/true.cfg@130172
[info] config-file= - - - - - svn://puma2.archer2.ac.uk/um.xm/main/branches/dev/simonwilson/vn12.1_archer2_compile/fcm-make/inc/options/thread_utils/false.cfg@130172
[info] config-file= - - - - - svn://puma2.archer2.ac.uk/um.xm/main/branches/dev/simonwilson/vn12.1_archer2_compile/fcm-make/inc/options/platagnostic/false.cfg@130172
[info] config-file=/tmp/KRmATu-fcm-make-args.cfg
[done] make config-parse # 3.1s
[init] make dest-init # 2025-06-30T15:10:22Z
[info] dest=tetts@puma2.archer2.ac.uk:/home/n02/n02/tetts/cylc-run/u-dr159/share/fcm_make_um
[info] mode=incremental
[done] make dest-init # 0.0s
[init] make extract # 2025-06-30T15:10:22Z
[FAIL] make extract # 0.0s
[FAIL] make # 3.2s

u-dr159 the error is [FAIL] ukca: name-spaces declared but not used – has this job run before?

It is apparently the UKESM1.1 AMIP version… Luke Abrahams pointed me at them:
https://code.metoffice.gov.uk/trac/UKESM/wiki/UKESM1.1StandardJobs. I need 13.7+ as want to be able to modify UKCA aerosol parameter.
Simon

u-dr159 - is using a vn12.1 config branch - that’ll not work with um13.7. Delete /home/n02/n02/tetts/roses/u-dr159/app/fcm_make_um/opt/rose-app-archer2.conf

(I’m not sure what else won’t be right - maybe refer the suite to Luke?)

Hi Grenville,
Don’t need I need something else here?
config_root_path=fcm:um.xm_br/dev/simonwilson/vn13.7_archer2_compile

???
Though, running grep vn on the whole suite gives all kind of versions… The ones that look a bit worrying are:
/home/n02/n02/tetts/roses/u-dr159/suite.rc: UM_VN = 11.4
/home/n02/n02/tetts/roses/u-dr159/rose-suite.conf:source=fcm:um.xm_tr/rose-stem/ana/mule_cumf.py@vn11.4

Simon

On further reflection, you should abandon u-dr159. Try a copy of u-dp730 or u-do575 (see bottom table in https://code.metoffice.gov.uk/trac/UKESM/wiki/UKESM1.1StandardJobs) - they may work straight off (I’m trying u-dp730, having never run it before)

I am (in parallel) trying u-dp730 – though cylc vn8 requires some more work to understand how it works and might make other things difficult for me :frowning: My job there is u-dr157 and you were right about login nodes just being a temporary glitch. Job has been failing in fcm_make_pp_archive_host and seems to be running tests. I’ve turned off archiving and testing for now and am waiting for it to stop. Is there a die now option in cylc version 8…

Once I’ve got the job working then I will probably turn archiving back on… Hopefully, by then you will have a fix to archiving. It seems to be complaining about make mirror if that helps!

Simon

Looks like model has ran 3 months (??). I expected to find data in share/data/History_Data but a.pm*pm data seems to be in /work/n02/n02/tetts/cylc-run/u-dr157/run1/share/cycle/19790101T0000Z. Is that a change from cylc 7??

Looking at the graph for the run I see that fcm_make2_pp_archive_host depends on fcm_make_pp_archive_host (which failed) but nothing seems to depend on fcm_make2_pp_archive_host. Should I worry about that?

I am getting failures from pp_transfer where it tries to use globus. I imagine that is because I’ve not done any globus setup…

WARN] file:atmospp.nl: skip missing optional source: namelist:script_arch
[WARN] [SUBPROCESS]: Command: globus transfer --format unix --jmespath task_id --recursive --fail-on-quota-errors --sync-level checksum --label u-dr157/19790101T0000Z --verify-checksum --notify off 3e90d018-0d05-461a-bbaf-aab605283d21:/work/n02/n02/tetts/cylc-run/u-dr157/run1/share/cycle/19790101T0000Z a2f53b7f-1b4e-4dce-9b7c-349ae760fee0:/work/xfc/vol5/user_cache/tetts/archive/u-dr157/19790101T0000Z
[SUBPROCESS]: Error = 4:
MissingLoginError: Missing login for Globus Transfer.
Please run:

globus login

I tried globus login on archer2 login node and puma – both failed, with globus not found error. I assume I need to follow the instructions at Globus - ARCHER2 User Documentation. Is there anything else I should look at? I suspect documentation on the archive script would be useful too…

I am also getting errors about mail – how do I stop the work flow trying to email me when things go wrong…

I thought I had turned off archiving and testing. But the test cases all ran, even though I had turned them off … I think it is because I did cylc play rather than cylc vip. Correct?

Simon

p.s. trying to run ssh instructions at bottom of cylc 8 documentation. It hangs when I try and connect from my desktop windows maching using power shell…
(base) PS M:> ssh -N -L 04076:localhost:04076 tetts@login.archer2.ac.uk
Enter passphrase for key ‘C:\Users\stett2/.ssh/id_rsa_archer2’:
(tetts@login.archer2.ac.uk) Expecting a code for login.archer2.ac.uk:tetts

TOTP code: 932067

Hi Simon,

If you want to transfer data to JASMIN (ie. run pptransfer) you will need to setup Globus first:

fcm_make_pp_archive_host is the task that installs the scripts for the JDMA task on JASMIN. If you wish to archive your data to JASMIN Elastic Tape you will need to make sure you have your environment setup to access JASMIN as per the instructions here: Migration of UM Data to JASMIN Elastic Tape. Whilst you are setting up/testing the suite I would suggest turning of the JDMA task.

You can ignore the warning/error messages you see from mail. They do not affect the running of the suite.

cylc vr does the equivalent of a cylc 7 cylc reload

You might find this cylc7 to cylc8 cheat sheet useful: Cheat Sheet — Cylc 8.4.3 documentation

Regards,
Ros.

P.s. Regarding the data question - it’s the same as for cylc7. When the model is running data is put into the share/data/History_Data directory. Postproc then “archives” it. In this case to the share/cycle/<cyclepoint> directory ready for transfer to JASMIN.

Hi Ros, I don’t think I want to automatically archive to tape – just transfer the data across for subsequent processing and visualisation. Setting up globus now!
Simon

Globus going fine until I try step#5 " Run CLI-based check"

On a clean archer2 session.

tetts@ln01:~> module load globus-cli
tetts@ln01:~> globus session show
For information on your primary identity or full identity set see
globus whoami

Username ID Auth Time
tetts@accounts.jasmin.ac.uk 0b19c31d-bdc8-49fb-8b23-a9a6f707b4ff 2025-07-01 11:26 BST
stett2@ed.ac.uk 5c4dcee1-2ece-4a42-9e6c-9fb2c08dc9c4 2025-07-01 11:29 BST
tetts@safe.archer2.ac.uk 9fb892d1-f77e-4a99-a04b-01695186fa3a 2025-07-01 11:30 BST
tetts@ln01:~> globus ls 3e90d018-0d05-461a-bbaf-aab605283d21:/~/
The resource you are trying to access requires you to re-authenticate.
message: Missing required data_access consent

Please use “globus session update” to re-authenticate with specific identities.

From the session show I think I am authenticated on all accounts.. I’ve tried --all and stett2@ed.ac.uk tetts@safe.archer2.ac.uk and got the same results…

Simon

Hi Simon,

This is a bug in globus which has been fixed in a newer version of the CLI which I have now been told has been installed on ARCHER2 but is not the default.

Please try moduule load globus-cli/3.35.2

If that solves the problem please let me know and I will update the instructions and postproc module accordingly.

Cheers,
Ros.

That worked… See below. globus cli does not work on jasmin. I think I need to create a python virtual env and install. See JASMIN Help Site - Globus Command-Line Interface Is there a module load for it? Or some standard NCAS python stuff…

tetts@ln01:~> globus session show
For information on your primary identity or full identity set see
globus whoami

Username ID Auth Time
tetts@accounts.jasmin.ac.uk 0b19c31d-bdc8-49fb-8b23-a9a6f707b4ff 2025-07-01 11:26 BST
stett2@ed.ac.uk 5c4dcee1-2ece-4a42-9e6c-9fb2c08dc9c4 2025-07-01 11:29 BST
tetts@safe.archer2.ac.uk 9fb892d1-f77e-4a99-a04b-01695186fa3a 2025-07-01 11:30 BST
tetts@ln01:~> globus whoami --linked-identities
For information on which identities are in session see
globus session show

stett2@ed.ac.uk
tetts@safe.archer2.ac.uk
tetts@accounts.jasmin.ac.uk
tetts@ln01:~> globus ls 3e90d018-0d05-461a-bbaf-aab605283d21:/~/
The collection you are trying to access data on requires you to grant consent for the Globus CLI to access it.

Please run:

globus session consent ‘urn:globus:auth:scope:transfer.api.globus.org:all[*https://auth.globus.org/scopes/3e90d018-0d05-461a-bbaf-aab605283d21/data_access]’

to login with the required scopes.
tetts@ln01:~> globus session consent ‘urn:globus:auth:scope:transfer.api.globus.org:all[*https://auth.globus.org/scopes/3e90d018-0d05-461a-bbaf-aab605283d21/data_access]’
Please authenticate with Globus here:

Log In using Globus

Enter the resulting Authorization Code here: JMx73LxGS0iD3IZ95Lg5bdf7XBycwL

You have successfully updated your CLI session.

tetts@ln01:~> globus ls 3e90d018-0d05-461a-bbaf-aab605283d21:/~/

Hi Simon,

Glad that fixed the problem.

For transfer of data from ARCHER2 to JASMIN within a suite, in those instructions you don’t run globus on JASMIN.

Cheers,
Ros

Thanks. Did the last bit – made the sym link.
How do I control the archiver? I know where I want the data to go to on Jasmin and don’t want to put it on tape.
And do I need to change the archiver following UM Post-processing App. See Pp_transfer and cylc 8 where I’ve got a problem.