Problem with conversion to pp for UM timeseries output

I am running the regional nesting suite on Monsoon and have set it up to output timeseries information at a point (so I can better compare to observations). It runs fine and outputs the timeseries to a fields file but the conversion to pp fails so I have to archive the fields file rather than a pp. This means that I lose some of the metadata which makes analysis tricky as I hope to run the nesting suite for a whole year.

If I run mule-convpp on one of the fields files then I get the following error: skipping field validation due to irregular lbcode and just includes the first hour of the first timeseries variable.

My suite id is u-dg348 and the file where I request the time series is here: /home/d03/nahav/roses/u-dg348/app/um/opt/rose-app-stashpack6.conf

Any ideas on a way round this are most welcome!

Natalie

please point to a file that doesn’t convert.

Grenville

Sorry - here’s an example:

/home/d03/nahav/cylc-run/u-dg348/share/cycle/20230612T1800Z/UK/ukv_ITE/RAL3P2_MURK_MORUSES/um/umnsaa_roissy000

Hi Natalie

um-convpp seemed to work

$UMDIR/vn11.2/xc40/utilities/um-convpp umnsaa_roissy000 ~/umnsaa_roissy000.pp

Any good?

Thanks Grenville.

When you run the suite it seems to do to the conversion and create a pp file but when you look in it using xconv it only has the first hours worth of timesteps for the first timeseries variable.

I had to dig quite deep to find the error message skipping field validation due to irregular lbcode in the log files

Also, the archiving to mass doesn’t like this converted pp file.

If your converted pp file does have all the timesteps and variables in it then that is great! I guess I will just need to make sure I use the same version of um-convpp as you

Natalie

well, xconv shows all time steps and variables seem OK (when converted to netcdf to inspect)

Well that is good news! Let me see if I can point the nesting suite at that version of um-convpp

I now have pp files which have all the variables and time steps in - thank you!

Unfortunately iris won’t load these files. I get a ValueError: Unknown IB value for extra data: 0

If I convert the pp to netcdf using /projects/um1/Xconv/xconv1.94 then iris will load the netcdf fine. When I call a convsh script to convert the pp files from umpp then I get the following error: *Error can only extract data with dimensions 1 4 *
Requested dimensions are 1 2

Error in writefile

  • while executing*
    “writefile $outformat $outfile $fieldlist”
  • (“foreach” body line 10)*
  • invoked from within*
    "foreach infile $argv {

# Replace input file extension with .nc to get output filename

  • set outfile [file tail [file rootname $infile].nc]*

# Read…"

  • (file “/home/d03/nahav/roses/u-dg348/bin/pp_to_nc_timeseries.tcl” line 22)*

I am using /projects/um1/Xconv/convsh/. Am I just using the wrong version as manually converting in xconv seems fine?

Thanks for your help in advance!

Natalie

I have found a version of convsh that now works.

Thanks for your help with this.

Natalie

Glad you have got this working - between us, we should alert the IRIS team that it doesn’t appear to handle lbcode properly.

Happy to contact the iris team and report this. Do you have an email address for them?

Natalie