Skip to content

Latest commit

 

History

History
77 lines (57 loc) · 6.49 KB

README.md

File metadata and controls

77 lines (57 loc) · 6.49 KB

PyPRECIS

PyPRECIS is the python based training environment for Met Office PRECIS training courses.

Latest version Commits since last release Release date Repo size GitHub


Overview

PyPRECIS is principally designed as a learning tool to faciliate processing of regional climate model (RCM) output. It is desgined to be used in conjunction with taught workshops in an instructor led environment. The name PyPRECIS is a refefence to the initial version of these notebooks which were designed for analysis of data from the PRECIS model but the training is now designed to be more general.

PyPRECIS is built on Jupyter Notebooks, with data processing performed in python, making use of Iris. A conda environment is provided to install these packages, along with their dependencies. A guide containing instructions on how to install the conda environment can be found here.

The data analysed in the first set of notebooks is from the CORDEX-Core simulations which provide an ensemble of high-resolution (at least 25 km) regional climate change information. Further information about CORDEX-Core can be found on the CORDEX website. There is also a Special issue of Climate Dynamics which gives more information about this data. There are also a set of notebooks which analyse the 20CR-DS data set covering China.

Contents

The teaching elements of PyPRECIS are contained in the notebooks directory. The core primary worksheets are:

Worksheet Aims
1
  • Identify and list the names of CORDEX output data in netCDF format using standard Linux commands.
  • Use basic Iris commands to load data files, and view Iris cubes.
  • Use Iris commands to merge netCDF files - Take a subset of the data based on a date range - Save the output as NetCDF files.
  • 2
  • Apply basic statistical operations to Iris cubes
  • Plot information from Iris cubes
  • 3
  • Extract specific regions of interested from large datasets
  • Apply more advanced statistical operations to multi-annual data
  • Produce your own data processing workflow
  • 4
  • Calculate difference and percentage differences across cubes
  • Plot cubes using different plotting methods and with an appropriate colour scale
  • Create time series anomalies of precipitation and tempeature
  • 5
  • Have an appreciation for working with daily model data
  • Understand how to calculate some useful climate extremes statistics
  • Be aware of some coding stratagies for dealing with large data sets
  • 6 An extended coding exercise designed to allow you to put everything you've learned into practise

    Additional tutorials specific to the CSSP 20th Century reanalysis dataset:

    Worksheet Aims
    CSSP 1
  • How to use a cloud based platform to analyse the 20CR-DS dataset
  • Setting up a python environment
  • CSSP 2
  • How to load data into Xarrays format
  • How to convert the data xarrays into iris cube format
  • How to perform basic cube operations
  • CSSP 3
  • Calculate and visualise annual and monthly means
  • Calculate and visualise seasonal means
  • Calculate mean differences (anomalies)
  • CSSP 4
  • Calculate frequency of wet days
  • Calculate percentiles
  • Calculate some useful climate extremes statistics
  • Three additional worksheets are available for use by workshop instructors:

    • makedata.ipynb: Provides scripts for preparing raw model output for use in notebook exercises.
    • worksheet_solutions.ipyn: Solutions to worksheet exercices.
    • worksheet6example.ipynb: Example code for Worksheet 6.

    Data

    For information on how to access the CORDEX-Core data used in these worksheets, see: CORDEX: How to access the data. Most CORDEX data is available for unrestricted use but some is provided for non commercial use only. Before you download any CORDEX data you must ensure you are aware of the Terms of Use for the data you are accessing.

    Data relating to the CSSP 20CRDS tutorials is held online in an Azure Blob Storage Service. To access this data user will need a valid shared access signature (SAS) token. The data is in Zarr format and the total volume is ~2TB. The data is in hourly, 3 hourly, 6 hourly, daily and monthly frequencies stored seperatrely under the metoffice-20cr-ds container on MS-Azure. Monthly data only is also via Zenodo.

    Contributing

    Information on how to contribute can be found in the Contributing guide. Please also consult the CONTRIBUTING.ipynb for information on formatting the worksheets in Jupyter Notebooks. Note that we do not currently make use of Jupyter Lab as it doesn't currently support the types of html formatting we use in Jupyter Notebooks.

    Licence

    PyPRECIS is licenced under BSD 3-clause licence for use outside of the Met Office.

    Met Office
    © British Crown Copyright 2018 - 2022, Met Office