{ "cells": [ { "cell_type": "markdown", "metadata": {}, "source": [ "# Finch usage\n", "\n", "Finch is a WPS server for climate indicators, but also has a few utilities to facilitate data handling. To get started, first instantiate the client. Here, the client will try to connect to a local or remote finch instance, depending on whether the environment variable `WPS_URL` is defined." ] }, { "cell_type": "code", "execution_count": 1, "metadata": {}, "outputs": [], "source": [ "import os\n", "\n", "import xarray as xr\n", "from birdy import WPSClient\n", "\n", "pavics_url = \"https://pavics.ouranos.ca/twitcher/ows/proxy/finch/wps\"\n", "url = os.environ.get(\"WPS_URL\", pavics_url)\n", "verify_ssl = True if \"DISABLE_VERIFY_SSL\" not in os.environ else False\n", "wps = WPSClient(url, verify=verify_ssl)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "\n", "The list of available processes can be displayed using the magic ? command (`wps?`). Similarly, help about any individual process is available using ? or the `help` command." ] }, { "cell_type": "code", "execution_count": 2, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "Help on method frost_days in module birdy.client.base:\n", "\n", "frost_days(tasmin=None, thresh='0 degC', freq='YS', month=None, season=None, check_missing='any', missing_options=None, cf_compliance='warn', data_validation='raise', variable=None, output_name=None, output_format='netcdf', csv_precision=None, output_formats=None) method of birdy.client.base.WPSClient instance\n", " Number of days where the daily minimum temperature is below a given threshold.\n", " \n", " Parameters\n", " ----------\n", " tasmin : ComplexData:mimetype:`application/x-netcdf`, :mimetype:`application/x-ogc-dods`\n", " NetCDF Files or archive (tar/zip) containing netCDF files. Minimum surface temperature.\n", " thresh : string\n", " Freezing temperature.\n", " freq : {'YS', 'MS', 'QS-DEC', 'AS-JUL'}string\n", " Resampling frequency.\n", " month : {'1', '2', '3', '4', '5', '6', '7', '8', '9', '10', ...}integer\n", " Months of the year over which to compute indicator.\n", " season : {'DJF', 'MAM', 'JJA', 'SON'}string\n", " Climatological season over which to compute indicator.\n", " check_missing : {'any', 'wmo', 'pct', 'at_least_n', 'skip', 'from_context'}string\n", " Method used to determine which aggregations should be considered missing.\n", " missing_options : ComplexData:mimetype:`application/json`\n", " JSON representation of dictionary of missing method parameters.\n", " cf_compliance : {'log', 'warn', 'raise'}string\n", " Whether to log, warn or raise when inputs have non-CF-compliant attributes.\n", " data_validation : {'log', 'warn', 'raise'}string\n", " Whether to log, warn or raise when inputs fail data validation checks.\n", " variable : string\n", " Name of the variable in the input files.\n", " output_name : string\n", " Filename of the output (no extension).\n", " output_format : {'netcdf', 'csv'}string\n", " Choose in which format you want to receive the result. CSV actually means a zip file of two csv files.\n", " csv_precision : integer\n", " Only valid if output_format is CSV. If not set, all decimal places of a 64 bit floating precision number are printed. If negative, rounds before the decimal point.\n", " \n", " Returns\n", " -------\n", " output : ComplexData:mimetype:`application/x-netcdf`, :mimetype:`application/zip`\n", " The format depends on the 'output_format' input parameter.\n", " output_log : ComplexData:mimetype:`text/plain`\n", " Collected logs during process run.\n", " ref : ComplexData:mimetype:`application/metalink+xml; version=4.0`\n", " Metalink file storing all references to output files.\n", "\n" ] } ], "source": [ "# NBVAL_IGNORE_OUTPUT\n", "\n", "help(wps.frost_days)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "To actually compute an indicator, we need to specify the path to the netCDF file used as input for the calculation of the indicator. To compute `frost_days`, we need a time series of daily minimum temperature. Here we'll use a small test file. Note that here we're using an OPeNDAP link, but it could also be an url to a netCDF file, or the path to a local file on disk. We then simply call the indicator. The response is an object that can poll the server to inquire about the status of the process. This object can use two modes:\n", " - synchronous: it will wait for the server's response before returning; or\n", " - asynchronous: it will return immediately, but without the actual output from the process.\n", "\n", "Here, since we're applying the process on a small test file, we're using the default synchronous mode. For long computations, use the asynchronous mode to avoid time-out errors. The asynchronous mode is activated by setting the `progress` attribute of the WPS client to True." ] }, { "cell_type": "code", "execution_count": 3, "metadata": {}, "outputs": [], "source": [ "tasmin = \"https://pavics.ouranos.ca/twitcher/ows/proxy/thredds/dodsC/birdhouse/testdata/flyingpigeon/cmip3/tasmin.sresa2.miub_echo_g.run1.atm.da.nc\"\n", "resp = wps.frost_days(tasmin)" ] }, { "cell_type": "code", "execution_count": 4, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "Process status: ProcessSucceeded\n", "Link to process output: https://pavics.ouranos.ca/wpsoutputs/finch/16a77b02-e616-11ee-aab6-484d7ef8da38/frost_days_sres_a2_experiment_20460101_20650101.nc\n" ] } ], "source": [ "print(\"Process status: \", resp.status)\n", "urls = resp.get()\n", "print(\"Link to process output: \", urls.output)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "The `get` method returns a `NamedTuple` object with all the WPS outputs, either as references to files or actual content. To copy the file to the local disk, use the `getOutput` method. " ] }, { "cell_type": "code", "execution_count": 5, "metadata": {}, "outputs": [ { "data": { "text/html": [ "
\n", "\n", "\n", "\n", "\n", "\n", "\n", "\n", "\n", "\n", "\n", "\n", "\n", "\n", "\n", "
<xarray.DataArray 'frost_days' (time: 20, lat: 6, lon: 7)>\n",
              "[840 values with dtype=float64]\n",
              "Coordinates:\n",
              "    height   float64 ...\n",
              "  * lat      (lat) float64 42.68 46.39 50.1 53.81 57.52 61.23\n",
              "  * lon      (lon) float64 281.2 285.0 288.8 292.5 296.2 300.0 303.8\n",
              "  * time     (time) object 2046-01-01 00:00:00 ... 2065-01-01 00:00:00\n",
              "Attributes:\n",
              "    units:          days\n",
              "    cell_methods:   time: minimum (interval: 30 minutes) time: sum over days\n",
              "    history:        tas=max(195,tas) applied to raw data; min of 194.73 detec...\n",
              "    standard_name:  days_with_air_temperature_below_threshold\n",
              "    long_name:      Number of days where the daily minimum temperature is bel...\n",
              "    description:    Annual number of days where the daily minimum temperature...
" ], "text/plain": [ "\n", "[840 values with dtype=float64]\n", "Coordinates:\n", " height float64 ...\n", " * lat (lat) float64 42.68 46.39 50.1 53.81 57.52 61.23\n", " * lon (lon) float64 281.2 285.0 288.8 292.5 296.2 300.0 303.8\n", " * time (time) object 2046-01-01 00:00:00 ... 2065-01-01 00:00:00\n", "Attributes:\n", " units: days\n", " cell_methods: time: minimum (interval: 30 minutes) time: sum over days\n", " history: tas=max(195,tas) applied to raw data; min of 194.73 detec...\n", " standard_name: days_with_air_temperature_below_threshold\n", " long_name: Number of days where the daily minimum temperature is bel...\n", " description: Annual number of days where the daily minimum temperature..." ] }, "execution_count": 5, "metadata": {}, "output_type": "execute_result" } ], "source": [ "# NBVAL_IGNORE_OUTPUT\n", "\n", "import tempfile\n", "\n", "fn = tempfile.NamedTemporaryFile()\n", "resp.getOutput(fn.name, identifier=\"output\")\n", "ds = xr.open_dataset(fn.name, decode_timedelta=False)\n", "ds.frost_days" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "birdy's get function has a more user-friendly solution: setting the `asobj` argument to True will directly download all the output files and return outputs as python objects. This mechanims however does not allow for additional keyword arguments, such as the `decode_timedelta` needed here. " ] }, { "cell_type": "code", "execution_count": 6, "metadata": {}, "outputs": [ { "data": { "text/html": [ "
\n", "\n", "\n", "\n", "\n", "\n", "\n", "\n", "\n", "\n", "\n", "\n", "\n", "\n", "\n", "
<xarray.Dataset>\n",
              "Dimensions:     (lat: 6, lon: 7, time: 20)\n",
              "Coordinates:\n",
              "    height      float64 ...\n",
              "  * lat         (lat) float64 42.68 46.39 50.1 53.81 57.52 61.23\n",
              "  * lon         (lon) float64 281.2 285.0 288.8 292.5 296.2 300.0 303.8\n",
              "  * time        (time) object 2046-01-01 00:00:00 ... 2065-01-01 00:00:00\n",
              "Data variables:\n",
              "    frost_days  (time, lat, lon) timedelta64[ns] ...\n",
              "Attributes: (12/20)\n",
              "    comment:                  Spinup: restart files from end of experiment 20...\n",
              "    title:                    MIUB  model output prepared for IPCC Fourth Ass...\n",
              "    cmor_version:             0.96\n",
              "    institution:              Canadian Centre for Climate Services (CCCS)\n",
              "    source:                   ECHO-G(1999): atmosphere: ECHAM4 (T30L19) with ...\n",
              "    contact:                  Canadian Centre for Climate Services\n",
              "    ...                       ...\n",
              "    id:                       pcmdi.ipcc4.miub_echo_g.sresa2.run1.atm.da\n",
              "    history:                  Mon Aug  1 11:43:58 2011: ncks -4 -L 7 -d lat,4...\n",
              "    NCO:                      4.0.9\n",
              "    climateindex_package_id:  https://github.com/Ouranosinc/xclim\n",
              "    product:                  derived climate index\n",
              "    institute_id:             CCCS
" ], "text/plain": [ "\n", "Dimensions: (lat: 6, lon: 7, time: 20)\n", "Coordinates:\n", " height float64 ...\n", " * lat (lat) float64 42.68 46.39 50.1 53.81 57.52 61.23\n", " * lon (lon) float64 281.2 285.0 288.8 292.5 296.2 300.0 303.8\n", " * time (time) object 2046-01-01 00:00:00 ... 2065-01-01 00:00:00\n", "Data variables:\n", " frost_days (time, lat, lon) timedelta64[ns] ...\n", "Attributes: (12/20)\n", " comment: Spinup: restart files from end of experiment 20...\n", " title: MIUB model output prepared for IPCC Fourth Ass...\n", " cmor_version: 0.96\n", " institution: Canadian Centre for Climate Services (CCCS)\n", " source: ECHO-G(1999): atmosphere: ECHAM4 (T30L19) with ...\n", " contact: Canadian Centre for Climate Services\n", " ... ...\n", " id: pcmdi.ipcc4.miub_echo_g.sresa2.run1.atm.da\n", " history: Mon Aug 1 11:43:58 2011: ncks -4 -L 7 -d lat,4...\n", " NCO: 4.0.9\n", " climateindex_package_id: https://github.com/Ouranosinc/xclim\n", " product: derived climate index\n", " institute_id: CCCS" ] }, "execution_count": 6, "metadata": {}, "output_type": "execute_result" } ], "source": [ "# NBVAL_IGNORE_OUTPUT\n", "\n", "out = resp.get(asobj=True)\n", "out.output" ] } ], "metadata": { "kernelspec": { "display_name": "Python 3 (ipykernel)", "language": "python", "name": "python3" }, "language_info": { "codemirror_mode": { "name": "ipython", "version": 3 }, "file_extension": ".py", "mimetype": "text/x-python", "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", "version": "3.11.4" } }, "nbformat": 4, "nbformat_minor": 4 }