Skip to content

Commit

Permalink
Merge pull request #532 from ecmwf/develop
Browse files Browse the repository at this point in the history
Prepare release 0.11.0
  • Loading branch information
sandorkertesz authored Nov 20, 2024
2 parents bb84b9e + 837f240 commit c1fbe14
Show file tree
Hide file tree
Showing 225 changed files with 36,114 additions and 4,045 deletions.
29 changes: 2 additions & 27 deletions .github/workflows/cd-pypi.yml
Original file line number Diff line number Diff line change
Expand Up @@ -5,32 +5,7 @@ on:
tags:
- '**'

# jobs:
# pypi:
# uses: ecmwf-actions/reusable-workflows/.github/workflows/cd-pypi.yml@v2
# secrets: inherit

jobs:
deploy:
if: ${{ github.ref_type == 'tag' }}
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4

- name: Set up Python
uses: actions/setup-python@v4
with:
python-version: 3.x

- name: Install dependencies
run: |
python -m pip install --upgrade pip
pip install build wheel twine
- name: Build and publish
env:
TWINE_USERNAME: __token__
TWINE_PASSWORD: ${{ secrets.PYPI_API_TOKEN }}
run: |
python -m build
twine upload dist/*
uses: ecmwf-actions/reusable-workflows/.github/workflows/cd-pypi.yml@v2
secrets: inherit
9 changes: 9 additions & 0 deletions .github/workflows/ci.yml
Original file line number Diff line number Diff line change
Expand Up @@ -8,16 +8,25 @@ on:
- 'develop'
tags-ignore:
- '**'
paths-ignore:
- "docs/**"
- "README.md"

# Trigger the workflow on pull request
pull_request:
paths-ignore:
- "docs/**"
- "README.md"

# Trigger the workflow manually
workflow_dispatch:

# Trigger after public PR approved for CI
pull_request_target:
types: [labeled]
paths-ignore:
- "docs/**"
- "README.md"

jobs:
# Run CI including downstream packages on self-hosted runners
Expand Down
62 changes: 25 additions & 37 deletions .github/workflows/legacy-ci.yml
Original file line number Diff line number Diff line change
Expand Up @@ -24,44 +24,32 @@ defaults:
shell: bash -l {0}

jobs:
pre-commit:
if: ${{ !github.event.pull_request.head.repo.fork && github.event.action != 'labeled' || github.event.label.name == 'approved-for-ci' }}
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
with:
ref: ${{ github.event.pull_request.head.sha || github.ref }}
- uses: actions/setup-python@v4
with:
python-version: 3.x
- uses: pre-commit/[email protected]
# unit-tests-no-eccodes:
# name: unit-tests (3.10)
# if: ${{ !github.event.pull_request.head.repo.fork && github.event.action != 'labeled' || github.event.label.name == 'approved-for-ci' }}
# runs-on: ubuntu-latest

unit-tests-no-eccodes:
name: unit-tests (3.10)
if: ${{ !github.event.pull_request.head.repo.fork && github.event.action != 'labeled' || github.event.label.name == 'approved-for-ci' }}
runs-on: ubuntu-latest

steps:
- uses: actions/checkout@v3
with:
ref: ${{ github.event.pull_request.head.sha || github.ref }}
- name: Install Conda environment with Micromamba
uses: mamba-org/setup-micromamba@v1
with:
environment-file: tests/environment-unit-tests.yml
environment-name: DEVELOP
channels: conda-forge
cache-environment: true
cache-env-key: ubuntu-latest-3.10-no-eccodes
create-args: >-
python=3.10
- name: Install package
run: |
python -m pip install --no-deps -e .
micromamba remove eccodes
- name: Run tests without eccodes
run: |
python -m pytest -v -m 'no_eccodes'
# steps:
# - uses: actions/checkout@v3
# with:
# ref: ${{ github.event.pull_request.head.sha || github.ref }}
# - name: Install Conda environment with Micromamba
# uses: mamba-org/setup-micromamba@v1
# with:
# environment-file: tests/environment-unit-tests.yml
# environment-name: DEVELOP
# channels: conda-forge
# cache-environment: true
# cache-env-key: ubuntu-latest-3.10-no-eccodes
# create-args: >-
# python=3.10
# - name: Install package
# run: |
# python -m pip install --no-deps -e .
# micromamba remove eccodes
# - name: Run tests without eccodes
# run: |
# python -m pytest -v -m 'no_eccodes'


documentation:
Expand Down
12 changes: 12 additions & 0 deletions .github/workflows/python-pull-request.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,12 @@
name: Code Quality checks for PRs

on:
push:
pull_request:
types: [opened, synchronize, reopened]

jobs:
quality:
uses: ecmwf-actions/reusable-workflows/.github/workflows/qa-precommit-run.yml@v2
with:
skip-hooks: "no-commit-to-branch"
9 changes: 6 additions & 3 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -214,9 +214,10 @@ docs/examples/*.zip
docs/examples/_*
docs/examples/earthkit_use_cases/*.grib
docs/examples/_fdb
docs/examples/*.db
docs/examples/*.json
docs/examples/*.geojson
docs/experimental/_*
docs/experimental/*.grib*
docs/experimental/*.png


# PyBuilder
.pybuilder/
Expand Down Expand Up @@ -364,3 +365,5 @@ notebooks/data/*/
_dev

test.db

*.idx
51 changes: 18 additions & 33 deletions .pre-commit-config.yaml
Original file line number Diff line number Diff line change
@@ -1,27 +1,29 @@
repos:
- repo: https://github.com/pre-commit/pre-commit-hooks
rev: v4.6.0
rev: v5.0.0
hooks:
- id: trailing-whitespace
- id: end-of-file-fixer
- id: trailing-whitespace # Trailing whitespace checker
- id: end-of-file-fixer # Ensure files end in a newline
- id: check-json
- id: check-yaml
- id: check-yaml # Check YAML files for syntax errors only
args: [--unsafe, --allow-multiple-documents]
- id: check-toml
# - id: check-added-large-files
- id: debug-statements
# - id: check-added-large-files
- id: debug-statements # Check for debugger imports and py37+ breakpoint()
- id: mixed-line-ending
- id: no-commit-to-branch # Prevent committing to main / master
# - id: check-merge-conflict # Check for files that contain merge conflict
- id: no-commit-to-branch # Prevent committing to main / master
- id: check-merge-conflict # Check for files that contain merge conflict
exclude: /README\.rst$|^docs/.*\.rst$
- repo: https://github.com/PyCQA/isort
rev: 5.13.0
rev: 5.13.2
hooks:
- id: isort
args:
- -l 110
- --force-single-line-imports
- --profile black
- repo: https://github.com/psf/black
rev: 24.4.2
rev: 24.8.0
hooks:
- id: black
args: [--line-length=110]
Expand All @@ -30,12 +32,9 @@ repos:
hooks:
- id: blackdoc
additional_dependencies: [black==23.3.0]
- repo: https://github.com/PyCQA/flake8
rev: 6.1.0
hooks:
- id: flake8
exclude: xr_engine_profile_rst\.py
- repo: https://github.com/astral-sh/ruff-pre-commit
rev: v0.4.6
rev: v0.6.9
hooks:
- id: ruff
exclude: '(dev/.*|.*_)\.py$'
Expand All @@ -54,25 +53,11 @@ repos:
hooks:
- id: pretty-format-yaml
args: [--autofix, --preserve-quotes]
# - id: pretty-format-toml
# args: [--autofix]
# - repo: https://github.com/b8raoult/pre-commit-docconvert
# rev: "0.1.4"
# hooks:
# - id: docconvert
# args: ["numpy"]
- repo: https://github.com/PyCQA/pydocstyle.git
rev: 6.1.1
hooks:
- id: pydocstyle
additional_dependencies: [toml]
exclude: tests|docs
- repo: https://github.com/b8raoult/optional-dependencies-all
rev: "0.0.6"
- repo: https://github.com/sphinx-contrib/sphinx-lint
rev: v1.0.0
hooks:
- id: optional-dependencies-all
args: ["--inplace", "--exclude-keys=ci,dev,docs,test", "--group=dev=all,docs,test"]
- id: sphinx-lint
- repo: https://github.com/tox-dev/pyproject-fmt
rev: "2.1.3"
rev: "v2.5.0"
hooks:
- id: pyproject-fmt
4 changes: 3 additions & 1 deletion docs/api.rst
Original file line number Diff line number Diff line change
Expand Up @@ -43,7 +43,9 @@ CSV

- :py:class:`~data.readers.csv.CSVReader`


Xarray engine
--------------
- :py:class:`~data.utils.xarray.engine.EarthkitBackendEntrypoint`

Other
--------
Expand Down
11 changes: 10 additions & 1 deletion docs/conf.py
Original file line number Diff line number Diff line change
Expand Up @@ -38,6 +38,8 @@
"sphinx.ext.intersphinx",
"autoapi.extension",
"sphinx_issues",
"sphinx_tabs.tabs",
"sphinx_copybutton",
"earthkit.data.sphinxext.xref",
"earthkit.data.sphinxext.module_output",
]
Expand Down Expand Up @@ -67,6 +69,8 @@
# Path to GitHub repo {group}/{project} (note that `group` is the GitHub user or organization)
issues_github_path = "ecmwf/earthkit-data"

# sphinx_tabs configuration
# sphinx_tabs_disable_css_loading = True

# Add any paths that contain templates here, relative to this directory.
templates_path = ["_templates"]
Expand Down Expand Up @@ -95,6 +99,7 @@
html_logo = "_static/earthkit-data.png"

xref_links = {
"botocore": ("botocore", "https://botocore.amazonaws.com/v1/documentation/api/latest/index.html"),
"cfgrib": ("cfgirb", "https://github.com/ecmwf/cfgrib"),
"covjsonkit": ("covjsonkit", "https://github.com/ecmwf/covjsonkit"),
"earthkit": ("earthkit", "https://earthkit.readthedocs.io/en/latest/"),
Expand All @@ -121,10 +126,14 @@
),
"odb": ("ODB", "https://odc.readthedocs.io/en/latest/content/introduction.html"),
"pyodc": ("pyodc", "https://github.com/ecmwf/pyodc"),
"s3cmd": ("s3cmd", "https://s3tools.org/s3cmd"),
}


intersphinx_mapping = {"pandas": ("https://pandas.pydata.org/docs/", None)}
intersphinx_mapping = {
"pandas": ("https://pandas.pydata.org/docs/", None),
"xarray": ("https://docs.xarray.dev/en/latest/", None),
}


def setup(app):
Expand Down
1 change: 1 addition & 0 deletions docs/examples/NUTS_RG_20M_2021_3035.geojson

Large diffs are not rendered by default.

1 change: 0 additions & 1 deletion docs/examples/cache.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -21,7 +21,6 @@
"metadata": {},
"outputs": [],
"source": [
"import earthkit.data\n",
"from earthkit.data import settings, cache"
]
},
Expand Down
10 changes: 8 additions & 2 deletions docs/examples/data_from_stream.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -164,7 +164,13 @@
{
"cell_type": "markdown",
"id": "brilliant-struggle",
"metadata": {},
"metadata": {
"editable": true,
"slideshow": {
"slide_type": ""
},
"tags": []
},
"source": [
"Having finished the iteration there is no data available in *ds*. We can close the stream:"
]
Expand Down Expand Up @@ -199,7 +205,7 @@
"tags": []
},
"source": [
"When we use the :py:meth:`batched <data.readers.grib.index.GribFieldList.batched>` method we can iterate throught the stream in batches of fixed size. In this example we create a stream and read 2 fields from it at a time."
"When we use the :py:meth:`batched <data.readers.grib.index.GribFieldList.batched>` method we can iterate through the stream in batches of fixed size. In this example we create a stream and read 2 fields from it at a time."
]
},
{
Expand Down
8 changes: 4 additions & 4 deletions docs/examples/fdb.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -86,7 +86,7 @@
"tags": []
},
"source": [
"By default we retrieve data from FDB with :ref:`from_source() <data-sources-fdb>` as a stream."
"By default we retrieve data from an :ref:`FDB <data-sources-fdb>` source with :ref:`from_source() <data-sources-fdb>` as a stream."
]
},
{
Expand Down Expand Up @@ -1232,9 +1232,9 @@
],
"metadata": {
"kernelspec": {
"display_name": "pyfdb",
"display_name": "dev_ecc",
"language": "python",
"name": "pyfdb"
"name": "dev_ecc"
},
"language_info": {
"codemirror_mode": {
Expand All @@ -1246,7 +1246,7 @@
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.8.12"
"version": "3.10.13"
}
},
"nbformat": 4,
Expand Down
Loading

0 comments on commit c1fbe14

Please sign in to comment.