Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add code coverage calculation to pre-commit stages #321

Closed
delucchi-cmu opened this issue Nov 27, 2023 · 3 comments
Closed

Add code coverage calculation to pre-commit stages #321

delucchi-cmu opened this issue Nov 27, 2023 · 3 comments
Labels
enhancement New feature or request

Comments

@delucchi-cmu
Copy link
Contributor

I'm bummed when everything in pre-commit passes locally, then the github workflow lets me know that there's one new line that doesn't have test coverage.

Always aiming for 100% is a high bar, but even a warning that you're reducing coverage would be great!

@delucchi-cmu delucchi-cmu added the enhancement New feature or request label Nov 27, 2023
@drewoldag
Copy link
Collaborator

It would be nice if we could output the total coverage during the pre-commit step. This would give a heads up to let the user know that the github workflow might fail.

@delucchi-cmu
Copy link
Contributor Author

We're already calculating the code coverage amounts, so this would be a matter of twiddling with the output/verbosity options to get something useful. The best I've come up with is below, and is basically using the pytest command $ python -m pytest -qq --cov=./src --cov-report=term-missing:skip-covered

$ pre-commit
Check template version...................................................Passed
- hook id: check-lincc-frameworks-template-version
- duration: 0.08s
Clear output from Jupyter notebooks..................(no files to check)Skipped
Prevent main branch commits..............................................Passed
Check for large files....................................................Passed
Validate pyproject.toml..............................(no files to check)Skipped
Run isort............................................(no files to check)Skipped
pylint (python files in src/)........................(no files to check)Skipped
pylint (test-ish python files).......................(no files to check)Skipped
Run unit tests...........................................................Passed
- hook id: pytest-check
- duration: 4.27s

........................................................................ [ 24%]
........................................................................ [ 48%]
........................................................................ [ 72%]
........................................................................ [ 96%]
...........                                                              [100%]

---------- coverage: platform linux, python 3.10.13-final-0 ----------
Name                                                             Stmts   Miss  Cover   Missing
----------------------------------------------------------------------------------------------
src/hipscat/catalog/association_catalog/association_catalog.py      42      3    93%   71-72, 86
src/hipscat/catalog/healpix_dataset/healpix_dataset.py              53      2    96%   91-92
----------------------------------------------------------------------------------------------
TOTAL                                                             1744      5    99%

50 files skipped due to complete coverage.

Build documentation with Sphinx..........................................Passed

I like it:

  • will show up every time you run pre-commit, even if the tests pass
  • will show the line numbers of uncovered lines, and suppress all other text
  • the individual tests that are passing are shown only as individual dots.

I don't like it:

  • I can't get the dots to go away, but keep the coverage summary

@drewoldag
Copy link
Collaborator

We have codecov in CI actions, so we'll leave this as is for now.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

2 participants