-
Notifications
You must be signed in to change notification settings - Fork 122
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
test: handle warnings generated by our own tests #1469
test: handle warnings generated by our own tests #1469
Conversation
Validate PR title: I really think that Unit tests: looks like |
I feel like I get this wrong about 50% of the time 😆.
Yeah, a bunch of parameters were added in 3.11, and I'd forgotten that. Maybe I ought to default to running tox with 3.8 locally - it's much more likely that it would catch things than using newer Python.
Interesting thought - feel free to add it to the team discussion topics. I do agree that at times (like this one) it's more informative to immediately see "ah, this is bad on [3.8 or macOS or ..]". At other times, when something is just generally wrong (hopefully only drafts, if one is properly running tox locally) it does feel wasteful to do all the matrix when it's going to fail every time. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I wish there was a more elegant way, but I guess given that Harness is deprecated, this will do.
The downside is that we won't notice if Harness is accidentally used in new tests, isn't it? I supposed that would get caught by human review.
Our tests currently end with a very large list of warnings. This PR fixes that:
PendingDeprecationWarning
telling us that Harness is going away and we should use Scenario. This seems like something to simply ignore in our own tests, so I've added a-W
argument to Python to silence them in bulk.test_main
test raises a warning about controller storage going away - there's another test that tests that warning happens, and the two tests are otherwise identical. I've merged the two - we don't need them both.SecretInfo
objects can still be created without providing the model UUID. I've added apytest.warns
to (a) ensure that there's the warning, and (b) stop it appearing in the tox output.SecretInfo
objects. This is a one-line fix, so I've added it here - it stops deprecation warnings, but is really more the code than the tests, but since it's so small I think it's ok here.context.on
Scenario tests check that pre- and post- series upgrade events work, and those raise warnings. I've split them off to a separate test that verifies (and silences) the warning.Fixes #1408