-
Notifications
You must be signed in to change notification settings - Fork 34
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Group workflows by trigger type and initial commit for the integration tests workflow #611
Conversation
51ce373
to
1cc7334
Compare
this is looking great! Let's figure out the TLS errors tomorrow |
- name: Run integration tests | ||
env: | ||
CODEGATE_PROVIDERS: "copilot" | ||
CA_CERT_FILE: "/home/runner/work/codegate/codegate/codegate_volume/certs/ca.crt" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
You might be able to fix the ca stuff with something like this:
cp ca.crt /usr/local/share/ca-certificates/codegate.crt
update-ca-certificates
I don't know if sudo would be needed, do these run as root in github?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We should be able to either run the tests without installing the CA_CERT_FILE or not specify the CA_CERT_FILE argument if we had installed the cert to the system store.
9779706
to
4782697
Compare
3bce730
to
22cc684
Compare
@@ -12,15 +16,33 @@ jobs: | |||
docker-image: | |||
name: Check docker image build | |||
runs-on: ubuntu-latest | |||
env: | |||
IMAGE_NAME: stacklok/codegate | |||
IMAGE_TAG: dev |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We don't push this image do we? It's just a tag used for the artifact that we upload and then download?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We don't, no. This is just so it has a reference that we can then use to save it as a tar file.
name: sqlite_data | ||
name_is_regexp: true | ||
skip_unpack: false | ||
if_no_artifact_found: ignore |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
why do we use this parameter? Wouldn't this cause the integration tests to be ignored if the artifact is not built?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I got this part from the image-publish workflow to solve one of the issues I faced which was - codegate did not had any populated db so the invokehttp
test was failing because of that. With this step I'm adding the latest db.
That said, I don't think this is an expected use case (we call a workflow that does produce an artifact).
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
interesting - so the CI images are expected to not have the full db, but the contents of https://github.com/stacklok/codegate/tree/main/data. I worked around that in the initial integration tests patch by adding invokehttp
into the JSON files. Which is not ideal, if we can copy the db, then much better. I'd just be careful if we are not taking too much time or resources.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We were looking at this issue with Pankaj and Michelangelo and it turns out the dataset was not getting imported while building the image. We discussed that using the latest one vs the minimal dataset should not impose any performance implications (and if it does, it would be nice to catch those, right).
Another concern against this was if in the tests we rely that a given package is malicious, but it turns out its maintainers improved its score and this is no longer the case. Even though possible, this is rarely going to be the case, so thus the decision.
@rdimitrov not related to this patch, but iirc you mentioned on Slack that if one of these test cases fails the whole test run wouldn't fail, do we have an issue for that? |
I don't think we have, but I should have fixed it as part of this PR (check the changes on |
Oh, I missed those, I was just reading the workflows. Sloppy review.. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
thank you for answering my questions during the review!
@dependabot rebase |
Signed-off-by: Radoslav Dimitrov <[email protected]>
Signed-off-by: Radoslav Dimitrov <[email protected]>
The following PR bundles some of the existing workflows by trigger type and adds an initial workflow for the integration tests.
Changes:
Reasoning: