Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug] Leaked access-token on #718

Open
dturn opened this issue May 11, 2020 · 0 comments
Open

[Bug] Leaked access-token on #718

dturn opened this issue May 11, 2020 · 0 comments

Comments

@dturn
Copy link
Contributor

dturn commented May 11, 2020

Bug report

There was an error during a deployment that caused the kubeconfig.yml to be logged. The kubeconfig included an access-token. While access-tokens are short lived we shouldn't be logging them. Ideally we'd just sanitize sensitive information from the kubeconfig.yml instead of suppressing all of it. Possibly by not printing the auth-provider: section of a user or just not logging the entire users section

Error: Error loading config file "/app/config/kubeconfig.yml": v1.Config.Clusters:
[]v1.NamedCluster: v1.NamedCluster.Cluster: v1.Cluster.CertificateAuthorityData: decode base64: illegal base64 data at input byte 748, error found in #10 byte of ...|"}}]}|..., bigger context ...|"}}]}|...

Expected behavior:

access-token should never be logged.

Actual behavior:

access-token is logged as a part of the full kubeconfig.yml file.

Version(s) affected:

Likely all versions, but this happened on 1.0.0.pre.1

Steps to Reproduce

Unclear how to reproduce this exact failure. Might be enough to just have an invalid config file, but I haven't confirmed this

@dturn dturn added 🪲 bug Something isn't working 🔒 security labels May 11, 2020
@ghost ghost added the krane [ProdX-GSD] label Jun 15, 2020
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

1 participant