Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Improve performance with batching #23

Open
theblazehen opened this issue Feb 20, 2022 · 4 comments · May be fixed by #30
Open

Improve performance with batching #23

theblazehen opened this issue Feb 20, 2022 · 4 comments · May be fixed by #30

Comments

@theblazehen
Copy link

Currently it only uploads one log line with each HTTP request, which does not work well for high velocity logs.

The queued handler could potentially read all the available lines in the queue and upload everything outstanding at once as opposed to popping an item one at a time, and uploading it

@tomashornak
Copy link

tomashornak commented Apr 8, 2022

This would be nice feature, otherwise I'm little bit afraid to use it in production on high-load web servers.
Did you come to any solution @theblazehen ?

@theblazehen
Copy link
Author

@tomashornak Afraid not, decided to just log to journald with https://pypi.org/project/systemd-logging/ and use promtail from there, but haven't tested it at scale yet

@nathabit
Copy link

@tomashornak Afraid not, decided to just log to journald with https://pypi.org/project/systemd-logging/ and use promtail from there, but haven't tested it at scale yet

How are you propagating unique_labels, tags etc in your solution?

@fullonic fullonic linked a pull request Nov 8, 2022 that will close this issue
@LuisGMM
Copy link

LuisGMM commented Nov 26, 2024

Is there any new about this?

It is true that in an async environment with some load a logger that can batch logs becomes necessary :)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

4 participants