You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository has been archived by the owner on Aug 22, 2024. It is now read-only.
Describe the bug
Victoriametrics seems to be unable to scrape a line, weighing more than 262144 without manual interference. Apparently, the problem is that psql exporter retrieves data for postgres_statements_query_info as is, whether it is INSERT, SELECT, etc. Once done, VM sends an error below:
VictoriaMetrics/lib/promscrape/scrapework.go:390 cannot scrape "http://db3.node:9108/metric
s" (job "postgres_exporter", labels {__vm_filepath="/etc/victoria-metrics/postgres_exporter_targets.json", cluster="pg_cluster",instance="db.node:9108",job="postgres_exporter"
}): cannot read Prometheus exposition data: too long line: more than 262144 bytes
Steps to reproduce
Run postgres exporter and make it provide a line, weighing at least 262144, then try to scrape it either by VM or VM agent.
Expected behavior
Presumably, if the line reaches let's say 131072 bytes, it breaks down in two/multiples ones to be properly processed by Victoriametrics further on. On the other hand, a workaround with ignore of a line with length > than this is also possible.
My guess a threshold to fine-tune the desirable line length is also an improvement in this case.
Describe the bug
Victoriametrics seems to be unable to scrape a line, weighing more than 262144 without manual interference. Apparently, the problem is that psql exporter retrieves data for postgres_statements_query_info as is, whether it is INSERT, SELECT, etc. Once done, VM sends an error below:
VictoriaMetrics/lib/promscrape/scrapework.go:390 cannot scrape "http://db3.node:9108/metric
s" (job "postgres_exporter", labels {__vm_filepath="/etc/victoria-metrics/postgres_exporter_targets.json", cluster="pg_cluster",instance="db.node:9108",job="postgres_exporter"
}): cannot read Prometheus exposition data: too long line: more than 262144 bytes
Steps to reproduce
Run postgres exporter and make it provide a line, weighing at least 262144, then try to scrape it either by VM or VM agent.
Expected behavior
Presumably, if the line reaches let's say 131072 bytes, it breaks down in two/multiples ones to be properly processed by Victoriametrics further on. On the other hand, a workaround with ignore of a line with length > than this is also possible.
My guess a threshold to fine-tune the desirable line length is also an improvement in this case.
pgSCV startup options
Startup through listen_address: ipv4:9108
services:
"postgres":
service_type: "postgres"
conninfo: "postgres://mgmt:/s/@IPv4:5432/table"
databases: "db"
disable_collectors:
Errors and Logs
Same as above
Environment (please complete the following information):
The text was updated successfully, but these errors were encountered: