Replies: 1 comment
-
10条数据导入也报这样的错? |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
最近有项目需要用到数据分析,数据集是csv,15000列,60w的数据集
服务启动,根据官方文档,docker-compose启动
建表已经跑了
抽样了10条数据,通过stream load的形式,通过http java的实例代码发送的请求,报如下错误
report error status: Memory limit exceeded: Memory limit exceeded:<consuming tracker:<Load#Id=77497c22fc9dfed0-e53a7ce321a62d96>, failed alloc size 1.01 MB, exceeded tracker:<Load#Id=77497c22fc9dfed0-e53a7ce321a62d96>, limit 2.00 GB, peak used 2.00 GB, current used 2.00 GB>, executing msg:<execute:<ExecNode:VFILE_SCAN_NODE (id=0)>>. backend 172.20.80.3 process memory used 2.73 GB, limit 12.40 GB. If query tracker exceed,
set exec_mem_limit=8G
to change limit, details see be.INFO. to coordinator: TNetworkAddress(hostname=172.20.80.2, port=9020), query id: 77497c22fc9dfed0-e53a7ce321a62d96, instance id: 77497c22fc9dfed0-e53a7ce321a62d97set exec_mem_limit=8G
set global exec_mem_limit=8G
还是报这个错,请求还需要如何修改
Beta Was this translation helpful? Give feedback.
All reactions