-
Notifications
You must be signed in to change notification settings - Fork 15
Consider removing fiber_pool.js #14
Comments
You're suggesting I can just keep creating new |
As soon as a fiber finishes running its stack (which is the bulk of allocated memory) is returned to a shared pool. Garbage collection just cleans up the v8 handle which won't be a measurable performance cost in any application. |
We are experiencing some performance issues in chimp which uses meteor-promise package. I just used simple profiling https://nodejs.org/en/docs/guides/simple-profiling/ and here are the results (notice fiber_pool.js showing up for node 7.3.0) : node 6.3.1 + meteor-promise 0.7.4 [Bottom up (heavy) profile]:
Note: percentage shows a share of a particular caller in the total
amount of its parent calls.
Callers occupying less than 2.0% are not shown.
ticks parent name
17100 47.8% ___mac_get_pid
2570 7.2% UNKNOWN
107 4.2% LazyCompile: *readableAddChunk _stream_readable.js:148:26
106 99.1% LazyCompile: *onread net.js:524:16
1465 4.1% /Users/and_re/.nvm/versions/node/v6.3.1/bin/node
243 16.6% v8::internal::Builtins::~Builtins()
... node 7.3.0 + meteor-promise 0.7.4 [Bottom up (heavy) profile]:
Note: percentage shows a share of a particular caller in the total
amount of its parent calls.
Callers occupying less than 2.0% are not shown.
ticks parent name
94616 18.4% UNKNOWN
9887 10.4% v8::internal::Builtin_HandleApiCall(int, v8::internal::Object**, v8::internal::Isolate*)
9873 99.9% LazyCompile: *<anonymous> /Users/and_re/Projects/chimp-issue-6.3.1/node_modules/chimp/node_modules/meteor-promise/fiber_pool.js:14:36
6132 6.5% LazyCompile: ~t native promise.js:31:7
1831 29.9% LazyCompile: *<anonymous> /Users/and_re/Projects/chimp-issue-6.3.1/node_modules/chimp/node_modules/meteor-promise/fiber_pool.js:14:36
3933 4.2% LazyCompile: *<anonymous> /Users/and_re/Projects/chimp-issue-6.3.1/node_modules/chimp/node_modules/meteor-promise/fiber_pool.js:14:36
17599 3.4% v8::internal::JSObjectWalkVisitor<v8::internal::AllocationSiteUsageContext>::StructureWalk(v8::internal::Handle<v8::internal::JSObject>)
17592 100.0% v8::internal::Runtime_CreateObjectLiteral(int, v8::internal::Object**, v8::internal::Isolate*)
12182 69.2% LazyCompile: ~CreateResolvingFunctions native promise.js:29:34
3764 30.9% LazyCompile: *NewPromiseCapability native promise.js:227:30
17358 3.4% bool v8::internal::ScavengingVisitor<(v8::internal::MarksHandling)1, (v8::internal::PromotionMode)1, (v8::internal::LoggingAndProfiling)0>::SemiSpaceCopyObject<(v8::internal::AllocationAlignment)0>(v8::internal::Map*, v8::internal::HeapObject**, v8::internal::HeapObject*, int)
... node 7.3.0 + meteor-promise 0.8.0 [Bottom up (heavy) profile]:
Note: percentage shows a share of a particular caller in the total
amount of its parent calls.
Callers occupying less than 2.0% are not shown.
ticks parent name
65986 12.6% UNKNOWN
9780 14.8% v8::internal::Builtin_HandleApiCall(int, v8::internal::Object**, v8::internal::Isolate*)
9767 99.9% LazyCompile: *<anonymous> /Users/and_re/Projects/chimp-issue/node_modules/xolvio-sync-webdriverio/node_modules/meteor-promise/fiber_pool.js:14:36
1455 2.2% LazyCompile: ~<anonymous> /Users/and_re/Projects/chimp-issue/node_modules/xolvio-sync-webdriverio/node_modules/meteor-promise/fiber_pool.js:84:40
1455 100.0% LazyCompile: *FiberPool.run /Users/and_re/Projects/chimp-issue/node_modules/xolvio-sync-webdriverio/node_modules/meteor-promise/fiber_pool.js:69:23
1454 99.9% LazyCompile: ~result /Users/and_re/Projects/chimp-issue/node_modules/xolvio-sync-webdriverio/node_modules/meteor-promise/promise_server.js:134:25
1454 100.0% LazyCompile: *PromiseHandle native promise.js:94:23
1454 100.0% LazyCompile: ~<anonymous> native promise.js:108:27
18446 3.5% v8::internal::JSObjectWalkVisitor<v8::internal::AllocationSiteUsageContext>::StructureWalk(v8::internal::Handle<v8::internal::JSObject>)
18437 100.0% v8::internal::Runtime_CreateObjectLiteral(int, v8::internal::Object**, v8::internal::Isolate*)
12345 67.0% LazyCompile: ~CreateResolvingFunctions native promise.js:29:34
5060 41.0% LazyCompile: *Promise native promise.js:47:23
5059 100.0% LazyCompile: *FiberPool.run /Users/and_re/Projects/chimp-issue/node_modules/xolvio-sync-webdriverio/node_modules/meteor-promise/fiber_pool.js:69:23
5056 99.9% LazyCompile: ~result /Users/and_re/Projects/chimp-issue/node_modules/xolvio-sync-webdriverio/node_modules/meteor-promise/promise_server.js:134:25
3702 30.0% LazyCompile: ~<anonymous> native promise.js:194:27
1078 29.1% Stub: CallApiCallbackStub {2}
1078 100.0% LazyCompile: *_tickDomainCallback internal/process/next_tick.js:108:31
3583 29.0% LazyCompile: *NewPromiseCapability native promise.js:227:30
3583 100.0% LazyCompile: *then native promise.js:302:21
3583 100.0% LazyCompile: *Promise.then /Users/and_re/Projects/chimp-issue/node_modules/chimp/node_modules/cucumber/node_modules/meteor-promise/promise_server.js:13:37
6076 33.0% LazyCompile: ~result /Users/and_re/Projects/chimp-issue/node_modules/xolvio-sync-webdriverio/node_modules/meteor-promise/promise_server.js:134:25
6075 100.0% LazyCompile: *PromiseHandle native promise.js:94:23
6075 100.0% LazyCompile: ~<anonymous> native promise.js:108:27
1783 29.3% Stub: CallApiCallbackStub {2}
17611 3.4% v8::internal::Heap::IteratePromotedObjectPointers(v8::internal::HeapObject*, unsigned char*, unsigned char*, bool, void (*)(v8::internal::HeapObject**, v8::internal::HeapObject*))
... |
Fibers already maintains a pool on the backend so it seems like you're just wasting cycles here. See: coroutine.cc
You can tweak the pool size dynamically with
Fiber.poolSize
, default is 120.The text was updated successfully, but these errors were encountered: