You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
When a user logged in, the pod is created automatically, but if during the starting process of the pod, the user restarts the pod using the Start my server button, the hub will still wait for the old pod to start. Causing the following errors:
[W 2017-11-28 14:48:49.440 JupyterHub base:473] User cponce is slow to start (timeout=10)
[I 2017-11-28 14:48:49.441 JupyterHub log:122] 302 POST /hub/login?next= → /hub/ (@10.10.3.197) 10126.28ms
[I 2017-11-28 14:48:49.470 JupyterHub log:122] 302 GET /hub/ → /hub/home ([email protected]) 3.78ms
[I 2017-11-28 14:48:49.490 JupyterHub log:122] 200 GET /hub/home ([email protected]) 6.32ms
14:49:00.058 - info: [ConfigProxy] 200 GET /api/routes
[W 2017-11-28 14:49:00.071 JupyterHub proxy:320] Adding missing route for /user/gcorrea/ (<jupyterhub.objects.Server object at 0x7fffecad70f0>)
[I 2017-11-28 14:49:00.072 JupyterHub proxy:231] Adding user gcorrea to proxy /user/gcorrea/ => http://jovialhub-78fb774fd5-vjfdr:54498
14:49:00.076 - info: [ConfigProxy] Adding route /user/gcorrea -> http://jovialhub-78fb774fd5-vjfdr:54498
14:49:00.077 - info: [ConfigProxy] 201 POST /api/routes/user/gcorrea
14:49:03.132 - error: [ConfigProxy] Proxy error: Error: connect ECONNREFUSED 10.101.1.100:54498
at Object.exports._errnoException (util.js:1018:11)
at exports._exceptionWithHostPort (util.js:1041:20)
at TCPConnectWrap.afterConnect [as oncomplete] (net.js:1086:14)
14:49:03.133 - error: [ConfigProxy] 503 GET /user/gcorrea/
[I 2017-11-28 14:49:03.140 JupyterHub log:122] 200 GET /hub/error/503?url=%2Fuser%2Fgcorrea%2F (@10.101.1.100) 2.32ms
[I 2017-11-28 14:49:07.859 JupyterHub log:122] 302 GET / → /hub (@10.10.18.219) 0.98ms
[I 2017-11-28 14:49:08.466 JupyterHub log:122] 302 GET /hub → /hub/login (@10.10.18.219) 0.92ms
[I 2017-11-28 14:49:08.496 JupyterHub log:122] 200 GET /hub/login (@10.10.18.219) 1.63ms
14:49:11.195 - error: [ConfigProxy] Proxy error: Error: connect ECONNREFUSED 10.101.1.100:54498
at Object.exports._errnoException (util.js:1018:11)
at exports._exceptionWithHostPort (util.js:1041:20)
at TCPConnectWrap.afterConnect [as oncomplete] (net.js:1086:14)
14:49:11.195 - error: [ConfigProxy] 503 GET /user/gcorrea
[I 2017-11-28 14:49:11.199 JupyterHub log:122] 200 GET /hub/error/503?url=%2Fuser%2Fgcorrea (@10.101.1.100) 2.06ms
[I 2017-11-28 14:49:12.876 JupyterHub log:122] 302 GET /user/cponce/ → /hub/user/cponce/ (@10.10.3.197) 0.94ms
[I 2017-11-28 14:49:12.895 JupyterHub base:670] cponce is pending spawn
[I 2017-11-28 14:49:12.904 JupyterHub log:122] 200 GET /hub/user/cponce/ ([email protected]) 8.48ms
[I 2017-11-28 14:49:14.099 JupyterHub log:122] 302 GET / → /hub (@10.10.3.195) 0.85ms
[I 2017-11-28 14:49:14.128 JupyterHub log:122] 302 GET /hub → /user/gcorrea/ ([email protected]) 6.27ms
14:49:14.142 - error: [ConfigProxy] Proxy error: Error: connect ECONNREFUSED 10.101.1.100:54498
at Object.exports._errnoException (util.js:1018:11)
at exports._exceptionWithHostPort (util.js:1041:20)
at TCPConnectWrap.afterConnect [as oncomplete] (net.js:1086:14)
14:49:14.143 - error: [ConfigProxy] 503 GET /user/gcorrea/
[I 2017-11-28 14:49:14.146 JupyterHub log:122] 200 GET /hub/error/503?url=%2Fuser%2Fgcorrea%2F (@10.101.1.100) 1.79ms
[I 2017-11-28 14:49:18.013 JupyterHub base:670] cponce is pending spawn
14:49:19.010 - error: [ConfigProxy] Proxy error: Error: connect ECONNREFUSED 10.101.1.100:54498
at Object.exports._errnoException (util.js:1018:11)
at exports._exceptionWithHostPort (util.js:1041:20)
at TCPConnectWrap.afterConnect [as oncomplete] (net.js:1086:14)
14:49:19.011 - error: [ConfigProxy] 503 GET /user/gcorrea/
[I 2017-11-28 14:49:19.014 JupyterHub log:122] 200 GET /hub/error/503?url=%2Fuser%2Fgcorrea%2F (@10.101.1.100) 1.88ms
[I 2017-11-28 14:49:22.599 JupyterHub base:670] cponce is pending spawn
[I 2017-11-28 14:49:27.714 JupyterHub base:670] cponce is pending spawn
14:49:29.459 - error: [ConfigProxy] Proxy error: Error: connect ECONNREFUSED 10.101.1.100:54498
at Object.exports._errnoException (util.js:1018:11)
at exports._exceptionWithHostPort (util.js:1041:20)
at TCPConnectWrap.afterConnect [as oncomplete] (net.js:1086:14)
14:49:29.459 - error: [ConfigProxy] 503 GET /user/gcorrea/
[I 2017-11-28 14:49:29.462 JupyterHub log:122] 200 GET /hub/error/503?url=%2Fuser%2Fgcorrea%2F (@10.101.1.100) 1.90ms
[I 2017-11-28 14:49:32.835 JupyterHub base:670] cponce is pending spawn
[I 2017-11-28 14:49:37.968 JupyterHub base:670] cponce is pending spawn
[I 2017-11-28 14:49:43.082 JupyterHub base:670] cponce is pending spawn
[I 2017-11-28 14:49:48.199 JupyterHub base:670] cponce is pending spawn
[I 2017-11-28 14:49:53.433 JupyterHub base:670] cponce is pending spawn
[I 2017-11-28 14:49:54.748 JupyterHub log:122] 302 GET / → /hub (@10.10.3.177) 0.86ms
[I 2017-11-28 14:49:55.002 JupyterHub log:122] 302 GET /hub → /hub/login (@10.10.3.177) 0.85ms
[W 2017-11-28 14:49:55.134 JupyterHub base:201] Invalid or expired cookie token
[W 2017-11-28 14:49:55.134 JupyterHub base:201] Invalid or expired cookie token
[W 2017-11-28 14:49:55.136 JupyterHub base:201] Invalid or expired cookie token
[I 2017-11-28 14:49:55.136 JupyterHub log:122] 200 GET /hub/login (@10.10.3.177) 2.42ms
[I 2017-11-28 14:49:57.138 JupyterHub base:345] User logged in: cvalenzu
[I 2017-11-28 14:49:58.580 JupyterHub base:670] cponce is pending spawn
[I 2017-11-28 14:50:01.269 JupyterHub log:122] 200 GET /hub/api (@10.101.0.108) 1.23ms
[I 2017-11-28 14:50:03.715 JupyterHub base:670] cponce is pending spawn
[W 2017-11-28 14:50:07.189 JupyterHub base:473] User cvalenzu is slow to start (timeout=10)
[I 2017-11-28 14:50:07.190 JupyterHub log:122] 302 POST /hub/login?next= → /hub/ (@10.10.3.177) 10126.52ms
[I 2017-11-28 14:50:07.255 JupyterHub log:122] 302 GET /hub/ → /hub/home ([email protected]) 3.28ms
[I 2017-11-28 14:50:07.274 JupyterHub log:122] 200 GET /hub/home ([email protected]) 6.43ms
[I 2017-11-28 14:50:08.863 JupyterHub base:670] cponce is pending spawn
[I 2017-11-28 14:50:11.171 JupyterHub log:122] 302 GET /user/cvalenzu/ → /hub/user/cvalenzu/ (@10.10.3.177) 0.90ms
[I 2017-11-28 14:50:11.207 JupyterHub base:670] cvalenzu is pending spawn
[I 2017-11-28 14:50:11.212 JupyterHub log:122] 200 GET /hub/user/cvalenzu/ ([email protected]) 7.58ms
[I 2017-11-28 14:50:14.016 JupyterHub base:670] cponce is pending spawn
[I 2017-11-28 14:50:16.401 JupyterHub base:670] cvalenzu is pending spawn
[I 2017-11-28 14:50:19.133 JupyterHub base:670] cponce is pending spawn
[I 2017-11-28 14:50:21.554 JupyterHub base:670] cvalenzu is pending spawn
[I 2017-11-28 14:50:24.373 JupyterHub base:670] cponce is pending spawn
[I 2017-11-28 14:50:26.777 JupyterHub base:670] cvalenzu is pending spawn
[I 2017-11-28 14:50:29.738 JupyterHub base:670] cponce is pending spawn
[I 2017-11-28 14:50:31.949 JupyterHub base:670] cvalenzu is pending spawn
[I 2017-11-28 14:50:34.862 JupyterHub base:670] cponce is pending spawn
[I 2017-11-28 14:50:37.158 JupyterHub base:670] cvalenzu is pending spawn
[I 2017-11-28 14:50:40.001 JupyterHub base:670] cponce is pending spawn
[I 2017-11-28 14:50:43.211 JupyterHub base:670] cvalenzu is pending spawn
[I 2017-11-28 14:50:45.153 JupyterHub base:670] cponce is pending spawn
[I 2017-11-28 14:50:49.356 JupyterHub base:670] cvalenzu is pending spawn
[I 2017-11-28 14:50:50.269 JupyterHub base:670] cponce is pending spawn
[I 2017-11-28 14:50:55.225 JupyterHub base:670] cvalenzu is pending spawn
[I 2017-11-28 14:50:55.409 JupyterHub base:670] cponce is pending spawn
[I 2017-11-28 14:51:00.543 JupyterHub base:670] cponce is pending spawn
[I 2017-11-28 14:51:02.230 JupyterHub base:670] cvalenzu is pending spawn
[I 2017-11-28 14:51:05.157 JupyterHub log:122] 200 GET /hub/home ([email protected]) 9.13ms
[I 2017-11-28 14:51:05.714 JupyterHub base:670] cponce is pending spawn
Here we can notice that the hub is waiting for the following users:
User
Pod IP
cponce
10.10.3.197
cvalenzu
10.10.3.177
gcorrea
10.10.3.195
However if we execute kubectl get svc,pods -n jovial -o wide we get the following:
NAME TYPE CLUSTER-IP EXTERNAL-IP PORT(S) AGE SELECTOR
svc/jupyterhub-external NodePort 10.100.0.1 <none> 8000:30000/TCP 32m app=jovial-deployment
svc/jupyterhub-internal ClusterIP 10.100.0.2 <none> 8081/TCP,8001/TCP 32m app=jovial-deployment
NAME READY STATUS RESTARTS AGE IP NODE
po/jovialhub-78fb774fd5-grs28 1/1 Running 0 32m 10.101.1.108 cassaca-node005
po/jupyter-cponce 1/1 Running 0 31m 10.101.0.109 cassaca-node004
po/jupyter-cvalenzu 1/1 Running 0 15s 10.101.1.109 cassaca-node005
po/jupyter-gcorrea 1/1 Running 0 31m 10.101.3.106 cassaca-node007
po/jupyter-mosorio 1/1 Running 0 1d 10.101.1.104 cassaca-node005
po/jupyter-pcampana 1/1 Running 0 1d 10.101.0.107 cassaca-node004
The workaround for this is restarting the hub deployment, this way all internal states of the pods in the hub are invalidated, but it will cause that all users will be logged out when the hub comes back.
The text was updated successfully, but these errors were encountered:
When a user logged in, the pod is created automatically, but if during the starting process of the pod, the user restarts the pod using the Start my server button, the hub will still wait for the old pod to start. Causing the following errors:
Here we can notice that the hub is waiting for the following users:
However if we execute
kubectl get svc,pods -n jovial -o wide
we get the following:The workaround for this is restarting the hub deployment, this way all internal states of the pods in the hub are invalidated, but it will cause that all users will be logged out when the hub comes back.
The text was updated successfully, but these errors were encountered: