Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Run from JupyterLab / Jupyter Notebook #80

Closed
ahartikainen opened this issue Aug 6, 2019 · 19 comments
Closed

Run from JupyterLab / Jupyter Notebook #80

ahartikainen opened this issue Aug 6, 2019 · 19 comments

Comments

@ahartikainen
Copy link
Contributor

Running httpstan from Jupyter Lab/Notebook fails due to jupyter is already running asyncio event

~\miniconda3\envs\stan3\lib\asyncio\base_events.py in run_forever(self)
    427             raise RuntimeError(
--> 428                 'Cannot run the event loop while another loop is running')
    429         self._set_coroutine_wrapper(self._debug)

RuntimeError: Cannot run the event loop while another loop is running

Using IPython works (or I'm running this on Windows, and Python crash when I exit the python, so I can save the results with ArviZ to netCDF and use it later).

@ahartikainen
Copy link
Contributor Author

@ahartikainen
Copy link
Contributor Author

Could we run httpstan server with a subprocess.Popen/subprocess.call?

@stale
Copy link

stale bot commented Oct 22, 2019

This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.

@stale stale bot added the wontfix label Oct 22, 2019
@stale stale bot closed this as completed Oct 29, 2019
@riddell-stan riddell-stan reopened this Dec 14, 2019
@riddell-stan
Copy link
Contributor

riddell-stan commented Dec 14, 2019

We need to figure out a solution. Perhaps we could create our own event loop?

Python 3.7 also has a new function, asyncio.get_running_loop(), which could be useful if we want to "detect" if we are running in a jupyter notebook environment.

Example of creating a new event loop in a thread: https://gist.github.com/lars-tiede/01e5f5a551f29a5f300e

@riddell-stan riddell-stan added bug and removed wontfix labels Dec 14, 2019
@riddell-stan riddell-stan added this to the 3.0.0b1 milestone Dec 14, 2019
@riddell-stan riddell-stan modified the milestones: 3.0.0b1, 3.0 Apr 16, 2020
@riddell-stan
Copy link
Contributor

riddell-stan commented Jun 24, 2020

Things work for me with IPython and Python REPL. The code we have written is entirely valid. I think this is a bug against JupyterLab/Jupyter Notebook.

@riddell-stan riddell-stan removed the bug label Jun 24, 2020
@ahartikainen
Copy link
Contributor Author

Is this still failing with Jupyter?

@riddell-stan
Copy link
Contributor

Yes.

Closing this because the underlying issue (asyncio and jupyter notebook) is well documented elsewhere. It looks like the "official" workaround is to install an earlier version of tornado, see jupyter/notebook#3397 .

@ahartikainen
Copy link
Contributor Author

ahartikainen commented Jul 27, 2020

I have dig this a bit deeper.

edit. Code run from Windows10 in JupyterLab interface which is running on WSL (Ubuntu 18). Running similar code from terminal has no problems.

Following code / "hack" let's to compile model and start the sampling.

import multiprocessing
multiprocessing.set_start_method('spawn', True)
import stan

import concurrent.futures
def exec_async(func, *args, **kwargs):
    with concurrent.futures.ThreadPoolExecutor(max_workers=1) as executor:
        future = executor.submit(func, *args, **kwargs)
        return future.result()

posterior = exec_async(stan.build, schools_code, data=schools_data)
fit = exec_async(posterior.sample, num_chains=4, num_samples=1000)

But the fit fails randomly with DecodeError: Error parsing message (there is try-except added by me)

~/miniconda3/envs/stan/lib/python3.8/site-packages/stan/model.py in go()
    174                         if resp.status != 200:
    175                             raise RuntimeError((await resp.json())["message"])
--> 176                         stan_outputs.append(tuple(extract_protobuf_messages(await resp.read())))
    177                 for stan_output in stan_outputs:
    178                     assert isinstance(stan_output, tuple), stan_output

~/miniconda3/envs/stan/lib/python3.8/site-packages/stan/model.py in extract_protobuf_messages(fit_bytes)
    130                 next_pos, pos = varint_decoder(fit_bytes, pos)
    131                 try:
--> 132                     msg.ParseFromString(fit_bytes[pos : pos + next_pos])
    133                 except Exception as e:
    134                     print(e)

This seems to happen in random points

Here is a msg that is failing to parse (first info where in the fit it happens)

previous loc:

pos, pos+next_pos, next_pos =  276691 277529 838

failure loc:

pos, pos+next_pos, next_pos = 277531 278369 838
b'\x08\x04\x12\x12\n\x04lp__\x1a\n\n\x08\xc2\x98e\xef\x83\x99C\xc0\x12\x1b\n\raccept_stat_\xbc\x04\x08\x03\x12\x12\n\x04lp__\x1a\n\n\x08\x94\xb1\xe8Q\x12pC\xc0\x12\x1b\n\raccept_stat__\x1a\n\n\x08\x00\x00\x00\x00\x00\x00\xf0?\x12\x18\n\nstepsize__\x1a\n\n\x08\x1f\x82\x81\x9dEu\xd5?\x12\x19\n\x0btreedepth__\x1a\n\n\x08\x00\x00\x00\x00\x00\x00\x08@\x12\x1a\n\x0cn_leapfrog__\x1a\n\n\x08\x00\x00\x00\x00\x00\x00\x1c@\x12\x19\n\x0bdivergent__\x1a\n\n\x08\x00\x00\x00\x00\x00\x00\x00\x00\x12\x16\n\x08energy__\x1a\n\n\x08\x88\x14\xa7\x9f\xc93E@\x12\x10\n\x02mu\x1a\n\n\x08\x99\x01\xec\x8c3d!@\x12\x11\n\x03tau\x1a\n\n\x08\x1d\x06V)\xb0\xec\x07@\x12\x13\n\x05eta.1\x1a\n\n\x08>w\x87\x89\x99D\xf5?\x12\x13\n\x05eta.2\x1a\n\n\x08^[\x90\x1c\xda\xd1\xe6\xbf\x12\x13\n\x05eta.3\x1a\n\n\x08HgjR\xa2\xfe\xd5\xbf\x12\x13\n\x05eta.4\x1a\n\n\x08m\xbf\xd1,\xaa\xde\xe6?\x12\x13\n\x05eta.5\x1a\n\n\x08\xb8\xa6sRU\xb9\x98\xbf\x12\x13\n\x05eta.6\x1a\n\n\x08\ta\xd6\x00*(\xfb\xbf\x12\x13\n\x05eta.7\x1a\n\n\x08\xf4Z0\xdc\xd9\xc6\xc6?\x12\x13\n\x05eta.8\x1a\n\n\x08_\xd8\x7f\xb9\xba`\xf1?\x12\x15\n\x07theta.1\x1a\n\n\x08\xc2\x96\xe6:\x82W)@\x12\x15\n\x07theta.2\x1a\n\n\x08\x99\xe3u\x13\x98@\x1a@\x12\x15\n\x07theta.3\x1a\n\n\x08\x02g\x87.\xfa\xab\x1e@\x12\x15\n\x07theta.4\x1a\n\n\x08\x8c\x03I$\x80\xaa%@\x12\x15\n\x07theta.5\x1a\n\n\x08\xccpdd;?!@\x12\x15\n\x07theta.6\x1a\n\n\x08\xb6\x89\xf8fV\xf5\x0c@\x12\x15\n\x07theta.7\x1a\n\n\x08\xc8O\xed\xd4\xa9t"@\x12\x15\n\x07theta.8\x1a\n\n\x08+\xcc\\-;\xe3\'@\xbc\x04\x08\x03\x12\x12\n\x04lp__\x1a\n\n\x08\x84\x0fR\x82f\x08C\xc0\x12\x1b\n\raccept_stat__\x1a\n\n\x08\x9a\xd4:\xd8\xe64\xd5?\x12\x18\n\nstepsize__\x1a\n\n\x08\x1f\x82\x81\x9dEu\xd5?\x12\x19\n\x0btreedepth__\x1a\n\n\x08\x00\x00\x00\x00\x00\x00\x10@\x12\x1a\n\x0cn_leapfrog__\x1a\n\n\x08\x00\x00\x00\x00\x00\x00.@\x12\x19\n\x0bdivergent__\x1a\n\n\x08\x00\x00\x00\x00\x00\x00\x00\x00\x12\x16\n\x08energy__\x1a\n\n\x08\x1e\xfc\x8b1\xcc G@\x12\x10\n\x02mu\x1a\n\n\x08Y\x14\xf5r\x9b\x95"@\x12\x11\n\x03tau\x1a\n\n\x08\xf6KBM\xda\xf9+@\x12\x13\n\x05'

and here is the same message (but 1000 previous and next bytes; whitespace was added by me)

b'\x07g_eta.2\x1a\n\n\x08^sa\xef\xadZ\xf2\xbf\x12\x15\n\x07g_eta.3\x1a\n\n\x08L\x9d\xbf\xf6!\x15\xd7\xbf\x12\x15\n\x07g_eta.4\x1a\n\n\x08\xfd\x11\xfeC\xf6\xb1\xed?\x12\x15\n\x07g_eta.5\x1a\n\n\x08\xb1\'i\xfb\xb7\x80\xf3?\x12\x15\n\x07g_eta.6\x1a\n\n\x08\xf0\xdb\x13\xbaG\x1b\x9d?\x12\x15\n\x07g_eta.7\x1a\n\n\x08\xa4=\xab\xbb\x8d\xc2\xd3?\x12\x15\n\x07g_eta.8\x1a\n\n\x08\xe8!\x85\xc0\xe3~\xe0?\xc6\x06\x08\x04\x12\x12\n\x04lp__\x1a\n\n\x08\x16\xac\t\xcc\x7fFC\xc0\x12\x1b\n\raccept_stat__\x1a\n\n\x08>\xd3W\x01\x93\xbe\xee?\x12\x18\n\nstepsize__\x1a\n\n\x08\x1f\x82\x81\x9dEu\xd5?\x12\x19\n\x0btreedepth__\x1a\n\n\x08\x00\x00\x00\x00\x00\x00\x08@\x12\x1a\n\x0cn_leapfrog__\x1a\n\n\x08\x00\x00\x00\x00\x00\x00.@\x12\x19\n\x0bdivergent__\x1a\n\n\x08\x00\x00\x00\x00\x00\x00\x00\x00\x12\x16\n\x08energy__\x1a\n\n\x08\x99\x84\xef"wKD@\x12\x10\n\x02mu\x1a\n\n\x08s\x91\xff\x9a(4 @\x12\x11\n\x03tau\x1a\n\n\x08\x89\x0c\xe6\xe6\xd8\xe5\xfe?\x12\x13\n\x05eta.1\x1a\n\n\x08\xf9\xdb7\xf4\xd0\x8b\xe1?\x12\x13\n\x05eta.2\x1a\n\n\x08"&h\x01b\x03\xf0?\x12\x13\n\x05eta.3\x1a\n\n\x08\x9ft\x8d\xe8l\x91\xe1?\x12\x13\n\x05eta.4\x1a\n\n\x08\xe9\xc17}2\x06\xd6?\x12\x13\n\x05eta.5\x1a\n\n\x08\xf2\xa4\x95\x19TT\xea\xbf\x12\x13\n\x05eta.6\x1a\n\n\x08h\x16\xc6\xea\x1dE\xf6\xbf\x12\x13\n\x05eta.7\x1a\n\n\x08d\x90\xe4\xe2\xc1t\xf1\xbf\x12\x13\n\x05eta.8\x1a\n\n\x08\xc6\xda\r\x85\xb2\xad\xd6\xbf\x12\x12\n\x04p_mu\x1a\n\n\x08\xd8\x8dF`\x8b\x8c\xc3?\x12\x13\n\x05p_tau\x1a\n\n\x08\xc0\xfeEu4\xaad\xbf\x12\x15\n\x07p_eta.1\x1a\n\n\x08@cAB\x10\x1f\xf0?\x12\x15\n\x07p_eta.2\x1a\n\n\x08N\xf0\x19\xa5\xd3\xda\xe7?\x12\x15\n\x07p_eta.3\x1a\n\n\x08\x10\x06p[\x93\xcf\xd5?\x12\x15\n\x07p_eta.4\x1a\n\n\x08>\x1dyoMN\xf0\xbf\x12\x15\n\x07p_eta.5\x1a\n\n\x08\x1a\x83\xab\xcc\x1a\xd3\xf3\xbf\x12\x15\n\x07p_eta.6\x1a\n\n\x08D\xe2(\xe3\x08|\xbd?\x12\x15\n\x07p_eta.7\x1a\n\n\x08`?\x8e\xa8\x80,\xa7\xbf\x12\x15\n\x07p_eta.8\x1a\n\n\x08\x84\xbf\rL@\x9a\xdb\xbf\x12\x12\n\x04g_mu\x1a\n\n\x08m\xd3\x03\xf6Y;\xb6\xbf\x12\x13\n\x05g_tau\x1a\n\n\x08\x9eX\xd8pp-\xea?\x12\x15\n\x07g_eta.1\x1a\n\n\x08p\x85G\x87X\xcb\xab?\x12\x15\n\x07g_eta.2\x1a\n\n\x08`\x83i\x17K\xbe\xf7?\x12\x15\n\x07g_eta.3\x1a\n\n\x08\x9aG\x04\x8cug\xee?\x12\x15\n\x07g_eta.4\x1a\n\n\x08\x9d\x10\xd4e\xf5Y\xe1?\x12\x15\n\x07g_eta.5\x1a\n\n\x08\xa0\xa7\xc3\xd9\xe1\xfd\xe0\xbf\x12\x15\n\x07g_eta.6\x1a\n\n\x08\xd3\xee\x1a+b\x8c\xf8\xbf\x12\x15\n\x07g_eta.7\x1a\n\n\x08f\x9clHuW\x02\xc0\x12\x15\n\x07g_eta.8\x1a\n\n\x08~\xcfF\x14\xb2Q\xdf\xbf\xc6\x06


\x08\x04\x12\x12\n\x04lp__\x1a\n\n\x08\xc2\x98e\xef\x83\x99C\xc0\x12\x1b\n\raccept_stat_\xbc\x04\x08\x03\x12\x12\n\x04lp__\x1a\n\n\x08\x94\xb1\xe8Q\x12pC\xc0\x12\x1b\n\raccept_stat__\x1a\n\n\x08\x00\x00\x00\x00\x00\x00\xf0?\x12\x18\n\nstepsize__\x1a\n\n\x08\x1f\x82\x81\x9dEu\xd5?\x12\x19\n\x0btreedepth__\x1a\n\n\x08\x00\x00\x00\x00\x00\x00\x08@\x12\x1a\n\x0cn_leapfrog__\x1a\n\n\x08\x00\x00\x00\x00\x00\x00\x1c@\x12\x19\n\x0bdivergent__\x1a\n\n\x08\x00\x00\x00\x00\x00\x00\x00\x00\x12\x16\n\x08energy__\x1a\n\n\x08\x88\x14\xa7\x9f\xc93E@\x12\x10\n\x02mu\x1a\n\n\x08\x99\x01\xec\x8c3d!@\x12\x11\n\x03tau\x1a\n\n\x08\x1d\x06V)\xb0\xec\x07@\x12\x13\n\x05eta.1\x1a\n\n\x08>w\x87\x89\x99D\xf5?\x12\x13\n\x05eta.2\x1a\n\n\x08^[\x90\x1c\xda\xd1\xe6\xbf\x12\x13\n\x05eta.3\x1a\n\n\x08HgjR\xa2\xfe\xd5\xbf\x12\x13\n\x05eta.4\x1a\n\n\x08m\xbf\xd1,\xaa\xde\xe6?\x12\x13\n\x05eta.5\x1a\n\n\x08\xb8\xa6sRU\xb9\x98\xbf\x12\x13\n\x05eta.6\x1a\n\n\x08\ta\xd6\x00*(\xfb\xbf\x12\x13\n\x05eta.7\x1a\n\n\x08\xf4Z0\xdc\xd9\xc6\xc6?\x12\x13\n\x05eta.8\x1a\n\n\x08_\xd8\x7f\xb9\xba`\xf1?\x12\x15\n\x07theta.1\x1a\n\n\x08\xc2\x96\xe6:\x82W)@\x12\x15\n\x07theta.2\x1a\n\n\x08\x99\xe3u\x13\x98@\x1a@\x12\x15\n\x07theta.3\x1a\n\n\x08\x02g\x87.\xfa\xab\x1e@\x12\x15\n\x07theta.4\x1a\n\n\x08\x8c\x03I$\x80\xaa%@\x12\x15\n\x07theta.5\x1a\n\n\x08\xccpdd;?!@\x12\x15\n\x07theta.6\x1a\n\n\x08\xb6\x89\xf8fV\xf5\x0c@\x12\x15\n\x07theta.7\x1a\n\n\x08\xc8O\xed\xd4\xa9t"@\x12\x15\n\x07theta.8\x1a\n\n\x08+\xcc\\-;\xe3\'@\xbc\x04\x08\x03\x12\x12\n\x04lp__\x1a\n\n\x08\x84\x0fR\x82f\x08C\xc0\x12\x1b\n\raccept_stat__\x1a\n\n\x08\x9a\xd4:\xd8\xe64\xd5?\x12\x18\n\nstepsize__\x1a\n\n\x08\x1f\x82\x81\x9dEu\xd5?\x12\x19\n\x0btreedepth__\x1a\n\n\x08\x00\x00\x00\x00\x00\x00\x10@\x12\x1a\n\x0cn_leapfrog__\x1a\n\n\x08\x00\x00\x00\x00\x00\x00.@\x12\x19\n\x0bdivergent__\x1a\n\n\x08\x00\x00\x00\x00\x00\x00\x00\x00\x12\x16\n\x08energy__\x1a\n\n\x08\x1e\xfc\x8b1\xcc G@\x12\x10\n\x02mu\x1a\n\n\x08Y\x14\xf5r\x9b\x95"@\x12\x11\n\x03tau\x1a\n\n\x08\xf6KBM\xda\xf9+@\x12\x13\n\x05


eta.1\x1a\n\n\x08<\xad\xca\x82XX\xfd?\x12\x13\n\x05eta.2\x1a\n\n\x08\x10m\xcc615\xb1\xbf\x12\x13\n\x05eta.3\x1a\n\n\x08\xb9\x89r\xec\xe3\x0b\xf2\xbf\x12\x13\n\x05eta.4\x1a\n\n\x08O\xfc\xec(\x92\xa8\xea\xbf\x12\x13\n\x05eta.5\x1a\n\n\x08\xe4tB\xc0\x84c\xe1\xbf\x12\x13\n\x05eta.6\x1a\n\n\x08\xac\xd6\x1b\xd5f\xaf\xf4\xbf\x12\x13\n\x05eta.7\x1a\n\n\x08\xd5\x88\xfeT\x92\xbf\xf4?\x12\x13\n\x05eta.8\x1a\n\n\x08\x14[a\xfd\xdfX\xe0?\x12\x15\n\x07theta.1\x1a\n\n\x08\xa5\xd0\xa1\x04<yA@\x12\x15\n\x07theta.2\x1a\n\n\x08\xb9\xda\xd1\xd83\xb4 @\x12\x15\n\x07theta.3\x1a\n\n\x08\x84\xbf\x82\xe0\x88\xf0\x19\xc0\x12\x15\n\x07theta.4\x1a\n\n\x08,\x98\x87\xb6\x15\xe3\x02\xc0\x12\x15\n\x07theta.5\x1a\n\n\x08\xf8\x06\x89~\xf3\x0e\xfb?\x12\x15\n\x07theta.6\x1a\n\n\x08_S\xb9\xfee\x95!\xc0\x12\x15\n\x07theta.7\x1a\n\n\x08\xba\x81\x03gqn;@\x12\x15\n\x07theta.8\x1a\n\n\x08\xfe>L\xc2\x1dp0@\xbc\x04\x08\x03\x12\x12\n\x04lp__\x1a\n\n\x08\x8f&\xcfa\xc4YD\xc0\x12\x1b\n\raccept_stat__\x1a\n\n\x084</z\xbc\xe3\xed?\x12\x18\n\nstepsize__\x1a\n\n\x08\x1f\x82\x81\x9dEu\xd5?\x12\x19\n\x0btreedepth__\x1a\n\n\x08\x00\x00\x00\x00\x00\x00\x08@\x12\x1a\n\x0cn_leapfrog__\x1a\n\n\x08\x00\x00\x00\x00\x00\x00.@\x12\x19\n\x0bdivergent__\x1a\n\n\x08\x00\x00\x00\x00\x00\x00\x00\x00\x12\x16\n\x08energy__\x1a\n\n\x08 \xe1\xbf\x86\xd3cG@\x12\x10\n\x02mu\x1a\n\n\x08\xaf\xc8\xff\x02\xe7|\x16@\x12\x11\n\x03tau\x1a\n\n\x08\xeb\x18\xdf\xe1G#\x0b@\x12\x13\n\x05eta.1\x1a\n\n\x08U\xd7\xa6\x12\x01\xcf\xf6\xbf\x12\x13\n\x05eta.2\x1a\n\n\x08o\xfe\xed-\xa8\xc0\xe2?\x12\x13\n\x05eta.3\x1a\n\n\x08\xad<\xad\xfb\xc8#\xd2?\x12\x13\n\x05eta.4\x1a\n\n\x08$\xc90ci\xf7\xe4?\x12\x13\n\x05eta.5\x1a\n\n\x08\xfb\x92\\T\xe0\x19\xf1?\x12\x13\n\x05eta.6\x1a\n\n\x08\xae\\\x0f+\x0e\x18\xdd\xbf\x12\x13\n\x05eta.7\x1a\n\n\x08\xb8&\xe9\'\xec5\xd9?\x12\x13\n\x05eta.8\x1a\n\n\x08\x95\xf2\x85\x94\x1ao\xff?\x12\x15\n\x07theta.1\x1a\n\n\x08 \xbam\xae\xc3(\xe9?\x12\x15\n\x07theta.2\x1a\n\n\x08\xfa\x8eBb\x84p\x1e@\x12\x15\n\x07theta.3\x1a\n\n\x08\x9f%\xe7`sU\x1a@\x12\x15\n\x07theta.4\x1a\n\n\x08\x00k\xf2R\xd6`\x1f@\x12\x15\n\x07theta.5\x1a\n\n\x08\x0b\x88\xd4\x80\xcb~"@\x12\x15\n\x07theta.6\x1a\n\n\x08\x92\xea[\x1b\xcfQ\x10@\x12\x15\n\x07theta.7\x1a\n\n\x08\xb6\x98c\xba9\xd5\x1b@\x12\x15\n\x07theta.8\x1a\n\n\x08\x84I\xe3\xc8\xa6\x92(@_\x1a\n\n\x08\xe0\x08\xf0\xb1I\xcf\xef?\x12\x18\n\nstepsize__\x1a\n\n\x08\x1f\x82\x81\x9dEu\xd5?\x12\x19\n\x0btreedepth__\x1a\n\n\x08\x00\x00\x00\x00\x00\x00\x10@\x12\x1a\n\x0cn_leapfr'

It somehow looks like the accept_stat__ is there two times (but the first one has only one _)?

@ahartikainen
Copy link
Contributor Author

Noticed that there is an error in terminal (for Jupyter case)

[libprotobuf ERROR google/protobuf/wire_format_lite.cc:577] String field 'stan.WriterMessage.Feature.name' contains invalid UTF-8 data when parsing a protocol buffer. Use the 'bytes' type if you intend to send raw bytes.

@riddell-stan
Copy link
Contributor

riddell-stan commented Jul 27, 2020

I'd be open to discussing whether or not it's a good practice to spend time working around another package's bug --- especially when that bug is clearly identified and widely acknowledged. In general, I would not be inclined to support the practice.

@ahartikainen
Copy link
Contributor Author

I'm not suggesting we change anything.

I'm just keeping log what is going on. There is probably a way to fix this at some point.

If there is no way to run code in jupyter, then at least my use is quite limited (given that I code in jupyter 99+%). Also this would restrict tutorial formats. Or force users to have some subprocess tricks.

@riddell-stan
Copy link
Contributor

riddell-stan commented Jul 28, 2020

Thanks. I appreciate the clarification.
What about pinning jupyter tornado to a lower version (in a virtualenv). Doesn't that solve the problem cleanly?(I also imagine that jupyter will eventually fix the problem.)

@ahartikainen
Copy link
Contributor Author

That might work. Not sure if jupyterlab supports it.

@ahartikainen
Copy link
Contributor Author

ahartikainen commented Jul 29, 2020

I finally cracked it. Silly mistake by my end (indentation error with return, which caused some interference between the threads or something similar)

So to get sampling working on Jupyter (Notebook / Lab) with the recent Tornado one needs to do the following:

  1. Set multiprocessing methods to spawn, before importing any function with multiprocessing (e.g. import stan)
  2. Run function in a new thread (see exec_func function)

    import multiprocessing
    multiprocessing.set_start_method('spawn', True)
    
    import concurrent.futures

    def exec_async(func, *args, **kwargs):
        with concurrent.futures.ThreadPoolExecutor(max_workers=1) as executor:
            future = executor.submit(func, *args, **kwargs)
        return future.result()

    import stan

    ....

    # notice functions are not called on these lines
    posterior = exec_async(stan.build, schools_code, data=schools_data)
    fit = exec_async(posterior.sample, num_chains=4, num_samples=1000)

Not sure if there is a way to inject this pystan, probably not.

@ahartikainen
Copy link
Contributor Author

I created this little wrapper, which can be imported in Jupyter as

import stan_jupyter as stan
import multiprocessing
multiprocessing.set_start_method('spawn', True)
del multiprocessing    

from concurrent.futures import ThreadPoolExecutor as _ThreadPoolExecutor

def _exec_async(func, *args, **kwargs):
    with _ThreadPoolExecutor(max_workers=1) as executor:
        future = executor.submit(func, *args, **kwargs)
    return future.result()

from importlib import reload
import stan as _stan
_stan = reload(_stan) # force reload if stan was imported previously
del reload

def _inject_posterior(posterior):
    """Inject posterior.sample with call in thread."""
    posterior._sample = posterior.sample
    
    def sample(**kwargs):
        """"""
        return _exec_async(posterior._sample, **kwargs)
    
    sample.__doc__ += posterior.sample.__doc__
    posterior.sample = sample

def build(program_code, data=None, random_seed=None):
    """"""
    posterior = _exec_async(_stan.build, program_code, data=data, random_seed=random_seed)
    _inject_posterior(posterior)
    return posterior

build.__doc__ += _stan.build.__doc__
try:
    __version__ = _stan.__version__
except:
    pass

ps. Running Jupyter from WSL works fine, but running it from WSL2 you need to set the correct ip for it; and it is dynamical ip --> this works (where python should be the main python you use, if you use system python, change python to python3

jupyter lab --ip $(python -c "import subprocess; print(subprocess.run(['hostname', '-I'], capture_output=True).stdout.strip().decode('utf-8'))")

@riddell-stan
Copy link
Contributor

riddell-stan commented Aug 7, 2020 via email

@ahartikainen
Copy link
Contributor Author

It does sound fine.

@nrakocz
Copy link

nrakocz commented Oct 28, 2020

I created this little wrapper, which can be imported in Jupyter as

import stan_jupyter as stan
import multiprocessing
multiprocessing.set_start_method('spawn', True)
del multiprocessing    

from concurrent.futures import ThreadPoolExecutor as _ThreadPoolExecutor

def _exec_async(func, *args, **kwargs):
    with _ThreadPoolExecutor(max_workers=1) as executor:
        future = executor.submit(func, *args, **kwargs)
    return future.result()

from importlib import reload
import stan as _stan
_stan = reload(_stan) # force reload if stan was imported previously
del reload

def _inject_posterior(posterior):
    """Inject posterior.sample with call in thread."""
    posterior._sample = posterior.sample
    
    def sample(**kwargs):
        """"""
        return _exec_async(posterior._sample, **kwargs)
    
    sample.__doc__ += posterior.sample.__doc__
    posterior.sample = sample

def build(program_code, data=None, random_seed=None):
    """"""
    posterior = _exec_async(_stan.build, program_code, data=data, random_seed=random_seed)
    _inject_posterior(posterior)
    return posterior

build.__doc__ += _stan.build.__doc__
try:
    __version__ = _stan.__version__
except:
    pass

ps. Running Jupyter from WSL works fine, but running it from WSL2 you need to set the correct ip for it; and it is dynamical ip --> this works (where python should be the main python you use, if you use system python, change python to python3

jupyter lab --ip $(python -c "import subprocess; print(subprocess.run(['hostname', '-I'], capture_output=True).stdout.strip().decode('utf-8'))")

Hi,
I'm using Jupyter_stan but still encounter DecodeError: Error parsing message
it happened to me at the end of sampling.

DecodeError                               Traceback (most recent call last)
<ipython-input-22-b17b9817b7b6> in <module>
----> 1 fit = posterior3.sample(num_chains=4, num_samples=1000)

~/.conda/envs/pystan3/lib/python3.8/site-packages/stan_jupyter/__init__.py in sample(**kwargs)
     30     def sample(**kwargs):
     31         """"""
---> 32         return _exec_async(posterior._sample, **kwargs)
     33 
     34     # Update sample function docstring with posterior.sample docstring.

@ahartikainen
Copy link
Contributor Author

Hi, raise an issue here https://github.com/ahartikainen/pystan-jupyter

I'm not 100% that running the stan process in a thread is the best option (this is what pystan-jupyter does). Have you tried nest_asyncio?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants