Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support for prioritizing testing according to time since last test #35

Open
wants to merge 123 commits into
base: master
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
123 commits
Select commit Hold shift + click to select a range
4691e72
Add support for recording last time tested, as well as giving priorit…
boazbk Aug 18, 2020
4159e1b
Notebook with priority testing
boazbk Aug 18, 2020
fd46ae2
Fixed bug with sorting
boazbk Aug 18, 2020
51d3ad0
Remove testing notebook from git tracking
boazbk Aug 18, 2020
7da545a
Revert "Remove testing notebook from git tracking"
boazbk Aug 18, 2020
9800f6f
Remove testing notebook from git tracking
boazbk Aug 18, 2020
c974004
Add support for logging history, as well as as policy to stop simulat…
boazbk Aug 19, 2020
4435382
Realized that "obvious testing" can already be realized using the `ra…
boazbk Aug 19, 2020
309f789
Realized that "obvious testing" can already be realized using the `ra…
boazbk Aug 19, 2020
b686d5f
fix bug
boazbk Aug 19, 2020
581e48c
adding correct attributes, giving stopping policy access to history
boazbk Aug 19, 2020
50b4f51
stopping policy returns True to stop
boazbk Aug 19, 2020
f7823ae
fix indent bug
boazbk Aug 19, 2020
d4b60d7
fix bug
boazbk Aug 19, 2020
b38d244
fix bug
boazbk Aug 20, 2020
7411f1b
wip
boazbk Aug 20, 2020
977b9e5
ignore temp files
boazbk Aug 20, 2020
ae6d270
Added ability to introduce different number of exposures per group, a…
boazbk Aug 20, 2020
5921093
comment typos
boazbk Aug 24, 2020
fb69f6e
Added functions to map dictionary of history into pandas DataFrame an…
boazbk Aug 24, 2020
66baff0
Keep track of number of nodes, add more fields in logging
boazbk Aug 24, 2020
cb78cc9
Support for running many executions in parallel
boazbk Aug 24, 2020
19f49c9
Merge branch 'parallel' into latest
boazbk Aug 24, 2020
5695582
Added support for "tracing out" parameters to `run_tti_sim` so don't …
boazbk Aug 24, 2020
f6b5984
Added support for "tracing out" parameters to `run_tti_sim` so don't …
boazbk Aug 24, 2020
da5eebb
wip
boazbk Aug 24, 2020
f659e77
add gitignore
boazbk Aug 24, 2020
aa328a0
ignore cached files
boazbk Aug 24, 2020
e5cd795
ignore cached files
boazbk Aug 24, 2020
ccb6919
Merge remote-tracking branch 'origin/latest' into latest
boazbk Aug 24, 2020
6b9bfe6
wip
boazbk Aug 25, 2020
c6424c4
Ability to express functions that output first element (useful for gr…
boazbk Aug 25, 2020
c7e68dd
Merge remote-tracking branch 'origin/latest' into latest
boazbk Aug 25, 2020
ac3e99d
Ability to express functions that output first element (useful for gr…
boazbk Aug 25, 2020
54351d6
fix bugs in parallel run
boazbk Aug 25, 2020
3d98f82
Suppress warning
boazbk Aug 25, 2020
ad9dcc7
wip
boazbk Aug 25, 2020
24a06ed
wip
boazbk Aug 25, 2020
f08f89e
Log numTested_random
boazbk Aug 27, 2020
d1ec520
Merge remote-tracking branch 'origin/latest' into latest
boazbk Aug 27, 2020
9fb502c
move stopping policy check
boazbk Aug 27, 2020
d2a0280
make time the log of the index
boazbk Aug 27, 2020
74b0b9d
make time the log of the index
boazbk Aug 27, 2020
272f002
ensured there is an event every day, added option to continue simulat…
boazbk Aug 27, 2020
f9073c1
fix bug in time
boazbk Aug 27, 2020
fa5e2a5
Merge branch 'latest' of https://github.com/boazbk/seirsplus into latest
boazbk Aug 28, 2020
76a7c20
Log results at first detection
boazbk Aug 31, 2020
53ac015
Allow float parameters
boazbk Aug 31, 2020
438a123
Logging positive tests
boazbk Sep 1, 2020
9668554
Deferring execution for parallel runs, generate graph with isolation …
boazbk Sep 1, 2020
0380313
Progress bar for parallel run
boazbk Sep 1, 2020
e253b99
Fix bug
boazbk Sep 1, 2020
27c5703
Fix bug
boazbk Sep 1, 2020
38c6ae4
Test
boazbk Sep 1, 2020
705078c
For debugging
boazbk Sep 1, 2020
26f5ace
For debugging
boazbk Sep 1, 2020
f5670d1
For debugging
boazbk Sep 1, 2020
977874b
More robust type check for pickling
boazbk Sep 1, 2020
37b923d
fix bug
boazbk Sep 1, 2020
18a8e81
allow non disjoint isolation groups
boazbk Sep 1, 2020
647e12b
allow non disjoint isolation groups
boazbk Sep 1, 2020
0989ce6
log first detection time, separate out from isolation logic
boazbk Sep 1, 2020
e4315d5
debugging
boazbk Sep 1, 2020
8d738ce
fix bug
boazbk Sep 1, 2020
e51d13f
improved logging
boazbk Sep 1, 2020
0450223
fix typo
boazbk Sep 1, 2020
913dad1
fix div by zero error
boazbk Sep 1, 2020
1622d28
fix bug
boazbk Sep 1, 2020
a58ccee
fix bug
boazbk Sep 1, 2020
ce1c16d
fix bug
boazbk Sep 1, 2020
da7a9b6
make list of dictionaries
boazbk Sep 1, 2020
5a7094a
make list of dictionaries
boazbk Sep 1, 2020
f9ca71b
make list of dictionaries
boazbk Sep 1, 2020
6ed10c3
make list of dictionaries
boazbk Sep 1, 2020
7826dce
import sns plt
boazbk Sep 1, 2020
28dff91
typo
boazbk Sep 1, 2020
c55ba6b
track overallInfected
boazbk Sep 3, 2020
0ce715e
log number unquaranteened infectious
boazbk Sep 3, 2020
5c6878e
Don't make parameters to strings
boazbk Sep 3, 2020
5c55001
Improved logging
boazbk Sep 3, 2020
59a75c4
missing import
boazbk Sep 3, 2020
8368f0c
missing import
boazbk Sep 3, 2020
1a9992f
added support for decreasing budget as pool of people to be tested de…
boazbk Sep 3, 2020
abf3ebd
fix bug
boazbk Sep 3, 2020
9e1624e
Fix scaled average in logging
boazbk Sep 8, 2020
87c5720
Support for policies to stop after detection or to test everyone afte…
boazbk Sep 8, 2020
e729aa0
fix typo
boazbk Sep 8, 2020
8baab94
add missing argument
boazbk Sep 8, 2020
cd0b3fd
fix bug in interval scaling
boazbk Sep 8, 2020
2347cd4
fix bug in interval scaling
boazbk Sep 8, 2020
928d194
fix bug in interval scaling
boazbk Sep 8, 2020
f8c2a9c
fix bug in interval scaling
boazbk Sep 8, 2020
d304abe
fix another bug in interval scaling
boazbk Sep 8, 2020
ba29a9a
fix another bug in interval scaling
boazbk Sep 8, 2020
2e5f1bf
fix another bug in interval scaling
boazbk Sep 8, 2020
04dee3a
fix another bug in interval scaling
boazbk Sep 8, 2020
17db0b2
drop printing
boazbk Sep 8, 2020
1ec9ef3
another log bug
boazbk Sep 8, 2020
8b4bdf0
violin alignment
boazbk Sep 10, 2020
707aeeb
modified parenthesis
boazbk Sep 10, 2020
2d24ac6
fix tick alignment
boazbk Sep 10, 2020
b3f030b
longer strings
boazbk Sep 10, 2020
874694a
fix typo
boazbk Sep 10, 2020
ce82dcf
Import all of networkx graph gen routines
boazbk Sep 11, 2020
bc707c1
Allow digraphs
boazbk Sep 11, 2020
6d6e26a
Allow digraphs
boazbk Sep 11, 2020
5c760d8
Add support for single introduction
boazbk Sep 11, 2020
e555086
Logging bug
boazbk Sep 11, 2020
7ab0ba4
Introduction of exposures code
boazbk Sep 11, 2020
b92b28c
fix log
boazbk Sep 11, 2020
88c22d9
frequency
boazbk Sep 14, 2020
23f664e
update cadence cycle length
boazbk Sep 14, 2020
1a39763
update cadence cycle length
boazbk Sep 16, 2020
fad92a2
Enable skip asym/sym
boazbk Oct 15, 2020
22542af
fix bug
boazbk Oct 15, 2020
663e5d4
fix typo
boazbk Oct 19, 2020
3bc53d3
fix typo
boazbk Oct 19, 2020
b91b22d
fix typo
boazbk Oct 19, 2020
8bdca3c
fix typo
boazbk Oct 19, 2020
b0d0f25
Merge branch 'latest'
boazbk Nov 16, 2020
b8cf1aa
Update readm
boazbk Nov 16, 2020
f954bd8
Update readme
boazbk Nov 16, 2020
040f167
Update readme
boazbk Nov 16, 2020
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Binary file removed .DS_Store
Binary file not shown.
3 changes: 3 additions & 0 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -2,3 +2,6 @@
*.swp

.DS_Store

/seirsplus/.ipynb_checkpoints/
*.pyc
8 changes: 8 additions & 0 deletions .idea/.gitignore

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

19 changes: 19 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,3 +1,22 @@

This is a fork of [the SEIRS+ package](https://github.com/ryansmcgee/seirsplus) by Ryan Seamus McGee.

The main additions to the package are the following:

* Support for running multiple simulations in parallel using the file `parallel_run.py`

* More support for logging results of simulations as `pandas` DataFrames.

* more flexiblity in testing policies, in particular for asymptomatic testing of sub-groups

* Some fixes for low prevalence situations, ensuring that there is an event in the simulation loop every day even when there few or no infected people.

This fork was made by Boaz Barak for the paper "Optimizing testing policies for detecting COVID-19 outbreaks" by Janni Yuval, Mor Nitzan, Neta Ravid Tannenbaum, and Boaz Barak.
See [the repository boazbk/testingstrategies](https://github.com/boazbk/testingstrategies) for examples how to use it.

The remainder of this readme file below is taken from the original SEIRS+ package.


# SEIRS+ Model Framework

This package implements models of generalized SEIRS infectious disease dynamics with extensions that allow us to study the effect of social contact network structures, heterogeneities, stochasticity, and interventions, such as social distancing, testing, contact tracing, and isolation.
Expand Down
Binary file removed examples/.DS_Store
Binary file not shown.
2 changes: 1 addition & 1 deletion examples/Extended_SEIRS_Workplace_TTI_Demo.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -3279,4 +3279,4 @@
},
"nbformat": 4,
"nbformat_minor": 4
}
}
Binary file removed images/.DS_Store
Binary file not shown.
Binary file removed seirsplus/.DS_Store
Binary file not shown.
6 changes: 0 additions & 6 deletions seirsplus/.ipynb_checkpoints/Untitled-checkpoint.ipynb

This file was deleted.

163 changes: 133 additions & 30 deletions seirsplus/models.py

Large diffs are not rendered by default.

186 changes: 186 additions & 0 deletions seirsplus/parallel_run.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,186 @@
## Support to run

import sys

import os
p = os.path.dirname(os.path.abspath(__file__))
sys.path.append(p)


from models import *
from .networks import *
from sim_loops import *
from utilities import *
import collections
import random

import pickle
import inspect
import networkx
import argparse
import string
from networkx import *


try:
from p_tqdm import p_umap # https://github.com/swansonk14/p_tqdm
except ImportError:
print("Please install p_tqdm package via pip install p_tqdm")
print("Preparing code for parallel run will work, but running the script will not")
def p_umap(*L):
raise Exception("Package p_tqdm not found")



class Defer:
"""Class for deferring computation
Defer(f,positional and keyword arguments) stores f's name and the arguments for later evaluation"""
def __init__(self,f,*args,**kwds):
self.f_name = f.__name__
self.args = args
self.kwds = kwds

def __str__(self):
res = self.f_name+"("
res += ", ".join([str(a) for a in self.args])
if self.args and self.kwds:
res+=", "
if self.kwds:
res += ", ".join([str(k)+"="+str(v) for k,v in self.kwds.items()])
return res

def eval(self):
f = globals()[self.f_name]
args_ = [unpack(a) for a in self.args]
kwds_ = {k:unpack(v) for k,v in self.kwds.items()}
return f(*args_,**kwds_)

def unpack(O):
if "Defer" in type(O).__name__:
return O.eval()
return O


def generate_workplace_contact_network_(*args,**kwds):
"""Helper function to produce graph + isolation groups"""
G, cohorts, teams = generate_workplace_contact_network(*args,**kwds)
return G, list(teams.values())

def generate_workplace_contact_network_deferred(*args,**kwds):
"""Returns deferred execution of functoin to generate return graph and isolation groups"""
return Defer(generate_workplace_contact_network_,*args,**kwds)



def run(params_, keep_model = False):
"""Run an execution with given parameters"""
params = { key: unpack(val) for key,val in params_.items() }
# replace key a value pair of form (k1,k2,k3):(v1,v2,v3) with k1:v1,k2:v2,k3:v3 etc..
# useful if several keys depend on the same deferred computation
for key in list(params.keys()):
if isinstance(key,tuple):
L = params[key]
if not isinstance(L,(list,tuple)):
raise Exception("L is of type " +str(type(L)) + " and not tuple (L= " + str(L) +")")
if len(L) != len(key):
raise Exception("Key" + str(key) + "should have same length as value" + str(L))
for i,subkey in enumerate(key):
params[subkey] = L[i]
del params[key]

if ('G_Q' not in params) or (not params['G_Q']):
params['G_Q'] = networkx.classes.function.create_empty_copy(params["G"]) # default quarantine graph is empty
desc= { "run_id" : random.choices(string.ascii_lowercase,k=8) } # unique id to help in aggregating
model_params = {}
run_params = {}
for k, v in params.items():
if k in inspect.signature(ExtSEIRSNetworkModel).parameters:
model_params[k] = v
elif k in inspect.signature(run_tti_sim).parameters:
run_params[k] = v
else:
desc[k] = v
desc.update({key : make_compact(val) for key,val in model_params.items() })
desc.update({key : make_compact(val) for key,val in run_params.items() })
if ("verbose" in params_) and params_["verbose"]:
print("Parameters :", desc)
model = ExtSEIRSNetworkModel(**model_params)
hist = collections.OrderedDict()
run_tti_sim(model, history=hist, **run_params)
df, summary = hist2df(hist,**desc)
m = model if keep_model else None
return df,summary, m

def run_(T):
# single parameter version of run - returns only summary with an additional "model"
T[0]["verbose"] = False # no printouts when running in parallel
df, summary,model = run(T[0],T[1])
summary["model"] = model
return summary.to_dict()

def parallel_run(to_do, realizations= 1, keep_in = 0):
"""Get list of dictionaries of model and run parameters to run, run each given number of realizations in parallel
Among all realizations we keep."""
print("Preparing list to run", flush=True)
run_list = [(D, r < keep_in) for r in range(realizations) for D in to_do]
#print(f"We have {mp.cpu_count()} CPUs")
#pool = mp.Pool(mp.cpu_count())
print("Starting execution of " +str(len(run_list)) +" runs", flush=True)
rows = list(p_umap(run_,run_list))
#rows = list(pool.map(run_, run_list))
print("done", flush=True)
df = pd.DataFrame(rows)
return df

def save_to_file(L,filename = 'torun.pickle'):
"""Save list of parameter dictionaries to run"""
with open(filename, 'wb') as handle:
pickle.dump(L, handle, protocol=pickle.HIGHEST_PROTOCOL)

def read_from_file(prefix= 'data'):
i = 1
chunks = []
while os.path.exists(prefix+"_"+str(i)+".zip"):
print("Loading chunk "+ str(i), flush=True)
chunks.append(pd.read_pickle(prefix+"_"+str(i)+".zip"))
i += 1
return pd.concat(chunks)


def main():
parser = argparse.ArgumentParser()
parser.add_argument("--torun", default = "torun.pickle", help="File name of list to run")
parser.add_argument("--realizations", default = 5, type=int, help="Number of realizations")
parser.add_argument("--savename", default="data", help="File name to save resulting data (with csv and zip extensions)")
args = parser.parse_args()
print("Arguments:")
for arg in vars(args):
print(arg,":", getattr(args, arg))
print("Loading torun", flush=True)
with open(args.torun, 'rb') as handle:
torun = pickle.load(handle)
print("Loaded", flush=True)
data = parallel_run(torun, args.realizations)
print("Saving csv", flush=True)
data.to_csv(args.savename+'.csv')
chunk_size = 100000
print("Saving split parts", flush=True)
i = 1
for start in range(0, data.shape[0], chunk_size):
print("Saving pickle " + str(i), flush = True)
temp = data.iloc[start:start + chunk_size]
fname = args.savename+"_"+str(i)+".zip"
temp.to_pickle(fname)
i += 1
fname = args.savename + "_" + str(i) + ".zip"
if os.path.exists(fname): # so there is no confusion that this was the last part
os.remove(fname)
print("Done", flush=True)




if __name__ == "__main__":
# execute only if run as a script
main()

Loading