Skip to content

Commit

Permalink
Merge branch 'new_Arkheia_export' into lgn_playground_2
Browse files Browse the repository at this point in the history
  • Loading branch information
rozsatib committed May 7, 2024
2 parents 3634eb0 + 9ebbd88 commit b869bb7
Show file tree
Hide file tree
Showing 12 changed files with 298 additions and 36 deletions.
2 changes: 1 addition & 1 deletion .github/workflows/fast-model-mpi-explosion.yml
Original file line number Diff line number Diff line change
Expand Up @@ -20,7 +20,7 @@ jobs:
pip3 install pytest pytest-cov pytest-randomly coverage black
# TODO: Remove fixed numpy and pynn versions after the PyNN pull request
# https://github.com/NeuralEnsemble/PyNN/pull/762 is accepted
pip3 install numpy==1.23.5 scipy mpi4py matplotlib quantities lazyarray interval Pillow param==1.5.1 parameters neo==0.12.0 cython pynn==0.10.0 psutil future requests elephant pytest-xdist pytest-timeout junitparser numba
pip3 install numpy==1.23.5 scipy mpi4py matplotlib quantities lazyarray interval Pillow param==1.5.1 parameters neo==0.12.0 cython pynn==0.10.0 psutil future requests elephant pytest-xdist pytest-timeout junitparser numba numpyencoder sphinx
- name: Download and install imagen
run: |
git clone https://github.com/antolikjan/imagen.git
Expand Down
2 changes: 1 addition & 1 deletion .github/workflows/fast-model-mpi.yml
Original file line number Diff line number Diff line change
Expand Up @@ -20,7 +20,7 @@ jobs:
pip3 install pytest pytest-cov pytest-randomly coverage black
# TODO: Remove fixed numpy and pynn versions after the PyNN pull request
# https://github.com/NeuralEnsemble/PyNN/pull/762 is accepted
pip3 install numpy==1.23.5 scipy mpi4py matplotlib quantities lazyarray interval Pillow param==1.5.1 parameters neo==0.12.0 cython pynn==0.10.0 psutil future requests elephant pytest-xdist pytest-timeout junitparser numba
pip3 install numpy==1.23.5 scipy mpi4py matplotlib quantities lazyarray interval Pillow param==1.5.1 parameters neo==0.12.0 cython pynn==0.10.0 psutil future requests elephant pytest-xdist pytest-timeout junitparser numba numpyencoder sphinx
- name: Download and install imagen
run: |
git clone https://github.com/antolikjan/imagen.git
Expand Down
2 changes: 1 addition & 1 deletion .github/workflows/fast-model-stepcurrentmodule.yml
Original file line number Diff line number Diff line change
Expand Up @@ -19,7 +19,7 @@ jobs:
pip3 install pytest pytest-cov pytest-randomly coverage black
# TODO: Remove fixed numpy and pynn versions after the PyNN pull request
# https://github.com/NeuralEnsemble/PyNN/pull/762 is accepted
pip3 install numpy==1.23.5 scipy mpi4py matplotlib quantities lazyarray interval Pillow param==1.5.1 parameters neo==0.12.0 cython psutil future requests elephant pytest-xdist pytest-timeout junitparser numba
pip3 install numpy==1.23.5 scipy mpi4py matplotlib quantities lazyarray interval Pillow param==1.5.1 parameters neo==0.12.0 cython psutil future requests elephant pytest-xdist pytest-timeout junitparser numba numpyencoder sphinx
- name: Download and install imagen
run: |
Expand Down
2 changes: 1 addition & 1 deletion .github/workflows/fast-model.yml
Original file line number Diff line number Diff line change
Expand Up @@ -19,7 +19,7 @@ jobs:
pip3 install pytest pytest-cov pytest-randomly coverage black
# TODO: Remove fixed numpy and pynn versions after the PyNN pull request
# https://github.com/NeuralEnsemble/PyNN/pull/762 is accepted
pip3 install numpy==1.23.5 scipy mpi4py matplotlib quantities lazyarray interval Pillow param==1.5.1 parameters neo==0.12.0 cython pynn==0.10.0 psutil future requests elephant pytest-xdist pytest-timeout junitparser numba
pip3 install numpy==1.23.5 scipy mpi4py matplotlib quantities lazyarray interval Pillow param==1.5.1 parameters neo==0.12.0 cython pynn==0.10.0 psutil future requests elephant pytest-xdist pytest-timeout junitparser numba numpyencoder sphinx
- name: Download and install imagen
run: |
Expand Down
2 changes: 1 addition & 1 deletion .github/workflows/lsv1m-model.yml
Original file line number Diff line number Diff line change
Expand Up @@ -19,7 +19,7 @@ jobs:
pip3 install pytest pytest-cov pytest-randomly coverage black
# TODO: Remove fixed numpy and pynn versions after the PyNN pull request
# https://github.com/NeuralEnsemble/PyNN/pull/762 is accepted
pip3 install numpy==1.23.5 scipy mpi4py matplotlib quantities lazyarray interval Pillow param==1.5.1 parameters neo==0.12.0 cython pynn==0.10.0 psutil future requests elephant pytest-xdist pytest-timeout junitparser numba
pip3 install numpy==1.23.5 scipy mpi4py matplotlib quantities lazyarray interval Pillow param==1.5.1 parameters neo==0.12.0 cython pynn==0.10.0 psutil future requests elephant pytest-xdist pytest-timeout junitparser numba numpyencoder sphinx
- name: Download and install imagen
run: |
Expand Down
2 changes: 1 addition & 1 deletion .github/workflows/unit-tests.yml
Original file line number Diff line number Diff line change
Expand Up @@ -19,7 +19,7 @@ jobs:
pip3 install pytest pytest-cov pytest-randomly coverage black
# TODO: Remove fixed numpy and pynn versions after the PyNN pull request
# https://github.com/NeuralEnsemble/PyNN/pull/762 is accepted
pip3 install numpy==1.23.5 scipy mpi4py matplotlib quantities lazyarray interval Pillow param==1.5.1 parameters neo==0.12.0 cython pynn==0.10.0 psutil future requests elephant pytest-xdist pytest-timeout junitparser numba
pip3 install numpy==1.23.5 scipy mpi4py matplotlib quantities lazyarray interval Pillow param==1.5.1 parameters neo==0.12.0 cython pynn==0.10.0 psutil future requests elephant pytest-xdist pytest-timeout junitparser numba numpyencoder sphinx
- name: Download and install imagen
run: |
Expand Down
2 changes: 1 addition & 1 deletion README.rst
Original file line number Diff line number Diff line change
Expand Up @@ -51,7 +51,7 @@ ____________

Now you can install all other dependencies in this protected environment::

pip3 install numpy==1.23.5 scipy mpi4py matplotlib quantities lazyarray interval Pillow param==1.5.1 parameters neo==0.12.0 cython psutil future requests elephant pytest-xdist pytest-timeout junitparser numba
pip3 install numpy==1.23.5 scipy mpi4py matplotlib quantities lazyarray interval Pillow param==1.5.1 parameters neo==0.12.0 cython psutil future requests elephant pytest-xdist pytest-timeout junitparser numba numpyencoder sphinx

Next we will manually install several packages. It is probably the best if you create a separate directory in an appropriate
place, where you will download and install the packages from.
Expand Down
36 changes: 25 additions & 11 deletions mozaik/controller.py
Original file line number Diff line number Diff line change
Expand Up @@ -12,6 +12,7 @@
import time
from datetime import datetime
import logging
from mozaik.tools.json_export import save_json, get_experimental_protocols, get_recorders, get_stimuli

logger = mozaik.getMozaikLogger()

Expand Down Expand Up @@ -139,20 +140,28 @@ def prepare_workflow(simulation_name, model_class):


if mozaik.mpi_comm.rank == mozaik.MPI_ROOT:
# Let's store the full and modified parameters, if we are the 0 rank process
parameters.save(Global.root_directory + "parameters", expand_urls=True)
import pickle
f = open(Global.root_directory+"modified_parameters","wb")
pickle.dump(str(modified_parameters),f)
f.close()
# Store simulation run info, if we are the 0 rank process,
# with several components to be stored/filled in later during the simulation run
sim_info = {
'submission_date' : None,
'run_date': datetime.now().strftime('%d/%m/%Y-%H:%M:%S'),
'simulation_run_name': simulation_run_name,
'model_name': simulation_name,
"model_description": model_class.__doc__,
'results': {"$ref": "results.json"},
'stimuli': {"$ref": "stimuli.json"},
'recorders': {"$ref": "recorders.json"},
'experimental_protocols': {"$ref": "experimental_protocols.json"},
'parameters': {"$ref": "parameters.json"},
}
save_json(sim_info, Global.root_directory + 'sim_info.json')
save_json(parameters.to_dict(), Global.root_directory + 'parameters.json')
save_json(modified_parameters, Global.root_directory + 'modified_parameters.json')
recorders = get_recorders(parameters.to_dict())
save_json(recorders, Global.root_directory + 'recorders.json')

setup_logging()

if mozaik.mpi_comm.rank == mozaik.MPI_ROOT:
# Let's store some basic info about the simulation run
f = open(Global.root_directory+"info","w")
f.write(str({'model_class' : str(model_class), 'model_docstring' : model_class.__doc__,'simulation_run_name' : simulation_run_name, 'model_name' : simulation_name, 'creation_data' : datetime.now().strftime('%d/%m/%Y-%H:%M:%S')}))
f.close()
return sim, num_threads, parameters

def run_workflow(simulation_name, model_class, create_experiments):
Expand Down Expand Up @@ -264,4 +273,9 @@ def run_experiments(model,experiment_list,parameters,load_from=None):
logger.info('Simulator run time: %.0fs (%d%%)' % (simulation_run_time, int(simulation_run_time /total_run_time * 100)))
logger.info('Mozaik run time: %.0fs (%d%%)' % (mozaik_run_time, int(mozaik_run_time /total_run_time * 100)))

experimental_protocols = get_experimental_protocols(data_store)
stimuli = get_stimuli(data_store,parameters.store_stimuli, parameters.input_space)
save_json(experimental_protocols, Global.root_directory + 'experimental_protocols.json')
save_json(stimuli, Global.root_directory + 'stimuli.json')

return data_store
15 changes: 5 additions & 10 deletions mozaik/meta_workflow/parameter_search.py
Original file line number Diff line number Diff line change
Expand Up @@ -7,6 +7,8 @@
import re
from mozaik.cli import parse_parameter_search_args
from mozaik.tools.misc import result_directory_name
import json
from mozaik.tools.json_export import save_json

class ParameterSearchBackend(object):
"""
Expand Down Expand Up @@ -183,10 +185,7 @@ def run_parameter_search(self):

counter=0
combinations = self.generate_parameter_combinations()

f = open(mdn + '/parameter_combinations','wb')
pickle.dump(combinations,f)
f.close()
save_json(combinations, mdn + '/parameter_combinations.json')

for combination in combinations:
combination['results_dir']='\"\'' + os.getcwd() + '/' + mdn + '/\'\"'
Expand Down Expand Up @@ -250,9 +249,7 @@ def parameter_search_run_script_distributed_slurm(simulation_name,master_results
core_number : int
How many cores to reserve per process.
"""
f = open(master_results_dir+'/parameter_combinations','rb')
combinations = pickle.load(f)
f.close()
combinations = json.load(master_results_dir + '/parameter_combinations.json')

# first check whether all parameter combinations contain the same parameter names
assert len(set([tuple(set(comb.keys())) for comb in combinations])) == 1 , "The parameter search didn't occur over a fixed set of parameters"
Expand Down Expand Up @@ -295,9 +292,7 @@ def parameter_search_run_script_distributed_slurm_IoV(simulation_name,master_res
core_number : int
How many cores to reserve per process.
"""
f = open(master_results_dir+'/parameter_combinations','rb')
combinations = pickle.load(f)
f.close()
combinations = json.load(master_results_dir + '/parameter_combinations.json')

# first check whether all parameter combinations contain the same parameter names
assert len(set([tuple(set(comb.keys())) for comb in combinations])) == 1 , "The parameter search didn't occur over a fixed set of parameters"
Expand Down
24 changes: 24 additions & 0 deletions mozaik/tools/distribution_parametrization.py
Original file line number Diff line number Diff line change
Expand Up @@ -15,6 +15,7 @@
from collections import OrderedDict
import mozaik
import sys
import json

def load_parameters(parameter_url,modified_parameters=ParameterSet({})):
"""
Expand Down Expand Up @@ -201,3 +202,26 @@ def instanciate_objects(self, classes):
else:
v.instanciate_objects(classes = classes)

def to_dict(self):
"""
Cast to json serializable dictionary while dealing with PyNNDistributions
and MozaikExtendedParameterSets. Use a stack to avoid explicit recursion.
"""
import copy
d = copy.deepcopy(dict(self))
stack = [(d, k, v) for k, v in d.items()]

while stack:
parent, key, value = stack.pop()
if isinstance(value, dict):
stack.extend((value, k, v) for k, v in value.items())
elif isinstance(value, PyNNDistribution):
parent[key] = {
'class_name': 'PyNNDistribution',
'params': {
'name': value.__dict__['name'],
},
}
for pk, pv in value.__dict__['parameters'].items():
parent[key]['params'][pk] = pv
return d
Loading

0 comments on commit b869bb7

Please sign in to comment.