You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Here is a quick script I have used for this before.
# This script generates predictions for the gbq_qr_hhs_only model for all reference dates.
# Retrospective model fits are generated using the data that would have been available in
# real time.
#
# To maintain transparency about which model outputs were and were not generated in
# real time, these model outputs are stored in flusion/retrospective-hub.
# This script should be run with code/gbq as the working directory:
# python retrospective-experiments/gbq_qr_hhs_only.py
import os
import datetime
from multiprocessing import Pool
def run_command(command):
"""Run system command"""
os.system(command)
missing_dates = [
(datetime.date(2024, 10, 7) + datetime.timedelta(-i * 7)).isoformat() \
for i in range(77)]
missing_dates
output_root = '../../retrospective-hub/model-output'
commands = [f'uv run get_covid_clade_counts.py --as-of={date}' \
for date in missing_dates]
for command in commands:
run_command(command)
The text was updated successfully, but these errors were encountered:
I (Evan) may need to do the actual s3 upload for these files.
elray1
changed the title
run script to create intermediate data files and upload to s3
run script to create eval data files for past weeks and upload to s3
Nov 13, 2024
Here is a quick script I have used for this before.
The text was updated successfully, but these errors were encountered: