Skip to content

Commit

Permalink
disable my additional checks for missig data cause i can't make the t…
Browse files Browse the repository at this point in the history
…ests pasa

this is a weird one. i think my code is working and my tests work. But the original tests fail. I kinda think i have to fix the test, but i spent time trying to do that and i admit defeat. so im leaving my code commented out.
  • Loading branch information
brettelliot committed Nov 22, 2024
1 parent 9bf7c46 commit 7ae6b26
Showing 1 changed file with 13 additions and 9 deletions.
22 changes: 13 additions & 9 deletions lumibot/tools/polygon_helper.py
Original file line number Diff line number Diff line change
Expand Up @@ -411,15 +411,19 @@ def get_missing_dates(df_all, asset, start, end):
dates = pd.Series(df_all.index.date).unique()
missing_dates = sorted(set(trading_dates) - set(dates))

# Find any dates with nan values in the df_all DataFrame
missing_dates += df_all[df_all.isnull().all(axis=1)].index.date.tolist()

# make sure the dates are unique
missing_dates = list(set(missing_dates))
missing_dates.sort()

# finally, filter out any dates that are not in start/end range (inclusive)
missing_dates = [d for d in missing_dates if start.date() <= d <= end.date()]
# TODO: This code works AFAIK, But when i enable it the tests for "test_polygon_missing_day_caching" and
# i don't know why nor how to fix this code or the tests. So im leaving it disabled for now. If you have problems
# with NANs in cached polygon data, you can try to enable this code and fix the tests.

# # Find any dates with nan values in the df_all DataFrame
# missing_dates += df_all[df_all.isnull().all(axis=1)].index.date.tolist()
#
# # make sure the dates are unique
# missing_dates = list(set(missing_dates))
# missing_dates.sort()
#
# # finally, filter out any dates that are not in start/end range (inclusive)
# missing_dates = [d for d in missing_dates if start.date() <= d <= end.date()]

return missing_dates

Expand Down

0 comments on commit 7ae6b26

Please sign in to comment.