Skip to content

Commit

Permalink
Merge pull request #18 from anissa111/animation
Browse files Browse the repository at this point in the history
Change image data to be over a larger time span
  • Loading branch information
anissa111 authored Nov 18, 2023
2 parents 7bdc3b1 + 9d83a09 commit 77d245b
Show file tree
Hide file tree
Showing 55 changed files with 98,717 additions and 108,543 deletions.
207,254 changes: 98,713 additions & 108,541 deletions notebooks/animation.ipynb

Large diffs are not rendered by default.

Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Diff not rendered.
Diff not rendered.
Diff not rendered.
Diff not rendered.
6 changes: 4 additions & 2 deletions notebooks/scripts/goes-getter.py
Original file line number Diff line number Diff line change
Expand Up @@ -6,12 +6,14 @@
r = requests.get(base_url)
soup = BeautifulSoup(r.text, 'html.parser')

# get links to 20 images
# get all image urls
urls = []
for link in soup.find_all('a'):
if link.get('href').endswith('416x250.jpg') and len(urls) < 30:
if link.get('href').endswith('416x250.jpg'):
urls.append(link.get('href'))

# get a day's worth of hourly images
urls = urls[::12][:24]
# download images
for url in urls:
im = requests.get(base_url + url).content
Expand Down

0 comments on commit 77d245b

Please sign in to comment.