Skip to content

Commit

Permalink
made edits to snowex file and fixed broken notebook
Browse files Browse the repository at this point in the history
  • Loading branch information
mikala-nsidc committed Aug 15, 2024
1 parent 33ea2b2 commit 84475f7
Showing 1 changed file with 7 additions and 14 deletions.
21 changes: 7 additions & 14 deletions book/tutorials/Data_access/earthaccess_snowex.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,7 @@
"\n",
"This notebook demonstrates how to search for and download NSIDC DAAC server hosted SnowEx 2023 data using the `earthaccess` package. SnowEx mission data have not yet migrated to the cloud and continue to be hosted at the NSIDC DAAC server. \n",
"\n",
"As an example data collection, we use SnowEx23 Mar23 IOP Snow Depth Measurements, Version 1 (Snex23_MAR23_SD) over the Fairbanks, AK, field site. The data are stored in csv format with a metadata-rich header. \n",
"As an example data collection, we use SnowEx23 Mar23 IOP Snow Depth Measurements, Version 1 (Snex23_MAR23_SD) over the Alaska field sites. The data are stored in csv format with a metadata-rich header. \n",
"\n",
"We use `earthaccess`, an open source package developed by Luis Lopez (NSIDC developer) and a community of contributors, to allow easy search of the NASA Common Metadata Repository (CMR) and download of NASA data collections. It can be used for programmatic search and access for both _DAAC-hosted_ and _cloud-hosted_ data. It manages authenticating using Earthdata Login credentials. `earthaccess` can be used to find and access both DAAC-hosted and cloud-hosted data in just **three** lines of code. See [https://github.com/nsidc/earthaccess](https://github.com/nsidc/earthaccess).\n",
"\n",
Expand Down Expand Up @@ -106,8 +106,6 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"In this case there are 91 collections that have the keyword SnowEx hosted at the NSIDC-DAAC.\n",
"\n",
"The `search_datasets` method returns a python list of `DataCollection` objects. We can view the metadata for each collection in long form by passing a `DataCollection` object to print or as a summary using the `summary` method. We can also use the `pprint` function to Pretty Print each object.\n",
"\n",
"We will do this for the first 10 results (objects)."
Expand Down Expand Up @@ -165,7 +163,7 @@
"source": [
"### Narrow Search using a Spatial Filter\n",
"\n",
"Here we are going to use a bounding box for the Fairbanks study area to find SnowEx collections."
"Here we are going to use a bounding box for the Alaska study areas to find SnowEx collections."
]
},
{
Expand Down Expand Up @@ -202,11 +200,13 @@
]
},
{
"cell_type": "markdown",
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"## Step 3: Search for data granules (files)\n",
"First we will search for the SnowEx 23 Mar23 IOP Snow Depth Measurement collection using its short name: Snex23_Mar22_SD. "
"First we will search for the SnowEx 23 Mar23 IOP Snow Depth Measurement collection using its short name: Snex23_Mar23_SD. "
]
},
{
Expand Down Expand Up @@ -237,14 +237,7 @@
"## Step 4: Download Data Locally\n",
"For this section, we will download the granule above from the SnowEx23 Mar23 IOP Snow Depth Measurements collection locally. <br>\n",
"\n",
"We'll download the file into a separate folder named \"Downloads\", which will be created for us, if it doesn't already exist."
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Finally, we will go ahead and download the granule locally to this notebook into a separate folder named \"tmp\" (folder will be created for you if it doesn't already exist)."
"We'll download the file into a separate folder named \"tmp\", which will be created for us, if it doesn't already exist."
]
},
{
Expand Down

0 comments on commit 84475f7

Please sign in to comment.