This combined repository includes tutorials and code resources provided by the NASA National Snow and Ice Data Center Distributed Active Archive Center (NSIDC DAAC). These tutorials are provided as Python-based Jupyter notebooks that provide guidance on working with various data products, including how to access, subset, transform, and visualize data. Each tutorial can be accessed by navigating to the /notebooks folder of this repository. Please see the README files associated with each individual tutorial folder for more information on each tutorial and their learning objectives. Please note that all branches outside of Main
should be considered in development and are not supported.
These notebooks demonstrate how to search and access ICESat-2 from the NASA Earthdata Cloud:
Accessing and working with ICESat-2 Data in the Cloud
Originally presented to the UWG (User Working Group) in May 2022, this tutorial demonstrates how to search for ICESat-2 data hosted in the Earthdata Cloud and how to directly access it from an Amazon Web Services (AWS) Elastic Compute Cloud (EC2) instance using the earthaccess
package.
Plotting ICESat-2 and CryoSat-2 Freeboards
This notebook demonstrates plotting ICESat-2 and CryoSat-2 data in the same map from within an AWS ec2 instance. ICESat-2 data are accessed via "direct S3 access" using earthaccess
. CryoSat-2 data are downloaded to our cloud instance from their ftp storage lcoation and accessed locally.
Processing Large-scale Time Series of ICESat-2 Sea Ice Height in the Cloud
This notebook utilizes several libraries to performantly search, access, read, and grid ATL10 data over the Ross Sea, Antarctica including earthaccess
, h5coro
, and geopandas
. The notebook provides further guidance on how to scale this analysis to the entire continent, running the same workflow from a script that can be run from your laptop using Coiled.
Download, crop, resample, and plot multiple GeoTIFFs
This tutorial guides you through programmatically accessing and downloading GeoTIFF files from the NSIDC DAAC to your local computer. We then crop and resample one GeoTIFF based on the extent and pixel size of another GeoTIFF, then plot one on top of the other.
We will use two data sets from the NASA MEaSUREs (Making Earth System data records for Use in Research Environments) program as an example:
- MEaSUREs Greenland Ice Mapping Project (GrIMP) Digital Elevation Model from GeoEye and WorldView Imagery, Version 2 (NSIDC-0715)
- MEaSUREs Greenland Ice Velocity: Selected Glacier Site Velocity Maps from InSAR, Version 4 (NSIDC-0481)
Snow Depth and Snow Cover Data Exploration
Originally demonstrated through the NASA Earthdata Webinar "Let It Snow! Accessing and Analyzing Snow Data at the NSIDC DAAC" on May 6, 2020, this tutorial provides guidance on how to discover, access, and couple snow data across varying geospatial scales from NASA's SnowEx, Airborne Snow Observatory, and Moderate Resolution Imaging Spectroradiometer (MODIS) missions. The tutorial highlights the ability to search and access data by a defined region, and combine and compare snow data across different data formats and scales using a Python-based Jupyter Notebook.
Getting the most out of NSIDC DAAC data: Discovering, Accessing, and Harmonizing Arctic Remote Sensing Data
Originally presented during the 2019 AGU Fall Meeting, this tutorial demonstrates the NSIDC DAAC's data discovery, access, and subsetting services, along with basic open source resources used to harmonize and analyze data across multiple products. The tutorial is provided as a series of Python-based Jupyter Notebooks, focusing on sea ice height and ice surface temperature data from NASA’s ICESat-2 and MODIS missions, respectively, to characterize Arctic sea ice.
Global land ice velocities. The Inter-mission Time Series of Land Ice Velocity and Elevation (ITS_LIVE) project facilitates ice sheet, ice shelf and glacier research by providing a globally comprehensive and temporally dense multi-sensor record of land ice velocity and elevation with low latency. Scene-pair velocities were generated from satellite optical and radar imagery.
The notebooks on this project demonstrate how to search and access ITS_LIVE velocity pairs and provide a simple example on how to build a data cube.
Caution
The IceFlow notebooks and supporting code have some known problems and users
should exercise caution. It is likely that users will run into errors while
interacting with the notebooks. Requests for ITRF transformations are not
currently working as expected. We recommend users look at the corrections
notebook for information about how to apply ITRF transformations to data
themselves. IceFlow is currently under maintenence, and we hope to resolve
some of these issues soon.
Harmonized data for pre-IceBridge, ICESat and IceBridge data sets. These Jupyter notebooks are interactive documents to teach students and researchers interested in cryospheric sciences how to access and work with airborne altimetry and related data sets from NASA’s IceBridge mission, and satellite altimetry data from ICESat and ICESat-2 missions using the NSIDC IceFlow API
The Binder button above allows you to explore and run the notebook in a shared cloud computing environment without the need to install dependencies on your local machine. Note that this option will not directly download data to your computer; instead the data will be downloaded to the cloud environment.
-
Install Docker. Use the left-hand navigation to select the appropriate install depending on operating system.
-
Download the NSIDC-Data-Tutorials repository from Github.
-
Unzip the file, and open a terminal window in the
NSIDC-Data-Tutorials
folder's location. -
From the terminal window, launch the docker container using the following command, replacing [path/notebook_folder] with your path and notebook folder name:
docker run --name tutorials -p 8888:8888 -v [path/notebook_folder]:/home/jovyan/work nsidc/tutorials
Example:
docker run --name tutorials -p 8888:8888 -v /Users/name/Desktop/NSIDC-Data-Tutorials:/home/jovyan/work nsidc/tutorials
Or, with docker-compose:
docker-compose up
If you want to mount a directory with write permissions, you need to grant the container the same permissions as the one on the directory to be mounted and tell it that has "root" access (within the container). This is important if you want to persist your work or download data to a local directory and not just the docker container. Run the example command below for this option:
docker run --name tutorials -e NB_UID=$(id -u) --user root -p 8888:8888 -v /Users/name/Desktop/NSIDC-Data-Tutorials:/home/jovyan/work nsidc/tutorials
The initialization will take some time and will require 2.6 GB of space. Once the startup is complete you will see a line of output similar to this:
To access the notebook, open this file in a browser:
file:///home/jovyan/.local/share/jupyter/runtime/nbserver-6-open.html
Or copy and paste one of these URLs:
http://4dc97ddd7a0d:8888/?token=f002a50e25b6f623aa775312737ba8a23ffccfd4458faa6f
or http://127.0.0.1:8888/?token=f002a50e25b6f623aa775312737ba8a23ffccfd4458faa6f
If you started your container with the -d
/--detach
option, check docker logs tutorials
for this output.
-
Open up a web browser and copy one of the URLs as instructed above.
-
You will be brought to a Jupyter Notebook interface running through the Docker container. The left side of the interface displays your local directory structure. Navigate to the
work
folder of theNSIDC-Data-Tutorials
repository folder. You can now interact with the notebooks to explore and access data.
-
Install Docker.
-
Download the NSIDC-Data-Tutorials repository from Github.
-
Unzip the file, and open a terminal window (use Command Prompt or PowerShell, not PowerShell ISE) in the
NSIDC-Data-Tutorials
folder's location. -
From the terminal window, launch the docker container using the following command, replacing [path\notebook_folder] with your path and notebook folder name:
docker run --name tutorials -p 8888:8888 -v [path\notebook_folder]:/home/jovyan/work nsidc/tutorials
Example:
docker run --name tutorials -p 8888:8888 -v C:\notebook_folder:/home/jovyan/work nsidc/tutorials
Or, with docker-compose:
docker-compose up
If you want to mount a directory with write permissions you need to grant the container the same permissions as the one on the directory to be mounted and tell it that has "root" access (within the container)
docker run --name tutorials --user root -p 8888:8888 -v C:\notebook_folder:/home/jovyan/work nsidc/tutorials
The initialization will take some time and will require 2.6 GB of space. Once the startup is complete you will see a line of output similar to this:
To access the notebook, open this file in a browser:
file:///home/jovyan/.local/share/jupyter/runtime/nbserver-6-open.html
Or copy and paste one of these URLs:
http://(6a8bfa6a8518 or 127.0.0.1):8888/?token=2d72e03269b59636d9e31937fcb324f5bdfd0c645a6eba3f
If you started your container with the -d
/--detach
option, check docker logs tutorials
for this output.
- Follow the instructions and copy one of the URLs into a web browser and hit return. The address should look something like this:
http://127.0.0.1:8888/?token=2d72e03269b59636d9e31937fcb324f5bdfd0c645a6eba3f
-
You will now see the NSIDC-Data-Tutorials repository within the Jupyter Notebook interface. Navigate to /work to open the notebooks.
-
You can now interact with the notebooks to explore and access data.
Note: If we already have conda or mamba installed we can skip the first step.
-
Install mambaforge (Python 3.9+) for your platform from mamba documentation
-
Download the NSIDC-Data-Tutorials repository from Github by clicking the green 'Code' button located at the top right of the repository page and clicking 'Download Zip'. Unzip the file, and open a command line or terminal window in the NSIDC-Data-Tutorials folder's location.
-
From a command line or terminal window, install the required environment with the following commands:
Linux
mamba create -n nsidc-tutorials --file binder/conda-linux-64.lock
OSX
mamba create -n nsidc-tutorials --file binder/conda-osx-64.lock
Windows
mamba create -n nsidc-tutorials --file binder/conda-win-64.lock
You should now see that the dependencies were installed and our environment is ready to be used.
Activate the environment with
conda activate nsidc-tutorials
Launch the notebook locally with the following command:
jupyter lab
This should open a browser window with the JupyterLab IDE, showing your current working directory on the left-hand navigation. Navigate to the tutorial folder of choice and click on their associated *.ipynb files to get started.
Although the nsidc-tutorial environment should run all the notebooks in this repository, we also include tutorial-specific environments that will only contain the dependencies for them. If we don't want to "pollute" our conda environments and we are only going to work with one of the tutorials we recommend to use them instead of the nsidc-tutorial
environment. The steps to install them are exactly the same but the environment files are inside the environment folders in each of the tutorials. e.g. for ITS_LIVE
cd notebooks/itslive
mamba create -n nsidc-itslive --file environment/conda-linux-64.lock
conda activate nsidc-itslive
jupyter lab
This should create a pinned environment that should be fully reproducible across platforms.
NOTE: Sometimes Conda environments change (break) even with pinned down dependencies. If you run into an issue with dependencies for the tutorials please open an issue and we'll try to fix it as soon as possible.
This software is developed by the National Snow and Ice Data Center with funding from multiple sources.