Putting It All Together#

Overview

Teaching:

Exercises:

Questions:

  • Can you show me an example?

Objectives:

  • Create a simple workflow using a cloud VM and cloud object storage.

  • Understand the basics of Identity and Access Management (IAM)

  • Add collaborators to a Bucket with appropriate permissions.

A Research Computational and Data Workflow - Drew’s story#

Drew needs to do some analysis on the data. They need data (satellite images stored in the cloud), computational resources (a virtual machine), some software (we will supply this), and a place to store the results (Cloud Storage). We will assemble and process all these parts in the cloud with a simple example.

Create a VM#

For this exercise we will use the VM that we created in the Intro to Cloud Compute lesson.

Connect to the VM#

Recall the public IP and connect to the VM instance using the cloud shell in the web console. Open up a cloud shell by clicking on the Activate Cloud Shell icon (on the top blue bar click button the looks like a box with a greater than sign followed by an underscore) in the top blue bar if it is not already open. In Cloud Shell ssh to the VM you just created. (default username “azureuser”).

ssh <username>@<Public IP address>

Get Example Code#

We will now use git to download the example landsat analysis code into your home directory. For those of you who are unfamiliar with git, it is a way to collaboratively manage files.

cd ~
git clone https://github.internet2.edu/CLASS/CLASS-Examples.git

Now change directory to CLASS-Examples/azure-landsat where our analysis code is.

cd ~/CLASS-Examples/azure-landsat/

List the contents of the directory with the following command.

ls -l

Verify that landsat-hls-azure.py & setup.sh are listed.

Run the setup.sh file to setup the environment by installing the packages needed for the data analysis workflow.

./setup.sh

Get and Process landsat data#

Lets take a quick look at the python scrip landsat-hls-azure.py. This code will find the HLS tiles coresponding to a given latatue and longitude at a specific time. Using the tiles stored on Azure, we will combine different bands to create a image file of the satellite dat in (png) formate.

Run the analysis file with the following command.

./landsat-hls-azure.py

This will product the file test.png. This is an image file of the combined tiles we reqeusted.

ls -l
test.png

Upload the image we just created to the storage container

az storage blob upload --account-name $STGE_ACCT --container-name $BLOB_CONT --name test.png --file test.png --auth-mode login

Verify that the blob is in blob container

az storage blob list --account-name $STGE_ACCT --container-name $BLOB_CONT --output table --auth-mode login

Now go back to the console. Open the hamburger menu and navigate to the Storage Account blade. In the right hand menu under Data Storage, click on Containers. On this page you should see the container we just created, click on it. On the container page, click the Change access level on the menu towards the top of the page. Click on the Public access level pull down menu then select Blob (anonymous read access for blobs only). Note the message warning you that blobs in this container will have anonymous public access, Click OK. Now click on the blob (test.png) that we uploaded.

Now delete the blob (image.png file)

az storage blob delete --account-name $STGE_ACCT --container-name $BLOB_CONT --name test.png --auth-mode login

Delete the storage container

az storage container delete --account-name $STGE_ACCT --name $BLOB_CONT --auth-mode login

Delete the storage account

az storage account delete -n $STGE_ACCT