Skip to content

Commit

Permalink
Streamline the examples
Browse files Browse the repository at this point in the history
  • Loading branch information
Tom Scavo authored Oct 30, 2016
1 parent 601c7e8 commit 57666f6
Showing 1 changed file with 31 additions and 26 deletions.
57 changes: 31 additions & 26 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -54,18 +54,19 @@ The following examples show how to use the script to create some cron jobs on in

### Example #1

Consider the following URLs:
The goal is to transform InCommon metadata into the following CSV file:

```Shell
xml_location=http://md.incommon.org/InCommon/InCommon-metadata.xml
resource_url=https://incommon.org/federation/metadata/all_IdP_DisplayNames.csv
```
* https://incommon.org/federation/metadata/all_IdP_DisplayNames.csv

The latter resource is used to construct a [List of IdP Display Names](https://spaces.internet2.edu/x/2IDmBQ) in the spaces wiki.
The above resource is used to construct a [List of IdP Display Names](https://spaces.internet2.edu/x/2IDmBQ) in the spaces wiki.

Let's build an automated process that transforms the SAML metadata at ``xml_location`` into the CSV file at ``resource_url``. Schedule the following process to run every hour on incommon.org:
Suppose there is an automated process that transforms the main InCommon metadata aggregate into the CSV file at the above URL. Specifically, let's suppose the following process runs every hour on incommon.org:

```Shell
# determine the metadata location
xml_location=http://md.incommon.org/InCommon/InCommon-metadata.xml

# create the resource
xsl_file=$LIB_DIR/list_all_IdP_DisplayNames_csv.xsl
resource_file=/tmp/all_IdP_DisplayNames.csv
$BIN_DIR/http_xsltproc.sh -F -o $resource_file $xsl_file $xml_location
Expand All @@ -76,7 +77,7 @@ if [ $exit_code -gt 1 ]; then
exit $exit_code
fi

# resource_dir is the target web directory for web resources
# move the resource to the web directory
resource_dir=/home/htdocs/www.incommonfederation.org/federation/metadata/
mv $resource_file $resource_dir
```
Expand All @@ -85,19 +86,20 @@ Observe that the command ``http_xsltproc.sh -F`` forces a fresh SAML metadata fi

### Example #2

This example is similar to the previous example except that two resources are created. Consider the following URLs:
The goal is to transform InCommon metadata into the following pair of CSV files:

```Shell
xml_location=http://md.incommon.org/InCommon/InCommon-metadata.xml
resource1_url=https://incommon.org/federation/metadata/all_RandS_IdPs.csv
resource2_url=https://incommon.org/federation/metadata/all_RandS_SPs.csv
```
* https://incommon.org/federation/metadata/all_RandS_IdPs.csv
* https://incommon.org/federation/metadata/all_RandS_SPs.csv

The latter pair of resources are used to construct the [List of Research and Scholarship Entities](https://spaces.internet2.edu/x/ZoUABg) in the spaces wiki.
The above resources are used to construct the [List of Research and Scholarship Entities](https://spaces.internet2.edu/x/ZoUABg) in the spaces wiki.

Suppose there is an automated process that transforms the SAML metadata at ``xml_location`` into the CSV files at ``resource1_url`` and ``resource2_url``. Specifically, let's suppose the following process runs every hour on incommon.org:
Suppose there is an automated process that transforms the main InCommon metadata aggregate into the CSV files at the above URLs. Specifically, let's suppose the following process runs every hour on incommon.org:

```Shell
# determine the metadata location
xml_location=http://md.incommon.org/InCommon/InCommon-metadata.xml

# create the first resource
xsl_file=$LIB_DIR/list_all_RandS_IdPs_csv.xsl
resource1_file=/tmp/all_RandS_IdPs.csv
$BIN_DIR/http_xsltproc.sh -F -o $resource1_file $xsl_file $xml_location
Expand All @@ -108,6 +110,7 @@ if [ $exit_code -gt 1 ]; then
exit $exit_code
fi

# create the second resource
xsl_file=$LIB_DIR/list_all_RandS_SPs_csv.xsl
resource2_file=/tmp/all_RandS_SPs.csv
$BIN_DIR/http_xsltproc.sh -C -o "$resource2_file" "$xsl_file" "$xml_location"
Expand All @@ -118,7 +121,7 @@ if [ $exit_code -gt 1 ]; then
exit $exit_code
fi

# resource_dir is the target web directory for web resources
# move the resources to the web directory
resource_dir=/home/htdocs/www.incommonfederation.org/federation/metadata/
mv $resource1_file $resource2_file $resource_dir
```
Expand All @@ -127,19 +130,20 @@ Observe the commands ``http_xsltproc.sh -F`` and ``http_xsltproc.sh -C``. The fo

### Example #3

This example is very similar to the previous example. Consider the following URLs:
The goal is to transform InCommon metadata into the following pair of CSV files:

```Shell
xml_location=http://md.incommon.org/InCommon/InCommon-metadata-export.xml
resource1_url=https://incommon.org/federation/metadata/all_exported_IdPs.csv
resource2_url=https://incommon.org/federation/metadata/all_exported_SPs.csv
```
* https://incommon.org/federation/metadata/all_exported_IdPs.csv
* https://incommon.org/federation/metadata/all_exported_SPs.csv

The latter pair of resources are used to construct the [List of Exported Entities](https://spaces.internet2.edu/x/DYD4BQ) in the spaces wiki.
The above resources are used to construct the [List of Exported Entities](https://spaces.internet2.edu/x/DYD4BQ) in the spaces wiki.

Suppose there is an automated process that transforms the SAML metadata at ``xml_location`` (i.e., the Export Aggregate) into the CSV files at ``resource1_url`` and ``resource2_url``. Specifically, let's suppose the following process runs every hour on incommon.org:
Suppose there is an automated process that transforms the InCommon export aggregate into the CSV files at the above URLs. Specifically, let's suppose the following process runs every hour on incommon.org:

```Shell
# determine the metadata location
xml_location=http://md.incommon.org/InCommon/InCommon-metadata-export.xml

# create the first resource
xsl_file=$LIB_DIR/list_all_IdPs_csv.xsl
resource1_file=/tmp/all_exported_IdPs.csv
$BIN_DIR/http_xsltproc.sh -F -o $resource1_file $xsl_file $xml_location
Expand All @@ -150,6 +154,7 @@ if [ $exit_code -gt 1 ]; then
exit $exit_code
fi

# create the second resource
xsl_file=$LIB_DIR/list_all_SPs_csv.xsl
resource2_file=/tmp/all_exported_SPs.csv
$BIN_DIR/http_xsltproc.sh -C -o "$resource2_file" "$xsl_file" "$xml_location"
Expand All @@ -160,7 +165,7 @@ if [ $exit_code -gt 1 ]; then
exit $exit_code
fi

# resource_dir is the target web directory for web resources
# move the resources to the web directory
resource_dir=/home/htdocs/www.incommonfederation.org/federation/metadata/
mv $resource1_file $resource2_file $resource_dir
```
Expand Down

0 comments on commit 57666f6

Please sign in to comment.