Skip to content

Commit

Permalink
Add two examples
Browse files Browse the repository at this point in the history
  • Loading branch information
Tom Scavo committed Oct 23, 2016
1 parent 234963a commit b366483
Showing 1 changed file with 90 additions and 3 deletions.
93 changes: 90 additions & 3 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -68,10 +68,11 @@ xml_location=http://md.incommon.org/InCommon/InCommon-metadata.xml
resource_url=https://incommon.org/federation/metadata/all_IdP_DisplayNames.csv
```

Suppose there is an automated process that transforms the SAML metadata at ``xml_location`` into the CSV file at ``resource_url``. Specifically, let's suppose the following process runs every hour on incommon.org:
The latter resource is used to construct a List of IdP Display Names in the spaces wiki.

Let's build an automated process that transforms the SAML metadata at ``xml_location`` into the CSV file at ``resource_url``. Schedule the following process to run every hour on incommon.org:

```Shell
# the XSL script and the shell script are included in the md-transforms repository
xsl_file=$LIB_DIR/list_all_IdP_DisplayNames_csv.xsl
resource_file=/tmp/all_IdP_DisplayNames.csv
$BIN_DIR/http_xsltproc.sh -F -o "$resource_file" "$xsl_file" "$xml_location"
Expand All @@ -82,14 +83,100 @@ if [ $exit_code -gt 1 ]; then
exit $exit_code
fi

# the resource_dir is the target web directory for the resource_file
# resource_dir is the target web directory for web resources
resource_dir=/home/htdocs/www.incommonfederation.org/federation/metadata/
mv $resource_file $resource_dir
exit 0
```

Observe that the command ``http_xsltproc.sh -F`` forces a fresh SAML metadata file. If the server responds with ``304 Not Modified``, the process terminates without updating the resource file.

### Example #2

Consider the following URLs:

```Shell
xml_location=http://md.incommon.org/InCommon/InCommon-metadata.xml
resource1_url=https://incommon.org/federation/metadata/all_RandS_IdPs.csv
resource2_url=https://incommon.org/federation/metadata/all_RandS_SPs.csv
```

The latter pair of resources are used to construct the List of Research and Scholarship Entities in the spaces wiki.

Suppose there is an automated process that transforms the SAML metadata at ``xml_location`` into the CSV files at ``resource1_url`` and ``resource2_url``. Specifically, let's suppose the following process runs every hour on incommon.org:

```Shell
xsl_file=$LIB_DIR/list_all_RandS_IdPs_csv.xsl
resource1_file=/tmp/all_RandS_IdPs.csv
$BIN_DIR/http_xsltproc.sh -F -o "$resource1_file" "$xsl_file" "$xml_location"
exit_code=$?
[ $exit_code -eq 1 ] && exit 0 # short-circuit if 304 response
if [ $exit_code -gt 1 ]; then
echo "ERROR: http_xsltproc.sh failed with status code: $exit_code" >&2
exit $exit_code
fi

xsl_file=$LIB_DIR/list_all_RandS_SPs_csv.xsl
resource2_file=/tmp/all_RandS_SPs.csv
$BIN_DIR/http_xsltproc.sh -C -o "$resource2_file" "$xsl_file" "$xml_location"
exit_code=$?
[ $exit_code -eq 1 ] && exit 0 # short-circuit if not cached
if [ $exit_code -gt 1 ]; then
echo "ERROR: http_xsltproc.sh failed with status code: $exit_code" >&2
exit $exit_code
fi

# resource_dir is the target web directory for web resources
resource_dir=/home/htdocs/www.incommonfederation.org/federation/metadata/
mv $resource1_file $resource2_file $resource_dir
exit 0
```

Observe the commands ``http_xsltproc.sh -F`` and ``http_xsltproc.sh -C``. The former forces a fresh SAML metadata file as in the previous example. The latter goes directly to cache. If file is not in the cache (which is highly unlikely), the process terminates without updating any resource files.

### Example #3

This example is very similar to previous example. Consider the following URLs:

```Shell
xml_location=http://md.incommon.org/InCommon/InCommon-metadata-export.xml
resource1_url=https://incommon.org/federation/metadata/all_exported_IdPs.csv
resource2_url=https://incommon.org/federation/metadata/all_exported_SPs.csv
```

The latter pair of resources are used to construct the List of Exported Entities in the spaces wiki.

Suppose there is an automated process that transforms the SAML metadata at ``xml_location`` (i.e., the Export Aggregate) into the CSV files at ``resource1_url`` and ``resource2_url``. Specifically, let's suppose the following process runs every hour on incommon.org:

```Shell
xsl_file=$LIB_DIR/list_all_IdPs_csv.xsl
resource1_file=/tmp/all_exported_IdPs.csv
$BIN_DIR/http_xsltproc.sh -F -o "$resource1_file" "$xsl_file" "$xml_location"
exit_code=$?
[ $exit_code -eq 1 ] && exit 0 # short-circuit if 304 response
if [ $exit_code -gt 1 ]; then
echo "ERROR: http_xsltproc.sh failed with status code: $exit_code" >&2
exit $exit_code
fi

xsl_file=$LIB_DIR/list_all_SPs_csv.xsl
resource2_file=/tmp/all_exported_SPs.csv
$BIN_DIR/http_xsltproc.sh -C -o "$resource2_file" "$xsl_file" "$xml_location"
exit_code=$?
[ $exit_code -eq 1 ] && exit 0 # short-circuit if not cached
if [ $exit_code -gt 1 ]; then
echo "ERROR: http_xsltproc.sh failed with status code: $exit_code" >&2
exit $exit_code
fi

# resource_dir is the target web directory for web resources
resource_dir=/home/htdocs/www.incommonfederation.org/federation/metadata/
mv $resource1_file $resource2_file $resource_dir
exit 0
```

The commands ``http_xsltproc.sh -F`` and ``http_xsltproc.sh -C`` behave exactly as described in the previous example.

## Compatibility

The executable scripts are compatible with GNU/Linux and Mac OS. The library files are written in XSLT 1.0.
Expand Down

0 comments on commit b366483

Please sign in to comment.