From ff283b54f46ce81e4d2aa061ceafa3f01c6de2c8 Mon Sep 17 00:00:00 2001 From: Tom Scavo Date: Sat, 22 Oct 2016 20:01:36 -0400 Subject: [PATCH 1/9] Tweak example --- README.md | 5 +++-- 1 file changed, 3 insertions(+), 2 deletions(-) diff --git a/README.md b/README.md index 36062b5..b701793 100644 --- a/README.md +++ b/README.md @@ -70,7 +70,7 @@ xml_location=http://md.incommon.org/InCommon/InCommon-metadata.xml resource_url=https://incommon.org/federation/metadata/all_IdP_DisplayNames.csv ``` -Suppose there is an automated process that transforms the SAML metadata at ``xml_location`` into the CSV file at ``resource_url``. Specifically, let's suppose the following process runs every hour on www.incommon.org: +Suppose there is an automated process that transforms the SAML metadata at ``xml_location`` into the CSV file at ``resource_url``. Specifically, let's suppose the following process runs every hour on incommon.org: ```Shell # the XSL script and the shell script are included in the md-transforms repository @@ -78,7 +78,8 @@ xsl_file=$LIB_DIR/list_all_IdP_DisplayNames_csv.xsl resource_file=/tmp/all_IdP_DisplayNames.csv $BIN_DIR/http_xsltproc.sh -F -o "$resource_file" "$xsl_file" "$xml_location" exit_code=$? -if [ $exit_code -ne 0 ]; then +[ $exit_code -eq 1 ] && exit 0 # short-circuit if 304 response +if [ $exit_code -gt 1 ]; then echo "ERROR: http_xsltproc.sh failed with status code: $exit_code" >&2 exit $exit_code fi From 234963a72a33dca085af603b813df2430fae9402 Mon Sep 17 00:00:00 2001 From: Tom Scavo Date: Sun, 23 Oct 2016 16:22:59 -0400 Subject: [PATCH 2/9] Remove contents section --- README.md | 28 +++++++++++++--------------- 1 file changed, 13 insertions(+), 15 deletions(-) diff --git a/README.md b/README.md index b701793..a576c2b 100644 --- a/README.md +++ b/README.md @@ -2,20 +2,6 @@ XSLT transformations of SAML metadata -## Contents - -Executables: - -* http_xsltproc.sh - -Library files: - -* list_all_IdP_DisplayNames_csv.xsl -* list_all_IdPs_csv.xsl -* list_all_RandS_IdPs_csv.xsl -* list_all_RandS_SPs_csv.xsl -* list_all_SPs_csv.xsl - ## Installation Download the source, change directory to the source directory, and install the source into ``/tmp`` as follows: @@ -34,7 +20,19 @@ $ export LIB_DIR=$HOME/lib $ ./install.sh $BIN_DIR $LIB_DIR ``` -An installation directory will be created if it doesn't already exist. +An installation directory will be created if one doesn't already exist. In any case, the following files will be installed: + +```Shell +$ ls -1 $BIN_DIR +http_xsltproc.sh + +$ ls -1 $LIB_DIR +list_all_IdP_DisplayNames_csv.xsl +list_all_IdPs_csv.xsl +list_all_RandS_IdPs_csv.xsl +list_all_RandS_SPs_csv.xsl +list_all_SPs_csv.xsl +``` ## Overview From b3664838bfd4e13a14e632adc5d9492c2b8a2b44 Mon Sep 17 00:00:00 2001 From: Tom Scavo Date: Sun, 23 Oct 2016 16:27:55 -0400 Subject: [PATCH 3/9] Add two examples --- README.md | 93 +++++++++++++++++++++++++++++++++++++++++++++++++++++-- 1 file changed, 90 insertions(+), 3 deletions(-) diff --git a/README.md b/README.md index a576c2b..9e3d9b2 100644 --- a/README.md +++ b/README.md @@ -68,10 +68,11 @@ xml_location=http://md.incommon.org/InCommon/InCommon-metadata.xml resource_url=https://incommon.org/federation/metadata/all_IdP_DisplayNames.csv ``` -Suppose there is an automated process that transforms the SAML metadata at ``xml_location`` into the CSV file at ``resource_url``. Specifically, let's suppose the following process runs every hour on incommon.org: +The latter resource is used to construct a List of IdP Display Names in the spaces wiki. + +Let's build an automated process that transforms the SAML metadata at ``xml_location`` into the CSV file at ``resource_url``. Schedule the following process to run every hour on incommon.org: ```Shell -# the XSL script and the shell script are included in the md-transforms repository xsl_file=$LIB_DIR/list_all_IdP_DisplayNames_csv.xsl resource_file=/tmp/all_IdP_DisplayNames.csv $BIN_DIR/http_xsltproc.sh -F -o "$resource_file" "$xsl_file" "$xml_location" @@ -82,7 +83,7 @@ if [ $exit_code -gt 1 ]; then exit $exit_code fi -# the resource_dir is the target web directory for the resource_file +# resource_dir is the target web directory for web resources resource_dir=/home/htdocs/www.incommonfederation.org/federation/metadata/ mv $resource_file $resource_dir exit 0 @@ -90,6 +91,92 @@ exit 0 Observe that the command ``http_xsltproc.sh -F`` forces a fresh SAML metadata file. If the server responds with ``304 Not Modified``, the process terminates without updating the resource file. +### Example #2 + +Consider the following URLs: + +```Shell +xml_location=http://md.incommon.org/InCommon/InCommon-metadata.xml +resource1_url=https://incommon.org/federation/metadata/all_RandS_IdPs.csv +resource2_url=https://incommon.org/federation/metadata/all_RandS_SPs.csv +``` + +The latter pair of resources are used to construct the List of Research and Scholarship Entities in the spaces wiki. + +Suppose there is an automated process that transforms the SAML metadata at ``xml_location`` into the CSV files at ``resource1_url`` and ``resource2_url``. Specifically, let's suppose the following process runs every hour on incommon.org: + +```Shell +xsl_file=$LIB_DIR/list_all_RandS_IdPs_csv.xsl +resource1_file=/tmp/all_RandS_IdPs.csv +$BIN_DIR/http_xsltproc.sh -F -o "$resource1_file" "$xsl_file" "$xml_location" +exit_code=$? +[ $exit_code -eq 1 ] && exit 0 # short-circuit if 304 response +if [ $exit_code -gt 1 ]; then + echo "ERROR: http_xsltproc.sh failed with status code: $exit_code" >&2 + exit $exit_code +fi + +xsl_file=$LIB_DIR/list_all_RandS_SPs_csv.xsl +resource2_file=/tmp/all_RandS_SPs.csv +$BIN_DIR/http_xsltproc.sh -C -o "$resource2_file" "$xsl_file" "$xml_location" +exit_code=$? +[ $exit_code -eq 1 ] && exit 0 # short-circuit if not cached +if [ $exit_code -gt 1 ]; then + echo "ERROR: http_xsltproc.sh failed with status code: $exit_code" >&2 + exit $exit_code +fi + +# resource_dir is the target web directory for web resources +resource_dir=/home/htdocs/www.incommonfederation.org/federation/metadata/ +mv $resource1_file $resource2_file $resource_dir +exit 0 +``` + +Observe the commands ``http_xsltproc.sh -F`` and ``http_xsltproc.sh -C``. The former forces a fresh SAML metadata file as in the previous example. The latter goes directly to cache. If file is not in the cache (which is highly unlikely), the process terminates without updating any resource files. + +### Example #3 + +This example is very similar to previous example. Consider the following URLs: + +```Shell +xml_location=http://md.incommon.org/InCommon/InCommon-metadata-export.xml +resource1_url=https://incommon.org/federation/metadata/all_exported_IdPs.csv +resource2_url=https://incommon.org/federation/metadata/all_exported_SPs.csv +``` + +The latter pair of resources are used to construct the List of Exported Entities in the spaces wiki. + +Suppose there is an automated process that transforms the SAML metadata at ``xml_location`` (i.e., the Export Aggregate) into the CSV files at ``resource1_url`` and ``resource2_url``. Specifically, let's suppose the following process runs every hour on incommon.org: + +```Shell +xsl_file=$LIB_DIR/list_all_IdPs_csv.xsl +resource1_file=/tmp/all_exported_IdPs.csv +$BIN_DIR/http_xsltproc.sh -F -o "$resource1_file" "$xsl_file" "$xml_location" +exit_code=$? +[ $exit_code -eq 1 ] && exit 0 # short-circuit if 304 response +if [ $exit_code -gt 1 ]; then + echo "ERROR: http_xsltproc.sh failed with status code: $exit_code" >&2 + exit $exit_code +fi + +xsl_file=$LIB_DIR/list_all_SPs_csv.xsl +resource2_file=/tmp/all_exported_SPs.csv +$BIN_DIR/http_xsltproc.sh -C -o "$resource2_file" "$xsl_file" "$xml_location" +exit_code=$? +[ $exit_code -eq 1 ] && exit 0 # short-circuit if not cached +if [ $exit_code -gt 1 ]; then + echo "ERROR: http_xsltproc.sh failed with status code: $exit_code" >&2 + exit $exit_code +fi + +# resource_dir is the target web directory for web resources +resource_dir=/home/htdocs/www.incommonfederation.org/federation/metadata/ +mv $resource1_file $resource2_file $resource_dir +exit 0 +``` + +The commands ``http_xsltproc.sh -F`` and ``http_xsltproc.sh -C`` behave exactly as described in the previous example. + ## Compatibility The executable scripts are compatible with GNU/Linux and Mac OS. The library files are written in XSLT 1.0. From 5aa1e78e683e70da3785fafed14d8d7b85827a1c Mon Sep 17 00:00:00 2001 From: Tom Scavo Date: Sun, 23 Oct 2016 16:41:12 -0400 Subject: [PATCH 4/9] Add links --- README.md | 25 +++++-------------------- 1 file changed, 5 insertions(+), 20 deletions(-) diff --git a/README.md b/README.md index 9e3d9b2..c78b90e 100644 --- a/README.md +++ b/README.md @@ -36,28 +36,13 @@ list_all_SPs_csv.xsl ## Overview -Bash script ``http_xsltproc.sh`` is a wrapper around the ``xsltproc`` command-line tool. Unlike ``xsltproc``, this script fetches the target XML document from an HTTP server. See the inline help file for details: +Bash script ``http_xsltproc.sh`` is a wrapper around the ``xsltproc`` command-line tool. Unlike ``xsltproc``, script ``http_xsltproc.sh`` fetches the target XML document from an HTTP server. See the inline help file for details: ```Shell $ $BIN_DIR/http_xsltproc.sh -h ``` -Here's an example of script usage: - -```Shell -$ MD_LOCATION=http://md.incommon.org/InCommon/InCommon-metadata.xml -$ $BIN_DIR/http_xsltproc.sh $LIB_DIR/list_all_IdP_DisplayNames_csv.xsl $MD_LOCATION | head -IdP Display Name,IdP Entity ID,IdP Discovery,Registrar ID -"Ohio State University",urn:mace:incommon:osu.edu,show,https://incommon.org -"Cornell University",https://shibidp.cit.cornell.edu/idp/shibboleth,show,https://incommon.org -"University of California - Office of the President",urn:mace:incommon:ucop.edu,show,https://incommon.org -"University of California-Irvine",urn:mace:incommon:uci.edu,show,https://incommon.org -"University of Washington",urn:mace:incommon:washington.edu,show,https://incommon.org -"Internet2",urn:mace:incommon:internet2.edu,show,https://incommon.org -"University of California-San Diego",urn:mace:incommon:ucsd.edu,show,https://incommon.org -"Georgetown University",https://shibb-idp.georgetown.edu/idp/shibboleth,show,https://incommon.org -"Case Western Reserve University",urn:mace:incommon:case.edu,show,https://incommon.org -``` +The following examples illustrate usage of the script. ### Example #1 @@ -68,7 +53,7 @@ xml_location=http://md.incommon.org/InCommon/InCommon-metadata.xml resource_url=https://incommon.org/federation/metadata/all_IdP_DisplayNames.csv ``` -The latter resource is used to construct a List of IdP Display Names in the spaces wiki. +The latter resource is used to construct a [List of IdP Display Names](https://spaces.internet2.edu/x/2IDmBQ) in the spaces wiki. Let's build an automated process that transforms the SAML metadata at ``xml_location`` into the CSV file at ``resource_url``. Schedule the following process to run every hour on incommon.org: @@ -101,7 +86,7 @@ resource1_url=https://incommon.org/federation/metadata/all_RandS_IdPs.csv resource2_url=https://incommon.org/federation/metadata/all_RandS_SPs.csv ``` -The latter pair of resources are used to construct the List of Research and Scholarship Entities in the spaces wiki. +The latter pair of resources are used to construct the [List of Research and Scholarship Entities](https://spaces.internet2.edu/x/ZoUABg) in the spaces wiki. Suppose there is an automated process that transforms the SAML metadata at ``xml_location`` into the CSV files at ``resource1_url`` and ``resource2_url``. Specifically, let's suppose the following process runs every hour on incommon.org: @@ -144,7 +129,7 @@ resource1_url=https://incommon.org/federation/metadata/all_exported_IdPs.csv resource2_url=https://incommon.org/federation/metadata/all_exported_SPs.csv ``` -The latter pair of resources are used to construct the List of Exported Entities in the spaces wiki. +The latter pair of resources are used to construct the [List of Exported Entities](https://spaces.internet2.edu/x/DYD4BQ) in the spaces wiki. Suppose there is an automated process that transforms the SAML metadata at ``xml_location`` (i.e., the Export Aggregate) into the CSV files at ``resource1_url`` and ``resource2_url``. Specifically, let's suppose the following process runs every hour on incommon.org: From 73a3d27c9f3c172f2c14c701b0f47fee4f5b8c95 Mon Sep 17 00:00:00 2001 From: Tom Scavo Date: Sun, 23 Oct 2016 17:08:50 -0400 Subject: [PATCH 5/9] Specify env vars for the examples --- README.md | 16 ++++++++++++---- 1 file changed, 12 insertions(+), 4 deletions(-) diff --git a/README.md b/README.md index c78b90e..533a55f 100644 --- a/README.md +++ b/README.md @@ -36,13 +36,21 @@ list_all_SPs_csv.xsl ## Overview -Bash script ``http_xsltproc.sh`` is a wrapper around the ``xsltproc`` command-line tool. Unlike ``xsltproc``, script ``http_xsltproc.sh`` fetches the target XML document from an HTTP server. See the inline help file for details: +Bash script ``http_xsltproc.sh`` is a wrapper around the ``xsltproc`` command-line tool. Unlike ``xsltproc``, the ``http_xsltproc.sh`` script fetches the target XML document from an HTTP server using HTTP Conditional GET [RFC 7232]. If the server responds with 200, the script caches the resource and returns the response body. If the server responds with 304, the script returns the cached resource instead. See the inline help file for details: ```Shell $ $BIN_DIR/http_xsltproc.sh -h ``` -The following examples illustrate usage of the script. +The ``http_xsltproc.sh`` script requires two environment variables. ``CACHE_DIR`` is the absolute path to the cache directory (which may or may not exist) whereas ``LIB_DIR`` specifies a directory containing various helper scripts. + +For example, let's use the library installed in the previous section and specify the cache as follows: + +```Shell +$ export CACHE_DIR=/tmp/cache +``` + +The following examples show how to use the script to create some cron jobs on incommon.org. ### Example #1 @@ -78,7 +86,7 @@ Observe that the command ``http_xsltproc.sh -F`` forces a fresh SAML metadata fi ### Example #2 -Consider the following URLs: +This example is similar to the previous example except that two resources are created. Consider the following URLs: ```Shell xml_location=http://md.incommon.org/InCommon/InCommon-metadata.xml @@ -117,7 +125,7 @@ mv $resource1_file $resource2_file $resource_dir exit 0 ``` -Observe the commands ``http_xsltproc.sh -F`` and ``http_xsltproc.sh -C``. The former forces a fresh SAML metadata file as in the previous example. The latter goes directly to cache. If file is not in the cache (which is highly unlikely), the process terminates without updating any resource files. +Observe the commands ``http_xsltproc.sh -F`` and ``http_xsltproc.sh -C``. The former forces a fresh SAML metadata file as in the previous example; the latter goes directly to cache. If file is not in the cache (which is highly unlikely), the process terminates without updating any resource files. ### Example #3 From 373fd8922db5c5e52004f9b9e2a417c01d7be1ac Mon Sep 17 00:00:00 2001 From: Tom Scavo Date: Sun, 30 Oct 2016 06:55:26 -0400 Subject: [PATCH 6/9] adjust exit code --- README.md | 5 +---- 1 file changed, 1 insertion(+), 4 deletions(-) diff --git a/README.md b/README.md index 533a55f..cdc4b0c 100644 --- a/README.md +++ b/README.md @@ -79,7 +79,6 @@ fi # resource_dir is the target web directory for web resources resource_dir=/home/htdocs/www.incommonfederation.org/federation/metadata/ mv $resource_file $resource_dir -exit 0 ``` Observe that the command ``http_xsltproc.sh -F`` forces a fresh SAML metadata file. If the server responds with ``304 Not Modified``, the process terminates without updating the resource file. @@ -122,14 +121,13 @@ fi # resource_dir is the target web directory for web resources resource_dir=/home/htdocs/www.incommonfederation.org/federation/metadata/ mv $resource1_file $resource2_file $resource_dir -exit 0 ``` Observe the commands ``http_xsltproc.sh -F`` and ``http_xsltproc.sh -C``. The former forces a fresh SAML metadata file as in the previous example; the latter goes directly to cache. If file is not in the cache (which is highly unlikely), the process terminates without updating any resource files. ### Example #3 -This example is very similar to previous example. Consider the following URLs: +This example is very similar to the previous example. Consider the following URLs: ```Shell xml_location=http://md.incommon.org/InCommon/InCommon-metadata-export.xml @@ -165,7 +163,6 @@ fi # resource_dir is the target web directory for web resources resource_dir=/home/htdocs/www.incommonfederation.org/federation/metadata/ mv $resource1_file $resource2_file $resource_dir -exit 0 ``` The commands ``http_xsltproc.sh -F`` and ``http_xsltproc.sh -C`` behave exactly as described in the previous example. From 601c7e8d5f8a26a7ae5f5d891039e05dd40b9a70 Mon Sep 17 00:00:00 2001 From: Tom Scavo Date: Sun, 30 Oct 2016 09:56:14 -0400 Subject: [PATCH 7/9] Remove quotes from examples --- README.md | 6 +++--- 1 file changed, 3 insertions(+), 3 deletions(-) diff --git a/README.md b/README.md index cdc4b0c..7dc47c6 100644 --- a/README.md +++ b/README.md @@ -68,7 +68,7 @@ Let's build an automated process that transforms the SAML metadata at ``xml_loca ```Shell xsl_file=$LIB_DIR/list_all_IdP_DisplayNames_csv.xsl resource_file=/tmp/all_IdP_DisplayNames.csv -$BIN_DIR/http_xsltproc.sh -F -o "$resource_file" "$xsl_file" "$xml_location" +$BIN_DIR/http_xsltproc.sh -F -o $resource_file $xsl_file $xml_location exit_code=$? [ $exit_code -eq 1 ] && exit 0 # short-circuit if 304 response if [ $exit_code -gt 1 ]; then @@ -100,7 +100,7 @@ Suppose there is an automated process that transforms the SAML metadata at ``xml ```Shell xsl_file=$LIB_DIR/list_all_RandS_IdPs_csv.xsl resource1_file=/tmp/all_RandS_IdPs.csv -$BIN_DIR/http_xsltproc.sh -F -o "$resource1_file" "$xsl_file" "$xml_location" +$BIN_DIR/http_xsltproc.sh -F -o $resource1_file $xsl_file $xml_location exit_code=$? [ $exit_code -eq 1 ] && exit 0 # short-circuit if 304 response if [ $exit_code -gt 1 ]; then @@ -142,7 +142,7 @@ Suppose there is an automated process that transforms the SAML metadata at ``xml ```Shell xsl_file=$LIB_DIR/list_all_IdPs_csv.xsl resource1_file=/tmp/all_exported_IdPs.csv -$BIN_DIR/http_xsltproc.sh -F -o "$resource1_file" "$xsl_file" "$xml_location" +$BIN_DIR/http_xsltproc.sh -F -o $resource1_file $xsl_file $xml_location exit_code=$? [ $exit_code -eq 1 ] && exit 0 # short-circuit if 304 response if [ $exit_code -gt 1 ]; then From 57666f646d0b2e80335413b817e64116f48aa43e Mon Sep 17 00:00:00 2001 From: Tom Scavo Date: Sun, 30 Oct 2016 10:15:11 -0400 Subject: [PATCH 8/9] Streamline the examples --- README.md | 57 ++++++++++++++++++++++++++++++------------------------- 1 file changed, 31 insertions(+), 26 deletions(-) diff --git a/README.md b/README.md index 7dc47c6..2fbdb28 100644 --- a/README.md +++ b/README.md @@ -54,18 +54,19 @@ The following examples show how to use the script to create some cron jobs on in ### Example #1 -Consider the following URLs: +The goal is to transform InCommon metadata into the following CSV file: -```Shell -xml_location=http://md.incommon.org/InCommon/InCommon-metadata.xml -resource_url=https://incommon.org/federation/metadata/all_IdP_DisplayNames.csv -``` +* https://incommon.org/federation/metadata/all_IdP_DisplayNames.csv -The latter resource is used to construct a [List of IdP Display Names](https://spaces.internet2.edu/x/2IDmBQ) in the spaces wiki. +The above resource is used to construct a [List of IdP Display Names](https://spaces.internet2.edu/x/2IDmBQ) in the spaces wiki. -Let's build an automated process that transforms the SAML metadata at ``xml_location`` into the CSV file at ``resource_url``. Schedule the following process to run every hour on incommon.org: +Suppose there is an automated process that transforms the main InCommon metadata aggregate into the CSV file at the above URL. Specifically, let's suppose the following process runs every hour on incommon.org: ```Shell +# determine the metadata location +xml_location=http://md.incommon.org/InCommon/InCommon-metadata.xml + +# create the resource xsl_file=$LIB_DIR/list_all_IdP_DisplayNames_csv.xsl resource_file=/tmp/all_IdP_DisplayNames.csv $BIN_DIR/http_xsltproc.sh -F -o $resource_file $xsl_file $xml_location @@ -76,7 +77,7 @@ if [ $exit_code -gt 1 ]; then exit $exit_code fi -# resource_dir is the target web directory for web resources +# move the resource to the web directory resource_dir=/home/htdocs/www.incommonfederation.org/federation/metadata/ mv $resource_file $resource_dir ``` @@ -85,19 +86,20 @@ Observe that the command ``http_xsltproc.sh -F`` forces a fresh SAML metadata fi ### Example #2 -This example is similar to the previous example except that two resources are created. Consider the following URLs: +The goal is to transform InCommon metadata into the following pair of CSV files: -```Shell -xml_location=http://md.incommon.org/InCommon/InCommon-metadata.xml -resource1_url=https://incommon.org/federation/metadata/all_RandS_IdPs.csv -resource2_url=https://incommon.org/federation/metadata/all_RandS_SPs.csv -``` +* https://incommon.org/federation/metadata/all_RandS_IdPs.csv +* https://incommon.org/federation/metadata/all_RandS_SPs.csv -The latter pair of resources are used to construct the [List of Research and Scholarship Entities](https://spaces.internet2.edu/x/ZoUABg) in the spaces wiki. +The above resources are used to construct the [List of Research and Scholarship Entities](https://spaces.internet2.edu/x/ZoUABg) in the spaces wiki. -Suppose there is an automated process that transforms the SAML metadata at ``xml_location`` into the CSV files at ``resource1_url`` and ``resource2_url``. Specifically, let's suppose the following process runs every hour on incommon.org: +Suppose there is an automated process that transforms the main InCommon metadata aggregate into the CSV files at the above URLs. Specifically, let's suppose the following process runs every hour on incommon.org: ```Shell +# determine the metadata location +xml_location=http://md.incommon.org/InCommon/InCommon-metadata.xml + +# create the first resource xsl_file=$LIB_DIR/list_all_RandS_IdPs_csv.xsl resource1_file=/tmp/all_RandS_IdPs.csv $BIN_DIR/http_xsltproc.sh -F -o $resource1_file $xsl_file $xml_location @@ -108,6 +110,7 @@ if [ $exit_code -gt 1 ]; then exit $exit_code fi +# create the second resource xsl_file=$LIB_DIR/list_all_RandS_SPs_csv.xsl resource2_file=/tmp/all_RandS_SPs.csv $BIN_DIR/http_xsltproc.sh -C -o "$resource2_file" "$xsl_file" "$xml_location" @@ -118,7 +121,7 @@ if [ $exit_code -gt 1 ]; then exit $exit_code fi -# resource_dir is the target web directory for web resources +# move the resources to the web directory resource_dir=/home/htdocs/www.incommonfederation.org/federation/metadata/ mv $resource1_file $resource2_file $resource_dir ``` @@ -127,19 +130,20 @@ Observe the commands ``http_xsltproc.sh -F`` and ``http_xsltproc.sh -C``. The fo ### Example #3 -This example is very similar to the previous example. Consider the following URLs: +The goal is to transform InCommon metadata into the following pair of CSV files: -```Shell -xml_location=http://md.incommon.org/InCommon/InCommon-metadata-export.xml -resource1_url=https://incommon.org/federation/metadata/all_exported_IdPs.csv -resource2_url=https://incommon.org/federation/metadata/all_exported_SPs.csv -``` +* https://incommon.org/federation/metadata/all_exported_IdPs.csv +* https://incommon.org/federation/metadata/all_exported_SPs.csv -The latter pair of resources are used to construct the [List of Exported Entities](https://spaces.internet2.edu/x/DYD4BQ) in the spaces wiki. +The above resources are used to construct the [List of Exported Entities](https://spaces.internet2.edu/x/DYD4BQ) in the spaces wiki. -Suppose there is an automated process that transforms the SAML metadata at ``xml_location`` (i.e., the Export Aggregate) into the CSV files at ``resource1_url`` and ``resource2_url``. Specifically, let's suppose the following process runs every hour on incommon.org: +Suppose there is an automated process that transforms the InCommon export aggregate into the CSV files at the above URLs. Specifically, let's suppose the following process runs every hour on incommon.org: ```Shell +# determine the metadata location +xml_location=http://md.incommon.org/InCommon/InCommon-metadata-export.xml + +# create the first resource xsl_file=$LIB_DIR/list_all_IdPs_csv.xsl resource1_file=/tmp/all_exported_IdPs.csv $BIN_DIR/http_xsltproc.sh -F -o $resource1_file $xsl_file $xml_location @@ -150,6 +154,7 @@ if [ $exit_code -gt 1 ]; then exit $exit_code fi +# create the second resource xsl_file=$LIB_DIR/list_all_SPs_csv.xsl resource2_file=/tmp/all_exported_SPs.csv $BIN_DIR/http_xsltproc.sh -C -o "$resource2_file" "$xsl_file" "$xml_location" @@ -160,7 +165,7 @@ if [ $exit_code -gt 1 ]; then exit $exit_code fi -# resource_dir is the target web directory for web resources +# move the resources to the web directory resource_dir=/home/htdocs/www.incommonfederation.org/federation/metadata/ mv $resource1_file $resource2_file $resource_dir ``` From 907deba0376dd7389b54e9e4b9c5a934fbdbd193 Mon Sep 17 00:00:00 2001 From: Tom Scavo Date: Sun, 30 Oct 2016 11:37:54 -0400 Subject: [PATCH 9/9] Call out a dependency --- README.md | 2 ++ 1 file changed, 2 insertions(+) diff --git a/README.md b/README.md index 2fbdb28..a590d33 100644 --- a/README.md +++ b/README.md @@ -4,6 +4,8 @@ XSLT transformations of SAML metadata ## Installation +The scripts in this repository depend on a [Bash Library](https://github.internet2.edu/InCommon/bash-library) of basic scripts. Download and install the latter before continuing. + Download the source, change directory to the source directory, and install the source into ``/tmp`` as follows: ```Shell