Submission Procedure for XSLT/XPath Test Suites


Table of Contents

Introduction
Purpose of this Guide
Scope of this Guide
Audience of this Guide
Related Documents
1. Creating a Test Catalog: A Walk-through of the Collection Catalog DTD
<test-suite>
Syntax
Purpose
Usage
<test-catalog>
Syntax
Purpose
Usage
Attribute: submitter
<creator>
Syntax
Purpose
Usage
Examples
<date>
Syntax
Purpose
Usage
<test-case>
Syntax
Purpose
Usage
Attribute: id
Attribute: category
Examples
<file-path>
Syntax
Purpose
Usage
Examples
<creator>
Syntax
Purpose
Usage
Examples
<date>
Syntax
Purpose
Usage
Examples
<purpose>
Syntax
Purpose
Usage
Examples
<elaboration>
Syntax
Purpose
Usage
<spec-citation>
Syntax
Purpose
Usage
Attribute: spec
Attribute: version
Attribute: type
Attribute: place
Attribute: version-drop
Attribute: errata-add
Attribute: errata-add-date
Attribute: errata-drop
Attribute: errata-drop-date
Examples
<discretionary>
Syntax
Purpose
Usage
The Catalog of Discretion
Attribute: name
Attribute: behavior
Examples
<gray-area>
Syntax
Purpose
Usage
Attribute: name
Attribute: behavior
Examples
<scenario>
Syntax
Purpose
Usage
Attribute: operation
Attribute: message
Attribute: error
<input-file>
Syntax
Purpose
Usage
Attribute: role
Example
<output-file>
Syntax
Purpose
Usage
Attribute: role
Attribute: compare
Example
<param-set>
Syntax
Purpose
Usage
Attribute: name
Attribute: type
Attribute: value
Example
<console>
Syntax
Purpose
Usage
2. Validating the Test Catalog
3. Packaging the Files
Directory Structure
The VALIDATE Directory
The TEST Directory
How to Package Your Submission
4. Updating a Submission

Introduction

The OASIS XSLT/XPath Conformance Committee ("the Committee") collects test cases to include in its XSLT/XPath Conformance Test Suite. This document explains how to submit a catalog of test cases for consideration by the Committee.

The submission must include the following items:

  • a catalog of test cases, in the form of an XML instance conforming to the OASIS Collection Catalog DTD (collcat.dtd).

  • a file tree of test cases, including (allegedly) correct results

The first chapter of this document walks through the DTD in document order, clarifying the specific requirements of the XSLT/XPath Conformance Committee.

The second chapter of this document explains how to validate the catalog.

The third chapter explains how to package the files that make up the test suite.

The fourth chapter explains how to re-submit or update a test catalog.

Purpose of this Guide

The purpose of this guide is to provide submitters with everything they need to know to prepare the following items for submission:

  • test case(s)

  • reference output

  • test catalog

If you need help figuring out what to submit, see the separate Submission Guidelines document (subguide.htm).

Scope of this Guide

This guide instructs you on how to prepare and submit test cases for XSLT/XPath conformance.

This includes the following:

  • gathering the input and output files

  • creating the test catalog according to the Collection Catalog DTD

  • validating the submission

Note

The Committee recommends that you set aside a clean directory tree to unpack the OASIS package and organize all the files to be submitted.

Audience of this Guide

This document is addressed to test submitters, using the second person ("you").

Related Documents

All submitters should read the following documents:

  • Submission Guidelines and Policy (subguide.htm)

  • Catalog of Discretion (xsltdisc.htm)

For advanced information, submitters may also read:

  • xsltcfg.htm

1. Creating a Test Catalog: A Walk-through of the Collection Catalog DTD

The Collection Catalog DTD allows the following information to be associated with a test case:

  • Identification

    • who submitted the test case?

    • when was the test case submitted?

    • who created the test case?

  • Description

    • what is the purpose of the test case?

    • which specific provisions of the specification(s) are under test in this case?

    • which version of the specification is being tested?

    • what errata, if any, are being tested?

  • Filtering & Discretionary Choices

    • should the test case be executed with a given processor?

  • Operational Parameters

The high-level structures in the Collection Catalog DTD are used to group and identify a set of test cases.

A test catalog will contain one test-case element for each test case. All tests from one submitter should be collected into a test-catalog element. The Committee will review the catalogs it receives, and will publish selected catalogs as a test-suite.

<test-suite>

Syntax

<!ELEMENT test-suite ( test-catalog+ )>

Purpose

A test-suite element is the document element for a submitter's test catalog.

Usage

The final test suite published by the Committee will contain several test catalogs, one from each submitter. In contrast, a submitter's test-suite must contain only one test-catalog.

Note

Some confusion can arise because the DTD allows more than one test-catalog in a test-suite. Keep in mind that the Collection Catalog DTD is used not only by test submitters; it will also be used by the Committee to process the test cases it receives, and to publish the final test suite.

<test-catalog>

Syntax

<!ELEMENT test-catalog ( creator, date?, test-case * )>

<!ATTLIST test-catalog

submitter ID #IMPLIED>

Purpose

The test-catalog element is a container for the test cases from one submitter.

Attribute: submitter

The submitter attribute value will be assigned by the Committee during the production process. It will be a globally unique string used to identify the person or organization submitting the test suite.

The value of the submitter attribute will also be used as the name of the top-level directory which will contain all test cases from that submitter. For example, if the Committee assigns "ABC" as the identifier for ABC Corporation, then all test cases submitted by ABC Corporation will be stored in a directory named ABC, and the submitter attribute is "ABC".

Note

Any value supplied by the submitter is subject to change by the Committee.

<creator>

Syntax

<!ELEMENT creator ( #PCDATA )>

Purpose

The creator element that is a child of test-catalog identifies the person or organization submitting the test catalog.

Usage

While the Committee will assign the value of the submitter attribute, the submitting body can identify itself in the creator element. The submitter can decide what information to include in this element. It could include company name and URL, or personal name and email address, for example.

Note

Do not confuse this creator element with the creator child of test-case. The latter identifies the individuals who created an individual test case.

Examples

The Committee may assign the value "ABC" or "X123" to the submitter attribute, as a unique identifier of the submitter. The creator element allows the submitter to provide a fuller description of itself, such as:

<creator>Abner Bentley Conundrum Corporation</creator>

<date>

Syntax

<!ELEMENT date ( #PCDATA )>

Purpose

The date element indicates the date on which the test catalog is submitted to OASIS.

Usage

The Committee will add the date element to track the date the catalog is received from the submitter.

Note

The submitter can also assign a date to each test case individually. This allows, for example, the submitter to record when a test case was last modified.

<test-case>

Syntax

<!ELEMENT test-case ( file-path , creator* , date? , purpose , elaboration? , spec-citation+ , discretionary? , gray-area? , scenario )>

<!ATTLIST test-case

id ID #REQUIRED

category NMTOKEN #REQUIRED>

Purpose

A catalog can contain any number of test cases. Each test-case element should contain the information necessary to locate and run one test case.

Usage

The test-case element is a container for information about where the test files are located, their purpose and how to use them. The attributes on test-case identify the case uniquely and associate it with a Committee-defined category.

Attribute: id

The purpose of the id attribute is to identify a test case uniquely within the catalog.

Note

When assembling a test suite, the Committee will prefix the id value with the value of the submitter attribute and the filepath element. This will ensure that the id values are unique across the published test suite.

Attribute: category

Each test case must be associated with one of the categories that have been defined by the Committee. The categories are listed in the xsltcfg.xml file, in the categories element. No other values are permitted; if a test applies to more than one category, the attribute value should be specified as "Mixed".

In the xsltcfg.xml file, each category element identifies one category, in its id attribute. The associated p element provides a brief description of the category.

In the Collection Catalog DTD, the category attribute allows each test case to be associated with Committee-defined category, without imposing constraints on test names or directory structure.

The following table lists the categories initially defined for XSLT/XPath tests. However, submitters should always consult the newest copy of the xsltcfg.xml file for an up-to-date list.

Table 1.1. Allowable Categories of Test Cases

CATEGORYDEFINITION
MixedTests that fit into more than one category
XPath-Core-FunctionAll XPath functions
XPath-Data-ModelAll XPath not covered by the categories listed here
XPath-ExpressionOperators, type conversion
XPath-Location-Path

Axes, node tests, predicates

XSLT-Data-Manipulation

Sort, for-each, conditionals, variables, keys

XSLT-Data-Model

Treatment of source XML, text and whitespace, entities

XSLT-ExtendabilityFunctions and instructions related to extendability
XSLT-OutputOutput, message
XSLT-Result-Tree

Creation of nodes in result

XSLT-Structure

Stylesheet/transform, namespace, import, include

XSLT-TemplateMatching, call named, priority

Examples

Example 1.1. test-case Tag

<test-case id="first-test" category="XSLT-Result-Tree">

<file-path>

Syntax

<!ELEMENT file-path (#PCDATA)>

Purpose

The file-path element specifies a path, starting with a directory that is directly under the submitter's root directory (i.e., the directory named in the submitter attribute). The path must end with the directory containing the file that has been designated as the "principal" stylesheet file. For those cases without a separate stylesheet, use the directory containing the principal data file. (See the role attribute for more information about how to designate one file as the principal input file.)

Usage

If the file-path element contains a multi-level file path, it must use forward slashes as separators, but it should not begin or end with a slash. The string shall begin with the name of a directory that is directly within the submitter's root directory, and it must end with the name of the directory that contains the principal input file(s). The stylesheet and XML data may be in different directories, in which case the directory containing the stylesheet should be specified in this element. For those cases without a separate stylesheet, use the directory containing the principal data file. (This identifies the directory that will be the current directory for running the individual test case.)

Names in the file path must not contain spaces, must start with a letter, and should be of reasonable length.

Examples

Example 1.2. file-path tag

<file-path>functions/math/round>

<creator>

Syntax

<!ELEMENT creator (#PCDATA)>

Purpose

The optional and repeatable creator element is intended to identify individual test contributors. It serves two purposes: it allows the submitting organization to track the creator(s) of each test case, and it allows the individual test creators to get public credit for their work.

Note

Do not confuse this creator element with the creator child of the test-catalog element. The latter identifies the creator of the entire test catalog: that is, the submitter.

Usage

In the published test suite, the content of the creator element will be published verbatim.

Examples

Example 1.3. creator tag

<creator>Raffi</creator>

<date>

Syntax

<!ELEMENT date (#PCDATA)>

Purpose

The date element is optional in the DTD, but the Committee will need to use it if a submitter re-submits a test case. Therefore, the use of this element is encouraged.

Usage

In accordance with ISO-8601, the date format shall be yyyy-mm-dd or yyyy/mm/dd.

Note

When processing the catalog, the Committee will strip out the "-" or "/" separators to allow numeric date comparisons.

Examples

Example 1.4. date tag

<date>2002-01-17</date>

<purpose>

Syntax

<!ELEMENT purpose (#PCDATA)>

Purpose

The purpose and elaboration elements are closely related: only purpose is required. It should succinctly describe the reason for the test.

Usage

The purpose element shall contain a text string of 255 characters or less, with no new-lines. This text will be used in the final test suite document to briefly summarize each test.

If the creator or submitter feels the need to elaborate on the information in the purpose element, comments and/or the elaboration element may be used.

Examples

Example 1.5. purpose tag

<purpose>Show that 0 equals -0.</purpose>

<elaboration>

Syntax

<!ELEMENT elaboration &prose;>

Note

See the Prose DTD (prose.dtd) for an up-to-date content model. Initially, the prose model is very simple: (p | em | strong )*

Purpose

The optional elaboration element allows the submitter to provide a more detailed description of the test's purpose.

Note

This element can be used to explain non-obvious techniques. Remember that the Committee will be reviewing many tests; any help understanding the non-obvious ones will be greatly appreciated.

Usage

If the submitter feels the need to elaborate on the information in the purpose element, the elaboration element may be used.

<spec-citation>

Syntax

<!ELEMENT spec-citation EMPTY>

<!ATTLIST spec-citation

spec NMTOKEN #REQUIRED

version CDATA #REQUIRED

type NMTOKEN #REQUIRED

version-drop NMTOKEN #IMPLIED

errata-add NMTOKEN #IMPLIED

errata-add-date CDATA #IMPLIED

errata-drop NMTOKEN #IMPLIED

errata-drop-date CDATA #IMPLIED

place CDATA #REQUIRED>

Purpose

For each test case, the submitter must clearly identify which specification(s) are being tested.

The submitter must also supply a pointer to the provisions within each specification that are involved in a test. The more precise the pointer, the better. A precise pointer will decrease the need for an elaboration element and will improve the inversion from the specification to the test cases.

The pointer information may be used to generate and display citations.

Note

Each test case should involve only one testable assertion. See the Submission Policy Guide for a more detailed definition of "assertion".

Each pertinent specification should be cited by version number and (if relevant) errata status.

Some tests are developed in response to an erratum, as opposed to the specification itself. The release of an errata document may require new test cases, and it may make other tests obsolete. Submitters should keep in mind that a version of a specification can have any number of errata issued against it (including none). In addition, an erratum issued against one version of a specification does not apply to later versions of that specification.

Usage

To cite a specification unambiguously, the submitter must indicate the name of the specification (spec) and its version (version). The submitter must also indicate the kind of pointer used to isolate the relevant portion of the specification (type) and provide the pointer itself (place).

The values of these attributes may function as components in a displayable (and possibly linkable) citation to the specification.

For example, given the following attribute values:

spec="XSLT"
version="1999-11-16"
type="section"
place="7.7 Numbering"

the following pointer could be displayed:

Doc: http://www.w3.org/TR/1999/REC-xslt-19991116 Section: 7.7 Numbering

See the citation-specifications element in the xsltcfg.xml document for more information about the specifications that the Committee recognizes, and the permitted methods of citing these specifications. The following Attribute sections (spec, version, type, etc.) provide instructions on how to read this part of the xsltcfg.xml document.

Attribute: spec

XSLT/XPath conformance tests can be related to one of three Recommendations or corresponding Errata. Therefore, the spec attribute value must match one of the following five values:

  1. XSLT

  2. XPath

  3. XML-stylesheet

  4. XSLT-errata

  5. XPath-errata

Note

Always consult the latest version of the xsltcfg.xml document for an up-to-date list of allowable specifications.

Attribute: version

This attribute identifies the version of the specification being tested.

The version attribute facilitates inequality tests, so its value must be numeric.

If more than one specification is implicated by the test, the main specification should always be cited. If the test is really about XPath or XML-Stylesheet, try to use only the base features of XSLT (version 1.0 without any errata).

If the test concerns an erratum, then the version attribute should specify the version of the base specification (e.g., XSLT 1.0), and the version of the errata should be specified in the appropriate errata-* attribute.

Attribute: type

There are three types of pointers that a submitter can use to identify the relevant portions of specifications. The three possible values are:

  1. type="section", to point to a numbered section.

    Note

    Always consult the latest version of the xsltcfg.xml document for an up-to-date list of allowable specifications.

    Since a section pointer is human-legible and not intended for automated linking, it should be used only when neither of the other two types of pointers are possible.

    Example:

  2. type="anchor", to point to an HTML-style anchor.

    This type of pointer is machine-readable, used for linking directly into HTML source information.

    For example, given a citation to the XPath Recommendation, with

    For example, given a citation to the XPath Recommendation, with type="anchor" and place="number", the resulting pointer becomes:

    http://www.w3.org/TR/1999/REC-xpath-19991116.xml#number.

  3. type="XPointer", to use an ID from the specification document, optionally followed by an XPath expression to qualify it further.

    This type of pointer is machine-readable, used for linking directly into XML source information.

    For example, given a citation to the XPath Recommendation, with type="XPointer" and place="id(number)/ulist[2]/item[2]/p[1]/text()[1]", the resulting pointer becomes:

    http://www.w3.org/TR/1999/REC-xpath-19991116.xml#pointer(id(number)/ulist[2]/item[2]/p[1]/text()[1]).

Note

Always consult the latest version of the xsltcfg.xml document for an up-to-date list of allowable types.

Attribute: place

The place attribute should contain a precise pointer to the part of the specification covered by the test. You must use a pointer that corresponds to the value of the type attribute. For example, if type="section", then the place attribute should specify a section number.

Attribute: version-drop

The version-drop attribute indicates a version of the specification that makes the test case unnecessary.

If the version-drop attribute is specified, the value must be numerically greater than the value of the version attribute.

Attribute: errata-add

The errata-add or (errata-add-date) attribute indicates that the test case became necessary when the errata document was issued.

The submitter has a choice of identifying the pertinent erratum by errata number or date, because some Working Groups do not number their errata. All dates in a test-suite should be in ISO-8601 format: yyyy-mm-dd.

Errata are issued against a particular version of a Recommendation. Therefore, the submitter must specify the version of the base specification, plus the errata number or date.

Attribute: errata-add-date

The errata-add or (errata-add-date) attribute indicates that the test case became necessary when the errata document was issued.

The submitter has a choice of identifying the pertinent errata by errata number or date, because some Working Groups do not number their errata. All dates in a test-suite should be in ISO-8601 format: yyyy-mm-dd.

Attribute: errata-drop

The errata-drop (or errata-drop-date) attribute indicates that the test case is no longer necessary, now that the errata document has been issued. The value of the errata-drop or (errata-drop-date) attribute must always be numerically greater than the value of the errata-add (or errata-add-date) attribute.

The submitter has a choice of identifying the pertinent erratum by errata number or date, because some Working Groups do not number their errata. All dates in a test-suite should be in ISO-8601 format: yyyy-mm-dd.

Attribute: errata-drop-date

The errata-drop (or errata-drop-date) attribute indicates that the test case is no longer necessary, now that the errata document has been issued. The value of the errata-drop or (errata-drop-date) attribute must always be numerically greater than the value of the errata-add (or errata-add-date) attribute.

The submitter has a choice of identifying the pertinent erratum by errata number or date, because some Working Groups do not number their errata. All dates in a test-suite should be in ISO-8601 format: yyyy-mm-dd.

Examples

Example 1.6. spec-citation tag

<spec-citation spec="XPath" version="1.0" type="XPointer" place="id('strip')/p[3]/text()[1]"/>

<discretionary>

Syntax

<!ELEMENT discretionary ( discretionary-choice )+ >

<!ELEMENT discretionary-choice EMPTY >

<!ATTLIST discretionary-choice

name NMTOKEN #REQUIRED

behavior NMTOKEN #REQUIRED>

Purpose

The Committee has identified and named the areas in the XSLT and XPath Recommendations where the processor's behavior is left up to the discretion of the developer. If a test case examines a processor's response to one or more discretionary items, you must indicate this with the discretionary element.

The discretionary element will be used by testers to filter out irrelevant test cases when a test suite is assembled. Typically, if a test case mentions a discretionary item, the test case would be excluded from a rendered test suite for those processors that have made a different choice for that item. It would be included for those processors that have made the same choice as the one indicated in the test.

On the other hand, if a test case does not mention a discretionary item, the tester should assume that the test case does not depend on that specific discretionary choice. Therefore, the test case would be included for any choice made on that item.

For example, if a test case specifies the discretionary item "attribute-name-not-Qname" and the behavior "raise-error", then a test lab will likely exclude it when testing a processor that has implemented the "ignore" behavior.

Submitters may, if they wish, submit a separate test case for each branch of a multiple choice. In that case, each test must have its own distinct test name and reference output file.

See the Catalog of Discretion (xsltdisc.htm) for details about each discretionary item that has been identified by the Committee.

Usage

If a test case examines a processor's behavior in one or more discretionary areas, the submitter must create a discretionary-choice element for each area.

This information will allow a test lab to exclude tests that deal with discretionary items or behaviors not implemented by the processor being tested.

The Catalog of Discretion

The Committee has cataloged the discretionary choices that are available to the processor developer. Each entry in the catalog has been assigned a name, and the allowed choices have been identified by keywords. These names and keywords must be used as the values of the name and behavior attributes, respectively.

Attribute: name

For each discretionary choice element, the name attribute must specify the name of the discretionary area. See the Catalog of Discretion for a list of allowable discretionary-choice names.

Attribute: behavior

For each discretionary-choice element, the behavior attribute must specify the choice being tested. The Committee has created keywords to identify possible behaviors. See the Catalog of Discretion for a list of behaviors allowed for each discretionary choice.

Examples

Example 1.7. discretionary element

<discretionary>

<discretionary-choice name="two-attribute-set-same-attribute" behavior="choose-last"/>

</discretionary>

<gray-area>

Syntax

<!ELEMENT gray-area ( gray-area-choice )+ >

<!ELEMENT gray-area-choice EMPTY>

<!ATTLIST gray-area-choice

name NMTOKEN #REQUIRED

behavior NMTOKEN #REQUIRED>

Purpose

The Committee is currently identifying and cataloging the gray areas it finds in the XSLT-related specifications, and defining some allowable actions for each gray area, which can be used as a basis for writing tests. See the xsltcfg.xml file for the current list of gray areas.

The gray-area element will be used by testers to filter out irrelevant test cases when a test suite is assembled. Typically, if a test case mentions a gray area, the test case would be excluded from a rendered test suite for those processors that have made a different choice for that gray area. It would be included for those processors that have made the same choice as the one indicated in the test.

On the other hand, if a test case does not mention a gray area, the tester should assume that the test case does not depend on any specific behavior by the processor. Therefore, the test case would be included for any choice made on that item.

Usage

The gray-area element must be used when a test case examines a processor's response to a vague area in the specification. This is not to be confused with discretionary areas, where the specification offers clear choices to the processor developer.

In the xsltcfg.xml file, the Committee has cataloged the gray-area choices that are available to the processor developer. Each entry in the catalog has been assigned a name, and the allowed choices have been identified by keywords. These names and keywords must be used as the values of the name and behavior attributes, respectively.

Submitters may, if they wish, submit a separate test case for each branch of a two-way choice. In that case, each test must have its own distinct test name and reference output file.

Note

Errata should clear up some gray areas, so the submitter should monitor new releases and indicate any relevant errata document using the errata-add attribute on spec-citation.

For example, if a case tests a vague area, and the vagueness is later addressed in an erratum, then the submitter should resubmit the test, with the erratum specified. This will allow the same test to be used in examining the processor's conformance to the errata.

Attribute: name

The name attribute must specify the name of the gray area being tested.

See the file xsltcfg.xml for a list of allowable gray-area names. In each gray-area element, the value of the id attribute is the name of the gray area. For example, in <<gray-area id="gray1">>, the gray-area's name is "gray1".

Attribute: behavior

For each gray-area named, the behavior attribute must specify the choice being tested. The Committee has created keywords to identify possible behaviors.

See the file xsltcfg.xml for a list of allowable behaviors for each gray area. Each gray-area element has one or more choice descendants. The value of the value attribute on the choice element is the keyword for an allowable behavior. For example, if there are two such elements (<choice value="ignore"> and <choice value="raise-error">, then the value of the behavior attribute must be either "ignore" or "raise-error". No other values will be accepted by the Committee.

Note

Be sure to update your copy of the xsltcfg.xml file regularly, since it may change. At the time of this writing, the xsltcfg.xml file was part of a submissions prototype, and the gray area choices were not complete.

Examples

Example 1.8. gray-area element

<gray-area>

<gray-area-choice name="gray1" behavior="ignore"/>

</gray-area>

<scenario>

Syntax

<!ELEMENT scenario ( input-file* , output-file* , param-set* , console? )>

<!ATTLIST scenario

operation NMTOKEN #REQUIRED

message ( message ) #IMPLIED

error ( error ) #IMPLIED>

Purpose

The scenario element defines the parameters needed to run the test, such as:

  • inputs

  • outputs

  • instructions on how to run the test

  • indication of how to evaluate the results

Usage

Each test case requires a scenario element. It is a container element that allows the submitter to list and describe the input and output files, parameters, etc. See the sections for each of those elements below.

The names of individual input and output files are provided as the content of the input-file and output-fileelements. The full path to the principal input file, for example, is generated by concatenating the following values: suite/submitter/file-path/principal-input-file. (The "principal input file" is the stylesheet, if there is one, otherwise the XML data file, that is provided as an argument when the processor is invoked.)

Attribute: operation

The operation attribute indicates how to run the test. For example, if operation="standard", one XML document is used as the input document ( <input-file role="principal-data">), a stylesheet file is applied (<input-file role="principal-stylesheet">), and the output is captured in a file that can be compared to a "reference output" file.

The operations element in the xsltcfg.xml document identifies the valid operations that can be performed in an XSLT/XPath test. Initially, this attribute can take one of the following five values:

Table 1.2. Allowable Operations

OPERATIONDEFINITION
standard

Indicates that there are 2 principal inputs (the XML input document and the stylesheet) and 1 principal output.

embedded

Indicates that the XML input document contains embedded styling; there should be no separate stylesheet file.

If a separate XSL stylesheet exists, it should be ignored.

external-param

Indicates that parameters are passed in; otherwise, an external-param test is the same as a standard test.

In other words, the processor should be launched with parameters that are set via whatever mechanism the processor supports.

execution-error

Indicates that there are 2 principal inputs, but the test should not generate an output file.

result-analysis

Indicates that some interaction is required during the test; otherwise, a result-analysis test is the same as a standard test.

Note

See the xsltcfg.xml file for an up-to-date list of allowable operations.

Attribute: message

The message attribute is optional. If it is used, it must contain an exact string that should be found in either the standard output or the standard error stream.

Attribute: error

The error attribute is optional. Its presence indicates that the test should generate an error. The element must contain an exact string that should be found in the standard error stream. (See below for a way to express the substance of an error instead of the exact message.)

<input-file>

Syntax

<!ELEMENT input-file ( #PCDATA ) >

<!ATTLIST input-file

role NMTOKEN #REQUIRED>

Purpose

Each input-file element provides the name of one input file. The role of each file is specified by the role attribute.

Usage

The value of the input-file element must be the exact name of a file used in the test.

For a principal input file, the input-file element must specify only a filename, without a path. If the principal input file is a stylesheet, the principal data file may be referenced by a relative path, allowing the re-use of shared data files. For supplemental input files, the input-file element may supply a path, relative to the one in the file-path element.

Attribute: role

The input and output files must be assigned roles. The allowable values for the role attribute are listed in the xsltcfg.xml file. See the roles element (a child of scenarios) for details. The value of the role attribute must match the id attribute on one of the role elements.

The p element provides a description of each role, including whether it applies to an input file or an output file.

Note

As shown below, the role attribute defines some files as "principal" files. These are the files that are named when the processor is invoked (as command-line or API arguments).

Examples of valid roles for an input file are: principal-data, principal-stylesheet, supplemental-data, supplemental-stylesheet, and supplemental-params.

Example

Example 1.9. <input-file> element

<input-file role="principal-data" test1.xml</input-file>

<output-file>

Syntax

<!ELEMENT output-file ( #PCDATA ) >

<!ATTLIST output-file

role NMTOKEN #REQUIRED

compare NMTOKEN #REQUIRED>

Purpose

Each output-file element provides the name of one output file. Some tests will not expect any output files; in that case, there will be no output-file element.

The role of each file is specified by the role attribute.

The method by which the output file is to be evaluated is conveyed by the compare attribute.

Usage

The value of the output-file element must be the exact name of a file used in the test.

For the principal output file, the output-file element specify only the filename. Since XSLT 1.0 does not allow supplemental output files, the "supplemental" designation should be used only for captured console output.

Attribute: role

The input and output files must be assigned roles. The allowable values for the role attribute are listed in the roles element of the xsltcfg.xml file. The value of the role attribute in the Collection Catalog instance must match the id attribute on one of the role elements from the xsltcfg.xml files.

In the xsltcfg.xml file, the p element provides a description of each role, including whether it applies to an input file or an output file.

Examples of valid roles for an output file are: principal, and supplemental.

Attribute: compare

The compare attribute of the output-file element describes how to evaluate the outcome of the test; that is, it indicates how to compare the expected output to the actual output.

The allowable values for the compare attribute are listed in the comparisons element of the xsltcfg.xml file. The value of the compare attribute in the Collection Catalog instance must match the id attribute on one of the comparison elements from the xsltcfg.xml files.

In the xsltcfg.xml file, the p elements provides a description of each possible method of comparison.

Examples of valid methods of comparison are shown in the following table.

Table 1.3. Allowable Methods of Comparison

XMLThe output should be an XML file.
HTMLThe output should be an HTML file.
TextThe output should be a text file.
Manual

The output cannot be automatically verified.

This value should be used sparingly, for generate-id() and system-property() output.

IgnoreThe test should not produce any output.

An "ignore" value indicates that the test should not have produced any output file. You may still want to designate a name for such a file, just in case it gets produced in error.

Example

Example 1.10. output-file element

<output-file role="principal" compare="XML">test1.out</output-file>

<param-set>

Syntax

<!ELEMENT param-set EMPTY>

<!ATTLIST param-set

name NMTOKEN #REQUIRED

type NMTOKEN #REQUIRED

value CDATA #REQUIRED>

Purpose

Note

The param-set element type is in its preliminary stages of development, so it is subject to change.

In addition to input files, some processors may use command-line parameters or parameters of their API. The param-set element allows these parameters to be specified in the collection catalog.

Usage

Use one param-set element for each parameter.

Attribute: name

The name attribute specifies the name of the parameter.

Attribute: type

The type indicates the format of the parameter value. The allowable values are found in the xsltcfg.xml file, in the parameter-types element. The value of the type attribute in the collection catalog must match the value of the id attribute on one of the parameter-type elements in the xsltcfg.xml file.

In the xsltcfg.xml file, the p elements provide a description of each type of parameter.

Examples of allowable parameter types are: string, number, and boolean.

Attribute: value

The value attribute specifies the value of the parameter.

Example

Example 1.11. <param-set> tag

<param-set name="param-a" type="number" value="1">

<console>

Syntax

<!ELEMENT console (%prose)>

Note

See the latest version of the Prose DTD (prose.dtd) for an up-to-date content model. Initially, it is defined simply: (p | em | strong )*

Purpose

The console element is optional. Its presence signifies that the console output should be captured as the test is run.

Usage

The console element can contain a description of the anticipated result on a console or ancillary output device separate from any principal or supplemental expected outputs. This item may be used to describe the gist of an error message (as opposed to the error attribute on the scenario element, which specifies an exact string from an error message).

2. Validating the Test Catalog

You must validate your submission before submitting it to the Committee, to ensure that it is configured properly for the XSLT/XPath Conformance Test Suite.

This chapter explains how to use the SYSTEM/SUPPORT/valcat.xsl file to validate your submission. The following steps should be done in the order specified.

  1. Choose a short name for your submission. To accommodate the various platforms that may be used by test labs, this name must be no longer than 8 characters, with no spaces.

  2. Make a subdirectory of that name under the VALIDATE directory, and another one under the TESTS directory.

    Note

    As mentioned in Chapter 1, this name corresponds to the the submitter attribute of the test-catalog element and must be approved by the Committee to ensure uniqueness. Therefore, the Committee reserves the right to change the name you have chosen for your submission.

    In the example submission files, the name "XampleCo" was chosen as the submission name. Therefore, there are two subdirectories by that name, one under VALIDATE and the other under TESTS.

  3. In the subdirectory you have created under VALIDATE, create an XML file containing the data about your test cases. This XML file must conform to the Collection Catalog DTD (collcat.dtd), according to the guidelines in Chapter 1.

    The name of this file is up to you, but should be obvious to testers. For example, you may want to use the subdirectory name (as in XampleCo.xml) or a generic name like testcat.xml.

  4. In the subdirectory you have created under VALIDATE, create a second XML file. This will be smaller than the Collection Catalog instance, because it will contain only the submission header.

    Pattern this new file after the sample file Validate/XampleCo/subsgood.xml. You can copy subsgood.xml, rename it casesub.xml, and change the following element:

    <submission href="testgood.xml" submitter="XampleCo" date="2001-12-07"/>

    so that it matches the catalog name, submitter name, and date used in your main Collection Catalog instance from Step 3. (Chapter 1 explains how use the markup in the Collection Catalog DTD.)

  5. Copy the test.bat file from VALIDATE/XampleCo to the subdirectory you created under VALIDATE in Step 2 above, and edit it. Alternatively, you could create your own equivalent script for your operating system. In either case, the script must do the following:

    • validate the two XML files, casesub.xml and the catalog of test cases, that you have created in Steps 3 and 4;

    • run the valcat transformation on the catalog file to produce a report of any problems.

    You must also consider the following:

    • The example batch file (test.bat) calls validation and XSLT via batch files. You can create files that work for you, based on whatever XSLT processor and XML parser you are using.

    • It is strongly recommended that the XSLT output go to a file. That will give you a checklist of problems.

    • There are examples for a clean catalog, testgood.xml, and a testbad.xml with all of the problems that valcat knows how to find. You can debug test.bat for XampleCo.

    • If your xsl.bat adds filetype extensions, then test.bat shouldn't add them.

    • xsl.bat may need a path to the XML and XSL executables.

    • test.bat may need a path to the xml.bat and xsl.bat files.

    • If your XSLT validates XML input, you could use it in xml.bat.

    • [This is obvious, but stated as a reminder to check.] Change the names of the XML files being validated to whatever names you have used.

  6. Once you have test.bat working correctly with XML/XSL subsidiary batch files, run it in the subdirectory you created under VALIDATE, and read the resulting err-good.xml. (Or whatever filename you assigned to the valcat output.)

    If no problems are reported, you are ready to submit your test cases.

    If there are problems, fix your catalog of test cases as indicated in the err-good.xml file.

3. Packaging the Files

This Chapter explains how to package the files that make up your submission. Before submitting your package, you must also validate it according to the directions in the preceeding chapter.

Directory Structure

Package the files according to the directory structure shown in this section.

When you unpacked this package as downloaded, you obtained the following subdirectories:

  • VALIDATE

    Work area for test-case catalogs, with a subdirectory for each submitter.

  • DOCS

    Explanatory documents from OASIS.

  • SYSTEM

    Scripts, DTDs, stylesheets, configuration data, and other files that define the OASIS testing framework. The SUPPORT subdirectory contains files that apply to all test regimes, and the XSLT subdirectory contains files specific to XSLT/XPath testing.

  • TESTS

    Contains all the test cases and reference output, with a subdirectory for each submitter.

The VALIDATE Directory

Purpose

The VALIDATE directory is for your catalog and submissions file.

Usage

The VALIDATE directory must contain a subdirectory for each submitter.

In the example files provided, the submitter is XampleCo, so there is one XampleCo subdirectory.

The TEST Directory

Purpose

The TEST directory is for your test cases and reference output files.

Usage

The TEST directory must contain a subdirectory for each submitter. The directory you create for yourself is the one that will be the "staging area" for your submitted tests and data.

In the example files provided, the submitter is XampleCo, so there is one XampleCo subdirectory. In the steps below, we will use the string "YourName" to represent the name of your directory in TESTS, which must match the submitter attribute of the test catalog.

How to Package Your Submission

  1. Put your test inputs somewhere in the subdirectory you've created under TESTS.

    You can create any structure you want under this new directory. For example, you may wish to create TESTS/YourName/Input/ for input files, or you may wish to have a subdirectory for each category of tests. Use the file-path element in the catalog of test cases to record the path under YourName where the principal input is located.

  2. Put your proposed reference output files, in raw form, somewhere under the TESTS/YourName directory.

    For example, you may wish to create TESTS/YourName/Output/ for output files.

  3. Make sure that the data in the test case catalog accurately reflects the location of the files from steps 1 & 2. Use the TESTS/YourName directory as the base point for the file-path value.

  4. Put the test case catalog and the small submissions file (renamed casesub.xml) directly in the TESTS/YourName directory.

  5. Zip up the YourName directory tree using a recent version of PKZip, WinZIP, or similar utility.

  6. Submit the zipfile by visiting our upload page and uploading.

4. Updating a Submission

Sometimes you may need to re-submit a test catalog. In this case, the following rules must be observed:

  • the names of new tests (<test-case id="new-name">) must not conflict with the names of tests previously submitted.

  • changed tests must have a newer date (<test-case><date>newer-date</date></test-case>) than the tests they replace.

  • if an entire test catalog is re-submitted, the Committee must be able to determine easily and unambiguously which tests have changed since the previous submission.