Test Lab Guidelines - DRAFT 0.45

I. General Information

A. This package includes test cases, reference output for comparison, and catalog data that you can use to generate various scripts. The catalog data is in XML, and the scripts can be generated via XSLT. We include a standalone sufficiency test (MkFinder.xsl) so you can determine whether your XSLT processor can generate the anticipated scripts.

B. The expected "reference" results from the submitters will be XML, HTML or text format, as will the user's actual results. These raw results will be converted to an XML document with an indirect representation of the content in InfoSet-style markup, always in UTF-8 encoding. The XSLT/XPath Conformance Committee will supply the user with the "reference" results and the user will apply an Information Set Analysis mechanism to produce the Users Results Description of the actual test results run on a particular processor. The XML results can be canonicalized and the user can perform a byte-wise or text comparison. For example, if the result is an XML document, the user can apply an XSLT stylesheet supplied in the XSLT/XPath Conformance Testing package and use the processor being tested or another processor to produce an XML InfoSet representation of that result. Processing the information set result using Canonical XML or any consistent serializer with both the committee-supplied expected results and actual results allows for easy comparison. Direct XML or HTML comparitors can be used, if available.

C. This document is currently more of an outline than a complete explanation. It is being included with the test suite because it contains numerous suggestions and ideas that enable optimum use of the suite.

II. Unpacking and setting up the test system

A. This document explains A'. Other documents explain B. Planning your file system layout C. Planning your test operations

III. Getting an XSLT processor ready for testing

A. The Test Lab needs to get the following information from the processor developer(s) B. Ensure that the processor works C. Using XSLT as part of the installation

IV. Running the tests and evaluating a processor

A. Setup and planning B. Rendering the test suite for a given processor and errata level C. Running the applicable test cases through the processor D. Comparing the results of this run against the reference output