Skip to content

Formalize conformance test description format #1520

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
tedepstein opened this issue Apr 2, 2018 · 5 comments
Open

Formalize conformance test description format #1520

tedepstein opened this issue Apr 2, 2018 · 5 comments

Comments

@tedepstein
Copy link
Contributor

tedepstein commented Apr 2, 2018

Discussed on TSC Meeting: April 2, 2018:

Rules and test cases in a conformance test suite might have a descriptive structure, something like this:

  • Rule description (human-readable)
  • Reference to the relevant part of the spec.
  • Set of test cases derived from this rule:
    • Example input
    • Expected result, as one of the following:
      • pass
      • error (corresponds to MUST, MUST NOT in the spec)
      • warning (corresponds to SHOULD, SHOULD NOT in the spec)
    • Comments (optional)

OpenAPI implementations, including editors, generators, documentation formats, etc., are expected to validate the content, correctly detect error and warning conditions (without false-positives on valid content), expose and handle errors and warnings in whatever way is appropriate for that implementation.

Some goals, and possibly non-goals, pending discussion:

  • Create a definitive conformance test suite as the set of all of these rules and test cases.
  • Use a uniform descriptive structure for the test cases in the suite, something like the preceding bullet list.
  • Make the rule and test case descriptions machine-readable.
  • Make the rules machine-executable by adding structured representations of each rule, which could take different forms:
    • Where the rule can be expressed in JSON Schema, include it in a schema, which can be designed to be the standard schema for OpenAPI, or could be given a more limited scope of use, as an informative addition to the conformance test suite.
    • In other cases, or possibly in all cases, state the rule as a logical assertion using an expression language, preferably one that can be embedded or interpreted in different programming languages.

Consider all of the above as a straw-man proposal, for discussion.

@tedepstein
Copy link
Contributor Author

@darrelmiller, please have a look & see if this captures the ideas we discussed on the call today.

BTW:

  • I know the idea of executable rules is a stretch, and is probably separable from the other goals. Just thought I'd mention it anyway since it could be valuable in this context.
  • From the little bit that I know of behavior-driven design (BDD) and specification by example, what I'm describing kind of sounds like one or both of those things. If anyone has experience with BDD practices and frameworks, and thinks BDD is applicable here, please chime in.

@darrelmiller
Copy link
Member

@tedepstein Looks great to me. I'm going to add a note to tomorrow's agenda to discuss what next steps we can take. If you are able to make the meeting then it would be great to hear your thoughts and how we can progress.

@cmheazel
Copy link
Contributor

The Open Geospatial Consortium (OGC) runs a compliance test program at http://cite.opengeospatial.org/. This program is supported by the Team Engine test server at https://github.com/opengeospatial/teamengine. Team Engine executes test scripts written in TestNG (http://testng.org/doc/index.html). As part of their Testbed 14 initiative, the OGC will develop compliance tests for their OpenAPI based WFS 3.0 spec. This work may provide us with most of what we need for a general purpose OpenAPI compliance test. Any interest?

@tedepstein
Copy link
Contributor Author

@darrelmiller and @cmheazel, sorry to have missed your earlier comments. My Gmail filter was a bit overzealous, and I just corrected that.

I'll be on Monday's call to discuss next steps, and I'll take a look at what the OGC has done. Thanks!

@tedepstein
Copy link
Contributor Author

tedepstein commented May 2, 2018

My live notes from Monday's TSC call:

OAS3 Certification Testing 2018-04-30.pptx

Text content, for SEO & inline reading:

  • End State: Certification Process Overview

    • TSC
      • Standardize rules
      • Specify test cases
      • Publish guidelines and automated testing tool
    • Provider
      • Implement testing tool adapter to inject test inputs, inspect results
      • Self-certify
      • Submit for verification
    • TSC
      • Connect test driver to Provider implementation
      • Run automated certification process, report results
  • Start with OpenAPI 3.0 Document “Consumers”

    • Overall Categories:
      • Consumers (OAS3 Document Readers)
      • Producers (OAS3 Document Writers)
      • Generators (OAS3 -> other artifact)
    • OpenAPI “Consumers” are easiest. Start with these.
      • Sub-Categories:
        • Parser
        • Editor
        • Linter/Validator
        • Code generator that uses OAS3 as input (certify input phase)
      • Expectations
        • All valid input is handled without errors
        • Invalid input produces expected errors & warnings
  • Questions, Challenges…

    • Supporting multiple implementation languages
    • Error Codes
      • Practically a prerequisite for automatable verification
      • Format?
        • Distinguished String, e.g. prefix: “OAS3-____________”
    • How to verify “representation” of valid input
      • How can we tell that the consumer correctly consumed valid input, and translated it to an appropriate internal representation?
      • Conclusion: this is in “producer” scope.
    • Next phase: How to test OpenAPI “Producers”:
      • Documentation formats
      • Generators (using OpenAPI as an input)
      • Converters (from other formats to OpenAPI)
  • Rules and Test Cases for Validation

    • Rule description (human-readable)
    • Reference to the relevant part of the spec.
    • Error Code
    • Set of test cases derived from this rule:
      • Example input
      • Expected result, as one of the following:
        • pass
        • error (with code, corresponds to MUST, MUST NOT in the spec)
        • warning (with code, corresponds to SHOULD, SHOULD NOT in the spec)
      • Comments (optional)
  • Existing Validators

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants