Data-Driven Tests
datacur8 is primarily tested with data-driven fixtures under tests/<case>/. The fixtures are the test contract. A case is considered incomplete if required files or snapshots are missing, and the test suite must fail in that situation.
Table of contents
- Philosophy
- Required Case Structure
expected/File Reference- Fixture Completeness Rules Enforced by Tests
- Success vs Failure Cases
- Documentation Example Fixtures (
example_*) - Authoring Checklist
Philosophy
- Test behavior and test fixture completeness are both enforced.
- A case that is missing required snapshots is a broken test, even if the CLI behavior being exercised would otherwise pass.
- Failure cases are first-class test cases. An invalid
.datacur8or invalid data file is valid test input when the expected results are captured underexpected/.
Required Case Structure
Every top-level folder under tests/ is treated as a test case.
tests/<case>/
.datacur8 # required (may intentionally be invalid)
... input files ...
expected/ # required
validate.exit # required
validate.args # optional
validate.stdout # optional
validate.stderr # optional
export/... # required when validate.exit == 0 and outputs are configured
tidy/... # required for tidy cases
expected/ File Reference
expected/validate.exit (required)
- The expected exit code for
datacur8 validate. - Parsed as a single integer.
- Common values:
0: validation succeeded1: config/discovery failure2: data validation failure
expected/validate.args (optional)
- Extra CLI args appended to
validate. - Typical use:
--format jsonfor stable machine-readable error snapshots. - Whitespace-separated.
expected/validate.stdout (optional)
- Snapshot of
validatestdout. - Used most often with
--format json. - Compared as JSON (structural equality), not raw text.
expected/validate.stderr (optional)
- Snapshot of
validatestderr. - Compared line-by-line (order-insensitive for non-empty lines).
expected/export/... (conditionally required)
- Required when:
expected/validate.exitis0, and.datacur8declares one or moretypes[].output.path.
- Must include a snapshot file for every configured output path.
- Example:
expected/
export/
out/
teams.json
services.jsonl
expected/tidy/... (required for tidy cases)
- Contains the expected post-
tidy --writefile content. - Paths are relative to the case root, mirrored under
expected/tidy/. - Example:
expected/tidy/data/w1.yaml - The integration suite also runs plain
tidy(check mode) for the same fixture and asserts:- files are not rewritten in check mode
- exit code is non-zero when the snapshot differs from the original input
- diff output is emitted
Fixture Completeness Rules Enforced by Tests
The integration suite includes a fixture meta-test that fails when:
- a
tests/<case>/directory is missing.datacur8 expected/is missingexpected/validate.exitis missingexpected/export/exists but contains no filesexpected/tidy/exists but contains no filesvalidate.exit == 0and a configuredoutput.pathis missing a matchingexpected/export/...snapshot
This prevents silent skips and partial fixtures.
Success vs Failure Cases
Success cases
expected/validate.exitis0- If outputs are configured,
expected/export/...snapshots are required - Add
expected/tidy/...when the case is intended to exercisetidy(used for both check mode and--write)
Failure cases
- Non-zero
expected/validate.exitis expected .datacur8may be invalid (schema/semantic errors) or the data may be invalid- Prefer
expected/validate.argswith--format jsonplusexpected/validate.stdoutso failure intent is explicit and stable
Documentation Example Fixtures (example_*)
Use tests/example_* for fixtures that back examples shown in user-facing docs. This keeps examples traceable and makes documentation coverage auditable.
Current examples include:
tests/example_readme_quick_start_successtests/example_readme_quick_start_unique_id_failuretests/example_readme_quick_start_path_file_mismatch_failuretests/example_readme_quick_start_foreign_key_failuretests/example_examples_team_service_registry_successtests/example_examples_team_service_registry_foreign_key_failuretests/example_examples_csv_product_catalog_successtests/example_examples_csv_product_catalog_foreign_key_failuretests/example_examples_csv_product_catalog_type_conversion_failuretests/example_examples_strict_mode_enabled_failuretests/example_examples_strict_mode_force_failuretests/example_examples_multi_format_export_jsontests/example_examples_multi_format_export_yamltests/example_examples_multi_format_export_jsonl
Behavior-focused condition examples can also use example_* naming (for example tests/example_conditions_*) when they exist to illustrate a specific error mode.
Authoring Checklist
Before committing a new fixture:
- Add
.datacur8 - Add input files for the scenario
- Add
expected/validate.exit - Add
expected/validate.argsandexpected/validate.stdoutfor failure cases when possible - Add
expected/export/...for every configuredoutput.pathwhen validation succeeds - Add
expected/tidy/...when testingtidy - Do not keep generated outputs in the case root (store snapshots under
expected/export/...instead) - Prefer one clearly named behavior per case