Skip to main content
Welcome. This site supports keyboard navigation and screen readers. Press ? at any time for keyboard shortcuts. Press [ to focus the sidebar, ] to focus the content. High-contrast themes are available via the toolbar.
serard@dev00:~/cv

Part VIII: The Verdict — Where Typed Specs Sit

Eight approaches. No silver bullet. The question is not "which one?" but "which layer does each one cover?"

The Landscape

Every approach in this series occupies a different point on two axes: how automated is the linking? and how type-safe is the link?

Diagram

The bottom-left quadrant is where most teams live: informal, convention-based, discipline-dependent. Moving right adds automation. Moving up adds type safety. The top-right corner is where the compiler does the work.

What Typed Specs Trade Away

Typed specifications sit in the top-right corner. That position comes with tradeoffs — real ones, not theoretical:

Cross-Functional Readability

A PM can read this:

Scenario: Admin can assign roles to other users
  Given an admin user "alice"
  And a viewer user "bob"
  When alice assigns the editor role to bob
  Then bob should have the editor role

A PM cannot read this:

abstract adminCanAssignRoles(): ACResult;

Typed specs are developer tools. If cross-functional readability is a hard requirement — regulatory environments, QA-driven teams, products where PMs co-author specifications — BDD's natural-language format is genuinely superior.

Management Dashboards

Test management platforms (Allure, TestRail, Zephyr) provide real-time web dashboards with historical trends, pass rate charts, and per-release comparisons. Typed specs produce console output and JSON files.

If management needs dashboards they can access without touching the codebase — and they usually do — a test management platform delivers what typed specs don't.

Enterprise ALM Integration

Enterprise teams need traceability that spans systems: Jira epics → features → tests → deployments → incidents. ALM tools (Azure DevOps, Jira + plugins) provide this end-to-end chain.

Typed specs cover one link: feature → test. They don't connect to Jira epics upstream or to deployment pipelines downstream. In heavily regulated environments (healthcare, finance, aerospace), this gap matters.

Multi-Language Support

Typed specs require a language with a type system strong enough to enforce the chain:

  • TypeScript: keyof T + decorators (works)
  • C#: nameof() + typeof() + attributes (works, better)
  • Java: annotations + reflection (partial — no keyof equivalent)
  • Python: no compile-time enforcement (would need a linter)
  • Go, Rust, Ruby: no decorator/attribute system that fits this pattern

If your team uses Python or Go, typed specs in their current form aren't available. You'd need a language-specific adaptation or a different approach entirely.

Honest Limitations of Typed Specifications

Beyond the tradeoffs above, typed specs have specific limitations:

Regex-based scanning (TypeScript version). The compliance scanner uses regex to parse feature definitions and decorator references. It works because the file format is strict, but it's fundamentally less robust than AST-based analysis. Edge cases in formatting could theoretically fool it.

No historical trends. The scanner produces a point-in-time snapshot. "How has coverage changed over the last 10 releases?" requires saving JSON snapshots and building a trend viewer. This is on the V2 roadmap but not built yet. The tspec product series designs the Diem-backed persistence and dashboard that close this gap.

Proven at small scale. The TypeScript version is battle-tested on one project with 20 features and one developer. The C# version is designed for enterprise scale but hasn't been battle-tested at 50-developer scale. The tspec product series designs the multi-repo federation, multi-tenant, and multi-language backend architecture for CAC40-scale deployment.

Single-repo only (TypeScript version). The scanner reads the local filesystem. Cross-repo feature sharing requires npm packaging, which isn't built into the design. The C# version handles this with NuGet, but that's a separate system. The tspec product solves this with multi-repo federation: each repo pushes independently, one Diem dashboard aggregates everything.

Developer-owned specifications. The abstract classes are written by developers, for developers. There's no workflow for PMs to review or approve specifications without reading code. The requirements-human-side post explores collaboration patterns, but the tooling assumes developer authorship.

No manual test tracking. Every AC must be verified by an automated test. If some ACs are verified by manual QA (exploratory testing, usability testing), typed specs can't track them. Test management platforms handle this; typed specs don't.

The Decision Guide

If your situation is... Consider... Why
PM needs to read and co-author specs BDD (Gherkin) Natural-language .feature files that non-developers can review and edit
Regulatory compliance requires audit trails Jira + Allure/TestRail/Zephyr Auditing, historical records, dashboards for compliance officers
API-first product, no UI OpenAPI + contract testing The API spec IS the feature spec; schema validation covers everything
.NET enterprise monorepo, 10+ developers C# Roslyn source generators Full semantic analysis, IDE diagnostics, NuGet distribution, hierarchy enforcement
TypeScript project, small-to-medium team Typed Specifications Minimal infrastructure (375 lines), compiler-checked, zero external dependencies
Mixed stack (TS frontend + .NET backend) Both TS typed specs + C# Roslyn Same philosophy, different mechanics; each covers its own stack
Legacy project, first step toward traceability xUnit traits + directory conventions Lowest friction starting point; upgrade to typed specs later
Large team, needs management visibility Jira + test management platform + typed specs Layer them: Jira for workflow, platform for dashboards, typed specs for compiler enforcement
No tests yet Write tests first None of these approaches help until you have tests to link to

The Layering Insight

The most important takeaway from this comparison: these approaches are not mutually exclusive. They operate at different layers and cover different concerns.

┌─────────────────────────────────────────────────────────┐
│                    Project Management                    │
│              Jira / Azure DevOps / Linear                │
│    (workflow, priorities, sprints, stakeholder view)     │
├─────────────────────────────────────────────────────────┤
│                  Feature Specification                    │
│         Typed Specs (TS) / Roslyn (C#) / BDD            │
│    (features as types, ACs as methods, compile-time)    │
├─────────────────────────────────────────────────────────┤
│                  API Contract Layer                       │
│            OpenAPI / AsyncAPI / Pact                      │
│    (endpoint shape, schema validation, boundaries)       │
├─────────────────────────────────────────────────────────┤
│                 Test Organization                         │
│        Tags / Categories / Directory Conventions         │
│    (filtering, grouping, discoverability)                │
├─────────────────────────────────────────────────────────┤
│                  Test Execution                           │
│           Playwright / Vitest / xUnit / NUnit            │
│    (run tests, assert results, produce reports)          │
├─────────────────────────────────────────────────────────┤
│                  Reporting & History                      │
│           Allure / TestRail / Zephyr                     │
│    (dashboards, trends, historical tracking, audits)    │
└─────────────────────────────────────────────────────────┘

Each layer has a primary concern. No single tool covers all layers. The healthy question is: "which layers do we need, and what covers each one?"

A team might use:

  • Jira for project management and workflow
  • Typed specs for requirement-test traceability (compiler-enforced)
  • OpenAPI for API contracts between services
  • Directory conventions for test file organization
  • Playwright + Vitest for test execution
  • Allure for management dashboards and historical trends

No conflict. No overlap. Each tool does what it's best at.

Or a lean team might use:

  • Linear for project management
  • Typed specs for everything in the middle
  • Vitest for test execution

Simpler. Fewer moving parts. The typed specs compliance report is the dashboard, the traceability matrix, and the quality gate — all in one.

The Core Question

This series began with a question every team faces: how do you know which tests verify which features?

Eight approaches answer it at different levels of rigor:

Approach Level of Rigor
Wiki matrices Human memory
Directory conventions File system convention
xUnit traits / tags String metadata
Jira / ADO / Linear String IDs in external system
Allure / TestRail String IDs in external platform
BDD / Gherkin Structured text parsed at runtime
OpenAPI / Contract Schema validated at runtime
Typed Specs (TS) Type system checked at compile time
Roslyn Source Gen (C#) Semantic model checked at compile time

The further down the table, the less discipline you need to maintain the link. Wiki matrices require perfect human discipline. Typed specs require zero discipline — the compiler maintains the link, and the scanner quantifies completeness.

That's not because typed specs are "better" in some abstract sense. It's because they move the enforcement from human convention to compiler constraint. And compiler constraints don't forget, don't get tired, and don't skip steps when the deadline is tomorrow.


For the full implementation walkthrough, see Onboarding Typed Specifications: From Zero to Compiler-Verified Features.

For the C# variant, see Requirements as Code in C# and CMF Part IX: Requirements DSL.


Previous: Part VII: C# Roslyn Source Generators Back to: Overview and Comparison Matrix