Skip to main content
Welcome. This site supports keyboard navigation and screen readers. Press ? at any time for keyboard shortcuts. Press [ to focus the sidebar, ] to focus the content. High-contrast themes are available via the toolbar.
serard@dev00:~/cv

The Requirements/Feature Tracking DSL

The sixth M2 DSL bridges the gap between business requirements and implementation. It provides typed requirement references in code, hierarchical feature tracking, lambda-based acceptance criteria validation, and automatic tracing that maps features to code, tests, and API documentation.

Entities: From Epic to Implementation

The Requirements DSL models a complete feature hierarchy:

Epic (high-level initiative, e.g., "Multi-Tenant Support")
├── Feature (deliverable behavior, e.g., "Tenant isolation")
│   ├── Story (user-facing work, e.g., "Implement tenant-scoped query filtering")
│   │   └── Task (implementer's work, e.g., "Add TenantId to DbContext queries")
└── Bug (defect, e.g., "Queries leak data across tenants")
    └── (optional) Story/Task (fixes for the bug)

Each entity tracks:

  • Lifecycle state: Draft → Proposed → Approved → InProgress → Quality → Translation → Review → Done
  • Acceptance criteria: Behavioral tests in natural language
  • Validation lambdas: Executable acceptance criteria (source-generated validators)
  • Test coverage: Which tests validate which requirements
  • Implementation mapping: Which classes/methods implement which requirements

Example: Defining a Feature with Acceptance Criteria

[Feature(
    Id = "FEATURE-156",
    Title = "User role-based access control",
    ParentEpics = new[] { "EPIC-42" },
    Priority = RequirementPriority.Critical,
    Owner = "auth-team",
    AcceptanceCriteria = new[] {
        "Admins can assign roles to users",
        "Viewers can only read content",
        "Role changes take effect immediately after logout/login",
        "Roles are audited in the system log"
    },
    ValidationLambdas = new[] {
        "(user) => user.Roles.Count > 0 ? Result.Success() : Result.Failure(\"User must have at least one role\")",
        "(user) => user.Roles.All(r => r.IsActive) ? Result.Success() : Result.Failure(\"User has inactive roles\")"
    },
    EstimatedPoints = 5
)]
public partial class RoleAssignmentFeature { }

The generator produces:

  • Constants: Requirements.FEATURE_156 = "FEATURE-156" (type-safe reference)
  • Validators: RoleAssignmentValidator : IRequirementValidator<User> (source-generated)
  • Metadata: Serializable feature hierarchy and state machine

Implementing Requirements in Domain Code

Domain code decorates classes and methods with [Implements] to map implementation to requirements:

[Implements(Requirements.FEATURE_156)]
[Implements("EPIC-42")] // References parent epic too
public class RoleAssignmentService
{
    [Implements("FEATURE-156.AC.1")] // Specific acceptance criterion
    public async Task AssignRoleAsync(User user, Role role)
    {
        // Implementation details
    }

    public async Task<Result<User, InvalidOperationException>> OnLoginAsync(User user)
    {
        // Validate user roles against acceptance criteria
        var validator = new RoleAssignmentValidator();
        var result = validator.Validate(user);
        if (!result.IsSuccess)
            return Result<User, InvalidOperationException>.Failure(
                new InvalidOperationException(result.FailureMessage));

        // Implementation continues...
    }
}

Four Forms of Tracing

The Requirements DSL automatically generates four tracing outputs:

1. Code Comment Generation

/// <summary>
/// Implements:
///   - FEATURE-156: User role-based access control (AC.1, AC.2)
///   - EPIC-42: Enterprise platform capabilities
///
/// Acceptance Criteria:
///   AC.1: Admins can assign roles to users
///   AC.2: Viewers can only read content
/// </summary>
public async Task AssignRoleAsync(User user, Role role) { }

IDE tooltips and generated API documentation now show requirement context.

2. Runtime Audit Trail

An IRequirementAuditLog interceptor logs method execution with requirement context:

// Internal logging (DI-injected)
await _auditLog.LogImplementationAsync(
    requirementId: "FEATURE-156",
    typeName: "RoleAssignmentService",
    methodName: "AssignRoleAsync",
    parameters: new { user = "alice@example.com", role = "Admin" },
    result: successOrError,
    executionTimeMs: 42,
    occurredAt: DateTime.UtcNow
);

Enables post-mortems: "Which code changed satisfied requirement XYZ?" and "How many times did this requirement execute in production today?"

3. Test Coverage Report

[TestFor("FEATURE-156")]
[TestFor("FEATURE-156.AC.1")]
public class RoleAssignmentTests
{
    [Fact]
    public async Task AdminCanAssignRoleToUser()
    {
        // Test implementation
    }

    [Fact]
    public async Task ViewerRoleCannotModifyContent()
    {
        // Test implementation
    }
}

The generator produces RequirementTestCoverage mapping:

FEATURE-156 → [RoleAssignmentTests.AdminCanAssignRoleToUser, RoleAssignmentTests.ViewerRoleCannotModifyContent]
FEATURE-156.AC.1 → [RoleAssignmentTests.AdminCanAssignRoleToUser]

CI can fail if coverage drops below thresholds.

4. API Documentation

OpenAPI/GraphQL schemas are auto-decorated:

[OpenApiOperation(
    Summary = "Assign a role to a user",
    Description = "Implements FEATURE-156: User role-based access control (AC.1)"
)]
public ActionResult AssignRole([FromBody] AssignRoleRequest request) { }

Swagger/GraphQL docs show which APIs implement which features.

Lifecycle State Machine

All requirements follow a strict state machine:

Draft ──→ Proposed ──→ Approved ──→ InProgress
             │            │           │
             └──── Rejected           ├──→ Quality ──→ Translation ──→ Review ──→ Done
                                     │                               │
                                     └────────── Blocked ────────────┘

Source-generated RequirementLifecycleStateMachine enforces valid transitions and prevents invalid state changes (e.g., jumping from Draft to Done).

Integration with DDD and Testing

Requirements validators integrate seamlessly with domain aggregates:

[AggregateRoot]
public class User
{
    public async Task<Result<User, DomainException>> AssignRoleAsync(Role role)
    {
        // Inject the validator
        var validator = ServiceProvider.GetRequiredService<IRequirementValidator<User>>();
        var result = validator.Validate(this);

        if (!result.IsSuccess)
            return Result<User, DomainException>.Failure(
                new DomainException(result.FailureMessage));

        // Apply domain changes
        _roles.Add(role);
        _events.Add(new RoleAssignedEvent(Id, role));
        return Result<User, DomainException>.Success(this);
    }
}

Unit tests can validate requirements independently:

[TestFor("FEATURE-156.AC.3")]
public void RoleChangeTakesEffectAfterRelogin()
{
    // Arrange
    var user = new User("alice");
    user.AssignRole(Role.Admin);

    // Act & Assert
    var validator = new RoleAssignmentValidator();
    var result = validator.Validate(user);
    result.IsPassed.Should().BeTrue();
}

Constant Registry for Type Safety

All requirements are exposed as const strings, preventing typos:

public static class Requirements
{
    public const string EPIC_MULTI_TENANT = "EPIC-123";
    public const string EPIC_ENTERPRISE = "EPIC-42";
    public const string FEATURE_AUTH = "FEATURE-156";
    public const string STORY_JWT = "STORY-789";
    public const string TASK_VALIDATOR = "TASK-201";
    public const string BUG_DATA_LEAK = "BUG-999";
}

// Usage in code
[Implements(Requirements.FEATURE_AUTH)] // Type-safe, autocomplete support
public class RoleService { }

Coverage Analysis

The IRequirementCoverageAnalyzer service provides analytics:

var coverage = await _analyzer.GetCoverageReportAsync();

Console.WriteLine($"Total requirements: {coverage.TotalRequirements}");
Console.WriteLine($"Implemented: {coverage.ImplementationPercentage}%");
Console.WriteLine($"Tested: {coverage.TestCoveragePercentage}%");
Console.WriteLine($"Fully covered: {coverage.FullyCoveredCount}");

Reports identify:

  • Unimplemented requirements (no [Implements] found)
  • Untested requirements (no [TestFor] found)
  • Orphaned tests (tests with no corresponding requirement)

External Integration

The IRequirementRegistry outputs serializable metadata, enabling:

  • Jira sync: Automatically update Jira issues with code locations
  • Test frameworks: Get requirements for TDD/BDD frameworks
  • Observability: Publish requirement traces to APM systems
  • Markdown reports: Generate traceability matrices for stakeholders

Cross-DSL Integration

The Requirements DSL is the only one of the six that touches every other DSL. Its job is not to manage tickets — that is what Jira is for — but to make every other DSL aware of which business intent it satisfies. The integration points are concrete:

DSL How a [Feature] reaches into it Generator emits
DDD [Implements] on a class, method, or invariant XML doc comments on the generated entity, audit log entries on every command/event mentioning the feature ID
Workflow Gate("AllAcceptanceCriteriaMet") on a [Transition] A transition guard that resolves the relevant feature, runs its validators, and blocks the transition if any AC fails
Admin [ImplementsAcceptance] on an [AdminField] Inline help text on the form field rendering the AC's prose, plus the FluentValidation rule from the validator lambda
Pages [Implements] on a [PageWidget] OpenAPI metadata on the widget's REST endpoint, plus a footer block in the design TUI listing the implemented features
Content [ImplementsAcceptance] on a [ContentBlock] definition Block-level documentation in the StreamField editor and a JSON-schema annotation
Shared kernel Error code constants tagged with [FromRequirement(...)] A reverse map from ORD-001FEATURE-103.AC.1, surfaced in the API inventory report

A concrete example of the most important hop — the workflow gate. Recall the Approve transition from Part 12:

[Transition(From = "Review", To = "Approved", Label = "Approve")]
[RequiresRole("Reviewer")]
[Gate("AllAcceptanceCriteriaMet")]
public partial Result Approve(Product p);

Gate("AllAcceptanceCriteriaMet") is a built-in gate the requirements generator wires up automatically. The compiler sees that the gate name matches a [RequirementGate] attribute provided by Cmf.Requirements.Lib, then emits a Result factory in the workflow state machine that walks every feature whose [Implements] graph touches the Product aggregate, runs each of their validators against the entity instance, and aggregates the failures into a single Result.Failure("GATE-001", ...). Bob, the reviewer in the Day 4 walkthrough, sees the 422 Unprocessable Entity because his Product is missing the value the validator checks — not because someone hand-wrote a guard.

Compile-Time Coverage Thresholds

Coverage analysis at runtime is useful for dashboards, but the more interesting integration is at compile time. The CMF analyzer family CMF4xx enforces requirements policy directly in the build:

Rule Default What it forbids
CMF401 error A [Feature] with no [Implements] anywhere in the solution
CMF402 warning A [Feature] with [Implements] but no [TestFor] (raise to error in CI)
CMF403 warning A [TestFor] referencing a feature ID that does not exist
CMF404 error An [Implements] referencing a feature ID that does not exist
CMF410 error A feature in Lifecycle = Done whose validator lambda fails to compile
CMF411 error A workflow [Gate("AllAcceptanceCriteriaMet")] whose target aggregate has no implementing features
CMF420 warning A [Feature] with Priority = Critical and Lifecycle = Draft for more than N days (configurable)

These are ordinary Roslyn analyzers, so they show up in the IDE squiggles and in dotnet build output. The thresholds in cmfconfig.json:

{
  "requirements": {
    "minimumImplementationCoveragePercent": 100,
    "minimumTestCoveragePercent": 80,
    "criticalFeaturesMustBeTested": true,
    "promoteWarningsToErrorsInCI": true
  }
}

The CI step is then a one-liner — cmf validate exits with non-zero if any rule trips, which keeps the policy at the same layer as the type checker rather than buried in a dashboard nobody reads.

Lifecycle State Machine — Generated

The earlier ASCII diagram is hand-drawn for readability, but the actual state machine is a generated RequirementLifecycleStateMachine.g.cs. Like every other workflow in the CMF, the transitions are typed:

public sealed partial class RequirementLifecycleStateMachine
{
    public Result<RequirementLifecycleState> Transition(
        RequirementLifecycleState from,
        RequirementLifecycleEvent evt) => (from, evt) switch
    {
        (Draft,    Propose)        => Proposed,
        (Proposed, Approve)        => Approved,
        (Proposed, Reject)         => Rejected,
        (Approved, BeginWork)      => InProgress,
        (InProgress, EnterQuality) => Quality,
        (Quality, NeedsTranslation) => Translation,
        (Translation, ReadyForReview) => Review,
        (Review, Ship)             => Done,
        (_, Block)                 => Blocked,
        (Blocked, Unblock)         => from, // restored to previous
        _ => Result.Failure($"REQ-100: invalid transition {from}{evt}")
    };
}

This means a feature whose state is mutated through the runtime API cannot land in an invalid state — the same guarantees that protect the Order aggregate's OrderStatus protect the requirements layer too. The two state machines are produced by the same Stage 4 generator and share the same Result<T> plumbing.

Where This DSL Stops, and the Feature-Tracking Series Begins

This part has shown how the Requirements DSL integrates with the rest of the CMF: gates, analyzers, audit, OpenAPI tags, traceability reports. The deeper architecture of the requirements layer itself — the abstract record / abstract method pattern, the IRequirementValidator interface family, the [ForRequirement] / [Verifies] / [TestsFor] triple, the REQ1xxREQ4xx analyzer families, and the post-test quality gates — is documented in a dedicated nine-part series, Feature Tracking. Everything shown here builds on those primitives without re-explaining them.

⬇ Download