Skip to main content
Welcome. This site supports keyboard navigation and screen readers. Press ? at any time for keyboard shortcuts. Press [ to focus the sidebar, ] to focus the content. High-contrast themes are available via the toolbar.
serard@dev00:~/cv

You have a compliance engine that catches regressions at compile time. You have a scanner that produces structured data. Now what? You stare at JSON? No. You build a dashboard — and you generate it from the same content model that defines your features.

The Problem With Compliance Reports

Most compliance tools dump text. A wall of green checkmarks and red crosses in a terminal. Fine for a developer running a quick check. Useless for a PM trying to understand sprint progress. Useless for a team lead trying to spot trends. Useless for an auditor who needs to see coverage history over time.

The real question is not "did we pass?" but "where are we, how did we get here, and where are we going?"

That requires seven views. Not seven hand-built pages — seven views generated from the same content model that defines your features, ACs, and workflow states.

Before: Console Output

Diagram

After: Auto-Generated Dashboard

Diagram

Every panel in that wireframe is generated by the Diem Admin DSL. The developer writes [AdminModule] on a FeatureSnapshot type. The generator builds the pages.


1. Coverage Matrix

The Coverage Matrix is the landing page. An [AdminModule] listing on FeatureSnapshot produces a paginated table where every row is a feature and every column tells a story.

Columns: Feature ID, Name, Priority, Level (Epic/Feature/Story), Total ACs, Covered ACs, Coverage %, Status. Color-coded cells — green when coverage exceeds 90%, yellow between 50% and 90%, red below 50%. You see the health of your entire product in one glance.

Click any row to drill into the Feature Detail view. Filter by priority (Critical, High, Medium, Low), by hierarchy level, by team assignment, by sprint tag. The filters are generated from the content model's enum fields — add a new priority level to the DSL, and the filter dropdown updates automatically.

@AdminModule({ entity: FeatureSnapshot, listColumns: ['id', 'name', 'priority', 'coveragePercent', 'status'] })
export class FeatureCoverageModule {}

That single decorator generates the entire listing page, including pagination, sorting, and the color-coding rules defined in the theme configuration.

2. Kanban Board

The Workflow DSL defines lifecycle states: Draft, Proposed, Approved, InProgress, Quality, Translation, Review, Done. The Kanban board renders each state as a column. Feature cards show the feature name, coverage percentage, and current assignee.

Drag a card from InProgress to Quality. The dashboard triggers a gate evaluation. If the quality gate requires 90% AC coverage and the feature sits at 72%, the card bounces back to InProgress with an explanation: "Quality gate failed: coverage 72% < required 90%. 3 ACs uncovered: searchReturnsResults, filterByDate, paginationWorks."

No ambiguity. No negotiation. The gate is defined in code, evaluated by the scanner, enforced by the board. The PM sees the bounce. The developer sees the specific ACs. Everyone moves forward.

3. Trend Charts

The scanner runs on every build. Each run produces a ScanResult snapshot stored in Diem's content store. The Trend Charts widget queries that history.

Line charts show coverage percentage per feature over the last 30 days. You spot the feature that was at 95% two weeks ago and dropped to 60% after a refactor. Area charts show total ACs covered versus uncovered across the entire product — the gap should shrink over time. Heatmaps show coverage by team. Team A is consistently green. Team C has a red band across the last three sprints. That is a conversation starter, not a blame tool.

The data is already there. The scanner produces it on every run. The dashboard simply renders what already exists.

4. Hierarchy Tree

The Content DSL defines parent references. Epic contains Features. Feature contains Stories. Story contains Tasks. Bug links to its target feature via Bug<TTarget>. The Hierarchy Tree renders this as an expandable tree view.

Click an Epic node — it expands to show its Features. Click a Feature — Stories appear underneath. Click any node and the right panel shows its ACs with coverage status. Bugs appear under their target feature with a distinct icon, so you can see at a glance which features have open defects.

This is not a flat list. It is the actual requirement hierarchy, rendered from the same parent-child relationships defined in the content model. Add a new Story under a Feature in code, and the tree updates on the next scan.

5. Feature Detail

The detail view is where work happens. Click any feature from the matrix, the tree, or the Kanban board, and you land here.

The ACs table lists every acceptance criterion. Each row shows: AC name (from the abstract method name), description (from JSDoc), covered/uncovered status, linked test names (from [FeatureTest] attributes), and the last test result (pass/fail/skip with timestamp). Green rows are covered and passing. Red rows are uncovered or failing.

Below the ACs table: a state history timeline showing every lifecycle transition with timestamps and who triggered it. The current assignee. Comment threads where PMs and developers discuss specific ACs (stored in Diem, not in code). Related bugs and support tickets linked via Bug<TTarget>.

This is the single source of truth for a feature's status. Not a Jira ticket that someone forgot to update. Not a Confluence page that drifted from reality. The actual state, derived from code and scanner data.

6. Sprint View

Filter the entire dashboard by sprint tag. The Sprint View shows progress bars per feature — how many ACs are covered out of total for features tagged to this sprint. A coverage trend line within the sprint shows whether the team is converging toward 100% or stalling.

A burndown-style chart shows ACs remaining (uncovered) over time. Unlike story-point burndowns that depend on estimation accuracy, AC burndowns measure actual verified behavior. When the line hits zero, the sprint's features are fully covered. No interpretation needed.

7. Diff View

Select two scan snapshots — last Tuesday versus today, or last release versus current HEAD. The Diff View shows what changed between them.

New features added since the baseline. ACs removed (someone deleted a requirement — that should be intentional and visible). Coverage changes per feature — went from 85% to 92%, good. Regressions — features that went from green to red, meaning previously-covered ACs are now uncovered. Regressions get a prominent red banner because they represent lost ground.

This is the view you open in a sprint retrospective. It answers "what did we accomplish?" with data, not memory. It answers "what broke?" with specifics, not suspicion.


Why Generation Matters

None of these seven views are hand-built. The Admin DSL generates listing pages from [AdminModule]. The Workflow DSL generates the Kanban columns from lifecycle states. The Content DSL generates the hierarchy tree from parent references. The scanner generates the data. The dashboard renders it.

Change the content model — add a field, add a state, add a hierarchy level — and the dashboard updates. No frontend tickets. No design reviews for a new column. The dashboard is a projection of the content model, and the content model is the single source of truth.

That is the difference between a reporting tool and an integrated platform. Reporting tools display data someone else collected. An integrated platform generates both the data and the display from the same source.


Previous: Part VII: Enterprise Integration | Next: Part IX: The Stakeholder View