The Meta-Pattern — Documentation as Compiler Output
"Declare intent with a type. Let the compiler produce the artifact. The only variable is what gets generated — code, enforcement, or documentation."
The Grand Unification Table
Four blog series. One pattern. Five steps. Different outputs.
| Series | Step 1: Declare | Step 2: SG Reads | Step 3: SG Emits | Step 4: Analyzer Guards | Step 5: Output |
|---|---|---|---|---|---|
| CMF | [AggregateRoot] |
DDD SG | entities, EF, repos | DDD analyzer | Application code |
| Contention | [Injectable] et al. |
Per-domain SG | registrations, validators | Per-domain analyzer | Correct code |
| Typed Specs | abstract class Feature |
Compliance scanner | Coverage matrix | Build warning | Traceability data |
| Auto-Doc | Document<T> |
Doc SG | .md, .json, .yaml, .mermaid | DOC analyzer | Documentation |
| Ops | [DeploymentOrchestrator] |
Ops SGs | DAG, runbook, probes | OPS analyzer | Infrastructure artifacts |
The pattern is always:
Attribute (declare intent)
→ Source Generator (read intent, produce artifact)
→ Analyzer (guard boundaries, report violations)
→ Output (code, docs, infra — depends on the SG)Attribute (declare intent)
→ Source Generator (read intent, produce artifact)
→ Analyzer (guard boundaries, report violations)
→ Output (code, docs, infra — depends on the SG)The only thing that changes across series is what the SG emits. The infrastructure is the same. The developer experience is the same. The compiler does the work.
One Attribute, Many Outputs
A single [AggregateRoot] attribute on Order now drives outputs from five different series:
[AggregateRoot(BoundedContext = "Ordering")]
[ForRequirement(typeof(OrderPaymentFeature))]
[Injectable(Lifetime.Scoped)]
public partial class Order
{
[EntityId] public partial OrderId Id { get; }
[Composition] public partial IReadOnlyList<OrderLine> Lines { get; }
[Composition] public partial PaymentInfo Payment { get; }
[DomainEvent(typeof(OrderPaidEvent))]
public Result MarkAsPaid(PaymentReference ref, PaymentMethod method) { ... }
}[AggregateRoot(BoundedContext = "Ordering")]
[ForRequirement(typeof(OrderPaymentFeature))]
[Injectable(Lifetime.Scoped)]
public partial class Order
{
[EntityId] public partial OrderId Id { get; }
[Composition] public partial IReadOnlyList<OrderLine> Lines { get; }
[Composition] public partial PaymentInfo Payment { get; }
[DomainEvent(typeof(OrderPaidEvent))]
public Result MarkAsPaid(PaymentReference ref, PaymentMethod method) { ... }
}| Generator | Reads | Emits |
|---|---|---|
| DDD SG (CMF) | [AggregateRoot], [Composition], [DomainEvent] |
Order.g.cs, OrderRepository.g.cs, OrderConfiguration.g.cs |
| DI SG (Contention) | [Injectable] |
OrderDiRegistration.g.cs |
| Doc SG — Architecture | [AggregateRoot], [Composition] |
docs/Order/architecture.md, Order.mermaid |
| Doc SG — Traceability | [ForRequirement] |
docs/Order/requirements.md |
| Doc SG — API | [TypedEndpoint] on endpoints referencing Order |
docs/Order/api.md |
| Doc SG — Deployment | [DeploymentOrchestrator] referencing Order |
docs/Order/deployment/runbook.md, grafana.json, prometheus.yaml |
One class. Three attributes. Seven generators. Twenty files. Zero manual documentation.
The Gotcha Revisited
Part I stated the gotcha: to document, first you type.
Now we see the full picture:
The Investment
Typing is the investment. Adding [AggregateRoot], [ForRequirement], [DeploymentOrchestrator], [HealthCheck] to your code takes time. Each attribute is a declaration of intent that costs a few seconds to write.
The Payoff
Everything else is free:
- Architecture diagrams → generated
- Traceability matrices → generated
- API documentation → generated
- Deployment runbooks → generated
- Grafana dashboards → generated
- Prometheus alerts → generated
- Kubernetes probes → generated
- Helm values → generated
- Release notes → generated
- Cross-references → generated
The investment is one-time per concept. The payoff compounds with every build.
This Website Proved It
This site started with TypeScript interfaces (TocItem, Heading, TocSection) and a build pipeline (build-static.ts). The investment was typing the content metadata. The payoff was auto-generated toc.json, meta tags, JSON-LD, Mermaid SVGs, and a fully navigable SPA.
The C# version scales the same principle to enterprise: DDD aggregates, requirements, API contracts, deployments, observability — all typed, all auto-documented.
Document<Document<>> — The Recursive Proof
The Document DSL documents itself. Document<Document<>> generates the Document DSL's own reference documentation:
- List of all registered strategies and their output formats
- All active
Document<>declarations in the solution - All DOC001–DOC005 analyzer diagnostics with descriptions
- Coverage: which DSLs have documentation strategies, which don't
If you can document the documenter, the pattern is self-sustaining. No external tool needed. No separate documentation system. The typed system documents itself, including the system that does the documenting.
This is not a gimmick — it has practical value:
- New team members see exactly which documentation strategies exist
- The reference updates automatically when a new strategy is registered
- Missing strategies are visible (e.g., "Ops.Configuration has attributes but no DocumentStrategy")
When Auto-Doc Goes Too Far
Honest limitations:
What Auto-Doc Cannot Generate
- Narrative context — "Why did we choose this payment provider?" has no attribute
- Decision records — architectural decisions are human reasoning, not typed metadata
- Tutorial content — step-by-step guides require a narrative voice
- User-facing documentation — end-user docs need empathy and tone, not traceability
The Human Role Shifts
Auto-documentation doesn't eliminate the human. It shifts the role from author to editor:
| Before (Manual) | After (Auto-Doc) |
|---|---|
| Write the architecture diagram | Review the generated diagram |
| Maintain the traceability matrix | Verify the generated matrix is complete |
| Update the deployment runbook | Validate the generated runbook matches reality |
| Create Grafana dashboards by hand | Review the generated dashboard panels |
The human writes the "why." The compiler writes the "what."
The Decision Matrix
| Content Type | Auto-generate? | Why |
|---|---|---|
| Traceability matrix | Yes | Pure data: requirement → code → test |
| Entity diagrams | Yes | Pure structure: compositions, events |
| API contracts | Yes | Pure specification: methods, routes, types |
| Deployment runbook | Yes | Pure procedure: ordered steps, rollback |
| Infrastructure configs | Yes | Pure data: probes, alerts, dashboards |
| Architecture Decision Records | No | Human reasoning, trade-off analysis |
| Onboarding guides | No | Requires narrative voice and sequencing |
| Product documentation | No | Requires empathy, user persona awareness |
DX Recap — The Three-Line Promise
For any team using typed DSLs:
<!-- Line 1: Add the Document DSL -->
<ProjectReference Include="Cmf.Document.Lib" /><!-- Line 1: Add the Document DSL -->
<ProjectReference Include="Cmf.Document.Lib" />// Line 2: Declare what to document
[assembly: DocumentSuite<Order>(CrossReference = true)]// Line 2: Declare what to document
[assembly: DocumentSuite<Order>(CrossReference = true)]# Line 3: Build
dotnet build
# → 15+ documentation files generated
# → Grafana dashboard ready to import
# → Prometheus alerts ready to deploy
# → Kubernetes probes ready to apply# Line 3: Build
dotnet build
# → 15+ documentation files generated
# → Grafana dashboard ready to import
# → Prometheus alerts ready to deploy
# → Kubernetes probes ready to applyNo configuration files. No template systems. No documentation plugins. The types you already wrote ARE the documentation source. The Source Generator you already use IS the documentation engine.
The Compiler as Technical Writer
Documentation is no longer a human activity separate from development. It is a compiler output — as automatic as IL generation, as reliable as type checking, as up-to-date as the code itself.
The Build as Documentation Pipeline
dotnet build doesn't just compile your application. It generates your documentation, your infrastructure artifacts, your compliance reports, and your release notes. One command. One source of truth.
The PR as Changelog
When a developer adds a [Feature] or modifies an [AggregateRoot], the Document DSL regenerates the affected docs. The PR diff shows both the code change and the documentation change. The reviewer sees the impact on both code and docs in one place.
The Release as Runbook
When a new [DeploymentOrchestrator] is tagged for release, the generated runbook IS the release procedure. The generated Helm values ARE the deployment configuration. The generated Prometheus alerts ARE the monitoring setup. Nothing else to write.
Series Summary
| Part | What You Learned |
|---|---|
| I | The gotcha: to document, first type. C# IS the DSL infrastructure. This website proves it. |
| II | The dev DSLs (DDD, Requirements, Testing, API) already have the metadata. The Artifact Graph finds all consumers. |
| III | The Document DSL: Document<T> with pluggable strategies. Document<Document<>> documents itself. Onboarding in 3 lines. |
| IV | The Ops Meta-DSL: 5 sub-DSLs (Deployment, Migration, Observability, Configuration, Resilience), each a standalone NuGet. |
| V | Composing all 5 Ops sub-DSLs in one deployment class. OPS001–OPS012 analyzers catch cross-DSL violations. |
| VI | Generated artifacts: not just Markdown — Grafana JSON, Prometheus YAML, K8s probes, Helm values. |
| VII | Full lifecycle: DocumentSuite<T> cross-references everything. Release notes from git diff. Pipeline from build to static site. |
| VIII | Complete example: one feature, 20 generated files, zero manual documentation. |
| IX | The meta-pattern: one pattern, four series. Declare → Generate → Guard → Output. The compiler is the technical writer. |
The investment is typing. The payoff is everything else.