Skip to main content
Welcome. This site supports keyboard navigation and screen readers. Press ? at any time for keyboard shortcuts. Press [ to focus the sidebar, ] to focus the content. High-contrast themes are available via the toolbar.
serard@dev00:~/cv

Part I: Design Philosophy

"Small, sharp tools." -- Doug McIlroy

Why build yet another set of framework libraries? The answer is that most .NET "framework" packages are monoliths. You install one NuGet package and get 400 services, half of which you never use. You get a base class that infects your entire domain model. You get startup code that registers things you have never heard of into your DI container. You get transitive dependencies on packages you did not ask for and do not want. And when you need to upgrade, you upgrade everything or nothing, because the monolith does not believe in boundaries.

FrenchExDev takes the opposite approach: nine small, focused packages that compose because they share principles, not a base class.

Each package solves exactly one problem. FrenchExDev.Net.Options handles absence. FrenchExDev.Net.Guard handles validation. FrenchExDev.Net.Mediator handles dispatch. They compose through shared types -- Result<T>, Option<T> -- and shared DI conventions via the [Injectable] source-generated attribute, not through direct references to each other. You can use any subset. You can use one. You can use all nine. The choice is yours, and the framework does not punish you for choosing.

This is not a new idea. The Unix philosophy has advocated for small, composable tools since the 1970s. But the .NET ecosystem has historically drifted in the opposite direction: toward large, integrated frameworks that do everything. ASP.NET gives you a web server, a DI container, configuration, logging, routing, model binding, authentication, authorization, and CORS in one package. EF Core gives you an ORM, migrations, change tracking, lazy loading, and query translation in one package. MediatR gives you command dispatch, notification dispatch, pipeline behaviors, and stream requests in one package.

Each of these choices has merit. There are genuine advantages to integrated frameworks: consistent versioning, shared conventions, documentation that covers the whole surface. But there is a cost: you take all of it or none of it. You cannot use MediatR's pipeline behaviors without also using MediatR's notification dispatch. You cannot use EF Core's change tracking without also pulling in migrations.

FrenchExDev's nine packages reject that tradeoff. You can use the Guard package without the Saga package. You can use the Option package without the Mediator package. You can use all nine together, and they compose cleanly -- but the composition is opt-in, not forced.

This article lays out the five design principles that every pattern in the framework shares. These principles are not aspirational statements on a wiki page. They are structural constraints enforced by the package graph, the project layout, the source generators, and the test infrastructure. If a principle is not enforceable, it is not a principle -- it is a suggestion.


The Five Principles

What follows are the five non-negotiable design decisions that shaped every line of code across all nine packages. They are listed in order of impact: the first principle (composability) constrains the package graph; the second (testing) constrains the project structure; the third (source generation) constrains how DI registration and mapping work; the fourth (Result<T>) constrains the error-handling vocabulary; and the fifth (netstandard2.0) constrains the target framework and API surface.


Principle 1: Composability Over Completeness

Each package does one thing. This is not a slogan. It is a dependency constraint.

The package graph is shallow. Most packages depend only on FrenchExDev.Net.Result -- the Result<T> type that represents success or failure -- and nothing else. A few packages depend on nothing at all. No package depends on more than two other FrenchExDev packages.

Here is what that means in practice. When you install FrenchExDev.Net.Guard, your dotnet restore output looks like this:

Restored FrenchExDev.Net.Guard (1 dependency)
  └── FrenchExDev.Net.Result

Two packages. That is it. You do not pull in Saga. You do not pull in Outbox. You do not pull in an EF Core dependency because some other pattern in the framework happens to need it. The Guard package knows about Result<T> because it needs to return validation failures as Result<T>. It does not know about Option<T>, IClock, IMediator, or anything else.

Compare this to what happens with a typical enterprise framework package:

Restored Enterprise.Framework (47 dependencies)
  ├── Enterprise.Framework.Core
  ├── Enterprise.Framework.Abstractions
  ├── Enterprise.Framework.Data
  │   ├── Microsoft.EntityFrameworkCore (5.0.0)
  │   ├── Microsoft.EntityFrameworkCore.SqlServer
  │   └── Microsoft.EntityFrameworkCore.Relational
  ├── Enterprise.Framework.Messaging
  │   ├── RabbitMQ.Client
  │   └── Polly
  ├── Enterprise.Framework.Caching
  │   └── StackExchange.Redis
  ... (33 more)

You wanted a validation helper. You got Redis. That is what happens when composability is not a structural constraint.

How the packages compose

The nine packages compose through two mechanisms: shared types and shared DI conventions.

Shared types means that when Guard.ToResult returns a Result<T>, and when Option<T>.ToResult() converts None to a Result<T> failure, and when ISagaStep<T>.ExecuteAsync returns a Task<Result>, they are all returning the same Result<T> from the same FrenchExDev.Net.Result package. There is no Guard.Result and Option.Result and Saga.Result. There is one Result<T>, and every pattern speaks it.

Shared DI conventions means that every pattern registers itself into IServiceCollection using the [Injectable] attribute and its companion source generator. You do not write services.AddMediator() and services.AddSaga() and services.AddOutbox() as separate extension methods that each take different configuration lambdas. You write services.AddMyAppInjectables() -- a single, source-generated method that registers everything in the assembly, including all mediator handlers, saga steps, outbox processors, and domain services.

This is composability through protocol agreement, not through inheritance. The packages do not extend each other. They speak the same language.

What you can safely omit

Because the graph is shallow, you can use any subset without penalty:

If you need... Install... Transitive deps
Null elimination FrenchExDev.Net.Options FrenchExDev.Net.Result
Input validation FrenchExDev.Net.Guard FrenchExDev.Net.Result
Discriminated unions FrenchExDev.Net.Union (none)
Deterministic time FrenchExDev.Net.Clock (none)
Object mapping FrenchExDev.Net.Mapper (none)
CQRS dispatch FrenchExDev.Net.Mediator FrenchExDev.Net.Result
Event streams FrenchExDev.Net.Reactive System.Reactive
Multi-step orchestration FrenchExDev.Net.Saga FrenchExDev.Net.Result
Transactional outbox FrenchExDev.Net.Outbox FrenchExDev.Net.Result

Every cell in the "Transitive deps" column is either one package or zero. That is the structural guarantee.

The anti-pattern this avoids

The monolith anti-pattern is not just about package size. It is about coupling velocity. When you install a monolith framework, every update to any part of the framework forces an update to the entire framework. If the team that maintains the EF Core integration makes a breaking change, your validation code breaks too -- even though validation has nothing to do with EF Core.

With nine small packages, the blast radius of any breaking change is exactly one package. The Guard team (which is also me, but the principle holds for any team) can ship a breaking change to Guard without affecting Saga, Outbox, or any other package. Consumers upgrade one package at a time, on their own schedule.

This is the Unix philosophy applied to NuGet: do one thing, do it well, compose through shared protocols.

Versioning independence

Because the packages are independent, they version independently. FrenchExDev.Net.Guard might be at version 3.1.0 while FrenchExDev.Net.Saga is at version 1.4.0. This is not a bug -- it is the natural consequence of packages evolving at different rates.

Guard has been stable for two years. Its API surface is small and well-understood. It does not change often. Saga is newer and still evolving. Its API changes more frequently as edge cases are discovered in production.

In a monolith framework, both packages would be at the same version (say, 5.2.0), even though Guard has not changed since 5.0.0. Consumers would install a "new version" of Guard that contains zero changes, just because the monolith bumped its version. Worse, the changelog for 5.2.0 would list changes to Saga that Guard consumers do not care about.

Independent versioning means:

  • Changelogs are accurate. The changelog for FrenchExDev.Net.Guard 3.1.0 contains only Guard changes.
  • Upgrades are targeted. You upgrade the package that changed, not the entire framework.
  • Risk is proportional. Upgrading a stable package (Guard) carries less risk than upgrading an evolving package (Saga). Independent versions make that risk visible in the version number.

The composition test

How do you know these packages actually compose? Every release runs a composition integration test that installs all nine packages into a single project, registers them all via [Injectable], and exercises a workflow that touches all nine:

  1. Guard validates the incoming request
  2. Option looks up the customer (might not exist)
  3. Union routes to one of three processing paths
  4. Clock timestamps the operation
  5. Mapper transforms domain objects to DTOs
  6. Mediator dispatches the command through a behavior pipeline
  7. Saga orchestrates a three-step fulfillment process
  8. Outbox guarantees the domain event is published
  9. Reactive streams the event to analytics subscribers

If any package change breaks the composition, the integration test fails and the release is blocked. This is not a manual check. It is an automated gate.


Principle 2: Testing as a First-Class Package

Every single pattern ships a companion .Testing package. Not as an afterthought. Not as a set of examples in a README. As a real, versioned, NuGet-published package that you install into your test projects.

Here is the complete list:

Pattern Testing Package Key Test Doubles
Options FrenchExDev.Net.Options.Testing OptionAssertions (ShouldBeSome, ShouldBeNone, ShouldBeSomeAnd)
Union FrenchExDev.Net.Union.Testing UnionAssertions (ShouldBe<T>, ShouldNotBe<T>, ShouldMatchExhaustively)
Clock FrenchExDev.Net.Clock.Testing FakeClock (Advance, SetUtcNow, deterministic timers)
Mapper FrenchExDev.Net.Mapper.Testing MapperAssertions (ShouldMapTo, ShouldMapProperty, ShouldIgnoreProperty)
Mediator FrenchExDev.Net.Mediator.Testing FakeMediator (Setup<TRequest, TResponse>, WasSent, WasPublished)
Reactive FrenchExDev.Net.Reactive.Testing TestEventStream<T> (records every published event for later assertion)
Saga FrenchExDev.Net.Saga.Testing InMemorySagaStore (ConcurrentDictionary-backed, no database needed)
Outbox FrenchExDev.Net.Outbox.Testing InMemoryOutbox (ConcurrentBag-backed, captures messages for assertion)

Eight .Testing packages. Eight sets of purpose-built test doubles. Zero Moq. Zero NSubstitute. Zero runtime proxies.

Why fakes belong in a separate package

There are three reasons.

Reason 1: No test infrastructure in production assemblies. When FakeClock lives in the main FrenchExDev.Net.Clock package, every application that uses the clock in production ships with test infrastructure it will never use. That is dead code in production. Worse, it is dead code that references test-oriented patterns (mutable state, public setters) that have no place in a production assembly.

Reason 2: No Moq dependency. If the framework provides its own fakes, consumers do not need to install a mocking library to test code that uses the framework. FakeMediator does everything Mock<IMediator> does, but with a purpose-built API that knows about commands, queries, and notifications. You write fakeMediator.Setup<GetOrderQuery, OrderDto>(query => orderDto) instead of mock.Setup(m => m.SendAsync(It.IsAny<GetOrderQuery>(), It.IsAny<CancellationToken>())).ReturnsAsync(orderDto). The framework-provided fake is shorter, safer, and harder to misconfigure.

Reason 3: The framework author designs the test double. When you mock IMediator with Moq, you have to guess the correct way to set up the mock. Does SendAsync accept a CancellationToken? Is the return type Task<TResponse> or Task<Result<TResponse>>? You look at the interface, squint at the generic constraints, and hope you got it right. When the framework author provides FakeMediator, the test double is designed by the same person who designed the interface. It knows the invariants. It validates them.

The naming convention

Every .Testing package follows the same naming pattern:

FrenchExDev.Net.{Pattern}.Testing

This is a convention, not a coincidence. The build system enforces that every pattern project that declares a src/FrenchExDev.Net.{Pattern}/ directory also declares a src/FrenchExDev.Net.{Pattern}.Testing/ directory. If you create a new pattern and forget the .Testing project, the build fails.

What the test doubles look like in practice

Here is a test that uses FakeClock to verify that a subscription expires after 30 days:

[Fact]
public void Subscription_expires_after_thirty_days()
{
    // Arrange
    var clock = new FakeClock(new DateTimeOffset(2026, 1, 1, 0, 0, 0, TimeSpan.Zero));
    var subscription = new Subscription(clock.UtcNow, duration: TimeSpan.FromDays(30));

    // Act
    clock.Advance(TimeSpan.FromDays(31));

    // Assert
    subscription.IsExpired(clock.UtcNow).Should().BeTrue();
}

No DateTime.UtcNow. No Thread.Sleep. No flaky tests that depend on wall-clock time. The clock is a value you control.

Here is a test that uses FakeMediator to verify that a controller sends the correct command:

[Fact]
public async Task PlaceOrder_sends_command_with_correct_total()
{
    // Arrange
    var fakeMediator = new FakeMediator();
    var expectedResult = Result<OrderId>.Success(new OrderId(42));
    fakeMediator.Setup<PlaceOrderCommand, OrderId>(cmd => expectedResult);

    var controller = new OrderController(fakeMediator);

    // Act
    var result = await controller.PlaceOrder(new PlaceOrderRequest
    {
        CustomerId = 7,
        Items = new[] { new LineItem("SKU-001", quantity: 2, unitPrice: 9.99m) }
    });

    // Assert
    fakeMediator.WasSent<PlaceOrderCommand>(cmd =>
        cmd.CustomerId == 7 &&
        cmd.Items.Count == 1 &&
        cmd.Items[0].Sku == "SKU-001");
}

No Mock<IMediator>. No It.IsAny<>. No Verify(Times.Once()). The fake knows what a command is. The assertion reads like a specification.

Here is a test that uses OptionAssertions to verify a lookup result:

[Fact]
public void FindUser_returns_some_when_user_exists()
{
    // Arrange
    var repository = new InMemoryUserRepository();
    repository.Add(new User(42, "Alice"));

    // Act
    var result = repository.FindById(42);

    // Assert
    result.ShouldBeSome();
    result.ShouldBeSomeAnd(user => user.Name == "Alice");
}

[Fact]
public void FindUser_returns_none_when_user_does_not_exist()
{
    var repository = new InMemoryUserRepository();

    var result = repository.FindById(999);

    result.ShouldBeNone();
}

ShouldBeSome, ShouldBeNone, ShouldBeSomeAnd. The assertions say what they mean. There is no ceremony.

Here is a test that uses InMemorySagaStore to verify saga persistence without a database:

[Fact]
public async Task Saga_persists_state_between_steps()
{
    // Arrange
    var store = new InMemorySagaStore();
    var orchestrator = new SagaOrchestrator<OrderContext>(store, steps: new ISagaStep<OrderContext>[]
    {
        new ValidateInventoryStep(),
        new ChargePaymentStep(),
        new ShipOrderStep()
    });

    // Act
    var context = new OrderContext(OrderId: 42, Total: 99.99m);
    var result = await orchestrator.ExecuteAsync(context);

    // Assert
    result.IsSuccess.Should().BeTrue();

    var persisted = await store.GetAsync<OrderContext>(context.SagaId);
    persisted.ShouldBeSomeAnd(saga =>
        saga.State == SagaState.Completed &&
        saga.CurrentStep == 3);
}

InMemorySagaStore is backed by a ConcurrentDictionary. It implements ISagaStore exactly like the real EF Core or Redis store would, but without any infrastructure. The test runs in milliseconds, not seconds. It runs in CI without a database connection string. It runs on a developer's laptop without Docker.

That is the value of shipping test doubles as a first-class package. The framework author builds the fake once, correctly, and every consumer benefits.

Here is a test that uses InMemoryOutbox to verify that domain events are captured during a save operation:

[Fact]
public async Task Domain_events_are_captured_in_outbox()
{
    // Arrange
    var outbox = new InMemoryOutbox();
    var order = new Order(42, total: 99m);
    order.AddDomainEvent(new OrderPlaced(order.Id, order.Total));

    // Act
    await outbox.AddAsync(order.DomainEvents, CancellationToken.None);

    // Assert
    outbox.Messages.Should().HaveCount(1);
    outbox.Messages[0].EventType.Should().Be(nameof(OrderPlaced));
    outbox.Messages[0].ProcessedAt.Should().BeNull(); // Not yet published
}

InMemoryOutbox captures the messages in a ConcurrentBag. The test can inspect the count, the event types, the payload, and the processing state. No EF Core. No database transaction. Just a bag of messages.

And here is a test that uses MapperAssertions to verify that a source-generated mapper correctly maps all properties:

[Fact]
public void OrderMapper_maps_all_required_properties()
{
    var mapper = new OrderToOrderDtoMapper();
    var order = new Order(42, "Alice", 99.99m);

    var dto = mapper.Map(order);

    mapper.ShouldMapProperty(order, dto, o => o.Id, d => d.Id);
    mapper.ShouldMapProperty(order, dto, o => o.CustomerName, d => d.Customer);
    mapper.ShouldMapProperty(order, dto, o => o.Total, d => d.Total);
}

ShouldMapProperty takes two lambda expressions -- one for the source property, one for the target property -- and asserts that they produce the same value. If a mapping is wrong or missing, the assertion fails with a message that names both properties: Expected Order.CustomerName ("Alice") to map to OrderDto.Customer, but got "".

These are not trivial convenience methods. They are carefully designed assertions that produce diagnostic error messages. When a test fails at 2 AM in CI, the error message should tell you what went wrong without needing to reproduce the failure locally. Framework-provided assertions achieve this because the framework author knows what information is relevant to each failure mode.


Principle 3: Source Generation Over Reflection

Three patterns in the framework use Roslyn source generators:

  1. [Injectable] -- scans classes decorated with [Injectable], emits an Add{Assembly}Injectables() extension method that registers every service into the DI container
  2. Mapper -- scans [MapFrom] and [MapTo] attributes on DTOs, emits IMapper<TSource, TTarget> implementations with property-by-property mapping
  3. (Future) Mediator -- handler discovery, currently convention-based, planned migration to source-generated registration

The decision to use source generation instead of reflection was not a stylistic preference. It was a response to three concrete problems with reflection-based DI registration and object mapping.

Problem 1: Hidden performance costs

Reflection-based DI registration typically works like this at startup:

// Somewhere in a library you installed
services.AddMediatR(cfg => cfg.RegisterServicesFromAssembly(typeof(Program).Assembly));

That single line triggers a full assembly scan. Every type in the assembly is loaded. Every type is checked for interface implementations. Every matching type is registered. On a large assembly with 500 types, this can take 200ms or more -- and that is before the first HTTP request.

Source-generated registration looks like this at startup:

// Source-generated extension method — no reflection, no scanning
services.AddMyAppInjectables();

That method is a static list of services.AddScoped<IFoo, Foo>() calls. There is no reflection. There is no assembly scanning. There is no Type.GetInterfaces(). The method body was generated at compile time by the Roslyn incremental generator. It runs in microseconds, not milliseconds.

Problem 2: Debugging black boxes

When a reflection-based mapper throws MissingMemberException: Property 'FullName' not found on type 'UserDto', you get a stack trace that points into the mapper library's internal reflection code. You cannot step through the mapping logic because the mapping logic does not exist as source code. It exists as a runtime-generated delegate or an expression tree compiled into IL.

When a source-generated mapper has a bug, you open UserMapper.g.cs in your IDE and read it:

// <auto-generated />
public sealed class UserDtoMapper : IMapper<User, UserDto>
{
    public UserDto Map(User source)
    {
        return new UserDto
        {
            Id = source.Id,
            FullName = source.FirstName + " " + source.LastName,  // line 12
            Email = source.Email
        };
    }
}

You can set a breakpoint on line 12. You can step through it. You can read it. The generated code is not hidden in a framework DLL. It is in your project, under obj/Debug/net10.0/generated/, and your IDE indexes it like any other source file.

Problem 3: Startup latency

In serverless environments (AWS Lambda, Azure Functions), cold start time matters. Every millisecond of startup is a millisecond of latency on the first request. Reflection-based DI registration contributes directly to cold start time because it performs CPU-intensive type scanning during ConfigureServices.

Source-generated registration contributes zero to cold start time beyond what is already required to execute a static method call. The "scanning" happened at compile time. The generated method is just a sequence of AddScoped / AddSingleton / AddTransient calls.

Incremental generators, not V1 generators

The source generators in FrenchExDev use the Roslyn incremental generator API (IIncrementalGenerator), not the older V1 generator API (ISourceGenerator). This distinction matters for IDE performance.

V1 generators run on every keystroke. Every time you type a character in any file, every V1 generator in the solution re-executes. For a generator that scans 200 classes, this means 200 type resolutions on every keystroke. The IDE becomes sluggish. Autocomplete lags. Red squiggles appear and disappear randomly.

Incremental generators run only when their inputs change. The [Injectable] generator registers a syntax provider that filters for class declarations with specific attributes. If you edit a file that does not contain [Injectable], the generator does not run. If you edit a file that does contain [Injectable], the generator re-runs only for that file's contribution to the model, and the emitter only re-emits if the model actually changed.

The practical effect: you can have 500 [Injectable]-decorated classes in a solution, and editing a non-decorated class triggers zero generator work. The IDE stays responsive.

The [Injectable] source generator in action

Here is a class decorated with [Injectable]:

using FrenchExDev.Net.Injectable.Attributes;

namespace MyApp.Orders;

[Injectable(Scope = Scope.Scoped)]
public sealed class OrderRepository : IOrderRepository
{
    private readonly AppDbContext _db;

    public OrderRepository(AppDbContext db) => _db = db;

    public async Task<Option<Order>> FindByIdAsync(int id)
    {
        var entity = await _db.Orders.FindAsync(id);
        return entity is not null
            ? Option<Order>.Some(entity.ToDomain())
            : Option<Order>.None();
    }
}

The class implements IOrderRepository. The attribute says Scope = Scope.Scoped. The source generator reads both facts and emits:

// <auto-generated />
#nullable enable

namespace Microsoft.Extensions.DependencyInjection;

public static class MyAppInjectableExtensions
{
    public static global::Microsoft.Extensions.DependencyInjection.IServiceCollection AddMyAppInjectables(
        this global::Microsoft.Extensions.DependencyInjection.IServiceCollection services)
    {
        services.AddScoped<global::MyApp.Orders.IOrderRepository, global::MyApp.Orders.OrderRepository>();
        return services;
    }
}

One class. One registration line. No reflection. The generator detected that OrderRepository implements IOrderRepository, so it registered the pair. If the class had implemented two interfaces, there would be two registration lines. If the class had implemented zero interfaces, the generator would register the class as itself (services.AddScoped<OrderRepository>()).

This is the entire pipeline: attribute in source code, Roslyn reads it at compile time, generated extension method appears in obj/. At runtime, you call services.AddMyAppInjectables() in Program.cs, and every [Injectable]-decorated class in the assembly is registered.

The Mapper source generator preview

The Mapper pattern uses the same incremental generator architecture for a different purpose: generating IMapper<TSource, TTarget> implementations from attribute annotations. Here is a quick preview (Part VI covers this in full):

[MapFrom(typeof(Order))]
public sealed class OrderDto
{
    public int Id { get; init; }

    [MapProperty(nameof(Order.CustomerName))]
    public string Customer { get; init; } = "";

    public decimal Total { get; init; }

    [IgnoreMapping]
    public string DisplayLabel => $"Order #{Id} - {Customer}";
}

The source generator reads [MapFrom(typeof(Order))], matches properties by name and type, respects [MapProperty] for renames and [IgnoreMapping] for exclusions, and emits:

// <auto-generated />
public sealed class OrderToOrderDtoMapper : IMapper<Order, OrderDto>
{
    public OrderDto Map(Order source)
    {
        return new OrderDto
        {
            Id = source.Id,
            Customer = source.CustomerName,
            Total = source.Total
        };
    }
}

No reflection. No expression trees. No AutoMapper.CreateMap<Order, OrderDto>() at startup. The mapping is a plain C# method that the compiler checks at build time. If you rename CustomerName to ClientName in the Order class, the generated mapper breaks at compile time -- not at runtime when a user hits the endpoint.

What the generator does NOT do

The generator does not use reflection. It does not call Assembly.GetTypes(). It does not use Activator.CreateInstance(). It does not emit IL. It reads the Roslyn semantic model -- the same model the compiler uses to check your code -- and writes C# source code. The generated code is plain C# that you could have written by hand. The generator just writes it for you, correctly, every time.


Principle 4: Result<T> as the Lingua Franca

All patterns that can fail return Result<T> or integrate with it. This is not a suggestion. It is a structural decision enforced by the return types of the interfaces.

Here is where Result<T> appears across the nine packages:

Pattern Where Result appears
Guard Guard.ToResult returns Result<T> on validation failure
Option Option<T>.ToResult() converts None to Result<T>.Failure()
Saga ISagaStep<T>.ExecuteAsync returns Task<Result>
Saga ISagaStep<T>.CompensateAsync returns Task<Result>
Mediator Handlers typically return Result<TResponse> by convention
Outbox IOutboxProcessor.ProcessAsync returns Task<Result>

The consequence is that error handling has a single vocabulary across the entire framework. When a Guard validation fails, when an Option is empty, when a saga step fails, when a mediator handler rejects a command -- they all express failure in the same way: as a Result<T> with IsFailure == true.

What this looks like in a pipeline

Consider a request handler that validates input, looks up an entity, and performs an operation:

public async Task<Result<OrderDto>> Handle(GetOrderQuery query, CancellationToken ct)
{
    // Guard returns Result<int> — validation failure is a Result
    return Guard.ToResult(query.OrderId)
        .NotDefault("OrderId must be provided")
        .Positive("OrderId must be positive")

        // Option returns Result<Order> — absence is a Result
        .BindAsync(async id =>
        {
            var option = await _repository.FindByIdAsync(id);
            return option.ToResult("Order not found");
        })

        // Mapper is pure transformation — but still in the Result pipeline
        .Map(order => _mapper.Map(order));
}

Three packages in one pipeline: Guard, Option, and Mapper. The pipeline flows through Result<T> the entire way. If the guard fails, the option lookup never runs. If the option is None, the mapper never runs. Each step short-circuits on failure, and the caller receives a single Result<OrderDto> that is either a success with the DTO or a failure with a message.

No exceptions were thrown. No try-catch blocks were written. No null was returned. The types enforce the error handling.

Why not just use exceptions?

Exceptions are for exceptional situations -- things that should not happen during normal execution. A missing order is not exceptional. It is a normal, expected outcome of a lookup. An invalid input is not exceptional. It is a normal, expected outcome of user input.

When you use exceptions for expected failures, you pay three costs:

  1. Performance: Throwing an exception allocates an Exception object, captures a stack trace, and unwinds the call stack. This costs 10x to 100x more than returning a Result<T>.
  2. Readability: The method signature says Task<OrderDto>, which implies it always returns an OrderDto. It lies. It might throw. The caller has to read the documentation (if it exists) to know what exceptions to catch.
  3. Composability: You cannot compose exceptions in a pipeline. You cannot Map a thrown exception. You cannot Bind two methods that throw different exceptions into a single pipeline. You have to catch each one individually.

Result<T> eliminates all three costs. It is a value type (allocated on the stack, no GC pressure). The method signature says Task<Result<OrderDto>>, which tells the truth: this operation can succeed or fail. And you can Map, Bind, Match, and compose results in arbitrary pipelines because they are just values.

The unifying effect

Because every pattern speaks Result<T>, you can connect patterns without adapter code. Guard flows into Option flows into Saga flows into Mediator. Each pattern produces or consumes Result<T>, and the pipeline connects naturally.

This is what "lingua franca" means: a shared language that eliminates translation. No pattern invents its own error type. No pattern requires you to convert between Guard.Error and Saga.Error and Mediator.Error. There is one error type: Result<T> with a failure message. Every pattern speaks it.

Result through a saga

The saga pattern demonstrates how Result<T> flows through multi-step orchestration. Each saga step returns Task<Result>. If any step fails, the orchestrator triggers compensation in reverse order -- and each compensation step also returns Task<Result>:

public sealed class ChargePaymentStep : ISagaStep<OrderContext>
{
    private readonly IPaymentGateway _gateway;

    public ChargePaymentStep(IPaymentGateway gateway) => _gateway = gateway;

    public async Task<Result> ExecuteAsync(OrderContext context, CancellationToken ct)
    {
        var chargeResult = await _gateway.ChargeAsync(
            context.CustomerId,
            context.Total,
            ct);

        return chargeResult.Match(
            onSuccess: transactionId =>
            {
                context.PaymentTransactionId = transactionId;
                return Result.Success();
            },
            onFailure: error => Result.Failure($"Payment failed: {error}"));
    }

    public async Task<Result> CompensateAsync(OrderContext context, CancellationToken ct)
    {
        if (context.PaymentTransactionId is null)
            return Result.Success(); // Nothing to compensate

        return await _gateway.RefundAsync(context.PaymentTransactionId, ct);
    }
}

The gateway returns Result<TransactionId>. The step maps it to Result. The orchestrator checks IsSuccess. If false, it runs CompensateAsync on every previously completed step, in reverse. Every step in the chain -- forward and backward -- speaks Result<T>. There is no special saga error type. There is no SagaStepException. There is just Result.

Result in controller responses

At the API boundary, Result<T> translates directly to HTTP status codes:

[HttpGet("{id}")]
public async Task<IActionResult> GetOrder(int id)
{
    var result = await _mediator.SendAsync(new GetOrderQuery(id));

    return result.Match(
        onSuccess: order => Ok(order),
        onFailure: error => error switch
        {
            "Order not found" => NotFound(error),
            _ => BadRequest(error)
        });
}

The handler returns Result<OrderDto>. The controller matches on success or failure. Success maps to 200. Failure maps to 404 or 400 depending on the message. The Result<T> type flows from the deepest layer of the application (the repository returning Option<T>.ToResult()) all the way to the HTTP response, without a single exception being thrown.


Principle 5: netstandard2.0 -- Run Everywhere, Even Legacy

Core packages target netstandard2.0. This is the broadest .NET target available: it runs on .NET Framework 4.6.1 and later, .NET Core 2.0 and later, .NET 5 through .NET 10, Xamarin, Unity, and Mono.

This was a deliberate decision, not a default. The temptation to target net10.0 and use every new language feature is real. Pattern matching with is not null, collection expressions, primary constructors, required properties -- these are all things that make C# code more concise and readable. But they are not available on netstandard2.0.

The decision came from a simple observation: many organizations run .NET Framework 4.8 in production and will for years to come. If the Option pattern requires .NET 10, those organizations cannot use it. If the Guard pattern requires .NET 8, those organizations cannot use it. By targeting netstandard2.0, the foundational patterns -- Option, Union, Guard, Result -- work everywhere.

How conditional compilation bridges the gap

Where a newer API provides a better implementation, the code uses #if directives:

public override int GetHashCode()
{
#if NETSTANDARD2_0
    // netstandard2.0 does not have HashCode.Combine
    return unchecked((_index * 397) ^ _value?.GetHashCode() ?? 0);
#else
    return HashCode.Combine(_index, _value);
#endif
}

The two implementations produce the same result. The netstandard2.0 path uses the manual hash code multiplication that .NET developers have written since the early 2000s. The modern path uses HashCode.Combine, which is cleaner and has better distribution properties. Both are correct. The consumer does not need to know which path is taken.

Another common pattern:

#if NETSTANDARD2_0
    // netstandard2.0 does not have Span<T> overloads
    public static Result<string> NotNullOrWhiteSpace(string? value, string message)
    {
        if (string.IsNullOrWhiteSpace(value))
            return Result<string>.Failure(message);
        return Result<string>.Success(value!);
    }
#else
    public static Result<string> NotNullOrWhiteSpace(
        string? value,
        [CallerArgumentExpression(nameof(value))] string? paramName = null,
        string? message = null)
    {
        if (string.IsNullOrWhiteSpace(value))
            return Result<string>.Failure(message ?? $"'{paramName}' must not be null or whitespace.");
        return Result<string>.Success(value!);
    }
#endif

On modern .NET, [CallerArgumentExpression] captures the parameter name automatically, so the error message says 'customerEmail' must not be null or whitespace without the caller passing the name explicitly. On netstandard2.0, the caller must provide the message. Same functionality, different ergonomics.

What does NOT target netstandard2.0

Not every package can target netstandard2.0. Some patterns depend on APIs that did not exist before .NET 8:

Package Target Reason
FrenchExDev.Net.Clock net8.0+ Wraps TimeProvider, which was introduced in .NET 8
FrenchExDev.Net.Outbox.EntityFramework net10.0 Uses EF Core 10.0 interceptors
FrenchExDev.Net.Reactive net8.0+ Uses System.Reactive 6.0 features

The principle is: target netstandard2.0 when you can. Target a specific framework when you must. Never target a higher framework than the API surface requires.

The multi-targeting build

Each package's .csproj uses multi-targeting to build both netstandard2.0 and modern frameworks:

<PropertyGroup>
    <TargetFrameworks>netstandard2.0;net8.0;net10.0</TargetFrameworks>
    <LangVersion>latest</LangVersion>
    <Nullable>enable</Nullable>
</PropertyGroup>

This means a single NuGet package contains three TFM folders. When a consumer restores the package on .NET 10, they get the net10.0 binary with full modern API access. When a consumer restores on .NET Framework 4.8, they get the netstandard2.0 binary with the conditional compilation fallbacks. NuGet handles the selection automatically. The consumer does not need to think about it.

The polyfill strategy

For APIs that are missing on netstandard2.0 but essential for clean code, the framework uses internal polyfills:

#if NETSTANDARD2_0
namespace System.Diagnostics.CodeAnalysis
{
    [AttributeUsage(AttributeTargets.Parameter)]
    internal sealed class NotNullWhenAttribute : Attribute
    {
        public NotNullWhenAttribute(bool returnValue) => ReturnValue = returnValue;
        public bool ReturnValue { get; }
    }
}
#endif

This polyfill makes [NotNullWhen(true)] available on netstandard2.0, so the nullable analysis works correctly even on older frameworks. The attribute is internal and lives in the same namespace as the real attribute, so the compiler treats it identically. On net8.0 and net10.0, the polyfill is compiled away and the real BCL attribute is used.

Other polyfills include:

  • [CallerArgumentExpression] -- enables parameter name capture in Guard methods
  • IsExternalInit -- enables init property setters for record types
  • Index and Range -- enables ^1 and .. range syntax

These are small, well-understood polyfills that the .NET community has standardized on. They add no runtime cost (they are attributes, not code) and they enable modern C# syntax on older frameworks.

When netstandard2.0 is not enough

There are cases where targeting netstandard2.0 would require too many compromises. The Clock pattern is the clearest example.

.NET 8 introduced TimeProvider, a proper abstraction for time that the BCL team designed after years of the community building their own IClock interfaces. TimeProvider supports not just GetUtcNow() but also CreateTimer(), GetTimestamp(), and GetElapsedTime(). It is the official way to abstract time in modern .NET.

Building IClock on top of TimeProvider means the Clock pattern benefits from the BCL's design work, including timer integration, timestamp precision, and future BCL improvements. But TimeProvider does not exist on netstandard2.0. Building a polyfill for TimeProvider would be a massive undertaking -- it touches the timer infrastructure, the threading model, and the BCL's internal time representation.

So the Clock pattern targets net8.0+. This is the right tradeoff: organizations on .NET Framework 4.8 are unlikely to need a FakeClock for their legacy code (they have bigger problems), and organizations on .NET 8+ get a clock that integrates natively with the BCL.


The Package Graph

The nine packages form a directed acyclic graph. The diagram below shows every dependency between FrenchExDev packages. Dotted lines indicate optional platform-specific extensions.

Diagram
Nine packages, five arrows — the sparsity of the FrenchExDev package graph is the physical manifestation of Principle 1: composability shows up as the absence of dependencies.

Read the diagram from bottom to top. The behavioral patterns (Mediator, Reactive, Saga, Outbox) sit at the top of the application architecture. They orchestrate workflows, dispatch commands, stream events, and guarantee delivery. Below them, the infrastructure patterns (Clock, Mapper) provide cross-cutting services. Below those, the foundational patterns (Option, Union, Guard) provide type-safety primitives. At the base, the core types (Result<T>, [Injectable]) provide the shared vocabulary.

The key observation is how few arrows there are. Nine packages, five dependency arrows. Most packages are independent. You can install Mapper without Clock. You can install Union without anything else. The graph is sparse by design.

Notice what is not in the diagram:

  • No arrow from Mediator to Saga. A mediator handler might orchestrate a saga, but the Mediator package does not depend on the Saga package. The composition happens in your application code, not in the framework.
  • No arrow from Option to Guard. Guard can validate an Option<T> (e.g., Guard.Against.None(option)), but the Guard package does not depend on the Option package. The method uses the Result<T> type as the shared vocabulary.
  • No arrow from Clock to anything. Clock is a leaf node. It has zero dependencies on other FrenchExDev packages. It provides IClock and FakeClock. That is all it does.
  • No central "Core" package that everyone depends on. Result<T> is a dependency of some packages, but not all. Union, Clock, and Mapper do not need it. They operate independently.

This sparsity is the physical manifestation of Principle 1. Composability is not just a design goal -- it is visible in the dependency graph as the absence of arrows.


The Source Generation Pipeline

The [Injectable] source generator follows a four-stage pipeline that transforms attribute annotations into registration code. This is the same architecture used by the Mapper source generator (with different inputs and outputs).

Diagram
The [Injectable] source generator as a four-stage pipeline — the same incremental architecture the Mapper generator reuses, turning a single attribute into real, debuggable registration code.

Stage 1: Attribute detection. The Roslyn incremental generator registers a syntax provider that filters for class declarations with [Injectable] or [InjectableDecorator] attributes. This is a fast syntactic check -- it does not resolve types or check semantics. It just says: "This class has an attribute that looks like it might be [Injectable]."

Stage 2: Syntax analysis. For each candidate class, the generator reads the attribute's named arguments (Scope, As, Key, TryAdd) from the syntax tree. It also collects the class name, namespace, and the list of directly implemented interfaces from the class declaration.

Stage 3: Semantic model. The generator resolves the syntax to the Roslyn semantic model. This is where it confirms that the attribute is actually FrenchExDev.Net.Injectable.Attributes.InjectableAttribute (not some other attribute with the same name). It resolves interface types to their fully qualified names. It checks whether the class is a generic type definition (open generic). It reads the assembly-level [InjectableDefaults] attribute if present.

Stage 4: Emit. The generator builds an InjectableEmitModel containing every service and decorator to register, then passes it to InjectableEmitter.EmitMicrosoft() (or EmitSimpleInjector(), or EmitDryIoc() -- the generator supports three DI containers). The emitter writes plain C# source code: a static class with a single extension method that contains one registration call per service.

The generated file appears in the project's obj/ directory:

obj/Debug/net10.0/generated/
    FrenchExDev.Net.Injectable.Microsoft.SourceGenerator/
        FrenchExDev.Net.Injectable.Microsoft.SourceGenerator.InjectableGenerator/
            InjectableExtensions.g.cs

Your IDE indexes it. IntelliSense completes it. The debugger steps through it. It is real source code, just written by a machine.


The [Injectable] Attribute in Depth

The [Injectable] attribute is the cornerstone of the DI registration story across all nine patterns. Every saga step, every mediator handler, every mapper implementation, every domain service -- they all use [Injectable] to declare how they should be registered.

The attribute definition

namespace FrenchExDev.Net.Injectable.Attributes;

[AttributeUsage(AttributeTargets.Class | AttributeTargets.Interface, Inherited = false, AllowMultiple = true)]
public sealed class InjectableAttribute : Attribute
{
    /// <summary>Service lifetime in the DI container.</summary>
    public Scope Scope { get; set; } = Scope.Transient;

    /// <summary>
    /// Explicit service interfaces to register as.
    /// When null, auto-detects: registers for each directly implemented interface;
    /// self only if none.
    /// All types must be interfaces — the analyzer enforces this.
    /// </summary>
    public Type[]? As { get; set; }

    /// <summary>
    /// Service key for keyed DI registration (.NET 8+).
    /// When set, generates AddKeyed{Scope} calls inside a #if NET8_0_OR_GREATER block.
    /// </summary>
    public string? Key { get; set; }

    /// <summary>
    /// When true, generates TryAdd{Scope} instead of Add{Scope}.
    /// The service is only registered if no existing registration for the same
    /// service type exists.
    /// </summary>
    public bool TryAdd { get; set; }
}

The Scope enum

namespace FrenchExDev.Net.Injectable.Attributes;

public enum Scope
{
    Transient,
    Scoped,
    Singleton
}

Three lifetimes, matching the three lifetimes in Microsoft.Extensions.DependencyInjection. Transient is the default: a new instance per resolution. Scoped means one instance per scope (typically one per HTTP request). Singleton means one instance for the lifetime of the application.

Auto-detection of service types

When you do not specify the As property, the source generator auto-detects which interfaces to register. The rules are:

  1. If the class implements one or more interfaces directly, the generator registers the class for each interface. "Directly" means the interface appears in the class declaration, not in a base class.

  2. If the class implements zero interfaces, the generator registers the class as itself (self-registration).

// Implements IOrderRepository → registered as IOrderRepository
[Injectable(Scope = Scope.Scoped)]
public sealed class OrderRepository : IOrderRepository { ... }
// Generated: services.AddScoped<IOrderRepository, OrderRepository>();

// Implements IReader AND IWriter → registered for both
[Injectable(Scope = Scope.Scoped)]
public sealed class FileStore : IReader, IWriter { ... }
// Generated: services.AddScoped<IReader, FileStore>();
//            services.AddScoped<IWriter, FileStore>();

// Implements no interfaces → self-registration
[Injectable(Scope = Scope.Singleton)]
public sealed class CacheWarmer { ... }
// Generated: services.AddSingleton<CacheWarmer>();

Explicit service types with As

When you need to register a class for specific interfaces (not all the ones it implements), use the As property:

// OrderService implements IOrderReader, IOrderWriter, and IDisposable
// But we only want to register it as IOrderReader and IOrderWriter
[Injectable(Scope = Scope.Scoped, As = new[] { typeof(IOrderReader), typeof(IOrderWriter) })]
public sealed class OrderService : IOrderReader, IOrderWriter, IDisposable { ... }
// Generated: services.AddScoped<IOrderReader, OrderService>();
//            services.AddScoped<IOrderWriter, OrderService>();
// (IDisposable is NOT registered — it was excluded by the explicit As list)

The analyzer enforces that every type in the As array is an interface. If you accidentally write As = new[] { typeof(OrderService) } (a class, not an interface), you get a compile-time diagnostic:

INJECT001: Type 'OrderService' in 'As' must be an interface.

Keyed DI with Key

.NET 8 introduced keyed services in Microsoft.Extensions.DependencyInjection. The Key property generates keyed registration calls:

[Injectable(Scope = Scope.Scoped, Key = "primary")]
public sealed class PrimaryDatabase : IDatabase { ... }

[Injectable(Scope = Scope.Scoped, Key = "replica")]
public sealed class ReplicaDatabase : IDatabase { ... }

The generator emits keyed registration inside a #if NET8_0_OR_GREATER block, so the same source code compiles on .NET 6 (without keyed DI) and .NET 8+ (with keyed DI):

// Generated:
#if NET8_0_OR_GREATER
        services.AddKeyedScoped<global::MyApp.IDatabase, global::MyApp.PrimaryDatabase>("primary");
        services.AddKeyedScoped<global::MyApp.IDatabase, global::MyApp.ReplicaDatabase>("replica");
#endif

On .NET 6, those lines are compiled away. On .NET 8+, they register the keyed services.

Idempotent registration with TryAdd

When building a library that might be consumed by multiple assemblies, you want idempotent registration: register the service only if no one else has registered it yet. The TryAdd property handles this:

[Injectable(Scope = Scope.Singleton, TryAdd = true)]
public sealed class DefaultEventSerializer : IEventSerializer { ... }

The generator emits TryAddSingleton instead of AddSingleton:

// Generated:
services.TryAddSingleton<global::MyApp.IEventSerializer, global::MyApp.DefaultEventSerializer>();

If the consuming application registers its own IEventSerializer before calling the library's AddInjectables(), the TryAdd registration is skipped. The library provides a default; the consumer can override it.

The [InjectableDecorator] attribute

The decorator pattern is common in DI: you want to wrap an existing service with additional behavior (logging, caching, metrics) without changing the original implementation. The [InjectableDecorator] attribute makes this declarative:

namespace FrenchExDev.Net.Injectable.Attributes;

[AttributeUsage(AttributeTargets.Class, Inherited = false, AllowMultiple = false)]
public sealed class InjectableDecoratorAttribute : Attribute
{
    /// <summary>The service type (interface) this decorator wraps.</summary>
    public Type ServiceType { get; }

    /// <summary>
    /// Ordering hint when multiple decorators target the same service.
    /// Lower values are applied first (innermost). Default is 0.
    /// </summary>
    public int Order { get; set; }

    public InjectableDecoratorAttribute(Type serviceType)
    {
        ServiceType = serviceType;
    }
}

Here is how you use it:

// The original service
[Injectable(Scope = Scope.Scoped)]
public sealed class OrderRepository : IOrderRepository
{
    private readonly AppDbContext _db;
    public OrderRepository(AppDbContext db) => _db = db;

    public async Task<Option<Order>> FindByIdAsync(int id)
    {
        var entity = await _db.Orders.FindAsync(id);
        return entity is not null
            ? Option<Order>.Some(entity.ToDomain())
            : Option<Order>.None();
    }
}

// A caching decorator (innermost — applied first)
[InjectableDecorator(typeof(IOrderRepository), Order = 0)]
public sealed class CachingOrderRepository : IOrderRepository
{
    private readonly IOrderRepository _inner;
    private readonly IMemoryCache _cache;

    public CachingOrderRepository(IOrderRepository inner, IMemoryCache cache)
    {
        _inner = inner;
        _cache = cache;
    }

    public async Task<Option<Order>> FindByIdAsync(int id)
    {
        return await _cache.GetOrCreateAsync($"order:{id}", async entry =>
        {
            entry.AbsoluteExpirationRelativeToNow = TimeSpan.FromMinutes(5);
            return await _inner.FindByIdAsync(id);
        });
    }
}

// A logging decorator (outermost — applied second)
[InjectableDecorator(typeof(IOrderRepository), Order = 1)]
public sealed class LoggingOrderRepository : IOrderRepository
{
    private readonly IOrderRepository _inner;
    private readonly ILogger<LoggingOrderRepository> _logger;

    public LoggingOrderRepository(IOrderRepository inner, ILogger<LoggingOrderRepository> logger)
    {
        _inner = inner;
        _logger = logger;
    }

    public async Task<Option<Order>> FindByIdAsync(int id)
    {
        _logger.LogDebug("Finding order {OrderId}", id);
        var result = await _inner.FindByIdAsync(id);
        result.Switch(
            some: order => _logger.LogDebug("Found order {OrderId}", id),
            none: () => _logger.LogDebug("Order {OrderId} not found", id));
        return result;
    }
}

The resolution order is: LoggingOrderRepository wraps CachingOrderRepository wraps OrderRepository. The Order property controls the nesting: lower values are innermost (closer to the real implementation).

The generated code for decorators is more involved than simple registration. The generator removes the original registration, builds a factory lambda that resolves the inner service and passes it to the decorator constructor, and re-registers the decorator with the same lifetime:

// Generated decorator wiring:
{
    var inner = services.LastOrDefault(d => d.ServiceType == typeof(global::MyApp.IOrderRepository));
    if (inner is not null)
    {
        services.Remove(inner);
        services.Add(new global::Microsoft.Extensions.DependencyInjection.ServiceDescriptor(
            typeof(global::MyApp.IOrderRepository),
            sp => global::Microsoft.Extensions.DependencyInjection.ActivatorUtilities
                .CreateInstance<global::MyApp.CachingOrderRepository>(
                    sp,
                    inner.ImplementationFactory is not null
                        ? inner.ImplementationFactory(sp)
                        : global::Microsoft.Extensions.DependencyInjection.ActivatorUtilities
                            .CreateInstance(sp, inner.ImplementationType!)),
            inner.Lifetime));
    }
}

This is the kind of code you would have to write by hand in Startup.cs if you did not have the generator. It is correct, but it is tedious and error-prone. The generator writes it once, correctly, for every decorator in the assembly.

The [InjectableDefaults] assembly-level attribute

When most services in an assembly share the same scope, you can set a default at the assembly level:

// In AssemblyInfo.cs or any file
[assembly: InjectableDefaults(Scope = Scope.Scoped)]

Now every [Injectable] class in the assembly that does not explicitly set Scope will be registered as Scoped instead of the built-in default of Transient:

// This class gets Scoped (from assembly default) because Scope is not explicitly set
[Injectable]
public sealed class OrderRepository : IOrderRepository { ... }

// This class gets Singleton (explicitly set, overrides assembly default)
[Injectable(Scope = Scope.Singleton)]
public sealed class CacheWarmer { ... }

The generator tracks whether Scope was explicitly set on each attribute instance. If it was, the explicit value wins. If it was not, the assembly default applies. This is a small feature, but it eliminates 30 lines of Scope = Scope.Scoped in a typical web application assembly where almost everything is scoped.

Multi-container support

The source generator does not assume Microsoft.Extensions.DependencyInjection. It supports three DI containers:

  1. Microsoft.Extensions.DependencyInjection -- services.AddScoped<IFoo, Foo>()
  2. Simple Injector -- container.Register<IFoo, Foo>(Lifestyle.Scoped)
  3. DryIoc -- container.Register<IFoo, Foo>(Reuse.ScopedOrSingleton)

Each container has its own source generator NuGet package:

  • FrenchExDev.Net.Injectable.Microsoft.SourceGenerator
  • FrenchExDev.Net.Injectable.SimpleInjector.SourceGenerator
  • FrenchExDev.Net.Injectable.DryIoc.SourceGenerator

You install the one that matches your container. The attributes are the same. The generated code differs.

For Microsoft DI, the generated method is named Add{AssemblyName}Injectables and extends IServiceCollection. For Simple Injector, it extends Container. For DryIoc, it extends IContainer. The method body uses each container's native registration API -- no adapters, no abstractions over containers.

The before-and-after

Here is what a typical Program.cs looks like without [Injectable]:

// BEFORE — manual DI registration
var builder = WebApplication.CreateBuilder(args);

// Infrastructure
builder.Services.AddSingleton<IClock>(SystemClock.Instance);
builder.Services.AddSingleton<IEventSerializer, JsonEventSerializer>();
builder.Services.AddSingleton<CacheWarmer>();

// Repositories
builder.Services.AddScoped<IOrderRepository, OrderRepository>();
builder.Services.AddScoped<ICustomerRepository, CustomerRepository>();
builder.Services.AddScoped<IProductRepository, ProductRepository>();
builder.Services.AddScoped<IInventoryRepository, InventoryRepository>();
builder.Services.AddScoped<IPaymentRepository, PaymentRepository>();

// Domain services
builder.Services.AddScoped<IOrderService, OrderService>();
builder.Services.AddScoped<IPricingService, PricingService>();
builder.Services.AddScoped<IInventoryService, InventoryService>();
builder.Services.AddScoped<IPaymentGateway, StripePaymentGateway>();
builder.Services.AddScoped<IShippingCalculator, FedExShippingCalculator>();

// Mediator handlers
builder.Services.AddScoped<ICommandHandler<PlaceOrderCommand, OrderId>, PlaceOrderHandler>();
builder.Services.AddScoped<ICommandHandler<CancelOrderCommand, Unit>, CancelOrderHandler>();
builder.Services.AddScoped<IQueryHandler<GetOrderQuery, OrderDto>, GetOrderHandler>();
builder.Services.AddScoped<IQueryHandler<ListOrdersQuery, IReadOnlyList<OrderDto>>, ListOrdersHandler>();

// Saga steps
builder.Services.AddScoped<ISagaStep<OrderContext>, ValidateInventoryStep>();
builder.Services.AddScoped<ISagaStep<OrderContext>, ChargePaymentStep>();
builder.Services.AddScoped<ISagaStep<OrderContext>, ShipOrderStep>();

// Decorators (order matters, must be registered manually)
builder.Services.Decorate<IOrderRepository, CachingOrderRepository>();
builder.Services.Decorate<IOrderRepository, LoggingOrderRepository>();

// Behaviors
builder.Services.AddScoped<IBehavior<PlaceOrderCommand, OrderId>, ValidationBehavior>();
builder.Services.AddScoped<IBehavior<PlaceOrderCommand, OrderId>, LoggingBehavior>();

// ... 20 more lines for a medium-sized application

That is 30+ lines of registration code that must be kept in sync with the actual class implementations. Add a new repository? Remember to register it. Change a scope? Find the right line. Rename an interface? Update the registration. Forget any of these? Runtime InvalidOperationException on the first request.

Here is the same application with [Injectable]:

// AFTER — source-generated DI registration
var builder = WebApplication.CreateBuilder(args);
builder.Services.AddMyAppInjectables(); // One line. Everything registered.

One line. The source generator scanned every [Injectable] and [InjectableDecorator] attribute in the assembly and generated the complete registration method. If you add a new repository with [Injectable(Scope = Scope.Scoped)], the generator picks it up on the next compile. If you rename an interface, the generator follows the rename. If you forget the attribute, the service is not registered -- and you get a clear InvalidOperationException at startup that says which service is missing, not a silent null somewhere deep in the application.

The generated method is deterministic: same source code in, same registration code out. You can diff the generated file between commits to see exactly what changed in your DI registrations. You can review it in a pull request. You can read it when debugging.


The .Testing Convention

Every .Testing package in the framework follows the same anatomy. This consistency means that once you learn how to use FrenchExDev.Net.Options.Testing, you already know the shape of FrenchExDev.Net.Saga.Testing. The patterns repeat.

Category 1: Fakes

Fakes are in-memory implementations of framework interfaces that provide controlled behavior for testing.

Fake What it replaces Key API
FakeClock IClock / SystemClock Advance(TimeSpan), SetUtcNow(DateTimeOffset)
FakeMediator IMediator Setup<TRequest, TResponse>(), WasSent<T>(), WasPublished<T>()

Fakes differ from mocks in one critical way: they have real behavior. FakeClock actually tracks time. When you call Advance(TimeSpan.FromHours(1)), the clock's UtcNow property advances by one hour. Any timer created by the clock fires at the correct simulated time. This is not a mock that returns a canned value -- it is a deterministic simulation.

var clock = new FakeClock(DateTimeOffset.UnixEpoch);

clock.UtcNow.Should().Be(DateTimeOffset.UnixEpoch);

clock.Advance(TimeSpan.FromDays(365));

clock.UtcNow.Should().Be(DateTimeOffset.UnixEpoch.AddDays(365));

FakeMediator also has real behavior: it matches incoming requests against registered setups and returns the configured response. It records every sent command and every published notification for later assertion:

var mediator = new FakeMediator();

// Setup: when this query is sent, return this result
mediator.Setup<GetUserQuery, UserDto>(query =>
    Result<UserDto>.Success(new UserDto(query.UserId, "Alice")));

// Act: the system under test sends the query
var result = await mediator.SendAsync(new GetUserQuery(42));

// Assert: the result is what we configured
result.IsSuccess.Should().BeTrue();
result.Value.Name.Should().Be("Alice");

// Assert: the query was actually sent
mediator.WasSent<GetUserQuery>(q => q.UserId == 42).Should().BeTrue();

Category 2: Assertions

Assertions are extension methods that provide domain-specific assertion vocabulary for framework types.

Assertion class Target type Key methods
OptionAssertions Option<T> ShouldBeSome(), ShouldBeNone(), ShouldBeSomeAnd(predicate)
UnionAssertions OneOf<T1,T2,...> ShouldBe<T>(), ShouldNotBe<T>(), ShouldMatchExhaustively(...)
MapperAssertions IMapper<S,T> ShouldMapTo<T>(), ShouldMapProperty(...), ShouldIgnoreProperty(...)

These assertions replace generic .Should().Be() calls with assertions that understand the type:

// Without OptionAssertions — generic, unclear intent
result.Should().NotBeNull();
result.IsSome.Should().BeTrue();
result.Match(user => user.Name, () => "").Should().Be("Alice");

// With OptionAssertions — specific, clear intent
result.ShouldBeSome();
result.ShouldBeSomeAnd(user => user.Name == "Alice");

The assertion version is three lines shorter and communicates intent directly. When it fails, the error message says Expected Option to be Some, but was None -- not Expected True but got False.

Category 3: Test doubles (persistent stores)

Test doubles are in-memory implementations of persistence interfaces that behave like the real store but without infrastructure.

Test double What it replaces Backing store
InMemorySagaStore ISagaStore (EF Core, Redis, etc.) ConcurrentDictionary<string, SagaInstance>
InMemoryOutbox IOutboxStore (EF Core) ConcurrentBag<OutboxMessage>

These are not mocks. They are real implementations that store data in memory. InMemorySagaStore persists saga instances across multiple calls to SaveAsync and GetAsync. InMemoryOutbox accumulates outbox messages across multiple calls to AddAsync. They are thread-safe (backed by concurrent collections) and they implement the same interface as the production store.

This means your integration tests can exercise the full saga orchestration pipeline -- multiple steps, compensation on failure, state persistence between steps -- without a database:

[Fact]
public async Task Compensation_reverses_completed_steps_on_failure()
{
    var store = new InMemorySagaStore();
    var steps = new ISagaStep<OrderContext>[]
    {
        new AlwaysSucceedsStep(),
        new AlwaysSucceedsStep(),
        new AlwaysFailsStep()  // Third step fails
    };

    var orchestrator = new SagaOrchestrator<OrderContext>(store, steps);
    var context = new OrderContext(OrderId: 1, Total: 50m);

    var result = await orchestrator.ExecuteAsync(context);

    // The saga failed
    result.IsFailure.Should().BeTrue();

    // But the first two steps were compensated
    var saga = await store.GetAsync<OrderContext>(context.SagaId);
    saga.ShouldBeSomeAnd(s => s.State == SagaState.Compensated);
}

No SQL Server. No Redis. No Docker. No connection string in CI. The test runs in under 10 milliseconds.

Category 4: Event recorders

TestEventStream<T> records every event published to a reactive event stream. This lets you assert on the sequence, count, and content of events without subscribing with real handlers:

[Fact]
public void Order_events_are_published_in_sequence()
{
    var stream = new TestEventStream<OrderEvent>();
    var service = new OrderService(stream);

    service.PlaceOrder(new Order(42, total: 99m));

    stream.RecordedEvents.Should().HaveCount(3);
    stream.RecordedEvents[0].Should().BeOfType<OrderCreated>();
    stream.RecordedEvents[1].Should().BeOfType<InventoryReserved>();
    stream.RecordedEvents[2].Should().BeOfType<PaymentCharged>();
}

TestEventStream<T> implements IEventStream<T>, so it drops into any code that accepts the interface. The RecordedEvents property is a IReadOnlyList<T> that you can assert against with any assertion library.

Why this matters

The .Testing convention means three things for consumers of the framework:

  1. Production code never depends on test infrastructure. The FrenchExDev.Net.Clock package does not contain FakeClock. The FrenchExDev.Net.Saga package does not contain InMemorySagaStore. Test doubles live in separate assemblies that are only referenced by test projects.

  2. Tests never need Moq, NSubstitute, or FakeItEasy for framework types. Every interface that the framework exposes has a purpose-built test double. You can still use a mocking library for your own interfaces, but you never need one for IClock, IMediator, ISagaStore, or IEventStream<T>.

  3. Every test double is designed by the framework author. The person who wrote IClock also wrote FakeClock. The person who wrote ISagaStore also wrote InMemorySagaStore. The test double knows the interface's invariants, edge cases, and threading requirements because the same developer designed both sides.


Structural Consistency

Every pattern follows the same project structure. This is enforced by the monorepo's build system, not by documentation.

{Pattern}/
├── src/
│   ├── FrenchExDev.Net.{Pattern}/               # Core library (netstandard2.0 or net8.0+)
│   ├── FrenchExDev.Net.{Pattern}.Testing/        # Test doubles and assertions
│   └── FrenchExDev.Net.{Pattern}.{Platform}/     # Platform-specific extensions
├── tests/
│   └── FrenchExDev.Net.{Pattern}.Tests/          # Unit tests
└── README.md

Every pattern has a src/ directory with the core library and the .Testing library. Some patterns have a platform-specific extension (e.g., FrenchExDev.Net.Outbox.EntityFramework for EF Core integration). Every pattern has a tests/ directory with a single test project.

The naming convention is rigid:

  • Core library: FrenchExDev.Net.{Pattern} -- this is the NuGet package consumers install.
  • Testing library: FrenchExDev.Net.{Pattern}.Testing -- this is the NuGet package consumers install in test projects.
  • Platform extension: FrenchExDev.Net.{Pattern}.{Platform} -- this is an optional NuGet package for platform-specific integration.
  • Test project: FrenchExDev.Net.{Pattern}.Tests -- this is never published to NuGet.

This consistency has practical benefits:

  • Discoverability: If you know the pattern name, you know every package name. FrenchExDev.Net.Saga, FrenchExDev.Net.Saga.Testing, FrenchExDev.Net.Saga.Tests. No guessing.
  • Onboarding: A new developer who has worked with one pattern already knows the project layout of every other pattern. The src/ and tests/ directories are always in the same place. The .Testing package is always named the same way.
  • Automation: The CI pipeline discovers all pattern directories, builds them in parallel, and runs their tests. It does not need per-pattern configuration because every pattern follows the same structure.

The platform-specific extension directory is optional. Most patterns do not have one. The ones that do:

Pattern Platform Extension Purpose
Outbox FrenchExDev.Net.Outbox.EntityFramework EF Core SaveChangesInterceptor for transactional outbox
Saga FrenchExDev.Net.Saga.EntityFramework EF Core persistence for ISagaStore
Mapper FrenchExDev.Net.Mapper.SourceGenerator Roslyn source generator for [MapFrom]/[MapTo]
Injectable FrenchExDev.Net.Injectable.Microsoft.SourceGenerator Roslyn source generator for Microsoft DI
Injectable FrenchExDev.Net.Injectable.SimpleInjector.SourceGenerator Roslyn source generator for Simple Injector
Injectable FrenchExDev.Net.Injectable.DryIoc.SourceGenerator Roslyn source generator for DryIoc

The core library never depends on the platform extension. The platform extension depends on the core library. This keeps the dependency arrow pointing in the right direction: core is stable, extensions are volatile.

The consistency payoff

When every pattern follows the same structure, certain operations become mechanical:

Adding a new pattern. To add a tenth pattern (say, Retry), you create the same directory structure: Retry/src/FrenchExDev.Net.Retry/, Retry/src/FrenchExDev.Net.Retry.Testing/, Retry/tests/FrenchExDev.Net.Retry.Tests/. The build system picks it up automatically. The CI pipeline discovers it. The NuGet publishing script packages it. No manual configuration needed.

Finding the test double for any interface. If you need the test double for ISagaStore, you know it is in FrenchExDev.Net.Saga.Testing. If you need the test double for IEventStream<T>, you know it is in FrenchExDev.Net.Reactive.Testing. The naming convention eliminates searching.

Reviewing a pull request. When a PR modifies the Mapper pattern, the reviewer knows exactly where to look: Mapper/src/FrenchExDev.Net.Mapper/ for the change, Mapper/tests/FrenchExDev.Net.Mapper.Tests/ for the tests, and Mapper/src/FrenchExDev.Net.Mapper.Testing/ for any test double updates. The structure is the same as every other pattern the reviewer has seen.

Debugging a production issue. When a saga step fails in production, the developer knows the saga code lives in Saga/src/FrenchExDev.Net.Saga/, the saga store lives in Saga/src/FrenchExDev.Net.Saga.EntityFramework/, and the saga tests (which demonstrate correct usage) live in Saga/tests/FrenchExDev.Net.Saga.Tests/. No searching, no guessing.

Open generic support

One detail worth calling out in the structural story: the [Injectable] source generator handles open generic types. This matters for patterns like Mediator and Mapper, where handlers and mappers are generic:

[Injectable(Scope = Scope.Scoped)]
public sealed class Repository<T> : IRepository<T> where T : class, IAggregateRoot
{
    private readonly AppDbContext _db;

    public Repository(AppDbContext db) => _db = db;

    public async Task<Option<T>> FindByIdAsync(int id)
    {
        var entity = await _db.Set<T>().FindAsync(id);
        return entity is not null
            ? Option<T>.Some(entity)
            : Option<T>.None();
    }
}

The generator detects that Repository<T> is an open generic (it has unresolved type parameters) and emits typeof() registration instead of closed generic registration:

// Generated for open generic:
services.AddScoped(typeof(global::MyApp.IRepository<>), typeof(global::MyApp.Repository<>));

This single registration handles IRepository<Order>, IRepository<Customer>, IRepository<Product>, and every other aggregate root type. The DI container resolves the correct closed generic at runtime. The generator handles this transparently -- the developer just puts [Injectable] on the class, same as any other service.


Design Decisions That Did Not Make the Cut

Not every idea survives contact with the five principles. Several features were proposed and rejected during the design of the framework. Understanding why they were rejected is as informative as understanding what was accepted.

A shared FrenchExDev.Net.Core package. The initial design had a single Core package that contained Result<T>, Option<T>, and common extensions. This was rejected because it violated Principle 1: a consumer who only needed Option<T> would also pull in Result<T> and the extensions. Splitting Result<T> into its own package and Option<T> into its own package made the graph sparser. The cost was two NuGet packages instead of one. The benefit was that consumers pay only for what they use.

Abstract base classes for handlers. An early version of the Mediator pattern had CommandHandler<TCommand, TResponse> as an abstract base class instead of ICommandHandler<TCommand, TResponse> as an interface. The base class provided convenience methods (Ok(), Fail()) and automatic logging. This was rejected because base classes create coupling: every handler inherits from the framework, and if the base class changes, every handler recompiles. Interfaces are cheaper to evolve. The convenience methods moved to extension methods on Result<T>, which any handler can use without inheriting anything.

Runtime decorator registration. An early version of the decorator system used runtime reflection to discover decorators: services.AddDecorators(Assembly.GetExecutingAssembly()). This was rejected because it violated Principle 3 (source generation over reflection). The [InjectableDecorator] attribute and its source-generated registration replaced the runtime scan with compile-time code generation.

A single AddFrenchExDev() method. The most frequently requested feature was a single extension method that registers all nine patterns at once. This was rejected because it violates Principle 1: calling AddFrenchExDev() would register patterns you do not use, and it would require a dependency from one package to all eight others. Instead, each assembly has its own Add{Assembly}Injectables() method, and you call the ones you need. If you use five patterns in five assemblies, you call five methods. That is five lines of code. It is not a burden.

These rejections were not easy. Each rejected feature had genuine appeal. But each one violated a principle, and the principles are non-negotiable. That is what it means to have principles instead of guidelines.


What This Series Covers

This article covered the five design principles that shape every pattern in the framework. The remaining ten articles each focus on a single pattern: what problem it solves, how the API works, how to test it with the companion .Testing package, how it integrates with DI via [Injectable], and how it composes with the other patterns.

Part II: The Option Pattern -- Representing the absence of a value without resorting to null. Option<T> as a sealed record with Some and None cases, exhaustive Match and Switch, and the full functional extension landscape: Map, Bind, Filter, Tap, OrDefault, OrElse, Zip, Contains. Async pipelines on Task<Option<T>>. LINQ query syntax. Collection extensions including Haskell-inspired Sequence and Traverse. Bidirectional conversion with Result<T>. Testing with OptionAssertions: ShouldBeSome, ShouldBeNone, ShouldBeSomeAnd.

Part III: The Union Pattern -- Discriminated unions in C# without language support. OneOf<T1,T2>, OneOf<T1,T2,T3>, and OneOf<T1,T2,T3,T4>: byte discriminator, private storage, implicit conversions, exhaustive Match and Switch, TryGet<T>, IEquatable. When to use Union vs Option vs Result. A real-world example modeling payment methods (CreditCard, BankTransfer, Crypto) as a closed set of types that the compiler enforces exhaustive handling for.

Part IV: The Guard Pattern -- Three prongs of defensive programming. Guard.Against throws exceptions at API boundaries for fail-fast semantics. Guard.ToResult returns Result<T> for functional pipelines that compose validation into chains. Guard.Ensure asserts internal invariants with InvalidOperationException. [CallerArgumentExpression] captures parameter names automatically on modern .NET. Every method returns the validated value for inline chaining. Fourteen built-in checks from Null to UndefinedEnum.

Part V: The Clock Pattern -- A deterministic time abstraction wrapping .NET 8+ TimeProvider with a domain-oriented API. SystemClock.Instance for production. FakeClock for testing with Advance(TimeSpan) and SetUtcNow(DateTimeOffset). Properties: UtcNow, Now(TimeZoneInfo), Today. Methods: CreateTimer, Delay. Why DateTime.UtcNow is a hidden dependency that makes tests non-deterministic and how to eliminate it with a single interface.

Part VI: The Mapper Pattern -- Source-generated, zero-reflection object mapping. IMapper<in TSource, out TTarget> with variance. Five attributes: [MapFrom], [MapTo], [MapProperty], [IgnoreMapping], [OneWay]. What the source generator emits. Three-layer mapping: Domain to DTO to Persistence Entity. Testing with MapperAssertions: ShouldMapTo, ShouldMapProperty, ShouldIgnoreProperty. A detailed comparison with AutoMapper: what you gain (compile-time safety, debuggability, zero startup cost) and what you lose (runtime configuration, convention-based mapping).

Part VII: The Mediator Pattern -- CQRS dispatch with IMediator, SendAsync, and PublishAsync. Marker interfaces: ICommand<T> and IQuery<T>. IBehavior<TRequest, TResult> pipeline middleware for cross-cutting concerns: logging, validation, authorization, transactions. INotification with three publish strategies: Sequential, Parallel, FireAndForget. Testing with FakeMediator: Setup, WasSent, WasPublished. A comparison with MediatR: what differs and why.

Part VIII: The Reactive Pattern -- Domain event streams wrapping System.Reactive with domain vocabulary. IEventStream<T> backed by Subject<T>. Ten operators: Filter, Map, Merge, Buffer, Throttle, DistinctUntilChanged, Take, Skip, OfType. The ObservableEventStream<T> adapter for full Rx interop. Testing with TestEventStream<T> that records every published event for later assertion.

Part IX: The Saga Pattern -- Multi-step orchestration with SagaOrchestrator<TContext> and ISagaStep<T>. The SagaContext state machine: Pending, Running, Completed, Compensating, Compensated, Failed. Forward execution, reverse-order compensation on failure. ISagaStore persistence with SagaInstance. Testing with InMemorySagaStore. A real-world example: order fulfillment with three steps (inventory, payment, shipping) and three matching compensations (release, refund, cancel).

Part X: The Outbox Pattern -- Solving the dual-write problem. OutboxMessage with retry tracking. EF Core OutboxInterceptor as a SaveChangesInterceptor that captures domain events from IHasDomainEvents entities. IOutboxProcessor for background publishing. The key guarantee: domain state and events are committed in the same database transaction, so you never have state without events or events without state. Testing with InMemoryOutbox.

Part XI: Composition -- All nine patterns working together in a single subscription renewal scenario. Guard validates at the API boundary. Option models the subscription lookup. Union models three renewal paths (standard, upgrade, downgrade). Mapper transforms between domain and DTO layers. Clock calculates expiration dates. Mediator dispatches through a behavior pipeline. Saga orchestrates payment and provisioning with compensation. Outbox guarantees the renewal event is published. Reactive streams feed analytics subscribers. The complete DI registration with [Injectable]. The integration test with all eight .Testing packages.


Let's start with the most fundamental pattern -- representing the absence of a value without resorting to null.

⬇ Download