I reviewed the original API and found several problems ranging from minor to serious. I have listed the more impactful ones here.
I prefer more explicit seperation of concerns between state and logic (background reading here) and the original was mixing these a lot.
For example, not only is the Product
class a domain model, it is also a DTO, and is also used for database persistence.
This is a problem because it means that the domain model is now coupled to the database and the API. This makes it difficult to change the domain model without affecting the database and API. This is a problem because the domain model should be the most stable part of the codebase, and should be able to change without affecting other parts of the codebase.
One of the first code smells I noticed in the product class is how it tracks itself with IsNew
.
Logically this is a property that should be tracked by a change tracker. A product has a name, description, price, and... the knowledge of whether it's new or not... and the ability to execute SQL queries.
As described above, one of my major concerns is the mixing of state and logic in the domain models. On a related note, they also have lots of inline SQL.
In my experience this is often not a great long term solution compared to either using an ORM, SQL stored in .sql files, or even libraries like Dapper.
Furthermore, some of the SQL is open to SQL injection attacks. For example:
var conn = Helpers.NewConnection();
conn.Open();
var cmd = new SqlCommand($"delete from productoption where id = '{Id}'", conn);
cmd.ExecuteReader();
The Helpers
class contain a hardcoded connection string. Furthermore, it uses HttpContext.Current.Server.MapPath("~/App_Data")
. This should be injected instead with the value coming from appsettings.json
.
Even in the original version, this value should have been read from Web.config
.
The original version shows no clear boundaries, patterns, or structure. In fact everything is dumped into Models
and suffers from all of the previous problems.
The code does not have any tests, it's also in such a poor state it can't really be tested without effectively rewriting it.
cd xero-interview-task
dotnet run --project .\src\Refactored.Api
Swagger should be now be running.
From the original README it was clear that the original implementation was written poorly on purpose with the goal of refactoring (or building a new implementation) to improve the code quality.
After reviewing the code, the state of the project (for example using .NET 4.5), the numerous problems and technical debt in the code, I decided to build a new implementation from scratch.
I built the new API with .NET 7.0/ASP.NET Core 7.0, so you'll need to ensure you have the latest .NET 7.0 SDK installed to run the project.
Being a fan of maintainable and testable code, as well as functional programming and clean code patterns and dependency injection I opted to use the following structure popularised in DDD (Domain Driven Design) circles.
Please note that for the sake of the time limit mentioned in the original README, I have skipped several important areas such as aggregate roots, unit tests, etc.
Later in this README I discuss how I would add tests.
The project is structured as follows:
Refactored.Api
- The API project. This is where all the parts come together, and is the binary that is run.Refactored.Domain
- The domain project. This contains the domain models and business logic.Refactored.Infrastructure
- The infrastructure project. This contains the infrastructure code such as the database context, Entity Framework configuration, etc.Refactored.Application
- The application project. This is the core containing all the actual business logic. This is an especially important part as by design it has no dependencies on other projects. This allows for the application to be reused anywhere.
I highly recommend watching this talk to understand the advantages and motivation behind this structure: https://www.youtube.com/watch?v=dK4Yb6-LxAk&t=1s
I opted for Entity Framework Core as the ORM. It has been sometime since I used EF (having used EF 5 in the past), so I saw this as a good opportunity to get up to speed on it's latest developments. Additionally, I wanted this interview task to run with little friction when it was reviewed, so I also used SQLite. The single-file database is conveniently in this repository with the connection string pointing at it.
I found some surprising edge cases with SQLite/the EF driver for SQLite such as decimal
and Guid
being stored as TEXT
. Furthermore, querying on primary keys that are a Guid
is not supported (query returns null).
After reading through some open GitHub issues I found a fix to this, which involves ToString()
- I suspect this leads to very sub-optimal generated SQL.
I prefer configuration over often-inflexible attributes, so I usually create EF configuration classes. For example, this is one of them.
public class ProductOptionConfiguration : IEntityTypeConfiguration<ProductOption>
{
public void Configure(EntityTypeBuilder<ProductOption> builder)
{
builder.Property(productOption => productOption.Name)
.HasMaxLength(100)
.IsRequired();
builder.Property(productOption => productOption.Description)
.HasMaxLength(1000)
.IsRequired();
}
}
Experience has shown me that often auditability is an eventual requirement - sometimes simple created/updated properties are adequate, or sometimes an entire audit log is required for regulatory reasons.
I decided to showcase one way of adding simple auditing without needing to manually remember to set these values. I've created several variations of this in the past, but I recently found a very nice example of EF interception online that I decided to use here.
public class AuditableEntitySaveChangesInterceptor : SaveChangesInterceptor
{
private readonly IDateTime dateTime;
public AuditableEntitySaveChangesInterceptor(IDateTime dateTime)
{
this.dateTime = dateTime;
}
public override InterceptionResult<int> SavingChanges(DbContextEventData eventData, InterceptionResult<int> result)
{
UpdateEntities(eventData.Context);
return base.SavingChanges(eventData, result);
}
public override ValueTask<InterceptionResult<int>> SavingChangesAsync(DbContextEventData eventData, InterceptionResult<int> result, CancellationToken cancellationToken = default)
{
UpdateEntities(eventData.Context);
return base.SavingChangesAsync(eventData, result, cancellationToken);
}
public void UpdateEntities(DbContext? context)
{
if (context == null) return;
foreach (var entry in context.ChangeTracker.Entries<BaseAuditableEntity>())
{
if (entry.State == EntityState.Added)
{
entry.Entity.CreatedAt = this.dateTime.Now;
}
if (entry.State == EntityState.Added || entry.State == EntityState.Modified)
{
entry.Entity.UpdatedAt = this.dateTime.Now;
}
}
}
}
As I'm using EF, migrations were simple to add. This allows for the database to be created and updated as the domain models change.
As described previously, I used a functional-like approach to the project structure. This implies that the entities inside Refactored.Api
are pure and are not aware of concerns such as serialisation, deserialisation, and database persistence.
I created DTO's to achieve this. The general flow is: API <-DTO-> Controller <-DTO-> Business Logic <-Entity-> Database
.
Rather than manually copying properties, I used AutoMapper.
I use dependency injection to achieve loose coupling and testability. The composition root of each project structure is defined in their respective ConfigureServices.cs
. Ultimately, these are then registed in the entry point of the API project. As a nice convenience each composition root is expressed as an extension method allowing this:
public void ConfigureServices(IServiceCollection services)
{
services.AddInfrastructureServices(Configuration);
services.AddApplicationServices();
...
}
There were several features I decided to skip over that I would have liked to add if the README had a longer time limit.
- Updating Products and Product Options. I decided that a better use of time to showcase my approach would be to focus on the project structure, my general approach to software design and best practices.
- Validation. Although I planned on using FluentValidation (this is a really great library) I decided to skip this for the same reason as above.
- Unit tests. I would have liked to add unit tests to showcase my approach to testing. I have a small list of libraries I use for this:
- https://nsubstitute.github.io/ - I prefer this over Moq as I find it's approach more readable and more intuitive than Moq's.
- https://github.com/shouldly/shouldly - A much nicer way of asserting via extension methods and other declarative patterns.
- https://autofixture.github.io/ - A very powerful library that helps apply TDD principles and reduces setup noise, in a reusable way.
- I've mentioned it above and in code comments but I'd liked to have taken my usual approach of making error states more explicit with
Result<T,V>
type objects instead of throwing exceptions.
In addition to all of the above, there are some further patterns and design choices I usually follow that I would like to apply.
- As previously mentioned, I like functional programming as I believe it leads to better quality code. Rather than the common practice of throwing exceptions around for clearly non-exceptional situations, I would use a
Result
monad for cases such as a product not being found. I tend to use this library for that: https://github.com/vkhorikov/CSharpFunctionalExtensions - Logging! Although I added some minor logging, my usual approach is to use Microsoft.Extensions.Logging alongside Serilog to achieve structured logging.
- I would have liked to add documentation to the code. I usually use DocFX to generate documentation from XML comments.
- When I first recieved the tech test I decided I would build a React based SPA to interact with the API but given the time limit I did not.