So, there I was, reviewing a pull request from one of our junior devs. Bless their heart, they were building a fantastic feature, but the code... let's just say it was a bit... enthusiastic. No generic services, DRY principles were taking a vacation, and the whole thing felt like a house of cards waiting to collapse.
That's when it hit me: we need a guide. A simple, straightforward, "here's how you build a real .NET Core Web API" kind of guide.
This blog post is that guide.
If you're a fresher, or someone with zero knowledge about .NET Core Web APIs, and want to learn how to set up a robust, maintainable, and scalable project, you've come to the right place. We'll cover everything from project setup to essential concepts, data access, messaging, security, and even deployment.
By the end of this post, you'll have a solid foundation for building your own .NET Core Web APIs with confidence. Let's dive in!
Project Setup
First, make sure you have the .NET SDK installed. You can download it from the official .NET website.
Once you have the SDK, you can create a new .NET Core Web API project using the following command in your terminal:
dotnet new webapi -o MyWebApi
cd MyWebApi
This will create a new project with a basic structure. Let's take a look at the key files and folders:
- Program.cs: This is the entry point of the application. It configures the application's services and middleware.
- appsettings.json: This file contains configuration settings for the application, such as connection strings and API keys.
- Controllers: This folder contains the API controllers, which handle incoming requests and return responses.
- Properties: This folder contains the launchSettings.json file, which configures how the application is launched in development.
To keep our project organized, let's create the following folders:
- Services: This folder will contain the business logic of the application.
- Models: This folder will contain the data models.
- DTOs: This folder will contain the Data Transfer Objects (DTOs).
- Middleware: This folder will contain custom middleware components.
Your project structure should now look like this:
MyWebApi/
├── Controllers/
├── Services/
├── Models/
├── DTOs/
├── Middleware/
├── appsettings.json
├── Program.cs
└── MyWebApi.csproj
Essential Concepts
Now that we have our project set up, let's dive into some essential concepts that are crucial for building a robust and maintainable .NET Core Web API.
Dependency Injection (DI)
Dependency Injection (DI) is a design pattern that allows us to develop loosely coupled code. In .NET Core, DI is a first-class citizen, and it's heavily used throughout the framework.
To configure DI, we need to register our services with the IServiceCollection
in the Program.cs
file. Here's an example:
builder.Services.AddTransient<IMyService, MyService>();
This code registers the MyService
class as a transient service, which means that a new instance of the service will be created every time it's requested.
There are three main scopes for DI services:
- Transient: A new instance is created every time it's requested.
- Scoped: A new instance is created for each request.
- Singleton: A single instance is created for the lifetime of the application.
GlobalUsings
Global using directives allow you to import namespaces globally across your project, reducing the need to add using statements to individual files.
To use global using directives, create a file named GlobalUsings.cs
in your project and add the following code:
global using System;
global using System.Collections.Generic;
global using System.Linq;
Models
Models represent the data entities in your application. They are typically simple classes with properties that map to database columns.
Here's an example of a model class:
public class Product
{
public int Id { get; set; }
public string Name { get; set; }
public string Description { get; set; }
public decimal Price { get; set; }
}
Data Transfer Objects (DTOs)
Data Transfer Objects (DTOs) are used to transfer data between the API and the client. They help to decouple the API from the data model, allowing you to change the data model without affecting the API.
Here's an example of a DTO class:
public class ProductDTO
{
public int Id { get; set; }
public string Name { get; set; }
public decimal Price { get; set; }
}
AutoMapper
AutoMapper is a library that simplifies the process of mapping objects from one type to another. It can be used to map model classes to DTO classes, and vice versa.
To configure AutoMapper, you need to create a mapping profile:
public class MappingProfile : Profile
{
public MappingProfile()
{
CreateMap<Product, ProductDTO>();
CreateMap<ProductDTO, Product>();
}
}
Then, register AutoMapper in your Program.cs
file:
builder.Services.AddAutoMapper(typeof(MappingProfile));
## Data Access Layer
The data access layer is responsible for interacting with the database. We'll use the Generic Repository and Unit of Work patterns to create a flexible and testable data access layer.
### Generic Repository Pattern
The Generic Repository pattern provides an abstraction over the data access logic, allowing you to easily switch between different data sources without modifying the rest of the application.
First, let's define a generic repository interface:
csharp
public interface IGenericRepository where T : class
{
Task GetByIdAsync(int id);
Task> GetAllAsync();
Task AddAsync(T entity);
Task UpdateAsync(T entity);
Task DeleteAsync(T entity);
}
Next, let's create a generic repository implementation:
csharp
public class GenericRepository : IGenericRepository where T : class
{
private readonly AppDbContext _context;
public GenericRepository(AppDbContext context)
{
_context = context;
}
public async Task<T> GetByIdAsync(int id)
{
return await _context.Set<T>().FindAsync(id);
}
public async Task<IEnumerable<T>> GetAllAsync()
{
return await _context.Set<T>().ToListAsync();
}
public async Task<T> AddAsync(T entity)
{
await _context.Set<T>().AddAsync(entity);
await _context.SaveChangesAsync();
return entity;
}
public async Task UpdateAsync(T entity)
{
_context.Set<T>().Update(entity);
await _context.SaveChangesAsync();
}
public async Task DeleteAsync(T entity)
{
_context.Set<T>().Remove(entity);
await _context.SaveChangesAsync();
}
}
### Unit of Work Pattern
The Unit of Work pattern provides a way to group multiple database operations into a single transaction. This ensures that all operations are either committed or rolled back together, maintaining data consistency.
First, let's define a unit of work interface:
csharp
public interface IUnitOfWork : IDisposable
{
IGenericRepository Products { get; }
Task CompleteAsync();
}
Then, let's create a unit of work implementation:
csharp
public class UnitOfWork : IUnitOfWork
{
private readonly AppDbContext _context;
public UnitOfWork(AppDbContext context)
{
_context = context;
Products = new GenericRepository<Product>(_context);
}
public IGenericRepository<Product> Products { get; private set; }
public async Task<int> CompleteAsync()
{
return await _context.SaveChangesAsync();
}
public void Dispose()
{
_context.Dispose();
}
}
### Base Service
Now, let's create a base service that uses the generic repository and unit of work:
csharp
public interface IBaseService where T : class
{
Task> GetAllAsync();
Task GetByIdAsync(int id);
Task AddAsync(TDTO entity);
Task UpdateAsync(int id, TDTO entity);
Task DeleteAsync(int id);
}
public class BaseService : IBaseService where T : class
{
private readonly IUnitOfWork _unitOfWork;
private readonly IMapper _mapper;
public BaseService(IUnitOfWork unitOfWork, IMapper mapper)
{
_unitOfWork = unitOfWork;
_mapper = mapper;
}
public async Task<IEnumerable<TDTO>> GetAllAsync()
{
var entities = await _unitOfWork.Products.GetAllAsync();
return _mapper.Map<IEnumerable<TDTO>>(entities);
}
public async Task<TDTO> GetByIdAsync(int id)
{
var entity = await _unitOfWork.Products.GetByIdAsync(id);
return _mapper.Map<TDTO>(entity);
}
public async Task<TDTO> AddAsync(TDTO entity)
{
var model = _mapper.Map<T>(entity);
await _unitOfWork.Products.AddAsync(model);
await _unitOfWork.CompleteAsync();
return entity;
}
public async Task UpdateAsync(int id, TDTO entity)
{
var model = await _unitOfWork.Products.GetByIdAsync(id);
_mapper.Map(entity, model);
await _unitOfWork.CompleteAsync();
}
public async Task DeleteAsync(int id)
{
var entity = await _unitOfWork.Products.GetByIdAsync(id);
_unitOfWork.Products.DeleteAsync(entity);
await _unitOfWork.CompleteAsync();
}
}
Asynchronous Messaging
Asynchronous messaging allows different parts of your application to communicate with each other without blocking each other. This can improve the performance and scalability of your application.
We'll use Azure Service Bus and MassTransit to implement asynchronous messaging in our project.
Azure Service Bus
Azure Service Bus is a fully managed enterprise integration message broker. It can be used to decouple applications and services.
To configure Azure Service Bus, you need to create a Service Bus namespace in the Azure portal and obtain a connection string.
Then, add the following NuGet package to your project:
Install-Package Azure.Messaging.ServiceBus
MassTransit
MassTransit is a free, open-source, lightweight message bus for .NET. It provides a simple and easy-to-use API for sending and receiving messages.
To configure MassTransit, add the following NuGet packages to your project:
Install-Package MassTransit
Install-Package MassTransit.Azure.ServiceBus.Core
Install-Package MassTransit.Newtonsoft
Then, configure MassTransit in your Program.cs
file:
builder.Services.AddMassTransit(x =>
{
x.UsingAzureServiceBus((context, cfg) =>
{
cfg.Host("your_service_bus_connection_string");
cfg.ReceiveEndpoint("my_queue", e =>
{
e.Consumer<MyConsumer>();
});
});
});
This code configures MassTransit to use Azure Service Bus as the transport and registers a consumer for the my_queue
queue.
Defining Messages, Consumers, and Publishers
To define a message, create a simple class:
public class MyMessage
{
public string Text { get; set; }
}
To define a consumer, create a class that implements the IConsumer<T>
interface:
public class MyConsumer : IConsumer<MyMessage>
{
public async Task Consume(ConsumeContext<MyMessage> context)
{
Console.WriteLine($"Received message: {context.Message.Text}");
}
}
To publish a message, use the IPublishEndpoint
interface:
public class MyService
{
private readonly IPublishEndpoint _publishEndpoint;
public MyService(IPublishEndpoint publishEndpoint)
{
_publishEndpoint = publishEndpoint;
}
public async Task SendMessage(string text)
{
await _publishEndpoint.Publish(new MyMessage { Text = text });
}
}
## Filtering and Sorting
Filtering and sorting are essential features for any API that returns a list of data. They allow clients to easily find and order the data they need.
We'll use Sieve to implement filtering and sorting in our project.
### Sieve
Sieve is a library that provides a simple and flexible way to implement filtering, sorting, and pagination in ASP.NET Core APIs.
To configure Sieve, add the following NuGet package to your project:
bash
Install-Package Sieve
Then, register Sieve in your `Program.cs` file:
csharp
builder.Services.AddScoped();
To use Sieve in your API endpoints, inject the `ISieveProcessor` interface and use the `Apply` method to apply filtering and sorting to your data:
csharp
public class ProductsController : ControllerBase
{
private readonly ISieveProcessor _sieveProcessor;
private readonly IProductService _productService;
public ProductsController(ISieveProcessor sieveProcessor, IProductService productService)
{
_sieveProcessor = sieveProcessor;
_productService = productService;
}
[HttpGet]
public async Task<IActionResult> Get([FromQuery] SieveModel sieveModel)
{
var products = await _productService.GetAllAsync();
var filteredProducts = _sieveProcessor.Apply(sieveModel, products.AsQueryable()).ToList();
return Ok(filteredProducts);
}
}
This code applies filtering and sorting to the `products` collection based on the values in the `sieveModel` object.
To enable filtering and sorting for specific properties, you can use the `[Sieve]` attribute in your model classes:
csharp
public class Product
{
public int Id { get; set; }
[Sieve(CanFilter = true, CanSort = true)]
public string Name { get; set; }
public string Description { get; set; }
public decimal Price { get; set; }
}
Security
Security is a critical aspect of any API. We'll use JWT (JSON Web Token) authentication to secure our API endpoints.
JWT Authentication
JWT Authentication is a stateless authentication mechanism that uses JSON Web Tokens (JWTs) to verify the identity of users.
To configure JWT authentication, add the following NuGet package to your project:
Install-Package Microsoft.AspNetCore.Authentication.JwtBearer
Then, configure JWT authentication in your Program.cs
file:
builder.Services.AddAuthentication(JwtBearerDefaults.AuthenticationScheme)
.AddJwtBearer(options =>
{
options.Authority = "https://your-auth-provider.com";
options.Audience = "your-api-audience";
});
This code configures JWT authentication to use the specified authority and audience.
To protect API endpoints, use the [Authorize]
attribute:
[Authorize]
[HttpGet]
public async Task<IActionResult> Get()
{
// Only authenticated users can access this endpoint
return Ok();
}
## Middleware
Middleware components are executed in the request pipeline and can be used to perform various tasks, such as logging, exception handling, and authentication.
To create custom middleware, you need to create a class that implements the `IMiddleware` interface or follows a specific convention.
### Exception Handling Middleware
Exception handling middleware can be used to catch unhandled exceptions and return appropriate error responses.
Here's an example of exception handling middleware:
csharp
public class ExceptionMiddleware : IMiddleware
{
private readonly ILogger _logger;
public ExceptionMiddleware(ILogger<ExceptionMiddleware> logger)
{
_logger = logger;
}
public async Task InvokeAsync(HttpContext context, RequestDelegate next)
{
try
{
await next(context);
}
catch (Exception ex)
{
_logger.LogError(ex, "An unhandled exception occurred.");
context.Response.StatusCode = 500;
context.Response.ContentType = "application/json";
var errorResponse = new
{
message = "An unhandled exception occurred.",
traceId = Guid.NewGuid()
};
await context.Response.WriteAsync(JsonConvert.SerializeObject(errorResponse));
}
}
}
To register the middleware, add the following code to your `Program.cs` file:
csharp
builder.Services.AddTransient();
app.UseMiddleware();
### Logging Middleware
Logging middleware can be used to log request and response information for debugging and monitoring.
Here's an example of logging middleware:
csharp
public class LoggingMiddleware : IMiddleware
{
private readonly ILogger _logger;
public LoggingMiddleware(ILogger<LoggingMiddleware> logger)
{
_logger = logger;
}
public async Task InvokeAsync(HttpContext context, RequestDelegate next)
{
_logger.LogInformation($"Request: {context.Request.Method} {context.Request.Path}");
await next(context);
_logger.LogInformation($"Response: {context.Response.StatusCode}");
}
}
To register the middleware, add the following code to your `Program.cs` file:
csharp
builder.Services.AddTransient();
app.UseMiddleware();
Health Checks
Health checks are used to monitor the health of your application. They can be used to detect problems and automatically restart the application if necessary.
To configure health checks, add the following NuGet package to your project:
Install-Package Microsoft.AspNetCore.Diagnostics.HealthChecks
Then, configure health checks in your Program.cs
file:
builder.Services.AddHealthChecks();
app.UseHealthChecks("/health");
This code adds health checks to the application and exposes a health check endpoint at /health
.
Dockerization
Docker is a platform for building, shipping, and running applications in containers. Containers provide a consistent and isolated environment for your application, making it easy to deploy and scale.
To dockerize your .NET Core Web API, you need to create a Dockerfile in the project root directory. Here's an example Dockerfile:
FROM mcr.microsoft.com/dotnet/aspnet:6.0 AS base
WORKDIR /app
EXPOSE 80
FROM mcr.microsoft.com/dotnet/sdk:6.0 AS build
WORKDIR /src
COPY ["MyWebApi.csproj", "."]
RUN dotnet restore "./MyWebApi.csproj"
COPY . .
WORKDIR "/src/."
RUN dotnet build "MyWebApi.csproj" -c Release -o /app/build
FROM build AS publish
RUN dotnet publish "MyWebApi.csproj" -c Release -o /app/publish
FROM base AS final
WORKDIR /app
COPY --from=publish /app/publish .
ENTRYPOINT ["dotnet", "MyWebApi.dll"]
This Dockerfile defines the steps for building and running your application in a container.
To build the Docker image, run the following command in your terminal:
docker build -t mywebapi .
To run the Docker container, run the following command:
docker run -d -p 8080:80 mywebapi
This will run the Docker container in detached mode and map port 8080 on your host machine to port 80 on the container.
CI/CD with GitHub Workflow
CI/CD (Continuous Integration/Continuous Deployment) is a set of practices that automate the build, test, and deployment process. We'll use GitHub Actions to set up a CI/CD pipeline for our .NET Core Web API.
To create a GitHub Workflow, create a file named .github/workflows/main.yml
in your project repository. Here's an example workflow file:
name: CI/CD
on:
push:
branches: [ main ]
pull_request:
branches: [ main ]
jobs:
build:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v2
- name: Setup .NET Core
uses: actions/setup-dotnet@v1
with:
dotnet-version: '6.0'
- name: Install dependencies
run: dotnet restore
- name: Build
run: dotnet build --configuration Release
- name: Test
run: dotnet test --configuration Release
- name: Publish
run: dotnet publish -c Release -o /tmp/publish
- name: Deploy to Azure App Service
uses: azure/webapps-deploy@v2
with:
app-name: your-app-name
slot-name: production
publish-profile: ${{ secrets.AZURE_WEBAPP_PUBLISH_PROFILE }}
package: /tmp/publish
This workflow defines the steps for building, testing, and deploying your application to Azure App Service.
To configure the workflow, you need to:
- Replace
your-app-name
with the name of your Azure App Service. - Add a secret named
AZURE_WEBAPP_PUBLISH_PROFILE
to your GitHub repository with the publish profile for your Azure App Service.
Example Controller with CRUD Operations
To demonstrate how to use the concepts we've covered, let's create a simple controller for managing products.
First, create a new controller class named ProductsController.cs
:
[Route("api/[controller]")]
[ApiController]
public class ProductsController : ControllerBase
{
private readonly IProductService _productService;
private readonly ISieveProcessor _sieveProcessor;
public ProductsController(IProductService productService, ISieveProcessor sieveProcessor)
{
_productService = productService;
_sieveProcessor = sieveProcessor;
}
[HttpGet]
public async Task<IActionResult> Get([FromQuery] SieveModel sieveModel)
{
var products = await _productService.GetAllAsync();
var filteredProducts = _sieveProcessor.Apply(sieveModel, products.AsQueryable()).ToList();
return Ok(filteredProducts);
}
[HttpGet("{id}")]
public async Task<IActionResult> Get(int id)
{
var product = await _productService.GetByIdAsync(id);
if (product == null)
{
return NotFound();
}
return Ok(product);
}
[HttpPost]
public async Task<IActionResult> Post([FromBody] ProductDTO productDto)
{
var product = await _productService.AddAsync(productDto);
return CreatedAtAction(nameof(Get), new { id = product.Id }, product);
}
[HttpPut("{id}")]
public async Task<IActionResult> Put(int id, [FromBody] ProductDTO productDto)
{
await _productService.UpdateAsync(id, productDto);
return NoContent();
}
[HttpDelete("{id}")]
public async Task<IActionResult> Delete(int id)
{
await _productService.DeleteAsync(id);
return NoContent();
}
}
This controller implements the basic CRUD operations for managing products. It uses the IProductService
interface to access the business logic and the ISieveProcessor
interface to apply filtering and sorting.
To create the IProductService
interface and implementation, create a new file named IProductService.cs
in the Services/Interface
directory:
public interface IProductService : IBaseService<Product, ProductDTO>
{
}
Then, create a new file named ProductService.cs
in the Services/Implementation
directory:
public class ProductService : BaseService<Product, ProductDTO>, IProductService
{
public ProductService(IUnitOfWork unitOfWork, IMapper mapper) : base(unitOfWork, mapper)
{
}
}
This code implements the IProductService
interface and inherits the base service class.
Conclusion
Congratulations! You've now learned how to set up a robust .NET Core Web API project with best practices. We've covered everything from project setup to essential concepts, data access, messaging, security, and deployment.
Remember, this is just a starting point. There's always more to learn and explore. I encourage you to experiment with the code, try out different patterns, and dive deeper into the topics that interest you most.
Here are some resources that you may find helpful:
- .NET Documentation
- ASP.NET Core Documentation
- AutoMapper Documentation
- MassTransit Documentation
- Sieve Documentation
Happy coding!
Top comments (10)
Couple of things:
Overall, much more decent than the majority.
Still, I must clubber down the use of Entity Framework here.
Why Entity Framework is Bad for ASP.Net
The DbContext class, as I already mentioned, is a Unit of Work implementation. This contradicts the general contract of RESTful API's.
In a RESTful API, we don't do multi-entity modifications, and Unit of Work is all about transacting multiple entity modifications. It should therefore be clear that the whole UoW thing that EF brings to the table is wasted code.
Furthermore, the UoW features collide more often than not with REST implementations. We end up using tricks like
AsNoTracking()
to bail out of UoW stuff like entity tracking.In short, Dapper is 1000x better than EF for a RESTful HTTP server.
I think you're mixing up REST design with how databases work. In a proper REST API, the resources you expose represent business concepts, not individual database tables. Microsoft even recommends not designing APIs that mirror your database structure and you can check that here: learn.microsoft.com/en-us/azure/ar....
For example, creating an order through /api/orders might touch multiple tables: the Orders table, the OrderLines table, maybe even the Customers table to update a balance. That’s normal, and it's exactly where EF’s Unit of Work pattern is useful. It helps make sure that all these changes happen in a single transaction. If one part fails, nothing gets saved. That’s not wasted code, that’s protecting your data.
As for AsNoTracking(), it's not a hack or workaround. It's just a way to improve performance when you're only reading data and don’t need tracking. EF gives you the option to choose when you want change tracking and when you don’t.
Dapper is great for certain scenarios, but saying EF is bad for REST just isn’t true. It depends on what you're building.
Hello! Good points. Indeed, if you work that way, I do see some better use of the UoW pattern that EF implements. Still, it is not enough. I'll elaborate:
In a scenario where you have a REST application whose entity is multi-table, you claim that EF's UoW implementation is good. I'll go as far and say that yes, I agree that we must ensure the health of the data, and UoW does that.
But, do you know that I can accomplish that with
TransactionScope
and just use Dapper sequentially, as if it were separate entities? Like 95+% of the time, this is all I need. Why suffer the pains of the full EF just to avoid ausing
block that defines aTransactionScope
object?Will I need in a REST server that the DbContext tracks individually-retrieved entities? No. (Give me more EF features to see if they are needed, lol!, I can't think of more since I never use EF).
Then there's another issue: One must retrieve the entity with EF before being able to update it (don't remember why right now, but that's what my head is telling me). I can, on the other hand, avoid this round-trip with Dapper + SQL.
I have had 2 instances in production code where an included property generates an endless loop in serialization, crashing the HTTP server. The endpoint, needless to say, did not need the property, but the model included it and so EF fetched it.
Etc. I bet that if I put my head to work on it, I can come up with more nasty little things EF does that REST servers don't need.
As for your argument for
AsNoTracking()
, it is an optimization until you have to use it or your back-end code throws. I have only ever had to use it to avoid the exception of reading the same entity twice with the same context.So all in all, EF, in the best case scenario, is just overkill. Worst case scenario, is your personal nightmare.
I don’t disagree that you can do most of this with Dapper, just like you could with raw SQL. And to be honest, I really like Dapper too. It gives you a lot of control and performance.
That said, if you look at the example below you’ll see how EF can be more maintainable in projects with large, complex entities. It needs less code, handles relationships, cascades and transactions out of the box, and you don’t have to manually maintain SQL scripts, which can get messy and harder to trace (e.g., where a property is used, how joins are handled, etc.). Of course you have to pay it with some configuration or attributes on your data models.
“You must retrieve an entity before updating it.”
You don’t, actually. EF supports detached updates and manual state setting. You can attach an entity and mark it as modified.
“I’ve had issues with EF loading too much data.”
That’s not EF’s fault that’s a modelling or projection issue. With proper DTOs or Select() projections, you control exactly what’s fetched and returned. Things like ProjectTo<> help here too.
I’m not saying EF is better than Dapper. They’re just different tools. EF has been through years of production use, has improved a lot over those years, and if it helps you keep your codebase healthier and easier to evolve, there’s no reason not to use it.
example with EF:
and the same example in dapper:
LOL! Why did my comment come out with some random big text in the middle??? Apologies for that. I did not preview it. Hehe.
I'll be back with you... Paid work is in the middle ATM.
The Dapper Code Sample
The above is your code, but corrected and enhanced. Dapper, when given an array of objects, runs the SQL statement once per array item. Therefore, there's no need for the
foreach
loop.The method to commit the transaction is
Complete()
, notCommit()
. At least that's what I remember. Sorry, I've been deep to the neck with front-end and TS code lately.The
TransactionScope
class auto-rollbacks on disposal unlessComplete()
has been called, in which case it auto-commits. This means that you don't need thetry..catch
blocks.After these modifications. both your EF-based example and this amended Dapper example contain 17 lines of code. I am assuming you were trying to prove that using Dapper was more verbose. This should disprove it.
The SQL Argument
I personally prefer to fine-tune complex queries in code, but I must grant you, LINQ-to-SQL is cool.
Well, it seems that it is now possible with Dapper!! Yay! Dommel. Dommel claims to provide this (or an equivalent) capability. I haven't had the opportunity to try it out. I can't wait!
Extra Round Trips to the Database with EF
I remembered the case: Imagine an entity that has at least one property that can only be updated according to its current value. An example is the
Status
property of a workflow item (like a Jira ticket), where you can't go to "In Progress" unless the current status is "Defined".With EF, one MUST read the item (meaning one full round trip to the database), validate the update request to the
Status
property, then respond accordingly (400 BAD REQUEST or 200 OK). Am I mistaken in thinking that this costs an extra round trip, no way out? Don't give me the raw SQL solution in EF. If I wanted raw SQL I would have used Dapper. As far as I can tell, unless I write the SQL myself to do the validation and saving in one round trip to the database, EF requires an extra round trip to the database. If you can give me a solution that avoids the round trip while not using raw SQL, I'll concede and will be happy, as I will have learned a new thing.EF Loads Too Much Data
Sure, you can say is the developer's fault for not knowing how to properly project with EF. No doubt. But do you know that more than likely the projection error would not have happened with good ol' SQL and Dapper? The complexities added by EF need to be "worth it". These complexities around configuration and avoiding pitfalls come with very little (not to say zero) gratification.
My Conclusion
I am a bit more strongly swayed to say that Dapper is definitely better than EF for RESTful services. Like I am "this close" to declare EF enemy #1 of REST, hehe.
This is because there is not one feature EF offers that I sigh for while writing REST. Seriously. Zero nostalgia.
You're right about the Dapper example, I actually generated it with GPT and didn’t double-check it properly. Your version is definitely more succinct.
That said, I still find the EF example easier to maintain over time. I really like the IDE support when working with strongly-typed models compared to dealing with raw SQL strings.
Also, I hadn’t heard of Dommel looks cool, thanks for sharing that!
Extra Round Trips to the Database
Since EF Core 7.0, you can use ExecuteUpdate() to avoid that round trip:
learn.microsoft.com/en-us/ef/core/...
EF Loading Too Much Data
I think that kind of projection issue is less about EF itself and more about how familiar the developer is with the tool. Just like someone inexperienced with EF might write inefficient queries, the same can happen with Dapper and SQL.
"This is because there is not one feature EF offers that I sigh for while writing REST. Seriously. Zero nostalgia."
That’s fair if you don’t miss anything from EF while building REST APIs, then Dapper clearly fits your use case well.
Personally, I do find some of EF’s features helpful, especially in more complex domains. Things like relationship handling, cascade rules, automatic transactions, model validation, and built-in concurrency support can reduce boilerplate and improve maintainability. And I also prefer the IDE support when navigating models versus scanning through SQL strings especially when the project grows.
That said, I’m not saying EF is better than Dapper. They’re just different tools with different trade-offs. In some projects, I’ve used both for example, EF Core for writes and Dapper for reads in a CQRS setup.
This view is pretty aligned with what the Dapper team says themselves:
learndapper.com/dapper-vs-entity-f...
At the end of the day, it just depends on your needs. Use whatever helps you keep your codebase clean and maintainable.
Correct. It is a matter of skill.
Also correct, but it is incomplete. You are saying this:
What's wrong with this train of thought? It assumes that EF can always write efficient queries for you. Your train of thought is assuming or implying that people that use EF are exempt from the SQL problem. They are not because EF CAN produce bad SQL too, so you, as a developer that uses EF, also need to know SQL. Your argument is invalid because it stands on a false premise.
So, demonstrated the fact that the SQL skill is needed regardless of your choice between Dapper and SQL, that argument can be discarded as "background noise" that is present in all scenarios. In other words: "No matter your ORM, the developer must know SQL".
This takes us back to the fact that EF still comes with EXTRA complexities not found in Dapper, which is the point of my argument: To properly project in EF requires extra skills. This extra effort should have a payoff. There's none. Why, then, incur the extra effort? This is where people start listing EF features, most of which I can bounce back to them.
About ExecuteUpdate(), it's a good addition. I read the docs, and I cannot help but thinking that you're writing way, way more code with lambdas. The example there:
This, to produce:
I don't know. The idea of lambda expressions is to "free" people from writing SQL, but this is starting to look scarier than SQL.
Anyway, other than that, good that now there's a way to skip the round trip.
Yes, I agree. I like EF for Windows Forms and WPF. EF really shines there. I can use most of the features there, especially with entity tracking and data binding. For REST, Dapper.
Yeah, you’re totally right that you still need to know SQL even if you’re using EF. I’d never recommend someone use it without understanding how SQL works under the hood. So I get the point you’re making it’s not like EF magically saves you from knowing the database layer.
And honestly, I kind of agree with the spirit of your reply. EF is a tool with its own set of tradeoffs, just like anything else. You have to weigh them before deciding to use it. For me, I’ve found more upsides than you mentioned. For example, in a new project I can move really fast set up Identity quickly, use DbSets directly without writing generic repos, scaffold the database if needed, generate migrations, and have a decent structure from the start.
So for me, when I weigh the tradeoffs, EF often helps me get up and running quicker and keeps things maintainable. I totally understand if someone feels differently, though. I just thought the original comment came off a bit too harsh, like EF has no place in REST APIs at all. That felt extreme, since it actually can be a good fit depending on the project and team.
At the end of the day, I think we’re mostly on the same page. Use whatever works best for your situation.
You CAN make EF work in REST. You will have to learn tricks and things to overcome issues. If you know EF already, you might feel this is no chore. If you don't, you'll definitely feel it.
If your existing project is tied to EF and is too big to bail out, go with it. It will work. If you're starting up, know that EF has a significantly steeper learning curve without sufficient benefits in the REST world. Most benefits can be reaped in WPF or WinForms with property data binding, a concept not needed in REST. <--- This is my stance, in a nutshell.
Yes, DbSet implements the Repository pattern. No surprise that you don't need a generic Repository implementation since DbSet already fulfills it.