Blog

.NET 8: Handle Long-Running Tasks in Applications

image-1

In application development, there are instances where executing prolonged tasks in the background becomes necessary. Although contemporary solutions often involve leveraging Cloud Services for such operations, integrating these services demands extra deployment and maintenance efforts. In response to this challenge, CodeLink will illustrate a seamless approach to handling long-running tasks within a .NET 8 application in this article.

NET 8 Handle Long-Running Tasks in Applications-1.png

We recently developed a web API using .NET 8 deployed on an AWS Fargate instance, which was hosted behind an AWS API Gateway. The task involved synchronizing data between internal services, triggered by an endpoint. Initially, the process took up to 10 minutes to fulfill requests, but as our data volume grows over time, the task may extend beyond this duration.

Due to extended processing times, there is a risk of hitting the AWS API Gateway's 30-second limit. To address this, we implement a 'Fire and Forget' mechanism. This approach involves triggering the task via an endpoint and promptly returning the trigger result without waiting for the task to finish.

This task is exclusively scheduled for release days to ensure data synchronization across our services. In response to the customer's directive, we are refraining from introducing new infrastructure to avoid prolonged review processes. Our current infrastructure is adept at managing this task seamlessly without any operational issues.

Available options:

  1. Use Task.Run.
  2. Use an external library.
  3. Use Background tasks.

Use Task.Run

The idea of implementing the task came to mind when the request was received. However, further discussions revealed the necessity of implementing a long-running task, which could potentially introduce additional challenges while Task.Run consumes ThreadPool threads, impacting the application's performance when handling many long-running tasks.

public void DoSomeThing()
{
    Task.Run(() =>
    {
        LongRunningTask();
    });
}

Furthermore, it is crucial to implement error handling effectively within Task.Run. In previous projects, we encountered a debugging issue within Task.Run in a production environment. The error was not properly handled, resulting in a challenging experience as Task.Run swallowed the error.

A minor consideration to keep in mind is that when ignoring the Task returned by an async method, it is essential to manage the lifecycle of the injected services. These injected services may be terminated once the request is completed, even if your task is still in progress.

private IMyService _myService;

[HttpPost]
public void StartLongRunningTask()
{
    _ = _myService.ExecuteAsync();
}

For example, suppose IMyService is utilizing some transient services. When the application returns the response, all those transient services will be disposed of because the request has been marked as complete.

Use an external library

For simple data synchronization tasks between systems that do not require delays, batches, or continuations, using an external library may be unnecessary. A simpler approach could be sufficient in such cases.

Use Background tasks

After exploring various options, we decided to integrate long-running tasks as part of the application's life cycle, managed in the background. The concept involves utilizing a straightforward queue for sending and processing requests by the background worker. In scenarios with simultaneous requests, they will be executed sequentially. To regulate the number of concurrent threads in the queue, we will implement SemaphoreSlim.

public interface IBackgroundTaskQueue
{
    void QueueBackgroundWorkItem(Func<CancellationToken, ValueTask> workItem);
    Task<Func<CancellationToken, ValueTask>?> DequeueAsync(CancellationToken cancellationToken);
}

public class BackgroundTaskQueue : IBackgroundTaskQueue
{
    private readonly ConcurrentQueue<Func<CancellationToken, ValueTask>> _workItems = new();
    private readonly SemaphoreSlim _signal = new(0);

    public void QueueBackgroundWorkItem(Func<CancellationToken, ValueTask> workItem)
    {
        ArgumentNullException.ThrowIfNull(workItem);

        _workItems.Enqueue(workItem);
        _signal.Release();
    }

    public async Task<Func<CancellationToken, ValueTask>?> DequeueAsync(CancellationToken cancellationToken)
    {
        await _signal.WaitAsync(cancellationToken);
        _workItems.TryDequeue(out var workItem);

        return workItem;
    }
}

Now, we will leverage Microsoft.Extensions.Hosting.BackgroundService to implement the background task. This service will consume the queue that we created above and execute the task from the queue.

public class RetryPolicy
{
    public int MaxRetries { get; set; } = 3;
    public TimeSpan DelayBetweenRetries { get; set; } = TimeSpan.FromSeconds(5);
}

public class BackgroundTaskService(
    IBackgroundTaskQueue backgroundTaskQueue,
    ILogger<BackgroundTaskService> logger,
    RetryPolicy retryPolicy)
    : BackgroundService
{
    protected override async Task ExecuteAsync(CancellationToken stoppingToken)
    {
        while (!stoppingToken.IsCancellationRequested)
        {
            var workItem = await backgroundTaskQueue.DequeueAsync(stoppingToken);

            if (workItem == null) continue;

            var retryCount = 0;

            while (retryCount < retryPolicy.MaxRetries)
            {
                try
                {
                    await workItem(stoppingToken);
                    break;
                }
                catch (Exception)
                {
                    logger.LogInformation("Failed {RetryCount}, retrying {TaskName}...", retryCount,
                        workItem.Method.Name);
                    retryCount++;

                    if (retryCount < retryPolicy.MaxRetries)
                    {
                        await Task.Delay(retryPolicy.DelayBetweenRetries, stoppingToken);
                    }
                }
            }
        }
    }
}

We added a basic retry policy that allows customization of the maximum number of retries and delays between attempts.

To implement this feature, you will need to register the services by incorporating the following code snippet within your ConfigureServices function.

services.AddHostedService<BackgroundTaskService>();
services.AddSingleton<IBackgroundTaskQueue, BackgroundTaskQueue>();
services.AddSingleton(new RetryPolicy
{
    MaxRetries = 3,
    DelayBetweenRetries = TimeSpan.FromSeconds(5)
});

The setup for the background task is now complete. You can inject the IBackgroundTaskQueue into your service to enqueue tasks. In case your task necessitates transient services, consider manually injecting the IServiceScopeFactory to create a dedicated scope for the task.

public void AddOperation()
{
    backgroundTaskQueue.QueueBackgroundWorkItem(async _ => { await Execute(); });
}

private async Task Execute()
{
    try
    {
        using var scope = serviceScopeFactory.CreateScope();
        // await for your service
        // await _myService.ExecuteAsync()
    }
    catch (Exception ex)
    {
        logger.LogError("Error while working on the operation: {Error}", ex);
    }
}

Summary

In conclusion, CodeLink demonstrates a streamlined approach to managing long-running tasks in a .NET 8 application, without the need for additional infrastructure. The web API on AWS Fargate synchronizes data between systems, with a 'Fire and Forget' mechanism to handle tasks efficiently. Background tasks using a queue system and a retry policy for customization were implemented. Tasks are enqueued with IBackgroundTaskQueue injection, and transient services are managed using IServiceScopeFactory for scoped tasks.

NET 8 Handle Long-Running Tasks in Applications-2.png

Happy coding!

Let's discuss your project needs.

We can help you get the details right.

Book a discovery call
background

CodeLink Newsletter

Subscribe to receive the latest news on technology and product development from CodeLink.

CodeLink

CodeLink powers growing startups and pioneering corporations to scale faster, leverage artificial intelligence, and release high-impact technology products.

Contact Us

(+84) 2839 333 143Write us at hello@codelink.io
Contact Us
2024 © CodeLink Limited.
All right reserved.
Privacy Policy