

This is because payload we send can be big and might not fit in a query string, so it is a good practice to use POST in such a case.Ĭode that would fetch users in batches in parallel looks like this: public async Task> GetUsersInParallelInWithBatches(IEnumerable userIds) Notice that endpoint for getting multiple users is a POST. Var users = JsonConvert.DeserializeObject>(await ()) New StringContent(JsonConvert.SerializeObject(ids), Encoding.UTF8, "application/json"))
#How much is parallels code#
Now swagger looks like this:Īnd code for fetching users: public async Task> GetUsers(IEnumerable ids) In this case – fetching many users at once. To further enhance performance we need to create a specific endpoint for our use. It is not effective to fetch users one by one when we need to fetch thousands of them. You won’t always have the ability to change the API you are calling, but only changes on both sides can get you even further. The proper solution needs some modifications in the API. You can manipulate the batch size and figure out what is best for you. This is the slightly better result because framework needs to handle fewer threads at the same time and therefore it is more effective. Users.AddRange(await Task.WhenAll(tasks)) Int numberOfBatches = (int)Math.Ceiling((double)userIds.Count() / batchSize) public async Task> GetUsersInParallelFixed(IEnumerable userIds) The idea here is to do parallel requests, but not all at the same time. Let’s run requests in parallel, but smarter Executing 1000 requests at the same time will try to create or utilize 1000 threads and managing them is a cost. The thing that slows down the process is thread handling. This is way better than before, but it’s not impressive. A drawback here would be an exception handling because when something goes wrong you will get an AggregatedException with possibly multiple exceptions, but you would not know which task caused it. WhenAll is a beautiful creation that waits for tasks with the same type and returns a list of results. Var tasks = userIds.Select(id => client.GetUser(id)) The code can look like this: public async Task> GetUsersInParallel(IEnumerable userIds) Running in parallel is the key here because you can make many requests and use the same time that one request takes. If you look at how requests are executed in time, you will see something like this: Asynchronous means requests will not block the main thread, that can go further with the execution. This is because although it is asynchronous programming, it doesn’t mean requests are done in parallel. public async Task> GetUsersSynchrnously(IEnumerable userIds) Var user = JsonConvert.DeserializeObject(await ()) Īsynchronous programming in C# is very simple, you just use async / await keywords in your methods and magic happens. I wrapped a single call in a UsersClient class: public class UsersClient

So the task here is to write a method, that would call this endpoint and fetch 1000 users by their ids as fast as possible.
#How much is parallels how to#
Here is my post on how to build an app and deploy it to Azure: Īnd a post about custom data source in Application Insights:

I was also able to debug it remotely and check it’s work in Application Insights. net core app can be deployed and tested in a real hosting environment. I deployed it quickly to Azure using App services and it was ready for testing in less than two hours. It fetches them from plain old MSSQL database.

In order to test different methods of handling requests, I created a very simple ASP.Net Core API, that return user by his id. I want to make 1000 requests! How can I make it really fast? Let’s have a look at 4 approaches and compare their speed.
