When I publish a Blazor Server Side app on Azure, Visual Studio prompts a message that says:
Your application is making use of SignalR. For environments that need to scale we strongly recommend adding a dependency on Azure SignalR Service.
However, my app works just fine as it is, without making use of Azure SignalR Service. So I was wondering if it really makes sense to integrate it or it's just a way for Microsoft to squeeze a few extra dollars from our pockets...
Has anyone tried deploying a Blazor Server Side app with and without Azure SignalR Service, in order to test if there is any actual difference in terms of performance? What kind of advantage should I expect from it?
SignalR is a must in server-side blazor. If you don't want that, then switch to wasm blazor. Then you can choose to use signalR if you want or go straight controllers.
Azure SignalR Service simplifies the process of adding real-time web functionality to applications over HTTP. This real-time functionality allows the service to push content updates to connected clients, such as a single page web or mobile application.
Blazor Server. With the Blazor Server hosting model, the app is executed on the server from within an ASP.NET Core app. UI updates, event handling, and JavaScript calls are handled over a SignalR connection using the WebSockets protocol. The state on the server associated with each connected client is called a circuit.
There are a few variables in play here, so nobody can tell you "Above X clients, you need to use a SignalR service." Depending on how your solution is provisioned, one component or another may be the limiting factor.
For example, the App Service service limits show the maximum number of web sockets per Web App instance. For the Basic tier, it's 350. When you need 351, your options are:
After you go to the Standard service tier and scale out to multiple Web App instances, you can get pretty far hosting SignalR yourself. We've run over 5K concurrently connected clients this way with four Standard S3 instances. Four is a misleading number because we needed the horsepower for other portions of our app, not just SignalR.
When hosting SignalR yourself, it imposes some constraints and there are various creative ways you can hang yourself. For example, using SignalR netcore, you're required to have an ARR affinity token for a multi-instance environment. That sucks. And I once implemented tight polling reconnect after a connection was closed from the front end. It was fun when our servers went down for over two minutes, came back up, and we had a few thousand web browsers tight polling trying to reconnect. And in the standard tier Web App, it's really hard to get a handle on just what percentage of memory and CPU multiple websocket connections are consuming.
So after saying all of this, the answer is "it depends on a lot of things." Having done this both ways, I'd go ahead and use SignalR service.
I know this an old question, but I wanted to add some valuable information in regards to costs.
The cost of Azure SignalR can increase drastically and quickly. Messages are divided by 2k in size, so a 10k message will be 5 messages from a billing perspective.
With the free tier you get up to 20 concurrent connections with 20k messages per day, the standard tier allows 1k concurrent connections per unit and 1 million messages per day per unit. Each unit is $49 a month. Then it's $1 per million messages that go over.
This may not seem like a lot, but I have seen a service accrue more than $3,000 worth of SignalR service in just 7 days.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With