Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

how many webservices

Tags:

c#

asp.net

azure

I have a web service that looks like this:

public class TheService : System.Web.Services.WebService
{
  [WebMethod(EnableSession = true)]
  public string GetData(string Param1, string Param2) { ... }
}

In other words, it's contained in one class and in there, I have one public method and there is another private method that does a read to the database.

The issue I'm facing is in terms of scalability. I'm building a web app that should work for 1,000 daily users and each user will do about 300-500 calls a day to the web service and so that's about 300,000 to 500,000 requests per day. I need to add 9 more calls to the web service. Some of these calls will involve database writes.

My question is this: am I better off creating 9 separate web services or continue with the one service I have and add the other methods. Or may be something different and better. I'm planning to deploy the application on Azure so I'm not really concerned about hardware, just the application side of things.

like image 631
frenchie Avatar asked Dec 17 '22 07:12

frenchie


2 Answers

I wouldn't base my decision off the volume, or for performance/scalability reasons. You won't get much if any performance benefit from keeping them lumped together or separating them. Any grouping or filtering that can be done while the services are grouped one way can also be done with the services grouped the other way. The ability to partition between servers will be the same, too.

Design

Instead I would focus on trying to make your code understandable and maintainable. Group your services how they make the most sense architecturally within your program. Keep them logically grouped how they make the most sense to be grouped, from a problem-domain perspective (as opposed to a solution domain perspective).

Since you're free to group them how you want, I recommend you read up on SOLID, which is a set of guiding principles for creating software architecture.

One of the principles listed that is particularly important is the Interface Segregation Principle, which can be defined by the notion that "many client specific interfaces are better than one general purpose interface."

Performance and scalability

Since you mentioned performance and scalability being a concern, I recommend you follow this plan:

  • Determine how long you can wait until you can patch/maintain the software
  • Determine your expected load, including both average and peak load-per-time (you've determined the average), and how much you expect this traffic to grow over time (specifically over the period you can go without patching/maintaining the software)
  • Create a model describing exactly which calls will be done and in which ratios (per time and per server)
  • Create automation that mirrors these models as closely as you can. Try to model both average and peak traffic, and surpassing your highest scale traffic
  • Profile your code, DB, network traffic, and disk traffic while running this automation
  • Determine the bottlenecks, and if they are within acceptable tolerance
  • Optimize your bottlenecks (as required), and repeat from the profiling step forward
  • The next release of your software, repeat from the top to add scenarios/load/automation
  • Perform regression testing using your existing tests, altered to fit the new scale
like image 137
Merlyn Morgan-Graham Avatar answered Dec 29 '22 17:12

Merlyn Morgan-Graham


Splitting the web methods into several web services won't help you here; load balancing will.

like image 44
Icarus Avatar answered Dec 29 '22 17:12

Icarus