Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Suggestions on posting huge string to web service

Below is my array :

var.child.Cars1 = { name:null,operation:0,selected : false} 

Now in above array,selected property represent check/uncheck status of checkbox and i am posting above array to web service(WCF) as string using json.stringify.

Above array contains 2000 - 4000 records and now user can check/uncheck checkboxes.

Now consider there are 4000 records in above array in which there are 2000 records which are checked and 2000 records are unchecked and in my web service i am processing only those records which are checked.I remove records with selected value as false.

Now as due to 4000 records its a huge json string and because of that i get error from web service end :

  Error :  (413) Request Entity Too Large

Now reason why i am not filtering out records with selected as flase is because it will create lots of overhead on client browser and can even hang browser so right now i am doing it on server side.

So my question is that i should filter out records with selected as false on client side and then post 2000 records only or what i am doing is the right way.

I have some question in my mind that posting such huge json string will again take some times and filtering out records with selected as false will also put lots of overhead on browser.

So i am not sure i am doing wrong or right.

Can anybody please guide me for this???

like image 900
Learning-Overthinker-Confused Avatar asked Mar 04 '17 14:03

Learning-Overthinker-Confused


Video Answer


2 Answers

Generally if you start running in to the large request size issues like this, it is a good opportunity to look at ways to optimize instead of override. A lot of people have tried providing ways to circumvent the problem, but have not provided insight as to ways that don't circumvent it, but instead improve upon the design to be more lightweight.

Here are some possibilities:

Have you considered paging the request(s)? This would allow you to asynchronously load the data on the client as needed, thus preventing the requests from taking too long, improving the responsiveness of the website, and reducing any burdens on both the client and server memory wise. You can preemptively load the data as a user scrolls, and if need be, provide some sort of entertainment/feedback to the user if the process is taking too long, so they know there's more data being loaded.

Have you considered changing the names of the properties to be shorter, and less descriptive, reducing the footprint of the object itself? For example:

Your current model:

{ name:null,operation:0,selected : false}

A simplified model:

{ n: null, o: 0, s: false }

An approach such as this will make it harder to read the JSON itself, but JSON isn't meant solely to be read by people, it's meant to serialize data; although, this can be overcome by documenting your model. Doing it this way may help to reduce your data being sent by up to 30%.

I cannot provide a complete approach to the solution because you will have to ask yourself a lot of hard questions as to what you're trying to achieve, who will be consuming the data, and how is the best way to get there.

In addition, I would strongly consider questioning why a process would require a user to interact with 2000+ records at once. I am not trying to criticize, but say you need to take a critical look at the business process behind what you're trying to achieve because there may be serious issues with repetitiveness, stress on the user and more, which would greatly affect how your application will be effective and useful to the end-user. As an example, is there ways you can break up the task in to smaller, less tedious blocks so that the end user doesn't stare at 4000 checkboxes for 2 hours?

This may not be the answer you're looking for as it opens up a huge number of additional questions, but hopefully it will help you start to formulate questions which help shape the final answer.

like image 125
Inari Avatar answered Oct 29 '22 01:10

Inari


A quick fix might be increasing the servers allowed content length. This is approximately what that would look like.

<configuration>
    <system.web>
       <httpRuntime maxRequestLength="2147483647" />
    </system.web>
</configuration>

<system.webServer>
    <security>
       <requestFiltering>
           <requestLimits maxAllowedContentLength="2147483647" />
       </requestFiltering>
    </security>
</system.webServer>
like image 22
TJM Avatar answered Oct 29 '22 00:10

TJM