Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Does Caching always enhance performance?

I have a number of sites with PHP and MySQL, especially running MediaWiki, and I need to enhance the performance. However, I have only a limited percentage of CPU that I'm allowed to use.

The best thing I can think about to improve performance is to enable caching. However, I'm confused: Does that really enhance performance overall or just enhance speed?

What I can think about is, if caching will use files, then it would take more processing to get the content of these files. If it will use SQL tables, then it will take more processing to query these tables as well, perhaps the time will be shorter, but the CPU usage will be more.

Is that correct or not? does caching consume more CPU to give a speeder results or it improves performance overall?

like image 694
Tamer Shlash Avatar asked Sep 27 '12 09:09

Tamer Shlash


People also ask

Does cache affect performance?

Cache memory is a large determinant of system performance. The larger the cache, the more instructions can be queued and carried out. Storing instructions in cache reduces the amount of time it takes to access that instruction and pass it to a CPU core.

How does caching improve performance?

Reduced latency The time taken to retrieve the resource from the cache will be lower than the time it takes from the origin server and this speeds up the content delivery process significantly.

How much does cache improve performance?

The more cache there is, the more data can be stored closer to the CPU. Cache memory is beneficial because: Cache memory holds frequently used instructions/data which the processor may require next and it is faster access memory than RAM, since it is on the same chip as the processor.

Does cache increase speed?

Conclusion. Web caching can significantly increase the speed of your website without sacrificing anything.


2 Answers

At the most basic level caching should be used to store the result of CPU intensive processes. For example, if you have a server side image handler that creates an image on-the-fly (say a thumbnail and larger preview) then you don't want this operation to occur on every request - you'd want to run this process once and store the results; Then, every other request gets the saved result.

This is obviously a hugely over-simplified description of basic caching, and the use of an image is fine in this case as you don't have to worry about stale data i.e. how often will the actual image change? In your case, databases are hugely different. If you cache data then how can you guarantee that there won't be an instant mismatch between your real data and your cached data? Querying a database is not always a CPU intensive task also (granted you have to consider how the database is designed in terms of indexing, table size etc) but in most cases querying a well designed database is far more intensive on disk I/O than it is on CPU cycles.

First, you need to look at your database design and secondly your queries. For example are you normalizing your database correctly, are your queries trawling through huge amounts of data when you could just archive, are you joining tables on non-indexed fields, are your where clauses querying fields that could be indexed (IN is particulary bad in these cases).

I recommend you get hold of a query analyzer and spend some time optimizing your table structure and queries to find that bottle neck before looking into more drastic changes.

like image 157
Paul Aldred-Bann Avatar answered Sep 29 '22 16:09

Paul Aldred-Bann


Reference : http://msdn.microsoft.com/en-us/library/ee817646.aspx

Performance : Caching techniques are commonly used to improve application performance by storing relevant data as close as possible to the data consumer, thus avoiding repetitive data creation, processing, and transportation. For example, storing data that does not change, such as a list of countries, in a cache can improve performance by minimizing data access operations and eliminating the need to recreate the same data for each request.

Scalability : The same data, business functionality, and user interface fragments are often required by many users and processes in an application. If this information is processed for each request, valuable resources are wasted recreating the same output. Instead, you can store the results in a cache and reuse them for each request. This improves the scalability of your application because as the user base increases, the demand for server resources for these tasks remains constant. For example, in a Web application the Web server is required to render the user interface for each user request. You can cache the rendered page in the ASP.NET output cache to be used for future requests, freeing resources to be used for other purposes.

Caching data can also help scale the resources of your database server. By storing frequently used data in a cache, fewer database requests are made, meaning that more users can be served.

Availability : Occasionally the services that provide information to your application may be unavailable. By storing that data in another place, your application may be able to survive system failures such as network latency, Web service problems, or hardware failures. For example, each time a user requests information from your data store, you can return the information and also cache the results, updating the cache on each request. If the data store then becomes unavailable, you can still service requests using the cached data until the data store comes back online.

like image 33
Amrish Prajapati Avatar answered Sep 29 '22 15:09

Amrish Prajapati