Our company is changing web platforms and we would like to be able to preserve our Google results so we are planning on putting the 301 redirects into our htaccess file.
My concern is that if if I put in all these redirects (probably 3000 - 5000 total) it will slow the server down as it makes all those checks.
Does anyone know if having an htaccess file this large will cause any problems? I have a pretty fast server for the site (8 cores) so I have a fair amount of horsepower available.
An . htaccess file can be used to load customized error pages (such as 404 pages), create URL redirects, implement password-protected authentication for specific directories on your server, and more. In this tutorial, you will learn how to enable, create, and use an .
htaccess and . htpasswd files are protected from all external access. This is super important because you do not want anyone or anything to access these sensitive and powerful files.
htaccess is not required for having a general website. That file simply allows you to make changes in the way your website behaves for example banning people from accessing your site or redirecting an old dead link to a new page. Some software like Wordpress requires settings in the . htaccess file (or httpd.
Save the file and type the URL yoursite.com/foobar/ . If the reditect works and the URL gets redireted to the homepage of example.com then it's clear that your htaccess is working and being read by your Apache server. If it still doesn't work then the problem might be that your hosting provider has not enabled it.
I doubt it would noticeably slow down the server. But check it out first.
Create a .htaccess 5k line file in www/temp folder with some rewrite rules like you will be using. See how long it takes to access a page with and without the .htaccess file.
The other answers have some good suggestions, but if you don't wind up using some alternative to rewrite rules, I would strongly suggest putting these lines in the main server configuration file instead of a .htaccess
file. That way Apache will only parse them once, when it starts up, and it can simply reference an internal data structure rather than having to check for the .htaccess
file on every request. In fact, the Apache developers recommend against using .htaccess
files at all, unless you don't have access to the main server configuration. If you don't use .htaccess
files, you can set
AllowOverride None
in the main configuration, and then Apache doesn't even have to spend time looking for the files at all. On a busy server this could be a useful optimization.
Another thing you could consider doing (in conjunction with the above) is using the RewriteMap
directive to "outsource" the URL rewriting to an external program. You could write this external program to, say, store the old URLs in a hashtable, or whatever sort of optimization is appropriate.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With