What techniques or tools are recommended for finding broken links on a website?
I have access to the logfiles, so could conceivably parse these looking for 404 errors, but would like something automated which will follow (or attempt to follow) all links on a site.
The Wayback Machine Chrome extension detects dead web pages and gives you the option to view an archived version of the page. Imagine it in action. A website with a 404 error or a Page Not Found message can be an annoyance. A slightly dated but still relevant version of the webpage is the next best option.
Sitechecker website crawler checks your website for broken links and provides the whole information about how to correct them. You can research the anchors of 404 error pages and fix them immediately. It is a web tool, so it can be run by any operating system. Moreover, the website on any CMS can be crawled.
For Chrome Extension there is hexometer
See LinkChecker for Firefox.
For Mac OS there is a tool Integrity which can check URLs for broken links.
For Windows there is Xenu's Link Sleuth.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With