Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to serve regional content to search engine spiders?

Tags:

seo

I currently have a single domain www.mydomain.com

This performs geolocation and shows you the content that is relevant to your country.

This means that search engines also get served regional content when indexing. However, my problem is that google.co.uk is showing the U.S. content (because I identify the Googlebot as in the U.S.).

What can I do about this?

How do I tell the bot that there is different content for different countries?

I don't want to have multiple domain suffixes (e.g. mydomain.de, mydomain.co.uk)

and I'd rather not use subdomains, although I will if I have to (although how does Google know that uk.mydomain.com is the UK version of us.mydomain.com and not end up listing both in both regions?)

Are there any alternatives? Do bots identify what region they're crawling for?

like image 936
Andrew Bullock Avatar asked Jan 25 '26 21:01

Andrew Bullock


1 Answers

You can use either subdomains or subfolders (mydomain.com/uk/), but in either case you should do the following two things:

  1. Use rel="alternate" hreflang="x" on the <html> head section of your domain root for each version to specify the language and the country. Alternatively, you can do the same thing with an XML sitemap.

  2. Verify the different subdomains/subfolders as different "sites" in Google Webmaster Tools, and then set a geographic target for each one of them.

Hreflang example:

<link rel=”alternate” href=”http://mydomain.com/uk/” hreflang=”en-gb” />
like image 181
Arttu Raittila Avatar answered Jan 29 '26 12:01

Arttu Raittila



Donate For Us

If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!