I currently have a single domain www.mydomain.com
This performs geolocation and shows you the content that is relevant to your country.
This means that search engines also get served regional content when indexing. However, my problem is that google.co.uk is showing the U.S. content (because I identify the Googlebot as in the U.S.).
What can I do about this?
How do I tell the bot that there is different content for different countries?
I don't want to have multiple domain suffixes (e.g. mydomain.de, mydomain.co.uk)
and I'd rather not use subdomains, although I will if I have to (although how does Google know that uk.mydomain.com is the UK version of us.mydomain.com and not end up listing both in both regions?)
Are there any alternatives? Do bots identify what region they're crawling for?
You can use either subdomains or subfolders (mydomain.com/uk/), but in either case you should do the following two things:
Use rel="alternate" hreflang="x" on the <html> head section of your domain root for each version to specify the language and the country. Alternatively, you can do the same thing with an XML sitemap.
Verify the different subdomains/subfolders as different "sites" in Google Webmaster Tools, and then set a geographic target for each one of them.
Hreflang example:
<link rel=”alternate” href=”http://mydomain.com/uk/” hreflang=”en-gb” />
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With