Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Will using HTTP and HTTPS on the same domain affect SEO [closed]

Tags:

http

https

seo

Will using both protocols on the same domain, serving the same website, make it indexed twice by search engines, and considered duplicate content?

If that so, what is the best way to avoid this negative effect in SEO. Should I make https on another subdomain like https://ssl.example.com instead of https://www.example.com with a different docroot? Or is it good enough to make the necessary 301 redirects from https to http for non-secure content and vice versa on the same docroot? What is the best practice here?

like image 614
MMSs Avatar asked Dec 08 '13 06:12

MMSs


People also ask

Does changing HTTP to HTTPS affect SEO?

The answer is no. According to Mueller, “there's nothing against linking to sites like these.” Even though linking to insecure pages does not affect SEO, the HTTPs protocol has been a well-known ranking factor since 2014.

Can a website have both HTTP and HTTPS?

They can both be open at the same time, they can even serve different websites. In some ways they are 2 different websites. To avoid this you can simply close down port 80, or alternatively, make sure that website served on port 80 always sends a redirect to the https website.

Is HTTP and HTTPS the same domain?

The two are essentially the same, in that both of them refer to the same “hypertext transfer protocol” that enables requested web data to be presented on your screen. But, HTTPS is still slightly different, more advanced, and much more secure. Simply put, HTTPS protocol is an extension of HTTP.

Why HTTP should not be used?

The problem is that HTTP data is not encrypted, so can be intercepted by third parties to gather data passed between the two systems. This can be addressed by using a secure version called HTTPS, where the S stands for Secure.


1 Answers

See this thorough discussion on the topic to address concerns specific to your site:

Moz Q&A: Duplicate Content and http and https?

In summary:

"Google considers https:// and http:// two different sites and may inconsistently index one vs. the other and/or penalize for duplicate content... https indexed pages that don't require https typically come from a bot entering an https required page and traversing outward (shopping cart, etc.)"

(note: Obviously if something requires https: on your site, you need to be careful about any of the following.

The easiest solution may be to use absolute canonical links in your site.

I.e: <a href='https://www.example.com/securepage/'>.. as needed or regular http:// otherwise.

Solutions

  1. "Be smart about the site structure: to keep the engines from crawling and indexing HTTPS pages, structure the website so that HTTPs are only accessible through a form submission (log-in, sign-up, or payment pages). The common mistake is making these pages available via a standard link (happens when you are either ignorant or not aware that the secure version of the site is being crawled and indexed)."

  2. "Use Robots.txt file to control which pages will be crawled and indexed."

  3. "Use .htaccess file. Here’s how to do this:"

    Create a file names robots_ssl.txt in your root.

    Add the following code to your .htaccessRewriteCond

    `%{SERVER_PORT} 443 [NC]RewriteRule ^robots.txt$ robots_ssl.txt [L]`
    
  4. Remove yourdomain.com:443 from the webmaster tools if the pages have already been crawled.

  5. For dynamic pages like php, try:

    < ?phpif ($_SERVER["SERVER_PORT"]== 443){echo “<meta name=” robots ” content=” noindex,nofollow ” > “;}?>

  6. Dramatic solution (may not always be possible): 301 redirect the HTTPS pages to the HTTP pages – with hopes that the link juice will transfer over.

like image 127
cerd Avatar answered Nov 15 '22 04:11

cerd