HTTP to SSL HTTPS affects website ranking?


The reason why I’ve written this post since last Wednesday I’ve experienced having one of our sites redirecting to https. And that was kind of alarming since I’ve noticed most of the keywords rankings of that site goes down. So I’ve searched and find the possible solution and applied it. And now I will going to share to you what I’ve searched over the internet.


The redirection of the hypertext transfer control or http into https has screwed the concern of webmasters who are anxious of its effects to the site’s ranking, or directly, to the regular course of business. It is undeniably a current anathema that is yet to be resolved by the SEO experts, otherwise, it would lead nowhere else but in an immediate failure. Hence, a thorough understanding of this matter is a requisite for the efficient resolution thereof, and effective rescue of the website’s competent performance.

Nevertheless, in some significant degree, there may be those who are still totally unaware that such phenomenon is already affecting the whole system of information transmission in the server or search engine, especially with Google. An ostensible cause of which is when a certain search engine commences to index the pages or add the protocol with an “S” not to pluralize it, but to demoralize the entire functioning of the site. For instance, when Google discerns that your site’s link includes the prefix HTTPS in the URL, it could lead to the duplication of content which is apparently a grave issue.

It is just one ordinary symptom that could lead to a mind-boggling diagnosis of an amateur site administrator; thus, consequently, it might also be the cause of consummation of wasting time and effort pushing the ranking up while a strong gravity of the said problem is pulling it down. The reason why an early prevention to the malignant and destructive “online disease” is always the effective cure for the safety and security of the normal operation.

Wittingly, the hassle of the said prefix has usually stemmed from the skeptic initiative of a blogger or site administrator to secure the protocol by adding an extension to the common protocol.. Nonetheless, such is no good at all for when a search engine discovers a page with https, it can then begin to creep the entire site with the same prefix especially when a webmaster solely relies on relative links in the pages. It should be noted then that a relative link is one that utilizes a brief version of the URL in the attribute of “href” from anchor tag, and hence, does not include either of the said prefixes or a common domain name.

Meanwhile, for an efficient aid in comprehension, the http, on the other hand, serves as a request and response process by which the internet agents follow so that the data can be easily, consistently, and accurately provided among the servers which keep relevant information, and the clients who accessing to it. Yet, it is also utilized to access pages of html, and that most clients may be trading confidential data with any server and, thus, it needs tight security to prevent unfiltered access. Consequently, because of such, the https, or the secure protocol, was purposely created by Netscape Corporation to serve the purpose of safeguarding transactions through filtering or keeping the gate of main access.

However, on the contrasting point, https is usually identical to http for it also follows the fundamental protocols; but one should note that the use of https instead of http signifies that an encrypted connection is requested. There are also other notable distinctions between the two which are naturally in the default port which is 80 for the latter and 443 for the former. In https further, it functions through the transmission of normal http interactions in an encrypted system so as the data therein could not be viewed by any strange visitor.

Additionally, for hosting the https connections, a server should have a key certificate which has the secret code information with owner’s verification identity. The certificates here are counter-checked by another party to ensure the security of the said key. Furthermore, the https is employed in a lot of situations such as that of initial pages for banking, business corporations, and other private or public entities which have confidential transactions. This protocol, however, can also be a disadvantage when the application is not done properly; hence, it is an indispensable duty of users to have “extra senses” in evaluating doubtful certificates and with the disclosure of personal information in the internet.

In a critical case, when a site administrator utilizes all websites with https, it could limit the access of visitors, and some mobile devices cannot access with its anchored pages. It is, thus, contrary to what is usually assumed that such prefix may be optimized with excellent results and Google could treat the content as still normal.

The happening in the virtual world is a complete opposite of the specious claim. As already mentioned, the https has already caused content issues among various sites. Specifically, it does when the search engine has discovered it in a website and that may have been caused by either a mere accident in the system or malicious acts of internet users, and such often occurs on business sites.

Hypothetically, it was also concluded that wise competitors use it as a stealth strategy to post links to another site with https so they could make a duplicate content from your site and, consequently, decrease the ranking of their rival.

With such reasons at hand, it is indisputably necessary for you to determine possible options to consider when the occurrence of the problem persists and the antagonists do not cease to intoxicate your business. The following “sub” discussions could greatly help you in formulating wise strategies as a shield thereof:

Why HTTPS is a Serious Problem?

HTTPSIt is a grave matter because when a search engine detects that a page is being published in various URLs, it could significantly lower the rank of that page one could impossibly imagine. And as one of the canonicalization problems, it may likewise affect the depth of the crawl. To better understand this issue I have posted here from Maria Nikishyna’s Tools SEO for HTTP and HTTPs from search engine journal and here’s what I’ve find out:

(1) Duplicate Content and Canonicalization – because the http and https protocols are considered as separate site, then there’s really a possibility that the main site will be penalized for having duplicate contents. Yet, generally, when the search engine determines such defect, it would take only one and reject the other. So what’s the solution for this issue then? Just keep in mind the following:

Be wise with the structure of the site – this is necessary to keep the search engines from crawling and indexing https pages. Structure the website so that https are accessible only through submission of forms (sign-up, log-in, etc.), but do not make pages available through standard link;

Employ Robots.txt file – this is intended to control what pages should be indexed and crawled;

Utilize.htaccess file – the procedures in doing it are as follows: (1) Create file name: robots_ssl.txt in your root and (2) Add the following code to your .htaccessRewriteCond %{SERVER_PORT} 443 [NC]RewriteRule ^robots.txt$ robots_ssl.txt [L]

Detach from the webmaster tools – do this after the pages have already been crawled;

For dynamic pages like php, try< ?phpif ($_SERVER[“SERVER_PORT”] == 443){echo “< meta name=” robots ” content=” noindex,nofollow ” > “;}?>

Redirect the HTTPS pages to the HTTP pages – since basic solutions may not always be effective, this one could possibly. Just try and hope that the link would evntually transfer over;

Have some site portions configured to use SSL – this would allow the transfer of data between the browser and the server through secured or encrypted connection. Note, however, that the URLs of these pages still commence with https instead of http to signify the secure protocol.

Remove pages of https with Webmaster Tools if you already have them in the index.

(2) Linking – the problem of https can also be directly connected with links since, as earlier noted, other site administrators use it against their competitors. Thus, the following may be a great shield or solution:

Make links to http version of the pages – in doing this, you will need to make sure that vital pages are accessible in http;

Keep a different log file for the https domain and write a code to lead the links to the e-mail regularly;

Keep ordinary sections under http only – this is necessary to reduce the possibility of internet users linking to https;

How to detect problems with canonicalization?

Actually in order to detect it I have used Google webmaster tool but I have also search other’s way on how to find out this existing issue and here’s a helpful post from Ian Laurie’s Detect canonicalization problems (part 2 of 3) and below are his suggestions:

(1) Use Google Webmaster Tools – here, click ‘Diagnostics’ and ‘HTML Suggestions’. After which, check for pages with the same title tags. Through this, you can identify the problem as early as possible and make immediate solutions thereof.

(2) Utilize A Link Checker – with this, you can also list page of duplicate title tags. The canon confusion or problem may also be detected here.

(3) Employ Search Results – searching for the same title tags that you used is not easy, thus the follow tsteps could aid you: (a) Go to your page site and from there copy a phrase that you think is different and cheack it, and (b) proceed to any search engine and seek for “your phrase”, and if you notice the same item, then you may really have a problem.

SSL https to http?

The following are two common ways to do it and this is a helpful post from Aaron Wall on seobook and find the full version here: htaccess, 301 Redirects & SEO: Guest Post by NotSleepy

(1) Make the succeeding PHP file and icorporate it over each page:
if (isset($_SERVER[‘HTTPS’]) && strtolower($_SERVER[‘HTTPS’]) == ‘on’) {
echo ”. “\n”;}

(2) Cloak your robots.txt file.
If an internet user comes from https and becomes one of the identified bots like googlebot, you should display: User-agent: *Disallow: /, or otherwise, display your normal robots.txt. To do such, employ your .htaccessfile treat .txt files as PHP or other dynamic languages, then continue to write
the cloaking code.

You might also find it helpful to visit the following sites for other suggested solutions:
Blocking & Removing Pages Indexed with HTTPS by Rainbodesign

What if you’re going to optimize a website with https?

Well, to do such, you need to consider the following and this is what I’ve seen from Doug Williams’ How To SEO Secure HTTPS Pages

Page Load Speed – https is an intensive server and that can greatly affect the load time of certain page, and slow loading of pages could affect Google rankings. Hence, you have to make sure that the website is being hosted on an excellent performance server, and minimize sizes of images and other files that could even slower down the loading.

URL Canonicalization: http and https have totally different treatments in search engines. Thus, make sure to set-up 301 redirects from the http to the https versions of your site.

XML Sitemaps: Use the https version of the URL in the sitemap for the secure pages. Some sitemap generators are not able to parse https pages.

Robot.txt – this should be placed within the protocol of https for it to work.

Additionally here’s a video of Matt Cutts answering Can switching to HTTPS harm ranking?

So what if you’re switching to all https, is there really chances of ranking? The answer is simply yes and let’s looks at the logic of Paypal as Matt also cited this as an example and what is important that you are going to have a consistent strategy for this.

The reason why it is always a matter of duty for webmasters to be aware of current matters which could greatly affect their sites to avoid being penalized by Google and to maintain the tight security of the entire functioning of SEO business.


This is what exactly happened to our client website. We have the ssl on that site for a while but just for our payment portal somehow it forced the homepage to ssl and google decided to index that page instead of the regular. Google read these both http and https differently and this is the reason behind why we lost ranking to 9 keywords which previously ranking top 10 in

How did we fix it?
Ans:  We removed the payment portal and ssl and did a 301 redirect from the ssl homepage to the regular. After 2 days we saw gradual improvement in ranking.


Al Gomez

SEO Consultant, Online Marketer & Blogger, Web Developer & DLINKERS Founder.

“Chose a job you love, and you will never have to work a day in your life.”