Matt Cutts on Google Using a Set Standard for Manually Removing Webspam

Google on webspam

Google’s journey towards fighting spam has been evident from the day Penguin algorithm was employed. It scraped out websites which are considered spam or doing unethical practices. Also, it was proven that Google can’t be tricked even with advertorial pages as discussed in the Interflora Case.

In such cases, you would want to know how Google or Googlers doing manual actions determine spam websites. Does it use set of standards or simply removed the website when they feel or smell it as spam? You may have the same sentiments from the questioner, Billy, Sacramento.

And Matt Cutts answered his question through this video.

Matt Cuts was definite when he said they don’t penalize websites just because they look and smell like spam. In the times I’ve been in SEO, I understand that Google does not provide set of standards for determining a spam. But, it provided clear guidelines as to how ethical SEO should be practiced. Violation of this quality guidelines can make your website subjective to penalties- that can be low quality, spamming activity or deceptive techniques. If you would give a simpler view, I can say anything that is out of this quality guidelines, can make the website critical for Google.

There can be many violations of quality guidelines, stuffs he mentioned like cloaking, sneaky redirects, hidden text or links. Other SEO professional would think this way of manipulating Google rankings, thinking that they can escape its algorithms. Since algorithm is not 100% perfect, some websites can avoid penalties, yet they must take extra precautions with Google Manual Actions of removing or banning spam or low quality websites. Google will not you get away with it.

Matt Cutts mentioned three things about their Googlers:

• They are trained for consistency.

• They have direct coaching techniques or “shadowing” to correct and give instructions immediately.

• Where there are gray zones about spam, they provide meetings of team to provide same results.

• Google reviews the manual penalties set by Googlers.

• Random checks are also applied to instantaneously check quality of output.

• They don’t look at one side. They make judgments based on the overall performance of the website. They must consider repetitive spamming, hacking, malwares, and other violations.

True enough, in SEO, Socrates philosophy, the whole is always greater than the sum of its parts is embodied. Google employs fair judgment among the websites present in the internet; the ones who follow will be rewarded but those who do not will be punished. On the other side of it, consider your own SEO campaign. Don’t just count the right things you did but the whole picture of how you have been doing with SEO.

You might think I being repetitive, but really the way to SEO and Google is to boil down your campaigns into its basic principles of quality:

• Always think of users welfare much more than the search engines.

• Deception will nail you down the bottom of SERP

• Trick and manipulation of Google Rankings should be eliminated. Google always find a way of knowing how you climb up for the ranks.

• Create a website that features readership, authority, credibility, uniqueness and interactions.

Search engine is far from being an infant industry. It innovated to serve users with utmost quality. Google sets guidelines of how not to be considered as spam but how you can make it without spamming. Take the positive light and you’ll keep up with changes and updates without losing your spot in Google search engine results page.

Thoughts? Share it below.

online strategic solution

Al Gomez

SEO Consultant, Online Marketer & Blogger, Web Developer & DLINKERS Founder.

“Chose a job you love, and you will never have to work a day in your life.”