The Seven Deadly Sins of SEO: What You Should Avoid At Your Website

The Seven Deadly Sins of SEO: What You Should Avoid At Your Website

Anastasia

Eugene  |  

12 Mar, 2018

There are bunch of mistakes which may occurs in SEO practice. It includes mistakes in content marketing, copywriting, on-page SEO, linkbuilding and so on. But there are some mistakes that may be dangerous for your website. Probably you know it, but we think that it would be useful for all SEO newcomers. So let’s breaking down the seven deadly sins of SEO!

Ignoring Page Speed

How To Improve Website Speed 660X330

Everybody knows that mobile-first index starts early as 2015 (thanks to Google’s accelerated mobile pages). Now AMP-pages are very important for the on-page SEO and as said by many authority websites, it’s more than important to use it for the faster page downloads.

Furthermore, as said Google’s webmasters, page speed (especially on mobile devices) affects on ranking on the organic results. Why is this so? It’s easy - the more people visiting your website and remain there, the more attractive your website for the Google will be. So if you’ll ignore the page speed, it would frustrate your ranking.  

Cloaking

Manual Actions Chart

All the major search engines compete to make their search results as relevant, up to date and informative as possible. For a search engine to be considered effective, and therefore gain users, it relies on its reputation for providing the right information for any given search term.

They’re right for assuming this. Imagine you were looking for some tips on how to clean your windows, and you used a search engine you’re unfamiliar with. If you visited a site through this new search engine, and it brought you to a website on adult porn – you wouldn’t be too happy, would you? In fact, you’d probably dismiss the search engine as useless, and wouldn’t bother to use it again.

That’s why search engines take an issue known as ‘cloaking’ so very seriously. If their livelihoods depend on the search results being accurate and informative, search engines have a duty to their own business ethics – as well as their customers – to frown upon cloaking, and they do. Do it, and your website will be removed from search results and most likely blacklisted.

So what is cloaking? Cloaking is the practice of writing a piece of programming that means human visitors to your website see something very different from what a search engine bot crawling your website sees. If you cloak effectively, you could indeed disguise your adult site as something as harmless as cleaning windows – and you’d benefit from a good SEO ranking. You’d also, unfortunately, ruin the search engine results – and they can’t be having that. When it comes to cloaking, avoid.

Duplicate Content

Duplicate Content

Among those well versed in internet marketing, duplicate content is something of a sticky issue. The exact nature of the problem is in what constitutes duplicate content, with some internet marketers insist anything that has previously been written on any other website qualifies as duplicate content – while others say it only matters for the same text to be repeated on the same website.

The exact definition is not exactly known, and isn’t helped by the fact that the search engines are not particularly forthcoming on the issue. However, if you are found to be using duplicate content on your website and a search engine does have an issue with it, you can kiss goodbye to a good ranking with that search engine.

It is more likely – though not certain – that the duplicate content rule applies to text used within the same site. You should not, for example, make lots of pages all using the same article with no changes. This is the lesser version of duplicate content, though some marketers still exist search engines frown on the same article or text being used from anywhere on the internet will trigger a duplicate content penalty.

The idea, of course, is to avoid plagiarism and for search engines to avoid publishing results that show the same text over and over again. To be absolutely sure you’re not committing the duplicate content sin, always write and use original content, both within your website and externally. That way, you can be sure – no matter who is right and wrong in the debate – that you aren’t going to be penalized for it.

Linking To Bad Sites

Bad Links 1

Have you ever heard the phrase ‘falling in with a bad crowd’? Well, if you link to websites that search engines consider ‘bad’, that’s the search engine optimization equivalent of falling in with a bad crowd. While your website may not be intrinsically ‘bad’ in itself, if you promote (by linking) sites that violate the terms and conditions of major search engines, you’ll be tarred with the same brush. While it’s unlikely your site will be completely blacklisted, you may see a sharp fall in rankings position – or be removed from the search rankings altogether.

This, of course, begs the question: how do I know what a ‘bad’ site is? After all, if someone links to you, you’re probably going to want to do the decent thing and return the favor That’s what so much of website building, networking and promotion is all about – right? So how can you be sure you’re not destroying your own search engine chances by linking to a poor site that search engines consider bad?

It’s tricky, but the basic answer is to use your gut. How does the website look? Does it look professionally designed, properly maintained? Is the content unique, or does it all sound familiar, or is the English terribly written?

On a more technical basis, you can check the metrics of the site (for example, MOZ’es DA/PA or Majestic’s TF/CF), and also its standings with Alexa. This should give a good understanding of the website in question’s general standing, and whether or not it’s the kind of crowd you want to be associating with. Also familiarize yourself with the Google terms of service, and scan the site for any obvious violations. If it passes, feel free to post a link back.

Hosting Viruses, Malware or Other Nasties

Hosting

This may seem obvious; no search engine is going to rank you well in their search results if their bots discover that there is spyware, malware, viruses or any other kind of internet nasties contained within your website. In fact, if a bot does discover such content, your site will most likely be removed and blacklisted for good.

So that’s simple – and most of you won’t even be considering hosted that kind of content anyway, so there’s nothing to worry about, right? Perhaps wrong. Many sites are subject to hacking, which leads to them being infected with the nasties that search engines (and internet users in general, for that matter) hate so much. Even sites with thoroughly strong security can be hacked and infected, quite without the owner’s knowledge. So you could be merrily promoting your site, working on its content and ensuring your SEO is tip top, but you may not be aware that your site is infected and only a few steps away from being blacklisted forevermore.

There are a few things you can use to prevent it. The first is obvious, but crucial: visit your site regularly with your anti-virus working, and check it seems okay. Secondly, you can get a good idea of what other people think of your site by installing a Firefox Add-On called “Web of Trust”. This displays a ring of one of three colors near the browser menu of a website; green means the website is ‘safe’, orange means ‘doubtful’ and red means ‘avoid this site’. These colors are user generated, so you can check that no one is experiencing problems with your site by installing this add-on.

Title Stacking

When it comes to search engine optimization, one of the most useful tools in a web developer’s arsenal is the < title > tags within HTML code. Unlike articles, which must be based around keywords (a procedure which is never easy), the < title > tag is a section of code which you can pack with your keywords – all without having to add a context, a readability, and all the other things that an article needs. The extra bonus is that you can have your main page have a < title > tag full of keywords, and keywords are often hard to find on a simple “welcome to this website” page.

The usefulness of the < title > tag is also one of its major problems. The tag becomes so powerful, so influential, and so easy to use, that those employing shady black hat SEO techniques quickly learn how to manipulate it. They discovered that by using more than one set of < title > and < / title > tags in an HTML code for a web page, they could fit in many more keywords – and thus rise up the search rankings. Using many sets of < title > tags is, understandably, known as title stacking.

Get caught doing it by search engines, and you’ll be dropped from the search results quicker than you can say ‘jack rabbit’. It might work for awhile, but the overall quality and reliability of your site will soon be called into question – because you will get caught. Use one set of < title > tags only, and keep the keywords relevant to your site.

Avoid Black Hat Techniques

Walter White

They appear every so often on internet marketing forums; people claiming to have discovered a fool proof “black hat” search engine optimization technique. Their technique, available for a price, will propel your website to the top of the search engine listings – and of course they guarantee you’ll never get caught.

Now, think about it. While we’d all like to believe that there are methods that can get us to number one in Google with no effort whatsoever, it just isn’t true. Google is huge, and it’s smart. There’s no denying that those employing “black hat” (a phrase used to describe methods that go against Google, or other search engine, terms of service) techniques may experience success at first, but it won’t be long term. Not ever. In fact, there’ll be lucky if it works for a few days.

Let’s say these people, these forum peddlers, really had discovered a flawless technique to guarantee themselves top of the pile picks in search engine results. Do you think they’d be selling their method for a couple of bucks on forums? No, of course not. If their method really worked, they’d be creating small affiliate websites in every profitable niche, working their SEO black hat magic and sitting back to watch the profits roll in. Furthermore, the more they publicize their method, the more likely it is that Google will discover it – so why would they risk it?

They wouldn’t, because these methods don’t exist. Avoid them. Don’t waste money, both on purchasing the method and the subsequent building and use of method on a website, on something that is doomed to fail.

Similar Articles