Table of Contents
This is a guest post from Servando Silva of Stream SEO, one of the smartest marketers I know. He’s one of the first people I go to when I have questions about my own projects, as he understands online marketing at a much deeper level than others.
While there are a ton of SEO case studies showing how to increase your traffic to rank new websites, there aren’t as many available where people talk about how they recover penalized websites.
Honestly, recovering a penalized website can be hard if you don’t know what you’re doing, and there’s always a possibility Google won’t care anyways and the domain is doomed.
That’s why a lot of people simply move to a new project/domain once they have a problem they can’t solve.
Today I’d like to show you a recent case study of a website which had been dead pretty much forever, but had the potential to start growing once we fixed a few things.
Many of the problems this website had are well known problems people still make from time to time because they don’t know about SEO keyword research and how the Google algorithm works.
In this article, I’ll show you exactly how this site recovered from a Google Penalty and now receives thousands of visitors per month.
What Is A Google Penalty | Types Of Google Penalties
Before I go into the specifics of why this website was penalized and how we recovered it, I want to give you a brief overview of what is a Google Penalty and why websites are penalized.
A Google Penalty is a negative impact on a website’s search appearance when it goes against Google’s established search engine optimization guidelines.
When a website is penalized, it loses its search ranking and stops appearing in Google Search results.
As a result, it stops getting traffic from search engines.
There are two types of penalties
- Algorithmic Penalties: These penalties come into effect as a result of a change in Google algorithms. Its impact is widespread and it applies to all the websites indexed in Google Search. For example, Google Hummingbird was an algorithmic penalty that penalized any websites that used over-optimized anchor text for link building.
- Manual Penalties: A manual penalty occurs when someone from Google’s webspam team manually checks a website and penalizes it for malpractices. The penalized website is notified via email and a message in its Google Search Console account.
Both manual and algorithmic penalties can vary in their effect depending on the intensity of your site’s violations.
The Top Reasons For Google Penalties
To recover a website from a Google penalty, you first need to understand why it was penalized.
The most common reasons why websites are penalized by Google and downgraded in search results are.
Low Quality/Thin/Copied Content
Websites that copy content from other sites or publish low quality, thin, or spun content are routinely penalized by Google Search algorithms.
These are mostly low-quality websites that are looking to rank quickly for their target keywords without actually putting in the hard work or offering any real value to the user.
Their content doesn’t make sense and does not answer the user’s questions.
As a result, Google scraps them from its search index.
Spammy Backlink Profile
A poor backlink profile is among the biggest reasons why websites are penalized. Google considers backlinks as votes of confidence from one site to another.
The more backlinks a site has the higher it ranks.
But Google strictly prohibits websites to artificially build backlinks instead of earning them with high-quality content.
Most of the sites that are penalized for backlinks have links coming in from Private Blog Networks (PBNs), links that are clearly paid for, spammy links created through bots and software, or irrelevant links coming from sites that they have nothing to do with.
Overoptimized Anchor Text
Overoptimzied anchors mean using the keywords you want to rank for as the anchor text of your backlinks every time you create a link on some other site whether it makes sense or not.
This clearly shows that your links are not earned and artificially created which results in a penalty.
Poor User Experience
If your site is not built using a responsive website design or if it takes too long to load, the user experience suffers.
As a result, people bounce back from your site and don’t find it useful.
Naturally, Google comes at such sites and downgrades them in its search rankings
How I Recovered A Website From Google Penalty
Now that you know why sites are penalized, let me tell you how I identified a penalized site, found its core issues, and then resolved to recover its lost rankings.
However, I can say many of the problems this website had are also because Google has evolved in the last few years, focusing more on quality of the content as well as quality of links.
So, in case you have a website that has been active for at least 5 years but the traffic has tanked or it’s declining every month, this case study could help you as many things that DON’T work nowadays used to work 5-10 years ago.
For security and competitive reasons, I won’t reveal the URL of the website shown in this case study, but we’ll talk and show everything else including links profile, traffic stats and more.
Here’s the graph I just pulled from Ahrefs to show you the organic traffic evolution of this particular site:
As you can see, the organic traffic is about to reach 6,500 visits per month and a lot of keywords are being indexed in the first page of Google.
This trend started back in Q4 2018.
Before that, the site was pretty much dead, with a few visits here and there but nothing that could really earn money. I’m sure the website wasn’t even paying the rent of the host and domain name at that point.
Now let’s have a closer look at the last 12 months so you can clearly see when the traffic started coming up:
The trend started around September but it really rocketed around November around 2018. If you look closely to the graph above, you’ll see the site had a lot of keywords already ranked between page 2-10 of Google but pretty much none of the keywords were reaching page 1, which is where the traffic and value is.
Here’s a screenshot of the 12 months historical data from Google Search Console. The screenshot is taken from a Spanish account but as you can see, it matches the data from Ahrefs.
And just because people love to see Google Analytics screenshots, here’s a pic of the same data from Google analytics with all the historical data from 2018:
I wanted to show you all the details of the website because as you can see, the site is not new at all.
It had a few hundred visits every month (some organic, some from other sources) and it also had a ton of keywords ranked in the page 2-10 of Google pulling very little traffic. So, this is not a new website created in the second half of 2018 that just started gaining traction a few months later.
This is an old website that has been around for almost 10 years and it was completely tanked without gaining any traction. The website itself has more than 100 articles but it used to have more than 300 at the beginning of 2018. We will talk about that later.
Now let me tell you a story.
This website wasn’t created by me. This website is owned by one of my friends named Javier and he lives in Spain. He bought this website a couple years ago but it never had traffic so he kind of forgot about it for a while until we decided to do this experiment and try to revive it.
Now as far as the metrics of this website, you can see from Ahrefs it has some low level authority and a ton of backlinks.
The first problem we noticed is the number of backlinks against the number of referring domains was too damn high. 20,000 backlinks from 200 domains means each domain on average had 100 backlinks pointing to this site.
This, in short words usually means the backlinks come from spammy or low quality sites, or maybe forums, profiles and other sites where you can put a link and it will duplicate itself in several pages.
The domain Rank (9) clearly shows how the backlinks don’t really have much authority, yet those were enough for a ton of keywords to be indexed and ranked between page 2 and page 10 of the SERPs.
Ahrefs has historical data from 2013 and there we discovered the ratio of low quality links pointing to this site was even worse 4 years ago. The site had almost 100,000 referring pages pointing just from 200 domains.
Also, I want to be clear we did not build any new backlinks to this site to try to revive it. We literally didn’t get any good quality link to try and push the rankings to page 1 because we thought the problem was the site itself and not just the low quality links.
Because it was already indexed and ranking, but it just wouldn’t reach the first page for any keyword ever.
Our plan to revive the website
Since this website wasn’t really worth much we decided to experiment and have a plan consisting of 2 parts.
Step 1. Check the website and delete all the content which wasn’t ranking or content that was considered as thin or low quality in the eyes of Google. Same for low quality backlinks.
Step 2. If the first part gave results (fortunately it did), we could move to the second phase of the plan which was to invest some money in it, write more content and build some high quality backlinks to boost the rankings.
Now that we’re getting 6,400 visitors per month I think it is time to move to phase 2, but that might be worth another post in the future. For now, let’s focus on part 1, which is how we pretty much revived the site from having a hundred visits per month to more than 6,000 visits per month.
To reach this goal we mainly did 3 things in the website.
- Clean all thin content
- Update articles
- Disavow low quality links
Let’s quickly explain each one now:
Cleaning thin content
This was one of the main problems of the website. The previous owners had no idea about how Google works, so they decided to fill up the website with a ton of low quality, short articles and hoped for the best.
Many things happened here. From keyword cannibalization, to duplicated content to empty pages and more were not allowing this site to prosper after 7 years of existence. The on-page SEO of this site was a total mess and it can affect a lot once you have a backlink profile as I have shown in this case study before.
We literally decided to delete and erase most of the posts in the blog that had super low quality content or had a word count of 400 or less.
By doing this we reduced the number of posts available from more than 300 to around 120. So, we ended up deleting close to 200 posts completely as they were super low quality and many of them had less than 300 words!
We resubmitted the sitemap and waited a few weeks before we started to see some extra traffic. This was a good signal.
As a part of clearing thin content we then decided to delete all pages that could represent duplicates for Google. But instead of just deleting those we used the SEO by Yoast plugin to deindex all the category, author and tags pages which were also indexed in the SERPs.
Tags were very popular 10 years ago (remember the tag clouds?) and everybody used them. But nowadays I try to avoid them as the plague in WordPress.
After a few months passed we ended up deleting most of the tags once they were deindexed (we still have some more to go but eventually we’ll get there).
Google Search Console panel doesn’t go more than 3 months back in their index coverage record but you can see how the excluded pages (gray bars) number increased a lot once we marked the tags and categories as no index within the sitemap:
Also, here’s the graph of the valid indexed pages after the cleanup. Right now, we have a bit more than 110 pages indexed but you can see 2 months ago we still had close to 180 pages indexed.
This number decreases every week and every time this happens we get rewarded with more traffic as Google detects we’re erasing and deindexing all of the low quality pages we had indexed before.
The whole process should take a few more months but eventually we’ll just have the post pages and home page indexed in the SERPs. Google can take several months to clean this up, but we’re patient and we don’t want to force things.
Another thing we did was to update the articles we kept with more information to add more value. Most of the main articles that were indexed in page 2 were updated with more content. The word count goes from 1,000 to 3,000 words per article easily nowadays.
Of course, there are a few articles where we didn’t really have anything else to add, but at least we tried to keep them at 500+ words each.
We didn’t mess up with the published date or anything else but that might work in the future.
Last thing we did was to disavow a few links who were pointing to our site with tons of referral pages from a singular domain.
Since the site doesn’t have much authority I wasn’t sure if this was necessary but Javier insisted on doing it as it was an easy task.
The website has around 200 referring domains, so it wasn’t difficult to find with SEMRush and Ahrefs which domains were the spammiest and had lower quality and after submit them in a disavow file.
Many of the disavowed websites are auto generated content from bots and scrappers so I don’t think they would hurt a good authority website, but in this case, it was something we could test.
If you have a big site with good authority you’ll most likely see scrappers and duplicated content from low quality websites but they won’t hurt your rankings as you already have some powerful links from high authority sites.
As you can see, after all the changes we started doing in Q3 and part of Q4, the site started gaining some traction and it has been growing ever since:
During October and November many of the keywords started reaching the first page of Google and the site went from a hundred visits to 1,000+ visits per month.
December came and the site reached close to 3,000 visitors per month, which was great for the shopping season. This is an Amazon Associates website after all.
During January 2019 we reached about 4,000 visits and it looks like February might end up with close to 7,000 visits.
The website is ranking for many high volume keywords in the first page of Google, and many of those are buying intent keywords:
We still don’t know how much growth potential the site has without adding some powerful backlinks, but we’ll keep testing during this year and see what happens.
What we already started to do is adding some new articles to let Google know this domain is not forgotten. So far, we’ve added around 10 articles in 2019 and some of them are ranking already in the 2nd page of Google.
I know most people like to see numbers here’s a screenshot of the revenue generated by the Amazon Associates program in 2018:
The traffic is coming mainly from Spanish speaking countries, including Spain, Mexico and Chile. But those earnings are mostly from Spain. We plan to add other Amazon accounts if possible soon to increase the revenue.
Now I mentioned having a spike of traffic during November-December was great because that’s when people are in shopping mode and Amazon sites usually benefit a lot from it.
The site started making $10-20 per month in Q3 but it reached close to $300 in December thanks to Christmas season. Earnings dropped since beginning of 2019 to $100-200 per month but traffic is still growing so it should be good in the long run.
I like to start monetizing my sites once they reach 1,000 visits per month.
To be honest I’m not a super fan of Amazon Associates and I think there’s plenty of other ways to monetize a website where my skills are put to better use, but in this case, since I wasn’t the buyer of the website and I didn’t decide the niche of it I can’t do much to change it.
What I’ve noticed though is ranking for keywords in other languages (not English) is way easier and can be very profitable as well.
The linking profile of this domain is still pretty weak and it has a lot of space to grow if we add some powerful backlinks. Ahrefs mostly scores the keywords difficulty we’re ranking below 4.
The same keywords in English usually have a 10-20 difficulty score, which means we would need a ton more authority to rank in the first page.
Would you like to see an update a few months later once we add some powerful links?
Please let me know if you have any comments or questions below!