Trustiko post featured image

How to Prevent your Blog from Being Copied by others

In Blogging by Fathi Arfaoui0 CommentsLast Updated: September 16th, 2017

For a long time, bloggers learned that the only way to rank well in search engines is to create high-quality content, then, wait for web crawlers to index your posts later. This is the problem that web scrapers use to outrank your original content, simply because they have high page rank or sites with high crawling periodicity.

In the case when you have a new blog and you publish, new articles and work hard on that, these people can get your work and ranks for it without your known and the result will be, a site with lots of content but with no traffic. You know the reason? It’s because Google sees your site as copied content when it’s the original source. This is not a general case, but it happens a lot, especially for new sites without links and authority.

Stop others from copying your website content

Use Google search console

Of course, you can use any of the many tools like the copywriting tools and so on, but what they can do for Google? Nothing. So, what you have to do is telling Google that you have a new blog post or content and that it should be crawled as fast as possible, the easiest and the only way to do that is the Google webmaster tools, where there is a tool called Fetch as Google. The first thing to do is to login to your Google Webmaster dashboard and then find the tool under the “Crawl” menu as the below screenshot.

Webmaster Tools

Now, after clicking the option “Fetch as Google” and you can see the below example where you should add the new post or article or page URL. Please note that Google adds your domain name automatically and you just have to add the last part of the URL.

Protecting Original Blog Content

As you can see in the above example, there are two options, but you just need the first one, the second option which is “FETCH AND RENDER” will let you know how Google crawler see your page exactly.

Anyway, use the first one and the Googlebot will visit that page instantly or in the next few minutes. All depends on the work that these crawlers are doing that time on the web. In general, the crawling is instant.

Then, Google will show you the status of that page crawling, it should be green with “Complete” status as the following screenshot. If you get errors, then, read this post that will show you the common crawling errors in Google tools. What you have to do now is to click on that “submit to index” button.

Indexing a new page

Google allows you to submit up to 500 URLs per month and that’s enough for all sizes of sites and blogs. This is the best option to use for crawling a single URL each time. The second options let you submit only 10 URL per month, but crawling all the related posts and pages at once, this is needed only when you have a new site or lots of content that you think not indexed at all in Google.

So, use the first option as the next example to submit that single page to the index.

single URL carwling

As we all know, Google does not guarantee to index any page you submit, however, the new and original content will be indexed fast in general. If you have a blog or a website with good content and online since months at least, the index will be instantly in my experience.

That’s it, you’ve just submitted your latest blog post to be indexed in Google and that will protect your content naturally. A few minutes after publishing your post, Google will index it after the success crawling. So, when someone finds that same content via RSS or any other feed, then, when scrapers copy and publish it, Google already knew that you’re the publisher and all their content is copied when your post is the original.

Don’t worry about scrapers when your site became popular

When your site becomes popular and authority, it will be crawled hourly or in a few hours and that will guarantee that your newest post will be found and crawled. In this case, you don’t need this tool unless there is a problem with indexing your posts.

If you found that some published your content and they rank for it as the original publisher, then, use the Google Copyright Removal tool and submit your exact URL or any other page your own and you think it was copied.

That means, no one can outrank your posts because they have a good page rank or authority and you still get the best result for your hard work. Please share this post with your friends and people you think can benefit from it when they create new blogs. This will avoid wasting time with content that got published by others and you get nothing in the end than a copied content. It’s all about authority and speed, Google needs to find your content first and find it original before anyone else can use it as a copy.

About Fathi Arfaoui

Fathi Arfaoui: A Physicist, Blogger, and the founder and owner of Trustiko.com. He shares Business, Blogging, WordPress, Web Safety, and Blogging tips to build better websites and blogs. Also, he shares online marketing strategies and recommendations.

Leave a Comment