61% Of All Website Traffic Are Bots – How This Impacts Small Businesses





These days, you may be getting more non-human visitors to your website than human visitors — and you may not even be aware of it.  And those non-humans visiting your site may be there to do it harm.

If the idea of non-human visitors conjures up an image of Arnold Schwarzenegger as the Terminator coming to take down your site — it’s not. These non-human visitors are “bots.”

A recent report by Incapsula says that bot visits to sites are now up to 61.5% of total website traffic. This is a lot, so it pays to know what these bots are and what they are potentially up to.

What is a Bot?

A bot is a small robot-like software app that roams the Internet, jumping from site to site.  There are good bots and bad bots.

A good bot is sent out by another site (say, Google) to collect information or to perform a specific task, and it jumps from site to site via the links on each site.

The good kind are typically search engine bots that index your site, such as the GoogleBot. This kind of bot traffic makes up 31% of that 61.5% of bot traffic.  You can pretty much trust and forget these good bots. They will do their business and then quickly be on their way. However, still keep one eye on them, because some bad bots have been known to disguise themselves as GoogleBots.

It’s the other 30.5% — the bad bots — that you need to be concerned about. Here’s where you start to enter a murky side of the Internet — bots that could potentially do your website and business harm.

Scrapers are bots that will kick in your website door and steal all of your content. That content is then passed back to the scraper owner, who passes it off as his own content (and probably tries to profit off it by putting advertisements on his page).  Maybe this article will end up being scraped? It’s impossible to tell in advance, which is why you should monitor mentions of your site and brand name, by regularly searching Google to see what pops up anywhere else online.  By monitoring, you may be able to get your stolen content taken down.

Spammers are also another type of entity using bad bots.   If you run a blog, for instance, you should be monitoring your comments daily. Otherwise, your pages will quickly get filled with spam such as links to drugs, get-rich-quick schemes, and other dodgy links. It’s like having walk-in visitors come into your store, and deposit trash on your floor and leave.  You certainly wouldn’t want to allow that to happen, and you wouldn’t leave the trash there.  If you are using WordPress for your site, a spam blocker called Askimet will block 99.9% of your spam (see diagram above for Akismet works). But nothing is perfect in life, so still check your comments section religiously.

The bad bots you should really be concerned about are used by hackers. Some people love nothing more than to break into a site, take it down, desecrate it, destroy the files, and change the login details so you can’t get it back.  Bad bots may be designed to let in hackers.  You can help protect yourself against this by backing up your site daily, locking out IP addresses after a certain number of failed logins, and even employing Google Authenticator to provide a second layer of protection.  Another thing you can do is use a service that blocks bad bots from doing their thing, before they have a chance to trash your site.

Having a website is a must today, but the world has some unpleasant individuals so stay on the alert!

Image: Incapsula

22 Comments ▼

Mark O'Neill Mark O'Neill is a staff writer for Small Business Trends, covering software and social media. He is a freelance journalist who has been writing for over 25 years, and has successfully made the leap from newspapers and radio onto the Internet. From 2007-2013, he was the Managing Editor of MakeUseOf.com.

22 Reactions
  1. The first response I have is “Don’t fall in love with your analytics.” This shows that a good portion of visits aren’t people, which increases bounce rates, decreases time on site, etc. Make sure you’ve got metrics that reflect real people, like conversion rates, signups, etc.

    • I agree that metrics regarding real visitors is important, as that is where the development of a site lies. But at the same time, you need to carefully keep a close watch on the bots, for reasons outlined in the article. Otherwise you may very well find that you don’t have a site at all!

    • But I guess you really cannot prevent Googlebots from visiting your site. These bots have to crawl your site so that they can gather some information on your site at a regular basis.

  2. 30.5 % of bad bots is a huge amount.
    Websites should take all kind of actions to prevent bad bots.

  3. Thanks for sharing Mark. You need to keep one thing in mind, Akismet does not prevent bots from visiting your website. It only stops spam comments. Bot visits will still register with analytics tool you’re using.

    For WordPress, to help block bots out I would recommend trying out Bad Behaviour plugin. Also, all our websites are behind Cloudflare, so not only do they block bots, but we also see accurate analytics compared to javascript based analytics (like Google Analytice) and even see breakdown of threats.

    Thought I mention this.

  4. As far as I know analytics does not count bot visits. But nevertheless anaytics is not the only problem that they pose.
    Agreeing with Mark, content theft is one of the biggest problems that i understand as far as social media is concerned.
    Great to know about Akismet. Hope to see more solutions like that.
    A great read overall.

  5. There’s one of my blogs where the traffic was higher than usual last month. I hadn’t written on it for a bit. When I logged in, I noticed there were LOTS of spam comments (moderated, of course). So (sigh), that’s where I guess a lot of the traffic came from.

    I do have Akismet though, which helps. And I also have Limit Login Attempts which locks out anyone who attempts to login to that blog.

    In any case, it is a worry that so much of a proportion of traffic is bad bots.

  6. Mark,

    I agree – bots are big problems. One more trouble caused by bots: They drain your server resources – for nothing, literally.

    One of the ways to reduce the aggressiveness of the bots is by blocking the bad ones. This page (not mine) offers some of the badass bots you want to block. Consult with your techie guy on how to block them.

    http://www.askapache.com/htaccess/blocking-bad-bots-and-scrapers-with-htaccess.html

    • Thanks for the link, Ivan. Though I appreciated spam comments being blocked via Akismet, I wish there was a WordPress plug-in that actually blocked bad spam bots. That would be neat – very useful.

      • ebele, see my comment above, I do mention a plugin Bad Behavior that does block bots. Also mention Cloudflare. Hope that helps 🙂

      • Hi Viktor…

        Thanks. Bad Behavior, huh? For bad behavior 🙂

        Checking it out now as we speak: four-star rating, free to download, I’m sold! Gonna download it and give it a shot.

        Thanks! Will look into Cloudflare as well.

  7. Identifying and blocking bad spam bots is a must if you are concerned with optimizing your website for search engines. A second good practice is maintaining a backup of the website at all times in case the site is hacked.

  8. Akismet doesn’t block most bots. If you rely on it on a popular blog you’ll either waste a ton of time wading through what they throw into spam to find comments like mine (which consistently go there) OR if you delete spam without checking it you will lose comments by some of your most regular commenters.

    That is why I don’t moderate comments in any blog that doesn’t run the GrowMap anti-spambot plugin that Andy Bailey @CommentLuv named after my blog. If you’ve seen the checkbox to click that you’re not a spammer or that you’re human or any other variation, you’ve seen this plugin in action.

  9. I use Akismet to block spam and I still get hell lot of it everyday. There’s around 900+ spam comments waiting to get deleted.