fbpx

What is Bot Traffic? | The Click Fraud Blog


Bot traffic is essentially non-human traffic to a website. Bots are used extensively by online services to collect data from the internet, and to enhance our user experience. Your search results on Google would be more like Alta-Vista or AOL if it wasn’t for bots (if you’re old enough to get those references, you’ll remember that search results pre-Google were pretty rubbish).

For things like repetitve tasks or scraping huge amounts of data, bots far outperform humans.

With this ability to perform repetitive tasks quickly, bots can be used for both good and bad.

“Good” bots can, for example, check websites to ensure that all links work. “Bad” bots on the other hand can be unleashed to target websites with heavy traffic, enough to overwhelm and take down the site (DDoS attacks).

As bots are just programmed scripts, they can perform any number of functions. Bots are used for example by search engines such as Google to crawl the web to fetch and analyze information, which in turn lets these companies keep search results updated and relevant.

For end users like browsers of websites, bot traffic isn’t really an issue.

But for site owners bot traffic is critical; whether it’s to ensure that Google is crawling your site properly, to enhance the accuracy of your analytics results, to ensure the health and performance of your website, or to prevent malicious behavior on your website and ads.

The fact is that more than half of all web traffic is bot traffic. What’s disturbing, however, is that 28.9% of all traffic is thought to be from automated and malicious sources.

Different Types Of Bot Traffic

In order to more fully understand what is bot traffic, one has to look at the various kinds of bot traffic – which could include web crawlers for search engines like Google, or malicious bots which are used to attack websites.

“Good Bots”

Lazy Placeholder
Not all bots are bad
  • SEO: Search engine crawler bots crawl, catalogue and index web pages, and the results are used by search providers like Google to provide their service
  • Website Monitoring: These bots monitor websites and website health for issues like loading times, down times, and so on
  • Aggregation: These bots gather information from various websites or parts of a website, and collate them into one place
  • Scraping: Within this category, there are both “good” and “bad” bots. These bots “scrape” or “lift” information from websites, for example phone numbers and email addresses. Scraping (when legal of course) can be used for research for example, but can also be used to illegally copy information

“Bad Bots”

Lazy Placeholder
  • Spam: Here, bots are used for spam purposes, often within the “comments” section of websites or to send you those phishing emails from Nigerian Princes
  • DDoS: Bots can be used to take down your site with a denial of service attack – often a coordinated attack
  • Ad Fraud: Bots can be used to click on your ads automatically, often used together with fraudulent websites to boost the payout for ad clicks
  • Ransomware and other malicious attacks: Bots can be used to unleash all kinds of havoc, including ransomware attacks which are used to encrypt devices – often in exchange for a payout to ‘unlock’ them

Read more about the different types of cyber crime here.

How To Detect Bot Traffic?

Detecting bot traffic is a great first step in ensuring that you’re getting all the benefits of the good bots (like appearing in Google’s search results) while preventing the bad bots from costing you.
When figuring out how to detect bot traffic, the best place to start is with Google Analytics.

If you have wondered to yourself, “Can I see bot traffic in my analytics account?”, the answer is: you can definitely get an indication of it. You need to know what to look out for, and you’ll be able to get an indication of bot traffic, but you may not find a smoking gun.

The key ratios to keep track of here are:

  • Bounce Rate
  • Page Views
  • Page Load Metrics
  • Avg Session Duration

Bounce rate is expressed as a percentage, and shows visitors of your website who navigate away from the site after viewing only one page. Humans are most likely to arrive on your site (from a search engine result, for example), and then click through to explore your offering. A bot isn’t interested in exploring your site, so it will “hit” one page, and leave. A high bounce rate is a great indicator of bot traffic detected.

Lazy Placeholder
A high Bounce Rate is an indicator of bot traffic

Page Views are almost the reverse of this. The average visitor might visit a few pages in your site, and then move on. If you suddenly see traffic where 50 or 60 pages are being viewed, this is most likely not human traffic.

Slow site load metrics. This is also really important to monitor. If load times suddenly slow down, and your site is feeling sluggish, this could indicate a jump in bot traffic, or even a DDoS (Distributed Denial of Service) attack using bots. A tech solution might be required in some cases (more about this below), but this is a good first step in how to detect bots.

Avg. Session duration will tell you a lot about how users from different sources are interacting with the site. In the image below, the Microsoft Corp Network is most likely bringing non human traffic. Two seconds is classic for bot clicks.

Lazy Placeholder

How To Stop Bots From Crawling My Site

When figuring out how to stop bots from crawling my site, it’s important to keep in mind that some bots are good, that is you want them to be crawling your site. For example, it’s possible to prevent all bots from engaging with your website, this also means you’ll fall out of Google search results, for instance.

Your first stop is your robots.txt file. This is a simple text file that gives guidelines to bots visiting your page in terms of what they can and can’t do.Without a robots.txt file, any bot will be able to visit your page. You can also set up your file so that no bots can visit your page (although see warning above).

The “middle ground” is to put rules in place, and the good news is that the “good” bots will abide by these. The bad news, however, is that the “bad” bots will disregard these rules entirely.

When it comes to the “bad” bots, you’ll need to engage a tech solution. This is where a CDN (Content Delivery Network) service comes in. One of the advantages of a good CDN is the protection they can provide against malicious bots and DDoS attacks. Some of the most common ones are Cloudflare and Akamai, which can stop some bots from crawling site. As Cloudflare themselves say, “Cloudflare’s data sources will help reduce the number of bad bots and crawlers hitting your site automatically (not all)”.

There are also purpose-built anti-bot solutions that can be installed, but it’s important to note that most of these can protect your website relatively well, but cannot protect you outside of that – for example your ads on search engines and other properties.

Another more tedious (and less effective) option is to manually block IP’s where you know that the traffic is bot-related. A trick you can use is check the geographic origin of traffic. If your traffic is usually from the US and Europe, and suddenly you’re seeing a lot of IPs from the Philippines, it could be a bot or click farm.

Why Is It Important To Protect Your Ads?

One of the biggest threats to your ad campaigns, and by extension to the future of your business, is bot traffic. CHEQ and the University of Baltimore economics department showed that even opportunistic bots are set to cost businesses $35 billion in 2021

Bots can be programmed to click on your ads, leaving chaos in their wake: for example, by draining your Adwords account, by causing Google to rate your ad’s performance as poor, by stopping your ad being displayed while competitors’ ads are featured prominently, and by impacting conversion rates and rendering your analytics meaningless.

In today’s digital advertising industry, bots are both a huge help and have the potential to be very damaging. Taking a proactive approach to PPC protection is the only way to ensure that your ad campaigns are safe. All ad managers should consult with third party software to determine how their traffic is being affected by bot activity.

A third party expert will help you mitigate this problem by tracking and blocking the bad guys and empty clicks.

ClickCease is the industry leading click fraud prevention software, highly rated by marketing professionals and business owners. To block fraud on your PPC ads, including bot traffic, sign up for your free trial of ClickCease.

Get Your Ads Protected Now

Source link

Digital Strategy Consultants (DSC) © 2019 - 2024 All Rights Reserved|About Us|Privacy Policy

Refund Policy|Terms & Condition|Blog|Sitemap