Nick
In Remembrance
Web site analytics packages record what real people do on a site but most web site traffic comes from other computers often with nefarious intent.
Incapsula, a provider of cloud-based security for web sites, released a study today showing that 51% of web site traffic is automated software programs, and the majority is potentially damaging, automated exploits from hackers, spies, scrapers, and spammers.
The company says that typically, only 49% of a web sites visitors are actual humans and that the non-human traffic is mostly invisible because it is not shown by analytics software.
This means that web sites are carrying a large hidden cost burden in terms of bandwidth, increased risk of business disruption, and worse.
Heres a breakdown of an average web sites traffic:
- 5% is hacking tools searching for an unpatched or new vulnerability in a web site.
- 5% is scrapers.
- 2% is automated comment spammers.
- 19% is from spies collecting competitive intelligence.
- 20% is from search engines - which is non-human traffic but benign.
- 49% is from people browsing the Internet.
The data was collected from a sample of 1,000 websites that are enrolled in the Incapsula service.
I spoke with Marc Gaffan, co-founder of Incapsula. Few people realize how much of their traffic is non-human, and that much of it is potentially harmful.
Incapsula offers a service aimed at securing small and medium sized businesses. It has a global network of nine data centers that analyze all traffic to a customers site and blocking harmful exploits in real-time, while also speeding up page loading times through cached content closer to users.
Because we have thousands of web sites as customers, we spot exploits way ahead of others and we can then block them for all our customers. Thats the benefit of scale. We also maintain a virtual patch service that prevents harmful exploits days and sometimes weeks before a patch is ready.
There is no software or hardware installation required by the customer, a small change in a web sites DNS records directs traffic through Incapsulas data centers. And all analytics, and search engine rankings, are unaffected by the change.
Web sites are significantly faster because the company caches content and keeps it close to where users are located.
An important aspect of the service is that it is in compliance with the Payment Card Industry data security standard (PCI) which is essential for online merchants. They risk losing their ability to process credit card payments if they dont meet strict PCI requirements.
The company offers a free service for sites with less than 25 GB of monthly bandwidth, and premium plans start at $49 a month.
Foremskis Take: Im curious to try this service because looking at my server logs I get hit by about 28 robots daily, and while some are from legitimate sources such as Google, Yahoo, Microsoft, the majority are unidentified and together, they use as much as one-third of my bandwidth.
This means that the human user experience suffers because my server is trying to deal with all the non-human traffic generated by software programs hitting the site.
Incapsulas ability to block exploits before a patch is available is another attractive feature. I dont have time to keep up with the many security patches sent out, and then installing and upgrading multiple programs is a chore Id rather do without.
More info: Incapsula In The News
Incapsula, a provider of cloud-based security for web sites, released a study today showing that 51% of web site traffic is automated software programs, and the majority is potentially damaging, automated exploits from hackers, spies, scrapers, and spammers.
The company says that typically, only 49% of a web sites visitors are actual humans and that the non-human traffic is mostly invisible because it is not shown by analytics software.
This means that web sites are carrying a large hidden cost burden in terms of bandwidth, increased risk of business disruption, and worse.
Heres a breakdown of an average web sites traffic:
- 5% is hacking tools searching for an unpatched or new vulnerability in a web site.
- 5% is scrapers.
- 2% is automated comment spammers.
- 19% is from spies collecting competitive intelligence.
- 20% is from search engines - which is non-human traffic but benign.
- 49% is from people browsing the Internet.
The data was collected from a sample of 1,000 websites that are enrolled in the Incapsula service.
I spoke with Marc Gaffan, co-founder of Incapsula. Few people realize how much of their traffic is non-human, and that much of it is potentially harmful.
Incapsula offers a service aimed at securing small and medium sized businesses. It has a global network of nine data centers that analyze all traffic to a customers site and blocking harmful exploits in real-time, while also speeding up page loading times through cached content closer to users.
Because we have thousands of web sites as customers, we spot exploits way ahead of others and we can then block them for all our customers. Thats the benefit of scale. We also maintain a virtual patch service that prevents harmful exploits days and sometimes weeks before a patch is ready.
There is no software or hardware installation required by the customer, a small change in a web sites DNS records directs traffic through Incapsulas data centers. And all analytics, and search engine rankings, are unaffected by the change.
Web sites are significantly faster because the company caches content and keeps it close to where users are located.
An important aspect of the service is that it is in compliance with the Payment Card Industry data security standard (PCI) which is essential for online merchants. They risk losing their ability to process credit card payments if they dont meet strict PCI requirements.
The company offers a free service for sites with less than 25 GB of monthly bandwidth, and premium plans start at $49 a month.
Foremskis Take: Im curious to try this service because looking at my server logs I get hit by about 28 robots daily, and while some are from legitimate sources such as Google, Yahoo, Microsoft, the majority are unidentified and together, they use as much as one-third of my bandwidth.
This means that the human user experience suffers because my server is trying to deal with all the non-human traffic generated by software programs hitting the site.
Incapsulas ability to block exploits before a patch is available is another attractive feature. I dont have time to keep up with the many security patches sent out, and then installing and upgrading multiple programs is a chore Id rather do without.
More info: Incapsula In The News