Tuesday, May 20, 2014

Are Computer Bots Getting Too Smart? | Bosmol - Social Media & Web 2.0 Internet Marketing News

Computer-BotsUndetectable robot invaders. Hackers are not quite there yet, but they are getting closer. This article will address the current state of bot technology; how they help make the internet more efficient, how they are also used for devious purposes. Do you know what bots you’re dealing with?



Bot Proliferation



Generally speaking, internet bots are not created to cause harm. They travel across cyberspace performing repetitive tasks at speeds far greater than of any human user. Many of these benevolent bots are search engine crawlers, tasked with logging content from countless web pages. These Good Bots comprise about 32% of total web activity.


Because these bots gather information, to do a proper job they require some basic access to a website’s server. Unfortunately, hackers have honed in on the process of a website letting down its defenses to acquire entry for their own devilish bots. Once a bot reaches the Application Layer of the TCP, they are liable to perform DDoS, or Distributed Denial of Service attacks, with the intention of taking down the victim’s server.



DDoS Zombie Bots



In order to wield the kind of firepower needed to take down a commercial website, hackers gather herds of bots called “botnets”. The ‘bots’ in this case are computers infected with malware or Trojans, or headless (user-less) browsers. Once infected, a computer’s processing strength is used for malicious purposes. The hacker (sometimes called a “shepherd”) can send these bots to attack a single server with the click of a button. Most serious botnets have tens of thousands of ‘zombie’ bots at their disposal.


Zombie Bot Evolution



Some types of DDoS or other harmful bots are easily filtered out by firewalls or simple hardware installations, but recent research shows that these tools are fast becoming obsolete. The pace of hacker innovation is causing concern across the cyber landscape.



Online security service provider Incapsula recently came out with a DDoS Treat Landscape report of web activity over the last 15 months that lends insight into the current state of website vulnerability. Incapsula has identified bots that are now capable of accepting cookies, and executing JS—two common bot filtering methods.


According to the report, close to 30% of the bots encountered accepted cookies, and 0.8% executed JS. Although these numbers are still small compared to the remaining “primitive” bots scouring the web, these developments paint a grim picture of what’s to come. Once a new type of malicious bot technology becomes available, it is often adapted by the rest of the hacking community at alarming speeds.



Defense against ‘Smart’ Bots



In the context of these advancements in bot technology, the cyber defense industry is fighting to stay ahead of the curve. Online security services must respond to a difficult question; how to distinguish between bots and human users without having to deploy CAPTCHAs or other invasive measures which tend to hinder the user experience.



Are Computer Bots Getting Too Smart?

No comments:

Post a Comment