Hack Me: A Geopolitical Analysis of the Government Use of Surveillance Software

Written by

In summer 2015 a South Korean intelligence officer identified only as Lim was found alone on a mountain road, slumped over in his car. Beside his body was a piece of burnt coal that had emitted a fatal dose of carbon monoxide poison, next to a three-page suicide note. Lim had reportedly succumbed to the pressure surrounding his work to implement controversial tracking software within the South Korean National Intelligence Service (NIS). The note suggests that as Lim lay in the car, one Italian company was clearly on his mind: ‘Hacking Team.’

A few days earlier, in a major twist of irony, Milan-based Hacking Team was itself hacked. More than 400GB of internal data was extracted by unknown perpetrators, who then gleefully used the organization’s own Twitter account to break the news. For those outside the industry, Hacking Team—made up of a small group of sophisticated hackers and programmers—exists to develop customized malware that gathers intelligence against desired targets. They market their services to entities around the world, who then deploy the software against their own adversaries.

Security researchers and journalists alike have been combing the data to identify the technical methodologies adopted by this secretive organization. Vulnerabilities have been discovered and quickly integrated into well-known exploit kits. Microsoft was forced to deploy an out-of-band update to address the memory corruption of an Adobe kernel module. Mozilla temporarily blocked Adobe Flash until significant patches could be released.

The bigger picture is even more worrisome. The organization’s customers allegedly include nations such as the US and Italy, but there are also reports of sales to those with questionable human rights records and/or countries that may engage in political oppression.

Lim’s suicide note points to this murkiness. Addressing the NIS agency director, the vice director, and the bureau chief, he said: “My excessive ambition at work appears to have caused today’s situation.” Lim was reportedly responsible for purchasing and implementing the Hacking Team’s surveillance software, Remote Control System, for use against the country’s North Korean neighbors. Official accounts indicate that the note maintains the technology was never used against domestic targets; however, the NIS is currently the subject of what South Korean officials call a ‘field investigation’ for allegations that similar technology was used to spy on the public ahead of the 2012 presidential election.

Players in the Game

In 2008, while financial markets were imploding, companies within the surveillance industry were seeing their revenues soar. Hacking Team and Gamma International were still developing their spy-tool infrastructure, but Florida-based defense contractor Harris Corp. was already marketing its second-generation IMSI-catcher known as the StingRay II to U.S. law enforcement and federal agencies.

IMSI-catchers are often small-suitcase-sized devices that mimic legitimate cellphone relay towers. The devices are designed to overpower legitimate cell towers, forcing cellphones in the area to relay information that can be used to locate a subject of interest. IMSI-catchers can be deployed to home in on suspects known to reside in a small area, or can be grouped together in clusters to overpower larger areas, say an entire city, to monitor the communications of an expanded pool. The nature of the design dictates that all communications within the powering vicinity of the device are and analyzed—everyone falls within the net, not just the designated targets.

Public awareness of the Harris StingRay II product emerged in 2011 when reports first surfaced of an ‘off the grid hacker’ named Daniel Rigmaiden. In 2008, having been charged with tax fraud, Rigmaiden walked into a lawyer's office complaining of ‘government rays being sent into his living room.’ No one was willing to hear his case, so Rigmaiden decided to defend himself. He spent years gathering formal transcripts and documents from local meetings to glean possible insights into the technology responsible for putting him behind bars.

After reportedly analyzing over 15,000 documents pertaining to various cell technologies and devices, Rigmaiden finally discovered references to new ‘investigative techniques’ that—through the use of FOIA requests and the assistance of the ACLU and EFF—helped shed light on the elusive technology that was being marketed to federal and state agencies. Rigmaiden eventually lost his Fourth Amendment challenge to the use of the StingRay device, but the public debate regarding the use of such technologies had just started.

Sting in the Tale

To prevent discovery of such products’ capabilities, their developers can use the power of contract law and non-disclosure agreements (NDAs) to maintain strict confidentiality with its clients. These restrictions are often so stringent that they mandate the withholding of information from official documents and even other government officials. And of course, they prevent federal and state officials from disclosing any information, or even the existence, of the StingRay system.

Here’s how this plays out in the real world. In one Florida case, a man facing a near-guarantee of four years in prison for armed robbery saw his sentence reduced to six months’ probation through a plea bargain. That was after prosecutors were ordered to disclose information surrounding the use of the StingRay device to opposing attorneys. In other reports, St. Louis prosecutors dropped a total of 14 charges against four men accused of first-degree robbery and other crimes. All charges were dropped when the officer involved was scheduled to give a deposition regarding the techniques used during the investigation.

There’s no end to the irony here. The NDA requirements of the very products marketed to protect citizens from crime essentially allowed accused criminals and intruders to go free because the contract prevented the disclosure of information regarding the products used to get the evidence. Despite public discourse and loud objections from the likes of the ACLU and EFF, law enforcement agencies are not alone in using surveillance tools to monitor large swaths of the population. Many governments are doing the same thing.

Tortured justification

On 14 July, Hacking Team CEO David Vincenzetti released a statement on the company website detailing the attack on its systems while attempting to sway public opinion regarding the exposed tactics. First, he promoted the “comprehensive,” “easy to use” and “powerful” surveillance capabilities of the company’s product line, then validated the actions of his company by pointing out that the company only sells the product to approved government entities. Keep in mind, the information released through the Hacking Team data dump demonstrates that countries are spending millions to thwart attacks against their infrastructure, and are using the very same tools against their adversaries. Vincenzetti went on to list some former clients: Russia, Ethiopia and Sudan.

Several days later, Chief Marketing and Communications Officer Eric Rabe released a follow-up statement in which he claimed that “there is only one violation of law in this entire episode, and that one is the criminal attack on Hacking Team.

Coincidentally, the day after Hacking Team released its initial statement about the hack of its internal systems, the FBI announced the arrest of a 20-year-old FireEye intern and mobile malware researcher named Morgan Culbertson, and the subsequent dismantling of the Android-based Dendroid malware toolkit, which he is accused of developing and selling on the recently shuttered Darkode marketplace. While details are still awaited, initial reports indicate that Culbertson only developed and sold the malware toolkit to interested parties, and may have had no plans to use it himself. Basically, he wrote a piece of malware and sold it to those who wanted it, just like Hacking Team.

The Dendroid malware is an Android-based package that allows its owners to gather personal information—visited websites, keystrokes pressed, passwords, etc. Known as Remote Access Trojans (RATs), these programs have been around for years; familiar names include Poison Ivy, Back Orifice and Sub7. Most RATs are similar in that they allow for the harvesting of usernames and passwords via keystroke collection capabilities, and possess mechanisms to allow it to go undetected from anti-virus programs.

Future Ethical Concerns

Given the nature of international terrorism, some can justifiably see these tools as necessary weapons of war, but that’s a gross over-simplification. Unlike, say, surgical strikes with guided missiles, unpatched exploits can affect all computer users regardless of country, origin or intent. If zero-day exploits are to be compared to machines used by the armed forces, then they need similar restraints and definitions, something akin to a weapon of mass destruction.

That brings us back to the suicide of the South Korean official, which prompted this investigation. His was surely not the only tragedy; there have likely been many abuses enabled by the use of these technologies by oppressive governments. That’s why it’s time to create a new roadmap to the future. These digital innovations are by nature secretive, and so are many of the institutions using them. But does the security industry have a duty to consider the customer base for these tools?

So why is Hacking Team, which develops and sells malware to oppressive nations, better than an individual who develops and sells malware to unsavory online characters? If anything, the purchase, sale, and use of zero-day exploits poses far greater dangers than an individual marketing an Android-based malware toolkit. This is the core of the ethical concerns plaguing this field. Activities illegal in one country may be legal in another, and most digital advances don’t respect geographic boundaries any way.

On July 21, just two weeks after the Hacking Team breach, a mysterious Reddit post appeared under the /r/hacking board claiming to be interested in starting a new company with a similar agenda to Hacking Team. The Reddit post, which was active for less than 24 hours, stated: “Creating a hacking team. Must have at least a basic understanding. Add Leo Da Vinci on line, with an android phone if interested.

Obligations

The battle is raging. Organizations that develop and market the use of government-only surveillance tools will continue to grow and thrive—and so will those attempting to expose the secrets of such groups—along with those that want to exploit the technology for nefarious purposes.

The use of zero-day exploits by commercial entities for financial gain is also a dangerous practice that potentially jeopardizes everyone. If companies that use exploits to compromise targets of national interest come to learn that other, less savory individuals are using the same exploits for illicit purposes, do they have an obligation to disclose the problem?

Maybe the next big data dump will give us the answer.

What’s hot on Infosecurity Magazine?