Our website uses cookies

Cookies enable us to provide the best experience possible and help us understand how visitors use our website. By browsing Infosecurity Magazine, you agree to our use of cookies.

Okay, I understand Learn more

The black art of digital forensics

Forensics experts can no longer trust the PC system clock, much less present its breadcrum-like trail in court as irrefutable evidence.
Forensics experts can no longer trust the PC system clock, much less present its breadcrum-like trail in court as irrefutable evidence.
Peter Sommer, London School of Economics
Peter Sommer, London School of Economics
Peter Wood, First Base Technologies
Peter Wood, First Base Technologies

Digital data forensics – as a science – has been around for more than 25 years, with its first high-profile user being Dr Alan Solomon, who founded S&S International in 1983 to produce software for early MS-DOS-based PCs.

After founding S&S, Alan went on to develop his company’s expertise in data recovery and, as history tells us, PC viruses and defences against them.

By the late 1980s, Alan’s expertise had moved into digital forensics and, as a world renowned expert, he appeared as an expert witness in many legal cases of the day.

One of Doc Solly’s (as Alan Solomon was later to be become known) mainstay digital forensic building blocks was the PC system clock, the timestamp for which has become the central argument in many civil and criminal litigation cases throughout the years.

PC system clocks however, are not a reliable source of forensic data. In a 2007 study by Florian Buchholz and Brett Tjaden – two professors at the James Madison University in Virginia – more than a quarter of the web servers on the internet had their clocks off-beam (i.e. incorrect) by more than 10 seconds.

As a result, digital forensics experts can no longer trust the PC system clock, much less present its breadcrumb-like trail in court as irrefutable evidence.

They do however present experts with a set of circumstantial evidence, and that – in many ways – is what the art of forensics is all about: the marriage, analysis, and eventual interpolation of a group of data sets gleaned from one or more computer systems at one or more moments in time.

Whilst Dr Solly was always talking about PC timestamp in the early IT court cases of the 1980s, it is now recognised in digital forensics circles that a system timestamp can be forged.

Log files, on the other hand, offer us a lot more information. Each time a file is modified, accessed or has its metadata changed, modern computer systems will update the file’s so-called ‘MAC times’.

Popular digital forensic tools such as EnCase, FTK and Sleuth Kit have the ability to read all of the MAC times within a computer system and sort them to create a single time line.

Thanks to this data, investigators can use these time lines to work out which files an unauthorised intruder has browsed or modified.

It’s worth noting that, because PC system clocks are set incorrectly, most digital forensic tools will allow the investigator to input a time offset (‘delta’) value into the data analysis logs. The bad news? The delta value is rarely constant.

In their six-month analysis of more than 8000 web servers, Buchholz and Tjaden found that systems with the wrong time frequently drifted or jumped around in unpredictable ways.

Some systems, they noted, would get steadily slower or faster, and then jump back to the correct time.

Other systems were solid in the rate that time passed, but they were off-beam from the correct time by minutes, hours, days or even years.

"Forensics experts can no longer trust the PC system clock, much less present its breadcrumb-like trail in court as irrefutable evidence."
 

Some systems also followed the wrong rules for summer and winter time changes (e.g. daylight savings time, British summertime, etc) and some servers returned a different ‘wrong time’ each time they were polled.

The jigsaw

Good though long-standing digital forensic tools like EnCase and FTK are, there are some new kids on the block, including ForensicSoft’s SAFE and Evidence Talk’s SPEKTOR.

Released in May of this year, SAFE stands for System Acquisition Forensics Environment and is a new Windows-based digital forensic platform specifically designed to support the expanding needs of computer forensic, computer security, and litigation support professionals.

US-based ForensicSoft claims that its digital forensic software allows investigators to acquire, preview and analyse digital evidence to such as a degree that it can be presented in a court of law.

Unlike conventional digital forensic boot disks that use basic protection techniques – such as mounting drives as read-only – ForensicSoft says its SAFE platform uses the firm’s SAFE Block technology to block all disks at the physical level.

This, the company says, allows a forensically sound preview, exploration and capture of the digital evidence.

SAFE is unusual in that it is not designed to usurp professional usage of digital forensic applications such as EnCase and FTK, but allows users to run these investigative packages in the secure knowledge that the underlying operating system (Windows PE) is highly secure.

Closer to home, Milton Keynes-based Evidence Talks, has just released SPEKTOR, which it describes as a digital forensic triage utility for the police and law enforcement communities.

Rather than run the utility on a laptop, SPEKTOR is a self-contained unit that uses embedded firmware for security. It’s billed as generating digital forensic evidence capable of being produced in court, without extensive IT training on the part of the investigative officer.

A touch screen on the SPEKTOR control pod allows an operator to forensically wipe, verify and configure reusable SPEKTOR Collectors.

Once configured – a process the firm says takes just a few seconds – the SPEKTOR Collector can acquire digital forensic data from target PCs, Apple Macs, removable USB, Firewire and memory card devices in just a few minutes.

According to Andrew Sheldon, Evidence Talks’ managing director, “any data that is collected by the SPEKTOR system is protected from unauthorised users”, with relevant data and reports stored in separate protected areas on the Collector device.

From Windows-based PCs, SPEKTOR automatically extracts forensically useful data from the registry, including comprehensive profile settings, details of previously attached USB devices, recent file activity, network setting, installed software and online storage details.

Interestingly, SPEKTOR also includes a ‘remote forensics’ facility that allows users to seek assistance from remote colleagues via a secure, audited network connection which can run across a 3G mobile network if required.

Limitations

Collating the data using applications like those mentioned above is only part of the skills of an IT forensics specialist.

According to Professor Peter Sommer, a forensic and IT security specialist, most forensics utilities do a lot of things, but they only do a number of preset tasks.

Professor Sommer, who is a visiting professor in the Information Systems Integrity Group in the Department of Management at the London School of Economics, says that digital forensics utilities can be used by almost anyone with a minimum of training.

In these situations, he says, there is a danger of a user simply being a GUI (graphical user interface) jockey, but not someone who can interpret the digital forensics data that is actually collated.

“You may not be aware of how the data is actually achieved”, he says.

The problem with this is that if a barrister or similar legal professional is well briefed enough, they can pick apart the law enforcement professional’s argument, which is based on data collected by a digital forensic application, in front of a jury, with disastrous effects.

Because of this, Professor Sommer argues that investigating officers using digital forensic applications must understand how their software operates and, if required, explain how it operates to a civil or criminal court.

The problem of interpreting the digital forensics data, he says, “is getting worse as the cost of hard drives is steadily falling” and, as a result, average hard drive sizes are getting larger.

"Now [companies] mainly want to know what's wrong and have some guidance on how to fix the loopholes."
Peter Wood, First Base Technologies

So what about informed peers? Is that the solution to beating the inexperienced operator allegation in court?

Not really, says the Professor, as whilst someone can ask their senior peers in a large police investigations team in London, once you get outside London, the investigatory teams are not large enough for newer officers to get peer-based training from their senior colleagues.

Audit trail

Peter Wood, partner/chief of operations with penetration specialist First Base Technologies, is less of a proponent of painstakingly recording all relevant data within a civil forensics/security investigation.

“Five years ago, it was all about making meticulous records and including that data in your report to the companies that had hired you. Now they mainly want to know what’s wrong and have some guidance on how to fix the loopholes”, he says.

Obviously, says Wood, who is also an ISACA conference committee member, you need to create an audit trail in your investigation, but the penetration testing/investigation industry has changed markedly in the last five years. Many companies are now less interested in the fact they have an IT security problem, and more concerned with how to fix it.

Along the way, he says, if an investigator has taken meticulous notes, “that’s a nice-to-have option, rather than a must-have”, which is something that most companies are embracing these days.

On the civilian penetration investigation front, he tells Infosecurity, most companies are looking for headlines of their vulnerabilities, rather than painstakingly-accurate investigator reports.

“Having said that, if the company has poor levels of security in its initial penetration test analysis, then we would look at the main problems and then come back later to talk about their options.”

The only exception to this strategy by corporate professionals, says Wood, is where PCI-DSS (Payment Card Industry – Digital Security Standard) requirements mandate that all problems – no matter how small they are – must be fixed by the IT forensics and penetration testers in their first sweep of a system.

Otherwise, he says, it’s mainly about best practice, more than anything.

The legal perspective

According to Alistair Kelman, a barrister and legal counsel with more than two decades of experience in IT legal battles – both criminal and civil – the science of data forensics remains very much a black art, owing to the lack of information coming from companies like Microsoft.

Microsoft Vista, he says, is a classic example of this. The operating system has been a runaway success for users, but prying information from Microsoft’s developers on the software front, he says, is almost impossible.

As a result, he says, digital forensics investigators have had to put their deerstalker hats on and investigate the operating system to a very granular level.

There are, he adds, two sides to this issue. On the one hand, investigators have a lot of tools at their fingertips and, in most cases, can help a company’s staff get to the grips with the scale of their problem.

As a result, he tells Infosecurity, whilst many digital forensics specialists say they need to record everything meticulously in an investigation, there is often very little real need for such granular data.

A good digital forensics investigator, he argues, adopts an IPSEC approach to their investigations.

“Firstly you need to Identify the data you are looking into, then you Preserve that data, as well as Selecting the data you want to Examine at a later stage”, he says. “Finally, once you’ve examined the data from privilege and reference as part of your investigation, you need to Classify the data as the final stage in your research.”

After the IPSEC steps, he claims, it’s usually a simple matter to classify the data collated and then analyse it fully.

Good digital forensics, he says, is not rocket science, but it does take a lot of thought to be able to complete an investigation and research all the relevant angles thoroughly.

“The bottom line with good IT forensic analysis is that you need to think about what data you have and how you can use it to your best advantage”, he says.

“Some data may be irrelevant, some data may be repetitious, and some might be more relevant than you might first think.”

What’s Hot on Infosecurity Magazine?