If you ask any UK based digital forensic analyst who provides services to the criminal justice system what the biggest challenge over the last 12 months has been, the answer is always the same: ISO17025.
Those mere three letters and five numbers have caused great consternation and sweat on the brow for many forensic teams.
Since October 2017, the quality standard which sets the ‘general requirements for the competence of testing and calibration laboratories’ has been a threatened mandatory requirement for all digital forensic labs whose work feeds into the criminal justice system.
But nine months or so down the line, there is still much discussion within the industry on if the introduction of this quality standard has been beneficial to the industry or not.
Some see ISO17025 as an opportunity to create structure and standardisation within an industry that has previously been described by some as the ‘wild west of forensics’.
Some see it as an unnecessary burden on already overworked and under-resourced laboratories; yet more red tape taking examiners away from their examinations.
In my experience, however, most people are in favour of a quality accreditation - in principle - but have issues with the appropriateness of the standard or the way in which it is being assessed.
The bar had been raised (which is a good thing). But has it been raised to the right level and in the right place?
Where is it going wrong?
We are now starting to see the effect of ISO17025 on laboratories up and down the country. But, as with all quality standards, we will never truly be able to quantify the benefits, as failures in examinations that could have led to miscarriages of justice had it not been for new standards, will never be known.
However, the negative effects of ISO17025 are becoming evident - and one which we are seeing with increasing frequency - is difficulties laboratories are having in adapting to changes in consumer technology and adopting new tools that could potentially yield better results.
No forensic discipline is static, improvements in techniques, technology and interpretation are constantly happening across the board. However digital forensics suffers in the sense that the subject of analysis is a constantly moving target.
DNA and fingerprints have been the same for millennia. But it is estimated that approximately 5,000 to 6,000 apps are released on the App Store and Google Play every day. That is 10,000 to 12,000 potential new sources of evidence every day, any of which could be crucial to an investigation. And labs need to be able to adapt and evolve to ensure they are getting the most and highest quality evidence from their examinations.
This, of course, is the situation in its extreme. But there are very real examples of where laboratories have been unable to adapt to changes in consumer technology because of their quality accreditations.
Labs are on the backfoot
In September 2017 Apple began rolling out a new file system onto MacOS devices. This file system, named APFS, fundamentally changes the way that data is stored on Apple computers.
APFS brought with it a number of challenges as well as new opportunities for forensic examiners.
Many of the existing tools were unable to fully interpret the new file system, but also the file system introduced a number of new artefacts, (for example Snapshots that maintain previous versions of a file) that had the potential to be immensely useful to an investigation.
As we work with laboratories to assist with technology selection, we initiated conversations with our customers to help them adapt their technology so that APFS file systems could be effectively examined.
Some of our customers had in place technology that could cope with APFS. However, several were aware of the issue but did not have the resources to bring any additional tools into their quality accreditation, and so had no choice but to accept that they were unable to handle these types of exhibits effectively.
To deploy a new tool into a laboratory requires the tool be validated for all intended purposes. This means not just testing that the tool behaves as expected, but also creating test devices containing data that is representative of real-life submitted exhibits.
And. Once we have the tool validated, we will need to re-validate every time the tool is updated or when a new consumer device is released that we have previously not tested our tool against.
This rigmarole takes a whole load of time and, of course, costs money. And it means that many labs lack the resources required to adopt new technology and techniques.
Raising standards is a good thing. But…
Don’t get me wrong. This is not to say that quality accreditations are a bad thing, and I agree that any tool used should be reliable and validated.
However, the ISO17025 framework - as it is currently being deployed and assessed - is arguably not compatible with digital forensics.
The set-up cost and overhead of validating and maintaining validation for tools significant, and resource-constrained laboratories are having to think carefully about their tool selection.
For many, it’s no longer a case of choosing the best tool for the job, but rather a tool that has been validated.
Taking this to its logical conclusion means that forensic laboratories will have standardised processes for what they do – great!
However, unless additional resources are committed to validating new tools, over time laboratories will become increasingly limited in what they can do.
So what’s the solution?
Benefit could be gained by allowing laboratories to share validation data.
As it stands right now, every lab is responsible for validating their own tools and creating the data to do the testing.
This to me is illogical. And a total waste of time, and money.
In the UK we trust the Medicines and Healthcare products Regulatory Agency (MHRA) oversee the clinical trials and testing of new drugs that are released to market. To expect every hospital using a drug to repeat the clinical trials and re-validate the results themselves would be ridiculous and cost prohibitive.
However, this is the model which we are expecting digital forensic laboratories to follow.
Allowing labs to share validation data, either directly or through a centralised body would enable laboratories to pick their technologies based on functionality, rather than the time and cost of adoption and maintenance.
It raises the standards at the same time as reducing the burden, and makes sure that labs still get to use the best tool for the job, even in an ever-changing world.
Rob Savage is the Chief Technology Officer with the digital forensic specialists, Avatu. He can be contacted on firstname.lastname@example.org or 01296 621121.