The buzz surrounding the Casey Anthony Trial has resurfaced (yet again), amid a media report relating to Google searches being overlooked by detectives on a computer that was primarily used by Casey Anthony. You can conduct your own internet search to read the various articles and stories published online, regarding this recent development. At the conclusion of the trial in the summer of 2011, one of the detectives assigned to this case provided an exclusive interview with Digital Forensic Source; regarding the trial and digital forensics, you can read the DFSource posts here.
Observations
(Disclaimer: Any information and/or opinions contained in this blog post should NOT be considered legal advice. As always, consult with an attorney at law for legal advice.)
Hindsight is always 20/20 and it is easy to sit back and play that Monday morning quarterback. The defense expert and fellow digital forensic examiner, Larry Daniel of Guardian Digital Forensics, stated that he did locate this evidence and it was brought to the defense team's attention prior to the trial. Regardless, of why investigators with the OCSO didn't find this digital evidence prior to the trial, the fact remains that our criminal justice system is an adversarial system. Generally, inculpatory evidence, such as these searches, is not required to be brought to the attention of the State by the Defense as the burden falls upon the State to prove its case, beyond a reasonable doubt. If the State had located exculpatory evidence (i.e. Google searches that would prove innocence of the defendant), they would have been obligated to notify the Defense of this evidence. However, this can be open to interpretation and not always black & white. Rules of evidence vary by State.
Lessons Learned
And with great power comes great responsibility. As a lethal forensicator, are you asking yourself what lessons can be learned from this case? There are many takeaways in this case that you can apply to your DFIR workflow. Were you thorough and complete in your forensic methodology? Did a peer review your findings? Are you focused on being an objective digital fact finder in every forensic exam? Did you answer your original question posed, prior to initiating a forensic examination? Do you regularly reference an internal/external workflow chart or something similar to the SANS DFIR Poster [artifact analysis] to ensure your forensic examination is focused and targeting the necessary digital artifacts? The best DFIR and most powerful tool is the one that resides between one's ears. Forensic examinations are only as complete and thorough as the digital forensic examiner. We have all had the light bulb moment or have reached a roadblock where you make a quick phone call, send out an e-mail to one of the DFIR lists, or post a tweet to your DFIR followers; Moments later, you get your answer and think to yourself..."why didn't I think of that?" We've all executed a knockout in a forensic exam, and then we call upon a fellow forensic examiner for peer review. He or she then finds a few more artifacts that we may have missed and then you cannot understand why you missed the artifact. We all have the same or similar methodologies, yet our lens or instrument we use in conducting our exams vary, which is important to keep in mind. I am confident that the detectives/examiners in this case and with all the digital evidence mediums that were involved, looked the evidence and analyses over several times. If this was the only machine where an examiner forgot to complete Firefox browser analysis, then that one missed opportunity is all everyone remembers. And now a football analogy to put things in perspective, nobody remembers how many points a team puts up on the scoreboard or how badly they defeat their opponent. People remember the score at the end of the game. People also remember that game-winning field goal that a kicker misses, which costs their team to win the game. Everything else that happens during a game is non-memorable for the most part, but each play or each possession contribute to that final score and a W or L being added to your team's record.
A Rubik's Approach
In digital forensics, the dozens of other artifacts that the examiner found may be relevant, but that one artifact the examiner failed to obtain during their forensic examination is what then becomes relevant and what people (a jury) remember. So the question is, what are you doing to ensure your validated forensic findings that have been validated? Just as there are many ways to solve a rubik's cube, there are different ways to conduct a forensic examination and arrive at the same conclusive results (or at least we should). As forensic examiners, utilize an array of forensic tools to validate your findings. It doesn't matter if you prefer Windows-based vs Linux-based tools or commercial vs FOSS (Free Open Source Software) tools. Our results should be precise and reach the same conclusion. As we follow along in the scientific method are we stopping to think and try a different approach? For example, if the question I am trying to answer is did the suspect plug this Acme Corporation removable USB device into this machine? After I have conducted my analysis and conclude that the suspect did plug the Acme Corp. removable USB device into this machine, are you reversing your hypothesis? Ask yourself, what artifacts would support a hypothesis that the suspect DID NOT plug this Acme Corporation removable USB device into this machine?
As a digital forensic community, there were lessons and many takeaways from the Casey Anthony trial. Whether you have never testified in court and are looking for videos of actual courtroom digital forensic testimony, wanting to review a digital forensic report, or just wanting to view some of the artifacts released in this case, there is something for every digital forensic examiner to learn from in this case. A high profile, publicized trial, that also heavily relied upon digital evidence, bringing shortcomings, successes, and transparency of digital forensics into the public eye.