Monday, November 26, 2012

A Rubik's Approach: Casey Anthony Google Searches Overlooked & Lessons Learned


The buzz surrounding the Casey Anthony Trial has resurfaced (yet again), amid a media report relating to Google searches being overlooked by detectives on a computer that was primarily used by Casey Anthony. You can conduct your own internet search to read the various articles and stories published online, regarding this recent development. At the conclusion of the trial in the summer of 2011, one of the detectives assigned to this case provided an exclusive interview with Digital Forensic Source; regarding the trial and digital forensics, you can read the DFSource posts here.


Observations
(Disclaimer: Any information and/or opinions contained in this blog post should NOT be considered legal advice. As always, consult with an attorney at law for legal advice.)

Hindsight is always 20/20 and it is easy to sit back and play that Monday morning quarterback. The defense expert and fellow digital forensic examiner, Larry Daniel of Guardian Digital Forensics, stated that he did locate this evidence and it was brought to the defense team's attention prior to the trial. Regardless, of why investigators with the OCSO didn't find this digital evidence prior to the trial, the fact remains that our criminal justice system is an adversarial system. Generally, inculpatory evidence, such as these searches, is not required to be brought to the attention of the State by the Defense as the burden falls upon the State to prove its case, beyond a reasonable doubt.  If the State had located exculpatory evidence (i.e. Google searches that would prove innocence of the defendant), they would have been obligated to notify the Defense of this evidence. However, this can be open to interpretation and not always black & white. Rules of evidence vary by State.

Lessons Learned
And with great power comes great responsibility. As a lethal forensicator, are you asking yourself what lessons can be learned from this case? There are many takeaways in this case that you can apply to your DFIR workflow. Were you thorough and complete in your forensic methodology? Did a peer review your findings? Are you focused on being an objective digital fact finder in every forensic exam? Did you answer your original question posed, prior to initiating a forensic examination? Do you regularly reference an internal/external workflow chart or something similar to the SANS DFIR Poster [artifact analysis] to ensure your forensic examination is focused and targeting the necessary digital artifacts? The best DFIR and most powerful tool is the one that resides between one's ears. Forensic examinations are only as complete and thorough as the digital forensic examiner. We have all had the light bulb moment or have reached a roadblock where you make a quick phone call, send out an e-mail to one of the DFIR lists, or post a tweet to your DFIR followers; Moments later, you get your answer and think to yourself..."why didn't I think of that?" We've all executed a knockout in a forensic exam, and then we call upon a fellow forensic examiner for peer review. He or she then finds a few more artifacts that we may have missed and then you cannot understand why you missed the artifact. We all have the same or similar methodologies, yet our lens or instrument we use in conducting our exams vary, which is important to keep in mind. I am confident that the detectives/examiners in this case and with all the digital evidence mediums that were involved, looked the evidence and analyses over several times. If this was the only machine where an examiner forgot to complete Firefox browser analysis, then that one missed opportunity is all everyone remembers. And now a football analogy to put things in perspective, nobody remembers how many points a team puts up on the scoreboard or how badly they defeat their opponent. People remember the score at the end of the game. People also remember that game-winning field goal that a kicker misses, which costs their team to win the game. Everything else that happens during a game is non-memorable for the most part, but each play or each possession contribute to that final score and a W or L being added to your team's record.
 
CC0 1.0 Universal (CC0 1.0)

A Rubik's Approach
In digital forensics, the dozens of other artifacts that the examiner found may be relevant, but that one artifact the examiner failed to obtain during their forensic examination is what then becomes relevant and what people (a jury) remember. So the question is, what are you doing to ensure your validated forensic findings that have been validated? Just as there are many ways to solve a rubik's cube, there are different ways to conduct a forensic examination and arrive at the same conclusive results (or at least we should). As forensic examiners, utilize an array of forensic tools to validate your findings. It doesn't matter if you prefer Windows-based vs Linux-based tools or commercial vs FOSS (Free Open Source Software) tools. Our results should be precise and reach the same conclusion. As we follow along in the scientific method are we stopping to think and try a different approach? For example, if the question I am trying to answer is did the suspect plug this Acme Corporation removable USB device into this machine? After I have conducted my analysis and conclude that the suspect did plug the Acme Corp. removable USB device into this machine, are you reversing your hypothesis? Ask yourself, what artifacts would support a hypothesis that the suspect DID NOT plug this Acme Corporation removable USB device into this machine?

As a digital forensic community, there were lessons and many takeaways from the Casey Anthony trial. Whether you have never testified in court and are looking for videos of actual courtroom digital forensic testimony, wanting to review a digital forensic report, or just wanting to view some of the artifacts released in this case, there is something for every digital forensic examiner to learn from in this case. A high profile, publicized trial, that also heavily relied upon digital evidence, bringing shortcomings, successes, and transparency of digital forensics into the public eye.

3 comments:

Unknown said...

I have posted the raw Mozilla history files I received recently in this case at the following two links:

History from non-password-protected "casey" account


History from password-protected "owner" account


Readers are free to download the files and, using their favorite forensic toolkits, determine for themselves whether or not the suffocation search was done at approximately 1:50 PM on June 16, 2008 as the defense maintains, or at approximately 2:50 PM as I believe.

The point is, the forensic tools that the OCSO and defense relied on prior to trial (NetAnalysis and Cacheback) had their Mork parsing engines completely re-written after the trial in response to the errors that were uncovered during trial. Both tools now agree on such critical issues as record count, visit count, and time stamps.

The version of the tools that the defense and prosecution relied on prior to trial produced incorrect time stamps. This error is as critical to the one of the OCSO neglecting to examine the file altogether. They certainly had the file, because I obtained it from them.

For completeness, the raw history file recovered from unallocated space containing the infamous Google searches for "chloroform" <a href="https://dl.dropbox.com/u/39580638/Deleted%20Mozilla%20History%20%282%29.dat>can be found here</a>.

H. Carvey said...

Good post, with some very interesting questions.

But really...how many examiners out there open themselves to peer review? I think that this is one of the biggest things missing in our "profession"...lack of engaging with others, to include peer review.

And I know what all the excuses are...including that if a defense attorney finds out that you asked a question, they can grill you on the stand. But which is better...to ask a question and from that develop an answer and a better understanding of the issue, or to not ask and simply not know?

Unknown said...

John - Thank you for posting this.

My thoughts - Digital Forensic Examiners should rely on multiple eye witnesses to confirm or dispute findings.

For example, if X tool reports 1:50pm from the internet record (index.dat for example), the forensic examiner should be looking in TIF to see if there is an HTML file with a C date being created around the same date/time -- then also cross referencing images/java etc that were on the webpage that would be hitting the system as well. They can also do a full temporal analysis on the drive to see if there are any inconsistencies / consistencies to support the tools finding. Of course this would be burdensome for every internet finding, but for really important "events" on the system the examiner should not be blindly relying on a tools output.