Monday, November 26, 2012

A Rubik's Approach: Casey Anthony Google Searches Overlooked & Lessons Learned

The buzz surrounding the Casey Anthony Trial has resurfaced (yet again), amid a media report relating to Google searches being overlooked by detectives on a computer that was primarily used by Casey Anthony. You can conduct your own internet search to read the various articles and stories published online, regarding this recent development. At the conclusion of the trial in the summer of 2011, one of the detectives assigned to this case provided an exclusive interview with Digital Forensic Source; regarding the trial and digital forensics, you can read the DFSource posts here.

(Disclaimer: Any information and/or opinions contained in this blog post should NOT be considered legal advice. As always, consult with an attorney at law for legal advice.)

Hindsight is always 20/20 and it is easy to sit back and play that Monday morning quarterback. The defense expert and fellow digital forensic examiner, Larry Daniel of Guardian Digital Forensics, stated that he did locate this evidence and it was brought to the defense team's attention prior to the trial. Regardless, of why investigators with the OCSO didn't find this digital evidence prior to the trial, the fact remains that our criminal justice system is an adversarial system. Generally, inculpatory evidence, such as these searches, is not required to be brought to the attention of the State by the Defense as the burden falls upon the State to prove its case, beyond a reasonable doubt.  If the State had located exculpatory evidence (i.e. Google searches that would prove innocence of the defendant), they would have been obligated to notify the Defense of this evidence. However, this can be open to interpretation and not always black & white. Rules of evidence vary by State.

Lessons Learned
And with great power comes great responsibility. As a lethal forensicator, are you asking yourself what lessons can be learned from this case? There are many takeaways in this case that you can apply to your DFIR workflow. Were you thorough and complete in your forensic methodology? Did a peer review your findings? Are you focused on being an objective digital fact finder in every forensic exam? Did you answer your original question posed, prior to initiating a forensic examination? Do you regularly reference an internal/external workflow chart or something similar to the SANS DFIR Poster [artifact analysis] to ensure your forensic examination is focused and targeting the necessary digital artifacts? The best DFIR and most powerful tool is the one that resides between one's ears. Forensic examinations are only as complete and thorough as the digital forensic examiner. We have all had the light bulb moment or have reached a roadblock where you make a quick phone call, send out an e-mail to one of the DFIR lists, or post a tweet to your DFIR followers; Moments later, you get your answer and think to yourself..."why didn't I think of that?" We've all executed a knockout in a forensic exam, and then we call upon a fellow forensic examiner for peer review. He or she then finds a few more artifacts that we may have missed and then you cannot understand why you missed the artifact. We all have the same or similar methodologies, yet our lens or instrument we use in conducting our exams vary, which is important to keep in mind. I am confident that the detectives/examiners in this case and with all the digital evidence mediums that were involved, looked the evidence and analyses over several times. If this was the only machine where an examiner forgot to complete Firefox browser analysis, then that one missed opportunity is all everyone remembers. And now a football analogy to put things in perspective, nobody remembers how many points a team puts up on the scoreboard or how badly they defeat their opponent. People remember the score at the end of the game. People also remember that game-winning field goal that a kicker misses, which costs their team to win the game. Everything else that happens during a game is non-memorable for the most part, but each play or each possession contribute to that final score and a W or L being added to your team's record.
CC0 1.0 Universal (CC0 1.0)

A Rubik's Approach
In digital forensics, the dozens of other artifacts that the examiner found may be relevant, but that one artifact the examiner failed to obtain during their forensic examination is what then becomes relevant and what people (a jury) remember. So the question is, what are you doing to ensure your validated forensic findings that have been validated? Just as there are many ways to solve a rubik's cube, there are different ways to conduct a forensic examination and arrive at the same conclusive results (or at least we should). As forensic examiners, utilize an array of forensic tools to validate your findings. It doesn't matter if you prefer Windows-based vs Linux-based tools or commercial vs FOSS (Free Open Source Software) tools. Our results should be precise and reach the same conclusion. As we follow along in the scientific method are we stopping to think and try a different approach? For example, if the question I am trying to answer is did the suspect plug this Acme Corporation removable USB device into this machine? After I have conducted my analysis and conclude that the suspect did plug the Acme Corp. removable USB device into this machine, are you reversing your hypothesis? Ask yourself, what artifacts would support a hypothesis that the suspect DID NOT plug this Acme Corporation removable USB device into this machine?

As a digital forensic community, there were lessons and many takeaways from the Casey Anthony trial. Whether you have never testified in court and are looking for videos of actual courtroom digital forensic testimony, wanting to review a digital forensic report, or just wanting to view some of the artifacts released in this case, there is something for every digital forensic examiner to learn from in this case. A high profile, publicized trial, that also heavily relied upon digital evidence, bringing shortcomings, successes, and transparency of digital forensics into the public eye.

Wednesday, June 27, 2012

2012 Forensic 4cast Digital Forensic Awards-- Results

The 2012 Forensic 4cast Digital Forensic Awards were just streamed LIVE here at the SANS Digital Forensics and Incident Response Summit.

Computer Forensic Hardware Tool of the Year
Digital Forensic Article of the Year
Phone Forensic Software Tool of the Year
Digital Forensic Podcast of the Year
Digital Forensic Book of the Year
Computer Forensic Software Tool of the Year
Digital Forensic Blog of the Year
Phone Forensic Hardware Tool of the Year
Digital Forensic Organization of the Year
Digital Forensic Examiner of the Year

For more information on the Forensic 4cast Awards click here. Hats off to Lee Whitfield for putting together another successful awards. Remember, these awards are 100% community driven with the nominees & winners voted on by the DFIR community. This year's awards alone had 4,000 votes!
If you are wanting to catch the remainder of the talks here at the #DFIRSummit checkout SANS' LiveStream (no parking as there are only 50 spots available).

Friday, June 22, 2012

Preview: #DFIRSummit 2012

SANS Digital Forensics and Incident Response Summit 2012 is just days away and begins June 26th. The pre-summit training courses began June 20th and are going on now. The agenda has been posted, as well as the speaker line-up. The 2012 Forensic 4cast Awards are at 9AM Central, Wednesday June 27th and will be streamed LIVE from the Summit (link will be posted later). Lee Whitfield posted to Twitter that the awards are secured, under lock and key in an undisclosed location.
I will be attending the Summit and looking forward to re-connecting with old friends and meeting new friends. Make sure that you follow us on Twitter @DFSource for live updates from the Summit. #DFIRSummit is the official hash tag of the SANS Digital Forensics and Incident Response Summit. 

#DFIRSummit & Volatility Goodness:
Lee Whitfield, Brian Moran, Cindy Murphy, and Tom Yarrish. delivered the unofficial Forensic 4cast Awards via Google+ Live Hangout. In case you missed it here are the results:
  • Forensic dance of the year – Hal Pomeranz 
  • Shiniest head in digital forensics – Rob Lee
  • Wussiest punch in DFIR – Dave Kovar
  • Forensic Tool of the year – Mark McKinnon
  • Forensic beard of the year – David Cowen
  • Best photoshop of Lee Whitfield – J-Michael Roberts, Brian Moran, Mark McKinnon
  • Most confusing presentation – Hal Pomeranz
  • Handsomest male forensicator – Joseph Shaw
  • Prettiest female forensicator – Lee Whitfield
  • The “Not Bad for an Old Guy” Award – Ovie Carroll
  • Most unpronounceable name in forensics- Kristinn Gudjonsson

See you in Austin and make sure to follow the Summit happenings via our Twitter feed @DFSource and #DFIRSummit, the official hash tag of the Summit. 

Wednesday, February 8, 2012

Internet Evidence Finder vs NetAnalysis vs Cacheback

In a recent conversation that began on Google+ with several digital forensic professionals, how does Internet Evidence Finder version 5 stack up to some of the other commercial tools? After some discussion, I contacted JadSoftware and the company stated that they were in the process of completing internal testing on IEF v5 and comparing it to other commercial tools. JadSoftware stated they would provide our readers with those results to publish here to the blog (see below). In regards to NetAnalysis and CacheBack performing their own tests, they have been contacted at the time of this blog post. If NetAnalysis or CacheBack provides their own testing results then that information will be posted here.

Editor's Note: Digital Forensic Source does not endorse commercial digital forensic tools. This information is being shared as a service to the digital forensic community, and is being provided "as-is", the testing results completed by the vendor (JadSoftware). DF Source did beta test version 5 and provide feedback to the vendor. As always, conduct your own testing on your DF tools.

The browser testing results that were conducted by JadSoftware, and shared with Digital Forensic Source, are being posted as a service to the digital forensic community. These results have not been tested by Digital Forensic Source.

Click Here for the Browser Forensic Tools Comparison Chart provided by JadSoftware

(For a listing of other files posted to the Digital Forensic Source, click here.)

Adam Belsher (CEO of JadSoftware) stated, "Some of our key differentiators include:

·Carving for IE, Chrome, Opera, and Firefox (Cacheback doesn’t do any carving and NetAnalysis is limited.)
·Single search for all artifacts (rather than having to do multiple searches)
·We search in more areas on the hard drive (i.e.  hiberfile.sys with decompression, pagefile.sys, live RAM captures etc.)
·We support more artifacts within the browsers (i.e. Chrome- top sites, credit card data, favicons etc.)"

Commentary: On a side note, thank you JadSoftware for following up on the inquiries in regards to IEF and how it is currently comparing to other commercial browser forensic tools. It is a great thing for the digital forensic community, when vendors respond promptly to the needs of examiners. As budgets continue to be a focus for governments, academia, and companies, digital forensic vendors that are truly service-oriented and driven by the requests of their customers (the forensic analyst), will be successful. Sharing information, research, tool testing, etc. benefits the entire digital forensic community.

Thursday, January 19, 2012

DFIROnline: Connecting the DFIR Community (January Meetup)

DFIROnline is an opportunity to meet-up monthly (informally) with other DFIR practitioners. It is being held the third Thursday of every month @ 2000 hours ET. If you missed tonight's meet-up the archive will be posted soon. Props to Mike Wilkinson for taking the initiative to organize (organise for Lee Whitfield) DFIROnline.
Leave comments or feedback from tonight's meet-up. These informal meet-ups are a great idea and the digital forensics community was out in force this evening. Participation (# of attendees) tripled compared to the first DFIROnline meet up in December, which is archived and can be viewed here. Great presentations this evening by Harlan Carvey on "Malware Detection with An Acquired Image" and Eric Huber on "The Advanced Persistent Threat or: How I Learned to Stop Worrying and Love DF/IR". {Be a sheep dog!}
I enjoy watching Harlan present. He always delivers a practical (something that can be implemented now into your DFIR toolkit/processes) presentation, the DFIR analyst/investigator can understand and deploy immediately. Eric also delivered a great presentation on the APT...What it is and what it is not; drawing upon his knowledge of history, to demonstrate and define the APT. Make sure you catch the next DFIROnline meet-up on Thursday, February 16, 2012. 

Checkout the future line-up already scheduled for this year:

Feb 16 2012 Peter Coons and John Clingerman: Case studies in e-discovery  
                        Jon Rajewski     TBA
Mar 15 2012 Hal Pomeranz: Linux Forensics for non Linux users        
                        Corey Harrell: Ripping Volume Shadow Copies - Tracking User Activity

Remember to follow #DFIROnline hash tag on Twitter.