U.S. Department of Justice, Office of Justice Programs; National Institute of Justice The Research, Development, and Evaluation Agency of the U.S. Department of Justice U.S. Department of Justice, Office of Justice ProgramsNational Institute of JusticeThe Research, Development, and Evaluation Agency of the U.S. Department of Justice

Backlogs and Their Impact on the Criminal Justice System

Listen to the NIJ Conference 2010 panel discussion.

Gerry LaPorte: Good afternoon, everybody. I hope everybody enjoyed their lunch. My name is Gerry LaPorte, and I am the Forensic Policy Program Manager within the Office of Investigative and Forensic Sciences, which is within the National Institute of Justice.

It's my pleasure today to have a very distinguished panel seated here to my left. Mr. Joseph Peterson is going to handle the bios and kind of introduce each of the speakers and so forth, but my job here is just to moderate and then to introduce Joe, and then he'll kind of take it over from that point.

Mr. Peterson is a professor and director of the School of Criminal Justice and Criminalistics at the California State University, Los Angeles. For the past 35 years, Mr. Peterson's research has monitored the evolution of forensic science, documenting its growth potential as well as its shortcomings. Mr. Peterson's 2002 and 2005 reports, Census of Publicly Funded Forensic Crime Laboratories, for the Bureau of Justice Statistics, have documented high caseloads, long backlogs, and severe budgetary and personnel needs. His NIJ sponsored research has also examined the role and impact of forensic evidence at key decision points in the judicial process, such as arrest, charging and determination of guilt or innocence in sentencing.

His current research examines the backlog of sexual assault kits in crime laboratories, which is the theme of this session right now. So this is obviously a very, very hot issue right now, so I am going to let Joe take over.

Joseph L. Peterson: Thank you, Gerry, and good afternoon to everybody.

I should note that I was involved in those censuses of crime labs. Actually, it was Kevin Durose who published the 2005 data a couple of years ago, so I want to acknowledge on that.

We have a wonderful panel, and we're going to be addressing different aspects of this backlog problem. And I am going to wind up at the end and just give you an overview of the research that we're doing in Los Angeles in terms of looking at the results of the testing of cases that were in the backlogs of the Los Angeles Police Department and Sheriff's Department and tell you some other things that we're doing.

But we're going to start our panel discussion today with Mr. Kevin Strom, who is a senior research scientist at RTI International. Actually, I got to know Kevin when he worked for the Bureau of Justice Statistics a number of years ago. His research interests include law enforcement responses in community violence and the impact of forensic science on the criminal justice system. He's published widely, both government reports as well as academic journals. And most recently, Kevin and colleagues did some very interesting survey work on the backlog issue of forensic evidence, and he's going to speak to that issue right now.

Kevin?

Kevin Strom: Thank you.

So, today, I'm going to talk about a survey we concluded for NIJ last year. This survey was focused on forensic evidence processing in state and local law enforcement agencies. A lot of the information out there is based on forensic backlogs in crime labs, and this particular effort was focused on sort of the other side of the fence, what's going on in law enforcement agencies, how are agencies processing and moving these cases forward, what's the decision making, how is the evidence maintained, and where maybe some solutions for how some of these things can work more efficiently.

So the final report came out in October 2009. It's available on the NIJ website, and there have been some other — one other article that appeared in Criminology & Public Policy based on this work that came out last month.

So what do we know about evidence backlogs more generally? As I mentioned, forensic backlogs within crime laboratories are relatively well established. These trends have fluctuated over time. There is some evidence that suggests recent funding through NIJ has resulted in some reductions, but as a whole, the BJS surveys have shown that based in 2002, there was a backlog of over 260,000 cases, and in 2005, the subsequent survey showed that these backlogs increased by a factor of 24 percent.

NIJ also conducted a study earlier in the 2000s that looked at unanalyzed evidence in law enforcement agencies. So these were defined as cases that were unsolved, that had forensic evidence associated with them, but for one reason or another never went to a crime lab for analysis.

This earlier study was focused on biological evidence only, and it found that there were over 50,000 unsolved homicides, nearly 170,000 unsolved rapes that contained biological evidence but were never sent to a crime lab. They also estimated about 264,000 unsolved property crime cases that contained biological evidence. So this is sort of the foundation for some of this work. The goal for NIJ was to conduct this survey of law enforcement agencies and also to look at some things in more detail, what type of evidence was involved and what were the different capacity issues for these agencies.

So the primary objective was to estimate the number of unsolved violent homicide and rape cases and property cases that contained some form of forensic evidence, not just DNA, but that were not submitted to a crime laboratory for some particular reason. We also look at the estimates of the types of forensic evidence within these cases and also describe the capabilities and procedures in law enforcement agencies for processing, submitting and retaining evidence.

Today, I'm going to focus mainly on the results that look at the unanalyzed cases and then also talk a little bit about the capacity issues within the agencies.

A little bit of background on the survey: It was a nationally representative survey. We sampled more than 3,000 state and local law enforcement agencies, all large law enforcement agencies, so all those with 100 or more sworn or included with certainty in the sample, and then we selected a stratified sample of smaller agencies.

It was a multi mode data collection: web, mail, fax, telephone. As you can imagine, this is not very easy information to collect. Some law enforcement agencies have this information readily available. They have databases that can pull this out, but many do not. So, in those cases, we asked them to approximate these numbers for us as best they could, and we really worked with them to try and come up and try and limit burden on them in providing this information.

We also worked with an expert panel of forensic scientists, law enforcement agents, researchers and others to develop the survey instrument and to follow up with specific agencies as needed, and, of course, NIJ was involved throughout the process.

A little bit on response modes: Overall there was a 73 percent response rate. Nearly half of the responses came by web, about 40 percent through hard copy mail, 11 percent by fax, and we did some telephone follow-up, but very few people completed by phone. Mostly, the telephone was used to prompt agencies to respond, and anybody that sort of followed survey research notes at the web-based responses, which used to be a sort of a lower proportion, have grown steadily over time.

This is just a screenshot of the web base, the website where agencies could log on with a unique password, enter in information, and the website also allowed us to track responses in real time.

So the response rates by agencies, as I mentioned the 73 percent response rate overall; municipal police departments had a 75 percent response rate and then state police agencies 63 percent. Also, what you see by agency size, those agencies were most likely to respond that were larger, so, essentially, smaller sheriff's departments had the lowest response rate overall.

So what do the data show? Over a five-year period, we asked about a five-year period from 2003 to 2007. There were an estimated — nearly 4,000 estimated homicide cases that contained some form of forensic evidence but that were not submitted to a crime lab for analysis. This represented about 14 percent of all estimated unsolved homicide cases.

For rapes, about 27,500 estimated rapes that again contained some form of forensic evidence but were not submitted to a laboratory, this represented about 18 percent of unsolved cases.

For property crimes, nearly five million, which is not a surprise considering how many property crimes, the sheer number of property crimes out there, and that represented about 23 percent of unsolved cases with unanalyzed forensic evidence.

So the numbers are fairly large. The magnitude of these is debatable in terms of the sheer scale. Based on the amount of evidence that flows through the system, some folks have looked at these numbers and said, “Well, these aren't very surprising; in fact, they could be a little bit low,” whereas others have had the opposite viewpoint.

When you look at the types of forensic evidence contained in these backlog cases — and this is just for homicides and rape cases — about 40 percent contain some form of DNA evidence. Trace evidence was included in about 27 percent of these cases; 26 percent contained some form of latent prints and firearm and tool marks in 23 percent. So DNA, while not involved in all these cases, was the most likely form of evidence to be contained in these unanalyzed homicide and rape cases.

When looking at the breakouts by agency size, as one can imagine, larger police agencies for the unanalyzed homicides accounted for more than 80 percent, but the same was not necessarily true for especially rape and property cases. So smaller agencies, those with less than 50 officers, accounted for about 30 percent of those cases. Collectively, agencies with 100 or less sworn accounted for about 40 percent of the unanalyzed rape cases. So the perception of large agencies, that we can just focus on large agencies to improve some of these issues, I think is misleading, especially for rape cases.

This just shows a little bit more about agency size, by agency type. Again, you see that municipal police departments accounted for most of the homicide and rape cases, and I guess, most notably, for state police agencies, they accounted for about one in 10 unsolved rape cases.

So where are some reasons why this evidence isn't moving through the system? Perhaps the most common reason was that there was no suspect identified in the case, and so, in some instances, this may be an issue with the investigator either not fully understanding or maybe there being some type of procedural issue that may not allow them to submit the case forward. In some cases, there was the issue with the prosecutor. The analysis had not been requested by the prosecutor; 15 percent indicated that was the reason the evidence hadn't been submitted. And, also, in 12 percent, the suspect had been identified but not formally charged. So these investigative-type issues are affecting the flow of evidence in some of these cases.

Laboratory resource and timeliness issues, while not as prevalent, were also noted. The inability of the laboratory to produce timely results was reported in 11 percent of the instances, insufficient funding for analysis 9 percent, and the fact that the lab will not accept evidence due to backlog issues in 6 percent. And remember these are based on the perceptions of the law enforcement agency.

Finally, some other factors that influence, the inhibiting factors that influence the submission of evidence: 24 percent indicated the suspect had been adjudicated without the forensic evidence, 17 percent reported that they were uncertain of the usefulness of forensic evidence in that particular case, and 2 percent were uncertain where to send the evidence for analysis.

A little bit about evidence retention: We also ask about evidence retention policies within the law enforcement agency. Less than half of agencies reported having such a policy for preserving biological evidence in cases in which the defendant was found guilty. One in five agencies were unsure about such a policy, and for instances where the agency did have a policy, but the investigating law enforcement agency was responsible in the vast majority of the cases for storing that evidence.

The same was true for unsolved cases. So the bottom line is that in almost all instances, the vast majority of incidents, both unsolved and solved cases, the burden of retaining, storing and tracking that evidence normally comes down to the law enforcement agency.

So what are some of the study implications, some of the more generalizable issues that come out from this study? One is certainly that law enforcement agencies continue to face substantial forensic backlogs for homicide, rape and also property crime cases. One in seven unsolved homicide cases, one in five rape cases and one in four property cases that contain some form of forensic evidence were not submitted to a crime lab for analysis. Those backlogs were not just limited to large agencies, especially for rape cases, which is an important point.

It also indicates that there is some additional training that is required in enhanced policies regarding the use of forensic evidence. It shouldn't be just up to one investigator or a single person within an agency to make all these decisions, some more checks and balances about how cases move forward.

I should also mention that policies that really move to the completely other side and that require all evidence be submitted under all circumstances I think are also a little bit suspect in terms of really preserving the resources available in the system.

Some law enforcement agencies continue to have the mind set that forensic evidence is beneficial primarily for prosecuting crimes, not just for developing new leads in investigations, and again, sometimes there's procedural issues within the crime lab, within the prosecutor's office that really contributes to that mind set.

Another really important point was about information systems and the ability to track evidence within the law enforcement agencies and also as it moved on to the crime lab and then to adjudication. Only about four in 10 law enforcement agencies report having a computerized system in place capable of tracking forensic evidence in inventory.

This is a major area of need. We really need to get better at how we track evidence and especially as it moves from one part of the system to the other; also, more guidelines, documentation and resources for evidence processing. Policies must take into account the resources available in law enforcement agencies, as I mentioned. In most instances, they're responsible for tracking and maintaining this evidence over time, and really more assistance and more resources need to be provided to allow them both to maintain this evidence and also better understand what evidence can be discarded.

Finally, this is just a review of the training, creating computerized systems, improving storage capacity, but then also a system-wide approach to improve coordination among the police, the labs and the prosecutors. This could include dedicated staff that are responsible for case management, regular teams for case review and computerized systems, but really a coordinated approach to looking at this as opposed to each entity looking at their forensic evidence and, as it moves, it becomes someone else's problem. So a more coordinated approach around how this evidence is thought through and placing more checks and balances to decide what evidence should proceed and what would have the most utility for solving and investigating these crimes.

Thank you.

[Applause.]

Joseph L. Peterson: Thank you, Kevin.

I think what we'll do is we'll wait for questions until the very end. I think we may have time for some questions and for some discussion of the panelists.

Our next presenter is Dean Gialamas. He is the director of the Los Angeles Sheriff's Crime Lab in Los Angeles, California. That's a jurisdiction that serves over six million residents, and they're an institution that's fully accredited by the American Society of Crime Lab Directors.

Dean is the immediate past president of ASCLD and is also a member of the Consortium of Forensic Science Organizations, and I'm proud to say he's a graduate of our program at Cal State. And he will discuss the situation with Los Angeles County's background on the backlog as it formed there and in terms of what they're actually doing about it in the Los Angeles County.

Dean?

Dean M. Gialamas: Thank you, Joe. Good afternoon, everyone.

I am going to take you through a little historical perspective of Los Angeles County, and as I give you a little, brief overview of what I'm going to talk about today, I'll talk about how the backlog was created, so to speak, why for me on my soapbox, backlogs are actually a false metric. It's really something that I find as a meaningless number. We'll review some of the statistics from the program and then what the next steps are for us at L.A.

Just a quick overview, so everybody understands what L.A. County comprises. A pretty large county, probably larger than a majority of the states in the union, 11 million people, 4,000 square miles. We have 88 cities with 47 independent police departments that operate within L.A. County, and we provide service specifically for DNA to every one of those agencies except for one, and that's the City of Los Angeles, who has their own crime lab. And, in fact, the data I'm going to present today is only for all the areas of Los Angeles excluding the city. If we added the city to it, these numbers would easily double just because of the numbers that the city has been dealing with.

A little bit about the lab: We not only provide service to the sheriff's department but all those cities as well, about 50 other law enforcement agencies when you factor in things like Cal State University police or Amtrak police and all these other interesting police departments that operate at a local, state and federal level. Our lab is about 300 sworn and professional staff, and we get about 80,000 evidence submissions a year. So it's a pretty large operation, which, when you see the numbers, you will understand the perspective.

So how does the backlog emerge in L.A.? Well, it started with some local attention. Our partners across the way at the city were going through their sexual assault kit backlog program. It caught the attention of the L.A. Times several times. Then the Human Rights Watch got involved and wrote a very detailed report about what was going on in the City of Los Angeles, trying to raise awareness issues, seeing that L.A. was a place to start, and they have pledged to move forward all across this country. So don't worry; you'll get your chance, too.

[Laughter.]

Gialamas: Concerns were raised by the local rape treatment center operators, and so all these things culminated into the sheriff's office, at which point in time a policy decision was made. And the sheriff's policy decision was if a sexual assault kit is collected, then it will be tested, end of discussion. That is the effective triage, and that is exactly what we have now in both the City of Los Angeles and in Los Angeles County. If a kit is obtained, it will be tested, regardless of merit or other issues or circumstances in the case.

So we needed to get a handle, then, on how many kits we actually have now and then worry about what's going to come in the door tomorrow. So we initiated a hand count. It was an inventory of all the freezers. We got parkas for everyone. Off they marched, and they did an inventory and counted 6,113 sexual assault kits that were in local storage at some location or another.

Now, while that inventory was being completed, we also generated a database, and the database is because we did not have a LIM system that was capable of integrating information from independent law enforcement agencies with the crime lab. So we needed a way of tracking these, and our existing LIMS was not capable of doing it. So this was the first of many databases that were created in order to help us manage this particular problem.

So, as we started going through the data, what we then tried to determine was this is a hand count of everything that exists. Because some of these kits may have actually been returned to those storage locations already analyzed, the question then was, well, how many of those kits were previously worked, and that number ended up being just over 1,400. So that left our magic starting number of 4,675 unanalyzed sexual assault kits that were now part of our backlog with the breakdown that you see, and if you can imagine what happened in the laboratory, we went from having about 25 cases that were pending and are needed to be done according to the investigators and law enforcement to having 4,675 cases just in a matter of days; hence, my reason for why backlog is a meaningless number.

Why is it meaningless? Well, first of all, backlog is really an input of — is a measurement of inputs; it's not a measurement of productivity at all. In fact, it doesn't even measure efficiency, and that's unfortunately what a lot of policymakers get wrapped up with.

A good example of that is the NIJ Convicted Offender program. Many of you have known for the last several years, NIJ has graciously funded convicted offender backlog reduction programs, and from the start of that funding until today, the backlogs have actually grown, not decreased, and it is not because people aren't working. In fact, we have gotten more efficient. It's that more states have added either arrestee laws or additional laws on the books that are now collecting more and more samples.

So the idea of a backlog is a static number. It is really a slice in time, and it doesn't give you a real full perspective on efficiency. It also doesn't tell you anything about productivity or turnaround time, which is oftentimes what we, even ourselves, as crime lab managers and supervisors, get wrapped up in.

A good example of that is in Orange County, California, very little or low backlog. Through programs that were initiated with batching and other things, we went from processing about 3,000 samples a year to over 13,000 samples a year, and during that same time frame, our backlog went from essentially zero to over 2,500 cases waiting to be done simply because of property crimes. So, even though our turnaround time and our productivity increased four and five times over, our backlog went through the roof.

So policymakers, unfortunately, relying on backlogs are looking at this magic number and they want to see it go away, and they don't realize that there's really no connection between inputs and productivity and the number that they are asking for in the public. So that is really an educational piece that we have to be wary of.

I only mention this because it's a soapbox item for me, because we get wrapped up in it, too, and I think we need, as professionals in this field, to stop using that term or at least defining what it means when we do use it because it can be viewed erroneously and actually come back to bite us.

Backlogs are not controlled by the laboratory. Because they're inputs, it's controlled by our client users; it's controlled by the crime rates, which can go up and down. I don't know about you, but our property crime rates are plummeting in Los Angeles, and yet our level of submissions in the laboratory continues to increase. That's, again, we're a victim of our own success. So we have to be careful of the terms we use.

Well, going back to the L.A. problem, when we were in the survey phase, we decided to triage the cases based on the potential probative value. We designed an audit questionnaire that went out to all the law enforcement agencies. We asked for a two-week turnaround time, which, for the most part, proved okay, but for larger agencies or those that had a significant number of kits, like in the thousands, that proved to be an inadequate amount of time for the very same reasons that Kevin just described about getting surveys returned. Not enough data, no LIMS, no database exists to track this kind of information.

We did have overall about a 75 percent return rate, and we followed up with a personal touch on those that we didn't get any response from. And here are the categories that we broke down that we found useful. Category one being an unknown suspect, those would be given priority. And I might mention that unlike other agencies' experience, very few of these cases were what we would typically call the “stranger rape” situation. They were, for investigative purposes — the investigator, even though it was an unknown suspect, really had no desire to be pursuing this case in the first place — known suspect cases, DA rejects, those that had been adjudicated, incomplete audit returns and some in which the elements of the crime had not even been established.

So here's the overall breakdown, and I hope those in the back can read some of the numbers. I tried to get this as large as I could, but we ended up — of the 6,113, we ended up with about 6,073 cases that actually met the merits of working under this program, with an approximate percentage breakdown of 31 percent that had been analyzed and 69 percent that had not been analyzed in our inventory, and so focusing on the 69 percent, this is the breakdown based on category. Only 20 percent of those cases were actually unknown suspect. All the rest met all the criteria of the other category: known suspect, DA reject. These were essentially cases that wouldn't typically be worked by most crime labs because most of the time investigations wouldn't be submitting them. There was really no probative value in knowing that information.

For example, many of the known suspect cases happened to be situations where the assailant was known. There was no question that there was an act of some sexual-related issue, but it was just an issue of consent. Well, a DNA test isn't going to resolve the issue of consent. That's an investigative process. So conducting the DNA test on those types of cases was not probative in the sense of the investigation, which is why many of these cases were never submitted in the first place.

It'll be very interesting to see over time how this data kind of shakes out, and we will get to it a little bit later, as Dr. Joe Peterson will talk about some of the work that Cal State L.A. is pursuing on that.

Now, we ran into the obvious dilemma that most of you would have. When you go from nothing, have 25 cases, to 4,675, you just don't have the resources to do that overnight. So we chose the outsourcing component. We now use a total of seven laboratories that help us out. We have five contract private laboratories. We also have assistance from two law enforcement agencies. One is California DOJ, and the other is Marshall University, who are assisting us with the testing.

We did take some time to get this ramped up. It took time to get contracts developed and in place. The metrics that we needed, we needed to deal with some of the laboratory audits, which were just some of the start up procedural issues to get through.

We started sending off at the early stages about 20 kits a month — I'm sorry — 20 to 60 kits a month. We are now actually sending more than 500 kits per month out to these private laboratories. As you can imagine, it's a huge undertaking. It's a lot of work to be putting through and getting out to these entities.

Our funding, as you may wonder where this is all coming from, is actually paid for, NIJ grant funding — thank you very much, NIJ.

[Laughter.]

Gialamas: Local Prop 69 funding, which is a DNA funding component in California; for those that aren't from California, there was a bill passed several years ago that for every felony conviction, one dollar out of every 10 dollars that is collected in fine money actually gets held back for the purposes of DNA testing; 75 percent of that stays locally and 25 percent of it goes to California DOJ to help deal with the Convicted Offender programs there. And then we also had department funds.

Now, with every crisis comes opportunity, and we had our crisis, and fortunately for us, we got our opportunity as well. This resulted in an additional $2.3 million from our local board of supervisors to ensure that this problem doesn't resurface again. We have actually been able to accomplish to date everything on our project without even using that funding. We don't know whether it'll be there or not, given our budget cuts that we are facing, but at least at this point, we have not dipped into that allocation.

We did request additional personnel. We saw the bottlenecks that would be coming, and the one significant one was sending all these cases out. When they come back in, somebody has got to review the data, somebody has got to spend the time to upload them, and so we projected that and we asked for additional personnel. So we have six new DNA analysts and one additional supervisor to help out with this dilemma. And then we just recently were awarded a grant award from NIJ, which will help us with our continuing backlog efforts.

So what are some of the interesting numbers to date? Well, so far we have outsourced over 94 percent of the identified untested sexual assault kits. This problem happened, as some of my L.A. colleagues will know — really, the meltdown was a date they will never forget, November 1, 2008. It's a date etched in their memory. And so, since that time, we actually can see the light at the end of the tunnel. It's starting to feel like, wow, we're going to get through this.

So that's 4,028 cases out of the 4,271 that were sent out. Of those, 41 percent were negative for the biologic screen and 59 percent were positive for biological screen, and that's a very interesting number because I think, empirically, I have always held that, you know, roughly 60 to 70 percent of those cases come up positive, and here is actual data, over thousands of cases, which kind of confirms what we've known all along. Of those 59 percent that were positive, 37 percent qualified for CODIS upload, 27 percent failed to meet the upload criteria and the other 36 percent are pending data review. And you might think that 36 percent is relatively high, but that's only 430 cases, and it's sending 500 cases out a month. That's really only one month's work of data that has to be reviewed and entered. So it's really not that far behind, even though that percentage may make it seem like it's rather high.

Well, what does this mean from the public perspective, because I always love to ask that question, and as a taxpayer, I guess my first question is, well, how much did this wonderful plan cost us? Well, so far, we've spent $1.7 million to test these kits. Well, on average now — this is giving, this is just a really rough average, right; this is not the true cost of our kit — but if you just take that 1.7 million into the number of kits we've done, that's about $850 a kit. So this is not inexpensive work, and that figure does not include any of the in-house time that we spend on data tracking, on searching for these kits, identifying the samples out of the kit that we are going to remove and send off, and the data review that happens afterwards. So, when we look at the total cost of this program, it is going to far exceed what this number is, but that's where we are so far.

And so the next logical question, as our local politicians like to know, is, well, great, how many crimes have we solved? This is a bit misleading; the answer is two. We have had two cold hits that have actually led to a prosecution. Now, over the course of time, including our backlog program and the regular program, we have had about 150 hits. A majority of them are still under investigation, but of the 60 or so that they have actually pursued, most of them were DA rejects. Most of them were rejected because the case had already been adjudicated. It's only two so far that have actually led to something.

Now, this presents a very interesting social economic question of the value, then, of pursuing all of this work for those two cases, and it is very hard to look at these two and say that they're not important, and I don't want to be misheard by saying what I just said because, if we have brought resolution to two victims who have gone through a horrific incident, then that is time and money well spent.

I think what we need to ask ourselves — and, again, this is, I think, the interesting study that Dr. Peterson will be doing — at what point can we look at, instead of just barrage and doing them all, is there some scientific way in which we should proceed through cases; is there some real merit to saying we're going to triage cases in this fashion because we have data to show that you tend to get more probative value and better results when you proceed in this fashion. So that'll be a very interesting study to pursue.

A couple of things of what worked and what didn't. First of all, taking the time to inventory and fully assess the problem, we used existing performance metrics to carefully project our time. As it stands now, we told our board of supervisors it was going to take a certain amount of time. We're going to be completing this well ahead of the time we told them; however, we hope that — at least we are not telling them that at this point. We are telling them everything is going — we're going to be meeting it on time, but our goal is that we are going to be — we are going to be done well ahead of where we had planned on.

Our immediate chain of command at our local division level really understood the issues, and that helped bridge some of the gaps between the upper department executives who really didn't understand this process. Anecdotally, the sheriff thought, well, we could just do like we do in jails, right, just bring in a platoon, lay out a bunch of temporary tables; let's have an assembly line. We should get this done over the weekend, right? Uh huh.

We had highly dedicated staff that were motivated to get this done.

Some things that didn't work: I mentioned already we didn't have a LIM system, and that was a huge impediment to us. We were designing databases on the fly, and if we could go back, we'd even do it all over again. But we are where we are, and it is what it is.

We did not account for evidence staff, and that was really a failure on our part. The idea that we have now 4,000 more items that have to be transacted multiple times back and forth, we really should have thought of adding additional staff to our evidence control function.

And then, of course, what didn't work was convincing the executives that this really wasn't the best strategy to pursue this testing, but, again, you know, when it's political, time is of essence and so getting those answers quickly is important.

So what do we look forward to? We look forward to dealing with our increased productivity with robotics and new technology. Automation and technology are going to revolutionize what we do. We're getting that in now.

We have some new submission forms, so we are trying this triage process that we worked on this series. We're doing that now with cases that are coming in since everything that's collected will be tested. We're moving toward batching, so we should see some significant increases, much like other labs, usually on the order of, you know, multiple times over two, three, four times the productivity.

We're still struggling with increased follow-up by investigators on CODIS hits. That, I know, is an issue that plagues many different jurisdictions, and we're no different. We do need to increase productivity on our CODIS reviews, and I guess I'm daring to tap dance on some of the current hot-button issues right now, and I don't want to suggest that we need changes to the way CODIS is being dealt with, but one of the things that we're looking at is whether or not we can really look at some true software-type components that can help us out with these mixture interpretations.

Now, there's good systems out there and they do well with single or two-person mixtures, but, beyond two people, they really just don't cut the mustard. And I think what we need is we need some groundbreaking work to pursue that, and we're actually working with some individuals now who are some real prodigies with computerized systems. I joke with my staff and I've told them, “You know, if these guys can design missile systems to put missiles thousands away on a target and be within a few inches, then, by God, they've got to be able to interpret a few DNA profiles. They can't be that hard.”

And then future limitations will be turnaround time, and, actually, Kevin touched on this a little bit as well, and I think it's really the speed of information. And there is actually a whole session on some of this later, but I think the future limitation to crime labs is not going to be so much the technology, because I think it's here; it's the speed of information flow.

When we do studies and we look at time motion studies, a significant amount of time — and I am talking 40, 50, sometimes 60 percent of our time — is about moving information from one area to another. It's copying information, biographical information for a report, putting data onto worksheets. That's the kind of stuff that's going to kill us in productivity, and until we get some real systems in to be able to make this faster, we're not going to see bounds and leaps in productivity, because the automation is here. It's all about moving that information flow.

And, lastly, our goal is all the same, right? It's to provide information. We are not concerned necessarily about who we help, but, ideally, what we want to do is we want to hold those who've committed crimes accountable, and those who have been wrongly accused, we want to assist in exonerating them from their hold on law enforcement.

So my review of the, quote/unquote, “backlog problem” in L.A. Thank you for your time.

[Applause.]

Joseph L. Peterson: Thank you, Dean.

Our final presenter today is Jeff Nye, who is the DNA technical leader with the Michigan State Police Forensic Science Division. Jeff has extensive experience, 15 years or more, in the field of forensic DNA testing. He currently oversees the technical operations of the Forensic Biology Unit for seven Michigan State Police laboratories and their database that has 65 additional staff, and he's also responsible — this is no small task — for and project manager of the Detroit Police Department Backlog Sexual Assault Kit Project that he's heading, also.

So I give you Jeff Nye.

Jeffrey Nye: Well, I could make this really simple and probably just follow Dean's talk by saying “ditto” because I think we're kind of going through the process.

[Laughter.]

Nye: But I'll go through the talk as it is.

I wanted to give a little bit of idea, the outline of my talk, and, basically, the point that I want to do is much like what Dean was saying. These backlogs were not created overnight, and so I want to kind of set time back a little bit to 2008, give you an idea of where we were at in 2008, give you an idea of the Detroit Police Department Crime Laboratory, and then kind of fast forward to today and let you know exactly where we're at today and then talk significantly about the CSC kit backlog that's occurring in the city of Detroit and then talk a little bit about some of the pending legislation and possible solutions to a couple of technology issues.

So, as was stated, Michigan — everybody is familiar with the geography of Michigan; it's the mitten state. We have seven laboratories within the state. They're geographically positioned throughout the state. We have three of those that are appearing in red with the asterisk that actually do DNA testing. The other four laboratories have our serology screening portions available to it.

In 2008, the important point to point out is that we had 265 people within our Forensic Science Division. We had 13 serologists and 17 DNA analysts, which is a really small number, and so we were not in fantastic shape in 2008 to handle a situation like the Detroit Police Department Crime Laboratory.

In 2008, we completed just a shade under 7,300 biology cases; that's about 250 per person per year. And we still carried a backlog in 2008, just shy of 3,000 cases.

How did we meet the demand with the small number of people that we had? Much like Dean had mentioned — and I'll reflect that here as well — we have a very dedicated, high-performing staff, and I think that that can be said for just about every laboratory that's out there as well as the law enforcement agencies. Everybody is very, very dedicated and very interested in moving things along. It's just getting everybody to speak to the same message and work together.

We also used a combination of outsourcing. We outsource a fairly significant amount of casework. We have a standing offer to our scientists for 30 hours per pay period — that's every two weeks — of overtime. I have worked in the division for 15 years, and I think we've had that standing offer of 30 hours per pay period for probably 10 of the 15 years. And as we hire in nice young people that are single, that overtime looks pretty good.

[Laughter.]

Nye: It doesn't last particularly long. After about three or four years, they're pretty much burned out on the overtime. And our average consumption rate, even though 30 hours of overtime is available, our average consumption time is about six hours every two weeks, just to give you an idea of how burdened they are with that.

Out of the 30 people that we did have in 2008, eight were actually funded federally through NIJ. So we actually can attribute about 22 of our positions in forensic biology just to state funding. We, too, have looked at issues for capacity improvements through automation, and we've become very, very active with General Motors, which, of course, is a huge entity within the state of Michigan, for process mapping, which is a system that a lot of people have been using in forensic biology. Process mapping has been a very big endeavor, and we just incorporated it with General Motors, which I think everybody could agree that if you're working in an assembly-line type of situation, they probably are the experts on process mapping.

I heard a talk a little bit ago. We had one assistant to a state senator, Senator Cropsey, was speaking a little bit this morning to some of the statistics in Michigan. We have the moniker of having three of the most violent cities in the top 10 in the country. So those would be Saginaw, Flint and Detroit, all associated with the automobile industry. And then I'm going to focus just a little bit on the city of Detroit, just historically, and I think everybody can read the news just like I can.

Historically, they're about a size of two million people within the city historically. They are down to about 900,000, actually maybe closer to about 850,000 people, so it's definitely a city that's in decline, and it's a very violent city, 400 to 500 homicides per year and somewhere in the neighborhood of 2,500 sexual assaults per year within the city limits.

The Detroit Police Department held the only other accredited crime laboratory in the state. The only discipline within their crime laboratory in 2008 that was accredited was DNA. It was a very large discipline. It had two scientists and one technical leader.

And they had other disciplines in firearms, control substances and trace chemistry. Their latent print unit was not part of their laboratory system; it was part of the investigative unit. As I said, biology was the only accredited discipline, and they dealt with similar issues as what everybody else is. Their facility was in very poor condition. They obviously had some staffing issues, and I won't elaborate, but there were some significant communication issues between law enforcement and the prosecutor's office.

Their capacity, annually, in 2008, they did about 350 cases a year, and it doesn't take a Harvard degree in math to see that when you have 400 homicides a year and close to 2,500 sexual assaults, they were hardly touching what was available to them. In a five-year period, they put in 582 profiles in CODIS. Two hundred and twenty of those, as a side note, were the result of outsourcing; 360 as the result of testing that they did within their laboratory system.

They closed. I think that's probably pretty well common knowledge. What prompted the closure was a defense expert in firearms, who actually was a previous Michigan State Police forensic firearms examiner, had noted an inconsistency on a case. It was a very typical shooting within the city of Detroit. It was an assault weapon that discharged about 30 rounds of ammunition in a homicide, and the firearms examiner said that it all came from the same weapon, and believe it or not, it came from about two, if not three, different weapons. As a result of that, from fallout, the state police was asked to do an audit of their laboratory system, and they noted a 10 percent error rate within their firearms unit. It was a sufficient amount of information for the Wayne County Prosecutor's Office to close that laboratory down, and that was all disciplined and it was immediate. There was no opportunity to prepare for the closure of that laboratory.

Subsequently, from that, they requested additional audits of the laboratory. Any time one of the cases goes forward to court that was processed within that laboratory, it's reanalyzed by our staff. They are doing a five-year audit of the firearms unit as well as a one-year audit of all other disciplines. And we did transfer their CODIS unit to the state police, and there were some technical difficulties with that just from differences in DNA platform.

What did we do in response to that? We hired 12 new scientists. We hired 10 plus two supervisors. I just released 10 of those for actual casework on Friday last week. So they've been hired for about a year, went through a very rigorous training program, and, hopefully, they're going to be very productive here in the short term.

In order to accommodate the additional staff that we hired, we had to do a lot of facility renovation and everything within our locations. Just to give you an idea of some of the things that we had to consider that we didn't necessarily understand when they first closed down was, even just transferring evidence to our facility; the Detroit Police Department became the largest submitter of evidence to the state police crime labs overnight, and just process mapping that evidence submission and receipt was incredible. We receive approximately 100 to 125 cases per day, five days a week, 365 days a year from the Detroit Police Department.

As I said, we get a single point of entry. That is an updated number that I have there. We had to find a place to actually store evidence, and I have a picture of some pod storage units that we had to purchase for one of our laboratories. We had to hire a coordinator to move evidence throughout the state. We have two laboratories within the metro Detroit area that are close by, but the amount of evidence that is coming in, it actually impacts all seven of our laboratories. So we move evidence around the entire state. We have become the FedEx of forensics, I think.

We prioritize our cases by court order. Routinely, most every day, we are getting a court order from the Wayne County Prosecutor's Office to prioritize cases, and to give you an idea of the geography of Michigan, the farthest laboratory away from Detroit is actually 10 hours one direction. So getting to court and getting evidence to court can sometimes be a little bit difficult.

This is a quick photo of these pod storage units. We actually purchased two for our Northville laboratory, which is about 20 minutes outside the city of Detroit, and this is a view from the inside. It's just stacked with evidence coming in from the city of Detroit.

Now we're 21 months in. I hate to tell you, but our DNA turnaround time is 318 days. It has done nothing but get longer. We have large discrepancies from one area of the state to another. On the western side of the state where they have sufficient resources, we are at about a two- to three-month turnaround time, and when we move cases around, because obviously you want to normalize the situation across the state, basically what happens is you get a lot of law enforcement agencies where they were receiving very nice turnaround times of two to three months; now we are looking at 318 days, and the fingers all point towards the city of Detroit. So it does get a little bit difficult there.

In 2009, just to give you an idea, we processed 1,609 cases for biology for the city of Detroit. More than 500 profiles entered into CODIS, and we had 227 associations out of those 500 profiles that went in, 84 of which were the sexual assaults, and I already mentioned that we have 12 new scientists, which will account for about 2,500 to 2,600 cases this coming year.

Basically, what we are finding is that when we took over the Detroit Police Department Crime Laboratory activities, we found out that many things within the Detroit Police Department needed our assistance, and some of them went from collection of evidence and proper handling of evidence to proper submission of evidence, much like Dean was saying. You take a vested interest in how the evidence is collected and how it is submitted because, when you're receiving over 100 cases a day, quality stuff in means quality stuff out, and so we took a big interest in that. And basically what occurred is we had one of our administrative commanders do a tour of their evidence facilities and found 10,559 sexual assault kits in their property room. That was six months ago. As recent as two weeks ago, the Wayne County Prosecutor's Office testified in front of Congress that that number might be as high as 15,000.

A couple of things to point out about those sexual assault kits: They represent all of the kits that they have in storage, much like Dean was saying. We are going through a process right now. We're actually trying to figure out what the status is on each and every one of them, and that is definitely a daunting task.

The other thing to point out is that this isn't an issue of whether they should have been submitted for testing or not. The Detroit Police Department holds that they were all properly investigated and all properly handled, but when you look at the numbers, when they have 2,500 to 3,000 sexual assault kits a year and the Detroit Police Department Crime Laboratory did about 10 percent of that, I think that is a pretty low number for analysis. The kits date back to about 1993, basically.

So what we've decided to do — and this has been kind of working on it for about the past six months — is when you have over 10,000 CSC kits and you're looking at the possibility of actually having to process them, the first thing we did is we actually put together a focus group, stakeholders group, and we decided to take a very broad-based approach to it. Obviously, the Michigan State Police, Wayne County Prosecutor's Office, Detroit Police Department are key members of that group.

We also reached out to the Prosecutors Association of Michigan, federal partners; there are advocacy groups through the Michigan Domestic Violence Prevention and Treatment Board and a whole host of groups to get input on how to handle this particular issue.

Basically, what we did, the first thing, is we process mapped it: How are we going to look at all these kits? What is the process that we are going to go through? We felt that it was really important to have a process because that's the way that you get everybody to adhere to looking at them the same way and not veering from one process to another. And I'll point out, too, that this document is a living document. It changes, it seems like, monthly for sure.

What we have decided to do is … actually, we termed this the “400 Project.” In order to get a scope-of-work project out of this, we have identified with Michigan State University's Center for Statistics. In order to get to a 95 percent confidence interval for being able to predict out what that full 10,559 is going to look like, it would require us to sample a random sample of 400 sexual assault kits. So we're actually in that process right now where we have randomly selected 400 sexual assault kits, and we're going through evaluating all of them, processing them for serology and DNA, and then we'll take that data that we get, give it back to the Michigan State University Center for Statistics, and they will project what the full 10,559 is going to look like.

Basically, what this is going to do is it's going to allow us to create a business plan for trying to garner sufficient funding and resources from the laboratory analysis, the investigative resources, the prosecutorial resources, as well as the advocacy of resources to actually deal with this particular issue, and much of that funding is available through recovery of money as well as NIJ backlog reduction funding.

Much like Dean, we are collecting probably more information than what we'll ever use. Some of it is more just to cover ourselves for questions that might be asked in the future, but it'll help us out in order to project as well as to hopefully answer any questions that might come.

Some of the metrics that we're looking at, we're going to evaluate the cases prior to submitting them for DNA, and we have a team associated with it where we have prosecutors, we have investigators, we have crime victim advocates, and the Prosecutors Association of Michigan is actually looking at each and every one of these cases prior to submission to the laboratory. And this is being done on the initial 400 cases. We may or may not cover that for carrying over to the 10,500 cases, but we're certainly going to do that on the first 400. The reason for that is really an issue of sort of an audit between the DPD laboratory and our laboratory and the Wayne County Prosecutor's Office of what actually needs to be processed because they are saying that they are all investigated properly.

Some of the things we're going to look at are whether they're previously adjudicated, and remember that I said that every one of these cases covers the gamut of every section that the case can actually appear in, whether the suspect was identified previously through means other than DNA, but his or her profile is available in CODIS as an offender. A lot of these are the willingness of the victim to actually prosecute. The advocates are very interested in looking at that part of it. Some very simple things about the prosecutability of the cases: Many of these don't even have police reports associated with them. The only information that we would have is a medical history form that is within the sexual assault kit.

For the laboratory side of things, much like what Dean was mentioning, we're going to be capturing information on how many are positive for the presence of male DNA, how many have a positive male result for the actual STR typing, how many are eligible for CODIS and how many are ultimately identified as an offender within the database, and this will help us to develop our business plan on how much resources and efforts that we need based on where they fall out of the system, based on the type of evidence that we're getting.

Oddly enough, Dean and I didn't talk about our talks ahead of time, but I've got almost the same thing that you had. It's a huge challenge; there is no doubt. And two years ago I would have never anticipated that I was going to be dealing with a project like this, but there are certainly some challenges associated with it. We certainly have very limited resources. We've hired these 10 additional people, but that barely touches the amount of evidence that's coming in.

There's very different viewpoints from one focus group to another, which has been challenging but very interesting. Outsourcing casework, as Dean covered, has its own challenges. Out-of-state law enforcement agencies have very similar issues to Detroit, only the numbers are a little smaller, and by doing this, we're going to be setting a new level of service that hopefully will be duplicated throughout the state, and also with the challenges come some rewards.

I've dealt or built some incredible relationships. What I'm finding is that within the city of Detroit, no matter what your viewpoint is on the situation, whether you're an advocate or with a law enforcement agency or one of the attorneys handling it, they're all an incredibly dedicated and professional group to work with.

And the other thing I wanted to point out is that as we talk about backlogs and we talk about very large numbers — and I know that it'll fall on ears that are sympathetic to this, but let's not forget that each one of those sexual assault kits actually represents an individual, the name and a story, and I think that's an important thing to point out. And, again, it's also an opportunity to make an incredible impact on a very impoverished city right now.

As far as pending legislation, there is one piece of legislation that is pending in front of the state legislature. It is being proposed by Senator Tupac Hunter and it's very different than the legislation that you see some other places, where it's requiring sexual assault kits actually be submitted to the laboratory. This is more for notification type of piece of legislation. It is called the [Sexual] Assault Victims Right[s] Act, and basically, what it does is just keeping the sexual assault victim up to speed on exactly what is going on with his or her case and all the way through the process, from identifying an individual and having samples in CODIS and all that kind of information.

And solutions, much like what Dean was mentioning, there are certainly some IT solutions that can be associated with this, one of them being that we have a lot of different LIM systems and court system, information systems and all that stuff, but they don't talk to each other, and it would be very useful to have a type of system that actually could talk to each other, so that we're not doing quite as much wasted work.

Submission policies are a huge thing, and then, of course, as we move evidence throughout the state, video testimony is a huge endeavor of ours. All seven of our laboratories have video testimony capabilities. It's just a matter of getting them out to the courts. We have already amended the court rules to accept the testimony as testimony. It's just that a lot of defense experts are not accepting of it, and then finally, these CSC kits in the city of Detroit caught us a little bit by surprise, and it would be beneficial to have some sort of a CSC kit tracking system, so that you could actually see where the backlog is going to be before it actually gets to numbers that are outside of concept.

And then that's my contact information. Thank you very much.

[Applause.]

Joseph L. Peterson: Thank you, Jeff, and thanks to all the panelists.

I would like to conclude with a brief overview of what we're in the process of doing in Los Angeles, trying to describe and evaluate and tabulate the results of the testing that's being done of this, that's being outsourced at this time.

As Dean said, that we had around 11,000 or more backlog kits combining both the Los Angeles Police Department and the Sheriff's Department back in, I think, November 1, 2008. The reasons for the backlog — and I use the term advisedly, as Dean is shooting looks at me, but it's truly, the backlog, the responsibility is with the detective, with the investigator who failed to request an analysis. So that the backlog wasn't really in the laboratory, it was at that stage at the property room preceding submission of that evidence to the laboratory, and I think the work that Kevin has done illustrates that as well.

A decision was made by the sheriff and by the chief in Los Angeles Police Department to test all the backlog kits and to — because of resource limitations — to outsource that, the testing, to these labs, plus I forgot Cal DOJ, and directed to these labs in varying amounts depending upon contracts that each of the departments and agencies worked with.

Our objectives in our study, as we received NIJ funding to undertake this evaluation, we wanted to start first with an evaluation of the sexual assault literature, and there is considerable literature there. It's mostly social science. These are studies that, over the last 30 years, have appeared in various criminological and criminal justice journals, but they tell us a lot, and I'll talk some about that, too.

Another objective was to sample and describe and evaluate the results of the tests being performed by these private testing laboratories. What were the results, in fact? What were the primary characteristics of these outsourced cases? And this next point is a real challenge, is to determine the investigative and judicial outcomes of these cases.

Because of limitations with laboratory LIM systems and just our criminal justice system, it's very difficult to track down and find out the numbers of these cases that led to an arrest or some sort of adjudication in the courts. The audit that the Los Angeles Sheriff's Department certainly assisted us in moving toward that goal, but that is a tremendous challenge for us, but we felt that that was one question we wanted to try to answer: “Well, what was the outcome of these cases?”

And the next point, we wanted to compare the judicial outcomes once these backlog cases have been examined. What is the backlog; what is the outcomes of those cases versus cases where testing was immediate — that these kits had not been backlogged, they had been examined perhaps in six months, perhaps nine months, the normal turnaround time in those laboratories — did we see a different outcome in those cases; and then lastly, what were the outcomes in those cases where there was no testing done at all? Were there substantial and identified differences in these three groups of cases?

And then, as has been discussed, too, by Dean and Jeff, can we help to develop future prioritization criteria; that is, should it just be the decision of the detective or should it be a collaboration between the detective and the laboratory? Should we involve the prosecutor as well, since the prosecutor's wishes and particularly their idea of the convictability of a given case looms large in this whole operation of what is done in the way of forensic testing? So we can't disregard that.

What we've found looking at the literature, what indicates those cases that result in arrest and conviction — this is really prior to the work we're doing right now — but looking at that literature — and those of you who are familiar with Violence Against Women, the journal and other journals — weapon, was there a weapon in use in that case, and that is a key factor in determining the predicted success of that case leading to an arrest and leading to a conviction.

Why? I think it has something to do with the perceived seriousness of the case on the part of the investigator, just as physical injuries suffered by the victim, not to mention the psychological and some, of course, physical trauma with respect to the sexual assault, but was there documentable injuries that could be photographed? I t could be very clear to the individuals in charge, the investigators, the prosecutors, the courts, as to the gravity or the seriousness of the offense.

Was there penetration? Some of these studies have tried to determine that. Some of it was self report on the part of the victim. Others was based on scientific evidence through — and that was through corroboration through scientific evidence that there was a finding of semen.

Was there a sexual assault kit taken? Even though there wasn't an analysis of the kit, there are several studies that have shown that cases in which the victim went to the hospital, had a sexual assault kit taken, was more likely to result in an arrest and prosecution and conviction than others. If I may interpret that, it's another indication to the investigator that this is a real case, this is a serious case. The victim is doing everything she can to further the interests of the investigation, and that is one indication, once again, even without the analysis of the evidence, the fact that she went through that process.

And the value, interesting enough, of the kit varied inversely with the prosecutor's assessment of credibility; that is, the value of the kit seemed to be greater where the credibility of the victim was questionable or was lower; that is, the physical evidence piece of it could buttress that case and could elevate the overall credibility of the case, if you will.

So there are some very interesting social demographic dimensions to this that I personally feel that the crime laboratories need to be in touch with in terms of evaluating the case and valuating the evidence when it comes in and making a decision whether this case is going to be investigated and examined or not.

We found in other … I think the survey work that the sheriff's department did … that, largely, these backlog cases were situations where there was a reluctance on the part of the investigators to request an analysis. Either they didn't think it was necessary, they were concerned about the limited resources in the lab or perhaps a combination of those things.

In a high percentage of these cases, victims and assailants have known associations. They're either in a dating relationship, they're family members, they're intimates, they're acquaintances of some sort. This other NIJ study that we just finished, we found that in a random sample of rape cases in five jurisdictions, 80 percent of the victims knew their assailant, 80 percent. These were cases where there was an investigation that included both, both where physical evidence was gathered and was not. So you have to look at that issue as well.

There are many claims of consensual intercourse. Detectives may conclude that a rape did not occur. The victim may be uncooperative and/or judged not credible, and prosecutors declined to file charges.

This OCJP form in California is the form that's filled out when the victim is examined and the sexual assault kit is taken, and what we're trying to do is incorporate some of these facts and looking at ultimately what were the test results. Once the testing of the evidence is done, can we say something about where that evidence originated from the victim? Is the DNA evidence or other physical evidence recovered from different portions of the victim's body typically more valuable than others? And most of these are from victims. Occasionally, they are from suspects, these kits are taken, but we hope to be able to say something also about that.

We're going to be tabulating the number of cases in which semen was identified, where DNA was identified, the male profile determined, those that were uploaded into Cal DNA databank and CODIS, as well as hits in cases that may associate this person with other incidents that he may have participated in.

I've talked with Ken and Larry Blanton and others that, you know, the serial acquaintance rapist — is there such a phenomenon? Can we document that — can we truly associate this person who maybe wasn't prosecuted in this particular case, but if it could be shown that this individual was associated with other cases of this type, the prosecutor might determine to proceed against that individual? We are looking at, of course, CODIS information and CODIS hits.

Just real quickly, in our NIJ study, this was a part from this particular study. One thing of interest: We looked at the fraction of cases of rapes in which evidence was collected, submitted and examined. It was collected in 64 percent of these rape cases. It was submitted to a crime laboratory in 32 percent, and it was actually examined in 19 percent. I think that that again speaks to these issues that Kevin Strom and others have looked at, at this. We need to look at this process from the point at which the evidence is gathered, when it's submitted and when it was examined and the reasons for that.

So it's a complex process, and I don't need to tell you all that, but I think many of these social science organizational determinants are things that the crime lab community are going to have to come to grips with and try to incorporate some of these factors into their decision-making.

Thank you very much.

[Applause.]

NIJ Home Page.

Date Modified: July 12, 2010