top of page
Search
Writer's pictureMike Fry

Know Your Stuff: How To Audit Animal Shelter Statistics



People working to assess animal shelters have relied heavily on a simple statistic to measure shelter performance: The Live Release Rate or LRR, which is the percentage of animals that leave the shelter alive, relative to the total outcomes. In an ideal world, computing this number would be simple and straight-forward. Unfortunately, in the strange world of animal sheltering, it is anything but, because there is no standardized way of entering animal records, or consolidating and reporting them. Understanding these and other issues related to data collection and reporting is essential for anyone trying to assess an animal shelter's performance.

The closest thing there is to any sort of standardized reporting format is commonly known as the Asilomar Accords, which is a document written in 2004 by about two dozen people who appear to have gone out of their way to hide some killing in animal shelters. As a result, "the Accords" puts forward an approach that allows animal shelters to exclude a wide range of deaths from their LRR calculations. For example: the Accords allow shelters to exclude deaths of animals that were euthanized at the request of an animal's owner. They also allow shelters to not count any animals that died while in their care.

Those two factors alone can dramatically inflate an animal shelter's LRR in ways that are misleading, because few people, including government officials who oversee animal shelters, understand that shelters may be omitting large numbers of deaths from their outcome reports. In fact, I have yet to speak with a government official responsible for overseeing a shelter that uses Asilomar's approach to statistics reporting who knew or understood that many deaths were being excluded from the shelters' calculations. That fact alone demonstrates two major flaws with Asilomar: They are both inaccurate and misleading.

Clearly, comparing a shelter that uses that approach to a shelter that documents and reports all deaths can make the more transparent organization look like it is doing relatively poorly. And, doing so is not comparing apples to apples. The problems with Asilomar, however, get far worse than that, because they also encourage animal shelters to engage in various practices that conceal significant numbers of deaths and dramatically inflate the LRR.

I have found, for example, that it is common practice for animal shelters that use Asilomar to report their statistics to pressure people surrendering their pets to sign "euthanasia" paperwork. Frequently, they are not completely honest with pet owners when doing so. Two approaches I have personally observed are described below:

The "Check Box" Approach

This approach uses a simple checkbox on pet surrender forms that gives the shelter authorization to euthanize a pet. When a pet comes in that the shelter believes may be difficult to adopt and if the person surrendering the pet has not checked the box, the staff say things like, "In the event that we need to euthanize this pet, we need you to check this box." Then, after the box is checked, if the shelter destroys the pet, they categorize it as an "Owner Requested Euthanasia" (ORE) even if the person who surrendered the pet didn't actually request euthanasia. Sometime the staff goes even farther and simply checks the box on the form themselves.

The Separate Form Approach

Like the check box approach described above, this approach centers on shelter staff pushing people into signing "euthanasia" paperwork, often unwittingly. Shelters that operate this way will frequently give people surrendering pets separate paperwork that authorizes euthanasia, even if the pet is healthy. Then, if the shelter euthanizes those pets, they count them as ORE. Those pets are then eliminated from the LRR calculation for the shelter.

It is not as if the shelters don't know what they are doing. When consulting with animal shelters about statistics and reporting, No Kill Learning makes the following recommendations:

1) Discontinue using Asilomar Accords to compute the LRR.

2) Discontinue filtering ORE from the reports.

3) Evaluate pets surrendered by owners for euthanasia like any other pet and determine the outcome based on the evaluation, rather than the request.

When receiving these recommendations, it is not uncommon for shelter staff to make statements like, "but, that is how we make our numbers look good... what are we going to do if we change that?" It is, therefore, clear that shelters are using the Asilomar Accords in order to inflate their reported LRR.

Furthermore, when shelters discontinue these practices, it is common for the numbers of reported ORE to drop dramatically. The practices described so far can have a profound impact on the LRR, inflating it by as much as 10% or even more. Unfortunately, they are only some of the common practices in animal shelters that make them look like they are doing better than they really are. Nathan Winograd of the No Kill Advocacy Center recently described several others. It is important to note that not all shelters that do some of these things are doing them in order to manipulate their outcome statistics. It should also be said that some of these practices are common in both No Kill and traditional animal shelters.

A short, summary of Winograd's list is below:

1) Not counting animals that die while at the shelter or in foster care.

2) Not counting animals surrendered for "euthanasia."

3) Not counting late-term abortions.

4) Not counting all neonates (un-weaned babies).

5) Double [or triple or more] counting foster animals as live outcomes.

6) Transferring animals to other agencies where they are killed.

7) Calling themselves No Kill when they are not.

Some of these practices are so commonplace in animal shelters that many shelter staff believe they represent "how it is supposed to be done," and do not necessarily understand the implications these practices have on the reported LRR. A perfect example of this is Winograd's Number 5, "Double counting foster animals."

Double or Triple (or more) Counting Foster Animals

When it comes to double (or triple or more) counting fosters, I have only personally observed this being done by users of Chameleon Software. However, given that Chameleon Software is the most commonly used shelter software system, that isn't very comforting, especially since Chameleon Software users are reporting that Chameleon Software has actually trained them to count fosters in this way.

Shelters who do this record an "outcome" whenever an animal goes into foster care, even though going to foster is not a final outcome. Then, when the animal comes back from foster, either for adoption, or for spay/neuter surgery, the animal is re-entered into the system. If an animal is sent to another foster home, another outcome is logged and when they come back, the pet is entered again.

Beyond a doubt, this practice can have a profound impact on the LRR. It artificially inflates both the intake and live outcome numbers. If the shelter provides their raw computer data for review (many do not), this practice is relatively easy to spot. Take this report, for example, from Memphis Animal Services (MAS). On Page 1 of the report, near the top, 16 animals are listed as intakes from "foster." On Page 2 it lists 67 "outcomes" as "Foster." A couple of screen captures from this report are included below.


Chameleon Software intake report from Memphis Animal Services


Chameleon Software intake report from Memphis Animal Services.

Note: the total column in the outcome report is the second from the left, as opposed

to the far right, as it is in the "Intake" report.

When asked to explain this double counting, Memphis Animal Services responded, in part, by saying this is how they were trained by Chameleon to enter foster animals. They also said they were working with a No Kill consultant who also recommends this approach. Given those facts, it should be no surprise that this practice is very commonplace in animal shelters all over the USA and results in dramatically inflated intake and live outcome numbers. Since the date of the above report, MAS has changed their reporting format and has said they are now "filtering out" these duplicate entries. While doing so would make their public reports more accurate, it also means that their public reports do not match the reports being generated by their computer system. It also means that auditing their reports is impossible, unless you know exactly how they have changed the data before publishing it. I probably don't have to say it, but, changing the data before publishing it is, in and of itself, potentially problematic. This way of entering "fosters" as multiple intakes and outcomes is done by both No Kill and traditional animal shelters and it is my opinion that few of these shelters fully understand the impact this practice has on the overall LRR.

Not Counting All Neonates

Another of the items on Winograd's list that is very widespread in both No Kill and traditional shelters. In my experience, failing to count all neonatal (young, un-weaned) animals (particularly kittens) happens innocently enough via a scenario that goes something like this: a pregnant cat goes into foster care and then gives birth at the foster home. The shelter does not see or enter the kittens in the computer system until the mom cat and kittens come back to the shelter for vaccinations and other veterinary work.

That scenario is common enough that many/most animal shelters don't bother to enter ANY neonatal kittens in their computer systems until those kittens are back at the shelter for sterilization in preparation for adoption. That means that any kittens that died before then are generally not counted, even if shelter's intake report lists a large number of kittens "born in shelter."

While that might not sound significant, neonatal kittens can make up a huge portion of shelter intake for several months of the year, which is, in fact, why the Spring and Summer months are generally referred to in the industry as "Kitten Season." Furthermore, neonatal kittens are the most fragile lives seen in animal shelters. According to veterinary literature, the mortality rate for kittens born to healthy mother cats in ideal situations is about 10%. Given that animal shelters are frequently dealing with mother cats and kittens that are in less than ideal situations, it is reasonable to assume that the mortality rate for neonatal kittens in shelters could likely to be higher than 10%. Yet, many animal shelters of all kinds are reporting live release rates for neonatal kittens that are higher than you would expect to see simply accounting for natural mortality. The most likely explanation for that is that they are not counting all of the kittens that died.

Needless-to-say, animal shelters need to take extra steps and do extra work to ensure they are counting each of those lives and deaths. Yet, if they do, their LRR will look worse than a shelter that does not do that. There is little incentive, therefore, for shelters to do that extra work.

Auditing Shelter Statistics

One thing becomes clear when thinking about these issues: An animal shelter's LRR means nothing without knowing more about their data entry and other record keeping practices. An animal shelter reporting an 89% LRR could be performing dramatically better than one reporting a LRR of 94% or higher, particularly if they are counting every life and they are counting them only once.

A reasonable shelter audit needs to look far past the reported numbers and the computer system. It needs to include a review of a host of intake and data entry practices. Without accounting for those factors, the reported LRR means nearly nothing.

Conclusion

The Asilomar Accords is a document that was written by a small group of people more than a decade ago. At the time, it might have been considered a reasonable first step toward an open and transparent approach to reporting animal shelter outcomes. That is about the best a person could reasonably say about it, because it is neither open or transparent. As a reasonable first step it has also largely failed. There has been virtually no meaningful follow-up or updating to it in the 13 years since it was originally written. That puts elected officials that oversee animal shelters as well as advocates seeking shelter reforms in difficult positions: they can't know if their shelters are being fully transparent without a detailed audit of all of these practices. Many questions need to be asked and answered. It is insufficient, for example, to ask a shelter whether or not they count all of their neonatal kittens. Detailed questions about when and how they are entered need to be asked and the answers verified with a review of actual practices as well as the raw computer data off which the shelter's reports are generated.

For many years, it has been generally concluded that animal shelters with LRR higher than 90% might be No Kill. But, No Kill is not the number 90%. No Kill means not ending the lives of healthy or treatable pets. How that really translates into a fully transparent LRR in different contexts is, I believe, largely an unknown. Therefore, I believe there are more important questions that need to be asked to evaluate whether or not a shelter is truly No Kill:

• Are animals being destroyed that had rescue groups willing to save?

• Is the shelter asking rescue groups for help for special needs animals? If so, who are they asking and how?

• Does the shelter have good relationships with the local rescue groups?

• Is the shelter providing veterinary treatment for pets with treatable medical conditions?

The answers to these kinds of questions, in combination with an audit of their statistics, can answer the question of whether or not a shelter is No Kill. Anything less than that probably can't, no matter how great the shelter's LRR looks.

bottom of page