We called it a “climate of fear.” Between all the muggings, thefts, and “peeping tom” incidents, campus crime during the 2010-2011 school year at the University of Illinois felt out of control—and it was putting the campus community on edge. It wasn’t so much that students were witnessing or experiencing criminal activity on a regular basis, but a slew of emails with “CRIME ALERT” in the subject lines were pouring into our inboxes at alarming rates. 

But by the end of the first semester, I and a bunch of my journalism classmates made a shocking discovery: Despite the uproar, it turned out that there wasn’t actually any “crime wave.” The campus community was being misled.

This discovery came from an idea from one of my journalism professors, Eric Meyer (you may recognize that name as the Kansas newspaper publisher whose office and mother’s home were raided in 2023 over his paper’s investigation of a local restaurant owner). Inspired by all the crime talk, Professor Meyer suggested that our journalism design class do something different this semester: build an interactive, data-driven news website all about campus crime. So for the rest of the semester I and 40 other journalism students divvied up story ideas and got to work. 

In the end, our reporting revealed that despite the hype over crime alerts, campus crime wasn’t increasing…as much as the crime alerts being sent to students were increasing. Even though campus police were required by law to collect and publish information about certain campus crimes, the police chief went above and beyond the law this particular semester and padded the alerts with additional crimes that the department typically didn’t, and weren’t required to, report. Without an explanation to the campus community that the police department had made these changes, it gave the false impression that crime was occurring a lot more often than it used to.

More often than not, when you see a crime stat or crime report, it’s being used to tell a misleading story.

 

Essentially, the campus police were voluntarily increasing alerts while simultaneously entertaining the idea that the campus was becoming less safe. Despite the panic and a growing climate of fear, it wasn’t until after our reporting that campus police conceded that crime alerts were being sent to students in excess of what was required. This came well after campus outcry, and a university commitment of $1 million to security enhancements and additional police staff.

Our class project, campuscrime.net, would go on to win first place for online news reporting in the Society of Professional Journalists’ Mark of Excellence competition that year. A new group of students created a followup site the next semester, probing even more stats. 

But, for me, the most memorable takeaway from that project was: Crime stats, and the people who cite them, are often misleading the public. In fact, I’d venture to say that, more often than not, when you see a crime stat or crime report, it’s being used to tell a misleading story!

HOW CRIME STATS LIE

Crime and fear are good building blocks for a juicy narrative. Whether it’s politicians, news reporters, or activists doing it, crafting crime narratives is an easy grab for political, monetary, or social gain. This is why every local news broadcast begins with crime stories, and every local politician has some grand plan for reducing crime, whether or not crime is actually rising.

And people eat it up!

In reality, crime stats are never as simple or straightforward as people make them seem. There are too many variables that can be mixed and matched to support whatever story the person citing the stats wants. In this post, I explain exactly how people, including journalists, do this using six common variables:

Time

Time is an important element of crime data, but it’s malleable. People who use crime data can cherry-pick the best time frame that supports whatever story they want to tell. And this creates misleading narratives.

Take the Year-over-Year (YOY) metric. It’s the most common time frame people use to report crime statistics. People often use it with the phrase “compared to the same time last year,” as in “crime is up now, compared to the same time last year.” The problem with the YOY metric is that it’s often meaningless without context. Simply comparing crime between two moments in time without looking at the moments surrounding them can be misleading. So what if crime is up now, “compared to the same time last year?” Suppose the crime rate was even higher the five years before, or what if it decreases next year? 

Let’s look at a real example of how people use the YOY variable to tell a misleading crime story. In June 2024, The New York Post published an article with a headline: “NYC gun violence spikes as summer heat hits the five boroughs: ‘It’s that time of the year.’” The gist of the story is that New Yorkers should prepare for a “long, hot summer” of violence because crime for one week in June was higher than the same week in June the year before. 

The article claims “shootings spiked in the Big Apple last week by a whopping 50% compared to the same period last year.” For that one-week period in June 2024, there were two dozen shootings in New York City, compared to 16 shootings in the same timespan in 2023. The story warns readers that “it’s that time of the year.” It even quotes an anonymous police officer who says, “The summer hasn’t even officially started yet it’s already getting out of hand.”

The story leads New Yorkers to believe that there is some crime spike that they should be really worried about heading into the summer. In reality, the article is using a very narrow one-week time frame to tailor the crime stats to its misleading narrative.

What happens when we zoom out a bit and get a wider view?

Looking at the full month of June, we see there were only five more NYC shootings in June 2024 compared to June 2023; nothing close to the “whopping 50%” increase between the one-week period the article focuses on. Zoom out even further to quarterly crime reports and see that the total number of shootings for Q2 (which includes the months of April, May and June), was roughly the same in 2024 as in 2023 (and technically speaking, there was one less shooting in Q2 2024). 

Now, let’s zoom out even further and look at the year-to-date (which includes shootings from the start of the year up to the date the story was reported). Several paragraphs deep, the story admits that there was a decline in “year-to-date shootings.” It glosses over that point a bit, but, according to the numbers it cites, the number of shootings between January 2024 and June 2024 were indeed 7 percent lower than shootings during the same time frame in 2023.

So in June 2024 New York City shootings increased, stayed the same, and decreased, all at the same time, depending on which time frame you look at. This shows that time can be used to support almost any crime stat narrative—just zoom in or out far enough until it shows you what you want.

That’s why the year-to-date time frame (the one the New York Post article glossed over) is the better method. It accounts for all crimes for the year, up until the date of reporting, then compares that to the same time frame of another year. BUT, even the year-to-date time frame can be misleading, as criminal justice scholar Jerry Ratcliffe explained on his website. He found that year-to-date (YTD) crime stats don’t provide enough data to draw conclusions about statistically significant trends until at least October. And sure enough, New York City’s Q3 crime report, released in October, shows that shootings, year-to-date, were 9 percent lower in 2024, compared to 2023—so much for that “long, hot summer” of violence.

Ratcliffe argues that year-to-date stats are more accurate by the end of the year. But even then, it’s important to zoom out and compare that year to several years before. For instance, in Ratcliffe’s own example using Philadelphia crime data: While crime increased 13 percent in 2015 by the end of the year compared to the year before, the last three years still had the lowest crime of the decade.

Per capita

Sometimes people show crime stats as “per capita,” meaning the amount of crime has been adjusted to account for the population size (i.e. crime per x number of people). That’s because larger areas (i.e. cities) will probably have more crime than smaller areas based on the sheer volume of people in the region. People often use per capita to prevent population size from distorting the data, but sometimes using per capita can distort the data, too.

Take this example from Chicago’s WBEZ and the Chicago Sun-times. The article is about crime and safety on Chicago’s subway system (CTA). The reporters analyzed CTA crimes, specifically “crime per rider trip.” They found that CTA crime per rider was increasing, even though total CTA crimes were relatively the same. The reason for this is that ridership was decreasing, causing “crime per rider” to technically increase. Therefore, they concluded, “The more people who ride public transit, the safer it tends to be.”

Unfortunately, this is a tiny bit misleading. The reporters actually acknowledge the problem, yet still devote a significant portion of the article (including the headline and lead) to the stat, pushing the narrative that “the number of violent crimes has remained stable but the number of people riding the CTA has dropped, increasing the odds of a rider becoming a victim.” Here’s why that’s misleading:

On its face, the stat is technically true: the statistical odds of a rider becoming a victim of a violent crime become lower if more people ride the subway. However, this doesn’t make the transit system “safer.” That is, depending on how you define “safer.” The data shows that the transit system is no more safer with decreased ridership than with more riders because the total number of crimes (and thus the number of victims) stays the same. So adding more riders will not decrease crime; it simply drowns out the existing crime and exposes new riders to potentially becoming the victims.

The problem with the story’s interpretation of the data is that it helps other interested parties like the transit authority, police department, and politicians to spin the numbers by touting that crime per trip is going down or that the CTA needs more riders to stay safe, when in reality the amount of crime on the CTA system has stayed the same.

Even outside the context of transit ridership, explaining overall crime using per capita has issues. People sometimes use per capita crime rates to give an audience a sense of the likelihood that an average person in the population might become a victim of crime. But this assumes that everyone in the population has an equal chance of becoming a crime victim. In reality, a whole host of things, including gender and neighborhood, can alter an individual’s odds of becoming a crime victim.

Another huge problem with per capita crime rates is that per capita formulas often define population based on residency. However, not all crime happens to or by residents, especially in large touristy areas. 

Another issue occurs when people use per capita crime rates to compare different cities or areas. The thought process is that larger cities with more people will obviously have more crime, so to normalize the data, just convert total crimes to total crimes per capita. However, a 2021 study found that this approach can have misleading results. The problem is that the per capita formula (total crimes divided by total population) assumes that crime increases at a similar rate as population. But the study shows that crime and population have an non-linear relationship: Crime increases with population, but usually at increasing rates—and this varies by type of crime. To prove this point, the researchers compared a ranking of cities based on crime per capita to a ranking of cities based on a formula that adjusts for population size without assuming a linear relationship. They found that the final rankings for the two lists were very different. They concluded that using “per capita” can therefore distort crime rate comparisons. So take per capita crime rate rankings like this one with a grain of salt!

Geography

Geography is another way people misrepresent crime. Crime stats that fail to zoom in on smaller geographies erase the areas that might not be experiencing the same crime rates as most other areas. Saying crime is decreasing in the U.S., for instance, means little to the people who live in cities where crime is indeed increasing.

Overly broad geographies also erase some of the systemic issues behind crime rates. Saying that a city like Chicago has a high homicide rate doesn’t mean that everyone in that city is equally susceptible to violent crime. Chicago’s history of residential segregation partially explains why more than 50 percent of the city’s murders so far this year have happened in only six of its 77 neighborhoods (community areas), according to my analysis. So broad geographies can whitewash crime and present it as a region-wide problem, when in reality it’s often an indicator of social inequities that disproportionately affect smaller populations.

Language and context

The way people describe crime stats can really distort and misrepresent them. This includes language that associates crime (or certain types of crime) with specific groups of people. Take crime involving people who are migrants for example: Lately, when politicians talk about unprecedented levels of undocumented immigration and border crossings, many, including presidential candidates, link border-crossing immigrants to crime. Unfortunately, even some mainstream journalists parrot politicized language about “migrant crime.”

The problem is there is no such thing as “migrant crime.” What even is “migrant crime?” What makes “migrant crime” different from any other crime? It would make sense to point out a crime suspect’s citizenship status if crimes committed by people who are migrants occurred at higher rates, or if they increased the crime rate, but it turns out that there is no evidence of that in the U.S. Even locally in places like Chicago and New York City, which have received thousands of migrants being bused in from Texas, data doesn’t support the claim that people who are migrants are causing crime waves. In Chicago, most migrant arrests are for non-violent violations like traffic violations and theft, according to the Chicago Tribune. And in New York City, while the New York Post claimed that migrants accounted for 75 percent of arrests in Manhattan’s midtown, they failed to provide a substantial source for that stat, or report what types of crimes the arrests represented. Meanwhile, NYC murders and shootings are down in the city. Violent crime, including rape, murder and shootings have been decreasing since 2022, according to the New York Times.  Moreover, research shows that immigrants are less likely to commit crime than U.S.-born residents

People, especially journalists, who associate a group of people with crime automatically discredit themselves. That’s because it’s highly unlikely that all or even most people in that group are actually associated with any crime. While the statistics alone might be accurate, when we try to characterize crime data with people groups it’s very easy to inadvertently imply that all or even most people in a group should be associated with it. This is true even with a half-hearted “not all of them” disclaimer, because if “not all of them” or “most of them” why mention them at all?

 “Black people are 13 percent of the population, but commit more than 50 percent of America’s murders.”

Let’s take, for example, the infamous 13 percent meme about black people and crime. The saying goes something like “Black people are 13 percent of the population, but commit more than 50 percent of America’s murders.” The statement is trying to show that, according to FBI data, black suspects account for more than 50 percent of arrests for murder, even though black people are only 13 percent of the U.S. population. Instead, it’s strongly suggesting that all or even most black people should therefore be associated with crime. 

This is not logical.

There were 1 million violent crimes in 2022. Even if 100 percent of the offenders in these crimes were black (they obviously weren’t), that would still represent less than 3 percent of the black population. That is, the overwhelming majority of black people, 98 percent of them, do not commit any violent crime. The language and context surrounding the 13 percent statistic therefore becomes extremely problematic. The way people use the stat makes harmful implications about black people as a whole, even though the stat itself only represents a tiny portion of the community.

Racists cite the 13 percent stat in bad faith, but there are countless examples of mainstream news media also using language that erroneously links crime to racial groups, especially black and Hispanic people. Let’s revisit the 2010 campus crime debacle I talked about at the top of this post. Amid the “climate of fear” at the University of Illinois at Urbana-Champaign, an increase in crime alerts sent to students led to small racial tensions, because many of the crime alerts said the assailants were black. I and a lot of black students subsequently experienced racial profiling on campus. A lot of this was documented on our project site campuscrime.net, including editorials from people who admitted to the racial profiling

Because many of the assailants in the crime alerts were described as black and many of the victims were white, rumors of a legendary game called “polar bear hunting” spread. Polar bear hunting comes from an unfounded claim that groups of black youth make a game out of targeting unsuspecting white people and knocking them unconscious. Though from my research I could not find any evidence of anyone but concerned residents, police, and media using this term or claiming it was a real thing, often with very hazy and contradictory origin stories. Local police officials also denied it was happening at the time.

Yet, without any evidence of their own, the local newspaper the News-Gazette ran a series of stories linking local crimes involving white male victims to the mythical race-based polar bear hunting game. Even though police told the Gazette there was no evidence that the attacks were related to any racist game, the Gazette went ahead and published a-typo-laden editorial calling for local police to acknowledge that the attacks were racially motivated. The Gazette even went so far as to interview a black business owner to answer for why so many black youth appeared to be attacking white men. The print-edition headline for that article was apparently: “Embarrassing to our community.” 

The editorial featured a map of all the batteries in Champaign, Illinois, claiming each one involved white male victims and black attackers.

The data behind the map turned out to be extremely spotty. Joel Gillespie, a reporter for Champaign digital magazine Smile Politely, looked closer at the News-Gazette’s map of attacks and found that the paper was being misleading with its racialized characterizations of the crimes. First, the map itself contradicted the paper’s claim that all the victims were white and all the attackers were black. The map showed that some of the victims were not white. Second, Gillespie looked at batteries for the sister-city Urbana and found that of hundreds of batteries involving white victims the assailants were just as likely to be white as black. Gillespie was unable to get demographic data from Champaign police. But It’s highly unlikely that the adjacent Champaign, which shares the campustown with Urbana, would be significantly different. Third, the Gazette’s map highlighted only 20 batteries in the span of 42 days, but in the year before there were 564 batteries in Champaign, averaging about 47 per month. So it appeared as though the Gazette cherry-picked the data, only mapping incidents involving black attackers and white male victims, and ignoring the others. 

Finally, Gillespie reached out to the Champaign police and requested the same data the Gazette used for its editorial. The data he got back included five additional incidents that weren’t on the Gazette’s map. Each of those incidents involved female victims and/or non-black or race-unknown attackers. This suggests that the Gazette simply mapped only the crimes that fit their racialized narrative, ignoring all the other incidents that contradicted it.

Aside from the glaring journalistic ethics issues, the News-Gazette’s coverage pinpoints the danger in using racial groups to characterize criminal activity. When people become so fixated with proving a narrative that they focus only on one group, they risk missing all the other groups doing the same thing. They end up creating a fake trend. Gillespie summed it up best: “Media-driven hysteria which assigns a storyline to statistically-insignificant small sample sizes, cherry-picked data, and attempts to assign motivation and connection to crimes which are (SIC) ultimately appear to be unrelated (except in their demographic makeup), is an extremely unhealthy way to approach these issues.”

Poor word choice in crime stat coverage doesn’t have to be racialized to be problematic. People use all sorts of terms that falsely characterize crime stats or link them to nonexistent trends. One example that comes to mind is “subway crime.” New York City reporters often cover “subway crime” as if it’s some unique and rising epidemic. This makes people terrified to ride the subways. But violent crimes on subways are relatively rare. Another example is “smash and grab” retail theft. Coverage often falsely implies the viral videos of young people breaking into retail stores en masse and stealing things are related or part of a growing trend, when there is little data to back up that claim.

When we aren’t careful with how we describe crime, we risk making a small number of unlinked incidents appear to be part of some larger, growing, unprecedented trend that might not actually exist. We might then implicate entire groups of people who, in reality, mostly have nothing to do with crime. 

Frequency

One reason crime appears to be out of control is because news organizations constantly cover it. Multiple studies have found that news media devote a disproportionate amount of coverage to the most violent crimes, compared to actual crime rates. As a result, people overestimate how close they are to mayhem and underestimate how safe they are.

Every local news broadcast I watch begins with lengthy crime and mayhem coverage. A lot of journalists sniff out a grisly crime story, narrativize it, and then broadcast it, giving the impression that it can likely happen to anyone watching or reading, too. In reality, it happened to one person that day of millions of people and now journalists, politicians, activists and other stakeholders are amplifying it. Doing this every single day can give the impression that crime is a bigger problem than it is. It makes people feel like crime is happening all around them and they will be the next victim.

The truth is that most people are not criminals. People committed a little more than 1 million violent crimes in 2022, according to FBI data. But there are 300 million people in the United States. That means that even if every one of those violent crimes were committed by a different person, that’s only 0.33 percent of the population. 

Journalists rarely remind their audiences that just because the media keeps talking about crime, doesn’t mean crime is actually happening more often than usual.

Data collection

There’s an ongoing political debate about whether or not crime in the United States is up or down. The reason for the disagreement is data collection—or a lack thereof. Some politicians, like Republican Donald Trump, claim violent crime in the country is rising. Others, like former Democratic presidential candidate Kamala Harris, say it’s going down. Who is right?

The FBI is usually the main source of national crime data. Police departments around the country typically send their data to the bureau, which tabulates and publishes the results in quarterly and annual reports. The problem is that in 2021 the FBI transitioned to a new reporting system. This required law enforcement agencies to spend time training staff and updating their systems. This caused huge delays for some agencies, and so 2021 national crime data from the FBI only accounts for about 60 percent of agencies, the lowest response rate in decades. Some of the missing data included numbers from the biggest police departments in the country, including the NYPD and LAPD. So data for that year is incomplete and basically meaningless—that hasn’t stopped some politicians from citing it.

When Harris says violent crime is down she is referring to the fact that, despite the incomplete 2021 data, overall violent crime in the U.S. has been trending downward since the 1990s. But Trump appears to be fixated on the fact that some agencies, big ones, did not report data in 2021.

Other issues make FBI data incomplete. Interestingly, local law enforcement agencies are not always required to send data to the FBI, and about 24 percent usually don’t. Even then, there are so many ways to define crime and collect crime data that the numbers agencies submit might not be painting an accurate picture of crime in the country. For instance, before they are sent to the FBI, sometimes crimes like shoplifting or petty larceny get upgraded to other crimes like robbery. And some victims might not report crimes.

The National Crime Victimization Survey (NCVS) is supposed to address the latter issue. It’s a survey asking Americans if they’ve been victims of crime in the past six months. But it also has data collection issues. As a survey, it’s a sample, and therefore susceptible to margins of error. Trump appeared to be referring to NCVS data in September 2024 when he posted on social media that violent crime was up 40 percent in 2023 compared to 2020. That’s technically true, according to the NCVS, but it’s misleading to compare 2023 to 2020, skipping over 2022. In fact, crime actually decreased slightly from 2022 to 2023, according to the NCVS.

Sourcing is another problem. Sometimes police departments don’t have data on hyper-specific crimes. Most departments classify shoplifting, for example, as larceny, which can also include things like car theft and pickpocketing. So instead, many journalists and politicians rely on special industry agencies to collect this data using their own methods. This can be very risky.

In 2023, for instance, American retailers started claiming that rampant shoplifting and organized retail crime in stores were causing store closures, profit losses and locked up items. This became a buzzy headline. For months, shoplifting and viral images of groups of young people rushing into stores stealing things dominated news coverage.

Turns out the data behind the media frenzy was very fishy. The primary source was a report from an industry lobbying group, the National Retail Federation (NRF). Investigative reporters discovered that the report was misrepresenting “shrink,” which is a retail industry term that refers to inventory losses, including shoplifting, internal thefts, and supply chain errors. The reporters found that only about 36 percent of shrink was due to customer shoplifting, and even so, it didn’t appear to be increasing much. They concluded that the data was too dubious for anyone to conclude that mass organized retail crime was causing the industry’s financial woes. Some academic researchers also cast doubt on the industry’s shoplifting claims. The NRF eventually retracted its statements about shrink and retail theft . Soon after, executives who told shareholders that shoplifting was to blame for profit losses also walked back their comments. Eventually, the NRF stopped publishing shrink data altogether.

THE RAMIFICATIONS

Crime data is very messy. That doesn’t stop politicians, law enforcement officials, or journalists from taunting the public with misleading statistics. Unfortunately, this obsession with crime and crime data has a number of bad consequences:

Widening the perception gap: Crime overall has been declining in America since the 1990s, but Gallup polls show that almost every year for more than 20 years Americans say they believe that there is “more crime in the U.S. than there was a year ago.” This disconnect between perception and reality is known as a “perception gap.” Politicians, law enforcement officials, and journalists exploit the perception gap when they use some of the crime stat tricks I mentioned above. In the end, people are left more afraid of the world than they need to be.

Perpetuating generalizations: Sloppy crime stats can perpetuate stereotypes, and unfairly link groups of people to crime. They can oversimplify crime and the people who commit it, often totally ignoring a lot of the socioeconomic factors that contribute to criminal behavior. People are left believing that some people commit crime simply because they are a member of a certain group, and in some cases they begin to treat those people accordingly.

Costly safety theater: Poorly presented crime stats often lead to costly “solutions” that give the appearance of addressing crime issues while actually being incapable of reducing crime. Take my campus crime example: Responding to public outcry over the apparent crime wave, the University of Illinois earmarked $1 million for campus “security enhancements,” including new locks for dorms. But only a small percent of the reported crimes actually occurred in dorms, and as we later found out, there wasn’t much of a crime wave. Another example: Following a number of reports about crime on New York City subways in 2024, officials announced a number of security features, including a weapons scanner pilot program. Under the program, riders were randomly scanned by detectors in search of weapons. The pilot program, which was actually free for the city, ended with no arrests, and more than 100 riders being caught in false positive scans. This was despite subway crime actually going down in 2024. The program, and ones like it, are basically what I call “safety theater”: highly publicized and expensive actions, policies, and toys designed to respond to the perception of crime increases rather than any real crime increases.

Tough-on-crime posturing and over-policing: American crime has been declining since the 1990s, but in a speech in Pennsylvania in September 2024 then presidential candidate Donald Trump infamously called for what some saw as a “purge” (like the movie) to deal with shoplifters. He said he needed just “one really violent day” to deal with them:  “One rough hour, and I mean real rough. The word will get out and it will end immediately,” he said. This tough-on-crime rhetoric is an example of how the false perception of crime waves can spark calls for extreme policies that might lead to over-policing and criminalizing marginalized groups. Another example: Despite immigrants being less likely to commit crimes than U.S.-born citizens, and hardly any evidence showing a “migrant crime wave,” a series of high-profile crimes in New York City involving undocumented immigrants led to some New York lawmakers calling for local police to resume the banned practice of collaborating with immigration enforcement. That would require the repeal of a law that forbids police from helping ICE officials detain migrants without a court order from a federal judge. One more example: The mass incarceration stemming from the U.S.’s “war on drugs” was a direct result of “tough-on-crime” policies between the 1970s and 1990s. Research shows that during this era black people were incarcerated for drug crimes at higher rates than white people, even though white people were equally likely to use and sell drugs.

CLOSING REMARKS

Crime stats are virtually meaningless—at least most of the time people use them. They are little seeds that people with ulterior motives plant to grow big scary narratives that consume people by playing into their fears. At the same time, people can’t seem to get enough of crime tales and crime stats.

But bad crime stats have major consequences. They shape the way we vote, the way we maneuver in our communities, and the way we view and treat certain types of people in our communities. 

To avoid getting duped, people should scrutinize crime stats more often. Journalists especially, should be careful with how they cover crime and present crime data so that they don’t dupe people or help others dupe people. Ideally, news organizations would cut back significantly on their crime coverage, but I don’t see that happening because so much traditional news reporting revolves around crime and mayhem (“if it bleeds it leads”).

For the readers and the consumers, here’s a suggestion: Instead of over-relying on crime stats to gauge personal safety, just always use common sense safety precautions in public. And don’t let them shape your view of the world. At the end of the day, crime stats and the people who cite them often mislead, lie, or fail to tell the full story. Hopefully if people have a better understanding of how this happens and why, they might realize that, ironically, crime stats are often bad indicators of any one person’s safety level. Knowing this, maybe they can navigate their communities with a little more confidence and peace of mind.