Impact Brief: Through Data, a Model for Improving Police Tactics

Blog /

A police officer handing a driver a ticket
Data from traffic stops can help law enforcement understand racial disparities in their interactions. | Credit: Stocksy/Sean Locke

Jennifer Eberhardt was riding as a passenger in a police car when all hell broke loose in Oakland. It was late 2014 and protesters had crowded city streets in response to a spate of fatal police shootings of unarmed African Americans around the country.

Downtown was littered with shards of broken glass, trash and spray-painted walls. Officers hit the streets in riot gear. At one point, protesters blocked a nearby highway, forcing the cruiser Eberhardt was in to drive against stopped traffic.

Eberhardt, a Stanford social psychologist and MacArthur “genius grant” winner, had asked to ride along on a patrol for research she was doing on the very problem now visible in its extreme: the tense relations between the Oakland Police Department and the community it serves.

Eberhardt was retained by the City of Oakland to analyze data on police stops as part of a 2003 settlement in a civil rights case involving officer misconduct. She found that Oakland officers, both black and white, were stopping substantially more African Americans than whites.

Today, Eberhardt’s research and collaboration with fellow social scientists, police departments, and policymakers are helping law enforcement agencies nationwide use data to examine and address racial disparities in their interactions with the communities they serve.

Eberhardt wanted to watch officers in action to see things the way they did. “Ride-alongs help illuminate the complexity of issues around race and also the nuances,” she said.

And what she saw on the night of the protests brought to life much of the data she was already looking at. Oakland had given her data from 28,000 electronic reports detailing routine traffic and pedestrian stops over a 13-month period. At her request, she was also given access to footage from cameras worn by the Oakland officers, a potentially rich source of data at a time when few police departments in the country were using them.

The scope of the work called for more social scientists than just Eberhardt. She pulled in Dan Jurafsky, a professor of linguistics and computer science; Benoît Monin, a professor of organizational behavior and statistics at the Graduate School of Business; and researchers from SPARQ, a program within Stanford’s psychology department that works with government, businesses, and nonprofits to design solutions to real-world problems. Hazel Markus, a SPARQ faculty director along with Eberhardt, was also part of the team.

That footage formed the basis of a seminal study on body-worn cameras in policing. The study not only demonstrated differences in how police—regardless of their race—treated black and white drivers during traffic stops, it was also the first to use machine-learning techniques to analyze police officer language captured from body camera footage.

SHOWING RESPECT: ‘SIR’ VS. ‘MAN’

The insights were remarkable for their granularity. Jurafsky and his researchers had developed a first-of-its-kind algorithm that could analyze the Oakland PD data for indicators on how officers treated community members they pulled over based on their race.

They relied on existing linguistic theory about respect and how it gets conveyed. This meant programming software not only to recognize the use of “sir” or “ma’am,” but also far more ambiguous language, like “be careful” or “slow down,” indicating concern for motorist safety.

Officers were far more likely to use “sir” or “ma’am,” and to express concern for safety when addressing whites.

The algorithm found large disparities in levels of respect based on race. Officers were far more likely to use “sir” or “ma’am,” and to express concern for safety when addressing whites. But they called African Americans “bro” and “man,” and were less likely to offer apologies or reassurance. More recent research found similar disparities in the tone of voice officers used depending on the race of the motorist.

In a separate study, the Stanford team had found that Oakland officers were far more likely to stop, search, handcuff, and arrest African Americans than whites—even when they controlled for variables like neighborhood racial demographics and crime rates.

And in surveys Stanford researchers conducted with Oakland residents, they also uncovered widespread distrust in how African Americans and Hispanics viewed the police and their effectiveness. Whites and Asians, on the other hand, were more likely to think officers were trustworthy, capable, and racially impartial.

The disparity did not appear to be the result of overt discrimination, nor could Eberhardt and her collaborators measure for implicit bias. Rather the differences in treatment by race seemed to be rooted in the practices, beliefs, and culture that pervaded the department.

LOOKING AHEAD TO BROADER IMPACT

The insights—which led to 50 recommendations for new practices, policies, and procedures within the Oakland PD—were just the beginning. The city then asked the Stanford team to stay on to help senior leadership in the department implement the reforms. This included developing and conducting new officer training programs on procedural justice and implicit bias. It also required winning over a police force that, like others around the country, saw data as more of a cudgel than a learning opportunity.

Oakland PD began holding bi-weekly meetings with 15 officers of every rank. There, Eberhardt and Monin listened as officers offered their perspectives. They said they spoke casually to African Americans because they thought that saying “man” was a better way to connect with them than “sir.”

They also said that their stops weren’t motivated by a driver’s race, but because the officer often had knowledge linking the person to a crime. Unfortunately, the data on traffic and pedestrian stops didn’t account for intelligence-led policing, so it was difficult to track.

50: new policy & procedure recommendations made within the Oakland PD

At the Stanford team’s recommendation, Oakland changed its stop forms to allow officers to indicate whether their decision to pull over a driver or pedestrian was intelligence led. There are steps, too, for verifying that an intelligence-led stop was warranted. According to the latest stop data, nearly 40% of all stops are now made based on prior information.

Tracking intel-led stops seemed to have a direct impact on policing. According to department data, Oakland officers overall made only 19,000 stops in 2018, a 40% drop from the prior year. African Americans were pulled over 43% less often. At the same time, Oakland’s crime rate fell. The fact that officers were making fewer stops even as crime rates fell suggests that officers were being more judicious about who they pulled over and why.

Officers have come to understand and better appreciate the power of data, says LeRonne Armstrong, an Oakland deputy chief of police.

“Law enforcement didn’t think research had much value because it didn’t take into account the officer perspective,” he said. “But Dr. Eberhardt and her team were willing to stick around and explain why their data mattered, and to listen in return.”

Ronald Davis, who served as executive director of President Obama’s Task Force on 21st Century Policing and consults with thousands of police agencies around the country, credits Eberhardt with helping Oakland police develop some of the most progressive and methods for dealing with racial disparities. “My hope is that Eberhardt and her team inspire a generation of academics and researchers to do similar work—or even just half of what they have done,” he said.

 

This issue brief describes how teams of researchers and leaders in government, business, and nonprofits can work together to generate new ideas, insights, and solutions to make progress on social problems. Jennifer Eberhardt, the Stanford University Morris M. Doyle Centennial Professor, was a member of Stanford Impact Labs’ design team, serves on the SIL Faculty Advisory Board and receives support through Stanford Impact Labs Start-Up Impact Lab Funding. This brief was written prior to the launch of Stanford Impact Labs to show how new evidence and insights developed jointly by scholars and external practitioners can inform policies and programs to improve lives.

Stanford Impact Labs invests in highly motivated teams of researchers and practitioners from government, business, nonprofit organizations, and philanthropy. These teams—impact labs—work together on social problems they choose and where practical progress is possible. With financial capital and professional support from Stanford Impact Labs, they can rapidly develop, test, and scale new solutions to social problems that affect millions of people worldwide.

Learn more about the work Stanford Impact Labs is investing in at impact.stanford.edu.