close
close

topicnews · July 19, 2025

Inaccurate automation: legal concerns to suspend the police to use AI chatbots to write crime reports

Inaccurate automation: legal concerns to suspend the police to use AI chatbots to write crime reports

Usually the police in Oklahoma City grabbed his laptop and spent another 30 to 45 minutes to write down a report on the search. But this time he had artificial intelligence written the first draft.

The AI tool drew a report in eight seconds of all the noises and radio recording that were recorded by the microphone attached to Gilbert's body camera.

“It was a better report than I could ever have written and it was 100% exactly. It flowed better,” said Gilbert. It even documented a fact that he did not remember that he had heard – the mention of the color of the car by another official from which the suspects came.

The Oklahoma City police authority is a handful to experiment with AI chatbots to create the first designs of incident reports. Police officers who tried are enthusiastic about the time -saving technology, while some prosecutors, police guards and lawyers have concerns about how to change a fundamental document in the criminal justice system that plays a role in persecution or imprisonment.

Created with the same technology as Chatgpt and Sold by Axon, known for the development of the Taser and as a dominant US provider of body cameras, it could become what Gilbert describes as another “game changer” for police work.

“You become police officers because you want to do police work and spend half of the day is only a tedious part of the job you hate,” said Rick Smith, founder and CEO of Axon, and described the new KI product – with the name Draft One – as the “most positive reaction” of a single product that has introduced the company.

“Now there are certainly concerns,” added Smith. In particular, he said that district prosecutors who pursue criminal proceedings would like to be sure that police officers – not just a Ki chat bot – are responsible for capturing their reports because they may have to testify to what they have seen in court.

“You never want to bring an officer to the stand that says:” The AI did not write that, “said Smith.

KI technology is not new to police authorities, which adopted algorithmic instruments for reading license plates, recognize the faces of the suspects, recognize and predict shot tones where crimes could occur. Many of these applications have the concern and attempts by the legislator to take safety precautions with data protection and civil rights and attempted. However, the introduction of police reports from AI-generated police is so new that there are only a few, if at all, guidelines that direct their use.

Concerns regarding the racist prejudices and prejudices of society that are integrated into AI technology are only part of what the activist of the Oklahoma City Community, Aurelius Francisco, is “deeply worrying” about the new tool.

“The fact that the technology is used by the same company that provides the department of the department is alarming enough,” said Francisco, co -founder of the foundation for the liberation of heads in Oklahoma City.

He said that the automation of these reports would “make it easier for the police to bother, convince and to cause violence to bother the community.

Before the police officers tried the tool in Oklahoma City, they showed it to the local prosecutors who advised a certain caution before using it with high missions in criminal matters. At the moment it is only used for smaller incident reports that do not lead to someone being arrested.

“So no arrests, no crimes, no violent crimes,” said Jason Busert, police captain of Oklahoma City, who takes over the information technology for the department of 1,170 officers.

This is in another city, Lafayette, Indiana, not the case where police chief Scott Galloway said that all of his officers can use an entrance for any kind of case and that it has been “incredibly popular” since the beginning of the pilot at the beginning of this year.

Or in Fort Collins, Colorado, where the police SGT. Robert Younger said that the officials have free to use it for every kind of report, although they found that due to an “overwhelming noise” it does not work well on patrols of the bar district in the city center of the city.

In addition to the use of AI to analyze and summarize the audio recording, axon experimented with computer vision to summarize what is “seen” in the video material before it quickly realized that the technology was not finished.

“In view of all sensitivities towards police work, breed and other identities of those involved, this is an area in which I think we have to do some real work before we introduce him,” said Smith, the Axon CEO, some of the answers tested as “not open”, but insensitive to any other way.

These experiments prompted Axon to concentrate on audio in April during his annual corporate conference for police officers in April.

The technology is based on the same generative AI model as Chatgpt, which was produced by Openai in San Francisco. Openaai is a close business partner at Microsoft, the Cloud Computing provider of Axon.

“We use the same underlying technology as Chatgpt, but we have access to more buttons and dials than an actual chatt user,” said Noah Spitzer-Williams, who manages the AI products from Axon. The rejection of the “choice of creativity” helps the model to stick to facts so that it “does not embellish or hallucinated in the same way as you would find it if you only use chatt itself,” he said.

Axon refused to say how many police stations use technology. It is not the only provider in which startups such as Policereports.ai and Trulo absorb similar products. In view of the deep relationship from Axon to police departments that his taser and body cameras buy, experts and police officers expect omnipresent in the coming months and years.

Before this happens, the legal scientist Andrew Ferguson would rather see a public discussion about the advantages and potential damage. On the one hand, the large voice models behind Ai chatbots are susceptible to false information, a problem that is called hallucination and that could give a police report convincingly and difficult to illustrate falsehood.

“I am concerned that the automation and the lightness of the technology would lead to police officers being less careful with their writing,” said Ferguson, a legal professor at American University, which is being worked on on the article of the first Law Review on the emerging technology.

Ferguson said that a police report was important to determine whether the suspicion of an official “justifies a person's loss of freedom”. It is sometimes the only testimony that a judge sees, especially because of crimes.

Police reports also have mistakes, said Ferguson, but it is an open question which is more reliable.

For some officers who tried, it already changes how to react to a reported crime. They tell what happens so that the camera catches better, what you want to make in writing.

While the technology expects bust to expect the officials to become “increasingly verbor” to describe what is in front of them.

After BISD has loaded the video of a traffic stop into the system and pressed a button, the program created a report in the narrative style in the conversation language, which contained data and times, just like an official from his notes, all of which are based on audio from the body camera.

“It was literally seconded,” said Gilmore, “and it was taken to the point where I thought:” I have nothing to change. “

At the end of the report, the officer must click on a field that states that it was generated using AI.