camera, but not witnessed by an officer, being entered into reports.“It’s not that it didn’t happen,” Kazensky said. “It’s just a question of, if I didn’t see it, should it be in my police report?” Law enforcement agencies must be open and transparent about how and why they’re using AI, Kazensky said. Local government leaders can help ensure that happens. “There is a part for city and county government leaders to play in getting law enforcement agencies tools that improve their processes, but they need to do it in a way that’s open and fair and communicative with everyone,” he said.“Not everyone’s going to love it.There’s always going to be some contingency that says absolutely not. So, you have to listen to as many voices and address as many concerns as you can.” California State Sen. Jesse Arreguín, who wrote SB 524, said in a statement. “AI hallucinations happen at significant rates, and what goes in a police report can influence whether or not the state takes away someone’s freedom.” AI hallucinations — false or misleading outputs delivered as facts — are a threat not only to individuals’ freedom but to the public’s trust in law enforcement, Kazensky said. To keep these at bay, agencies need to vet AI tools as diligently as they would officer recruits and put procedures in place to ensure officers are checking their output. AI tools trained on internal data are safer than those that scoop up data from the internet, Kazensky said. He called the latter a quagmire of disinformation that can lead to hallucinations. He’s also concerned about data captured by a video 12 CIVIL AND MUNICIPAL VOLUME 06, ISSUE 11
RkJQdWJsaXNoZXIy MTI5MjAx