(The Center Square) – A task force created by the Washington Legislature via SB 5838 and overseen by the Attorney General’s Office to study Artificial Intelligence is examining its impact to many areas of life, including public safety.
For the task force subcommittee handling that issue, the question for its members is whether or not AI will lead to improved public safety by fast tracking information processing, or poses a threat through the malicious use of publicly collected information.
For some subcommittee members such as Redmond Police Chief Darrell Lowe, AI could help law enforcement identify suspects or pinpoint key information related to a criminal investigation.
“We can take 14 hours of the video having run through a program in about 45 minutes,” he told colleagues at an Aug. 23 meeting. “But I don’t have a person sitting there watching 14 hours worth a video even at, you know, speed and a half, double speed. That’s where I think some of the benefits are.”
However, he emphasized that AI handling of facial recognition software data is not going to lead to a Minority Report form of enforcement. “A lot of people have that as an anchor point and think, ‘Oh, well, we know we’re going to arrest you because the machine said you did something are going to do something 30 years from now.’”
Another form of data that could be handled by AI are those collected by automatic license plate readers. According to Lowe, the city of Redmond is considering their use.
He described them as “an additional tool to help with my enforcement to keep community safe.”
Yet, he noted “there’s a big concern around privacy and privacy rights, etc. And that is a valid concern, but I think what’s missing from the conversation is…that there is no expectation of privacy in public. If you’re driving on a public street, there is no expectation of privacy. People are being recorded all the time. I would submit that from a government perspective, at least my experience in Washington, has been such that the guardrails are very narrow and very clearly defined. We want to have safe communities and technology is the way to do it, because that is a less expensive way to go about it.”
For officers worried their positions will be replaced by patrolling AI robots, Lowe remarked “human police officers are not going to go away. AI will just help us do our jobs more efficiently and better, which ultimately benefits us all.”
Task force member Sen. Matt Boehnke, R-Kennewick, said that “we need to at least notify constituents that their faces are being used if it’s in a public space versus private, at least notify…that is being scanned or used in a facial recognition software like Clear or like a license plate leader or any of those types that you’d mentioned. That concerns me a lot about how the models are being trained.”
Within the private sector, AI could be used by businesses to track and identify serial offenders who rob from their establishments. Washington Retail Association State & Local Government Affairs Manager Crystal Leatherman, who co-chairs the task force subcommittee, told colleagues that “there is a very practical use to AI. When you have a private company or a private store that has the ability to capture the images of the people that are coming in, wreaking havoc economically and criminally on these companies, it would be very beneficial to then know…this individual went into a store and rolled out (a) shopping cart full of things and you have that person’s face. That is very helpful.
She added that if the same serial offender goes into another area and tries to rob from another store, but “it’s within that same store chain, then they know that this person has been responsible for this type of theft. They can then start to watch, perhaps call local law enforcement ahead of time.
Others on the subcommittee expressed fears over other uses of AI, such as the creation of Deep Fake photos and videos, which could potentially allow false criminal evidence.
Magda Balazinska is a member of the Database, Data Science, and UW Reality Lab groups at UW and the co-founder of the Northwest Database Society (NWDS). She told the subcommittee that “if AI has now advanced to the point that it’s possible to generate videos of someone, perhaps someone famous or perhaps someone who’s not famous at all to someone that we happen to know. And those videos can be very realistic and can be very difficult to distinguish.”
“I think this is a challenge both for minors, but I think also people above the age of 18 can also be harmed – not in the same way, of course, but in the way that Deep Fakes can be used to attack, can be used to harass, can be used to mislead.”