For those just catching up, this is Vole's significant service incorporating OpenAI technology. As of Wednesday, the updated terms of service are explicit: no Azure OpenAI Service for US police work, and that includes any advanced text and speech models from OpenAI. This decision has global implications, affecting law enforcement agencies worldwide.
Vole has thrown in a rule that's got global law enforcement in its sights, specifically saying a big no-no to "real-time facial recognition tech" on the go, like on body cams or dashcams, when you're trying to spot someone out in the wild.
However, Microsoft has allowed for some flexibility in its decision. The ban on Azure OpenAI Service is strictly for US law enforcement. International police are not part of this ban. And if you're considering facial recognition with stationary cameras in an office setting, that's not off-limits—unless you're with US law enforcement, then it's a hard pass. This nuanced approach shows that the situation is not as straightforward as it may seem.
This move is pretty much in line with what Microsoft and their buddy OpenAI have been doing when it comes to AI and the folks in uniform.
Just last week, Axon, the Taser folks, dropped news about their latest gizmo. It's an AI tool, built on OpenAI's GPT-4 Turbo, that listens to body cam audio and whips up a police report on the fly. It is unclear if Microsoft's move changes all that.