Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
AI Microsoft

Microsoft Bans US Police Departments From Using Enterprise AI Tool 49

An anonymous reader quotes a report from TechCrunch: Microsoft has changed its policy to ban U.S. police departments from using generative AI through the Azure OpenAI Service, the company's fully managed, enterprise-focused wrapper around OpenAI technologies. Language added Wednesday to the terms of service for Azure OpenAI Service prohibits integrations with Azure OpenAI Service from being used "by or for" police departments in the U.S., including integrations with OpenAI's text- and speech-analyzing models. A separate new bullet point covers "any law enforcement globally," and explicitly bars the use of "real-time facial recognition technology" on mobile cameras, like body cameras and dashcams, to attempt to identify a person in "uncontrolled, in-the-wild" environments. [...]

The new terms leave wiggle room for Microsoft. The complete ban on Azure OpenAI Service usage pertains only to U.S., not international, police. And it doesn't cover facial recognition performed with stationary cameras in controlled environments, like a back office (although the terms prohibit any use of facial recognition by U.S. police). That tracks with Microsoft's and close partner OpenAI's recent approach to AI-related law enforcement and defense contracts.
Last week, taser company Axon announced a new tool that uses AI built on OpenAI's GPT-4 Turbo model to transcribe audio from body cameras and automatically turn it into a police report. It's unclear if Microsoft's updated policy is in response to Axon's product launch.
This discussion has been archived. No new comments can be posted.

Microsoft Bans US Police Departments From Using Enterprise AI Tool

Comments Filter:
  • Are they going to ban license plate reading/lookup too if it's run through Azure? Traffic cameras? This is outside Microsoft's decision making IMHO. They either accept money for services rendered or they don't. Discriminating against US law enforcement should not be an option and should actually be illegal (again IMHO). We're supposed to trust the government. If that's an issue, then fix the root of the problem. Disallowing police from using technologies is not the right fix.

    • " This is outside Microsoft's decision making IMHO"

      It isn't.

      Dollars to doughnuts that this is however just a ploy to get more money.

      • You miss the 'IHMO' part? Tell me why a public company gets to decide which government agencies they sell products to, while enjoying the security of said government agencies. They are being hostile to the US police force and I won't stand for it...

        • well, googlers don't want Google selling products to the Depart of Defense in the USA... do you support them in that decision they have made as well?
          • Of course not. Seriously, anyone who's not a citizen with that stance should feel free to GTFO. Employees who are citizens with that stance should not have any say at all and again, are free to GTFO of that company. Activists inside a company should not get to drive the boat.

            Really though? Not selling to US Police Departments is a logical and valid stance now?

            • some time back there was this movie about some lady who owned a shop in NYC and she was in trouble with the IRS for only paying them the taxes that she felt supported the things that she wanted her taxes to support, hence (for example) she did not pay that part of her taxes that she felt would support the DoD. It appeared that lots of US citizens felt that that was a great movie... I wish I could recall the name... anyway, I was not happy with that movie for multiple reasons but one of which was that one do
              • some time back there was this movie about some lady who owned a shop in NYC and she was in trouble with the IRS for only paying them the taxes that she felt supported the things that she wanted her taxes to support, hence (for example) she did not pay that part of her taxes that she felt would support the DoD.

                Seems like the IRS reply to that is simple- "ok, your $x in taxes will be directed towards the programs you support, and a matching amount of other people's taxes will be redirected from those program

        • Hmm....while I'm cautious against AI usage for police and govt against the people...in this case their discrimination against the US vs everyone else is quite suspicious.

          Microsoft gets a LOT of money from the US on all levels of govt.....I would be surprised to see how long that continued...?

        • You miss the 'IHMO' part? Tell me why a public company gets to decide which government agencies they sell products to, while enjoying the security of said government agencies. They are being hostile to the US police force and I won't stand for it...

          Well, someone being hostile to the police is a nice change of pace from the police being hostile towards citizens. I'll give 'em a pass on this one.

        • Tell me why a public company gets to decide which government agencies they sell products to

          That isn't what's happening. They're deciding what purposes the software is fit for. They aren't doing anything to actually prevent them doing it — They're not Apple — but they are stating that doing so is a violation of the license and therefore they can wash their hands of the liability.

        • by Anonymous Coward

          Ah. "IMHO" now means "You're not allowed to disagree with me". Riiiiiight...

          Yeah, you sound like one of those "Blue Lives Matter" proto-fascists.

      • " This is outside Microsoft's decision making IMHO"

        It isn't.

        Dollars to doughnuts that this is however just a ploy to get more money.

        My first thought at the headline was "They're developing some licensing scheme for law enforcement that will be *MUCH* more expensive." Governments are easy to sway into throwing massive piles of money at tech gadgets that might, maybe, sometimes, make policing easier. This has to be a lead-up to an announcement about special law-enforcement only access, that's exactly the same except for the licensing.

        • My first thought at the headline was "They're developing some licensing scheme for law enforcement that will be *MUCH* more expensive."

          Agreed. That was my first thought too. My second thought was that they're saying, "Don't use the public interface! It logs everything. Here, use this special LEO interface that doesn't leave a paper trail."

      • by EvilSS ( 557649 )
        I don't think it is. Microsoft has been super paranoid when it come to possible blow back from how companies use their AI services. They add a ton of extra content moderation to the Azure OpenAI services above what OpenAI does, have a huge list of restrictions on what kind of apps you can build, and who those apps are used by (internal vs external users). I was playing around with the Azure Dall-E model and it bounced one of my prompts for violating their moderation restrictions. That prompt: A red car.
    • by taustin ( 171655 ) on Thursday May 02, 2024 @04:37PM (#64443218) Homepage Journal

      Given the (in)accuracy of things like facial recognition, and the inability of police to use it correctly, I'd think the proper place for a ban on police use of it should come from the higher ups in the department, but maybe Microsoft is trying to avoid legal liability for that kind of misuse that we see so often.

      • If the police themselves make that decision (or even up higher in government) that's fine. A US company deciding to BAN a service for US Law enforcement is not OK. That's not their call to make.

        • by taustin ( 171655 )

          Seems to me, forcing a company to do business with someone they don't want to do business with would violate the 13th amendment.

          But out them for it, sure, and let them eat the consequences. (And the consequences aren't negative, then the company isn't the problem, the government is.)

        • by kmoser ( 1469707 )
          The police tend to do what they want, without repercussions, so it's a moot point anyway.
        • What if a white supremacist group wanted to buy the software to identify opponents? Would you want Microsoft to have the right to refuse to sell to them? IMO they should have the right to refuse service to whomever they choose.
      • Given the (in)accuracy of things like facial recognition, and the inability of police to use it correctly, I'd think the proper place for a ban on police use of it should come from the higher ups in the department, but maybe Microsoft is trying to avoid legal liability for that kind of misuse that we see so often.

        Exactly. This is Microsoft's way of saying this technology is not fit for this particular purpose(s). LEOs will probably still use it regardless, but at least the courts will know that they were warned and broke the AUP/TOS anyway when it inevitably comes up in trials, hopefully making it a net liability for evidence purposes.

      • Think of the PR nightmare when someone was misidentified and the cops go George Floyd on them. A significant portion of the populace will blame Microsoft for the death. Or when our overzealous prosecutors ram through a conviction and the innocent spends years in prison as a result. If I were in the position at Microsoft I would refuse to see to them as well.
    • by tlhIngan ( 30335 )

      Are they going to ban license plate reading/lookup too if it's run through Azure? Traffic cameras? This is outside Microsoft's decision making IMHO. They either accept money for services rendered or they don't. Discriminating against US law enforcement should not be an option and should actually be illegal (again IMHO). We're supposed to trust the government. If that's an issue, then fix the root of the problem. Disallowing police from using technologies is not the right fix.

      That's probably it. First, using

    • We're supposed to trust the government.

      We would, if our government were trustworthy.

      This is outside Microsoft's decision making IMHO.

      It most certainly is within the scope of any company's rights to refuse service to anyone the company finds unconscionable, including (and especially) U.S. law enforcement. The latter has no legal authority to compel service from any company.

    • by sjames ( 1099 )

      AI isn't really appropriate for the way the police would likely use it, and that's a potential liability for MS if they don't forbid it.

      The last thing they need is for the AI to 'creatively' add in some details resulting in an innocent person's unreasonable ordeal at the hands of the police. People who suffer that tend to want to sue, for understandable reasons. If the police claim immunity and a judge/patsy buys that, that leaves the software vendor as a fat target.

      • This is exactly it. The direction for them to include such statements in their TOS came directly from Legal for sure.

    • >> Disallowing police from using technologies is not the right fix.

      Seems like a very prudent fix to me. MS "reaffirmed its ban on U.S. police departments from using generative AI for facial recognition through Azure OpenAI Service, the company’s fully managed, enterprise-focused wrapper around OpenAI tech." Excellent, I don't want any agency with the power of the police to have or be using that tech.

      'explicitly bars the use of “real-time facial recognition technology” on mobile camer

    • Because it's not useful. Everything in an AI is generated. It looks useful, but if it hallucinates even once it destroys a life.
    • They either accept money for services rendered or they don't.

      That's for MS to dictate, not you. It's their product, which they license and are entitled, by law, to change the TOS.

    • IMO in a free society the owner (Microsoft) has free will to associate with whomever they choose as well as not forced into associating with entities they find undesirable. Yes there's a whole lot of bad things which can happen there but overall positive.
  • The term "real-time facial recognition technology" is trivial to side step. A random delay between 1-3 seconds of the video feed will probably make it non-real-time by most software definitions.
    • I'm not sure there is an actual "software definition." It's more of a marketing term, since every computer process of any kind, requires some processing time.

      • by gweihir ( 88907 )

        There is an engineering definition: It just means "answer within a specified maximum response time". Note that real-time does not mean "fast" or "right now" or anything like that. Hence no, it cannot be sidestepped that easily.

        • Interesting, I've been an engineer for 35 years, and I've never seen this definition. Can you point to a source?

    • by gweihir ( 88907 )

      You need to look up what "real-time" means. Hint: It does not mean "immediately" or "fast".

      • Guaranteed within certain amount of time. However, if you try to pitch "real time instrument cluster" to NHTSA as "speedometer is guaranteed to show accurate speed within 100 years of car being at that speed", they will fail you on their "real time speed display" requirement.
  • Don't paint all police forces with the American brush. They're not all terrible. In some countries there's a reasonable balance and I can see some bulk data services being useful. In some countries the police force are actually trying to make the place better, not just to exert their control over others, as is often the way they are portrayed in American media.

    If someone were to do wrong, I would want them to be located, I don't think (here anyway) that the police are to be feared. Might actually be a loss

    • ", as is often the way they are portrayed in American media" Found your problem.

    • For the most part, the various levels of American law enforcment are reasonable, helpful, and rational. But that is not what gets you on the news.

      • (I agree with you on the seriousness of law enforcement and the preferred mediatization of a few bad examples.) I have a serious question. What is the likeliness of police forces in USA to make use of face recognition informally "just because we can"? From my perspective (non-USA) I expect it should require legislation, because anyway they'll have to use the data in court, and even parallel construction can be declared null in court if the original piece of information was not obtained thrpough legal means

      • No.

        Reasonable, helpful, and rational are not words anyone should use to describe American cops. They're out of control thugs.

      • Feel free to go to India, China, any country in Africa, Eastern Europe and even the French and Spanish and UK police forces. The only somewhat nicer ones are in the Nordic and Germanic language countries although they are getting more aggressive now that their society is becoming less homogenous, although the Nordic and Germanic forces are more likely to infringe upon your rights through the legal process as there is minimal to no defense against invalid charges.

  • The potential abuses of this real time facial recognition tech are obvious.

    Security cameras are easily placed and video from them recorded and stored indefinitely as it is. Police can have somewhat reasonable justifications for putting them out in places where there are a lot of pedestrians. And then there are the body cams on roving cops. Add facial recognition to that and there can easily be a database of everywhere you show your face in public. And then there's the cellphone location tracking which many

  • I would love to see it used by the feds to catch fraud and waste.

  • Please, there are plenty of other companies, cough, defense contractors, cough cough, who will step up and offer AI services to police agencies world wide

If Machiavelli were a hacker, he'd have worked for the CSSG. -- Phil Lapsley

Working...