错把黑人视频打上“灵长类动物”标签,Facebook的AI太离谱!官方已向公众致歉
Facebook's AI is too outrageous to label black videos as "primates" by mistake! Officials have apologized to the public.

左琳    郑州航空工业管理学院
时间:2024-04-20 语向:中-英 类型:人工智能 字数:1414
  • 错把黑人视频打上“灵长类动物”标签,Facebook的AI太离谱!官方已向公众致歉
    Facebook's AI is too outrageous to label black videos as "primates" by mistake! Officials have apologized to the public.
  • AI再惹种族歧视争议
    AI Again Causes Racial Discrimination Dispute,
  • 明敏 发自 凹非寺
    Mingmin from Aofei Temple
  • 量子位 报道 | 公众号 QbitAI
    Qubit Report Public Number QbitAI
  • 观看一个以黑人为主角的视频,结果平台给出的相关推荐是:
    Watching a video featuring black people, the platform gave the following recommendations:
  • 是否继续观看有关灵长类动物的视频?
    Do you want to continue watching videos about primates?
  • 这是最近一些Facebook用户在观看一则黑人视频时,遇到的真实情况。
    This is the real situation that some Facebook users encountered recently when watching a black video.
  • 视频的内容其实是几个平民和警察发生争执,和“猴子或灵长类动物”毫无关系。
    The content of the video is actually a dispute between several civilians and the police, which has nothing to do with "monkeys or primates".
  • 这在网上引起了轩然大波。
    This caused quite a stir on the Internet.
  • 对此Facebook紧急致歉,称其是“不能容忍的错误”。
    Facebook made an urgent apology, calling it an "intolerable mistake."
  • 目前,他们正在研究这一推荐功能以“防止这种情况再次发生”。
    At present, they are studying this recommendation function to "prevent this from happening again."
  • “我们的AI并不完美”
    "Our AI is not perfect."
  • 这件事情起源于《每日邮报》2020年6月27日发布的一个视频。
    The incident originated from a video released by the Daily Mail on June 27, 2020.
  • 视频中有白人、黑人和警察,他们之间发生了一些争执,视频的主要人物是黑人。
    There are white, black and police in the video. There have been some disputes between them. The main character in the video is black.
  • 最近,有Facebook用户观看这则视频时发现,平台给出的推荐提示居然是:
    Recently, when Facebook users watched this video, they found that the recommendation prompt given by the platform was:
  • 是否继续观看有关灵长类动物的视频?
    Do you want to continue watching videos about primates?
  • 这让人感到有些错愕。
    This makes people feel a little consternation.
  • 按照生物学定义划分,人类的确属于灵长目。
    According to biological definition, human beings do belong to primates.
  • 但结合一些社会情况,不少种族主义者将黑人贬低为“猴子”、“猩猩”,意图刻意将他们与“人”区分开。
    However, in combination with some social conditions, many racists belittle blacks as "monkeys" and "orangutans" with the intention of deliberately distinguishing them from "people".
  • 这样的社会背景,让人不免猜测Facebook给这段视频打上“灵长类动物”标签到底是怎么一回事。
    Such a social background makes people wonder what Facebook is doing to label this video as a "primate".
  • 于是有人把这个视频的情况告诉了前Facebook内容设计经理Darci Groves ,看到这个情况后,Groves感到非常的震惊。
    So someone told Darci Groves, a former Facebook content design manager, about the video. Groves was very shocked when he saw the situation.
  • 随即,她把这一情况发到了一个面向Facebook员工的产品反馈论坛中。
    Then she posted the situation to a product feedback forum for Facebook employees.
  • 之后,这事引起了Facebook官方的注意。
    After that, the matter attracted the attention of Facebook officials.
  • Facebook Watch产品经理表示,这种情况是不能被容忍的,公司已经在调查问题的根本原因。
    Facebook Watch's product manager said that this situation cannot be tolerated and the company is already investigating the root cause of the problem.
  • Facebook发言人Dani Lever则在一份声明中表示,:
    Facebook spokesman Dani Lever said in a statement:
  • 虽然我们在不断提升AI的水平,但是我们知道它并不完美,还有更多可以优化的地方。
    Although we are continuously improving the level of AI, we know that it is not perfect and there is still more room for optimization.
  • 我们向所有看到这则冒犯性推荐的人道歉。
    We apologize to everyone who saw this offensive recommendation.
  • AI种族歧视,不是第一次了
    AI racial discrimination, not the first time
  • Facebook这次的“错误标签”,再一次把AI推上了舆论的风口浪尖。
    Facebook's "wrong label" this time has once again pushed AI to the forefront of public opinion.
  • 这几年,不少科技巨头的AI都出现了存在种族偏见的负面新闻。
    In recent years, AI of many technology giants has seen negative news with racial prejudice.
  • 2015年,谷歌相册就曾把黑人照片打上黑猩猩的标签引起不少争议。
    In 2015, Google Album labeled black photos as chimpanzees, causing a lot of controversy.
  • 当时,谷歌对此“深表歉意”。
    At that time, Google "deeply apologized" for this.
  • 不过在2018年,《连线》杂志发现,谷歌并没有真的改正错误,只是……为图像分类算法去掉了“大猩猩gorilla”这个类别。
    However, in 2018, Wired magazine found that Google did not really correct the mistake, but... removed the category of "gorilla" for the image classification algorithm.
  • 去年被CVPR 2020收录的PLUSE方法,也被人发现存在明显的种族偏见。
    The PLUSE method included by CVPR 2020 last year has also been found to have obvious racial prejudice.
  • 它是一个可以把打码的人脸修复如初的AI,结果还原的人脸,全部是白人脸。
    It is an AI that can repair the coded face as before. As a result, all the restored faces are white faces.
  • 比如奥巴马这张复原图,可以说和他本人是毫无关系。
    For example, Obama's restoration map has nothing to do with himself.
  • 对亚洲脸庞的复原效果也非常不好。
    The restoration effect on Asian faces is also very bad.
  • 类似的例子在AI算法中,真的不少见。
    Similar examples are really not uncommon in AI algorithms.
  • 由此,AI的伦理问题也一直备受人们关注。
    As a result, the ethical issues of AI have also attracted much attention.
  • 在把人和大猩猩弄混这事上,人类不会犯这种低级错误。为什么AI会呢?
    Humans will not make such low-level mistakes in confusing humans with gorillas. Why does AI?
  • 有人认为这是一个复杂的计算机视觉问题。
    Some people think that this is a complicated computer vision problem.
  • 找出图像之间相似是很容易的,但是找出为什么相似却不相关,这是个很难的问题。
    It is easy to find out the similarity between images, but it is difficult to find out why the similarity is not related.
  • 也有人就指出这是因为AI做的工作更多是统计推理,它本身不会思考。
    Some people also pointed out that this is because AI does more statistical reasoning and does not think.
  • 所以,就要关注训练集。
    Therefore, we must pay attention to the training set.
  • 有人就表示,他曾与一家大公司合作时发现,不少AI训练集中都存在偏见。
    Some people said that when he cooperated with a large company, he found that many AI training focuses were biased.
  • 但是对于这一现象,不少团队都没有想消除歧视的想法。
    However, for this phenomenon, many teams have no idea to eliminate discrimination.
  • 此外,也存在训练集中包含黑人面孔太少的可能,这导致算法在识别黑人面孔时会表现不好。
    In addition, there is also the possibility that the training set contains too few black faces, which leads to poor performance of the algorithm in recognizing black faces.
  • 当然也不排除程序员本身存在种族偏见,所以在打标签时就已经存在偏见。
    Of course, it does not rule out that programmers themselves have racial prejudice, so there is already prejudice when labeling.

400所高校都在用的翻译教学平台

试译宝所属母公司