| 9,718 | 210 | 1084 |
| 下载次数 | 被引频次 | 阅读次数 |
人工智能的快速崛起,在人类事务领域引发无法想象的革命的同时,愈益严重的算法歧视现象也引发了人们普遍的担忧与焦虑。AI算法绝不仅仅是一项单纯的技术性问题,其蕴含的政治风险、安全风险与伦理道德风险同样不容忽视。实际上,算法歧视不仅会导致种族歧视、性别歧视等严重社会后果,在一些关键领域或特定情境下,还会侵害公民权利、自由,甚至危害其生命安全。因此,对算法歧视善加治理,才能为算法及其行业的健康发展创造良好的环境。
Abstract:Even though the rise of the new generation of artificial intelligence brings surprises and higher productivity to human beings,its serious algorithmic bias has also caused people's general concerns and anxiety increasingly. The AI algorithm is not only a technical issue,but also a political and ethical issue. AI Algorithms continue to reshape human production and lifestyle. How to ensure that algorithms follow human ethics and morals is becoming increasingly important. In fact,algorithmic bias will not only lead to serious social consequences such as racial discrimination and gender discrimination,etc. It will also infringe civil rights,freedoms,and even endanger people's lives in key areas or specific situations. Therefore,the good governance of algorithmic bias will be able to create a good environment for the healthy development of the algorithm and its industry.
[1]SEDGEWICK R,WAYNE K.Algorithms[M].4th ed.California:Addison-Wesley Professional,2011:4.
[2]BUOLAMWINI J,GEBRU T.Gender shades:intersectional accuracy disparities in commercial gender classification[J].Proceedings of machine learning research,2018(81):1-15.
[3]MANN G,O’NEIL C.Hiring algorithms are not neutral[EB/OL].(2016-12-09)[2019-03-18].https://hbr.org/2016/12/hiring-algorithms-are-not-neutral.
[4]O’NEIL C.How algorithms rule our working lives[N].The guardian,2016-09-01.
[5]BHARGAVA R.The algorithms aren’t biased,we are[EB/OL].(2019-01-29)[2019-03-28].https://www.kdnuggets.com/2019/01/algorithms-arent-biased-we-are.html.
[6]ZOU J,SCHIEBINGER L.AI can be sexist and racist:it’s time to make it fair[J].Nature,2018(559):324-326.
[7]王燃.大数据侦查[M].北京:清华大学出版社,2012:67.
[8]CALISKAN A,BRYSON J J,NARAYANAN A.Semantics derived automatically from language corpora contain human-like biases[J].Science,2017,356(6334):183-186.
[9]TADDEO M,FLORIDI L.How AI can be a force for good[J].Science,2018,361(6404):751-752.
[10]HUTSON M.Even artificial intelligence can acquire biases against race and gender[EB/OL].(2017-04-13)[2019-12-06].https://www.sciencemag.org/news/2017/04/even-artificial-intelligence-can-acquire-biases-against-race-and-gender.
基本信息:
中图分类号:D523;TP18
引用信息:
[1]汪怀君,汝绪华.人工智能算法歧视及其治理[J].科学技术哲学研究,2020,37(02):101-106.
基金信息:
国家社会科学基金项目“生态女性主义视阈下女性符号消费的伦理研究”(16BZX107);; 中央高校基本科研业务费专项资金资助项目“马克思主义‘人与自然双重解放’思想研究”(19CX04032B)