Racial algorithms and the thinking of coloniality as a barrier to human rights: A new form of structuring racial discriminations

被引:0
|
作者
Machado Jaborandy, Clara Cardoso [1 ]
de Melo, Stephanny Resende [1 ]
机构
[1] Univ Tiradentes Sergipe UNIT SE, Aracaju, SE, Brazil
来源
QUAESTIO IURIS | 2023年 / 16卷 / 02期
关键词
Artificial Intelligence; Racial Algorithms; Coloniality; Discriminations; Human Rights;
D O I
10.12957/rqi.2023.65152
中图分类号
D9 [法律]; DF [法律];
学科分类号
0301 ;
摘要
The use of new technologies and the hyperconnectivity between people brought the need to speed up the data response, so that algorithms and artificial intelligence gained prominence, however, problems arose when racist biases were shown in the results and systems decisions. Thus, considering that they are used both in the public and private spheres, including in public security systems, there was a need to debate discourses on the neutrality of software. The idea of coloniality is also linked to this problem, generating selectivity and social nonsense. The objective is to analyze the reasons why racial biases occur, injuring human rights, in addition to verifying how and how much coloniality thinking is correlated with these postures. In the end, it is concluded that racial algorithms are not neutral as they only replicate what was trained by human beings, therefore, the worldview of those who created them, making it necessary to regulate artificial intelligence, democratize technologies and debates about ethnic diversity. The methodology used was through a qualitative approach, of an exploratory nature, with a methodological procedure of documentary and bibliographic research.
引用
收藏
页码:635 / 657
页数:23
相关论文
共 14 条