Cadastre-se agora para um orçamento mais personalizado!

Socio-technical approach is needed to mitigate bias in AI, NIST report argues

16 de março de 2023 Hi-network.com

In a recently published report titledTowards a standard for identifying and managing bias in artificial intelligence, the US National Institute of Technology (NIST) argues that machine learning processes and data are not the only sources of bias in artificial intelligence (AI). While computational and statistical sources of AI bias are important, human and systemic biases are relevant as well. 'Systemic biases result from institutions operating in ways that disadvantage certain social groups, such as discriminating against individuals based on their race. Human biases can relate to how people use data to fill in missing information, such as a person's neighbourhood of residence influencing how likely authorities would consider the person to be a crime suspect.' The report argues in favour of a socio-technical approach to mitigating bias in AI, and introduces guidance for addressing three key challenges for mitigating bias -datasets, testing and evaluation, and human factors.

tag-icon Tags quentes : Inteligência artificial

Copyright © 2014-2024 Hi-Network.com | HAILIAN TECHNOLOGY CO., LIMITED | All Rights Reserved.