![]() ![]() There is the potential to build tools that shift power in favor of marginalized people, but without access to resources to build training datasets this is difficult. This in-turn contributes to low productivity and unstable incomes. Without locally representative data, it is difficult for engineers to build AI tools to help farmers in their communities plant and manage their crops. The question to pose is a deeper one: how is AI shifting power?Īs a practical example, The Food and Agriculture Organization estimates that 800 million people or 78 percent of the world’s poorest are harmed by agriculture data gaps. But ‘fair’ and ‘good’ are infinitely spacious words that any AI system can be squeezed into. It is not uncommon now for AI experts to ask whether an AI is ‘fair’ and ‘for good’. In a recent article for Nature, AI researcher Pratyusha Kalluri sums up the problem: ![]() Bias in existing datasets and the lack of other critical datasets is at its core about who has the power and resources to envision and build a better future. Marginalized people and problems are regularly left out of industry standard datasets, leading to measurably worse algorithmic performance when compared to privileged groups.īut a purely technical answer hides the social root of the problem. Developing a system to diagnose a medical condition or identify voice commands takes a large amount of labeled data to train the algorithm. We can often trace issues back to the data used to build the algorithm. From a technical perspective, there is a straightforward answer. ![]()
0 Comments
Leave a Reply. |