If you think of, for instance, the current paradigm is based on large datasets, large compute power, datasets that are scraped for instance and if you look at many parts of that pipeline are harmful. Timnit Gebru: There are many examples of how the way research in AI is done is harmful to marginalized communities. Melissa Harris-Perry: She shared some examples about the methodological biases in gathering AI research. The way in which not just products that are commercialized, but research itself is produced, is actually harmful to many people around the world. If we were to do research on artificial intelligence, how would we do that research in a way that's not extractive, that is not exploitative, and in a way that hopefully can be more beneficial to people at the margins? That's not what's currently happening and this research is being done in a way that's very extractive and exploitative. Timnit Gebru: It is basically just supposed to be an AI research institute. Now, late last year Timnit made headlines again when she announced the formation of a new research institute, Distributed AI Research or DAIR. She says the company asked her to retract a paper she co-authored about racial bias in artificial intelligence. Timnit says she was fired from her job as co-lead of Google's Ethical AI team after she expressed concerns about the company's insufficient efforts to create meaningful racial diversity. Melissa Harris-Perry: You may remember first hearing about Timnit back in late 2020 when she left her job at Google in a high-profile departure. DAIR is short for the Distributed AI Research Institute. Timnit Gebru: My name is Timnit Gebru and I am the founder and executive director of DAIR. Artificial intelligence is powerful, but it is not neutral. We introduced into this artificial intelligence all of the biases that have long characterized our social and political lives. Just like Data, the android from Star Trek, the artificial intelligence which uses or misuses our personal data to make critical decisions about our lives, well, it was made by us, by people. It's living and working among us, making decisions about whether we could rent an apartment, buy a house, get credit, board a flight or even wash our hands in a public restroom. Data may not be our pale-faced Starfleet colleague, but artificial intelligence is here. Artificial intelligence is not a distant possibility. Melissa Harris-Perry: It's just a little pop culture moment, but it was also a prophetic glimpse into a future we now inhabit. ?Data: Webster's 24th-century dictionary fifth edition defines an Android as an automaton made to resemble a human being. Do you remember the episode where the issue of Data's rights were on trial? Now, my fave is Worf obviously, but I also have a bit of a soft spot for the android Data as well. I also love Star Trek: The Next Generation. I'm Melissa and I am a Trekkie and not just an OG, Captain Kirk, and Mr. Melissa Harris-Perry: This is The Takeaway.
0 Comments
Leave a Reply. |