(thesis advisor)Jessica Marshall
May 21 2019
What does the future feel like when we are judged and discriminated against by machines? What happens when we recognize the discrimination but don’t do anything to correct it? If people cast judgement on me based on what I look like and wear, how would a machine treat me?
DAIQUAN is an installation that emulates the possible dangers of machine learning when fed biased information. In recent years machines have been becoming more connected and getting smarter due to advancements in data science and neural networks. But, they are also known to displaying acts of discrimination to minorities due to the biased datasets they are trained on. The goal of this project is to highlight and stress the importance of human bias in datasets when it comes to the future of implementing artificial intelligence.
There’s more! What you see on this site is only what is viewable online. Please visit our website to find out more about what’s in the archives.