© 2018 by Sohaila Mosbeh.

  • Sohaila Mosbeh

The AI Fad

Class 03- AI Now’s Discriminating Systems and Open AI


I wanted to write about both articles together since they both seem to hit the core issues. Who creates the AI? How accessible it is to users? and what's the environment like in spaces that create these technologies?


What I gained from these articles is one can not create a new thing if it's still built upon a corrupt foundation, if discrimination in AI exists it's because the main issues was never addressed in the first place, we ignore it and keep building upon it. Then act very offended or surprised of how the tools behave. If sexism exists in the workspace by the actual corporations that develop these technologies then of course it will seep it's way into the tools we use. "...these patterns of discrimination and exclusion reverberate well beyond the workplace into the wider world" I would argue the opposite because of these preconceived notions and already existing racism and sexism, it reverberates it's way into the work environment that can then result in conceptually corrupt tech tools that we put back out into world and from there the cycle continues. It's like trying to cure a disease with just treating the symptoms and not going beyond that, so at this point how can AI act as an non discriminatory tool if the environment that it grows up is corrupt to begin with?


"...With the same targeted audience and without the advertisers intending or being aware, ads are delivered in a manner that aligns with gender and racial stereotypes" something within this phrasing did not sit right with me, advertisers are fully aware how their ads are being used and disturbed. Case in point Instagram ads, the way it is disturbed is by targeting the gender the advertiser wants along with the age group and even location, so the fact they are unaware or not intending is very unbelievable and almost fairy tale like. Advertisers are the perfect example of it's not the tools fault it's the way it's created and the used. There is always the option of opting out of using this labeling mechanisms when spreading the word about their product or service, but if it doesn't serve the quick and easy way of getting more money. Why would they even consider it?


"Amodei explains...AI systems safe. This includes making sure that they reflect human values, can explain the logic behind their decisions, and can learn without harming people in the process." is a perfect example of someone's personal biases embedded in technology. It's such a moral and philosophical dilemma we are humans thus forever flawed, if there is ever a solution is to maybe accept that but keep trying to fix and talk openly on what is wrong. It's not a one fix solution it's the constant up keep and updating.


“But to me, it feels like they are doing something a little bit right,” Rhodes says. “I got a sense that the folks there are earnestly trying.”



Invisible Women:

Upon researching more under the broad topic of women's right and place in society these days, I stumbled on a very interesting topic from a podcast I listen to called Invisible women by 99% invisible. In the episode the host Roman Mars interviews Caroline Criado Perez who wrote booked called Invisible Women: Data bia's in a world designed for men where she opens up the discussion with giving an example of snow plowing patterns in a small town in Sweden. The town’s approach was to clear major roads first, then smaller local streets, researchers discovered that men and female driving patterns were different. Men mostly drove from in and out of work most of the time, while women had more complex patterns in running errands and family related things. So when the city council got a hold of these results, they hand the dynamics of how the streets where plowed, in doing so it resulted with less ER cases at the hospital. Perez highlights of course this wasn't done on purpose it was just the lack of research and data collection on women that resulted in this despite having well intentions.


"At the root of all these design problems which don't take women into account, is the fallacy of the default male. The idea when we picture a default human when we design something we picture a man"

This really struck a cord with me how can true it is to me when I think about it as well, and how very strange it is. As well as how it can to be as such? is the cultural background I grew up in? What will happen if we take this into consideration? How beneficial can it be for everyone not just women? Educating others on this topic and bringing awareness to it seems to be the solution, I agree with Perez by conducting alot of research on women and collecting data is the first step to achieve that and the second I would add is to actually ask women and design to their design not presume this will just do because it already works for one type of person. "Of course, when these designs aren’t adopted, the women are sometimes blamed for their failure to adapt to the design."