top of page
  • Writer's pictureTeam ARTE

Racial Bias In Digital Art

by Gill Bartels-Quansah (ARTE Intern)



Digital art is at the forefront of modern media. With technology expanding, artists are finding new ways to depict stories that have traditionally been demonstrated with paper, pencil, paint, and photography. This includes art-generating software like Adobe Photoshop, AutoDraw, Runway ML, etc. In addition to those tools, technology is also expanding to depict art through 3D modeling.


One of these methods is called computer generated-imagery, or CGI. Responsible for films like Toy Story and Star Wars, CGI has enabled realistic 3D imagery of humans and the world in films and television. CGI allows for the combination of technology and traditional elements of art (drawing, sketching etc.), which can be seen in films like Spider-Man Into the Spider-Verse.


However, as new forms of art and technology emerge, so does the implicit racial bias present in those technologies. To explore racial bias in CGI, we must take a step back to our more traditional art forms and how POC have traditionally been depicted.


Still of Miles Morales from the animated movie, "Spiderman Into the Spiderverse."
Spider-Man Into the Spider-Verse

The History


Mainstream media solely uplifted Eurocentric art well into the 20th century. Titus Kaphar speaks on this, particularly on how the painting techniques and subjects taught to most artists often erase stories of diverse people.


One of those techniques is called skin glow. “Skin glow” is a lighting and colors style of film, photography and animation meant to emulate white Europeans. “Skin glow” was originally used in paintings, and this lighting style is common in paintings depicting white people. It scatters the light of a translucent object in which the light exits the object in a different direction. It allows the features of these objects to be clearly visible and depicted. This is evident in the painting, Woman Holding the Balance, by Johannes Vermeer.


However, skin glow does not depict darker skin in the same way. For Black skin, the lighting style “shine” (called specular reflection in CGI) allows for darker skin to be depicted without washing out the skin tone of the person and the colors surrounding them. Shine is a lighting style that is mirror-like/reflective, allowing for light to exit in a singular direction. This is evident in paintings like Ibrahima Scaho II by Kehinde Wiley.

“Shine” in Ibrahima ScahoII (2017), Kehinde Wiley “Glow” in Woman Holding a Balance (1664), Johannes Vermeer


Because mainstream media usually highlights white subjects, other painting styles like “shine” are overlooked when teaching painting. In addition to lack of diversity in computer science/STEM disciplines, those same racial biases present in traditional forms of art have become encoded in our technologies today.


The Technology


Particularly in computer graphics, the concept of “skin glow” continues to dominate algorithms designed to generate 3D depictions of humans. When CGI first came to the market in the 1970’s, the algorithm used to depict humans only modeled white skin. This algorithm used a technique called subsurface light transport, which used the same light scattering technique portrayed in paintings using “skin glow.” In fact, if you google “realistic cgi human” it might take a while before you find a person of color.


This didn’t just extend to skin color, but hair texture as well. There is a reason why it has taken so long for the Type 4, a curl pattern that has very tight curls, to be depicted in animation. It is because the original algorithm for hair texture only could render Type 1, or straight hair.


The Implication


The algorithmic racial bias in computer graphics doesn’t just affect films and media, but other forms of technology as well. This is because the very nature of computing uses code already created to make technological development faster. Thus, one algorithm that has racial bias, or any type of bias, can affect things like facial recognition, augmented reality, natural language processing machines like Alexa or Google, automatic sinks, and so much more. These could put people in potentially dangerous situations, for example if someone is identified incorrectly using facial recognition. This is why racial bias in algorithms is such a pressing matter.


Four mugshot photos labeled "high risk" or "low risk" with corresponding numbers and lists of prior and subsequent offenses.
From "COMPAS Software Results," Julia Angwin et al. (2016)

One example of the harmfulness of racial bias in code is the risk assessment algorithm (from the company Northpointe). This risk assessment algorithm is a potentially biased algorithm used in facial recognition software. This algorithm is used to determine how likely someone is to commit a future crime. In the Machine Bias study by ProPublica (Julia Angwin, Jeff Larson, Surya Mattu and Lauren Kirchner), it was found that the algorithm disproportionally determined that Black people are more likely to commit a future crime.


The photo above shows a few examples of what the algorithm determined. In the photo on the bottom left, Dylan Fugette (a white male) was determined low risk after an attempted robbery, whereas Bernard Parker (a Black male) was determined high risk after resisting arrest without violence. Both were drug possession incidents. However, Dylan Fugetter was actually arrested 3 times for drug possession after the initial arrest, and Parker never offended afterward.


This study found that Black people were about two times more likely to get higher risk evaluations. Risk assessment scores often have weight in incarceration and parole sentencing. This disproportionate scoring of Black people leads to higher incarceration rates and parole times. Reversely, it potentially allows for individuals who have committed harm to face no consequences for their actions. In the long run, this bolsters the incarceration rates of Black people in prison and allows disproportionate sentencing to occur more often for minor crimes, leading to worsening mental health and unjust imprisonment.


The Solution


There are many ways to address the growing problem of racial bias in media, and other forms of technology. There are computer scientists like Timnit Gebru, Joy Bouluwini, and Deborah Raji, who are advocating for racial justice in computer science and AI. Additionally, techniques like deep learning are being used to generate art and media as opposed to older techniques. But, most importantly, representation in both STEM and Visual Arts can help solve issues of racial bias in the media. If we have people who look like the diversity of the world creating our art and our algorithms, we could see a world in which art, media and technology reflect all people.



Colorful AI-created portrait in profile of a Black man with a beard and mustache in a shirt and tie.
Gasoline Dreams created by Nettrice Gaskins using Deep Dream Generator v.2












757 views0 comments

Recent Posts

See All
bottom of page