Web Exclusives: Careers
Using Computers to Combat Unconscious Bias
Mary (Molly) Carnes, a physician and researcher at the University of Wisconsin-Madison, is developing an interactive, avatar-based computer program to help academic faculty recognize and reduce their unconscious biases with the goal of increasing diversity in the sciences. The project is one of six funded by the NIH Director's ARRA-Funded Pathfinder Award to Promote Diversity in the Scientific Workforce.
I recently talked to her about her new project and its goals.
What does your research focus on?
I've been working on ways to change the cultural norms of academic institutions, particularly the unconscious assumptions and implicit biases we all have about groups of people. Even against our own conscious intention, these assumptions are a major, if not the major, impediment to achieving the kind of workforce diversity we want in science, technology, engineering and mathematics, or STEM.
You're particularly interested in working with faculty. Why?
When you want to change the cultural norms of an institution, you have to change the attitudes and behaviors of the individuals in those institutions. In academic STEM, the faculty are the drivers of change. They're teaching and mentoring the students, and they're establishing the norms in the laboratory. If you start with the faculty, other people will follow.
Why did you decide to design a computer program to address the problem?
The current methods we're using are just not working. Right now, we give people data. But we clearly know that doesn't work—otherwise nobody would be smoking, nobody would be overweight and everybody would be exercising.
Our brainstorming group came up with the idea of a game-like interactive computer program as another tool we can use to help faculty recognize what these implicit biases look like and give them evidence-based strategies that we know help reduce implicit bias.
How will you convince people to participate?
People who study popular games like The Sims and Civilization know there has to be some challenge that is authentic and meaningful to the game player. There has to be feedback, you have to pique their curiosity, you have to give them enough information but not so much that they find it boring. Part of our challenge is to think: What kind of situations would engage faculty?
We don't know yet, because we haven't started storyboarding. But one thing we're thinking about is engaging faculty in how to become a more effective teacher. You could choose a computer avatar teacher who is a small Asian woman and observe how students behave differently toward you versus when your avatar is a Caucasian man or an African-American man. So you would really step into the shoes of that person. This technique is called perspective-taking, and it's known to be a really effective means of reducing bias.
Can a computer program really help people reduce their unconscious bias?
We know from evidence-based research and social psychology that certain things reduce implicit bias. One example is perspective-taking. A game-based format is ideal for that.
Another example is called counter-stereotype imaging. That is actually seeing positive exemplars from a stigmatized group. You could easily populate a program with positive exemplars.
Another is individuating. That is getting more individual information about a person from a stigmatized group to prevent stereotyped assumptions. When interviewers and other evaluators let stereotypes fill in the blanks, it works against women and ethnic or racial minorities and people with disabilities in STEM.
How would that work in an avatar-based format?
You might be interviewing someone, and we would provide the opportunity for you to see how your mind makes assumptions based on partial information. The program could give immediate and informative feedback and allow a redo. Allowing people to redo something in such a safe environment is a very effective way of teaching.
Say one of the options is to interview someone you're trying to recruit. You could go in first as the interviewer and make bias errors and fix them. Then you could go in as the person being interviewed and experience what that person is feeling in response to the interviewer's mistakes.
What do you ultimately hope the program will achieve?
I hope it increases diversity so that the scientific workforce and leadership across the board reflect the demographics of our country. Until that happens, we're really not taking full advantage of all perspectives and all talent.
Do you plan to expand the project beyond UW-Madison?
Once it's developed, we're hoping the program will be effective and disseminated to other institutions. But even if we only affect UW-Madison, where we have over 1,500 STEM faculty and 500 STEM Ph.D. graduates every year, the overall impact would be considerable.