Popcorn Hack 1:
Provide an example of a movie, TV show, video game, or software that demonstrates bias and specify who is affected by it. Explain a potential cause of this bias.
An example of bias in software is facial recognition technology, such as the early versions of Amazon Rekognition and other AI-based facial recognition systems. Studies have shown that these systems often exhibit racial bias, misidentifying people of color—especially Black individuals and women—at significantly higher rates compared to white men. A potential cause of this bias is the lack of diversity in the training data used to develop these AI models. If the datasets primarily consist of images of lighter-skinned individuals, the system becomes less accurate at recognizing faces from underrepresented groups. Additionally, biases in the developers’ perspectives and testing processes may contribute to the issue, as historically marginalized groups may not be as highly considered during development.
Popcorn Hack #2:
Think about a time when you felt a technology didn’t work well for you. What was the issue, and how did it make you feel? Write a short paragraph describing the experience and suggest one way the technology could be improved to be more inclusive.
I once had trouble unlocking my laptop using facial recognition while wearing my glasses. The system worked fine when I wasn’t wearing them, but it struggled to recognize me with them, forcing me to enter my passcode manually. I rely on my glasses daily, and it made the feature feel unreliable for people who wear them regularly. To make facial recognition more inclusive, the technology could be trained on a wider range of facial variations, including different lighting conditions, glasses, and other accessories, ensuring it works consistently for all users.
Popcorn Hack #3:
Imagine you’re designing a fitness tracking app. How could bias sneak into your app’s recommendations or performance evaluations? Think about users with different physical abilities, ages, or health conditions. What features could you add to ensure the app is fair and inclusive for all users?
Many calorie tracking apps only account for height and weight when determining the best fitness plan for users. However, apps could incorporate bmi,health conditions, dietary restrictions, or different body compositions in their analysis to ensure a more accurate and inclusive option. If the app assumes all users have the same baseline fitness level, it may overestimate progress for someone with a chronic illness or recovery limitations, leading to demotivation. Furthermore, age can also be a factor that is often not considered. If calorie goals and activity targets are designed for younger adults, they may be unrealistic for older users who have different metabolic rates and physical capabilities. Users should be able to set personalized fitness targets based on ability, age, and health conditions. Instead of one-size-fits-all goals, the app should use machine learning to adapt to individual progress, ensuring users are evaluated based on their own growth rather than broad averages.
Homework Hack #1:
Choose a digital tool, app, or website you use regularly. It could be a social media platform, a shopping site, or a streaming service.
Identify Potential Bias: Are there any patterns in the recommendations or interactions that might suggest bias? Does the platform cater well to different user groups (e.g., age, gender, language, accessibility)? Analyze the Cause: What might be causing this bias? Consider data collection, algorithm design, or lack of diverse testing. Propose a Solution: Suggest one way the developers could reduce bias and make the platform more inclusive.
Pinterest is an online platform that allows users to post pictures that build a personalized aesthetic on their profile. However, their recomendation algorithm often reinforces certain aethtic and cultural norms, which lead to bias in content suggestions. For example, the platform often reccomends female-centric content like fasion, beauty, and home decor, making it harder for users looking for more traditional male-oriented content.
Analyze the cause: If the majority of users creating and saving pins fit a certain demographic, the platform naturally prioritizes their preferences. Furthermore, if the training data primarily consists of a certain age, gender, or cultural group, the algorithm may fail to reflect broader diversity.
Solution: Pinterest should actively diversify its training data to include a wider range of user preferences, ethnic backgrounds, and gender representations. Furthermore, users should have more control over their feed preferences (e.g., toggling filters for diversity, gender-neutral recommendations).