If you play a videogame and you avoided or never met a particular queer character, did they exist in the game for you?
What kind of world does Spotify—through its algorithmic sorting of millions of users’ desires, through our aggregated listening—produce for us to hear?
Industry is already using data to remake culture. To reverse the tide—to make culture more equitable—we need to decode that data for ourselves.
“There is nothing shocking or radical about ending an economic practice that has too many negative externalities.”
“Campaigns matter in part because of who meets whom, about the social networks that are shaped by that campaign as well as shaping it.”
“If we want technologies that will not undermine our humanity, social analysts must join with other researchers.”
Since all data can now be used for immigration enforcement, universities cannot assume that collecting data on their students is safe.
How have data-centric systems perpetuated racial capitalism, and how have different communities, particularly in the global South, resisted this datafication?
Whose values get embedded into the algorithms that increasingly govern our lives? How are these data infrastructures complicating what it means to be human?
What harms can result from AI and automation, and how might we address and prevent those harms?
How has data been used to organize labor, and how do we make ourselves visible to data-centric systems?
How do people show up in data, and what are some of the inequalities that can result from data collection?
How long has human life been quantified as data, and in what contexts? What are some major implications of humanity being measured as data?
When the internet is in everything, its problems are everywhere.
“Start-ups: they need philosophers, political theorists, historians, poets. Critics.”
What kind of social space are we inhabiting when we’re online? How do practices like data collection, data brokering, and surveillance underwrite the “free” services we enjoy?
Where did the internet come from? Who gets left out of dominant stories about its origins? And what can history teach us about how to make the internet better?
Does the relationship between power and AI mean that all people will be monitored all the time?
Racial categories are, by definition, unequal categories. They reflect not universal truths but historical processes that have linked racial status to ...
I will discuss three concepts in this talk: first, the idea of design justice; second, how people are already resisting oppressive AI; and third, the ten principles of design justice ...