If we start to use social media data sets to take the pulse of a nation or understand a crisis – or actually use it to deploy resources – we are getting a skewed picture of what is happening. Kate Crawford Read Quote
If you’re not thinking about the way systemic bias can be propagated through the criminal justice system or predictive policing, then it’s very likely that, if you’re designing a system based on historical data, you’re going to be perpetuating those biases. Kate Crawford Read Quote
Data will always bear the marks of its history. That is human history held in those data sets. Kate Crawford Read Quote
If you have rooms that are very homogeneous, that have all had the same life experiences and educational backgrounds, and they’re all relatively wealthy, their perspective on the world is going to mirror what they already know. That can be dangerous when we’re making systems that will affect so many diverse populations. Kate Crawford Read Quote
We should have equivalent due-process protections for algorithmic decisions as for human decisions. Kate Crawford Read Quote
While massive datasets may feel very abstract, they are intricately linked to physical place and human culture. And places, like people, have their own individual character and grain. Kate Crawford Read Quote
As we move into an era in which personal devices are seen as proxies for public needs, we run the risk that already-existing inequities will be further entrenched. Thus, with every big data set, we need to ask which people are excluded. Which places are less visible? What happens if you live in the shadow of big data sets? Kate Crawford Read Quote
As AI becomes the new infrastructure, flowing invisibly through our daily lives like the water in our faucets, we must understand its short- and long-term effects and know that it is safe for all to use. Kate Crawford Read Quote
The fear isn’t that big data discriminates. We already know that it does. It’s that you don’t know if you’ve been discriminated against. Kate Crawford Read Quote
We should always be suspicious when machine-learning systems are described as free from bias if it’s been trained on human-generated data. Our biases are built into that training data. Kate Crawford Read Quote