This piece, titled Eyes of Texas, was commissioned by Interfaces for their publication, “Watch Dogs: A Zine about Community Surveillance & Policing” after a workshop, hosted by Grassroots Leadership, on the topic.
About the Piece
Eyes of Texas explores three hypothetical digital data profiles of the same person under separate lights: what we know, what we don’t know, and what we could know.
Under Capitalism, our demographics and content engagement data is compiled by AI to match us with affinity or interest categories to increase the efficacy of the ads we are served. This is the data collection we know happens through census demographics and social media profiles, as well as the algorithm-based matching that we can see reflected in our ad settings on Facebook (Meta) or provided by Google Analytics to site owners about their visitors. We’ve also heard stories about how Facebook has used this data to perform experiments on our emotional well-being. The development APIs for Meta’s targeting algorithms and Google’s DeepMind AI are open source, meaning they can be experimented with by anyone and piggy-backed off of to build any kind of targeting algorithm you can imagine.
Under Criminalization, you’ll find similar topics, but run through the algorithm of white-sanctioned obedience and policing. Where “social justice” is used for marketing, these policing AIs log “BLM” related events, posts, and activities as a threat and flags the person for terrorist activities or anti-American sentiments. ARIC collects data from criminal records, juvenile detention centers, public data, school districts, utility companies, social media, and “threat liaison officers,” which include over 300 trained civilians on top of police. Aside from what has been made public through the 2020 BlueLeaks hack and legally mandated reporting, we don’t know the full extent of what “crime affinity categories” we are being lumped into. Examples listed are based on what has been disclosed and conclusions I drew from my background in UX research and analytics.
On the reverse, we find Community Reciprocity, or the future we could know, where AIs are trained to use our data for mutual aid rather than exploitation. Criminal activity often stems from those who just need help: financially, psychologically, behaviorally, or socially. This technology could be matching people to resources when they need them, facilitating distribution of wealth, providing outlets for socialization in an increasingly impersonal world, and fostering creative empowerment to get everyone on the road to their best lives. It’s not a novel ideal; it’s just not a profitable one. We see it pop up after major weather disasters in independent, citizen-led assistance programs, or when abundance is shared through “buy nothing” groups on Facebook, or when Discord communities encourage safe spaces for mental health transparency. Non-profits, clubs, and other community organizations have been doing this work for decades.
Big data is out there. It isn’t going to go away. The time, energy, and resources poured into data collection for the means of building a social case against a person is outrageous. Furthermore, when systems like this are built to exploit a population, BIPOC and immigrant folks tend to be targeted most aggressively. Instead of our data being used to make us feel lacking, anxious, depressed, and hopeless, it could be used to create networks of strength, prosperity, and support. Imagine a world where funding was used to benefit all, not create criminals. It just takes one person willing to use data for good to promote a better world for everyone to start transforming lives.
This 5″x7″ digital artwork was drawn via tablet in Procreate and took over 41 hours to complete.
Model: Sadé Lawson
Silicon Hills Have Eyes (PDF)
Austin’s Big Secret (PDF)
Complete List of Facebook Ad Targeting Categories (as of Jan 2018): https://twowheelsmarketing.com/blog/facebook-ads-targeting-options-list/
Complete List of Google’s Affinity Categories (as of June 2019): https://brianswanick.com/blog/what-does-affinity-category-mean/
Others were sourced from reviewing Google Analytics Affinity and In-Market Segment data from my own websites and Facebook ad category settings from my personal profile.