Privacy as a Feminist IssueFeatured
Privacy as a Feminist IssueFeatured
kjamkjam·Jul 31, 2019·7 replies
Do you feel like you are in control of your privacy? Do you wish today’s privacy functioned like times past, where details about you, your photos, old social media posts were not a simple search away? Have you ever felt unsafe because of information available about you that you don’t have control over? Increased digitization of our private and public lives and monetization of that information has created a privacy economy - where privacy is a form of privilege - more available to those of us who traditionally hold power (either economically or socially).Many of us trade our information for services. And because of our privilege, this usually works out for us. Siri recognizes our command, we can easily order food and housekeeping, we get an Uber to home or work, our smart lock lets us in our house. When we, as privileged “techies”, exercise our relinquishment of privacy to get a good or service, we are usually pleasantly surprised by the outcome. And privacy is more available to us -- so long as we are willing to pay for it. If we want to block targeted advertisements, if we want to control what data is stored and where, if we want to opt-out of sharing, these usually come with a monetary transaction. But for many less privileged, exchanging data comes at a personal cost. Poorer residents trade massive quantities of personal data for basic services. Women of color become objects of fetishization or are completely written out of search results. To get discounts or free access to services, people fill out endless surveys or provide personal details and browsing history for consumer research companies who resell the reports for thousands of dollars. Many mobile or tablet apps make a profit by selling and trading user data or leveraging the buying and selling of user data via targeted advertisements.For women and people who menstruate, this assault on privacy continues with the revolution of femtech, where data is often the “payment” for things like cycle-tracking or other health issues. Historically, women have been underserved when it comes to data collection. They are underrepresented in medical studies, their bodies are less understood, and drug trials can be run without including them -- even for medication that may be later prescribed to them. With privacy as a corollary for other forms of privilege, the most at-risk groups (people of color, immigrants and LGBTIQ* folks) pay a significantly higher cost due to these interactions.For women and non-gender-conforming folks online, lack of privacy acts as a weapon. “Doxxing”, revenge porn, discriminatory advertisements and other online attacks on privacy overwhelmingly target these groups and make online a dangerous place, where one’s private life and personal information is used to manipulate, extort, and threaten them. The increase in startups focused on the needs of women, non-binary folks and LGBTIQ* individuals brings new possibility to better understand how our bodies work, how we engage online, how we make financial decisions, or which technologies suit us. Currently, all of this data comes at an individual cost of privacy. We trust that the company, application or data collector is going to do the right thing.But too often, that hasn’t been the case. As we’ve seen with the growing inequalities in our world, often with the help of new technologies, we cannot trust tech companies, even those with good intentions. And so we are left in a conundrum -- do we forfeit our privacy and control of our data in order to contribute to better health care, comfort, and services for ourselves and people like us? Or do we disengage and discard potential progress in exchange for our privacy?There is a growing demand for an alternative – a way that we can more safely, ethically and privately share our data. There is a growing demand for user control over their data – for user rights, where we are each given an option to determine how and why and where our data is used. People want to know whether it can be sold, whether it can be used for AI, and when it should be expired. The most recent iteration of these calls focus on data trusts: where you can define and make your data available for commercial or non-commercial research and development, where you can remove your data at will, and where you indeed control what data is available and to whom. No one solution can likely address all of our privacy concerns. Understanding privacy operates as a form of privilege allows us to plan how we’d like to address and manage societal consequences of this unequal resource. Given our record of innovation and ingenuity, we can hopefully provide more equal access to privacy and shift the balance of data control and power from the corporation to the individual.--Katharine Jarmul is a privacy advocate, AI skeptic and machine learning engineer. She is passionate about demystifying concepts like "AI" and machine learning and is concerned about the current security and privacy practices in the field. Since 2014, she runs a consulting and advisory company based in Berlin, working on machine learning and data engineering with a focus on natural language processing. She also helped found KIProtect, a company working on new encryption methods for machine learning tasks.