Privacy as a Feminist IssueFeatured

Do you feel like you are in control of your privacy? Do you wish today’s privacy functioned like times past, where details about you, your photos, old social media posts were not a simple search away? Have you ever felt unsafe because of information available about you that you don’t have control over? Increased digitization of our private and public lives and monetization of that information has created a privacy economy - where privacy is a form of privilege - more available to those of us who traditionally hold power (either economically or socially).Many of us trade our information for services. And because of our privilege, this usually works out for us. Siri recognizes our command, we can easily order food and housekeeping, we get an Uber to home or work, our smart lock lets us in our house. When we, as privileged “techies”, exercise our relinquishment of privacy to get a good or service, we are usually pleasantly surprised by the outcome. And privacy is more available to us -- so long as we are willing to pay for it. If we want to block targeted advertisements, if we want to control what data is stored and where, if we want to opt-out of sharing, these usually come with a monetary transaction. But for many less privileged, exchanging data comes at a personal cost. Poorer residents trade massive quantities of personal data for basic services. Women of color become objects of fetishization or are completely written out of search results. To get discounts or free access to services, people fill out endless surveys or provide personal details and browsing history for consumer research companies who resell the reports for thousands of dollars. Many mobile or tablet apps make a profit by selling and trading user data or leveraging the buying and selling of user data via targeted advertisements.For women and people who menstruate, this assault on privacy continues with the revolution of femtech, where data is often the “payment” for things like cycle-tracking or other health issues. Historically, women have been underserved when it comes to data collection. They are underrepresented in medical studies, their bodies are less understood, and drug trials can be run without including them -- even for medication that may be later prescribed to them. With privacy as a corollary for other forms of privilege, the most at-risk groups (people of color, immigrants and LGBTIQ* folks) pay a significantly higher cost due to these interactions.For women and non-gender-conforming folks online, lack of privacy acts as a weapon. “Doxxing”, revenge porn, discriminatory advertisements and other online attacks on privacy overwhelmingly target these groups and make online a dangerous place, where one’s private life and personal information is used to manipulate, extort, and threaten them. The increase in startups focused on the needs of women, non-binary folks and LGBTIQ* individuals brings new possibility to better understand how our bodies work, how we engage online, how we make financial decisions, or which technologies suit us. Currently, all of this data comes at an individual cost of privacy. We trust that the company, application or data collector is going to do the right thing.But too often, that hasn’t been the case. As we’ve seen with the growing inequalities in our world, often with the help of new technologies, we cannot trust tech companies, even those with good intentions. And so we are left in a conundrum -- do we forfeit our privacy and control of our data in order to contribute to better health care, comfort, and services for ourselves and people like us? Or do we disengage and discard potential progress in exchange for our privacy?There is a growing demand for an alternative – a way that we can more safely, ethically and privately share our data. There is a growing demand for user control over their data – for user rights, where we are each given an option to determine how and why and where our data is used. People want to know whether it can be sold, whether it can be used for AI, and when it should be expired. The most recent iteration of these calls focus on data trusts: where you can define and make your data available for commercial or non-commercial research and development, where you can remove your data at will, and where you indeed control what data is available and to whom. No one solution can likely address all of our privacy concerns. Understanding privacy operates as a form of privilege allows us to plan how we’d like to address and manage societal consequences of this unequal resource. Given our record of innovation and ingenuity, we can hopefully provide more equal access to privacy and shift the balance of data control and power from the corporation to the individual.--Katharine Jarmul is a privacy advocate, AI skeptic and machine learning engineer. She is passionate about demystifying concepts like "AI" and machine learning and is concerned about the current security and privacy practices in the field. Since 2014, she runs a consulting and advisory company based in Berlin, working on machine learning and data engineering with a focus on natural language processing. She also helped found KIProtect, a company working on new encryption methods for machine learning tasks.
Thank you Katharine for writing for us! If you have a story to share, or know someone who does, let us know [email protected] or DM me. We'd love to feature it.
Thank you for sharing! This is certainly a thought-provoking hot topic (see:, and it's one that I think about every time I ask Alexa for something... and also when Alexa hears anything resembling her name (e.g. Alex, Alexis) when we're watching a TV series or movie. When I consider how much I'm willing to trade in exchange for all the conveniences of modern society, I'm actually not sure -- it will probably have to come to a head in some sort of spectacular fashion, which is unfortunate.
Thank you for writing this, it was eye opening to say the least and brought to light so many ideas I hadn't previously considered. I am regularly thinking of the data I'm sharing and how it will impact me but have never considered those less privileged and how that are exploited through data collection. What you've shared is not surprising and I'm grateful that it's becoming a point of discussion as I'm sure so many other people don't think of it at all due to privilege. I agree that understanding privacy is an excellent first step, as is informing people of exploitation.
Such an important point - Moody Month is a London/ NYC based business whose mission is to provide femtech services around menstruation and mood tracking with secure date. Would love to get your thoughts on them? I've just started using -
I've been using drip ( which is created by some friends I know in Berlin and is fully open-source with no data sent to the cloud. It is funded via grants from the German government and EU for open health data / open health research.In a quick scan of Moody Month's privacy policy, it looks fairly standard for these apps. They say that they "anonymise" data before it is shared with researchers - I've actually worked with a team doing this before and sometimes only the bare minimum is done - however, it depends on the knowledge of the team around anonymization. (see: ). They also say they might share your data with marketers - but that you can opt out of that. If you want to do that you should contact [email protected].Finally, they say they might share your data with any potential buyers of the company. I think this is the biggest threat to most of these apps -- that a larger company with more data about you will also buy it (ahem, your health insurance provider) and this data will be linked, aggregated and used to make healthcare decisions about you.However, as of now, they are still independent! So, if you write them and share concerns, hopefully they are thinking similar things. :) Being informed of the issues and reading the privacy policies when possible are a great way to start moving the conversation forward. Since GDPR, all privacy policies are *supposed* to be written in non-legal terms and understandable. If you find a hard one, feel free to ping me and I'm happy to read it and try to help! (I am not a lawyer though!! But I do know a bit about this field. :) )
So spot on! Thank you."Understanding privacy operates as a form of privilege allows us to plan how we’d like to address and manage societal consequences of this unequal resource."This rings so true. When I discuss data privacy concerns with others, so often their response is, "Well, I have nothing to hide, so who cares?" It just captures a fundamental misunderstanding of the risk at hand. Recently, I've been trying to work with my employer to reform the company policy around doing a criminal background check using a third-party company with alarming terms of service. This company's terms dictate that they can investigate you in any capacity, at any time, and share their finding with anyone—well outside the original request to check for a criminal record. And it's all legal, because we have insanely weak data protection laws. I shopped around for an alternative, and I found pretty much all companies selling consumer reports have uneditable terms that give them carte-blanche to your data. In the end, I ordered a criminal background checked through the California dept. of justice. It was literally the only way to not sign my rights away and comply with my company's requirement. I want to note that I take no issue with employers performing criminal background checks. I agree with the policy, as I don't want to work with someone who might be dangerous. That said, there is almost no process to do this where employees' data and privacy are protected. We need data protection laws so badly.
I just want to add that NPR aired a great piece on data tracking by smart home devices yesterday, which is available in article and podcast form. Even for someone familiar with data privacy issues, it's eyebrow-raising stuff.