Back

What we don’t get about hardware: social justiceFeatured

As a robotics engineer I have been on many projects that are building the first robot of its kind. But I like to check in on what technology does for my close family and friends because I find that my work does not always intersect with their daily use of technology. My big question lately has been, if I’m not working on technology that helps my community, who am I building it for and what do I know about the communities I am impacting by working on this project? Not having the immediate answer to this question has made me curious about the conversations we are having around sustainability, bias and social justice when it comes to new products in robotics. Business models and build plans are a hurdle for robotics teams in big companies. The type of robots I’m talking about are largely autonomous and sometimes move on mobile bases. It’s a big deal to get these things moving on their own with some form of machine learning software. Today, we have frameworks that make it easier to get started. One is called ROS or Robot Operating System and it comes with a simulation package that allows you to do everything a physical robot would do, but simulated on your computer screen.Robotics is becoming a tiny bit easier to visualize and program. Putting the physical robot together is also getting easier. The NVIDIA Jetson series and GPUs are small enough to put on most robot platforms. And with a battery and a camera or two, it becomes a moving and seeing computation machine. For sure, Bluetooth has its limits. Wires don’t have an excellent lifespan and in general, hardware just loves to fail at the worst possible time. But it can and does work.Often when I start a robotics project answering the question who are we building for is a little murky. There may be a hand wave in the direction of some underserved community such as the elderly and disabled or even the busy and overworked. It’s exciting at first to think we are going to build robots for people that need it, but then comes the reality. Robot building is very costly with many parts thrown away and research time that gets wasted. It’s pretty easy to understand if a team is going to be able to stay afloat based on how patient management is with rising costs.So here is the social justice part.ConsumablesEvery day, trash bins are being emptied across teams who are developing and prototyping new robots. We are leaving a heavy waste footprint. Some things can’t be recycled like screws, wires and tape - materials that literally hold robots together. I personally believe we need to get to an ultra-futuristic state of making hardware obsolete. Hardware needs to be so easy to create and break down to recycle that even a 5 year old could do it. (I have no idea how to do this just yet but it’s an important idea for the conservation of our planet.)Who are we designing for?Robotics needs to provide more solutions for marginalized communities. We run the risk of automating the life of who happen to exist in these companies, labs or spaces where robotics are being developed while leaving the needs of poor and underserved communities behind. For example, these robotics labs are not located in neighborhoods or even rural areas. They are concentrated in dense urban cities like San Francisco and Pittsburgh and I don’t see often enough, developers and engineers who start a project in order to address the experiences of people from areas tens, hundreds or thousands of miles away. This extends to creating any new technology in general. We need to balance our teams with developers and engineers have a personal and community driven stake in what is being created. So they can be on the inside of conversations that decide how much effort should be put into building features or accessibility because their experience informs them of certain programmable and build related roadblocks for their community. There is room for badass engineers who don’t have a community driven stake in how a product works and at that point they provide the support to make something profitable but they don’t need to drive product fit and functionality.Bias and awarenessWe can’t get away from the digital reality of our society, which brings us more online and dependent on systems of data. We are increasingly getting only the information we want through targeted ads. Simply put, we are building ourselves into our own silos. And I think that is sad. We can’t ever hope to build products that scale globally if we don’t turn our new data science powers into tools for creating good tech. Robotics also presents an intersectionality problem. We need to be quicker on understanding faulty computer vision algorithms that don’t detect dark skin tones and understanding explainable AI that helps us review implicit bias coded into our software.In my mind, being invested in sustainable systems that bring positively affect the communities they are deployed in is social justice work as an engineer and a developer. And some areas of tech expansion might need to slow down until we can understand the impact on our most vulnerable communities. Our tech is evolving into autonomous vehicles, robots, wearables and rockets delivering to us experiences that will shape society’s future and even more importantly, the future of our own communities. --Camille Eddy is a robotics engineer and tech expert. She studies bias in AI and uses her international speaking platform to teach engineers, testers and non-technical founders the use cases of de-biasing algorithms released today. She serves on the board for two nonprofit STEM outreach organizations based in San Francisco, California and San Diego, California and regularly coaches students and early career professionals through her website.
Thank you, Cami for sharing your story with us! If you have a story to tell or know someone who does, please reach out to us via DM.
YES. YES. and YES. Love this take.Question: do you believe there is a commercial outcome that can drive bringing robotics and AI to marginalized communities? What is the incentive (beyond social good) to do this work?
Hi Lauren, While social good, in my mind, is what these companies claim to be doing anyway, i.e. Google wants to connect the world through information. There are tons of articles about the buying power of underserved communities. Which, in a tech sense, underserved communities means that these people are not filling the average customer base/interviews/profiles for these products. And in my opinion, almost always because of OTHER reasons besides customer fit. Think about Black women buying power in the beauty industry (https://www.huffpost.com/entry/black-women-nielsen-report_n_59c3fec2e4b06f93538d3a05), they are a great fit for these products and are still underserved by the beauty industry. On top of that idea, here is a general list of other articles that talk about the buying power of people with disabilities, Black people, zip codes, and some other ways (huge) populations of underserved communities are locked out of being a part of the product development conversation:https://www4.uwm.edu/eti/PurchasingPower/purchasing.htmhttp://www.aetna.com/producer/aetnalink/2008-12/link4q_08_niche.htmlhttps://www.air.org/system/files/downloads/report/Hidden-Market-Spending-Power-of-People-with-Disabilities-April-2018.pdfhttps://www.diversityinc.com/growth-black-buying-power-continues/Thanks for your question. I hope this idea impacts the way you develop products. We can all own a part of this solution. Camille
I LOVE it when people are so aware + thoughtful about the design process. I totally agree with this: “Robotics needs to provide more solutions for marginalized communities.” This statement also applies to software!Innovation often has unintended side effects, with certain segments of the population reaping benefits while others continue to suffer from a growing list of disadvantages. The climate crisis is a salient example of this, with poorer communities suffering the brunt of climate change + pollution. To this end, do you have a “social responsibility” checklist for your robotics projects? I believe @jyoung has developed such a list to encourage thoughtful and responsible design — Josie, would love to hear your insight on this!
Hi Quinn,I love the idea of a social responsibility checklist. This is what I have been doing:1st level:Does a product cross an ethical line for my community that results in them being targeted or treated in a discriminatory manner? Does it rely on certain cultural and physical identities that are broadly represented by my community (e.g. skin tone, voice accent, neighborhood resources like malls, retail stores, access to hardware, computers, tech skills, etc.) in order to accurately and fully serve my community? And if it doesn't or can't, do we have a good reason for why we are still making the product in this way?How accessible are members of my community, outside of me, to be reached for testing/conversation/release (as in does the team know who they are making this product for and can they get accurate data on how the product is performing for my community's segment of the customer base)?2nd level:Am I aware of any of the answers above negatively impacting a community I am not a member of?Can we get/create a testing group for that community? Or at least start a conversation with qualified and vetted stakeholders (as in NOT token stakeholders who are just there to show we talked to someone).3rd level:If we are using a data-rich testing group, who is missing from the conversation? What groups are easier to test for and WHY? Thanks for your question Quinn! I hope this is interesting and usable. We all can own a part of this solution!Camille
I work in the social impact space and I would love to share this with my consulting group (and possibly others) - the "checklist" - would you be okay with that? Happy to give credit obviously...
YES. So many times yes. It's the responsibility of all of us in tech to ask ourselves these questions before we start a project. Especially agree with your statement that "some areas of tech expansion might need to slow down until we can understand the impact on our most vulnerable communities." Like Quinn said, there are so often unintended consequences in tech (and other industries) that take time to manifest. Do you have any recommendations for striking the balance between quickly deploying new technologies that have the potential to help our communities and taking the time to more fully understand their effects?
Hi RM, I think the first way we can strike a balance is to allow the limited deployment of tech but not in the way we are used to. If we can't be near enough to monitor the effects of our technology, we need to have people in those physical locations that understand when the norm for a community deviates in a harmful way. For example, Google Maps was unaware of a false avalanche warning re-routing people around Stanley, Idaho. A rural community that depends on tourist traffic. https://www.ktvb.com/article/news/local/google-maps-error-fixed-but-it-has-hurt-the-town-of-stanley/277-449766175 I was at Google when this happened and I was contacted by a friend from my hometown of Boise, Idaho, she wanted to know if I could help get a fix. Within an hour of seeing the message from her, I was able to get in contact with someone from the Google Maps teams which was not the team I was a member of. But the problem had been occurring for months and the people in Stanley couldn't get a hold of the appropriate stakeholders. Thanks for your comment! I hope that idea is impactful for you as you build your own products. We all can own a piece of this solution.
Thank you for sharing this! Very insightful thoughts. I think one thing we need to be mindful of, even in the very initial planning stages of bringing technology into other communities and countries from the Western world, is to wrestle with if our technologies will actually help these communities. Will technology truly bring happiness, or are we performing a modern day colonization in bringing what we believe is superior into their world. I liked your point that, "We need to balance our teams with developers and engineers have a personal and community driven stake in what is being created." To add to that, We need to include people from the communities in which we want to bring the technology to, to understand their needs. In doing "social justice," what truly is justice? I argue that it is not necessarily what us in high-tech societies think it is.
Thanks for sharing great insights Camille! The hardware industry needs more engineers like you.