What we don’t get about hardware: social justiceFeatured
As a robotics engineer I have been on many projects that are building the first robot of its kind. But I like to check in on what technology does for my close family and friends because I find that my work does not always intersect with their daily use of technology. My big question lately has been, if I’m not working on technology that helps my community, who am I building it for and what do I know about the communities I am impacting by working on this project? Not having the immediate answer to this question has made me curious about the conversations we are having around sustainability, bias and social justice when it comes to new products in robotics. Business models and build plans are a hurdle for robotics teams in big companies. The type of robots I’m talking about are largely autonomous and sometimes move on mobile bases. It’s a big deal to get these things moving on their own with some form of machine learning software. Today, we have frameworks that make it easier to get started. One is called ROS or Robot Operating System and it comes with a simulation package that allows you to do everything a physical robot would do, but simulated on your computer screen.Robotics is becoming a tiny bit easier to visualize and program. Putting the physical robot together is also getting easier. The NVIDIA Jetson series and GPUs are small enough to put on most robot platforms. And with a battery and a camera or two, it becomes a moving and seeing computation machine. For sure, Bluetooth has its limits. Wires don’t have an excellent lifespan and in general, hardware just loves to fail at the worst possible time. But it can and does work.Often when I start a robotics project answering the question who are we building for is a little murky. There may be a hand wave in the direction of some underserved community such as the elderly and disabled or even the busy and overworked. It’s exciting at first to think we are going to build robots for people that need it, but then comes the reality. Robot building is very costly with many parts thrown away and research time that gets wasted. It’s pretty easy to understand if a team is going to be able to stay afloat based on how patient management is with rising costs.So here is the social justice part.ConsumablesEvery day, trash bins are being emptied across teams who are developing and prototyping new robots. We are leaving a heavy waste footprint. Some things can’t be recycled like screws, wires and tape - materials that literally hold robots together. I personally believe we need to get to an ultra-futuristic state of making hardware obsolete. Hardware needs to be so easy to create and break down to recycle that even a 5 year old could do it. (I have no idea how to do this just yet but it’s an important idea for the conservation of our planet.)Who are we designing for?Robotics needs to provide more solutions for marginalized communities. We run the risk of automating the life of who happen to exist in these companies, labs or spaces where robotics are being developed while leaving the needs of poor and underserved communities behind. For example, these robotics labs are not located in neighborhoods or even rural areas. They are concentrated in dense urban cities like San Francisco and Pittsburgh and I don’t see often enough, developers and engineers who start a project in order to address the experiences of people from areas tens, hundreds or thousands of miles away. This extends to creating any new technology in general. We need to balance our teams with developers and engineers have a personal and community driven stake in what is being created. So they can be on the inside of conversations that decide how much effort should be put into building features or accessibility because their experience informs them of certain programmable and build related roadblocks for their community. There is room for badass engineers who don’t have a community driven stake in how a product works and at that point they provide the support to make something profitable but they don’t need to drive product fit and functionality.Bias and awarenessWe can’t get away from the digital reality of our society, which brings us more online and dependent on systems of data. We are increasingly getting only the information we want through targeted ads. Simply put, we are building ourselves into our own silos. And I think that is sad. We can’t ever hope to build products that scale globally if we don’t turn our new data science powers into tools for creating good tech. Robotics also presents an intersectionality problem. We need to be quicker on understanding faulty computer vision algorithms that don’t detect dark skin tones and understanding explainable AI that helps us review implicit bias coded into our software.In my mind, being invested in sustainable systems that bring positively affect the communities they are deployed in is social justice work as an engineer and a developer. And some areas of tech expansion might need to slow down until we can understand the impact on our most vulnerable communities. Our tech is evolving into autonomous vehicles, robots, wearables and rockets delivering to us experiences that will shape society’s future and even more importantly, the future of our own communities. --Camille Eddy is a robotics engineer and tech expert. She studies bias in AI and uses her international speaking platform to teach engineers, testers and non-technical founders the use cases of de-biasing algorithms released today. She serves on the board for two nonprofit STEM outreach organizations based in San Francisco, California and San Diego, California and regularly coaches students and early career professionals through her website.