Interviews

Share

Sarah Rudek: Progress, Politics, and Programming

  • Sarah Rudek introduced herself as founder of Shieldmaiden Studios, a freelance developer and self-titled cyborg with an implantable cardioverter defibrillator. She’s a perpetual techno-optimist who strives to make the future accessible, fair, and smart for all.

    Taking the audience back to 2016, Sarah reminded us about famous deaths, the Brexit referendum and a US election which was supposed to have an almost certain result for one candidate and humiliation for the other campaign: What went wrong?

    “Aren’t computers unerringly rational and objective?” asked Sarah.

    Yet supposedly blind algorithms can enforce human prejudices. Facial recognition technology works best with white men. The datasets used to train many implementations were not representative of the variety of human skin tones and facial structures.

    “Products inherit the biases of the developers and testers.” Lack of representation in datasets can neuter fabulous new technologies and render them biased: “technology is extraordinarily good at spreading human biases.”

    It’s not just facial recognition. Sarah showed a video clip demonstrating an automatic soap dispenser that failed to work when a black-skinned hand was held under it, while it spewed out soap for white hands.

    “The larger the role that humans play in the design and utilisation of technology, the greater the risk of bias becomes. It is unlikely that the engineers at Facebook intended to create a powerful propaganda tool, but in the process of dividing and subdividing and sub-subdividing their customer base … they have created the perfect environment for commercialisation and hyper-targeted marketing and indoctrination.”

    Sarah suggested that ethics needed to be embedded within technology companies and implementations. She asked how companies should ethically programme self-driving cars to deal with the trolley problem? [A runaway trolley is racing down a track. Should you do nothing and allow the trolley to kill five people standing on the main track, or should you intervene and pull a lever to change the points to divert the trolley onto a side track where it will kill one person?]

    The ethics of computing is not a new problem waiting to be solved. The IEEE and ACM acm.org/about-acm/code-of-ethics have well-established codes of ethics. The British Computer Society require undergraduate Computer Science courses to include ethics. But do management training courses in IT firms include an ethics module?

    Sarah’s bottom line is:

    “If you’re a developer, remember that everything you do can have far-reaching and long-lasting consequences. If you work in tech, you have a lot of power. Code is reused; your choices can be magnified and perpetuated long after you are no longer in control!”

    Subscribe to the NI Dev Conf YouTube channel to keep up to date with recordings of the talks being uploaded from the June conference.

Share this story