Ethical Technology: Machines + Media Summer Virtual Series

July 16, 2020

By Camber Systems

The way we treat people’s security, privacy, and right to safety in the midst of a national emergency sets the tone for future policy. As we develop new technologies to address COVID-19, we take on the responsibility of ensuring our data helps people rather than make them more vulnerable.

On July 14, we joined a webcast on NYC Media Lab’s “Machines + Media” summer 2020 virtual series with two other panelists considering the same ethical questions. The panel examined the way emerging technologies are changing how media is produced and consumed, as well as their impact on society, through the episode’s lens of “Tech Ethics: Diversity, Bias, Inequality, & Privacy.”

Machines + Media Panelists

Ian Allen, CEO of Camber Systems

Cathy O’Neil, Author of Weapons of Math Destruction

Riley Jones IV, Co-Founder and CEO of Join the Bloc

The panel was moderated by Erica Matsumoto,
Director of Partnerships and New Initiatives at NYC Media Lab.

Thwarting Algorithmic Bias

Technology must be developed to support the people who use them, not just serve as a means to an end. Algorithmic bias is the result of coding a computer system to repeat errors, ultimately skewing towards certain outcomes. Panelists shared a mission to keep their technologies from representing human opinions more than truth, and they articulated a variety of methods for meeting that standard, including:

  • Taking data from a representative sample. Ian considers the recent example of The New York Times reporting on data from smart thermometers. While the technology can record good data, the type of people who own smart thermometers are a small and specific subset of the population. Looking to them will not help illustrate trends in the general population.

Listen to what Ian Allen says about where the weight of responsibility falls when automated systems are used in place of humans.

  • Asking, “For whom will this fail?” Cathy O’Neil suggests this question be asked at every level of product development and design, defining who the stakeholders are and what failure means for them. For example, studies on facial recognition have found black women to be detected much less accurately than white men. Understanding how the technology is failing black women should lead developers to improve their product. 

Listen to what Cathy O’Neil says about the ethics in data that’s collected to study recidivism in the criminal justice system.

  • Not using technology to avoid difficult human conversations. There are some contexts and conversations that require uncomfortable and sincere human interaction that can’t be reduced to an algorithmic answer. Riley Jones highlights the necessity of human feedback to support people looking for employment; in this case, tailored responses to a changing job landscape and individual needs can’t be meaningfully automated.

Listen to what Riley Jones says about bias in data collection and how to design tools responsibly.

Empowering Humans, not Machines

Ethical technology issues get buried under the veneer of the respectability and objectivity of technical disciplines. There’s immense danger in not looking critically at the technology we choose to depend on because they can easily be working against us.


Listen to the full episode.


Camber Systems develops its products knowing that machines don’t give context. Tools like the Social Distancing Reporter can do math, but they can’t produce answers or policy solutions to the problems COVID-19 creates without situational context. Even on the data-driven level, good processing techniques are essential to properly anonymize data. Compromising people’s privacy means compromising their safety, and that’s never an acceptable shortcut. 

Technology needs the democratic oversight of people to do good, and businesses and governments should be taking responsibility for that oversight. Camber Systems strives to be a model for the kind of accountability and transparency we need to normalize, in times of both crisis and peace.

If you’re interested in learning how privacy-forward, inspired data analytics can help you make informed policy decisions that protect people, we’d like to talk to you. Contact us here.

visit Social Distancing Reporter →

← back to all posts