BASF President Teresa Johnson’s signature Truth & Power Speaker Series project kicked off with “Advocating for the Truth at Theranos,” a conversation featuring whistleblower Tyler Shultz and his attorney, Mary Inman, as they shared their experience navigating a high-profile fraud case.
The second installment of the three-part initiative, “Standing at the Crossroads of Technology and Civil Rights,” featured Airbnb’s Senior Counsel of Privacy, Jennifer Grace leading a powerful discussion with Twitter Whistleblower Anika Collier Navaroli, and President and CEO of the Leadership Conference on Civil and Human Rights Maya Wiley. The conversation addressed key issues including the intersection of technology and contemporary civil rights, evolving trust and safety concerns both online and offline, the link between civil rights reform and enhanced business value, and the impact of social media on the election cycle. The full segment can be watched here (timestamps noted below are correlated with the video).
Navaroli’s involvement in the discourse of technology and civil rights began years ago before the boom in artificial intelligence. “Technology is not neutral, there are going to be implications to civil rights and these impacts are not going to be distributed evenly,” said Navaroli (6:00). She felt she played a role in breaking down the myths of technology by pushing early conversations around the harmful and misleading qualities of facial recognition software. “Technology replicates the biases of its creators,” she said. “What I’ve also seen in my career… is the reality of how technology exacerbates all of the underlying social ills that we have.” (7:00)
We’ve seen these issues unfold in real time between the Federal Trade Commission (FTC) and Rite Aid, in which their facial recognition system was flagging people of color as being criminals, and children were confronted in stores as being thought of as thieves. “Because of the basic realities of America, which have been segregation, data is going to be collected in a way that is discriminatory. The data that we have is all biased,” explained Navaroli. “If the data is based on what people have the opportunity to utilize, communities of color are going to be underrepresented,” added Wiley. (15:00)
The panelists agreed that there are policies that companies can, should, and must have. Upholding these standards is necessary, but there also needs to be a fundamentally different approach to how business models are perceived. The current business model for technology companies is profit over everything. For change to happen, everyone involved needs to be part of the discussion. “These technology companies have had way too much power, and they have not been responsible with that power,” said Navaroli. “My theory of change involves all of us at the table,” she continued. “It involves everybody who is thinking about these issues from the civil rights community, former tech workers, academia, civil society, and government…” (32:30) Jennifer added that “it takes stakeholders within companies from all backgrounds, from all subject matter expertise to really have an honest conversation. But what are we collecting, where is it coming from, what rights do our vendors have, are we vetting those contracts, are we making sure we’re not giving away data on our users that actually hurts others in the U.S.?” (35:00)
Wiley also recognized the dual disruption caused by artificial intelligence and robotics, cautioning against the consequences of neglecting civil rights protections. Technologies are not only bringing rapid paradigm shifts, but also an onslaught of diversity, equity, and inclusion. “If we don’t understand why it is necessary to have civil rights protections in every single way when we’re talking about technologies, it’s not going to be something that is going to produce positive social innovation, but will drive more disparity, more division, and unnecessary competition for limited resources,” said Wiley. (45:00)
The panelists concurred that technology has also inevitably influenced the election cycle. “Imagine how these technologies can deepen and advance the appearance of reality of something that is false in a way that can incite violence,” said Wiley. This presents a significant threat to democracy and the electoral process, both of which are being targeted with lies. “Democracy should be based on facts not disinformation,” Wiley urged. Navaroli and Wiley shared their beliefs that these platforms need to revisit their policies and understand how they can protect the integrity of their election systems and also to enforce them appropriately.
The next installment of the initiative, Truth & Power: Artificial Intelligence will be held on Thursday, September 26. Join us for a discussion around AI systems increasingly making decisions formerly reserved for humans, raising critical legal and ethical questions. Register now here.