In September, former Stanford math professor and ex-Google researcher Jack Poulson called on HAI to disinvite keynote speaker Eric Schmidt. In a letter co-signed by more than 40 former Google researchers, professors and tech activists, Poulson accused Schmidt of ethical misconduct during his time as CEO of Google.
HAI defended its decision to invite Schmidt, citing the conference’s commitment to promoting diverse perspectives on the role of AI society. At the conference, Schmidt spoke alongside former European Parliament member Marietje Schaake, who has pushed for greater regulation of big data and stronger privacy protections.
Schmidt opened the session by describing how AI has enabled important breakthroughs in science research. As scientific disciplines become increasingly data-dependent, Schmidt said, powerful algorithms can accelerate computation and analysis process of the data and also lead to unexpected discoveries.
He added that current developments in health research facilitated by AI have made earlier diagnosis of serious diseases possible and “will save millions of people over the next five or ten years.”
The achievement of powerful algorithms goes beyond medical lab or hospital, he said.
“Google, for example, automatically captions 1 billion YouTube videos in 10 different languages,” Schmidt said. “You can imagine the impact of that to global diffusion of information.”
Schmidt said that both the private and public sector must make sure these AI innovations are under ethical and legal control.
“We know that data has bias,” Schmidt said. “That is not shocking.”
He said that both the government and corporations should “understand the bias and identify” it to mitigate side effects.
Schmidt argued international collaboration between scientists and researchers was crucial.
“Even in a situation where everyone hates each other, there are still areas of common agreement,” Schmidt continued, calling for cooperation with the talent pool of researchers in China. He added that the world could benefit from having a “common framework,” through which governments and companies can establish norms in the discussion of regulatory structure and policies.
“I think it is also important that we establish right here right now, that the liberal values of Stanford University and Western values are the ones that should win,” Schmidt said. “We shouldn’t allow other values … we need to be unified and clear on that.”
Schaake spoke after Schmidt and focused her talk on the regulation of big tech. She argued that many contemporary problems “stem from under-, rather than over-regulation of technology.” She expressed concern with the fact that “there’s a lot of power in the hands of very few actors.”
In contrast to Schmidt, Schaake questioned the role of technology in the life of the public, citing concerns about Chinese technological dominance.
“If AI benefits disproportionately from undemocratic and centrally governed models such as the one we see in China, but also other parts of the world, where data can be massively Hoovered up without much restriction and where human rights are not protected — and if AI in turn will make that undemocratic government more powerful — why do we have such high expectations of what this technology will bring us?” she asked.
“It’s a little ironic, to put it mildly, that the same companies that are warning against the dominance of Chinese standards are in fact sending data to Beijing themselves,” she said.
Schaake further argued for the early regulation of new technologies specifically artificial intelligence. “Perhaps the timing is never perfect, but I would prefer to be proactive.”
Contact Won Gi Jung at jwongi ‘at’ stanford.edu, Max Hampel at mhampel ‘at’ stanford.edu, Marianne Lu at mlu23 ‘at’ stanford.edu, Daniel Yang at danieljhyang ‘at’ stanford.edu and Trishiet Ray at trishiet ‘at’ stanford.edu.
A quote from Marietje Schaake has been corrected to indicate her belief that modern problems in big tech are caused by under-, not over-regulation. A previous version of this article swapped these characterizations of regulation. The Daily regrets this error.