Skip to content

Opinion: Governments and industry must balance ethical concerns in the race for AI dominance

Protectionism, nationalism and racism undoubtedly play roles in concerns over technology consumption and adoption.
artificialintelligencecomputer
Privacy concerns about AI are widespread.

The CEO of OpenAI, the company behind ChatGPT, recently testified before United States senators that AI “could go quite wrong” and his company wanted to “work with the government to prevent that from happening.”

Privacy concerns about AI are widespread. Along with temporary bans of ChatGPT in Italy, some private organizations have started to restrict its use. These concerns are not limited to ChatGPT, either.

Studies have also demonstrated that WeChat — the most-used social app in China — incorporates censorship algorithms.

TikTok has similarly been framed as a propaganda tool for the Chinese government, leading to U.S. congressional hearings about privacy concerns. Along with broader international efforts by other lawmakers, there is clearly concern about the role governments should play in the development and use of artificial intelligence.

Despite these growing concerns, there are few signs that investment in China-made AI has — or will — decelerate, with U.S. venture capitalists continuing to invest heavily in the country’s AI sector.

Some have claimed that concerns over China are unwarranted — that oppression is unlikely and that others will simply step in to develop and distribute the technology if China doesn’t.

But we cannot disregard how the Chinese government — or any government — is deploying AI to achieve their goals.

AI gold rush

A speculative gold rush has followed the realization that AI — especially large language models like ChatGPT — has the potential to revolutionize business.

As businesses seek to capitalize on these opportunities, they must expand their portfolios to international markets. China is poised to provide a high return on investment to these businesses.

The Chinese government has prioritized innovation to counter the American technological dominance. Recent estimates suggest China has the fourth-largest number of AI “unicorns” — private start-ups that are valued at over $1 billion.

But unlike in the West, the boundary between state-owned and private organizations in China is permeable, with many companies hosting Chinese Communist Party committees within their organizations.

Given social media’s potential to help China achieve its goals, TiKTok’s relationship with the Chinese government raises concerns about what content is presented on the platform, how user information is collected and how it might be used to influence user beliefs and choices.

Ethical business of AI

Protectionism, nationalism and racism undoubtedly play roles in concerns over technology consumption and adoption. Research has repeatedly demonstrated that a product’s country of origin affects consumers’ perception. Yet, these factors must be carefully weighed against others.

Like many nations, China seeks global influence through soft power. Following the communist revolution, the Chinese state has attempted to guide technology development for the purposes of monitoring and regulating society. Such practices are deeply rooted in Chinese philosophy prioritzation of harmony.

Harmony for society can be costly for others. Uyghurs, political dissidents and non-compliant people and groups have all been targeted by the Chinese government. The oppressive surveillance of the Uyghurs in Xinjiang province has not only resulted in their detainment in detention camps, but has resulted in many Han settlers leaving the province.

Western governments and AI

No technology is value-neutral. Values inform the choices of AI designers, developers, and users.

We must be wary of virtue signalling that fixates on China’s problems and ignores our own, as these are differences in the degree of these issues rather than the kind of issues.

Government mass surveillance of citizens, ill-defined policies about autonomous weapons in the military and the collection of user data by private organizations must all be reckoned with in North America.

As recent revelations over the components of a Russian drone used in an attack on Ukraine have made clear, AI has both domestic and military applications. Three-quarters of the drones’ components were found to be made in the U.S. Investors cannot ignore the moral implications of global supply chains when it comes to AI.

Co-ordinated efforts are key

Despite industry being the primary driver of AI development, all stakeholders have a role to play. While the Chinese government’s involvement in AI development might be too great, the hands-off approach of western governments have created their own problems.

These issues include the spread of disinformation and polizarization and increased anxiety and depression associated with social media use.

Regulation is not the only answer, but it is a start. As the U.S. mulls over legislation for systems like ChatGPT, and Canada finalizes its own broad AI framework, the Chinese government seeks to establish its own laws that will undoubtedly help it consolidate control.

Industry leaders and academics are likely best positioned to understand the technology. However, governments can provide insight to users and investors who might be unaware of larger issues within technological ecosystems such as privacy and security.

Illustrating this, Sequoia Capital, one of the largest venture capital firms to invest in China, sought advice from national security agencies. Its recent decision to split its U.S. and China operations has no doubt been influenced by this process.

Strengthening democratic values in the face of AI will require coordinated international efforts between industry, government and non-governmental organizations.

The Conversation

Jordan Richard Schoenherr does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.