A group of artificial intelligence experts and industry executives, including Elon Musk, have called for a six-month pause in the development of AI systems more powerful than OpenAI’s GPT-4. They issued an open letter through non-profit organisation, Future of Life Institute, citing the potential risks to society and humanity. The letter called for a pause on advanced AI development until shared safety protocols were developed, implemented and audited by independent experts. The co-signatories included researchers at DeepMind, Yoshua Bengio, Stuart Russell and Emad Mostaque, CEO of Stability AI. The Future of Life Institute is primarily funded by the Musk Foundation and other organisations.
The EU police force Europol has warned about the potential misuse of advanced AI like ChatGPT, raising ethical and legal concerns. Meanwhile, the UK government has unveiled proposals for an “adaptable” regulatory framework around AI that would split responsibility for governing AI between its regulators for human rights, health and safety, and competition.
OpenAI unveiled its ChatGPT AI program earlier this month, which has been used for a vast range of applications. Since its release last year, ChatGPT has prompted rivals to accelerate developing similar large language models, and companies to integrate generative AI models into their products. OpenAI recently partnered with around a dozen firms to build their services into its chatbot, allowing ChatGPT users to order groceries via Instacart or book flights through Expedia.
Critics have accused the letter’s signatories of promoting “AI hype” and exaggerating claims about the technology’s current potential. They argue that AI researchers should be subjected to greater transparency requirements rather than pausing research. However, signatories argue that powerful AI systems should be developed only once it is confirmed that their effects will be positive and their risks manageable. The letter also called on developers to work with policymakers on governance and regulatory authorities.