WuDao 2.0 becomes the basis for China’s first ‘virtual student’
Chinese academics claim to have built the world’s largest language model, with 1.75 trillion parameters – besting previous record holder, Google Brain.
The WuDao 2.0 model was unveiled at the 2021 Beijing Academy of Artificial Intelligence (BAAI) Conference last week. It is capable of simulating conversational speech, writing poems and understand pictures.
Google's Switch Transformer, announced in January, featured 1.6 trillion parameters.
"WuDao 2.0 is the first trillion scale model in China and the largest in the world," professor Tang Jie, Tsinghua University’s Computer Science director told Chinese state media outlet Xinhua.
WuDao 2.0 is said to have achieved “excellent results” in its nine benchmark tasks. Taang went as far as to suggest the model “aims to enable machines to think like humans, move toward universal AI and allow developers to build an AI application ecosystem.”
Any researchers and enterprises can apply to use the language model for free.
This wasn’t the first sizeable language model to come out of China in recent months.
In April, Chinese tech giant Huawei unveiled what it called the world’s largest Chinese NLP model, Pangu NLP. PanGu-Alpha was trained with 207 billion parameters – larger than OpenAI’s impressive GPT-3, but considerably smaller than Google Brain’s 1.6 trillion parameter monster.
AI’m off to school
To show off the model’s potential, WuDao 2.0 became the basis for China's first “virtual student.”
The digital avatar was named Hua Zhibing, and the model was used to develop its appearance, voice, and background music, as well as art.
According to Tang, the virtual student will grow and learn faster than an average human student. If she begins learning at the level of a six-year-old this year, she will be at the level of a twelve-year-old in a year.
"We hope that she will master skills first and then seek breakthroughs in reasoning and emotional interaction," he said.
The virtual student will ‘study’ in Tang’s department at the Tsinghua University.