September 24, 2020
First they get their hands on Doom, and now this
OpenAI’s GPT-3 language model will be exclusive to Microsoft Azure.
The third iteration of the Generative Pre-training Transformer autoregressive language model has been in beta since June, and remains in beta at time of publication.
The AI system was trained on 175 billion parameters, a huge increase over the previous iteration’s 1.5 billion parameters. It also makes it the world’s largest language model, overtaking Microsoft's 17 billion parameter Turing NLG.
"The scope of commercial and creative potential that can be unlocked through the GPT-3 model is profound, with genuinely novel capabilities – most of which we haven’t even imagined yet," Microsoft CTO Kevin Scott said.
"Directly aiding human creativity and ingenuity in areas like writing and composition, describing and summarizing large blocks of long-form data (including code), converting natural language to another language – the possibilities are limited only by the ideas and scenarios that we bring to the table."
Scott also said that Microsoft would look to integrate GPT-3 into its own products, but did not disclose any specifics.
OpenAI was established as a non-profit back in 2015, with $1bn in funding from Elon Musk (Tesla, SpaceX), Altman (then-president of Y Combinator), Peter Thiel (PayPal, Facebook investor), Reid Hoffman (LinkedIn), AWS, and more. It promised to release all its work as open source.
Since then, Musk dropped out, and OpenAI launched a profit-making division inside the wider 'non-profit.' This division employs the majority of the staff, and allows for investment (such as Microsoft’s $1bn).
It claims, however, that this does not impact its non-profit goals, as the return on investment is capped at 100 times what is put in (so Microsoft would receive no more than $100bn).
OpenAI also changed from open-sourcing its tech to instead commercializing it, as exemplified by this GPT-3 partnership.
You May Also Like