GPT-3

Share This
« Back to Glossary Index

The innovative language model, GPT-3, developed by OpenAI, is the third installment in the GPT series and is distinguished by its unprecedented scale, making it the largest non-sparse language model currently available. Outperforming its predecessor, GPT-2[1], and Microsoft’s Turing NLG, GPT-3 has ten times the capacity of the latter. It is renowned for its capability to generate text, including news articles, and aid in coding tasks, though it also poses potential misuse threats such as the propagation of misinformation or phishing. GPT-3 comes in various versions to accommodate different needs, with davinci, possessing 175 billion parameters, being the largest. The subsequent GPT-3.5 series introduced new models and abilities. GPT-3 plays a pivotal role in both industry and research, supporting products like GitHub[2] Copilot and finding application in several Microsoft products. However, it also raises ethical and academic issues.

Terms definitions
1. GPT-2 ( GPT-2 ) Generative Pretrained Transformer 2, or GPT-2, is an advanced AI model specifically engineered for natural language processing tasks. This model, launched by OpenAI in February 2019, is renowned for its versatility in generating a wide array of text types, with its prowess extending to answering queries and completing code automatically. GPT-2's training involved a vast online text corpus, WebText, and it operates on a staggering 1.5 billion parameters. Despite its resource-intensive nature, GPT-2 has found usage in diverse and innovative applications such as text-centric adventure games and subreddit simulations. Initial misuse fears led to the full GPT-2 model's release in November 2019 when the concerns didn't manifest. However, to address resource constraints, a smaller model, DistilGPT2, was developed. The innovations and successes of GPT-2 set the stage for future progress in AI text generation.
2. GitHub ( GitHub ) Primarily designed for developers, GitHub is a platform that facilitates the creation, storage, management, and sharing of code. Built upon Git software, it provides features such as version control, access control, and bug tracking. Since becoming a Microsoft subsidiary in 2018, GitHub has established itself as a premier host for open source software projects. As of January 2023, it boasts a vibrant community of over 100 million developers and is home to more than 420 million repositories. GitHub was conceived in 2008 by its four founders and initially operated as a flat organization, fostering autonomy, flexibility, and collaboration among its members. In addition to version control, GitHub also provides services such as task management, continuous integration, and support for project wikis. It is more than just a platform; it's a comprehensive suite of tools for software development.
GPT-3 (Wikipedia)

Generative Pre-trained Transformer 3 (GPT-3) is a large language model released by OpenAI in 2020. Like its predecessor, GPT-2, it is a decoder-only transformer model of deep neural network, which supersedes recurrence and convolution-based architectures with a technique known as "attention". This attention mechanism allows the model to selectively focus on segments of input text it predicts to be most relevant. It uses a 2048-tokens-long context[jargon], float16 (16-bit) precision, and a hitherto-unprecedented 175 billion parameters, requiring 350GB of storage space as each parameter takes 2 bytes of space, and has demonstrated strong "zero-shot" and "few-shot" learning abilities on many tasks.

Generative Pre-trained Transformer 3 (GPT-3)
Original author(s)OpenAI
Initial releaseJune 11, 2020 (beta)
Repository
PredecessorGPT-2
SuccessorGPT-3.5
GPT-4
Type
Websiteopenai.com/blog/openai-api

On September 22, 2020, Microsoft announced that it had licensed GPT-3 exclusively. Others can still receive output from its public API, but only Microsoft has access to the underlying model.

« Back to Glossary Index
Keep up with updates
en_USEnglish