Models

GPT (Generative Pre-trained Transformer)

Family of language models using transformer architecture

What is GPT (Generative Pre-trained Transformer)?

GPT is a family of large language models developed by OpenAI that uses the transformer architecture. These models are pre-trained on vast amounts of text data and can generate human-like text, answer questions, write code, and perform various language tasks.

Key Points

1

Based on transformer decoder

2

Pre-trained then fine-tuned

3

Autoregressive generation

4

Powers ChatGPT and similar tools

Practical Examples

ChatGPT
Code generation with Codex
Content creation
Question answering