OpenAI, a non-profit organization that researches artificial intelligence, has developed a large-scale AI language model that can create entire paragraphs of coherent text: GPT-2. It can understand clearly structured texts, translate them automatically, answer simple questions, and summarize contents without being specifically trained for these tasks. For this training a data set of eight million websites was used, which leads to impressively “real” written texts.
Credits: “Language Models are Unsupervised Multitask Learners” by Alec Radford, Jeffrey Wu, Rewon Child, David Luan, Dario Amodei, Ilya Sutskever