Ghostwriter AI – This is how Artificial Intelligence writes

| | |

Freitag, 24. Juli 2020, 14:30 - 15:00
Alle Termine werden in der Mitteleuropäischen Sommerzeit (MESZ / UTC+2) angegeben.
EN

„GPT-2“ is the name of a machine learning model with which the American research group OpenAI has developed a remarkably powerful language system. „GPT-2“ is transformer-based, which is the name of a new approach to NLP (Natural Language Processing), in which the system also learns independently which words and parts of a text require more attention. The model was trained „unsupervised“ with the simple goal of predicting the next word by taking into account all previous words in a text. To do this, it was fed with text from eight million web pages, can take into account 1.5 billion parameters, and thus produce impressively „real“ looking texts.

In this Home Delivery edition we tell you more in English about this and also why the developers initially decided to make the trained model available to the public only in a very limited form.