Ghostwriter AI – This is how Artificial Intelligence writes

| | |

Wed Jul 8, 2020, 2:30 pm - 3:00 pm
All times are given in Central European Summer Time (CEST / UTC +2).
AR

“GPT-2” is the name of a machine learning model with which the American research group OpenAI has developed a remarkably powerful language system. “GPT-2” is transformer-based, which is the name of a new approach to NLP (Natural Language Processing), in which the system also learns independently which words and parts of a text require more attention. The model was trained “unsupervised” with the simple goal of predicting the next word by taking into account all previous words in a text. To do this, it was fed with text from eight million web pages, can take into account 1.5 billion parameters and thus produce impressively “real” looking texts.

In this Home-Deilvery edition we tell you more in Arabic language about this and also why the developers initially decided to make the trained model available to the public only in a very limited form.