GPT-2 is a machine learning model that has been trained to complete English-language texts on its own.
The ability to produce and understand language is often considered a uniquely human characteristic. However, AI research is making progress in this area, too. GPT-2 is a machine learning model that has been trained to complete English-language texts on its own. Released by OpenAI in 2019, it is part of a larger goal: general artificial intelligence that is not limited to specific tasks. GPT-2 is trained with 40 gigabytes of text from a wide variety of fields and can formulate a few sentences in response to any text input. The installation GPT-2: Sprachfelder gives you the opportunity to try out the program and discover the limits of its comprehension. Humans are still in the lead. We can draw logical conclusions from a few words and relate the larger contexts of texts to each other. Language is an abstract mirror of the human world; to really understand it, artificial systems would also have to be able to comprehend all its intricacies. But the question remains: How will sectors such as journalism, and our writing style in general, change if they are increasingly supported by AI?
Project Credits / Acknowledgements:
OpenAI: GPT-2, Ars Electronica Futurelab