AI: Better Language Models and Their Implications

Alec Radford, Jeff Wu, Dario Amodei,Daniela Amodei, Jack Clark, Miles Brundage & Ilya Sutskever in the blog of OpenAI: Our model, called GPT-2 (a successor to GPT), was trained simply to predict the next word in 40GB of Internet text. Due to our concerns about malicious applications of the technology, we are not releasing the trained … Continue reading AI: Better Language Models and Their Implications