Machine Learning and Language: Q&A with Michael Douglas
SigmaCamp is continuing the Q&A with Sigma lecturers series!
Our next lecture will be on Tu, June 16:
Machine Learning and Language
by Michael Douglas
Computers don't understand English yet, but there has been remarkable progress in modeling language and generating stories which sound like a person might have written them. In this lecture we will give a demonstration and explain the basic ideas behind GPT-2, the transformer model developed at OpenAI whose stories about unicorns were recently in the news.
About the lecturer:
Mike Douglas is a leading researcher in theoretical physics whose contributions include the first solvable models of string theory, many connections between string theory and mathematics, and statistical approaches to making predictions from string theory. He helped to build the Digital Orrery, a computer which was used to show that the orbit of Pluto is chaotic, and he has many works in computational physics, including the first six loop calculation in quantum field theory (which was referred to in an episode of ``The Big Bang Theory''). He received a B.A. from Harvard in 1983, and a Ph.D. in physics from Caltech in 1988. He is currently a researcher at Renaissance Technologies and an affiliated researcher at the Simons Center for Geometry and Physics at Stony Brook University.
How it Works
- View lecture. The lecture is posted below; you can view it at any time before the Q&A session. Or, if you prefer, join us for the group viewing of the lecture by joining our Zoom meeting at 2 pm EST on Tu, June 16
- Join Q&A session. At 3 pm EST on June 16, join our zoom meeting to ask questions or just chat with the lecturer. One of Sigma Camp faculty will act as the moderator.
Please note that this lecture is open for everyone; feel free to invite your friends!