Wednesday, May 1 | 4-5 p.m. | BR 264
Today Luke Sanderson will give a talk in the Math Seminar about neural networks. The seminar is in BR 264 at 4 p.m. See below title and abstract.
Title: RNNs and Gradient Descent
Abstract: Ever wondered how your phone can talk, recognizes songs, etc.? Surprisingly or unsurprisingly, it deals with math and coding. What your phone has been programmed to do is to use neural networks and recurrent neural networks. These systems help us approximate whatever answer you may be looking for. For these networks to “learn”, we need to dive deeper into gradient descent in order to optimize these networks. The only prerequisites required to make these networks work is coding, linear algebra, and multivariate calculus. As we continue learning how these types of networks, we will use higher techniques like ODEs to solve these problems. This talk is a summary of how gradient descent works, how neural networks work, RNNs, and lastly RNNs using ODEs.