Click here to apply to PI4 for Summer 2017.
Linear Algebra Working Group
May 22-24, in 239 Altgeld Hall
May 26-June 9, in 239 Altgeld Hall; note Memorial Day holiday on Monday May 29
Prepare and Train Group – Machine Learning: Algorithms and Representations
Dates: Monday June 12 through Friday July 21; note Independence Day holiday on Tuesday July 4
Instructors: Maxim Raginsky (Department of Electrical and Computer Engineering) and Matus Jan Telgarsky (Department of Computer Science)
The program is a “Research Experience for Graduate Students” style endeavor: after a series of introductory lectures, the students form small groups (2-5 people) to work on open-ended interconnected problems.
The goal is to guide students through the transition from working on “canned” problems to tackling open-ended problems and formulating the problems themselves. We expect the group work to involve a mixture of computational experiments (to generate conjectures) and theory (to prove them).
The topics will focus on probabilistic and approximation-theoretic aspects of machine learning, with emphasis on neural networks. We will introduce the probabilistic formulation of machine learning and relate the performance of commonly used learning algorithms (such as stochastic gradient descent) to the concentration of measure phenomenon. Problems of varying levels of difficulty will revolve around several open questions pertaining to stability and convergence of stochastic gradient descent. We will also cover several results characterizing neural network function classes, for instance results saying that neural networks can fit continuous functions, that neural networks gain in power with extra
layers, and that neural networks can model polynomials. Open questions will cover more nuanced aspects of adding layers, as well as other neural net architectures, for instance convolutional and recurrent neural networks.
Various dates. Hosts to be arranged.