Graduate Students Seminar
Wednesday, April 15, 2020 · 11 AM - 12 PM
Online
Session Chair: | Maria Deliyianni |
Discussant: | Dr. Hoffman |
Speaker 1: Saeed Damadi
- Title
- Neural networks pruning
- Abstract
- Pruning a neural network has been an intriguing matter in order to improve generalization, network simplification, hardware or storage requirements reduction, and increasing the speed of further training. Although it dates back to 1990, recently Frankle in his paper has posed a hypothesis saying "dense, randomly-initialized, feed-forward networks contain subnetworks (winning tickets) that – when trained in isolation – reach test accuracy comparable to the original network in a similar number of iterations'' which has been termed 'LTA: Lottery Ticket Algorithm'. Finding those particular subnetworks is an important task because possessing such networks can lighten the burden of the test time and fulfill the aforementioned advantages of network pruning.
- In my presentation, I will do the following: going over neural networks, defining loss cross entropy loss functions for binary and multiclass classification, stating optimization problems, explaining network pruning, and showing numerical results.
Speaker 2: Gaurab Hore
- Title
- Free probability and its application to Random Matrices
- Abstract
- Fields such as random matrices and statistics have defined independence in which commutativity does not hold. Free independence or free probability allows one to be able to compute the distributions of the eigenvalues of large size random matrices. In this talk, we will talk about what 'free probability' is and how it is applicable to Random Marix theory. The entries in random matrices can be replaced by free semicircular random variables (called deterministic equivalent). With the replacement, all the joint moments or cumulants of random matrices can be calculated, which may lead to the calculations of the distributions of the eigenvalues of the functions of these random matrices.