## Page Not Found

Page not found. Your pixels are in another canvas.

A list of all the posts and pages found on the site. For you robots out there is an XML version available for digesting as well.

Page not found. Your pixels are in another canvas.

Dyson Brownian motion [Dy62] is best known to characterize the eigenvalues of special random matrices [Ta12]. Most interestingly, it is also equal in distribution to \(n\) independent Brownian motions conditioned to not intersect [Gr99]. In a topics course by Bálint Virág, I came across a proof of this result that is *just too clean* for this type of calculations. After picking up my jaw from the ground months later, I finally decided to write up this surprisingly elegant proof.

Nobody has time to read an 80 page paper [LE20]. Therefore I doubt most readers realized the manifold Langevin algorithm paper actually contains a novel technique for establishing functional inequalities. And I really doubt anyone had time to interpret the intuitive consequences of such results on perturbed gradient descent, and definitely not extending the Kannan-Lovász-Simonovits (KLS) conjecture [LV18] - which brings me to write this blog post.

Equivalent representation results contribute not only a connection between different concepts, but also a new set of proof techniques. Indeed, stochastic analysis has offered a number of alternative proofs to many problems. Occasionally the proof can simplify drastically. In this post, we will discuss a particularly elegant application by Auffinger and Chen (2015), for an otherwise very difficult problem in spin glass.

In a similar sense to line integrals, stochastic calculus extends the classical tools to working with stochastic processes. One of the most elegant and useful result is the change of variable formula for stochastic integrals, commonly known as Itô’s Lemma (see end of this post for a discussion on Doeblin’s contribution). While this lemma is quite easy to use, the proof usually relies heavily on technical lemmas, hence difficult to develop intuition, especially for the first time reader.

While studying two seemingly irrelevant subjects, probability theory and partial differential equations (PDEs), I ran into a somewhat surprising overlap: the Poincaré inequality. On one hand, it is not out of the ordinary for analysis based subjects to share inequalities such as Cauchy-Schwarz and Hölder; on the other hand, the two forms of Poincaré inequality have quite different applications.

```
<p class="archive__item-excerpt" itemprop="description"><p>Short description of portfolio item number 1<br /><img src="/images/500x300.png" /></p>
```

</p>

</article> </div>

Short description of portfolio item number 2

Undergraduate Thesis (2015) [Document] [Presentation] [Code]

With Philippe Casgrain, Gintare Karolina Dziugaite, Daniel Roy.

Integration of Deep Learning Theories Workshop at NeurIPS 2018.

With Murat A. Erdogdu (2020). [arXiv]

Student Research Presentation Award at the Statistical Society of Canada (SSC) 2021 Annual Meeting.

With Maxime Gazeau (2021). [arXiv]

With Mihai Nica and Daniel M. Roy. [arXiv]

NeurIPS 2021.

With Sinho Chewi, Murat A. Erdogdu, Ruoqi Shen, and Matthew Zhang (2021). [arXiv]

With Raphaël Berthier (2022). [arXiv]

This is a description of your talk, which is a markdown files that can be all markdown-ified like any other post. Yay markdown!

This is a description of your conference proceedings talk, note the different field in type. You can put anything in this field.

This is a description of a teaching experience. You can use markdown like any other post.

This is a description of a teaching experience. You can use markdown like any other post.

–>