News
It's spring break! I'm relaxing back at home.
[May 13, 2025:] Geometry and Probability
[May 1, 2025:] Words I Pretend to Know
[April 1, 2025:] Langevin on Manifolds
Notes
Working on notes on the quantum mechanics, derivatives (AKA tangent spaces vs. algebraic approaches), and uploading my course notes onto this blog!
Projects
Finally started a projects page! I've recently made some nice upgrades to my post component, so it looks pretty clean! ;)
🌊
I'm considering whether or not to continue this project using WebGL or Three.js.
I'm also researching methods for generating the 3D scenes I want for this project automatically.
In the meantime, I've decided to proceed with some preliminary prototypes of the other interactive parts of this project.
Orange Juice
I like orange juice. :)
Mlog
Words I Pretend to Know
May 1, 2025
By Aathreya Kadambi
This post got very delayed by like… a year? But here it is now. Here are some words I better understand now:
KL Divergence: A measure of distance between distributions. For distributions and ,
Homology: Homology groups basically give us an algebraic framework for working with topological structures. We can form chain complexes, and by taking kernels and images of boundary maps, we then can understand “bounded” things called “cycles” and “boundaries” respectively. Then taking quotients of these objects in the chain complexes gives us homology:
As it is written in Fomenko and Fuchs:
Compared with homotopy groups, they have an important flaw—their accurate definition requires substantial algebraic work—and important advantages: Their computation is much easier, we will calculate them more or less immediately for the majority of topological spaces known to us, and also they are geometrically better visualizable [there are no counterintuitive phenomena like ].
I will postpone the last two words: Tor and Orientability, to a future post. For now, I need to solidify my understanding of Riemannian geometry and optimal transport. I will come to algebraic topology after this.
What fueled this post? The other day (and a year ago), a friend of mine asked me what KL divergence was, and I felt like laughing because KL divergence is just one of those things I always pretended to understand because everyone around me said it so much that I just had to follow along. A horrible habit, but it is what it is. It all started in elementary school when I was too lazy to walk over to get a dictionary every time I stumbled across a word I didn’t know, and started trying to use context clues. But with some things, context clues only get you so far.
The good thing is, most words that I don’t know now I will eventually figure out one day. Like the word tensor. In my defense, the first time I was taught what a tensor was, I was told the universal property of a tensor product (it was also the first time I was hearing what a univeral property was). Pretty traumatizing. But these days (a year ago) I’ve come to picture tensors as pretty much looking like multidimensional arrays. The universal property is pretty nice though, but I’m not sure if it’s the best and most intuitive idea. These days (now), I think having better understood and appreciated category theory, I do like the universal property definition a lot.
Anyway, the good thing is that over time, I’ve come to learn the definitions of many of those things which I had previously avoided looking in a dictionary for.
Sometimes I feel guilty about delayed posts like this one, but to be honest, these are the posts where I’ve taken the time and learned the most new words instead of staying in my comfort zone and using all the same old ones.
Comments
Not signed in. Sign in with Google to make comments unanonymously!