News

My basil finally developed roots! I'm currently reading about quantum mechanics, ferrofluids, and language models.

Blog

[08/09/2024:] Motivating Ladder Operators II
[08/08/2024:] A Cool Way to Garden
[08/08/2024:] Life and Basil Limeade
[08/05/2024:] Motivating Ladder Operators
[08/04/2024:] Migrated to AstroJS
[07/31/2024:] The Classical in the Quantum

Notes

Working on notes on the quantum mechanics, derivatives, and uploading my previous course notes onto this blog!

Projects

Finally started a projects page! I've recently made some nice upgrades to my post component, so it looks pretty clean! ;)

šŸŒŠ

I'm considering whether or not to continue this project using WebGL or Three.js.

I'm also researching methods for generating the 3D scenes I want for this project automatically.

In the meantime, I've decided to proceed with some preliminary prototypes of the other interactive parts of this project.

Orange Juice

I like orange juice. :)

Mlog


Grafting and GPUs

May 18, 2024
By Aathreya Kadambi

Iā€™m currently writing to you from SFO Terminal 1, as I wait for my flight to Texas! I canā€™t wait to learn about PDEs this summer.

Grafting

Thereā€™s no Wi-Fi in tunnels, so on the train ride to the airport, I decided to think about the prospects of becoming a hobby horticulturist in my new apartment. I recently saw a reel on Instagram about how adding lemon juice to smoothies containing spinach can activate the spinach and improve its health benefits. Thinking about it, I wonder if doing this also reduces the probabilities of developing kidney stones. And that thought let me slip into the following rabbit hole of a question: ā€œWould it be feasible to grow limes or lemons at my new apartment?ā€

Well, I started with the prospects of lemons; after all, this blog is called lemonade. As a side note, this blog is called lemonade and not limeade despite the fact that I like limes more than lemons! Anyway, I looked into it, but it turns out that lemons arenā€™t ā€œtrue to seedā€. What that means is supposedly that if I plant a lemon seed, because of how large the genetic diversity of lemons is, the lemon might not even taste good! In fact, it will probably taste pretty bad! I would hate to grow a lemon tree for a year (or even five) only to find that the lemons it made werenā€™t what I was expecting! I canā€™t even grow many of them because I simply donā€™t have the space. Additionally, according to Googleā€™s AI, it apparently isnā€™t possible to tell if a lemon seed will produce good fruit! šŸ˜­ I wonder if thereā€™s a way to reduce the genetic variation to make limes easier to grow from seeds?

Luckily, key limes, sweet oranges, and basically all of those yummy varieties that we love and enjoy are a different story. I canā€™t wait to try growing a key lime.

That being said, how do we get a consistent supply of yummy lemons if theyā€™re so genetically diverse? I would hate as a farmer to have to grow 100 trees to only sell lemons from a handful of them. I havenā€™t looked into it so thoroughly yet, but one way a Reddit user mentioned was grafting! Apparently, you can just take plants and merge them together into sort of ā€œhybridā€ plants, and they can even have the best parts of both of the original plants! One use of this method is to provide better rooting systems to plants with naturally insufficient rooting systems.

This has to be the coolest technique Iā€™ve ever heard of. Does this mean we could plant something with an immaculate rooting system, and then graft five or six different plants of other types to it, and have, for example, a banana-orange-lime-apple-pear tree?! Thereā€™s no way that could be trueā€¦ could it? I need to do some more research on this.

GPUs

On the other hand, some things you unfortunately canā€™t graft together apparently are RAM! I learned that from an entertaining YouTube video by Linus Tech Tips. Iā€™ve recently been trying to improve my hardware situationā€¦ limited support for building things with my M2 chip has caused me to consider whether maybe it would be a good idea to set up a better system. Google Colab doesnā€™t seem to be too much help, since it has pretty strict limits on GPU usage and I can barely run one CV notebook (if even one) before I hit my limit. Maybe I could ā€œgraftā€ together a bunch of different low-end GPUs?

Interestingly, it turns out that HuggingFace is now working on a project called ZeroGPU, a project dedicated to democratizing GPU access! I canā€™t wait for that. For some other updates, today I read about rotary embeddings (RoPE): a type of position encoding from this article by EleutherAI. This was super cool, reading about how they thought of implementing the use of complex numbers in reaction to the geometry of their problem was pretty insightful. I also read about a few other things:

  1. ChuXin, a 1.6B parameter language model (that actually uses RoPE)
  2. SQLCoder-8B: a model for generating SQL based on Llama 3
  3. Stable LM 2: a 1.6B model that doesnā€™t have as large hardware requirements
  4. Are GPTs Good Embedding Models: an article highlighting the benefits of fine-tuning with contrastive loss
  5. This article about some research at UC Berkeley!

and Iā€™m looking forward to reading the paper EE-LLM: Large-Scale Training and Inference of Early-Exit Large Language Models with 3D Parallelism by Yanxi Chen, Xuchen Pan, Yaling Li, Bolin Ding, and Jingren Zhou at some point on my flight (assuming I donā€™t fall asleep).

Until tomorrow!



As a fun fact, it might seem like this website is flat because you're viewing it on a flat screen, but the curvature of this website actually isn't zero. ;-)

Copyright Ā© 2024, Aathreya Kadambi

Made with Astrojs, React, and Tailwind.