FSDP for Dummies
I’ve always struggled to understand the intuitions behind Fully Sharded Data Parallel beyond the high level idea of “shard everything.” Without a systems background, the fundamental primitives like “all-reduce” and “reduce-scatter” aren’t in my vocabulary. But FSDP conceptually is not complicated, especially once you state what the goals are (the rest is nearly necessitated by the engineering). This post is an attempt to deconstruct the algorithm from first principles as a non-systems person. I will bring up the primitives in their specified context, which I think helps reinforces the intuition much better. Most ML researchers have a stronger understanding of the models, params, and optimizer processes than the systems jargon anyways. ...
Remarks on Spatial Localization in VLMs
Prelude This all started when I oversaw this tweet from Timothee Darcet (co-first author on DINOv2) https://x.com/TimDarcet/status/1726320282028360131?s=20 This was in response to people overreacting to how the final problem in computer vision was for AI to tell the difference between a blueberry muffin and a chihuahua, which, to be fair, is a rather funny joke. It turns out that AI models can do this quite well though, and have been able to already even since CLIP came out! So what’s the big deal? ...
My PhD Interview Experience
Over the last few months, I’ve been fully immersed with the CS PhD application process. I’ll make a later blog post detailing the overall process, but I thought I’d write up a quick post about my recent experiences (and hopefully future!) with the interview portion of the process. ...
New Year Resolutions for 2023
Seeing as its the new year, I took some time to think about my 2023 resolutions like most people are. ...