Something I learned
Autoregressive LLMs
This week has been standard work-wise. I didn't learn or pick something new I didn't know before.
But I listened to this interview with Yann LeCun. In the first half or so, he talks about the limitations of auto-regressive LLMs. Auto-regressive is a fancy way of saying today's LLMs are optimized to generate the next word (or token) based on past inputs and words generated. I've heard this before many times but never really understood how it's limiting the model.
In the interview, Yann talks about how these LLMs don't understand or pick up concepts about the world based on how they're trained and evaluated. They truly are just text prediction machines that become more and more inaccurate the longer you ask them to predict.
It's amazing that they've gotten this good with such limitations. They've gotten so good at imitating us by only predicting the next word that most people are overestimating their capabilities. I say this because I don't know how many of the companies offering planning, decision-making, or attempting higher-level reasoning using LLMs are going to make it.
The talk pushed me to think about LLM use cases more groundedly. As in, what else do we have in terms of text corpses that we haven't explored yet? One example that comes to mind is patient medical notes. Doctors write extensive summaries of care every time they treat a patient. This sounds like a good use case for helping doctors write better and faster summaries.
Gratefulness
Eid was this week. As part of the usual Eid family calls, I tried calling my grandma. My uncle picked up and told me she was busy watching the game. I never paid attention to the fact that my grandma now watches soccer with my uncles. What's even more adorable is that she's cheering for the same team as my youngest uncle.
We later had a call, and it was a really nice chat. She was in a good mood—not that she isn't usually, but sometimes, she feels distracted, and she is too polite to say it.
She and I have been having regular calls since I last saw her in July, and I'm grateful for that. My uncle told me how much she enjoys our calls, and I wish I could tell her how much that means to me. I honestly probably enjoy them more than her. Many things led to us not staying in touch over the years, but I'm really glad and grateful that we got past that after seeing her this last summer.