Oct 24th week letter
You subscribed to this. Now read.
I will be changing things up as I get the format, genre and purpose of this newsletter right. Here’s a fresh take.
Derivative Hot takes!
A section where I repeat hot takes that resonate with me.
Derivative Hot take 1 - 2026 will be the year of architectures.
Derivative hot take 2 - The Chinese ecosystem of AI development is powered by (in-part) the new (assumed) identity of pioneers in this field. They have a strong sense of nationalism embedded in this identity. Driving progress with any (and every) accessible resource.
Controversial and useless things - It’s binge-week.
De de Pyaar De 2 trailer dropped last week and I am psyched for this movie.
After binge-watching Stranger Things, I am all prepared and well revised for the next season!
Now You See Me, Now You Don’t can’t come to cinemas soon enough.
Family Man Season 3! WHAT! I have less expectations though - mainly because the material seems to be toned down to ensure the current Govt does feel any heat in light of this year’s Manipur events.
Open sourcing efforts
I wrote a FastAPI backend template using the abundant AI agents at my disposal. [GitHub]
Created a Spring Boot project to solve a toy fintech assessment. [GitHub]
Added an architecture document to Microsoft Amplifier [GitHub]
In other news
Deepseek-OCR
I read the article on the paper by DeepSeek OCR, in which their key breakthrough was compressing textual context using images, They created a novel architecture that uses DeepEncoder [ nerd galore: Segment Anything Model variant (with patch size 16), a convolutional compressor (2 layers 16x compression on vision tokens) and CLIP-large (removed the patch embedding layer) ] and DeepSeek-3B-MoE for decoding. This breakthrough achieved two things.
Efficiency at 9-10× compression ratio! That translates to 800 vision tokens instead of 6000 (Miner U2.0)
It excited the entire AI community with it. Andrej Karapathy wrote about it. More brownie points on the hot take 2.
Netflix Blog on building a resilient data platform with Write-Ahead log
At Netflix scale, it becomes imperative to create a generic WAL system that handles Data Corruption, Audits etc and maintains system consistency across the platform.
Creating one generic WAL system enables the Netflix Data Platform team to use the same mechanism to - recover from outages or failures, schedule notifications and bill customers in a timely manner.
Kafka for real-time streaming and SQL (through WAL) for reliable, delayed retries.
The different persona examples detail WAL as a tool for Generic Cross-Region Replication, Delayed Queues, Handling multi-partition mutations and more.
From the closing thoughts - Pluggable Architecture & Separation of concerns to enable scale resonated with me.
Podcast corner
Dwarkesh Patel and Andrej Karapathy Podcast
Evolution as a lens to predict the future of artificial intelligence, specifically large language models and Agentic AI. Andrej believes that evolution is not the best motivator or practical way of thinking for the future. This is where his most striking thesis comes in. He described the current LLMs as ethereal ghosts. Why ghosts? These entities are created to mimic human data creating pervasive digital entities that speak like humans (on the internet).
For continuous growth, he speaks of across the spectrum growth - architectures, loss functions and optimizers - which has continued since the start.

