Dylan Patel about GPT-4, China and AI

Past year, 200 something billion this year. And if next year, it could nearly double again, or more than double, based on what we see with data center footprints being built out all across the US and the rest of the world, it's going to be really hard for China to keep up with these rules.

Yes, there will always be smuggling and deep-seek level models of GPT-4 level models, O1 level models, capable to train on what China can get, even the next year above that. But if we speedrun a couple more jumps to billion-dollar models, 10 billion-dollar models, then it becomes, "Hey, there is a compute disadvantage for China for training models and serving them."

And the serving part is really critical. Deep-seek cannot serve their model today. It's completely out of inventory. It's already started falling in the app store, actually downloads, because you download it, you try and sign up.

They say we're not taking registrations because they have no capacity. You open it up, you get less than five tokens per second if you even get your request approved, because there's just no capacity because they just don't have enough GPUs to serve the model, even though it's incredibly efficient.

Said Dylan Patel.