> It's an extremely powerful position to have every at-home AI geek's setup to be bound to using intel cards
So, incredibly small market share while your competitors already have the first-mover advantage and nailed down the ecosystem? With no data backing it up, I think, graphics cards for local LLM needs is not really on demand. Even for gaming it’s probably more attractive, but then again, that’s not even where the real money is.
>So, incredibly small market share while your competitors already have the first-mover advantage and nailed down the ecosystem?
Exactly. This x100. It was easy for Nvidia to succed in the LLM market by winging it, in the days when there was no LLM market, so they had the greenfield and first mover advantages.
But today, when Nvidia dominates the mature LLM market, Intel winging it the same way Nvidia did, won't provide nearly the same success as Nvidia had.
Ferruccio Lamborghini also built a successful sports car company by building tractors and cars in his garage. Today you won't be able to create a Lamborghini competitor with something you can build in your garage. The market has changed unrecognizably in the mean time.
The market share is incredibly small but also incredibly well aimed.
The people learning how to do local LLMs will be the people directing build out of on-prem transformers for small-midsize companies. The size of the market is irrelevant here, it's who is in that market and the power they will have that is extremely relevant.
So, incredibly small market share while your competitors already have the first-mover advantage and nailed down the ecosystem? With no data backing it up, I think, graphics cards for local LLM needs is not really on demand. Even for gaming it’s probably more attractive, but then again, that’s not even where the real money is.