Yeah but even then they won't describe it using the same sort of language that everyone else developing these things does. How many parameters? What kind of corpus was it trained on? MoE, single model, or something else? Will the weights be available?
It doesn't even use the words "LLM", "multimodal" or "transformer" which are clearly the most relevant terms here... "foundation model" isn't wrong but it's also the most abstract way to describe it.
"Foundation model" is not Amazon lingo, though, but pretty standard industry term at this point. If you're doing any sort of AI in prod, you know what it means.
> How many parameters? What kind of corpus was it trained on?
It's rare for the leading model providers to answer these questions.
As someone who applies these models daily, I agree with the dead comment from meta_x_ai. Your questions are interesting/relevant to a person developing these models, but less important to the average person utilizing these models through Bedrock.
It doesn't even use the words "LLM", "multimodal" or "transformer" which are clearly the most relevant terms here... "foundation model" isn't wrong but it's also the most abstract way to describe it.