That might change a little with more of OpenAI's gpt-oss releases, which might be too new to exist in this data.
It's not great, and its extremely censored compared to non-US models (yes, the irony), but it is sparse and cheap to run on H100s for how fast it is.
But yeah, it seems like Meta is imploding from tech bro overload. Google Gemini is good, but Google's purposfully constricting their open source team to not compete with Gemini API. Anthropic is a censorship meme and will never release anything, and smaller US startups just don't have the attention nor the funding to compete. Their hope seems to be to invent something proprietary and hypey that gets them bought up, not to actually build something functional.
And one unspoken observation among the ML crowd is that the Chinese firms are both training on the outputs of American models, and maybe sharing high quality, questionably sourced (Chinese govt?) datasets with each other under the table, given how many share the same quirks.