With the rise of Large Language Models (LLMs) and their increasing shift toward open-source, the traditional competitive edge is evolving. This shift raises critical questions about the future valuations of companies in this space, especially as reliance on Nvidia for compute power is being challenged by innovations in custom chip design.

Many leading companies like Tesla, Google, and Microsoft are now developing their own chips to reduce dependency on Nvidia, and interestingly, startups and independent developers are also building clusters on Apple Silicon, which offers a superior cost-performance ratio for AI computing tasks.
Continue reading Exploring Strategic Opportunities in AI Compute and Data Infrastructure with Ombori


