The increasing demand for GPUs for AI applications has led to supply constraints, with even the CEO of chipmaker TSMC suggesting that general supply could be limited until 2025. This has led to the US Federal Trade Commission investigating partnerships between AI startups and cloud giants over potential anti-competitive practices. Inference.ai is a platform that provides infrastructure-as-a-service cloud GPU compute through partnerships with third-party data centers, using algorithms to match companies' workloads with GPU resources. The company claims to offer cheaper GPU compute with better availability than major public cloud providers, and recently closed a $4 million funding round to build out its deployment infrastructure.
All Comments