Calling it the highest performance chip of any custom cloud accelerator, the company says Maia is optimized for AI inference on multiple models.
SCX.ai deployment at Equinix SY5 delivers lowest carbon output per AI token in Asia-Pacific, avoiding water cooling and high ...
The chip is built on Taiwan Semiconductor Manufacturing Co.’s (TSM) advanced 3-nanometer manufacturing process with ...
Designed using TSMC’s 3nm process technology, company has already deployed chip in its US Central data center region ...
Training gets the hype, but inferencing is where AI actually works — and the choices you make there can make or break real-world deployments.
Neurophos is taking a crack at solving the AI industry's power efficiency problem with an optical chip that uses a composite material to do the math required in AI inferencing tasks.
LAS VEGAS, January 07, 2026--(BUSINESS WIRE)--Today at Tech World @ CES 2026 at Sphere in Las Vegas, Lenovo (HKSE: 992) (ADR: LNVGY) announced a suite of purpose-built enterprise servers, solutions, ...
Patent-pending, industry-first technology cuts compute costs by up to 60% and ensures a high-quality user experience by dynamically distributing individual AI model inferencing between local devices ...
GigaIO, a leading provider of scalable infrastructure specifically designed for AI inferencing, today announced it has raised $21 million in the first tranche of its Series B financing. The round was ...
In the evolving world of AI, inferencing is the new hotness. Here’s what IT leaders need to know about it (and how it may impact their business). Stock image of a young woman, wearing glasses, ...
On January 6, 2026 at Tech World @ CES 2026 at Sphere in Las Vegas, Lenovo announced a suite of purpose-built enterprise ...