This virtual panel brings together engineers, architects, and technical leaders to explore how AI is changing the landscape ...
Software developers have spent the past two years watching AI coding tools evolve from advanced autocomplete into something that can, in some cases, build entire applications from a text prompt. Tools ...
The Maia 200 chip is made using advanced 3-nanometer technology from TSMC, the same chip manufacturer used by Nvidia.
Microsoft unveils its Maia 200 AI chip, offering 30% better performance and improved cloud efficiency with wider customer ...
Yesterday, Microsoft made the software for its Maia 200 chip – its second generation inference processor – available to ...
Microsoft has announced the launch of its latest chip, the Maia 200, which the company describes as a silicon workhorse ...
As AI coding tools become more sophisticated, engineers at leading AI companies are stopping writing code altogether ...
Today, we’re proud to introduce Maia 200, a breakthrough inference accelerator engineered to dramatically improve the economics of AI token generation. Maia 200 is an AI inference powerhouse: an ...
A private school that’s opening campuses from New York to California uses AI bots to teach kids their academic subjects in ...
Calling it the highest performance chip of any custom cloud accelerator, the company says Maia is optimized for AI inference on multiple models.
Hyper3D, the platform developed by Deemos Tech, offers a suite of AI-powered generation tools that process various input ...