Cisco Systems on Tuesday launched networking chips for AI supercomputers that would compete with offerings from Broadcom and Marvell Technology.
Chips from its SiliconOne series are being tested by five of the six major cloud providers, Cisco said, without naming the firms. Key cloud players include Amazon Web Services, Microsoft Azure and Google Cloud, which together dominate the market for cloud computing, according to Bofa Global Research.
The rising popularity of AI applications such as ChatGPT, which is powered by a network of specialized chips called graphics processing units (GPUs), has made the speed at which these individual chips communicate extremely important.
Cisco is a major supplier of networking equipment including ethernet switches, which connect devices such as computers, laptops, routers, servers and printers to a local area network.
It said the latest generation of its ethernet switches, called G200 and G202, have double the performance, compared with the previous generation, and can connect up to 32,000 GPUs together.
"G200 & G202 are going to be the most powerful networking chips in the market fueling AI/ML workloads enabling the most power-efficient network," Cisco fellow and formerly principal engineer Rakesh Chopra said.
Cisco said the chips could help in carrying out AI and machine learning tasks with 40% fewer switches and a lesser lag, while being more power efficient.
In April, Broadcom announced the Jericho3-AI chip that can connect up to 32,000 GPU chips together.
Read more on tech.hindustantimes.com