Cisco shifts focus to AI with new infrastructure solutions
Cisco recently pivoted its attention towards artificial intelligence by unveiling new infrastructure solutions, signaling a shift in its strategic focus. Traditionally recognized for its role in connecting various data center and cloud infrastructure elements, Cisco seeks to rebrand itself as a prominent player in the GenAI domain. This transformation was highlighted during their latest Partner Summit, where the company introduced a series of new servers and switches tailored to AI workloads. Central to Cisco's new offerings are state-of-the-art servers powered by Nvidia GPUs and AMD CPUs, engineered to handle demanding AI workloads. Additionally, Cisco launched a high-speed network switch designed to seamlessly integrate multiple AI-centric servers. These innovations are complemented by preconfigured PODs, which consist of integrated compute and network infrastructure designed for specific applications. Among the new server lineup is the UCS C885A M8, a compact server equipped with up to eight Nvidia H100 or H200 GPUs and AMD Epyc CPUs. These servers, featuring Nvidia Ethernet cards and DPUs, are versatile enough to function both autonomously and as part of larger, interconnected systems. The accompanying Nexus 9364E-SG2 switch, leveraging Cisco's G200 custom silicon, supports 800G speeds, optimizing high-speed, low-latency connectivity across multiple server arrays. A noteworthy introduction is Cisco's AI PODs, which are Cisco Validated Designs (CVDs) that integrate CPU, GPU, storage, and networking with Nvidia's AI Enterprise platform software. These PODs are turnkey solutions, simplifying AI deployment for organizations by removing some of the complexity associated with infrastructure selection. Initially, these PODs cater more to AI inferencing but are expected to expand towards AI model training over time. Cisco's strategy also extends to integrating these solutions with its Intersight management and automation platform, enhancing device management and integration with existing infrastructure setups. While these offerings might not attract the early cloud adopters, they appear to resonate with enterprises keen on building their AI infrastructure during their GenAI journey. As AI applications increasingly require sensitive data processing, enterprises are opting for on-premises solutions to maintain control over their most critical data. Hence, even with Cisco's delayed entry into the AI infrastructure space, the timing aligns with enterprise needs, particularly as the trend towards on-premises data processing grows. This shift is complemented by a recent study indicating an 80% interest among companies in running some GenAI applications on-premises, suggesting a rapid evolution towards hybrid AI models that blend on-premises and cloud resources. Cisco's initiatives thus position it well within this emerging market segment, offering compelling solutions for enterprises navigating this transition.
Comments