CV_JanFeb_25

The headline might be a little dramatic, but the term “networking” in the data center could eventually be seen as dated due to AI, argued Gilad Shainer, senior vice president of networking at Nvidia. Rather, data center architecture will transform into “an integrated compute fabric” that enables thousands of accelerators to efficiently communicate with one another via scale-up and scale-out communications, spanning miles of cabling and multiple data center facilities, explained Shainer. This integrated compute fabric will include NVIDIA NVLink, which enables scale-up communications, as well as scale-out capabilities enabled by intelligent switches, SuperNICs and DPUs. “This will help securely move data to and from accelerators and perform calculations on the fly that drastically minimize data movement,” said Shainer. “Scale-out communication across networks will be crucial to large-scale AI data center deployments — and key to getting them up and running in weeks versus months or years.” As agentic AI workloads grow — requiring communication across multiple interconnected AI models working together rather than monolithic and localized AI models — compute fabrics will be essential to delivering real-time generative AI, he continued. In this world of distributed AI, all data centers will become accelerated as new approaches to Ethernet design emerge that enable hundreds of thousands of GPUs to support a single workload. This will help democratize AI factory rollouts for multi-tenant generative AI clouds and enterprise AI data centers, said Shainer. This breakthrough technology also will enable AI to expand quickly into enterprise platforms and simplify the buildup and management of AI clouds. “Companies will build data center resources that are more geographically dispersed — located hundreds or even thousands of miles apart — because of power limitations and the need to build closer to renewable energy sources,” Shainer concluded. “Scale-out communications will ensure reliable data movement over these long distances.” The benefits of artificial intelligence are fairly well understood: the automation of routine tasks and processes in order to save time and money, deeper dives into data and thinking in ways that were untenable without high-powered intelligence. Realizing those benefits of AI, however, will require overcoming a diverse set of challenges, suggests CompTIA survey data. And those expected challenges include “identifying uses cases,” which ranked as high as third on the list of expected challenges with AI. Goodbye Network, Hello Computing Fabric The Challenge of AI Challenges AI & AUTOMATION 10 CHANNELVISION | JANUARY - FEBRUARY 2025 AI’s Diverse Challenges Source: CompTIA, 2025 Cybersecurity/privacy concerns Expected/experienced challenges Expected/experienced benefits Time savings on routine tasks Automating IT operations Deeper data analysis Automating business workflows Ability to redeploy employees New insights/suggestions Cost of new applications Identifying use cases Balancing AI and human efforts Cost of infrastructure Building datasets for training Insufficient technical skill Lack of understanding on AI output Difficulty changing workflows 40% 35% 33% 33% 32% 28% 26% 26% 25% 32% 33% 44% 45% 49% 51%

RkJQdWJsaXNoZXIy NTg4Njc=