Introduces a series of 800G AI switches for AI/ML workloads - 32-port and 64-port 800G switches, giving a variety of choices for AI data centers.

Edgecore Networks Showcases Comprehensive End-to-End AI Data Center Solutions at the OCP Global Summit

Lucille Lu, Head of Marketing
Lucille_lu@edge-core.com

OCP -- Edgecore Networks, a leader in providing innovative network solutions for enterprises, data centers, and service providers, announces a new 32-port switch addition to its 800G AIS series – AI switches, designed to meet the demanding needs of AI and machine learning (ML) workloads.

These new switches offer low latency, high radix, and dynamic load balancing which ensure reduced congestion across a lossless network. Completing the total solution, Edgecore also provides a range of pluggable optical modules and a robust SONiC-based network operating system, enabling seamless integration with modern data centers. Together with Edgecore’s SONiC software ecosystem partners, such as Aviz Networks, BE Networks, Dorado, and Netris, these solutions provide users unparalleled visibility, performance assurance, and efficient deployment for enterprise data center networks.

“We are thrilled to present and showcase our latest AI/ML data center series of 800G platforms at the OCP Summit,” said Nanda Ravindran, Vice President of Product Management at Edgecore Networks. “Through open-source community and our innovation, we offer flexibility for our customers with AI/ML fabric solutions, integrating technologies like RoCEv2, DCQCN, and DLB. The SONiC ecosystem enables seamless scaling and optimization, crucial for handling unique traffic patterns and latency-sensitive AI workloads.”

The Edgecore end-to-end AI data center solutions:

  • Rich Ecosystem and Vendor-Agnostic: SONiC's rich ecosystem supports broad compatibility with the industry’s most popular server NIC devices. Its vendor-agnostic approach and deep integration with telemetry tools ensure adaptability, scalability, and optimal performance for AI/ML workloads.
  • 800G x 64-Port Switch: Capable of supporting 128-256 nodes, featuring NCCL PXN to eliminate latency from cross-switch communication, ensuring seamless scalability for AI workloads. Offering 64 and 32 port switches enables the most flexibility in AI data center design planning.
  • Lossless Ethernet for AI RDMA Traffic: Guarantees maximum throughput and minimal latency, optimized for high-performance AI applications.
  • DCQCN with PFC/ECN Support: Leverages advanced traffic management techniques to control AI fabric traffic, ensuring smooth operation with minimal congestion.
  • ECMP Eligible Mode: Breaks down large "elephant flows" into smaller flowlets, preventing hash polarization and congestion across ECMP links, improving load distribution and performance.

The Edgecore/Accton end-to-end AI networking solution (along with some advanced demonstrations of the industry’s most energy efficient open-loop and immersion cooling AI solutions) will be shown at the OCP Global Summit, San Jose Convention Center (Booth # A4) on October 15-17.

About Edgecore Networks

Edgecore Networks Corporation is a wholly owned subsidiary of Accton Technology Corporation, the leading network ODM. Edgecore Networks delivers wired and wireless networking products and solutions through channel partners and system integrators worldwide for the AI/ML, Cloud Data Center, Service Provider, Enterprise and SMB customers. Edgecore Networks is the leader in open networking providing a full line of open WiFi access points, packet transponders, virtual PON OLTs, cell site gateways, aggregation routers and 1G, 10G, 25G, 40G, 100G, 400G and 800G open networking switches that offer choice of commercial and open-source NOS and SDN software. For more information, visit www.edge-core.com.


Read Previous

Lenovo Announces Comprehensive New Hybri

Read Next

Novanta to Present at the Baird 2024 Glo

Add Comment