Nexthop AI
Category: AI Infrastructure
Developer of high-performance Ethernet switches and networking infrastructure purpose-built for AI data center traffic patterns, delivering optimized hardware, software, and optical solutions for hyperscale cloud operators. Nexthop AI was founded in 2024. The company is led by Anshul Sadana. Based in Santa Clara, United States. Team size: 250-300+. Total funding raised: $610.0M. Latest round: Series B ($500.0M, Mar 2026). Key investors include ["Lightspeed Venture Partners","Andreessen Horowitz","Altimeter Capital","Kleiner Perkins"].
- Founded
- 2024
- Headquarters
- Santa Clara, United States
- Team size
- 250-300+
- Total funding
- $610.0M
Value proposition
Delivers 15-20% superior power efficiency compared to competitors, reduces total cost of ownership by approximately 30% through disaggregated spine architecture, compresses product development cycles by 6-12 months through co-development partnerships, and provides seamless migration without requiring disruptive changes to rack or fiber plants.
Products and solutions
["NH-4010 Switch (51.2 Tbps, industry's lowest power based on Broadcom Tomahawk 5)","NH-4220 Switch (102.4 Tbps, highest density 1.6T air-cooled system based on Tomahawk 6)","NH-5010 Switch (25.6 Tbps, deep-buffer scale-across spine based on Qumran-3D)","Nexthop NOS (SONiC-based network operating system with cybersecurity optimizations)","Linear LPOs and LROs (optical modules reducing DSP requirements)","Disaggregated Spine Architecture (innovative scale-across network design)"]
Unique value
First networking company to design switches natively for AI traffic patterns rather than adapting enterprise/cloud products. Pioneered disaggregated spine architecture that decomposes traditional monolithic chassis into independent functional tiers. Only vendor combining world-class hardware engineered around open-source from inception, with proven 15-20% power efficiency advantage and 30% cost reduction versus legacy architectures.
Target customer
Hyperscale cloud operators (AWS, Microsoft Azure, Google Cloud, Meta) and NeoCloud providers running large-scale AI training and inference clusters
Industries served
["AI Infrastructure","Cloud Computing","Data Center Networking","Hyperscale Computing","GPU Clusters"]
Technology advantage
Co-designed hardware and software stack optimized specifically for AI workloads with RDMA over Converged Ethernet (RoCEv2) and DCQCN congestion management. Purpose-built for open-source network operating systems (SONiC, FBOSS) enabling hyperscaler customization. Joint Development Manufacturer (JDM) model acts as extension of customer engineering teams, compressing development cycles by 6-12 months. Industry-leading power efficiency delivering tens of megawatts savings at scale.
How they differentiate
AI-native networking solutions with Joint Development Manufacturer (JDM) model, purpose-built for AI workloads rather than adapting legacy products. Delivers 15-20% superior power efficiency through disaggregated spine architecture and open-source SONiC integration. Co-development approach acts as extension of hyperscaler engineering teams, compressing product development cycles by 6-12 months.
Main competitors
["Arista Networks","Cisco Systems","Nvidia"]
Key partnerships
["Broadcom (strategic chip supply partnership for Tomahawk 5/6 and Qumran-3D silicon)","Lightspeed Venture Partners (Series A and B lead investor)","Andreessen Horowitz (Series B major investor)","Altimeter Capital (Series B investor)","Kleiner Perkins (Series A investor)","Large hyperscaler customers (undisclosed, co-developing disaggregated spine architecture)","Microsoft Azure Networking team (SONiC ecosystem collaboration)"]
Notable customers
["Microsoft Azure (SONiC ecosystem partnership)","Leading hyperscalers","NeoCloud providers"]
Major milestones
["Company founded in 2024 by former Arista Networks COO Anshul Sadana","Launched from stealth with $110M Series A funding in March 2025","Raised oversubscribed $500M Series B at $4.2B valuation in March 2026","Launched foundational product portfolio including NH-4010, NH-4220, and NH-5010 switches","Pioneered disaggregated spine architecture for AI data centers","Built team of 300+ employees within 18 months"]
Growth metrics
Grew from founding to 300+ employees in 18 months; estimated $17.2M revenue in 2025; achieved $4.2B valuation within 2 years of founding
Market positioning
Specialized AI networking startup targeting $35B+ AI-optimized networking market, positioned as challenger to established networking vendors by offering custom, highly efficient solutions for hyperscale AI data centers. First networking company to design switches natively for AI traffic patterns.
Geographic focus
North America (Santa Clara, CA headquarters), serving global hyperscalers and neocloud operators with primary focus on US-based cloud providers
Patents and IP
No registered patents disclosed for Nexthop AI as a company. However, key team members including CEO Anshul Sadana and VP Product Management Arthi Ayyangar hold multiple networking technology patents from their previous roles at Arista Networks and other companies (including leaf-spine architecture, Ethernet latency, MPLS, GMPLS, SRv6 innovations).
About Anshul Sadana
Seasoned networking industry executive with nearly 17 years at Arista Networks as COO and Chief Customer Officer. Led product roadmap, hardware design, supply chain, global sales, customer engineering and support. Previously spent 8 years at Cisco as Senior Engineering Manager leading high-speed switch development. Holds multiple patents including leaf-spine architecture. MBA from Wharton School of Business, MS in Computer Science from University of Illinois at Urbana-Champaign. Founded Nexthop AI in 2024 to address AI infrastructure networking needs for hyperscalers.
Official website: https://nexthop.ai