Grove Ventures · Deep Tech Ecosystem

Data Center Infrastructure:
Israeli Startup Ecosystem Map

Over 70 Israeli startups building every layer of the AI data center stack. An analysis of funding, growth, and the opportunities ahead.

Overview

Why Infrastructure Is the New Frontier

AI needs a home - and Israeli entrepreneurs are building it. Grove Ventures is launching the Israeli Data Center Infrastructure Startup Map, revealing dozens of companies solving tomorrow's critical bottlenecks and redefining how the world will store, move, and process data in the age of AI.

AI doesn't exist in the cloud - it runs on physical infrastructure: racks of GPUs, kilometers of fiber, tanks of cooling fluid, and megawatts of electric power. The AI boom of the 2020s has exposed a hard constraint: the world is running out of the infrastructure needed to train the next generation of models.

Israel has responded with one of the densest concentrations of AI infrastructure startups globally. This map documents over 70 Israeli companies across 6 sectors - including approximately 15 in stealth. Together, privately held companies have raised over $4.2B.

🏗️ 70+
Companies Mapped
Across 6 infrastructure layers
💰 $4.2B
Funding Raised
Privately held companies only
📅 65%
Founded Since 2020
Reflects the surge in AI infrastructure startups
6
Sectors
Energy · Cooling · Compute · Storage · Connectivity · Orchestration
Ecosystem Map

Six Sectors. One Mission.

Over 70 Israeli companies spanning every physical and software layer of the AI data center stack - each tackling a distinct infrastructure bottleneck.

Israeli Data Center Infrastructure Ecosystem Map

Map created in collaboration with Startup Nation Central. Check out the interactive map on Finder ↗ Startup Nation Central

Growth Trajectory

New Companies Founded by Period

The acceleration in company formation mirrors the AI infrastructure boom - with 46 companies founded since 2020 alone.

10
2010-2014
15
2015-2019
46
2020-2025
Capital Flows

Funding by Sector

Storage and Connectivity dominate capital formation - driven by the extraordinary scale-up of AI training workloads demanding terabytes of data per second.
* Energy funding reflects private companies only, excluding exited companies (SolarEdge, Nostromo, GenCell)

Storage & Memory
$2.34B
Connectivity
$1.34B
Compute
$808M
Cooling
$50M
Orchestration
$51M
Energy *
$23M
Sector Breakdown

Companies by Sector

Distribution of over 70 companies across the six infrastructure sectors, including companies in stealth.

Energy
15
Compute
15
Connectivity
13
Storage & Memory
12
Cooling
9
Orchestration & Monitoring
7

Includes companies in stealth mode. Some companies may span multiple sectors and are assigned to their primary activity area.

For Founders

The Founder's Opportunities

Each layer of the AI infrastructure stack presents distinct, unsolved challenges - and the chance to build category-defining companies.

Compute
Beyond General Purpose: The Era of Workload-Specific & Memory-Centric Silicon
As the industry hits the "Scaling Wall," the next generation of founders must move beyond one-size-fits-all processing toward architectures where the lines between CPU, GPU, and memory blur. We see a massive opening for founders rethinking the traditional hierarchy—specifically through In-Memory Computing and Processing-in-Memory (PIM), which eliminate the energy-heavy "data tax" of moving information between storage and logic. There is an urgent need for specialized accelerators that move past raw TFLOPS to prioritize Performance-per-Watt, particularly for non-linear transformer architectures and sparse data workloads. Beyond the hardware itself, a critical opportunity lies in the Algorithmic-Software abstraction layer: startups that can automate the optimization of model-to-silicon mapping will be the ones to unlock the true potential of heterogeneous compute environments. The future belongs to those who can orchestrate a fluid shift of roles between processing centers, ensuring that every compute cycle is utilized at peak efficiency.
Energy
From Passive Consumers to Integrated Smart Grid Micro-Hubs
The data center of 2025 is no longer just a building with a plug; it has become a sophisticated microgrid. With power availability now the primary bottleneck delaying AI expansion, a generational opportunity exists for founders to transform how electricity is harvested, converted, and distributed. We see a massive opening for fundamental power architecture shifts within the facility—moving away from legacy systems toward Solid-State Transformers (SST), higher voltage distribution (800V to 1500V), and intelligent DC Bus management. By integrating in-rack energy storage and localized generation—ranging from long-duration industrial batteries to Small Modular Reactor (SMR) integration—startups can effectively "create" power capacity where none existed. The "Why Now" is clear: as rack densities skyrocket, the traditional grid can no longer keep up. Founders who can minimize conversion losses through advanced power electronics and AI-driven grid balancing will find an insatiable global market.
Cooling
The Great Liquid Transition and the Circular Thermal Economy
As rack densities soar from 15kW to over 100kW, air cooling has officially hit its physical limit, creating a vacuum for founders to develop entirely new thermal paradigms. We see a massive opening for two-phase immersion cooling and direct-to-chip microfluidics that can handle the extreme heat flux of next-gen GPUs. However, the true frontier for founders lies in Advanced Thermal Storage and High-Efficiency Heat Pump systems that allow data centers to decouple cooling demand from peak energy costs. Beyond just dissipating heat, there is a groundbreaking opportunity in Direct Heat-to-Power technologies, turning wasted thermal energy back into usable electricity. Combined with "Thermal Digital Twins"—predictive AI systems that orchestrate cooling in real-time based on fluctuating workloads—founders can transform heat management from a massive cost center into a strategic uptime advantage and a source of sustainable energy recovery.
Connectivity
Solving the Traffic Explosion through Optical Reinvention and Optimization
In the age of distributed AI training, the bottleneck has shifted from individual servers to the fabric that binds them. As traditional copper and Ethernet reach their limits in latency and signal integrity, a massive opening exists for founders to reinvent the interconnect. We see a generational opportunity in Advanced Silicon Photonics and Co-Packaged Optics (CPO), bringing "Optical-to-the-Chip" connectivity to reality. Beyond the physical layer, there is an urgent need for Advanced Cabling solutions and New Networking Topologies that can scale to hundreds of thousands of GPUs without congestion. On the software side, the frontier lies in Network Workload and Switching Optimization, creating intelligent fabrics that dynamically route traffic to prevent "Hot Spots" during massive collective communication phases. The goal for new startups is to make the entire data center behave like one giant, seamless supercomputer where the network is no longer a constraint, but a catalyst for scale.
Storage & Memory
Breaking the Data Silos with High-Velocity & Multi-Tiered Architectures
AI models are only as fast as the data that feeds them, and current storage architectures were never designed for the "feeding frenzy" of massive LLMs. Founders have a clear path to disrupt this segment by building disaggregated, high-bandwidth memory (HBM) solutions and AI-native storage fabrics that eliminate data-stalls and ensure expensive GPUs never spend a single millisecond waiting for a packet. However, as the volume of training and inference data grows exponentially, the frontier for innovation has expanded to Warm and Cold Storage. We see a massive opening for founders developing next-generation archival solutions—leveraging DNA storage, advanced optical discs, or high-density NVM—that can dramatically slash the TCO (Total Cost of Ownership) and physical footprint of long-term data retention. The "Why Now" is simple: the cost and availability of data storage have become limiting factors for AI's long-term sustainability. The future belongs to startups that can move data closer to the compute at lightning speeds while simultaneously solving the crisis of storing the world's rapidly expanding digital brain.
Management & Orchestration (Software)
The Autonomous Data Center Operating System
The complexity of modern AI infrastructure has officially outpaced human management capabilities. This segment is ripe for founders building the "Autonomous Data Center OS"—a software layer that uses AI to manage the very infrastructure it runs on. We see a massive opening for platforms that provide Holistic Power-Cooling-Compute Orchestration, breaking down the silos between these three critical pillars. Beyond traditional monitoring, there is an urgent need for Real-Time and Micro-Real-Time Power Orchestration that can dynamically shift loads to prevent grid instability or thermal spikes. Founders should focus on building self-healing ecosystems that include Predictive Maintenance—spotting a failing chip or a cooling leak before it halts a $10M training run—and Carbon-Aware Scheduling that moves workloads based on the availability of green energy. The future belongs to startups that can transform a chaotic collection of hardware into a unified, self-optimizing engine where every watt of energy and every compute cycle is accounted for in real-time.
Data & Methodology

About This Report

Methodology

To identify the companies included in the map, relevant tags were defined across the data center infrastructure landscape. Based on these tags, technology companies founded in 2010 or later were extracted from Startup Nation Central's Finder database. Additional companies operating in the space were then added manually based on publicly available sources. The final list was grouped into categories, with each company assigned to the primary activity area most relevant to its business.

The data presented in the analysis, including founding year, total funding, and other company information, is based on Harmonic.ai and publicly available sources and may therefore be partial or incomplete. In addition, information on stealth companies included in the analysis and mapped to the relevant categories may not be available, meaning actual aggregate funding across the landscape is likely higher.