Electricity is a key raw material for artificial intelligence, but new processing technologies are outpacing data center operators’ ability to manage their relationships with the power grid, forcing them to slow down by up to 30%.
“There’s so much power wasted in these AI factories,” Nvidia CEO Jensen Huang said in a keynote address at the company’s annual GTC customer conference. “Every unused watt is lost revenue,” the company declared in its annual presentation.
Now, startup Niv-AI has emerged from stealth with $12 million in seed funding to solve this problem by developing tools to accurately measure and more efficiently manage GPU power consumption with new sensors.
The Tel Aviv-based startup was founded last year by CEO Tomer Timor and CTO Edward Kizis and is backed by Glilot Capital, Grove Ventures, Arc VC, Encoded VC, Leap Forward, and Aurora Capital Partners. The company declined to disclose its valuation.
At Frontier Labs, we operate thousands of GPUs working together to train and run advanced models, resulting in frequent spikes in millisecond-scale power demand as the processor switches between computational tasks and communication with other GPUs.
These surges make it difficult for data centers to manage the power they receive from the grid. To avoid running out of enough power, data centers pay for temporary energy storage to cover power spikes or throttle GPU usage. In both cases, the return on investment in expensive chips is reduced.
“We can’t keep building data centers the way we are now,” said Lior Handlesman, a partner at Grove Ventures and a member of Niv’s board of directors.
tech crunch event
San Francisco, California
|
October 13-15, 2026
The first step in Niv’s roadmap is understanding what’s going on. The company is currently working with itself and design partners to deploy rack-level sensors that detect power usage at the millisecond level on its own GPUs. The goal is to understand the specific power profiles of different deep learning tasks and develop mitigation techniques that allow data centers to better utilize existing capacity.
Naturally, engineers expect to build AI models based on the data they collect and train the AI models to predict and synchronize the power load across the data center, in other words, the data center engineer’s “co-pilot.”
Niv expects to have operational systems in a small number of US data centers within the next six to eight months. This is an attractive idea because hyperscalers looking to build new data centers face land use and supply chain issues. The founders believe the final product is the missing “intelligence layer” between the data center and the power grid.
“The power grid is actually afraid of data centers consuming too much power at certain times,” Timor told TechCrunch. “The problems we’re looking at are problems on both sides of the rope. One is to help data centers utilize more GPUs, and hopefully get more out of the power they’re already paying for. On the other hand, we can also create a more reliable power profile between the data center and the grid.”
