Microsoft’s Majorana 1 quantum computing chip
Microsoft
After decades confined largely to research labs, Quantum computing may be closer to its breakout moment than many on Wall Street expect.
The technology, which uses the principles of quantum mechanics to solve problems beyond the ability of the most powerful classical supercomputers, has long been described as futuristic. But rapid advancements have intensified investment in the sector and sparked discussions about how these powerful computers will integrate with industries like the already booming data center sector.
“By the end of the decade, we are confident that we will have machines in data centers that have commercial value,” Zulfi Alam, Microsoft‘s corporate vice president of Quantum, told CNBC.
“I would not be able to say this with this much clarity last year, but this year, I can state to claim that by 2029 you will have machines that will have commercial [value], meaning that they will be doing calculations that classical machines cannot do,” said Alam, who’s leading the development of the company’s scalable quantum machine.
Classical computers use switches, or bits, to either pass through or block an electric current at any given moment to perform calculations. The larger the number of bits, the greater the computing power. Quantum computers, on the other hand, use the ability of some materials at extremely low temperatures to exist in both an ‘on’ and ‘off’ state at the same time. This allows quantum bits, or qubits, to perform the same calculations at vastly greater speeds.
Microsoft, which last year revealed a new quantum computing chip called Majorana, is among the hyperscalers — companies that provide computing capacity that can rapidly expand as demand rises — like Google and Amazon that are investing heavily in the technology.
Patrick Moorhead, CEO and chief analyst at Moor Insights & Strategy, said he is also seeing hyperscalers and platform vendors ramp up investment through cloud access, pricing controls and developer platforms, while the defense sector is investing early in both quantum computing and networking.
Governments are stepping up their investments too, with China leading with just short of $18 billion in public investment in quantum technology, followed closely by the EU, according to the European Centre for International Political Economy (ECIPE), a think tank.
Most industry roadmaps now place the implementation of these systems in the 2028–2032 timeframe, according to Ellie Brown, quantum computing and cloud economics analyst at S&P Global Market Intelligence.
UBS sees the advantages of quantum computing arriving by the early 2030s, even as company roadmaps are positioning for this earlier, said Madeleine Jenkins, analyst at UBS. “A lot of companies are telling me that 2027 is going to be a big year for quantum in terms of roadmap, in terms of what’s achieved,” she said.
Taken together, these timelines signal a sector moving steadily toward real‑world deployment, while raising important questions about how today’s data infrastructure will need to evolve to support it.
Changing the energy demand
In a 103-page report published in January, UBS analysts led by Jenkins said the industry is close to completing a quantum computer that could cost tens of millions of dollars to build but has the ability to solve a problem in 200 seconds that would take a conventional supercomputer 10,000 years.
When it comes to the impact on the data center ecosystem, experts told CNBC that quantum could potentially lower the energy needs of the power-hungry facilities while also reducing the workloads needed for training AI.
I would not be able to say this with this much clarity last year, but this year, I can state to claim that by 2029 you will have machines that will have commercial [value], meaning that they will be doing calculations that classical machines cannot do.
Zulfi Alam
Microsoft’s corporate vice president of Quantum
In terms of energy, quantum computing would require a “fraction of what a data center would use,” Jenkins said.
“The big thing is time; if you’re taking the same problem that would take thousands and thousands of… hours, and you’re replacing that with a quantum computer that takes seconds or minutes, then obviously you just need a lot less energy,” she said.
Microsoft’s Alam also noted the lower power requirements of quantum computers, highlighting that Majorana 1 is “showcasing more power than the entire computation of the entire planet [in] the palm of your hands and it’s not running super-hot. It’s running cold.”
While quantum technology is advancing rapidly, it is unlikely in the near term to displace the classical computing that data centers currently host.
“Ideally, the entire efficiency of a problem-solving workload will go down, but it’s not going to be a complete substitution,” S&P’s Brown said.
Microsoft’s Alam stressed that quantum systems will not operate in isolation. “A quantum machine is not a standalone entity. It’s a hybrid tool. It’s a quantum accelerator that needs a high-performance computer very close to it,” he said.
Moor Insights & Strategy’s Moorhead also noted that if quantum scales, it will likely play a complementary role, adding a new class of “special infrastructure” within data centers and shifting facility design toward “quantum pods” which come with their own power and thermal needs.
“It will not displace the dominant energy driver near-term, which is AI data center expansion, but it will add pockets of specialized load and operational complexity,” he told CNBC via emailed comments.
Ultimately, it’s likely to change the shape of demand, but not the scale, with the AI boom remaining a key driver of demand for the facilities.
Roadblocks ahead
Building that kind of system inside real data‑center environments won’t be straightforward and could require entirely new purpose-built facilities.
Only a handful of specialized quantum computers are currently deployed in data centers, with quantum vendors currently brainstorming a set of industry standards to help streamline broader adoption, according to Brown and S&P analyst Kelly Morgan.
There is still a significant amount of bespoke work that needs to be done in order to integrate quantum systems into data centers, Brown said, adding that, “we’re lacking some quantum talent to make use of that and get that installed effectively.”
But in the long term, she anticipates “a nice interplay between quantum and some of the other data center areas including AI” where the two could work together to solve problems.
Tim Adams, president and CEO of the Institute of International Finance, said that these hurdles reinforce the need for continued investment in data‑center infrastructure over the next decade.
“Data centers are necessary to move technological transformation forward and should be thought of as one of a number of likely investments on the road to very transformational achievements we are sure to see in the timeframe of the next ten years.” Adams told CNBC.
And this phase has already begun, with Brown pointing to a burst of M&A activity aimed at building the capabilities needed for quantum’s commercial phase.
“M&A has been massive over the past three months,” Brown said, noting several acquisition announcements from quantum firm IonQ. “There’s been a lot of positioning within the space not only to help improve quantum talent and technology, but also to help control that supply chain a little bit.”
Along with the opportunities that quantum computing offers, when it comes to risks, data security is arguably the biggest one.
According to UBS, a powerful enough quantum computer could break current encryption methods, meaning that security systems would no longer be reliable. The report by the Swiss bank warns that companies will have to implement new quantum-safe encryption techniques, and that investment in these will have to start in the next few years.
Even with this surge of investment, Microsoft’s Alam warned that the path ahead won’t be easy. It’s going to take a lot of “blood, sweat and tears,” he said, anticipating numerous challenges as quantum machines come online — from meeting performance benchmarks to solving complex technical problems — all of which need to “converge at the right time” for the real magic to happen.

