Why cold computing matters for the next generation of computing : Page 3 of 3

June 22, 2018 //By Nick Flaherty
Why cold computing matters for the next generation of computing
Running computers at 77K and even 4K gives huge advantages in power efficiency and access to quantum computers. Nick Flaherty talks to Craig Hampel, Chief Scientist at Rambus (above), on the cold computing research projects the company is working on that will be commercialised in the next three to five years.

“So 77K ends up be fairly conventional but there’s a discontinuity at 4K,” he points out. “There’s things we know don’t work down there such as a PLL where there’s not enough gain in the feedback loop, and copper doesn’t superconduct so its niobium, tantalum that are more attractive.”

He believes this is one of the only options for future computation. “Most quantum machines need a conventional error correction processor near them and that’s another aspect that’s driving this architecture,” he said. “The more likely transition is 77K and 4K machines deploy next to the quantum machines.”

The value for Rambus is in providing the buffer technology for such systems, as chips or as IP, in the next few years.

“Today we have a significant offering in memory buffers – these get more challenging and valuable when they also translate between temperature domains so primarily we would sell buffers that intermediate between superconducting domains and 77K – that’s the most natural approach.”

“We think that 77K DRAM subsystems will be thermally attractive and possible in three years or less,” said Hampel. “We are building prototypes today, and that can be used 3 to 5 years if things go well. The superconducting processor still needs a lot of engineering but if things go well it’s at the same timescale.”


Related stories:

Vous êtes certain ?

Si vous désactivez les cookies, vous ne pouvez plus naviguer sur le site.

Vous allez être rediriger vers Google.