James Cuff on Networks, Cloud, and the Problem that Drew Him Out of Retirement
By Allison Proffitt
April 26, 2022 | Before retiring nearly five years ago, James Cuff had held positions in high performance computing at the Wellcome Trust Sanger Institute, the Broad Institute, Harvard University, Cycle Computing, and more. And in his retirement, he watched the high performance computing space closely, tracking the maturation of technologies like artificial intelligence and the rise of trends like edge computing.
Over the span of 25 years or so, Cuff told Stan Gloss in the latest episode of Bio-IT World’s Trends from the Trenches podcast, while the space has seen some updates, a lot of things have fundamentally stayed the same.
“If I look back and think of how computing has evolved, we absolutely have more of it than we ever have. We are fortunate that computing now threads its way through all of our life on a daily basis, and yet we are still in a situation where we are still bounded by computational complexity: be it either access to storage, [or] access to more sophisticated computing resources,” Cuff said. “If I was to summarize the career arc, computing hasn’t actually gotten easier—albeit portions of it are a little more straightforward than they used to be. For instance, we don’t use janky perl scripts to orchestrate our compute anymore.”
Cuff maintains that computing should follow the basic scientific principle: if you can’t measure it, you can’t fix it. While complex compute environments include a great deal of heterogeneity in the workloads—and will always need to be “good enough” instead of perfect—he challenges technologists to benchmark their applications: “Do the sums,” he challenges. Work granularly to figure out if your algorithms are working optimally, are your machines operating at capacity, consider energy, operations, and cooling costs. “This is important when you are acquiring compute either by the hour or making capital purchases,” he emphasizes.
He sees fluidity in infrastructure with growth in networking, edge, cloud, and storage solutions. Object stores have grown very sophisticated, eliminating the headache for storage, he says. And high bandwidth networks exist within the data center have grown by “leaps and bounds.” Now technologists are considering not only the data closet down the hall, but global compute resources. Finally, he rejoiced that research computing scientists are finally coming into their own as fully fledged members of the scientific community.
Next Chapter: GigaIO
Just one month into his new role Cuff says it’s the culmination of all the things he’s spent the last five years observing.
“I started to get this bug around composability and disaggregated compute, because every system I’ve ever built, or borrowed, or rented, has always been inherently the wrong shape. I’ve always struggled trying to make sure the right compute was available for the right researcher at the right time to solve the right problem.”
In GigaIO he’s found a team of hardware engineers and software engineers that are tackling this problem. GigaIO effectively virtualizes hardware resources, Cuff says, matching any accelerator with any chunk of memory and any chunk of storage under any type of compute. This exploits high bandwidth and low latency creating what the company calls “impossible servers” that can share storage and memory.
The company’s current product is exciting, but the product of tomorrow is what Cuff finds even more compelling. His role, Chief of Scientific Computing and Partnerships, is to advocate for and evangelize for composable computing and thinking of computer architecture in new ways.
“I’m psyched,” he says. “It scratches all my itches because it’s everything about the computing career I’ve had wrapped up with a little bow on it.”
Trends from the Trenches Podcast
Bio-IT World’s Trends from the Trenches podcast delivers your insider’s look at the science, technology, and executive trends driving the life sciences through conversations with industry leaders. BioTeam co-founder Stan Gloss brings years of industry experience in science, data, and technology to conversations exploring what is driving data and discovery, and what’s coming next.