The stringent constraints initially led to doubt over the utility of additional memory. However, Buhrman and Cleve discovered, to their surprise, that carefully adjusting bits could indeed harness extra computational power from full memory. This finding was unexpected by the research community, as noted by Loff, a former graduate student in Buhrman’s group working with fellow student Florian Speelman. The team extended their findings to a broader range of problems, publishing their collective research in 2014.
This new paradigm was termed catalytic computing, drawing from chemistry terminology. Raghunath Tewari, a complexity theorist at the Indian Institute of Technology, Kanpur, explained the term by likening it to a chemical reaction that proceeds only with a catalyst, which remains unchanged itself.
Despite ongoing development in catalytic computing by a small group of researchers, no attempts were made to apply these techniques to the tree evaluation problem that initially sparked Koucký’s investigation. The open question for the tree evaluation problem was whether minimal memory could serve both storage and computation purposes; however, catalytic computing depended on having a substantial surplus of memory. When this memory is reduced, the methodologies become ineffective.
Nevertheless, James Cook, intrigued by the potential of adapting catalytic techniques for a tree evaluation algorithm, could not ignore the possibility. As the son of Stephen Cook, a prominent complexity theorist who invented the tree evaluation problem, James had a personal connection to the issue, having previously tackled it during graduate studies. Before encountering the foundational paper on catalytic computing in 2014, James was already transitioning from academia to a career in software engineering. Nonetheless, the idea of catalytic computing lingered in his mind,
“I had to understand it and see what could be done,” he remarked.
For several years, James Cook explored a catalytic approach to the tree evaluation problem in his free time, culminating in a presentation on his progress in 2019 at a symposium celebrating his father’s influential work in complexity theory. Following the event, Cook was approached by Ian Mertz, a graduate student who had embraced catalytic computing five years earlier as an undergraduate.
“It was like a baby bird imprinting scenario,” Mertz commented.
The collaboration between Cook and Mertz quickly bore fruit. In 2020, they created an algorithm that addressed the tree evaluation problem using less memory than the minimum conjectured by the elder Cook and McKenzie, albeit only slightly less. This achievement was sufficient to secure a $100 wager, with half remaining within the Cook family.
Yet, the task was not complete. Tree evaluation was initially studied as it seemed poised to demonstrate a problem existing within P but not in L—essentially, a relatively simple problem unsolvable with minimal memory. Though Cook and Mertz’s algorithm used less memory than any previous tree evaluation algorithm, it still employed substantially more than any algorithm for a problem in L.
In 2023, Cook and Mertz released an improved algorithm requiring nearly the minimal memory permissible for problems within L. This development led many researchers to believe that tree evaluation might, after all, belong in L, with proof seemingly imminent. Complexity theorists may need a new strategy for tackling the P versus L problem.
Meanwhile, Cook and Mertz’s findings have intensified interest in catalytic computing, prompting new investigations into connections with randomness and the consequences of permitting minor errors when resetting full memory to its original state.
“We’ve not finished exploring what we can do with these new techniques,” McKenzie stated. “We can expect even more surprises.”