Jack Poulson graduated from the Oden Institute in 2012.
Jack of All Computational Trades
Dr. Jack Poulson received his Ph.D. from the Oden Institute in 2012. Since then he has worked in academia, the private sector, and currently runs the nonprofit tech industry watchdog, TechInquiry.Org, he founded in 2019. While he has proven versatile and adept at many things, it has not been at the expense of mastering his passion for computational science.
“I would never recommend any student follow the path I did.” This was the response from Jack Poulson, former PhD student at the Oden Institute, and speaker at the student-run Babuška Forum, when asked about being a role model for current graduate students at UT.
After completing his Ph.D. at UT Austin, Poulson went to Stanford mathematics department to complete a postdoc. A stint as assistant professor of computational science at Georgia Tech was next, followed by a return to Stanford to teach at the Institute for Computational Mathematics and Engineering.
Despite moving to and from some of the most prestigious institutions in the country to study his craft, Poulson was getting a little frustrated.
“I struggled with some of the constraints I found in computational science research in academia. Computational Science has lagged behind Computer Science in openness - both in terms of open access publication and open-source software creation being treated as a first-class career option. Mathematics is critical but the labor, and academic merit, of open-source software development was treated as secondary.”
However, this should be seen within the context of a culture where academia must be cognizant of meeting the demands of industry. If most of the jobs in the tech sector are concerned with maintaining and improving upon existing hardware and software demands, and that infrastructure has not, historically, been designed using computational methods, one can see the quandary CSE finds itself in.
“There are some great Bayesian research groups in industry. But for the most part, tech companies want data scientists, and software engineers,” he said.
From Academia to Industry
So Poulson decided it was time to try and affect change from within. In 2016 he took up a post at Google. He was once more frustrated. “In a big company, you may have big ideas, but unless you come in at a high level, you’re not likely to have much impact on decision making, especially infrastructure.”
Then another more public and controversial disagreement with his new employer - namely his opposition to policies the search engine had regarding free speech on the internet in China - led Poulson to publicly resign in 2019. As his New York Times OpEd from the same year put it, he left Google as a “conscientious objector.”
He then went on to found TechInquiry.Org, a watchdog of sorts for the tech industry. Within a few short years Poulson had worked in academia, the private sector, and his own nonprofit. This is not only a lot of change in less than a decade, it was also quite a transition from a well-paid job in industry to the less glamourous world of building a fledgling nonprofit. “It was a struggle at first,” he said. “There was a period of time where I was, I would say about 50/50 between research and consulting. Actually it was more like 30% research, 50% research consulting and 60% nonprofit, which doesn’t make sense but that’s what it felt like. Now I'm almost 100% nonprofit. Which isn’t to say I don't have the interest in doing more research. But there's only so much time in the day.”
While some of his observations about CSE may seem quite dismal, Poulson is in fact optimistic about the future of the discipline.
“The computational science community needs to build infrastructure to have a seat at the table. This is frequently what drives managerial decision making in tech. The Julia Project is perhaps the closest to a very large-scale infrastructure project we have.”
There are a lot of opportunities for these companies to treat CSE as world-class expertise. More importantly, says Poulson, there is ample opportunities for computational scientists to “close the loop” in their work.
“For example, if we’re solving inverse problems, can we do so in a table-top way with actual physics – be it acoustics, electromagnetism (E&M) etc. – rather than synthetic (often 2D) experiments?”
The Role of Physics
In a recent Nature Comment article written by Drs. Karen Willcox, Omar Ghattas and Patrick Heimbach of the Oden Institute, a case was made for the imperative of physics as the foundation from which all machine learning should be developed. Known as ‘scientific machine learning’, the idea is that AI/machine learning should be tethered by the laws of physics so that autonomous decision-making processes remain firmly rooted in laws that are universally applicable. This struck a chord with Poulson.
“The software and hardware stacks at many major tech companies were not designed for solving large-scale physics problems. As a result, there is a tendency to replace high-fidelity physics models with probabilistic techniques. Beyond the challenge of gaining enough measurements to make such approaches meaningful in some domains, machine learning techniques are notoriously unstable and unreliable for inputs outside the distribution of the training set. The notion of certification of a prediction by an engineer based on rigorous mathematical modeling should not be forfeited but rather integrated alongside such techniques.”
Jack Poulson is the guest speaker at the Babuška Forum seminar series on Friday April 9th 2021.