Webinar Outcomes – “How to Engineer a Microbial Ecosystem – Fundamentals of Environmental Biotech”

Webinar Outcomes – “How to Engineer a Microbial Ecosystem – Fundamentals of Environmental Biotech”

SYNOPSIS

The new EBNet Webinar series is designed to replace our annual Research Colloquium with top-quality, specialist webinars on a range of EB topics. Leading the discussion this January to explore the latest in “How to Engineer a Microbial Ecosystem – Fundamentals of Environmental Biotech”, was Professor Tom Curtis, Newcastle. As a Co-PI of EBNet his focus is the “Biosciences to Engineering” theme. A wealth of experience on the challenges associated with understanding and predicting microbial communities provided a fascinating insight into the most promising ways to proceed.

Thanks to support from numerous funders and a large, diverse team Prof. Curtis’s research has gone a long way in examining the fundamental properties of “real life” microbial communities (MC) at scale. These communities encompass physical and biological factors acting on all the members – with bacteria living amongst associated fungi, protozoa etc. – to produce “emergent properties (EP)” which cannot be appreciated simply by listing the components of that community. Since the 19th century “trial and error” has been the method used to investigate these. Whilst effective, this approach takes considerable time and money for each study – which is a major limitation to progress. An example is ANNAMOX technology which took 20 years to move from proof of principle to application. Many valid ideas may not have benefited from such sustained effort.

There are certain “big rules” which govern all MCs. One such is immigration rate theory where an understanding of the underlying math, applied at scale, has major implications. For example, the probability of a “death” in an established MC being replaced by a “birth” rather than an “immigrant” is of the order of 2×1014. Essentially, at any sort of scale, start-up populations will always be more vital than any subsequent migration. Much work has demonstrated this in cold-adapted populations used to start anaerobic treatment (found to be effective even at 4oC, at scale). Another example is that species diversity scales weakly at bioreactor scale. Such parameters matter because much scientific study focuses on smaller synthetic communities (low diversity/low entropy) rather than the wider microbial world (high diversity/high entropy). When those two connect, Maxwell’s demon comes into play, and it is hard to argue that breaking the 2nd law of thermodynamics is likely!

Prof. Curtis sees this MC research as 3 interrelated problems: EP, scale and time. Whilst there is much published on “little rules” – highly specific pieces of work – it is difficult to bring these together to make engineering decisions. The solution is to make individual-based models (IBM) better. Using modern computing it is now possible to combine growth models (Monod/Free Energy) with physical models (e.g. Newton/Maxwell and Kelvin). Even then, there is no question of using IBM at scale given that 107 bacteria represent merely one mm2. However, by combining IBM with “emulators” that then build into computational fluid dynamics (CFD), Prof. Curtis has achieved meter scale modelling at millimetre resolution. The benefits are that 81 “10 day” experiments can be run, in silico, in a week – which is 100x faster/cheaper.

Looking ahead, the next 30 years will be critical in achieving real changes to our major engineering practices. One avenue of interest is to move towards the refined design (rather than “suck it and see”!) of media for AMBR and trickling filters. We can see that more sophisticated processes will be essential if synthetic biology is to avoid the entropy question. So-called “big rules” like Lotka, Drift-barrier and Game Theory can be developed. And whilst it is always going to be cheaper to design in silico, that does not mean it is cheap! Traditional experiments will continue to give an answer – but it is better to get the answer – with sophisticated and validated computer models.