For both scale up and scale down I would recommend to use a tool such as regime analysis. It forces you to dig into your understanding of the process, and it helps you identifying key subprocesses that might need further investigation. Check for example: http://www.sciencedirect.com/science/article/pii/0141022987901335
Bio-process engineering have shown promising results over the past few years in various industrial sectors. Although it is quite easy to maintain controlled conditions in lab scale, it is quite difficult in large scale trials for a commercial venture. Now the part of engineering comes into play where each parameter considered at lab scale has to be scaled up. Many issues come into picture where results vary from the lab scale. Engineering approach helps solve the complex unit operations. Modelling and simulation has shown better pictures.
Understanding the type of bioreactor you are going to design and scale up is very key and you must factor in the microorganisms to be employed, whether the process is aerobic or anaerobic as well as the mode of operation whether continuous or fed batch
The best technique to perform relevant scaling-up studies is to do the opposite : scaling-down, i.e. you can design a lab-scale reactor in order to reproduce mixing defect and see what happen at the level of the biology
Main tools for scale up bioprocess is optimise of condition operation factors to get highest rate production using pure culture. Lab results data and the biochemical reaction can use to design bioprocess line and bioreactor to larger production.
It would be gud to initially use the tools of Aspen and Matlab and then proceed to higher simulation tools like Super Pro designer to monitor all the parameters of the process. Super-pro Designer has been an efficient tool for analysis purposes.
For both scale up and scale down I would recommend to use a tool such as regime analysis. It forces you to dig into your understanding of the process, and it helps you identifying key subprocesses that might need further investigation. Check for example: http://www.sciencedirect.com/science/article/pii/0141022987901335
The bioprocess has been successfully scaled up to volumes greater than 25,000L through sound engineering fundamentals and thorough process understanding in case of the modern cell culture. A scale-up strategy that combines integrated teamwork with solid engineering efforts can go a long way to minimize costly rework .
Use these tips to successfully handle challenges that may arise during design and scale up of various bioprocess operation(Avoid pitfalls of process of Bioprocess Development,CEP August 2006)
Most bio-processes are enzyme controlled reactions. Prior to scale up of the processes it need to optimization of the reaction parameters required for enzyme catalyzed reactions like the maximum substrate-enzyme concentration, enzyme kinetics, process conditions etc.
In scaling up of bio-processes the intermediate products/enzyme substrate complexes formed have the important role to control the reaction kinetics. Accumulation of some such products may unusually hinder the catalytic activity of the system. For smooth continuation of bio-process it needs to maintain proper physiological proportion between the substrate, enzyme concentration (substrate free and substrate bound), reaction intermediates and end-products to keep the process equilibrium towards the product side.
The best tool is the brain. Before dashing off to do experiments, make sure you define what you know, what you assume and what you do not know. Each of these should be critically evaluated. Then intelligent use of canned programs to simulate recognisi g the outputs are dependent on the three categories outlined above. During these studies be prepared to redistribute the three as new data comes up.
However as pointed out you need to worry about your scaled down model as much as your scale up.
Development of biotechnological production processes take about 7 years.it is necessary to find new strategies to develop bioprocesses more efficiently by reducing the development times.This will reduce high investment risks and costs.
Engineers heavily rely on certain rules of thumb, also known as heuristics, for putting together the skeleton of a recovery and purification process.. They are: 1) Remove the most plentiful impurities first. 2) Remove the easiest-to-remove impurities first. 3) Make the most difficult and expensive separations last. 4) Select processes that make use of the greatest differences in the properties of the product and its impurities. 5) Select and sequence processes that exploit different separation driving forces.Bioprocess Desig and Economic by Demetri Petrides, Ph.D. President INTELLIGEN, INC. http://www.intelligen.com
Sorry for adding a question to a question. Is anyone using the basic principles and tools of Quality by Design and Quality Function Deployment in this process?
For upscaling of fermentation processes in batch and fed batch process wherein hybrid techniques are being developed, powerful Metabolic Control Analysis (MCA) tools which can analyse the complex metabolic pathways in the living cells are being employed.
This somewhat artificial topic surprises me but the question is valid. However, it did hardly get relevant answers. There are a lot of tools for scale up both traditionally (such as dimensional analysis) and new (such as CFD) but all tools have their disadvantages and in general do not straightforwardly give the scale up criterion to use. However, all tools help to understand the process at hand. All processes are limited by a subprocess, normally a transport process. The limiting subprocess determines the ruling regime. When the regime does not change with scale up (which is rare) scale up is rather easy. Anyway, a regime analysis is needed to determine which regimes may occur as a function of scale. Then, the limiting subprocess at the industrial scale can be studied in detail e.g. by scale down. From that a scale up criterion may be found.
CFD is a powerful tool but in general not yet suited for scale up. E.g. bubble column bioreactors do show a (chaotic, dynamic) flow behaviour that cannot yet be described by (time averaging) CFD. Furthermore CFD is time consuming, limiting its use for fast scale up. But the development is fast and (dynamic) CFD will in the (near?) future a standard tool.
as Professor Rob said, CFD tool is time consuming but for multiphase reactors that are strongly I size dependent I think the CFD simulation is unique. for starting an appropriate CFD simulation it needs a good dimensional analysis.
Parviz Moin, the Franklin M. and Caroline P. Johnson Professor in the School of Engineering and director of CTR at Stanford said that, “CFD simulations are incredibly complex. Only recently, with the advent of massive supercomputers boasting hundreds of thousands of computing cores, have engineers been able to model jet engines and the noise they produce with accuracy and speed,” .Many CFD breakthroughs involve using inventive numerical algorithms to break down fluid dynamics equations into finite numbers of standard differential equations.
In membrane chromatography, we have observed different mechanisms simultaneously impacting on band broadening. Binding mechanisms are scale-independent, if mass transfer phenomena are not lumped into the same parameters. Separate transport models at different scales can be used for transferring binding parameters across scales. This approach works with CFD models which are computationally expensive but provide physical insights (http://dx.doi.org/10.1016/j.chroma.2013.07.004), as well as with semi-empirical models that are inexpensive to solve but require estimation of additional parameters from measurement data (http://dx.doi.org/10.1002/bit.24771).
CFD is a good method OK, but there is a need for fermentation processto have a good knowledges in biokinetics and identifing the limiting factors .Establish the criticals flux and stock of substrates and respect them for scaling up or down.
An other point is to establish the criticals times (time in the loops(for mebrane reactors,,,,) mixing time for exemple.,,,,,,
Accurate quantification of biokinetics and stoechiometry at a reasonable size of fermentation (30liters to 100liters) permit to establish the constraints giving higher performences to CFD and accuracy in scale changing
I agree with your answer. Multivariate Analysis has extensively used by researchers in engineering,biology,chemistry,geology,mining ,education and many other fields.
I think that are three import ant aspect to consider " the heart of any process is the bioreactor: 1. Kind of process to carry out , is to say, microorganism types, animal cells,..2. Kinetics growth and production, aerobic or anaerobic process, rheological behavior during de process, and 3. Engineering parameters as mass and heat transfer studies. With this studies we may obtain mathematical models and simulate the process in CFD, Aspen….. software or make the proper software. What do you scale up in the bioprocess: pretreatment, bioreactor, separation process?
My feeling and experience in fermentation is that we need very accurate bio kinetics with effects of PO2,PCO2,Redox,shear rates, mass and heat transfer rates, criticals températures and Ph,concentrations (equivalent to Monod constants)and inhitions constants. After with rules of scaling up/down what is (are) the most critical factor to remain stable .This state of knowledge will be acquired with fermentors higly instrumented 30 liters start to be fair ?
These parameters are required to feed commercials soft ware and CFD .
It is also necessary to be careful with industrials substrates and changes in physico chemicals properties of fermentation medium.
This company in Copenhagen is producing a sensor ball that can be thrown into commercial bioreactors to gather liquid analytical data at different points within the reactor. This could help greatly with scale up processes. www.freesense.dk