Tuesday, May 17, 2011

Another approach to process safety textbooks!

The nature of core chemical engineering topics
Most of the core subjects on the chemical engineering curriculum at universities worldwide teach students how to improve things. Such as: How to select a more sustainable route to a given product, how to create a more effective catalyst, how to improve energy utilization in the plant using pinch technology, how to select the best solvent for a separation process. All this somehow translate directly into how to make more profit! Also in the view of most managers.

One core topic - or at least it should be a core topic in my opinion - stands out in this connection. That is process safety. That is all about prevention: Prevention of loss, prevention of injuries, prevention of fatalities, prevention of explosions, prevention of fires, prevention of chemical releases. All this somehow translate directly into expenses! At least in the view of many manager!

Motivating students
How could this be changed?`First textbooks - and fortunately there are not many - in process safety must be drastically changed. Secondly - and I am guilty of this too - we must change the way we start our process safety teaching from attempting to motivate our students with negative head lines from newspapers or dramatic videos of past disasters. People working on action theory, such as professor Morten Lind at DTU-Elektro, will tell you that avoiding some event is much more difficult than achieving something. This is because to avoid something we need to put barriers - i.e. safety systems - on all the possible routes to an event. On the contrary, to achieve something we just need to find one route to the goal. This is the fundamental difference between process safety and all the other subjects we try to teach our chemical engineering students (maybe with the exception of professional ethics).

I believe, that Walt Boyes at SustainablePlant.com with his idea that process safety is all about uptime, is on the right track. All the negative events, which process safety is trying to prevent reduce uptime - if we somehow fail to prevent just one of the routes to them. So let us change things around, and work on improving uptime. Uptime also has the advantage, that it can be easily measured and related to profits. And as Peter Drucker said, what gets measured gets managed!

Starting the change
So how should a text book, which aim at driving home the point, that process safety is all about uptime be structured? Is should take off with a positive message. This could be a quote from an interview with the CEO of the Dow Chemical Company in the middle 90's. To Harward Business Review this CEO stated, that in the last ten years the company had not made a single investment in process safety, which had not contributed positively to the company buttom line.

Then it could go on to say, that if your company's OSHA incidence rate is half the industry average then that is improving your company buttom line by having improved uptime, and reduced cost of insurance, workers compensation, and properly other expenses. At all cost our new textbook should avoid comparing our fatal accident rate by that of motorcycling, coal mining, construction or rock climbing. Such comparisons are completely irrelevant for the safety performance of our industry or your company!

Unfortunately I don't have a finished manuscript. However, what subjects do you think a modern textbook on process safety with positive message should contain? And how should this be delivered to the students?

Saturday, May 07, 2011

MPC moving up the control pyramides!

Thursday I had the oppurtunity to attend a one-day conference on model based control here in Denmark at AutomationDESIGN. The conference was to honor Jim Rawling from Department of Chemical and Biological Engineering at University of Wisconsin, who on Friday received an honorary doctorate degree from the Technical University of Denmark (DTU). There has been close collaboraton between the Deparment of Chemical Engineering at DTU and the sister department at University of Wisconsin in Madison ever since John Villadsen went to Wisconsin to develop the ideas of orthogonal collocation with professor Warren Stewart in the 1960's. Today there is a formalized exchange of teachers between the two departments, and many Ph.D.-students from DTU spends some time in Wisconsin during the doctoral studies.

When I first heard about MPC I was working at Imperial Oils Sarnia Chemical Plant with an excellent group of control engineers. We develeped supervisory control applications for the many different units at the site: Gas Cracker, Polyethylene Plant, Higher Olefins Plant, Lube Oil Plant etc. to take advantage of online analyzers as much as possible. Most - but not all - of this was done on Honeywell Process Control Computers - mainly PMX, on black and white character based terminals. Only the operators had color screens, but these also only had character based graphics.Many of the applications used models. Very simple models. Often just first order with time delay models developed by fitting step responses manually. Nonetheless it was model based control.

MPC is an idea, which came from industry. During a strike by operators at Shell plants in USA engineers was put in charge of running the plants, an two of these engineers came up with the idea, that you could improve control by basing the control action not on a single measurement but on a series of past values of process inputs and outputs. They called the concept Dynamic Matrix Control (DMC), took out a US Patent on the idea and created a company to assist others in implementing it.

Since those early days in 1982 MPC has come a long way. There are many applications running in industry. Some small single loop controllers others control a whole battery of cracking reactors with a single MPC application or a destillation train for purifying ethylene for use in polymerization. As the number of applications grew, and the engineers who developed them moved on in their carriers the ugly question of maintenance appeared.

The title of Jim Rawling's keynote was “Optimizing Process Economic Performance with Model Based Control”. He started by describing predictive control in rather simple one-input one-output open loop problem, and explained that there are two sources of disturbances, which we can't know accurately: process noise and measurement noise – in the state estimation problem. Then he continued to explain optimal control and optimal feedback control. Feedback of course is important due to disturbances and uncertainties. Dynamic programming was applied to industrial systems that are mostly constrained and nonlinear, but practical usefulness was limited.

MPC, said Jim, is a large industrial success story with 800 to 1200 applications in ethylene alone with credits of 500 to 800 M$/year in 2007 (In my first job as a control engineer we also had to calculate on a monthly basis, what the applications had saved the company the previous month. However, few people believed in these numbers, especially not the control engineers. Later is was accepted, that the plant ran more smoothly with supervisory control, and credit calculations was a thing of the past at least at that site). Acccording to professor Rawling
Eastman Chemical has 55-60 applications and claims 30-50 M$/year due to increased troughput. Dow state they use MBC for the money. Praxair has 150 applications with increased profit of 16 M$/year.

Jim also discussed questions such as: Has the application base stopped growing? Is the theory complete? Do we have the tools to solve nonconvex optimization problems online? Do we have tools to decompose large-scale systems into manageable problems? Do we have tools to commission and maintain the controllers? Do we have tools to optimize dynamic economic operations?

Current treatment of economics in industrial practice is two layer structure: a steady state layer and a dynamic layer. Drawbacks are inconsistent models and re-identification of linear model as setpoint change. The time scale of separation may not hold, economics may be unavailable in the dynamic layer. Optimizing economics – what is really desirable?

Past practice is to define a steady-state economic problem, and define a plant profit function. Then find the economically optimal steady state solution, and use that as setpoints for the dynamic layer.
Maybe it would be better to give economic function to the MPC, and consider the questions: What closed loop behavior is desirable? Fast or slow tracking? Asymmetric tracking?
Initial work on this was done by DTU Ph.d.-students John Bagterp and Dennis Bonné in 2000 and 2002. The results were published in 2008. Since then a Lyaponov function discovered, and the technique demonstrated on a non-linear chemical reactor example with enforced convergence.

The keynote ended with a status of economic MPC, and a statement of opportunities and challenges, and a personal story about writting an MPC Research Monograph.

During the Q&A session the following topics were discussed: Current technology assume, that you can generate a candidate solution quickly from which the optimal may be found. This is not necessarily true for all non-convex problems.
The economic MPC needs displays, which the operator and manager can understand. Currently there is no systematic way of deciding when to implement an economic MPC solution or not.
Problems around commissioning and maintenance still remain for normal MPC, modular control structures are prefered due to startup and shutdown issues.

The conference continued with presentation from the three departments active in Model Based Control at DTU: DTU Informatics, DTU Electro and DTU Chemical Engineering. DTU Informatics was involved in among other things automatic feedback control insulin injection.

During the afternoon sessions different Danish companies talked about their involvement in Model Based Control. DONG Energy stated, that they used the technology mainly for analysis, but had jet to implement their first MPC loop. Siemens talked about cooling control of a rooling mill using MPC with just measurement at the start and end available. 2-Control talked about development of a MPC toolbox in C# and .NET (Note, it should be possible to run the toolbox under Linux using the Mono package from Novell, which a now also part of openSUSE).

Some of the issues which the presenters raised during the conference I have some difficulty seeing as issues. Several company presenters talked about lag of computing power for using MPC in realtime control applications. I find it rather strange, that computing power is an issue, when the French used MPC in the tracking control of their exocet missile in the late eighties.

I also have problems with the issues related to scalability. In an industrial setting the operator need to be in charge of deciding what the controller should do, i.e. which constraints should be active and also which process inputs may be manipulated. Hence a manipulated variable, which is suddenly switched to manual simply become a constraint to the MPC solver. Similarly a process output, which is suddenly not available due to either instrument calibration or analyzer maintenance, simply become an unknown disturbance to the MPC controller. In an industrial setting the MPC controller must cope with such structural changes - even if the theory behind implementing them may not be complete. What do you think?

Presentations from the conference will be made available at the AutomationDESIGN web-site within the next couple of weeks.