Global optimization of MINLP problems in process synthesis and design

Two new methodologies for the global optimization of MINLP models, the Special structure Mixed Integer Nonlinear αBB, SMIN-αBB, and the General structure Mixed Integer Nonlinear αBB, GMIN-αBB, are presented. Their theoretical foundations provide guarantees that the global optimum solution of MINLPs...

Full description

Saved in:
Bibliographic Details
Published inComputers & chemical engineering Vol. 21; pp. S445 - S450
Main Authors Adjiman, C.S., Androulakis, I.P., Floudas, C.A.
Format Journal Article
LanguageEnglish
Published Elsevier Ltd 1997
Online AccessGet full text

Cover

Loading…
More Information
Summary:Two new methodologies for the global optimization of MINLP models, the Special structure Mixed Integer Nonlinear αBB, SMIN-αBB, and the General structure Mixed Integer Nonlinear αBB, GMIN-αBB, are presented. Their theoretical foundations provide guarantees that the global optimum solution of MINLPs involving twice-differentiable nonconvex functions in the continuous variables can be identified. The conditions imposed on the functionality of the binary variables differ for each method: linear and mixed bilinear terms can be treated with the SMIN-αBB; mixed nonlinear terms whose continuous relaxation is twice-differentiable are handled by the GMIN-αBB. While both algorithms use the concept of a branch & bound tree, they rely on fundamentally different bounding and branching strategies. In the GMIN-αBB algorithm, lower (upper) bounds at each node result from the solution of convex (nonconvex) MINLPs derived from the original problem. The construction of convex lower bounding MINLPs, using the techniques recently developed for the generation of valid convex underestimators for twice-differentiable functions ( Adjiman et al., 1996; Adjiman and Floudas, 1996), is an essential task as it allows to solve the underestimating problems to global optimality using the GBD algorithm or the OA algorithm, provided that the binary variables participate separably and linearly. Moreover, the inherent structure of the MINLP problem can be fully exploited as branching is performed on the binary and the continuous variables. In the case of the SMIN-αBB algorithm, the lower and upper bounds are obtained by solving continuous relaxations of the original MINLP. Using the αBB algorithm, these nonconvex NLPs are solved as global optimization problems and hence valid lower bounds are generated. Since branching is performed exclusively on the binary variables, the maximum size of the branch-and-bound tree is smaller than that for the SMIN-αBB. The two proposed approaches are used to generate computational results on various nonconvex MINLP problems that arise in the areas of Process Synthesis and Design.
ISSN:0098-1354
1873-4375
DOI:10.1016/S0098-1354(97)87542-4