Skip navigation
  • Home
  • Browse
    • Communities
      & Collections
    • Browse Items by:
    • Publication Date
    • Author
    • Title
    • Subject
    • Department
  • Sign on to:
    • My MacSphere
    • Receive email
      updates
    • Edit Profile


McMaster University Home Page
  1. MacSphere
  2. Departments and Schools
  3. Student Publications (Not Graduate Theses)
Please use this identifier to cite or link to this item: http://hdl.handle.net/11375/29955
Title: General convex relaxations of implicit functions and inverse functions
Authors: Cao, Huiyi
Khan, Kamil A.
Department: Chemical Engineering
Keywords: Global optimization;Implicit functions
Publication Date: 2023
Publisher: Journal of Global Optimization, Springer
Citation: Cao, H. and Khan, K.A., General convex relaxations of implicit functions and inverse functions. J Glob Optim 86, 545–572 (2023). https://doi.org/10.1007/s10898-023-01281-0
Abstract: Convex relaxations of nonconvex functions provide useful bounding information in applications such as deterministic global optimization and reachability analysis. In some situations, the original nonconvex functions may not be known explicitly, but are instead described implicitly by nonlinear equation systems. In these cases, established convex relaxation methods for closed-form functions are not directly applicable. This article presents a new general strategy to construct convex relaxations for such implicit functions. These relaxations are described as convex parametric programs whose constraints are convex relaxations of the original residual function. This relaxation strategy is straightforward to implement, produces tight relaxations in practice, is particularly efficient to carry out when monotonicity properties can be exploited, and does not assume the existence or uniqueness of an implicit function on the entire intended domain. Unlike all previous approaches to the authors’ knowledge, this new approach permits any relaxations of the residual function; it does not require the residual relaxations to be factorable or to be obtained from a McCormick-like traversal of a computational graph. This new convex relaxation strategy is extended to inverse functions, compositions involving implicit functions, feasible-set mappings in constraint satisfaction problems, and solutions of parametric ODEs. Based on a proof-of-concept implementation in Julia, numerical examples are presented to illustrate the convex relaxations produced for various implicit functions and optimal-value functions.
URI: http://hdl.handle.net/11375/29955
ISSN: 10.1007/s10898-023-01281-0
Other Identifiers: 10.1007/s10898-023-01281-0
Appears in Collections:Student Publications (Not Graduate Theses)

Files in This Item:
File Description SizeFormat 
sn-article.pdf
Open Access
855.61 kBAdobe PDFView/Open
Show full item record Statistics


Items in MacSphere are protected by copyright, with all rights reserved, unless otherwise indicated.

Sherman Centre for Digital Scholarship     McMaster University Libraries
©2022 McMaster University, 1280 Main Street West, Hamilton, Ontario L8S 4L8 | 905-525-9140 | Contact Us | Terms of Use & Privacy Policy | Feedback

Report Accessibility Issue