Home Mathematics



Constrained Optimization: The Method of Lagrange MultipliersEquality Constraints: Method of Lagrange Multipliers A company manufactures new phones that are projected to take the market by storm. The two main input components of the new phone are the circuit board and the LCD Touchscreen that make the phone faster, smarter, and easier to use. The number of phones to be produced E is estimated to equal E = 250 а^{1}/%^{1}/^{2} where a and b are the number of circuit board production hours and the number of LCD Touchscreen production hours available, respectively. Such a formula is known to economists as a CobbDouglas function. Our laborers are paid by the type of work they do: the circuit board labor cost is $5 an hour and the LCD Touchscreen labor cost is $8 an hour. What is the maximum number of phones that can be made if the company has $175,000 allocated for these components in the short run? Problems such as this can be modeled using constrained optimization. We begin our discussion with optimization with equality constraints, then we move to optimization with inequality constraints in the next section. Lagrange multipliers^{[1]} can be used to solve nonlinear optimization problems (NLPs) in which all the constraints are equations. Consider the NLP given by
where m < n. We can build an equality constrained model for our phone problem: maximize the number of phones made using all available production hours.
Lagrange Multipliers: Introduction and Basic Theory In order to solve NLPs in the form of (4.2), we associate a Lagrange multiplier Ai with the ith constraint, and form the Lagrangian function
The computational procedure for Lagrange multipliers requires that all the partials of the Lagrange function (4.3) must equal zero. The partials all equaling zero at x* form the necessary conditions that x* is a solution to the NLP problem; i.e., these conditions are required for x = (x,X2, ■ ■ ■ ,x_{n}) to be a solution to (4.2). We have Proposition. Lagrange Multiplier Necessary Conditions. For x“ to be a solution of the Lagrange function, all partials must satisfy
at x*. The points x we will consider as candidates for optimal are Definition. Regular Point. x is a regular point iff the set {Vi = l..m is linearly independent. The main theorem for Lagrange’s technique is Theorem. Lagrange Multipliers. Let x be a point satisfying the Lagrange multiplier necessary conditions. Then a. If / is concave and each g, is linear, then x is an optimal solution of the maximization problem (4.2). b. If / is convex and each g, is linear, then x is an optimal solution of the minimization problem (4.2). Previously, we used the Hessian matrix to determine if a function was convex, concave, or neither. Also note that the theorem limits constraints to linear functions. What if the constraints are nonlinear? We can use the bordered Hessian in the sufficient conditions. For a bivariate Lagrangean function with one constraint
define the bordered Hessian matrix as
The determinant of this bordered Hessian matrix is
The necessary condition for a maximum in the bivariate case with one constraint is that the determinant of its bordered Hessian is positive when evaluated at the critical point. The necessary condition for a minimum in the bivariate case with one constraint is the determinant of its bordered Hessian is negative when evaluated at the critical point. If x is a regular point and g, (x) = 0 (all constraints are satisfied at x), then
defines a plane tangent to the feasible region at x. (The ^{1} • ’ is the usual dot product.) Lemma. If x is regular, у = 0, then V/(x) • у = 0. Note that the Lagrange Multiplier conditions are exactly the same for a minimization problem as for a maximization problem. This is the reason that these conditions alone are not sufficient conditions. Thus, a given solution can either be a maximum or a minimum. In order to determine whether the point found is a maximum, minimum, or saddle point we will use the Hessian. The value of the Lagrange multiplier itself has an important modeling interpretation. The multiplier Л is the “shadow price” for scarce resources. Thus, A, is the shadow price of the *th constraint. If the righthand side of the constraint is increased by a small amount, say Д, then the optimal solution will change by Ai ■ A. We will illustrate shadow prices both graphically and computationally. Graphical Interpretation of Lagrange Multipliers We can best understand the method of Lagrange multipliers by studying its geometric interpretation. This geometric interpretation involves the gradients of both the function and the constraints. Initially, consider only one constraint, g(x) = b, then the Lagrangian equation simplifies to The solution is the point x where the gradient vector Vg(x) is perpendicular to contours on the surface. The gradient vector V/ always points in the direction / increases fastest. At either a maximum or a minimum, this direction must be perpendicular to contour lines on f’s surface S. Thus, since both V/ and Vg point along the same perpendicular line, then V/ = AVg. Further, g’s curve must be tangent to /’s contours at optimal points. See Figure 4.14. The geometrical arguments are similar for the case of multiple constraints. FIGURE 4.14: Lagrange Multipliers Geometrically: One Equality Constraint Let’s preview a graphical solution to an example. Generate a contour plot of г = /(x) with Maple, and overlay the single constraint onto the contour plot. See Figure 4.15. What information can we obtain from this graphical representation? First, we note that the unconstrained optimum does not lie on the constraint. We estimate the unconstrained maximum as (a;*,'(/*) = (2.3,1.3). The optimal constrained solution lies at the point where the constraint is tangent to a contour of z = /(x). This point is approximately (1.8,1.0) on the graph. We see clearly that the constraint does not pass through the unconstrained maximum, and thus, it can be modified/adjusted (if feasible) until the line passes through the unconstrained solution. At that point, we would no longer add (or subtract) any more constrained resources (see Figure 4.16). Valuable insights about the problem come from plotting the information, when possible. FIGURE 4.15: Contour Plot of / with Linear Constraint g Interpreting the value of A as a “shadow price” leads us to consider what happens when the amount of a resource governed by a constraint is changed. Figure 4.16 shows this concept graphically. FIGURE 4.16: Resource Constraint g is Increased by Constant a Let’s now turn to the calculations. Lagrange Multipliers Computations with Maple The set of equations in (4.2), pg. 163, gives a system of n + m equations in the n + m unknowns {aq, Ay}. Generally speaking, this system presents a very difficult problem to solve without a computer except for simple problems. Also, since the Lagrange Multipliers are necessary conditions only, not sufficient, we may find solutions (aq, Ay) that are not optimal for our specific NLP. We need to be able to classify the points found in the solution. Commonly used methods for determining optimality include a. the definiteness of the Hessian matrix b. the definiteness of the bordered Hessian via det(BdH) We’ll illustrate these, when feasible, in the following examples with Maple. Example 4.19. Lagrange Multipliers with Maple.
First, define the / and g, and then display a plot. For the plot, we’ll embed the contour plot into a 3D plot to make a better visual representation of /. To embed the plot, use the transform command from the plots package.
Then T will take a point (x,y) on a 2D contour plot and embed it in a 3D plot on the plane z = —500.
Now, put the graphs together.
Rotate the 3D figure and inspect it from many different perspectives. Being able to rotate a 3D image is an incredibly useful feature. Back to the computations. The Lagrangian function is L(x,y,X) = f(x,y) + X(g(x,y) — b).
Calculate the system of necessary conditions.
Solve the system NecessCond to find potential optimal points.
Explain why these two values are equal! It will be no surprise that Maple has several commands for finding solutions to a constrained optimization problem; choose which command to use depending on the form of the output needed. Table 4.5 shows the most commonly used commands. TABLE 4.5: Maple Commands for Constrained Optimization
Remember, in “real world problems,” a critical element of the solution using the Lagrange multiplier method is the value and interpretation of the multiplier Л. Using LagrangeMultipliers
The plot output can be very useful.
Using NLPSolve Constraints must now be written as equations (or inequalities).
We’ve lost the value of A since NLPSolve used a different method. Trying to use the Lagrangian equation for the objective function doesn’t capture the correct value.
Using Maximize
Still no A.
And we did not capture the correct value of A again. If we need complete information, it’s best to use either direct computation of the method or the LagrangeMultipliers function to solve these problems. Combine the information we’ve found. The solution is
We have a solution, but we need to know whether this solution represents a maximum or a minimum. We can use either the Hessian or the bordered Hessian to justify that we have found the correct solution to our constrained optimization problem.
The Hessian is negative definite for all values of x. so the regular point, also called the stationary point, x is a maximum. The bordered Hessian gives the same result.
Since the determinant is positive we have found the maximum at the critical point. Either method works in this example to determine that we have found the maximum for our constrained optimization problem. Now, let’s interpret the shadow price A = 0.76. If the righthand side of the constraint is increased by a small amount Д, then the function will increase by approximately A • Д = 0.76 • Д. Since this is a maximization problem, we would add to the restricting resource if possible because it would improve the value of the objective function. Rom a graph, it can be seen that the incremental change must be small or the objective function could begin to decrease. Look back at Figure 4.16. If we increase the right side of g(x) = b by one unit so the constraint becomes З.т + у = 7, the solution at the new point (x**,y**) should yield a new maximum functional value f(x**,y**) approximately equal to the old maximum plus Л times the change Д; that is f(x**,y**) ss f(x*,y*) + Л Д, here 10.4457 + 0.7609 = 11.2065. In actuality, changing the constraints yields a new solution of 11.04347826. (Verify this!) The actual increase is about 0.5978. Example 4.20. Lagrange Multipliers with Multiple Constraints. Minimize w = x^{2} + y^{2} + 3z subject to x + у = 3 and x + 3y + 2z = 7. First, we directly solve the problem using Maple.
Now, define the Lagrangian function L and find its gradient.
Build the system of equations and solve it to find the potential solution.
Check the solution! (We'll use the “longform,” that is “full name,” of the commands so as to not load the packages.)
This Hessian is always positive semidefinite. The function is convex, and so our critical point is a minimum. Now, we’ll solve the problem using Maple’ LagrangeMultipliers function.
The same Hessian shows the answer is our desired minimum. Interpret the shadow prices we found, Л = 0 and //. = —3/2. If we only had the funds to increase one of the two constraint resources, which one should we choose? Since the shadow price of the first constraint is 0, we expect no improvement in the objective function’s value; we would not spend extra funds to increase the first resource. Since the shadow price of the second constraint is — 1.5, we expect to change the objective function’s value by —1.5A (Improving the objective since we are minimizing!) if we increase that resource by A; we would spend extra funds to increase the second resource as long as the cost of increasing resource 2 was less than 1.5Д. Why does the cost need to be less than 1.5 A? Applications using Lagrange Multipliers Many applied constrained optimization problems can be solved with Lagrange multipliers. We’ll start by revisiting the opening problem of this section (from page 162). Example 4.21. The CobbDouglas Function. A company manufactures new phones that are projected to take the market by storm. The two main input components of the new phone are the circuit board and the LCD Touchscreen that make the phone faster, smarter, and easier to use. The number of phones to be produced P is estimated to equal P = 250 a^{l}/4^{1}/^{2} where a and b are the number of circuit board production hours and the number of LCD Touchscreen production hours available, respectively. This type of formula is known to economists as a CobbDouglas production function'. Laborers are paid by the type of work they do: the circuit board labor cost is $5 an hour and the LCD Touchscreen labor cost is $8 an hour. What is the maximum number of phones that can be made if the company has $175,000 allocated for these components in the short run? Let’s use Maple. Of course, LagrangeMultipliers gives the same results.
We find we can make « 313,765 phones using 11,667.67 production hours for circuit boards and 14,583.33 production hours for LCD touchscreens. A plot of the CobbDouglas function centered at the (11,668,14,583) showing the constraint in Figure 4.17 finishes our analysis. "Originated in Cobb & Douglas, “A Theory of Production,” Am Econ Review. 18, 1928, pg. 139165. FIGURE 4.17: CobbDouglas Phone Production Function Example 4.22. Oil Transfer Company Storage Facilities. The management of a small oil transfer company desires a minimum cost policy taking into account the restrictions on available tank storage space. A formula has been derived from historical data records that describes system costs.
where
The tank space constraint is given by:
where t_{n} is the space required for the nth item (in 1.000s bbl) and T is the available tank space (in 1,000s bbl). The parameter values shown in Table 4.6 are based on the company’s data collected over several years. TABLE 4.6: Oil Storage data Parameter Estimates
The storage tanks have only 22 (1,000s bbl) space available. Find the optimal solution as a minimum cost policy. We’ll first solve the unconstrained problem. This solution will provide an upper bound for the constrained solution and help us gain insight into the dynamics of the problem. Define the parameters, then define the functions. Build the system of equations {df /dxi = 0} and solve it. (Remember to load Student[MultivariateCalculus] for Gradient.) The only useful solution is where xi, a; 2, хз > 0, so we choose the first row.
This solution x* = (11.07,12.82,9.18} provides an unconstrained upper boui since it does not satisfy the constraint g(x) = 22.
Now we solve the constrained optimization problem knowing that the sol tion will be less than the unconstrained value we just found.
Using solve as we did above gives a plethora of complex and negative valu that we don’t want. We’ll use fsolve with a starting value of the unconstraim solution and Л = 1.
Do we have the minimum? The Hessian matrix H (Remember to load Student[VectorCalculusj) is Since the Hessian is positive for all positive x. the matrix is positive definite. We have found a minimum for the constrained problem at x* = (4.761,3.213,4.045). Should we recommend the company add storage space? We know from the unconstrained solution that, if possible, we would add storage space to decrease the costs. We have found the shadow price Л = 0.74 which suggests that any small increase Д in the RHS of the constraint causes the objective function to decrease by approximately 0.74 • Д. The cost of the extra storage tank would have to be less than the savings incurred by adding the tank. Exercises
If /(l(a:i) = 50 — (a.’i — 2)^{2} and /2(a;2) = 50 — (a,’2 — 2)^{2}, analyze the manufacturing processes to
ж^{2} _{+} y^{2} = 16. 5. Use the Method of Lagrange Multipliers to maximize f(x, y, z) = xyz subject to 2 ж + 3у + 4w = 36. Determine how much f[x, y, z) would change if one more unit was added to the constraint.

<<  CONTENTS  >> 

Related topics 