The Notebook

I’ll provide motivation to use Lagrange Multipliers with a simple problem. Throughout this post, I’ll use the following problem statement to have expository propositions on some circumstances.

Here I have two functions which are

My aim to use Lagrange Multipliers is finding

Mathematical representation of explanation above takes only one line:

**\textit{ max } f(x, y) \textit{ subject to } g(x,y)**

Function

First we parametrize our constraint function

**x^{2} + y^{2} = 16**

**\rightarrow y = \sqrt{16 - x^{2}}**

Then, let

Here, we define

Consequently, we define our vector-valued function

So, we can say that

As you might notice, vector-valued function

Are you with me so far? :)

If so, here we'll use

Lets define a modified version of

For example

** F(0) = f(x(0),y(0)) = f(\sqrt{16 - 0^2}, 0) = f(\sqrt{16},0) = \sqrt{16}.0 = 0**

** F(1) = f(x(1),y(1)) = f(\sqrt{16 - 1^2}, 1) = f(\sqrt{15},1) = \sqrt{15}.1 = \sqrt{15}**

** F(2) = f(x(2),y(2)) = f(\sqrt{16 - 2^2}, 2) = f(\sqrt{12},2) = \sqrt{12}.2 = 4\sqrt{3}**

** \dots **

** F(4) = f(x(4),y(4)) = f(\sqrt{16 - 4^2},4) =0.4 = 0**

So, if we can find

Before trying to do that, we make an investment of next steps by playing around with

As explained before,

** r'(t) = \frac{d\bold{r}}{dt} = \lim_{\Delta t \to 0} \frac{\bold{r}(t + \Delta t) - \bold{r}(t)}{\Delta t}**

** \rightarrow \lim_{\Delta t \to 0} \frac{x(t + \Delta t) - x(t)}{\Delta t}i + \lim_{\Delta t \to 0} \frac{y(t + \Delta t) - y(t)}{\Delta t}j**

** \rightarrow \frac{dx}{dt}i + \frac{dy}{dt}j**

As

Now, let's investigate this tangent vector with usage of example.

** g(x,y) = c = 16**

Since we can parametrize g(x,y) in a form of g(x(t), y(t)) as we've done above:

** \frac{d}{dt} g(x(t),y(t)) = \frac{d}{dt}c**

** \rightarrow \frac{dg}{dx} \frac{dx}{dt} + \frac{dg}{dy} \frac{dy}{dt} = 0**

Now, we can modify this equation with usage of inner product and gradient vector

** \rightarrow (\frac{dg}{dx}i + \frac{dg}{dy}j) \:\:\: . \:\:\: (\frac{dx}{dt}i + \frac{dy}{dt}j) = 0**

** \rightarrow \nabla_{g(x,y)} \textit{\:\:\: . \:\:\:} r'(t)= 0**

So, here we show that, if we have a function, the gradient of it -which is

Now, we would find the equation of this tangent vectors. In order to do that, here we provide small proposition and proof; if you are familiar, you might comfortably skip.

Let

So,

** m = \tan{\theta} = \tan{(\phi - 90^{\circ})} = \frac{sin(\phi - 90^{\circ})}{cos(\phi - 90^{\circ})} =
\frac{-\cos(\phi)}{\sin(\phi)} = \frac{-1}{\tan(\phi)} = \frac{-1}{n}**

By usage of this information, let's try to find tangential vector at

First, we calculate

So, slope of normal line at

So, we can express tangential line as following:

** (y - y_{0}) = m(x - x_{0})**

** (y - 2\sqrt{2}) = -1(x - 2\sqrt{2}) \rightarrow x + y = 4\sqrt{2}**

Now, we are close to provide proof of Lagrange Multipliers! :)

As declared a lot above, we need to find extreme point of

So, let me try:

Let's assume that, F(t) has it's maximum value at

**F'(t = t_{0}) = 0**

** \rightarrow \frac{df}{dx} \frac{dx}{dt} + \frac{df}{dy} \frac{dy}{dt} = 0**

Again, we can modify this equation by usage of inner product and gradient vector

** \rightarrow ( \frac{df}{dx} i + \frac{df}{dy} j ) \:\:\: . \:\:\: (\frac{dx}{dt}i + \frac{dy}{dt}j)**

Consequently,

** \rightarrow \nabla_{f(x,y)} \textit{\:\:\: . \:\:\:} r'(t)= 0**

We also know that, gradient of constraint function

So, here we observe that, at extreme point of

With this observation, we can finally write the following equation which is worth all the explanations above:

** \nabla_{f(x_{0},y_{0})} = \lambda \nabla_{g(x_{0},y_{0})}**

So, let's solve our example which consists

** y_{0}i + x_{0}j - \lambda(2x_{0}i + 2y_{0}j) = 0**

** \rightarrow \lambda = \frac{y_{0}}{2x_{0}} **

** \rightarrow \lambda = \frac{x_{0}}{2y_{0}} **

** \rightarrow x_{0} = y_{0}** and since ** x_{0}^2 + y_{0}^2 = 16**, ** x_{0} = y_{0} = 2\sqrt{2}**