Toronto Math Forum
APM3462018S => APM346––Home Assignments => Web Bonus Problems => Topic started by: Victor Ivrii on January 05, 2018, 04:34:56 PM

Solutions should be posted and discussed here (on the forum, in this topic) only. Do not post duplicate solutions. However significant improvements and corrections are different matter.
Find the general solution $u=u(x,y)$ of the overdetermined system (more equations than unknown functions. Usually it is a good idea to check such solution by the substitution):
\begin{align}
& u_{xx}=0,\label{A}\\
&u_{yy}=0.\label{B}
\end{align}

I've given the problem a try, and I'm still trying to figure out how to reconcile my results.
What I did:
Given $ u_{xx} = 0 $ (1) and $ u _{yy} = 0 $ (2), then let's work on (1) first:
$ u_{xx} = v_{x} = 0 $ which implies that $ v = \phi (y) $, then $ u_{x} = \phi (y) $ and $ u(x,y) = x\phi(y) + \psi(y) $
but then there is (2)
$ u_{yy} = w_{y} = 0 , w = \alpha(x) $ then $ u_{y} = \alpha(x), u(x,y) = y \alpha(x) + \beta(x) $
So comparing our two solutions for u:
(I) : $ u (x,y) = x\phi(y) + \psi(y) $
(II) : $ u(x,y) = y\alpha(x) + \beta(x) $
Could we simply conclude that in this family of solutions, $\phi(y) = ay$ and similarly $\alpha(x) = bx$. Furthermore, $\psi(y) = \beta(x)$ , and since we know those terms go to 0 when the first partial is taken (see above), can we just make them an arbitrary constant?
(Note: a,b are some arbitrary constant)
Am I on the right track, or am I making too many assumptions?

Tristan,
I don't think you are making too many assumptions if your goal is to find one possible solution, but it is valid to be concerned if it there are leaps in logic for a general solution.
Your solutions (I) and (II) are interpreted to be the general solutions to u_{xx}=0 and u_{yy}=0, respectively. I believe your solution steps for u_{x} and u_{y} for respective constraints to be correct. However, I have modified the final steps for the general solutions:
(I) : u(x,y) = xϕ(y)+ψ(y)+bx+C
(II) : u(x,y) = yα(x)+β(x)+cy+C
Note that if you take the derivative of (I) with respect to x, b gets absorbed into C. Analogous conditions apply to (II).
When constrained to variables x and y, C is a constant (and so are b and c).
Finally, note that the general solution which obeys both u_{xx}=0 and u_{yy}=0 is equal to both solution (I) and solution (II).
Thus, solve: xϕ(y)+ψ(y)+bx+C = yα(x)+β(x)+cy+C
Let's assume none of the terms are equal to 0. This will allow us to impose constraints that will deny trivial solutions. If any of ϕ(y), ψ(y), α(x), and β(x) are equal to 0, the general solution should still be valid.
Thus, xϕ(y) = yα(x).
Thus, ϕ(y)=ay and α(x)=ax. (This is pretty much what you said, except I imposed the extra constraint that the functions share the same coefficient). Check that xϕ(y) = yα(x) = axy.
So now we have axy + ψ(y) + bx + C = axy + β(x) + cy + C
Thus, ψ(y) = cy and β(x) = bx.
Thus, the general solution is axy + bx + cy + C.

Tristan started well, but then went wrong way in the place I could not expect. Adam made the same "tactical error" but then managed to solve (but it was overcomplicated). The most obvious way after one solves the first equation (\ref{A})
\begin{equation}
u(x,y)= \phi (y) x +\psi(y),
\label{C}
\end{equation}
is not to solve the second one (\ref{B}) from the scratch, but to plug (\ref{C}) in it... Continue, please.
I really want the simplest solution...
NB. Adam, please use $\LaTeX$ rather than html commands.

Here is my attempt at a solution, perhaps someone can review it?

Jaisen, if I did not know the solution, I would not be able to read your post. If you want to scan, write at least properly.
On Quiz, Test or Final such handwriting could lead to "Unreadable" and 0 mark.
And typing is much more preferable.

Ok, typed here are my steps:
$ u_{x} = f(y) $
$ u(x,y) = xf(y) + g(y) $
$ u_{y} = x f_{y} + g_{y} $, but
$ u_{yy} = 0 $ which implies
$ f_{y} = c_{1} $ and $ g_{y} = c_{2} $ which in turn implies
$ f(y) = c_{1}y + c_{3} $ and $ g(y) = c_{2}y + c_{4} $, concluding:
$ u(x,y) = c_{1}xy + c_{3}x + c_{2}y + c_{4} $

Ok, typed here are my steps:
$$
u_{x} = f(y) \implies u(x,y) = xf(y) + g(y) \implies u_{y} = x f_{y} + g_{y},$$
but
$$ u_{yy} = 0 \implies f_{y} = c_{1} \ \ \text{and} \ \ g_{y} = c_{2},
$$
which in turn implies
$$ f(y) = c_{1}y + c_{3} \ \ \text{and} \ \ g(y) = c_{2}y + c_{4} ,
$$
concluding:
$$ u(x,y) = c_{1}xy + c_{3}x + c_{2}y + c_{4}
$$
Now this is solution I was looking for (I just made it more readable).
So, here we have a general solution, depending on 4 arbitrary constants, rather than a number of arbitrary functions of one variable. Why so?

I am trying to follow the quick way as in lecture below.
$$\text{ something wrong }.$$
Hence upon substitution,
$$\text{ some other thing wrong}. $$
Thus I have the unknown functions. I think the constants and explicit forms shows up as a result of not presume these functions beforehand.
I hope what follows is no longer nonsense. We indeed have the notsoarbitrary functions as desired,
$$c_{1}xy, c_{3}x, c_{2}y, c_{4}.$$
They lost their arbitrariness in form by constrain $(2)$, since upon substitution
$$x f_{y}(y) + g_{y}(y)=h(x)$$
the particular form of $f,g$ is dictated to be polynomial in $y$ of degree less than one.

Jingxuan,
please read carefully what other students posted, you are on the wrong pass from the beginning.

Based on Tristan's work,
\begin{eqnarray}
u(x,y)&=&\phi(y)x+\psi(y)\\
u_{yy}&=&0
\end{eqnarray}
Consider derive $u(x,y)$ with respect to y:
\begin{eqnarray}
u_y&=&\phi'(y)x+\psi'(y)\\
u_{yy}&=&\phi''(y)x+\psi''(y)
\end{eqnarray}
Then plug (5) into (7):
\begin{eqnarray}
\phi''(y)x=\psi''(y)
\end{eqnarray}
Integrating both sides with respect to y:
\begin{eqnarray}
x(\phi'(y)+g(x))=\psi'(y)+f(x)
\end{eqnarray}
Let me rearrange it into:
\begin{eqnarray}
x\phi'(y)+\psi'(y)=f(x)xg(x)
\end{eqnarray}
Then plug (10) into (6):
\begin{eqnarray}
u_y=f(x)xg(x)
\end{eqnarray}
Integrate it with respect to y, we get:
\begin{eqnarray}
u(x,y)=xyg(x)+yf(x)+p(x)
\end{eqnarray}

Lingyun, think: after you got $\phi''(y)x=\psi''(y)$; this is an identity, it must be satisfied for all $x$ and $y$, which is possible only if $\psi''=\phi''=0$.
Jaisen did everything. I just want a discussion, not another solution (espacially the wrong one)
PS Lingyun, two $\LaTeX$ remarks.
1) eqnarray should not be used, it is old, buggy and has been deprecated long ago
2) MathJax supports autonumbering amd \label / \ref mechanism

Here is my solution to this question:
Given $u_{xx} = 0 \text{ , } u_{yy} = 0$ and $u=u(x,y)$
$u_{xx}=0$ implies $u_x = \phi(y)$
so we have $u(x,y) = \int u_x dx = \phi(y) x + \psi(y) $ (*)
And from $u_{yy}=0$, we have $u_y = \varphi(x)$
With (*) implies $u(x,y)_y = {(\phi(y) x + \psi(y))}_y = \varphi(x)$
Partial derivative on both sides with respect to y we have:
$\phi_{yy}(y)x + \psi_{yy}(y) = 0 $
which implies that:
$\phi_y(y) = c$ and $\psi_y(y) = d$ where both of c and d are constant
Thus:
$\phi(y) = cy + a$ and $\psi(y) = dy + b$ (**) where both of a and b are constant
Combine (*) and (**): we have $u(x,y) = cyx + ax + dy +b$ as our general solution to this pde question

Guys, there is already a perfect solution by Jaisen, and Tristan made the first part correctly. There is no point to post more solutions! But the interesting question: Why this solution includes just 4 arbitrary constants rather than arbitrary functions of one variable?
And please find the general solutions to two other overdetermined systems
\begin{equation}
\left\{\begin{aligned}
&u_{xx}=y^2,\\
&u_{yy}=x^2
\end{aligned}\right.
\label{Q}
\end{equation}
and
\begin{equation}
\left\{\begin{aligned}
&u_{xx}=y^2,\\
&u_{yy}=x^2
\end{aligned}\right.
\label{R}
\end{equation}

As for $(\ref{Q})$,
$$u_{xx}=y^2 \implies u=\frac{x^2y^2}{2} + xf(y)+g(y); u_{yy}=x^2 \implies f_{yy}(y)=g_{yy}(y)=0 \implies u = \frac{x^2y^2}{2} + x(ay+b) + (cy+d).$$
Soppose $u$ solves $(\ref{R})$, we would quickly arrive at what seems to me a contradiction
$$2x^2=xf(y)+g(y).$$

Jingxuan, I removed excessive quotation.
With (13) you found the general solution. Great!
But what about (14)?

For (14)
$$u_{xx}=y^2 \implies u=\frac{x^2y^2}{2} + xf(y)+g(y) \implies u_{y}=x^2y+xf_{y}+g_{y} \implies u_{yy}= x^2 + xf_{yy} + g_{yy} = x^2 $$
So we have:
$$2x^2 + xf_{yy} + g_{yy}= 0 $$
Suppose x = 0, therefore
$$g_{yy} = 0 \implies g_{y} = c \implies g(y)=cy+d \implies u(x,y) = u(0,y) = cy + d $$
Suppose x not equal to 0
When we solve the quadratic we arrive at imaginary values for x. I'm not sure if I ought to continue.

oops.

But the interesting question: Why this solution includes just 4 arbitrary constants rather than arbitrary functions of one variable?
Perhaps a hint? Are we supposed to know this from the solution or prior to arriving at the solution? The solution indicates this is a linear(?) PDE so perhaps it is related to that fact. Alternatively, perhaps I can ask, as this is a second order PDE, and our only constraints are related to the second partial derivatives, does that mean we need constants to determine the initial values and boundaries?

Perhaps a hint? Are we supposed to know this from the solution or prior to arriving at the solution?
In this particular case we can arrive to the correct conclusion in either way: we can get it in the process of solving or we can get it without solving.

Suppose x not equal to 0
When we solve the quadratic we arrive at imaginary values for x. I'm not sure if I ought to continue.
I agree with your first part. Now suppose x is not identically $0$, what function of $f_{yy}(y)$ multiply to $x$ will give you such a quadratic in $x$?
Also the matter seems not to be with possible complex values, but that in this case the quadratic formula gives a relation $x=F(y).$
Is it ever possible?
No. Hence there is no common solution if $x$ is not identically zero. But if it is then upon substituting $u=u(y)=g(y)$ and so
$$u_{xx}=0=y^2 \implies y=0.$$
Thus $u$ can only be defined on the origin, and takes any constant value. (Though I doubt if derivatives are well defined then.)

The answer to the question "why solution does not include arbitrary functions off one variable" the answer is simple: because it is overdetermined system. One can arrive to this conclusion by the following reasoning: solution to equation $u_{xx}=0$ includes arbitrary functions of $y$ but does not include arbitrary functions of $x$; solution to $u_{yy}=0$ does not include arbitrary functions of $y$. Then the common solution to both equations contains neither.
Pending: what is the general solution to (14) from the previous page

Suppose x not equal to 0
When we solve the quadratic we arrive at imaginary values for x. I'm not sure if I ought to continue.
I agree with your first part. Now suppose x is not identically $0$, what function of $f_{yy}(y)$ multiply to $x$ will give you such a quadratic in $x$?
Also the matter seems not to be with possible complex values, but that in this case the quadratic formula gives a relation $x=F(y).$
I'm not sure I follow. If x is not equal to zero, the quadratic would be satisfied if $f_{yy}$ and $g_{yy}$ are constant. So we end up with:
$$ u(x,y)=\frac{x^2y^2}{2} + \frac{c_{1}xy^2}{2}+c_{3}xy+c_{5}x+\frac{c_{2}y^2}{2}+{c_{4}y}+c_{6} $$

Jaisen, you arrived to the equality I maked by red. And it is not just an equality, it must be an identity, that is, fulfilled for all $x$ and $y$.
What are $f$ and $g$ here by your definition? And which of these functions make that equation to be an identity?

Jaisen, you arrived to the equality I maked by red. And it is not just an equality, it must be an identity, that is, fulfilled for all $x$ and $y$.
What are $f$ and $g$ here by your definition? And which of these functions make that equation to be an identity?
Ok, picking up from where I erred:
$$ x = f_{yy} = g_{yy} = 0 \implies f_{y} = c_{1} $$ and $$ g_{y} = c_{2} $$
Concluding:
$$ u(x,y)=\frac{x^2y^2}{2} + {c_{1}xy}+c_{3}x+{c_{2}y}+c_{4} $$

Wrong again. What are $f$ and $g$?

Wrong again. What are $f$ and $g$?
Functions dependent on the variable y but constant with respect to x. Other than that, I'm unsure.

So $f=f(y)$, $g=g(y)$ and
$$
2x^2 + xg(y)+ h(y)=0
$$
is an identity. Is it ever possible?

The identity is impossible. In other words, there does not exist any $g(y)$ or $h(y)$ such that the equation $2x^2 + xg(y) + h(y) = 0 $ is true for all $x$ and $y$ on the domain of $u(x,y)$.
Proof:
Denote $v(x,y) = 2x^2 + xg(y) + h(y)$. Say we find some value $x=x_0$ and $y=y_0$ such that $v(x_0, y_0) = 0$. If $g(y_0)$ is positive or 0, let $x_1 > x_0$. Now closely examine the expression $v(x_1, y_0)$ and compare to $v(x_0, y_0)$. $2x_{1}^2 > 2x_{0}^2$ and $x_{1}g(y_0) > x_{0}g(y_0)$. Then $v(x_1, y_0) \ne v(x_0, y_0) = 0$.
If $g(y_0)$ is negative or 0, let $x_2 < x_0$. We would similarly find $v(x_2, y_0) \ne v(x_0, y_0) = 0$.
Thus, there is always some set of points $x$ and $y$ where $v(x,y) \ne 0$.
How this applies to the problem (14):
The fact that the identity is never possible means that there does not exist a general solution for the constraints $u_{xx} = y^2$ and $u_{yy} = x^2$.

Adam nailed it: solution does not exist. It is the general situation for overdetermined systems: solution does not exist unless compatibility conditions are satisfied. If compatibility conditions are satisfied, then there exist fewer solutions.
You studied in Calculus II and ODE such system
\begin{equation}
\left\{\begin{aligned}
&u_x = f,\\
&u_y=g
\end{aligned}\right.
\end{equation}
In other words, when $f\,dx+g\,dy$ is an exact differential. And the necessary (and for simpleconnected domains, sufficient) condition was $f_y=g_x$.
For the system
\begin{equation}
\left\{\begin{aligned}
&u_{xx} = f,\\
&u_{yy}=g
\end{aligned}\right.
\end{equation}
such condition is $f_{yy}=g_{xx}$. Prove the necessity!

Necessity:$$u_{xxyy}=f_{yy}=u_{yyxx}=g_{xx}.$$

Would one approach to prove necessity would be to simply plug the results and show that the remaining terms demand that $f_{yy} = g_{xx}$
$ u_{xx} = f $ (1) and to keep things general, let's then write $u_{x} = \int f dx + \phi(y) $ and $u = \int\int f dxdx + x\phi(y) + \psi(y) $ (I). Then $ u_{y} = \int\int f_y dxdx + x\phi_{y}(y) + \psi_y(y) $ and $u_{yy} = \int\int f_{yy}dxdx + x\phi_{yy}(y) + \psi_{yy}(y) = g $
Then $u_{yy} = g $ , $u_{y} = \int g dy + \alpha(x) $ , $ u = \int \int g dydy + y\alpha(x) + \beta(x) $ , $ u_x = \int\int g_x dydy + y\alpha_x (x) + \beta_x (x) $ , $ u_{xx} = \int\int g_{xx}dydy + y\alpha_{xx} (x) + \beta_{xx}(x) = f $
From there: $ f_{yy} = u_{xxyy} = \partial{y}\partial{y} \int\int g_{xx}dydy +\partial{y}\partial{y} (y\alpha_{xx} (x)) + \partial{y}\partial{y}\beta_{xx}(x) = g_{xx} + 0 $
and $ g_{xx} = u_{yyxx} = \partial{x}\partial{x} \int\int f_{yy}dxdx + \partial{x}\partial{x} (x\phi(y)_{yy} + \psi(y)_{yy}) = f_{yy} + 0 $
Or as Jingxuan said $u_{xxyy}=f_{yy}=u_{yyxx}=g_{xx}$.
And we can conclude that if $f_{yy} \neq g_{xx} $ then we wouldn't be able to get this result?

Jingxuan
Correct and minimalist proof of necessity. Obviously $u_{xx}=y^2$, $u_{yy}=x^2$ does not satisfy and solution does not exist.
To prove that this compatibility condition $f_{yy}=g_{xx}$ is sufficient (under additional assumption that domain is convex, you can figure out generalization) solve $u_{xx}=f$, $u(0,y)=\phi(y)$, $u_x(0,y)=\psi(y)$ where $\phi$, $\psi$ so far are unknown and will be chosen later. Such $u$ obviously exist for any $\phi$, $\psi$.
What about $v:=u_{yy}g=0$? Obviously $v_{xx}=0$ and we need to satisfy $v(0,y)=0$, $v_x(0,y)=0$, which are, in fact, $\phi '' g(0,y)=0$, $\psi '' g_x(0,y)=0$, and we can satisfy these two equations. Four arbitrary constants $C_1=\phi(0)$, $C_2=\phi'(0)$, $C_3=\psi(0)$ and $C_4=\psi'(0)$ appear.

$u(0,y)=\phi(y)$, $u_x(0,y)=\psi(y)$ where $\phi$, $\psi$ so far are unknown and will be chosen later. Such $u$ obviously exist for any $\phi$, $\psi$.
What about $v:=u_{yy}g=0$? Obviously $v_{xx}=0$ and we need to satisfy $v(0,y)=0$, $v_x(0,y)=0$, which are, in fact, $\phi '' g(0,y)=0$, $\psi '' g_x(0,y)=0$, and we can satisfy these two equations. Four arbitrary constants $C_1=\phi(0)$, $C_2=\phi'(0)$, $C_3=\psi(0)$ and $C_4=\psi'(0)$ appear.
May I ask ask about how we get $u(0,y)=\phi(y)$, $u_x(0,y)=\psi(y)$? Since I cannot follow up starting here.....

For the necessity $u_{xxyy}=f_{yy}=u_{yyxx}=g_{xx}$, can we say that this is because mixed partial derivatives of the same type, ie same number of differentiations with respect to the same variables, are equal? (By Green's Theorem)

Explanation: 1) $u_{xxyy}=u_{yyxx}$ due to the property of partial derivatives; in other words operators $\partial_x$ and $\partial_y$ (of partial differentiations) commute.
2) Proving that $f_{yy}=g_{xx}$ (*) is sufficient (at least in convex domains): we denote $v:= (u_{yy}g)$ and if $u_{xx}=f$ we conclude that $v_{xx}=0$ due to (*).
Therefore if $v=v_x=0$ for $y=0$, then $v=0$ identically. But $u=\phi(y) +\psi(y)x$ and $v=\phi '' (y)+\psi'' (y)xg(x,y)$ which indeed implies $v(0,y)=\phi''(y)g(0,y)$ $v_x(0,y)=\psi''(y)g_x(0,y)$.
___
This topic is closed now; I will give points over coming weekends