Prove that if $a$, $b$, $c$ is a primitive Pythagorean triple, $a$ and $b$ can't both be even.

We can use an indirect argument here.

Assume that both $a$ and $b$ are even. Then, for some integers $n_1$ and $n_2$, we must have $a=2n_1$ and $b=2n_2$.

Now consider whether $c^2$ can be even or odd: $$ \begin{align*}c^2 &= a^2 + b^2 \\ &= (2n_1)^2 + (2n_2)^2 \\ &= 4n_1^2 + 4n_2^2 \\ &= 2(2n_1^2 + 2n_2^2) \\ &= 2n_3 \textrm{ for some integer $n_3$.} \end{align*}$$

So $c^2$ must be even. Consequently, $c$ itself must be even. (Clearly, $c$ can't be odd, as the square of an odd number is odd.) Notice, we now have $a$, $b$, and $c$ are all even, and hence, all divisible by $2$.

Recall, however, a "primitive" Pythagorean triple can't have any common divisors (other than 1). So we have a contradiction! Thus, we reject our original assumption that $a$ and $b$ are even. The opposite must instead be true: $a$ and $b$ can't both be even.

Prove that if $a$, $b$, $c$ is a primitive Pythagorean triple, $a$ and $b$ can't both be odd.

We can use an indirect argument here.

Assume that both $a$ and $b$ are odd. Then, for some integers $n_1$ and $n_2$, we must have $a=2n_1+1$ and $b=2n_2+1$.

Now consider $c^2$: $$ \begin{align*}c^2 &= a^2 + b^2 \\ &= (2n_1 + 1)^2 + (2n_2 + 1)^2 \\ &= 4n_1^2 + 4n_1 + 1 + 4n_2^2 + 4n_2 + 1 \\ &= 4n_1^2 + 4n_1 + 4n_2^2 + 4n_2 + 2 \\ &= 4(n_1^2 + n_1 + n_2^2 + n_2) + 2 \\ &= 4n_3 + 2 \textrm{ for some integer $n_3$} \end{align*}$$

Notice, this tells us two things:

- $c^2$ must be even as $4n_3+2 = 2(2n_3 + 1)$
- $4 \nmid c^2$, as the division would result in a remainder of 2.

However, if $c^2$ is even, then $c$ itself must be even. (Clearly, $c$ can't be odd, as the square of an odd number is odd.) Thus, $c=2k$ for some integer $k$. This, in turn, implies that $c^2 = (2k)^2 = 4k^2$. So, is must be true that $4 \mid c^2$. This contradicts our earlier conclusion that $4 \nmid c^2$. So we are forced to reject our original assumption that $a$ and $b$ are both odd. The opposite must instead be true: $a$ and $b$ can't both be odd.

Prove that if $m$ and $n$ are relatively prime and $(m+n)(m-n)$ is odd, then $m+n$ and $m-n$ must also be relatively prime.

We can use an indirect argument here.

Assume that $(m+n)$ and $(m-n)$ are not relatively prime. Then they share some common divisor other than one. Let's call this divisor $d$. But if $d \mid m+n$ and $d \mid m-n$, then $d$ must divide both their sum and their difference: $$d \mid (m+n) + (m-n)$$ $$d \mid (m+n) - (m-n)$$ After collecting like terms, we see that $$d \mid 2m\quad \textrm{and} \quad d \mid 2n$$ The first requires that either $d \mid 2$ or $d \mid m$, while the second requires $d \mid 2$ or $d \mid n$.

Notice, however, considering the first case in both possibilities: $d \mid 2$ (with $d \neq 1$), implies $d=2$. Consequently, since $d \mid m+n$ and $d \mid m-n$, both $m+n$, $m-n$, and their product $(m+n)(m-n)$ must be even, which would contradict the given statement that $(m+n)(m-n)$ is odd.

So instead, the second case in both possibilities must be true: $d \mid m$ and $d \mid n$. This, however, means that $m$ and $n$ are not relatively prime. Again, we reach a contradiction with the given information.

So we reject our assumption that $(m+n)$ and $(m-n)$ are not relatively prime. Instead, the opposite must be true: $(m+n)$ and $(m-n)$ are relatively prime.

Prove that if for integers $s$ and $t$, we know that $st$ is odd, then $s$ and $t$ must both be odd.

We can use an indirect argument here.

Assume $s$ and $t$ are not both odd. Then one of them is even. Without loss of generality, suppose $s$ is the even number. So $s=2k$ for some integer $k$. But then, $st = (2k)t = 2(kt)$, so $st$ must be even as well. This contradicts what we know about $st$ (it must be odd). So we reject our assumption. The opposite, instead, must be true: $s$ and $t$ must both be odd.

Prove that for odd integers $s$ and $t$, if $\displaystyle{\frac{s^2+t^2}{2}}$ and $\displaystyle{\frac{s^2-t^2}{2}}$ are relatively prime integers, then $s$ and $t$ must be relatively prime.

Argue indirectly and suppose an integer $d \gt 1$ divides both $s$ and $t$. Note $d$ must be odd as both $s$ and $t$ are odd. Further, we then know that $s = dk_1$ and $t = dk_2$ for some odd integers $k_1$ and $k_2$.

Now note that $$\frac{s^2 + t^2}{2} = \frac{(dk_1)^2 + (dk_2)^2}{2} = d^2 \cdot \left( \frac{k_1^2 + k_2^2}{2} \right)$$ However, as $k_1$ and $k_2$ are odd, $k_1^2+k_2^2$ must be even, and thus the factor on the far right above in parentheses is an integer. Consequently, $\displaystyle{d^2 \textrm{ divides } \frac{s^2+t^2}{2}}$

Note also that $d^2 \gt 1$ as $d \gt 1$.

Starting with $$\frac{s^2 - t^2}{2} = \frac{(dk_1)^2 - (dk_2)^2}{2} = d^2 \cdot \left( \frac{k_1^2 - k_2^2}{2} \right)$$ As $k_1^2 - k_2^2$ must be even as well, a similar argument establishes $\displaystyle{d^2 \textrm{ divides } \frac{s^2-t^2}{2}}$

We have the desired contradiction, as the integer $d^2 \gt 1$ divides two relatively prime values. Thus the original claim that $s$ and $t$ have no such divisor $d$ and are relatively prime must be true.

Find two primitive Pythagorean triples $\{a,b,c\}$ where $a^2+b^2=c^2$ and $a=35$.

Note that $35 = 1 \cdot 35 = 5 \cdot 7$. As $a = st$ for relatively prime, odd $s$ and $t$ with $s \gt t$, it must be that $s = 35$ and $t = 1$, or that $s = 7$ and $t = 5$. These lead to $b = (35^2-1^2)/2 = 612$ and $c = (35^2+1^2)/2 = 613$ or $b = (7^2 - 5^2)/2 = 12$ and $c = (7^2 + 5^2)/2 = 37$, respectively. Thus the two primitive triples we seek are given by

$$\{35,612,613\} \quad \textrm{ and } \quad \{35,12,37\}$$Find all primitive Pythagorean triples where $c$ is less than 150.

$$\begin{array}{cccc} (3,4,5) &(5,12,13) &(7,24,25) &(9,40,41)\\ (11,60,61) &(13,84,85) &(15,8,17) &(21,20,29)\\ (33,56,65) &(39,80,89) &(35,12,37) &(45,28,53)\\ (55,48,73) &(65,72,97) &(63,16,65) &(77,36,85)\\ (91,60,109) &(99,60,109) &(117,44,125) &(143,24,145) \end{array}$$One can analyze primitive Pythagorean triples in an alternate way as well. Again starting with $a^2 + b^2 = c^2$ for integers $a$, $b$, and $c$ with no common positive divisors other than $1$, we have already argued that either $a$ or $b$ must be even. Let us again assume $b$ is the even one, but this time initially solve for $b$ (instead of $a$) to get $$b^2 = c^2 - a^2$$

Factoring the difference of squares on the right, we have $b^2 = (c+a)(c-a)$. This time, however, the two factors on the right, $(c+a)$ and $(c-a)$, are not relatively prime as they must both be even (as $c$ and $a$ are both odd).

Prove the following:

$\displaystyle{\frac{c+a}{2} = u^2 \textrm{ and } \frac{c-a}{2} = v^2}$ for relatively prime positive integers $u$ and $v$, with $u \gt v$.

$a = u^2 - v^2, \quad b = 2uv, \quad \textrm{ and } \quad c = u^2 + v^2$

One of $u$ or $v$ is even.

Argue indirectly. Suppose $\displaystyle{\frac{c+a}{2} \textrm{ and } \frac{c-a}{2}}$ are not relatively prime.Then there exists some positive integer $d \gt 1$ such that $d$ divides them both. But then $d$ divides their sum, $c$, and their difference, $a$, as well. This would mean $a$ and $c$ are not relatively prime -- which contradicts the primitive nature of $\{a,b,c\}$.

Thus, $\displaystyle{\frac{c+a}{2} \textrm{ and } \frac{c-a}{2}}$ must be relatively prime.

Note their product is a perfect square, given that $$\left(\frac{c+a}{2}\right) \left(\frac{c-a}{2}\right) = \left(\frac{b}{2}\right)^2$$ and when $b$ is even, $\displaystyle{\frac{b}{2}}$ is an integer.

As we have previously seen, when two relatively prime integers have a perfect square as a product, they must both be perfect squares. Further, if the squares of two values are relatively prime, then the values themselves must be relatively prime. Hence, there exist relatively prime integers $u$ and $v$ such that

$$\displaystyle{\frac{c+a}{2} = u^2 \quad \textrm{ and } \quad \frac{c-a}{2} = v^2}$$Note we can always choose $u$ and $v$ to both be positive, and it must then be the case that $u^2 \gt v^2$ (as $c+a \gt c-a$), and consequently $u \gt v$.

Additionally, we then have $\displaystyle{u^2v^2 = \left(\frac{b}{2}\right)^2}$ which implies $b^2 = 4u^2v^2$, and finally $b = 2uv$.Basic algebra reveals $$u^2 + v^2 = \frac{c+a}{2} + \frac{c-a}{2} = \frac{2c}{2} = c$$ and $$u^2 - v^2 = \frac{c+a}{2} - \frac{c-a}{2} = \frac{2a}{2} = a$$

So we have determined that $a = u^2 - v^2$, $b = 2uv$, and $c = u^2 + v^2$.

Lastly, we notice that if $u$ and $v$ were both odd, then $a= u^2-v^2$, $b=2uv$, and $c=u^2+v^2$ would all be even and thus not a primitive Pythagorean triple. So it must be that one of $u$ or $v$ is even.

🔎 Prove that there exist no positive integers $a$, $b$, and $c$ where $a^4 + b^4 = c^4$.

If there were, there would be non-trivial solutions to $x^4 + y^4 = z^2$. Can you prove there are no such solutions to this latter equation?

🔎

^{✭}Prove that there exist no positive integers $a$, $b$, and $c$ where $a^3 + b^3 = c^3$.As a first step, argue that one one is even. If $c$ is even, then $a$ and $b$ are odd and we can write $a+b=2p$ and $a-b=2q$. Thus, $a=p+q$ and $b=p-q$.