Monday, October 20, 2008

Abel's Proof: Field Extensions: Step 1: Lemmas

The content in today's blog is taken from the essay by Michael I. Rosen entitled "Niels Hendrik Abel and Equations of the Fifth Degree."

In today's blog, I present some initial lemmas that I will use to establish Step 1 of the proof using field extensions. For the proof using the ideas that Niels Abel originally presented, see here.

Lemma 1:

Let F be a field containing a primitive n-th root of unity.

If a ∈ F is not an nth power and q is prime, then xq - a is irreducible over F. (see Definition 4, here).

Proof

This follows from Lemma 2, here.

QED

For Lemma 2 below, I will use the notation F(u) which is defined in Definition 3, here.

Lemma 2:

Let:

α be a root of xq - a = 0

Then:

every γ ∈ F(α) can be written in the form:

γ = a0 + a1α + ... + aq-1αq-1

where ai are in F.

Proof:

This follows directly from the definition of F(u) [see Definition 3, here.]

QED

Lemma 3:

Let:

q be prime

and

m,k be integers such that 1 ≤ m,k ≤ q-1

Then:

there exists integers r,s such that:

rq + sk = m

and

0 ≤ s ≤ q-1

Proof:

(1) Since q is prime and k ≤ q-1, it follows that gcd(k,q)=1. [That is, they are relatively prime]

(2) It follows that 0*k, 1*k, ..., (q-1)*k forms a complete residue system modulo q. [See Lemma 3, here]

(3) This means that for any integer m, there exists an integer s such that sk ≡ m (mod q) and 0 ≤ s ≤ q-1. [See Definition 1, here for review of a complete residue system]

(4) So that m - sk ≡ 0 (mod q). [See here for review of modular arithmetic if needed]

(5) Since q divides m -sk, it follows that there exists an integer r such that rq = m - sk

(6) Then, it follows that rq + sk = m.

QED

For Corollary 3.1 below, you will need to know that F[x] refers to the set of polynomials in field F. (See here for review of polynomials and the meaning of F[x] ). I will also use the notation F(u) which is defined in Definition 3, here.

Corollary 3.1:

Assume that xq - a ∈ F[x] is irreducible and that α is a root (see here for review of roots of a polynomial).

Let γ be a nonzero element of F(α) with γ not in F.

Then:

There is an element β ∈ F(α) such that βq ∈ F

and there exists b0, b2, ..., bq-1 ∈ F such that:

γ = b0 + β + b2β2 + ... + bq-1βq-1

Proof:

(1) From Lemma 2 above, we know that there exists a0, a1, ..., aq-1 where:

γ = a0 + a1α + ... + aq-1αq-1

(2) Since γ is nonzero, we know that there exists a smallest integer k where 1 ≤ k ≤ q-1 and ak ≠ 0.

(3) Let β = akαk

(4) We know that βq ∈ F since:

(a) ak ∈ F [from Lemma 2 above]

(b) Since α is a root, we know that αq - a = 0 which implies that αq = a

(c) Since a ∈ F, it follows that αq ∈ F.

(5) Now for each k+1 ≤ m ≤ q-1, using Lemma 3 above, we know that:

there exists r,s such that:

rq + sk = m

where 0 ≤ s ≤ q-1

(6) Then there exists cs such that:

αm = (αq)rk)s = csβs

where cs ∈ F.

(7) Thus, if we set c0 = a0, we have:

γ = c0 + β + ck+1βk+1 + ... + cq-1βq-1

where each ci ∈ F.

QED

Lemma 4:

Let q be prime and and ζ be a primitive qth root of unity.

Then:

For each integer i:

1 + ζi + ζ2i + ... + ζ(q-1)i =



Proof:

(1) Assume that q divides i.

(2) Then there exists an integer x such that qx = i.

(3) So then:

1 + ζi + ζ2i + ... + ζ(q-1)i = 1 + (ζq)x + ... + (ζq)x*(q-1) = 1 + 1x + ... + 1x*(q-1) = q

(4) Assume q does not divide i.

(5) Then gcd(i,q)=1 and 0*i, 1*i, ..., (q-1)*i form a complete residue system modulo q. [See Lemma 3, here]

(6) Now, since ζ is a primitive q-th root of unity, we know that:

1 + ζ + ζ2 + ... + ζq-1 = 0 [See Lemma 1, here]

(7) Now, since 0*i, 1*i, ... , (q-1)*i is a complete residue system modulo q, we know that each value is uniquely congruent to a value in the complete residue system {0, 1, 2, ..., q-1}

(8) Further, we know that if i ≡ j (mod q), then ζi = ζj since ζ is a primitive qth root of unity [See Lemma 2, here]

(9) But then we have:

1 + ζi + ζ2i + ... + ζ(q-1)i = 1 + ζ + ζ2 + ... + ζq-1 = 0

QED

Lemma 5:

Let y be an a root of an irreducible polynomial f(x) over K.

Let L be a field extension of K such that y ∈ L

Then:

L is a splitting field for f(x)

Proof:

(1) If one root y is an element of L, then all the roots are elements of L. [See Theorem 3, here]

(2) From the Fundamental Theorem of Algebra (see Theorem, here), we know that if y1, ..., yn are the roots, then:

f(x) = (x - y1)*...*(x - yn)

(3) It therefore follows that L is a splitting field for f(x). [See Definition 3, here]

QED

References

No comments: