Let (xk) ⊂ Rn and x ∈ Rn: Show that xk → x in Rn if and only if for every y ∈ Rn the

sequence (⟨xk; y⟩) ⊂ R converges to ⟨x; y⟩, that is, ⟨xk; y⟩ → ⟨x; y⟩ in R.

To show that (xk) ⊂ Rn converges to x ∈ Rn if and only if for every y ∈ Rn, the sequence (⟨xk; y⟩) ⊂ R converges to ⟨x; y⟩, we need to prove two implications:

1. If (xk) converges to x, then (⟨xk; y⟩) converges to ⟨x; y⟩.
2. If (⟨xk; y⟩) converges to ⟨x; y⟩ for every y, then (xk) converges to x.

Proof of implication 1:
Let (xk) converge to x, and let y ∈ Rn be fixed. We want to show that (⟨xk; y⟩) converges to ⟨x; y⟩.

First, note that the dot product of two vectors can be written as the sum of their components multiplied together:

⟨xk; y⟩ = ∑(xi_k * yi), where i ranges from 1 to n.

Now, let's consider the difference between ⟨xk; y⟩ and ⟨x; y⟩:

|⟨xk; y⟩ - ⟨x; y⟩| = |∑(xi_k * yi) - ∑(xi * yi)|.

Using the triangle inequality, we can split the difference into two separate sums:

|⟨xk; y⟩ - ⟨x; y⟩| ≤ ∑|xi_k * yi - xi * yi|.
≤ ∑(|xi_k - xi| * |yi|), since |ab| = |a| * |b|.

Since (xk) converges to x, for any ε > 0, there exists N such that for all k ≥ N, |xi_k - xi| < ε / |yi|.

Choosing ε' = ε / |yi|, we can rewrite the inequality as:

|⟨xk; y⟩ - ⟨x; y⟩| ≤ ∑(|xi_k - xi| * |yi|) < ∑(ε' * |yi|).

Since ε' is fixed and does not depend on k, and ∑(ε' * |yi|) is the sum of terms that no longer depends on k, we can control the convergence of (⟨xk; y⟩) by choosing a suitable N that satisfies the condition for (xk), as well as a constant C such that for all k ≥ N, ∑(ε' * |yi|) < C.

Therefore, we have shown that (⟨xk; y⟩) converges to ⟨x; y⟩.

Proof of implication 2:
Let (⟨xk; y⟩) converge to ⟨x; y⟩ for every y, and let's assume that (xk) does not converge to x.

If (xk) does not converge to x, then there exists an ε > 0 such that for any N, there exists some k ≥ N such that |xk - x| ≥ ε.

Now, let's consider the dot product ⟨xk - x; y⟩:

⟨xk - x; y⟩ = ⟨xk; y⟩ - ⟨x; y⟩.

Since we assumed that (⟨xk; y⟩) converges to ⟨x; y⟩, for ε' = ε / (2 * ||y||), there exists N' such that for all k ≥ N', |⟨xk; y⟩ - ⟨x; y⟩| < ε' for all y ∈ Rn.

Using the triangle inequality, we have:

|⟨xk - x; y⟩| = |⟨xk; y⟩ - ⟨x; y⟩| ≤ |⟨xk; y⟩| + |⟨x; y⟩|.

Now, letting k ≥ max(N, N'), we have:

|⟨xk; y⟩ - ⟨x; y⟩| < ε' = ε / (2 * ||y||).

And since |⟨x; y⟩ - ⟨x; y⟩| = 0, we can rewrite the inequality as:

|⟨xk - x; y⟩| = |⟨xk; y⟩ - ⟨x; y⟩| < ε / (2 * ||y||).

Multiplying both sides of the inequality by ||y||, we get:

|⟨xk - x; y⟩| * ||y|| < ε / 2.

Since ε / 2 > 0, we have found a constant C = ε / 2 such that for all k ≥ max(N, N'), |⟨xk - x; y⟩| * ||y|| < C.

However, we initially assumed that for any fixed y, the dot product ⟨xk - x; y⟩ converges to 0. This implies that for every ε > 0, there exists N'' such that for all k ≥ N'', |⟨xk - x; y⟩| * ||y|| < ε.

This is a contradiction since we found a constant C such that for all k ≥ max(N, N'), |⟨xk - x; y⟩| * ||y|| < C.

Therefore, our assumption that (xk) does not converge to x must be false, and thus (xk) converges to x.

Thus, we have shown that if (⟨xk; y⟩) converges to ⟨x; y⟩ for every y, then (xk) converges to x.

By proving both implications, we can conclude that (xk) ⊂ Rn converges to x ∈ Rn if and only if for every y ∈ Rn, the sequence (⟨xk; y⟩) ⊂ R converges to ⟨x; y⟩.