Proof: Continuous function preserves convergence in probability.
Table of Contents
Introduction
This tutorial provides a step-by-step guide to understanding the proof that a continuous function preserves convergence in probability. The main result states that if a sequence of random variables (X_n) converges to a random variable (X) in probability, then applying a continuous function (f) yields that (f(X_n)) converges to (f(X)) in probability. This concept is essential in probability theory and has practical implications in statistics and data analysis.
Step 1: Understand Convergence in Probability
To grasp the theorem, first understand what convergence in probability means.
- A sequence (X_n) converges to (X) in probability if, for every (\epsilon > 0):
- [ \lim_{n \to \infty} P(|X_n - X| \geq \epsilon) = 0 ]
- This definition indicates that as (n) becomes large, the probability that (X_n) is far from (X) (by at least (\epsilon)) approaches zero.
Step 2: Familiarize with Continuous Functions
Next, review the properties of continuous functions, which are essential for the proof.
- A function (f) is continuous at a point (c) if:
- For every (\epsilon > 0), there exists a (\delta > 0) such that if (|x - c| < \delta), then (|f(x) - f(c)| < \epsilon).
- Key property: Continuous functions map limits of sequences to the limits of their images.
Step 3: Set Up the Proof
Begin the proof by stating the assumptions clearly.
- Assume (X_n \to X) in probability.
- Define (Y_n = f(X_n)) and (Y = f(X)).
- Show that (Y_n) converges to (Y) in probability.
Step 4: Apply the Definition of Convergence
Use the definition of convergence in probability for (Y_n).
- For any (\epsilon > 0), we need to show:
- [ \lim_{n \to \infty} P(|Y_n - Y| \geq \epsilon) = 0 ]
Step 5: Use the Continuity of (f)
Utilize the continuity of (f) to relate (Y_n) and (Y).
- Given the continuity of (f) at (X), for every (\epsilon > 0), there exists a (\delta > 0) such that:
- If (|X_n - X| < \delta), then (|f(X_n) - f(X)| < \epsilon).
Step 6: Relate the Events
Establish a connection between the events concerning (X_n) and (Y_n).
- Note that:
- [ P(|Y_n - Y| \geq \epsilon) \leq P(|X_n - X| \geq \delta) + P(|Y_n - Y| \geq \epsilon, |X_n - X| < \delta) ]
Step 7: Conclude the Proof
Complete the proof by showing the limits.
- As (n \to \infty):
- The first term converges to zero by the assumption of convergence in probability.
- The second term also converges to zero because for large (n), (X_n) is close to (X) (within (\delta)), hence (f(X_n)) is close to (f(X)).
Step 8: Discuss Bounded Functions
For bounded continuous functions, the expected value also converges.
- If (f) is bounded, then:
- [ E[f(X_n)] \to E[f(X)] ]
- This result is derived from the Dominated Convergence Theorem, which ensures that expectations can be interchanged with limits under certain conditions.
Conclusion
In summary, we have shown that a continuous function applied to a sequence of random variables that converges in probability also converges in probability. Understanding this concept is crucial for deeper insights into probability theory and its applications. Next steps may involve exploring more complex functions or proving similar theorems in different contexts within statistics and probability.