\( \newcommand{\N}{\mathbb{N}} \newcommand{\R}{\mathbb{R}} \newcommand{\C}{\mathbb{C}} \newcommand{\Q}{\mathbb{Q}} \newcommand{\Z}{\mathbb{Z}} \newcommand{\P}{\mathcal P} \newcommand{\B}{\mathcal B} \newcommand{\F}{\mathbb{F}} \newcommand{\E}{\mathcal E} \newcommand{\brac}[1]{\left(#1\right)} \newcommand{\abs}[1]{\left|#1\right|} \newcommand{\matrixx}[1]{\begin{bmatrix}#1\end {bmatrix}} \newcommand{\vmatrixx}[1]{\begin{vmatrix} #1\end{vmatrix}} \newcommand{\lims}{\mathop{\overline{\lim}}} \newcommand{\limi}{\mathop{\underline{\lim}}} \newcommand{\limn}{\lim_{n\to\infty}} \newcommand{\limsn}{\lims_{n\to\infty}} \newcommand{\limin}{\limi_{n\to\infty}} \newcommand{\nul}{\mathop{\mathrm{Nul}}} \newcommand{\col}{\mathop{\mathrm{Col}}} \newcommand{\rank}{\mathop{\mathrm{Rank}}} \newcommand{\dis}{\displaystyle} \newcommand{\spann}{\mathop{\mathrm{span}}} \newcommand{\range}{\mathop{\mathrm{range}}} \newcommand{\inner}[1]{\langle #1 \rangle} \newcommand{\innerr}[1]{\left\langle #1 \right \rangle} \newcommand{\ol}[1]{\overline{#1}} \newcommand{\toto}{\rightrightarrows} \newcommand{\upto}{\nearrow} \newcommand{\downto}{\searrow} \newcommand{\qed}{\quad \blacksquare} \newcommand{\tr}{\mathop{\mathrm{tr}}} \newcommand{\bm}{\boldsymbol} \newcommand{\cupp}{\bigcup} \newcommand{\capp}{\bigcap} \newcommand{\sqcupp}{\bigsqcup} \newcommand{\re}{\mathop{\mathrm{Re}}} \newcommand{\im}{\mathop{\mathrm{Im}}} \newcommand{\comma}{\text{,}} \newcommand{\foot}{\text{。}} \)

Sunday, December 7, 2014

Exam Problem from Probability Class

These two days PG students in our office struggled for their take-home final exam with dead-line two days after it was released. They were stuck with the following problem (and so I was, pleasantly, asked for help):
Problem. Let $(X,\mu)$ be a probability measure space such that $f_n \stackrel{\mu}{\to} 0$ and that $\text{Var}(f_n)=1$, prove that $E(f_n)\to 0$.

Here $ \text{Var}(f_n) = \int_X f_n^2\,d\mu -(\int_X f_n\,d\mu)^2$ and $E(f_n)=\int_X f_n\,d\mu$.

In the sequel we denote $\int f=\int_Xf\,d\mu$ and $\int_A f=\int_A f\,d\mu$. Then $E(f_n)$ is also written as $\int f_n$.

In the proof below we need an elementary inequality from probability that easily follows from Chebyshev's inequality: \[
\mu\{x\in X:|f_n(x)-E(f_n)|\ge a\}\leq \frac{\text{Var}(f_n)}{a^2}.
\]
Proof.
Suppose on the contrary that $E(f_n)\not\to 0$, then there is an $\epsilon_0>0$ such that $|\int f_{n_k}|\ge \epsilon_0$. It follows that \begin{align*}
\epsilon_0&\leq \left|\int_{\{|f_{n_k}|\ge 1\}} f_{n_k}\right|+\left|\int_{\{|f_{n_k}|< 1\}} f_{n_k}\right|\\
&\leq\sqrt{\int_{\{|f_{n_k}|\ge 1\}}  f_{n_k}^2} \cdot \sqrt{\mu\{|f_{n_k}|\ge 1\} } + \int|f_{n_k}|\chi_{\{|f_{n_k}|<1\}}\\
&\leq \sqrt{\int f_{n_k}^2}\cdot  \sqrt{\mu\{|f_{n_k}|\ge 1\} } + \int|f_{n_k}|\chi_{\{|f_{n_k}|<1\}}.
\end{align*} Since $f_{n_k}\to 0$ in measure, there is a subsequence $f_{n_{k_p}}$ that converges to 0 pointwise $\mu$-a.e.. By Lebesgue Dominated Convergence Theorem, $\int|f_{n_{k_p}}|\chi_{\{|f_{n_{k_p}}|<1\}}\to 0$. So the above becomes \[
\epsilon_0\leq \lims_{p\to \infty} \sqrt{\int f_{n_{k_p}}^2}\cdot  \sqrt{\mu\{|f_{n_{k_p}}|\ge 1\} } .
\] We will finish the proof by showing $\int f_n$ is in fact bounded in $n$, and so is $\int f_n^2$ since $1=\text{Var}(f_n)=\int f_n^2 - (\int f_n)^2$, and then the above inequality yields a contradiction that $\epsilon_0\leq 0$.

Now we turn to the boundedness of  $\{\int f_n\}$, the only tricky part of this problem. For this, observe that for any measurable $f,g$ on $X$, \[
\mu\{|f|\ge 2\epsilon\}\leq \mu \{|f-g|+|g|\ge 2\epsilon\}\leq \mu\{|f-g|\ge \epsilon\}+\mu\{|g|\ge \epsilon\}.
\] In particular, let's take $f$ to be the constant function $E(f_n)$, and $g=f_n$, then \[
\mu\{x\in X: |E(f_n)|\ge 2\epsilon\} \leq \mu \{|E(f_n)-f_n|\ge \epsilon\} + \mu\{|f_n|\ge \epsilon\} \leq \frac{1}{\epsilon^2} + \mu\{|f_n|\ge \epsilon\},
\] the last inequality follows from $\mu \{|f_n-E(f_n)|\ge \epsilon\} \leq \frac{\text{Var}(f_n)}{\epsilon^2}$. Taking $\lims$ on both sides above, we get \begin{equation}\label{gen it}
\lims \mu\{x\in X: |E(f_n)|\ge 2\epsilon\} \leq \frac{1}{\epsilon^2}
\end{equation} by convergence in measure. Take $\epsilon=2$, there is an $N$ such that for every $n>N$, \[\mu\{x\in X: |E(f_n)|\ge 2\cdot 2\}  \bm{ <1},\] and therefore $\{x\in X: |E(f_n)|\ge 4\}=\emptyset$, and thus $|E(f_n)|< 4$ for $n>N$.$\qed$

Remark. After examining the proof the condition that $\text{Var}(f_n)=1$ can be replaced by that the sequence $\{\text{Var}(f_n)\}_{n=1}^\infty$ is bounded, since we can generalize (\ref{gen it}) to
\[\boxed{\dis
\lims \mu\{x\in X: |E(f_n)|\ge 2\epsilon\} \leq \frac{\text{Var}(f_n)}{\epsilon^2}.}
\]

No comments:

Post a Comment