Given that , "(f_n)" be a sequence of function.
Where "f_n:[0,1]\\rightarrow\\R" defined by "f_n(x)=x^n" .
Clearly, if "x=1" then the sequence "(f_n(1))" converges to 1 .
If "0\\leq x <1" then the sequence "(f_n(x))" converges to 0 as we known that "lim_{n\\to \\infty} x^n=0" if "0\\leq x<1" .
Let "f(x)=\\begin{cases}\n1 & \\text{if} \\ x=1 \\\\\n0 & \\text{if} \\ 0\\leq x <1\n\\end{cases}"
Thus the sequence of function "(f_n(x))" convergent to "f" on the set "[0,1]" .
If "n_k=k \\ and \\ x_k=(\\frac{1}{2})^{ \\frac{1}{k}}" then
"| f_{n_k}(x_k)-f(x_k)|=|\\frac{1}{2}-0|=\\frac{1}{2}" .
Therefore the sequence "(f_k)" doesn't converge uniformly on "[0,1] \\ to \\ f" .
Comments
Leave a comment