 Class of fractal curves derived from recursion on the base2 representation of the integers
 Summation rules and properties
 If integral of square of modulus of $f$ over $\mathbb C$ is finite, then $f \equiv 0$
 Does $N_0(T) =N(T)$ where $N$ is the exact Riemann $\zeta$ zero counting function and $N_0$ is the approxiomate zero counting function imply
 Combinatorics bridge card game.
 8th order particular solution
 Invariant function easy question.
 $C(X)$ is infinitely dimensional, where $X$ is Infinite compact Hausdorff space
 Some particular example clarification on algorithm of Hahn Hellinger Theorem
 Common Forecasting System in mathematics, and how to use differential system to construct a forecasting system
 Clifford algebra Cliff(0) as reals, Cliff(1) as complex…
 Prove ring of dyadic rationals is a Euclidean domain
 Solve $\frac{d}{da_k}\big[\sum_{k=1}^{p}{a_k\cdot y(nk)}\big]$
 Multivariable Taylor's formula and approximation (big O)
 Increase of free parameters in perturbed solution of an ODE
 Positive directional derivatives on sphere
 Homeomorphism wiht image and diffeomorphism with image
 Is it true that $\sum_{j=k}^{2k} {2k \choose j}\big(\frac{1}{2}\big)^{2k}\frac{1}{2}=\frac{1}{2}\Big({2k \choose k}\big(\frac{1}{2}\big)^{2
 infinite representation of a $C^*$ algebra
 How to factorize $a(y) := xy^{3}+xy^{2}+(x+1)y+x \in GF(2)[x]_{x^{2}+x+1}[y]$
Chernoff Bound. Inequalities like $P(Y \ge 150) \le e^{50log(\frac{27}{16})} $ .
Let $X_1, X_2 , ...,X_{200} $ be independent Bernoulli random variables.
In other words: $X_i$ ~ $Ber(\frac{1}{2}$) for $ i \in $ {1,...,200}.
Define $ Y = \sum_{i=1}^{200} X_i $.
Given: $Y$ ~ $Bin(200,\frac{1}{2}) $.
Show that:
a) $ P(Y \ge 150) \le \frac{2}{3} $
b) for every $ t$ $\in$ $\mathbb{R}$ :
$\mathbb{E}(e^{t Y}) $ = $ e^{200log(\frac{1}{2}e^t+\frac{1}{2})} $
c) for every $t$ $\in \mathbb{R}$ : $P(Y \ge 150) \le e^{150t+200log(\frac{1}{2}e^t+\frac{1}{2})} $
d) Find the optimal $t$ and show that: $P(Y \ge 150) \le e^{50log(\frac{27}{16})} $
My attempt:
a) Solved this with Markov's inequality.
b) $\mathbb{E}(e^{t Y}) = \mathbb{E}(e^{t \sum_{i=1}^{200} X_i }) = \mathbb{E}(e^{ \sum_{i=1}^{200} t X_i }) = \mathbb{E}( \prod_{i=1}^{200}e^{t X_i }) $ . I know that $X_i$ is independent. But how do I know that $ e^{tX_i}$ is independent? Anyways now I'm using the independence: $ \prod_{i=1}^{200} \mathbb{E}(e^{t X_i }).$ Now I need that: $\mathbb{E}( e

b) The independence comes from a standard theorem that if $X_1, X_2, \dots, X_n$ are independent, then $f_1(X_1), f_2(x_2), \dots, f_n(X_n)$ are independent for any functions $f_i$. For the expectations, try the Law of the Unconscious Statistician:
$$\mathbb E[e^{t X_i}] = e^{t \cdot 1} \cdot \frac 1 2 + e^{t \cdot 0} \cdot \frac 1 2.$$
c) Show that when $t < 0$, the exponent on $e$ is positive, hence the bound is trivial.
d) Hint: Recall from early calculus how to find optimal values of functions; the upshot is always to take a derivative, set it to 0, and solve.
20171203 05:26:23