In some exercises you are asked to plot a function of 1 or 2 variables. You may use your favorite software if you are already using it. If you write a program, you are welcome to include it in your submission. I will not grade this submission, but will take a look as needed.
Various software that will suffice includes:
Let $\mathcal{F}\subseteq 2^\Omega$ be an event space and Let $A_1,A_2,\ldots,$ be an infinite sequence of events, i.e., $A_j\in\mathcal{F})$ for $j=1,2,\ldots$. For any given non-negative integer $k$, Let $B_k$ be the set of those $\omega\in\Omega$ which belong to exactly $k$ of the events $A_j$. Prove that $B_k \in\mathcal{F}$.
As an application of the above theorem, consider the following random experiment. A fair coin is tossed every day, from now on to infinity. Assume that the outcome on a particular day ("heads" or "tails") is an event. Consider the set of outcomes in which exactly $5$ heads were obtained (from now to infinity). Prove that this set of outcomes is an event.
Let $X$ be a function whose range is $\{1,2,3,\ldots\}$. Is there a discrete random variable $X$ such that \[ \P(X=n) = \frac{1}{n(n+ 1)} \] for any $n\ge 1$. If so, what is $\E(X)$?
Define a random variable $X$ such that $\E(X^2)$ is finite, but $\E(X)$ does not exist. What can you say about $\V(X)$?
Use the result of Problem 2.6.1 on page 35 and the identity $x^2 = x(x-1) + x$ to find the variance of a Poisson distributed random variable $X$.
Find the $n$-th factorial moment of a geometrically distributed random variable $X$: \[ \P(X=k) = p q^{k-1}, \quad k=1,2,\ldots\] ($p$ probability of success; $q=1-p$ probability of failure) i.e. find \[ \E(X(X-1)\ldots(X-n+1). \]
Two cards are drawn at random from a deck of 52 cards. If $X$ denotes the number of aces drawn and $Y$ denotes the number of spades, display the joint mass function of $X$ and $Y$ in the tabular form of Table 3.1 from the textbook. Every entry must be correct and exact (a fraction) to earn credit for this problem.
The pair of discrete random variables $(X, Y)$ has joint mass function \[ \P(X= i, Y= j)=\begin{cases} \theta^{i+ 2j+1} & \text{if $i, j= 0, 1, 2$},\\ 0 & \text{otherwise}, \end{cases} \] for some value of $\theta$. Find an equation that $\theta$ must satisfy. Prove that $\theta$ is unique, and find its value either exactly or to 6 significant digits.
HINT: If you need to approximate $\theta$, use suitable software that can solve equations.
Let $X$ and $Y$ be the random variables discussed in Problem C3.1.2. Find the pmf of each of them. That is, find \[ p_X(x) = \P(X=x),\quad x\in\RR\] and \[ p_Y(y) = \P(Y=y), \quad y\in\RR.\]
The pair of discrete random variables $(X, Y)$ has joint mass function \[ \P(X= i, Y= j)=\begin{cases} \theta^{i+ 2j+1} & \text{if $i, j= 0, 1, 2$},\\ 0 & \text{otherwise}, \end{cases} \] for some value of $\theta$. Find the expected value \[ \E(X^2Y) \] as a function of $\theta$. Also approximate to 6 significant digits using software.
Let $X$ and $Y$ be independent discrete random variables. Prove that \[ \P(X \ge x \text{ and } Y \ge y) = \P(X \ge x)\P(Y \ge y) \] for all $x, y \in \RR$, but without using any summations. Instead, use Theorem 3.20 for suitably chosen functions $g, h:\RR\to\RR$.
Are the random variables $X$ and $Y$ discussed in Problem C3.1.2 independent? Why or why not? (A proof required.)
Let $X$ and $Y$ be two Bernoulli random variables on the same probability space $(\Omega,\mathcal{F},\P)$. Prove that $X$ and $Y$ are independent if and only if $\E(X\,Y) = \E(X)\E(Y)$.
Let $X$ and $Y$ be independent discrete random variables assuming values in the set $\{0,1,2,\ldots,100\}$, such that \[ \begin{align} \P(X=k) &= \frac{ck}{100},\\ \P(Y=k) &= \frac{c(100-k)}{100}. \end{align} \]
Let $N$ be the number of the events $A_1$, $A_2$, $\ldots$, which occur (infinitely many events). Show that $\E(N)=\sum_{i=1}^\infty \P(A_i)$. In particular, prove that $N$ is a random variable!
HINT: Use a hint in the footnote on page 47 and our Chapter1, page 6.
Let $X:\Omega\to\RR$ be the random variable representing the total number of heads in $n$ coin tosses (e.g. as introduced in Example 2.18 on Chapter2, page 8 or on page 27). Find a number $k$ ($k=\infty$ allowed), pairwise disjoint events $A_j$, $j=1,2,\ldots, k$, and real numbers $c_j$, $j=1,2,\dots,k$, such that \[ X = \sum_{j=1}^k c_j\one_{A_j}. \]
Let $X:\Omega\to\RR$ be the random variable representing the waiting time for the first head in repated coin toss experiment (the probability of head in a single toss is $p$; $P(X=x)=(1-p)^{x-1}p$, $x=1,2,\ldots$). Find a number $k$ ($k=\infty$ allowed), pairwise disjoint events $A_j$, $j=1,2,\ldots, k$, and real numbers $c_j$, $j=1,2,\dots,k$, such that \[ X = \sum_{j=1}^k c_j\one_{A_j}. \]
Don't forget that it is possible to never get a head, i.e. $X=\infty$ is possible! Propose a solution to this situation. Below we should assume the standard model from the book, with \[ \Omega = \{ T^{k-1}H \}_{k=1}^\infty \cup \{T^\infty\} \] Two possible solutions are:
A random number $N$ of people arrive at a movie theater to see a movie. The theater has $r$ separate rooms which play the movie. The doorman directs people to the rooms at random, with probabilities $p_1,p_2,\ldots, p_r$ ($\sum_{j=1}^r p_j=1$). Random variable $N$ is Poisson distributed with parameter $\lambda$.
Let $N_1, N_2, \ldots, N_r$ be the number of people that are let into the $r$ rooms. Show that $N_j$ are independent, Poisson distributed random variables, and find their parameters $\lambda_1,\lambda_2,\ldots,\lambda_r$ as function of $\lambda$ and $p_j$ ($j=1,2,\ldots,r$).
HINT: Read my solution to Problem 3.6.14 page 49 and our Chapter3, page 31.
We roll a die independently 10 times and compute $X$ - the total of face values.
NOTE: Follow the class notes on Chapter4, page 19. Note that you have a choice of 3 methods for the second part. If you use CAS, you can compute a derivative of $G_X(s)$ at $0$ or you can find the Taylor polynomial of $G_X(s)$ up to sufficiently high order. Or you can express $\P(X=0)$ as a sum and find several binomial coefficients.
(See the Example 4.39 on page 57.)
Given is that a random variable $X$ has generating function \[ G_X(s) = \left(\frac{1}{2} + \frac{1}{2}e^{3(s-1)}\right)^{20}. \] Find the exact value of $P(X = 20)$.
HINT: You must use a CAS for this exercise. The easiest way is to use the command that finds the Taylor series.
Prove carefully that for every random variable $X:\Omega\to\RR$: \[ \lim_{x\to-\infty} F_X(x) = 0\] and that \[ \lim_{x\to\infty} F_X(x) = 1\]
Prove carefully that the Devil's Staircase function (Cantor function) defined in the class notes is a valid cdf.
The stock of the company Macrosoft on a normal day goes up or down by $X$ dollars, where $X$ is uniformly distributed on the interval $[-1,1]$, except those days when the company reports its earnings (once every 90 days). when the stock gains or loses exactly $Y$ dollars, where $\P(Y=\pm 5)=0.5$. Let $Z$ be the price change of Macrosoft stock on a random day.