Data Fundamentals (H) - Week 09 Quiz
1. The entropy of a random variable \(H(X) = \sum_i -\log(P(x_i))P(x_i)\) is:
A measure of surprise
A measure of validity
A measure of expectation
A measure of likelihood
A measure of dimension
2. Bayes' Rule is:
P(A|B) = P(B|A)P(A) / P(B)
P(A|B) = P(B|B)P(A|A)
P(A|B) = P(B|A)P(A)P(B)
P(A|B) = P(A)P(B)
P(A|B) = P(A) - P(A ^ B)
3. The name for P(B|A) and P(A) in Bayes Rule are:
evidence and prior
priory and little red riding hood
posterior and evidence
likelihood and prior
likelihood and posterior
4. The expectation \(\mathbb{E}[X+1]\) for a discrete random variable \(X\) would be computed as:
\(\int_x P(X=x)P(X=1)x\)
\(\sum_x P(X=x)(x+1)\)
\(\sum_x P(X=x+1)\)
\(\sum_x P(X=x+1)(x+1)\)
\(\sum_x P(X+1=x)(x)\)
5. Which of these is a
statistic
which is an
estimator
of a population
parameter
for a normal distribution?
The entropy
The sample minimum
The probability
The bootstrap
The sample mean
6. Which of these is a nonparametric statistic?
gradient
standard deviation
expectation
mean
median
Submit Quiz