Data Fundamentals (H) - Week 08 Quiz
1. For a multi-objective optimisation, Pareto optimality means that:
Every combination of the sub-objective functions has been searched.
There is no possible improvement to any sub-objective function.
All sub-objective functions are zero.
Gradient descent is invalid.
Any improvement in any sub-objective functions makes at least one other worse.
2. What property of a probability distribution always holds true?
The product of all probabilities is 1
The determinant of probabilities is \(\infty\)
The sum of all probabilities is 1
The sum of all probabilities is 0
Probabilities are equally divided among outcomes
3. Bayesians use probability as:
a calculus of belief
a prayer book
complex angles
a representation of the long-term average of frequencies of outcomes
a calculus of truth
4. The conditional probability P(A|B) is defined to be: (\(\land\) means "and" and \(\lor\) means "or")
\(P(A \lor B) + P(B \lor A)\)
\(P(A \land B) / P(B)\)
\(P(A)P(B)\)
\(P(A \land B) P(B)\)
\(P(A||B) - B(A||P)\)
5. If I have a joint distribution over two random variables \(A\) and \(B\), \(P(A,B)\), how can I compute \(P(A)\)?
Divide \(P(A,B)\) by \(P(B)\)
Sum/integrate over \(P(A,B)\) for every value of \(A\)
\(P(A,B)−P(A|B)\)
Sum/integrate over \(P(A,B)\) for every value of \(B\)
Sum/integrate \(P(A,B)\) for every value of \(A\) and \(B\).
6. In an optimisation problem, a penalty function can be used to:
implement genetic algorithms
implement soft constraints
accelerate gradient descent
reduce the need for random search
issue red cards
Submit Quiz