Data Fundamentals (H) - Week 07 Quiz
1. Simulated annealing uses what metaheuristic to help avoid getting trapped in local minima?
Hill climbing.
A population of solutions.
Randomised restart.
A temperature schedule.
Crossover rules.
2. A
hyperparameter
of an optimisation algorithm is:
A measure of how good a solution is.
A value that is used to impose constraints on the solution.
A value that affects how a solution is searched for.
The determinant of the Hessian.
A direction in hyperspace.
3. First-order optimisation requires that objective functions be:
invertible
monotonic
one-dimensional
\(C^1\) continuous
disconcerting
4. The gradient vector \(\nabla L(\theta)\) is a vector which, at any given point \(\theta\) will:
be equal to \(\theta\)
point in the direction of steepest descent
be zero
have \(L_2\) norm 1
point towards the global minimum of \(L(\theta)\)
5. Finite differences is not an effective approach to apply first-order optimisation because:
the effect of measurement noise
none of the above
all of the above
the curse of dimensionality
of numerical roundoff issues.
6. Ant colony optimisation applies which two metaheuristics to improve random local search?
temperature and memory
random restart and hyperdynamics
thants
gradient descent and crossover
memory and population
Submit Quiz