Data Fundamentals (H) - Week 07 Quiz
1. Simulated annealing uses what metaheuristic to help avoid getting trapped in local minima?
Randomised restart.
Hill climbing.
A population of solutions.
Crossover rules.
A temperature schedule.
2. A
hyperparameter
of an optimisation algorithm is:
A value that is used to impose constraints on the solution.
A value that affects how a solution is searched for.
A direction in hyperspace.
A measure of how good a solution is.
The determinant of the Hessian.
3. First-order optimisation requires that objective functions be:
disconcerting
monotonic
one-dimensional
invertible
\(C^1\) continuous
4. The gradient vector \(\nabla L(\theta)\) is a vector which, at any given point \(\theta\) will:
point in the direction of steepest descent
be equal to \(\theta\)
point towards the global minimum of \(L(\theta)\)
be zero
have \(L_2\) norm 1
5. Finite differences is not an effective approach to apply first-order optimisation because:
of numerical roundoff issues.
all of the above
the effect of measurement noise
none of the above
the curse of dimensionality
6. Ant colony optimisation applies which two metaheuristics to improve random local search?
temperature and memory
gradient descent and crossover
thants
random restart and hyperdynamics
memory and population
Submit Quiz