Data Fundamentals (H) - Week 07 Quiz
1. Simulated annealing uses what metaheuristic to help avoid getting trapped in local minima?
A population of solutions.
Hill climbing.
A temperature schedule.
Randomised restart.
Crossover rules.
2. A
hyperparameter
of an optimisation algorithm is:
A direction in hyperspace.
A measure of how good a solution is.
A value that affects how a solution is searched for.
A value that is used to impose constraints on the solution.
The determinant of the Hessian.
3. First-order optimisation requires that objective functions be:
one-dimensional
disconcerting
invertible
\(C^1\) continuous
monotonic
4. The gradient vector \(\nabla L(\theta)\) is a vector which, at any given point \(\theta\) will:
point in the direction of steepest descent
point towards the global minimum of \(L(\theta)\)
be equal to \(\theta\)
have \(L_2\) norm 1
be zero
5. Finite differences is not an effective approach to apply first-order optimisation because:
none of the above
the effect of measurement noise
all of the above
the curse of dimensionality
of numerical roundoff issues.
6. Ant colony optimisation applies which two metaheuristics to improve random local search?
random restart and hyperdynamics
memory and population
gradient descent and crossover
temperature and memory
thants
Submit Quiz