Data Fundamentals (H) - Week 07 Quiz
1. Simulated annealing uses what metaheuristic to help avoid getting trapped in local minima?
Hill climbing.
Crossover rules.
A population of solutions.
Randomised restart.
A temperature schedule.
2. A
hyperparameter
of an optimisation algorithm is:
A measure of how good a solution is.
A value that is used to impose constraints on the solution.
The determinant of the Hessian.
A direction in hyperspace.
A value that affects how a solution is searched for.
3. First-order optimisation requires that objective functions be:
one-dimensional
\(C^1\) continuous
monotonic
disconcerting
invertible
4. The gradient vector \(\nabla L(\theta)\) is a vector which, at any given point \(\theta\) will:
be zero
be equal to \(\theta\)
point towards the global minimum of \(L(\theta)\)
point in the direction of steepest descent
have \(L_2\) norm 1
5. Finite differences is not an effective approach to apply first-order optimisation because:
none of the above
of numerical roundoff issues.
all of the above
the curse of dimensionality
the effect of measurement noise
6. Ant colony optimisation applies which two metaheuristics to improve random local search?
memory and population
thants
temperature and memory
random restart and hyperdynamics
gradient descent and crossover
Submit Quiz