3.7 Minimax Estimation
1 Definition
The idea is to minimize the worst-case risk
The minimum achievable sup-risk is called the minimax risk of the estimation problem
An estimator
Game theory interpretation:
- Analyst chooses estimator
. - Nature chooses parameter
to maximize the risk.
Nature chooses
Compare to Bayes estimator ("maximin" game):
- Nature chooses prior
. - Analyst chooses estimator to minimize the average risk.
2 Least Favorable Priors
The key observation is: average-case risk
Prior
For any
Suppose
is minimax. - If
is unique Bayes (up to a.s.) for , it is unique minimax. is least favorable.
- For any other
,
(last equality is by assumption) Sois minimax risk, is minimax estimator. - Replace
with in (*). - For any other prior
,
So when is average risk = sup risk?
This is true if
is constant. .
Try
2.1 Least Favorable Prior Sequence
Sometimes there is no least favorable prior. Like
A sequence
is minimax. is least favorable.
- For another
, , - For prior
,
Try prior
So
3 Bounding Minimax Risk
Our theorem gives an idea of how to bound
- Upper bound: if
is any estimator then . ("=" if is minimax) - Lower bound: if
is any prior then . ("=" if is least favorable).
Minimax estimators are very hard to find but minimax bounds are often used in statistics theory to characterize hardness (especially lower bound).
- Propose practical estimator
, find for which is close to . (or same rate, or converges asymptotically) Conclude can't be improved "much". - Quantify hardness of a problem by its minimax rate in some asymptotic regime.
- Caveat: a problem might be easy throughout most of parameter space but very hard in some bizarre corner you never encounter in practice.