Please note, due to essential maintenance online transactions will not be possible between 09:00 and 13:00 BST, on Monday 20th January 2020 (04:00-08:00 EDT). We apologise for any inconvenience.
Lehmann  in his lecture notes on estimation shows that for estimating the unknown mean of a normal distribution, N(θ, 1), the usual estimator is neither minimax nor admissible if it is known that θ belongs to a finite closed interval [a, b] and the loss function is squared error. It is shown that , the maximum likelihood estimator (MLE) of θ, has uniformly smaller mean squared error (MSE) than that of . It is natural to ask the question whether the MLE of θ in N(θ, 1) is admissible or not if it is known that θ ∊ [a, b]. The answer turns out to be negative and the purpose of this note is to present this result in a slightly generalized form.