Abstract

During the last decade, considerable work has been carried out in one-shot device analysis and, in particular, in robust methods based on divergences, which improve the classical inference based on the maximum likelihood estimator (MLE) or likelihood ratio test. The estimators and tests developed by this approach depend on a tuning parameter . The choice of is, however, one of the main drawbacks of this perspective. In this paper, given a data set, we study different methods for the choice of the “optimal” tuning parameter including the iterative-Warwick and Jones (IWJ) algorithm (Basak et al. [8]) or the minimization of some loss functions of the observed data. While IWJ algorithm seems to be a good approach for low and moderate contamination, some simulations do suggest that minimizing the mean absolute error of the observed probabilities is as least as efficient as the IWJ algorithm for high contamination, avoiding heavier computations.
Loading...

Quotes

0 citations in WOS
0 citations in

Journal Title

Journal ISSN

Volume Title

Publisher

Springer

Date

Description

Citation

Endorsement

Review

Supplemented By

Referenced By

Statistics

Views
184
Downloads
0

Bibliographic managers