Opis: The Johns Hopkins University Press • Baltimore and London 1978 str. 176, stan db (podniszczona lekko okładka, zakurzona, nieaktualne pieczątki) ISBN 0-8018-2031-6 PREFACE ACKNOWLEDGMENTS chapter 1. Historical Background 1.1. Moses as a Statistician 1.2. The Haberdasher and the "Histogram" 1.3. The Pearson Distributions 1.4. Density Estimation by Classical Maximum Likelihood chapter 2. Some Approaches to Nonparametric Density Estimation 2.1. The Normal Distribution as Universal Density 2.2. The Johnson Family of Distributions 2.3. The Symmetric Stable Distributions 2.4. Series Estimators 2.5. Kernel Estimators chapter 3. Maximum Likelihood Density Estimation 3.1. Maximum Likelihood Estimators 3.2. The Histogram as a Maximum Likelihood Estimator 3.3. The Infinite Dimensional Case chapter 4. Maximum Penalized Likelihood Density Estimation 4.1. Maximum Penalized Likelihood Estimators 4.2. The de Montricher-Tapia-Thompson Estimator 4.3. The First Estimator of Good and Gaskins 4.4. The Second Estimator of Good and Gaskins chapter 5. Discrete Maximum Penalized Likelihood Estimation 5.1. Discrete Maximum Penalized Likelihood Estimators 5.2. Consistency Properties of the DMPLE 5.3. Numerical Implementation and Monte Carlo Simulation appendix I. An Introduction to Mathematical Optimization Theory 1.1. Hilbert Space 1.2. Reproducing Kernel Hilbert Spaces 1.3. Convex Functional and Differential Characterizations 1.4. Existence and Uniqueness of Solutions for Optimization Problems in Hilbert Space 1.5. Lagrange Multiplier Necessity Conditions appendix II. Numerical Solution of Constrained Optimization Problems II. 1. The Diagonalized Multiplier Method II.2. Optimization Problems with Nonnegativity Constraints INDEX
|