Browse Subject Headings
Statistical Planning and Inference : Concepts and Applications
Statistical Planning and Inference : Concepts and Applications
Click to enlarge
Author(s): Ghosh, Subir
ISBN No.: 9781119962786
Pages: 240
Year: 202511
Format: Trade Cloth (Hard Cover)
Price: $ 128.81
Dispatch delay: Dispatched between 7 to 15 days
Status: Available

Preface xi 1 Foundation of Experiments 1 1.1 Uncertainties in Evidences 1 1.2 Examples 2 1.2.1 The Louis Pasteur Anthrax Vaccination Experiment 2 1.2.2 The Lanarkshire Milk Experiment: Milk Tests in Lanarkshire Schools 2 1.3 Replication, Randomization, Blocking, and Blinding 4 1.


3.1 Replication 4 1.3.2 Randomization 4 1.3.3 Blocking 4 1.3.4 Blinding 4 1.


4 Figuring It Out! 4 Questions and Answers 5 Bibliography 6 2 Completely Randomized Design 7 2.1 An Example 7 2.2 Analyses Using R and SAS 9 2.3 Figuring It Out! 12 Bibliography 16 3 Randomized Complete Block Design 17 3.1 Fixed Effects Model 18 3.2 Binomial Model for Signs 20 3.3 Randomization Model 20 3.4 Mixed Effects Model 25 3.


5 General Mixed Effects Model 27 3.6 The REML Variance Components Estimates 28 3.7 BLUEs and BLUPs 31 3.7.1 The Conditional Model 32 3.7.2 The Unconditional Model 32 3.7.


3 Computation--The Conditional Model 33 3.7.4 Computation--The Unconditional Model 34 3.8 Figuring It Out! 39 Bibliography 40 4 Randomized Incomplete Block Design 41 4.1 Model M1: Fixed-Effects Model 41 4.2 Model M2: Mixed-Effects Model 43 4.3 Research Questions 44 4.4 Figuring It Out! 45 4.


5 Definitions 46 Exercises 46 Bibliography 51 5 Error Rates 53 5.1 Definitions of Error Rates 53 5.2 Single-Stage Methods 55 5.3 A Multistage Method 56 5.3.1 Benjamini and Hochberg Method 57 5.4 Figuring It Out 58 Questions 59 Bibliography 62 6 Nutrition Experiment 63 6.1 Figuring It Out! 63 Bibliography 75 7 The Pearson Dependence 77 7.


1 Bivariate Normal Distribution 77 7.2 Estimation of Unknown Parameters 79 7.2.1 The Unconditional Model 79 7.2.2 The Conditional Model 81 7.2.3 Test of Significance 83 7.


3 A Bayesian Estimation 84 7.4 Exercises 86 Bibliography 87 8 The Multivariate Dependence 89 8.1 The Multivariate Normal Distribution 90 8.2 Inference 91 8.3 Partial Dependence 96 8.4 Exercises 96 Bibliography 98 9 The Conditional Mean Dependence 99 9.1 LS Estimation 100 9.2 Ridge Estimation 101 9.


2.1 A Bayesian Estimation 103 9.3 Dependence of Ridge Estimator on the Tuning Parameter 103 9.4 LASSO Estimation 104 9.5 Dependence of LASSO Estimators on the Tuning Parameter 105 Bibliography 116 10 More Parameters Than Observations 119 10.1 Learning by Doing--Exercises 122 Exercises 123 Bibliography 125 11 Eigenvalues, Eigenvectors, and Applications 127 11.1 Eigenvalues and Eigenvectors 127 11.2 Second-Order Response Surface 129 Exercises 132 Bibliography 133 12 Covariance Estimation 135 12.


1 Model 1 135 12.1.1 Characterization of the Covariance Matrix and Its Estimators 135 12.1.2 Likelihood Function 136 12.1.3 Properties 137 12.2 Model 2 137 12.


2.1 Characterization of the Covariance Matrix and Its Estimators 138 12.3 Model 3 138 12.4 Model 4 139 12.5 Model 5 140 12.6 Exercises 141 Bibliography 142 13 Discriminant Analysis 145 13.1 Learning from the Univariate Data--Two Normal Populations with Equal Variances 145 13.1.


1 Discriminant Analysis for the Univariate Data 147 13.1.2 Example--Univariate Discriminant Analysis 148 13.2 Learning from the Univariate Data--Two Normal Populations with Unequal Variances 151 13.2.1 Classification of 25 Versicolor Iris Flowers 153 13.2.2 Classification of 25 Setosa Iris Flowers 154 13.


2.3 Test of Homogeneity of Variances 154 13.3 Learning from the Multivariate Data 155 13.3.1 Classification of Versicolor and Setosa 156 13.3.2 Classification of Versicolor and Virginica 158 13.4 Logistic Regression 159 13.


5 Exercises 160 Bibliography 162 14 Optimizing the Variance-Bias Trade-Off 163 14.1 Variance-Bias Trade-Off 163 14.1.1 Example 1 164 14.1.2 Example 2 165 14.1.3 Example 3 166 14.


2 Information in Data 167 14.3 Information and Design in Presence of a Covariate 169 14.3.1 Information 169 14.3.2 Optimum Design for a Covariate 170 14.4 Information and Design in Presence of Multiple Covariates 171 14.4.


1 Information 171 14.4.2 Exponential Model 175 14.4.3 Exponential Regression Model with Multiple Covariates 176 14.4.4 Poisson Log-Linear Model 177 14.4.


5 Non-parametric Regression Model 180 14.5 Exercises 183 Bibliography 187 15 Specification, Discrimination, Robustness, and Sensitivity 189 15.1 The Global and Local Optimal Models 189 15.2 The T-Optimal Design 190 15.3 Convex and Concave Functions 192 15.4 The Kullback-Leibler (KL) Divergence 194 15.5 The KL Design Optimality 197 15.6 The Differential Entropy 198 15.


7 Lindley Information Measure 200 15.8 Joint Entropy, Conditional Entropy, and Mutual Information 202 15.9 Maximum Entropy Sampling 204 15.10 Search Linear Models and Search Designs 207 15.10.1 Factorial Experiments 209 15.10.2 Search Probability Matrix 210 15.


11 Robustness Against Unavailable Data 210 15.12 Influential Sets of Observations 212 15.13 Exercises 213 Bibliography 214 Data Index 217 Subject Index 219.


To be able to view the table of contents for this publication then please subscribe by clicking the button below...
To be able to view the full description for this publication then please subscribe by clicking the button below...
Browse Subject Headings