1  20
Next
Number of results to display per page
1. Statistics [2016]
 Donnelly, Robert A., Jr., author.
 Third edition, First American edition.  Indianapolis, Indiana : Alpha, a member of Penguin Random House LLC, 2016.
 Description
 Book — 1 online resource (1 volume) : illustrations.
 Summary

Statistics is a class that is required in many college majors, and it's an increasingly popular Advanced Placement high school course. In addition to math and technical students, many business and liberal arts students are required to take it as a fundamental component of their majors. A knowledge of statistical interpretation is vital for many careers. Idiot's Guides: Statistics explains the fundamental tenets in language anyone can understand. Content includes:  Calculating descriptive statistics  Measures of central tendency: mean, median, and mode  Probability  Variance analysis  Inferential statistics  Hypothesis testing  Organizing data into statistical charts and tables.
2. Analyzing baseball data with R [2014]
 Marchi, Max.
 Boca Raton : CRC Press, [2014]
 Description
 Book — xvii, 333 pages : illustrations ; 24 cm.
 Summary

 The Baseball Datasets Introduction The Lahman Database: SeasonbySeason Data Retrosheet GamebyGame Data Retrosheet PlaybyPlay Data PitchbyPitch Data
 Introduction to R Introduction Installing R and RStudio Vectors Objects and Containers in R Collection of R Commands Reading and Writing Data in R Data Frames Packages Splitting, Applying, and Combining Data
 Traditional Graphics Introduction Factor Variable Saving Graphs Dot Plots Numeric Variable: Stripchart and Histogram Two Numeric Variables A Numeric Variable and a Factor Variable Comparing Ruth, Aaron, Bonds, and ARod The 1998 Home Run Race
 The Relation between Runs and Wins Introduction The Teams Table in Lahman's Database Linear Regression The Pythagorean Formula for Winning Percentage The Exponent in the Pythagorean Formula Good and Bad Predictions by the Pythagorean Formula How Many Runs for a Win?
 Value of Plays Using Run Expectancy The Runs Expectancy Matrix Runs Scored in the Remainder of the Inning Creating the Matrix Measuring Success of a Batting Play Albert Pujols Opportunity and Success for All Hitters Position in the Batting Lineup Run Values of Different Base Hits Value of Base Stealing
 Advanced Graphics Introduction The lattice Package The ggplot2 Package
 Balls and Strikes Effects Introduction Hitter's Counts and Pitcher's Counts Behaviors by Count
 Career Trajectories Introduction Mickey Mantle's Batting Trajectory Comparing Trajectories General Patterns of Peak Ages Trajectories and Fielding Position
 Simulation Introduction Simulating a Half Inning Simulating a Baseball Season
 Exploring Streaky Performances Introduction The Great Streak Streaks in Individual AtBats Local Patterns of Weighted OnBase Average
 Learning about Park Effects by Database Management Tools Introduction Installing MySQL and Creating a Database Connecting R to MySQL Filling a MySQL Game Log Database from R Querying Data from R Baseball Data as MySQL Dumps Calculating Basic Park Factors
 Exploring Fielding Metrics with Contributed R Packages Introduction A Motivating Example: Comparing Fielding Metrics Comparing Two Shortstops
 Appendix A: Retrosheet Files Reference Appendix B: Accessing and Using MLBAM Gameday and PITCHf/x Data
 Bibliography Index
 Further Reading and Exercises appear at the end of each chapter.
 (source: Nielsen Book Data)
(source: Nielsen Book Data)
 Online
Green Library
Green Library  Status 

Find it Stacks  Request (opens in new tab) 
GV877 .M353 2014  Unknown 
 Boca Raton : CRC Press, 2014.
 Description
 Book — xxiii, 622 pages ; 24 cm
 Summary

 The History of COPSS A brief history of the Committee of Presidents of Statistical Societies (COPSS) Ingram Olkin Reminiscences and Personal Reflections on Career Paths Reminiscences of the Columbia University Department of Mathematical Statistics in the late 1940s Ingram Olkin A career in statistics Herman Chernoff "... how wonderful the field of statistics is ..." David R. Brillinger An unorthodox journey to statistics: Equity issues, remarks on multiplicity Juliet Popper Shaffer Statistics before and after my COPSS Prize Peter J. Bickel The accidental biostatistics professor Donna Brogan Developing a passion for statistics Bruce G. Lindsay Reflections on a statistical career and their implications R. Dennis Cook Science mixes it up with statistics Kathryn Roeder Lessons from a twisted career path Jeffrey S. Rosenthal Promoting equity Mary Gray Perspectives on the Field and Profession Statistics in service to the nation Stephen E. Fienberg Where are the majors? Iain M. Johnstone We live in exciting times Peter Hall The bright future of applied statistics Rafael A. Irizarry The road travelled: From a statistician to a statistical scientist Nilanjan Chatterjee Reflections on a journey into statistical genetics and genomics Xihong Lin Reflections on women in statistics in Canada Mary E. Thompson "The whole women thing" Nancy Reid Reflections on diversity Louise Ryan Reflections on the Discipline Why does statistics have two theories? Donald A.S. Fraser Conditioning is the issue James O. Berger Statistical inference from a DempsterShafer perspective Arthur P. Dempster Nonparametric Bayes David B. Dunson How do we choose our default methods? Andrew Gelman Serial correlation and DurbinWatson bounds T.W. Anderson A nonasymptotic walk in probability and statistics Pascal Massart The past's future is now: What will the present's future bring? Lynne Billard Lessons in biostatistics Norman E. Breslow A vignette of discovery Nancy Flournoy Statistics and public health research Ross L. Prentice Statistics in a new era for finance and health care Tze Leung Lai Metaanalyses: Heterogeneity can be a good thing Nan M. Laird Good health: Statistical challenges in personalizing disease prevention Alice S. Whittemore Buried treasures Michael A. Newton Survey sampling: Past controversies, current orthodoxy, future paradigms Roderick J.A. Little Environmental informatics: Uncertainty quantification in the environmental sciences Noel A. Cressie A journey with statistical genetics Elizabeth Thompson Targeted learning: From MLE to TMLE Mark van der Laan Statistical model building, machine learning, and the ahha moment Grace Wahba In praise of sparsity and convexity Robert J. Tibshirani Features of Big Data and sparsest solution in high confidence set Jianqing Fan Rise of the machines Larry A. Wasserman A trio of inference problems that could win you a Nobel Prize in statistics (if you help fund it) XiaoLi Meng Advice for the Next Generation Inspiration, aspiration, ambition C.F. Jeff Wu Personal reflections on the COPSS Presidents' Award Raymond J. Carroll Publishing without perishing and other career advice Marie Davidian Converting rejections into positive stimuli Donald B. Rubin The importance of mentors Donald B. Rubin Never ask for or give advice, make mistakes, accept mediocrity, enthuse Terry Speed Thirteen rules Bradley Efron.
 (source: Nielsen Book Data)
(source: Nielsen Book Data)
Science Library (Li and Ma)
Science Library (Li and Ma)  Status 

Stacks  Request (opens in new tab) 
QA276 .P276 2014  Unknown 
 Modica, Giuseppe.
 Chichester, West Sussex : John Wiley & Sons Ltd., 2013.
 Description
 Book — 1 online resource (xii, 334 pages)
 Summary

 Preface xi
 1 Combinatorics 1 1.1 Binomial coefficients 1 1.1.1 Pascal triangle 1 1.1.2 Some properties of binomial coefficients 2 1.1.3 Generalized binomial coefficients and binomial series 3 1.1.4 Inversion formulas 4 1.1.5 Exercises 6 1.2 Sets, permutations and functions 8 1.2.1 Sets 8 1.2.2 Permutations 8 1.2.3 Multisets 10 1.2.4 Lists and functions 11 1.2.5 Injective functions 12 1.2.6 Monotone increasing functions 12 1.2.7 Monotone nondecreasing functions 13 1.2.8 Surjective functions 14 1.2.9 Exercises 16 1.3 Drawings 16 1.3.1 Ordered drawings 16 1.3.2 Simple drawings 17 1.3.3 Multiplicative property of drawings 17 1.3.4 Exercises 18 1.4 Grouping 19 1.4.1 Collocations of pairwise different objects 19 1.4.2 Collocations of identical objects 22 1.4.3 Multiplicative property 23 1.4.4 Collocations in statistical physics 24 1.4.5 Exercises 24
 2 Probability measures 27 2.1 Elementary probability 28 2.1.1 Exercises 29 2.2 Basic facts 33 2.2.1 Events 34 2.2.2 Probability measures 36 2.2.3 Continuity of measures 37 2.2.4 Integral with respect to a measure 39 2.2.5 Probabilities on finite and denumerable sets 40 2.2.6 Probabilities on denumerable sets 42 2.2.7 Probabilities on uncountable sets 44 2.2.8 Exercises 46 2.3 Conditional probability 51 2.3.1 Definition 51 2.3.2 Bayes formula 52 2.3.3 Exercises 54 2.4 Inclusion exclusion principle 60 2.4.1 Exercises 63
 3 Random variables 68 3.1 Random variables 68 3.1.1 Definitions 69 3.1.2 Expected value 75 3.1.3 Functions of random variables 77 3.1.4 Cavalieri formula 80 3.1.5 Variance 82 3.1.6 Markov and Chebyshev inequalities 82 3.1.7 Variational characterization of the median and of the expected value 83 3.1.8 Exercises 84 3.2 A few discrete distributions 91 3.2.1 Bernoulli distribution 91 3.2.2 Binomial distribution 91 3.2.3 Hypergeometric distribution 93 3.2.4 Negative binomial distribution 94 3.2.5 Poisson distribution 95 3.2.6 Geometric distribution 98 3.2.7 Exercises 101 3.3 Some absolutely continuous distributions 102 3.3.1 Uniform distribution 102 3.3.2 Normal distribution 104 3.3.3 Exponential distribution 106 3.3.4 Gamma distributions 108 3.3.5 Failure rate 110 3.3.6 Exercises 111
 4 Vector valued random variables 113 4.1 Joint distribution 113 4.1.1 Joint and marginal distributions 114 4.1.2 Exercises 117 4.2 Covariance 120 4.2.1 Random variables with finite expected value and variance 120 4.2.2 Correlation coefficient 123 4.2.3 Exercises 123 4.3 Independent random variables 124 4.3.1 Independent events 124 4.3.2 Independent random variables 127 4.3.3 Independence of many random variables 128 4.3.4 Sum of independent random variables 130 4.3.5 Exercises 131 4.4 Sequences of independent random variables 140 4.4.1 Weak law of large numbers 140 4.4.2 Borel Cantelli lemma 142 4.4.3 Convergences of random variables 143 4.4.4 Strong law of large numbers 146 4.4.5 A few applications of the law of large numbers 152 4.4.6 Central limit theorem 159 4.4.7 Exercises 163
 5 Discrete time Markov chains 168 5.1 Stochastic matrices 168 5.1.1 Definitions 169 5.1.2 Oriented graphs 170 5.1.3 Exercises 172 5.2 Markov chains 173 5.2.1 Stochastic processes 173 5.2.2 Transition matrices 174 5.2.3 Homogeneous processes 174 5.2.4 Markov chains 174 5.2.5 Canonical Markov chains 178 5.2.6 Exercises 181 5.3 Some characteristic parameters 187 5.3.1 Steps for a first visit 187 5.3.2 Probability of (at least) r visits 189 5.3.3 Recurrent and transient states 191 5.3.4 Mean first passage time 193 5.3.5 Hitting time and hitting probabilities 195 5.3.6 Exercises 198 5.4 Finite stochastic matrices 201 5.4.1 Canonical representation 201 5.4.2 States classification 203 5.4.3 Exercises 205 5.5 Regular stochastic matrices 206 5.5.1 Iterated maps 206 5.5.2 Existence of fixed points 209 5.5.3 Regular stochastic matrices 210 5.5.4 Characteristic parameters 218 5.5.5 Exercises 220 5.6 Ergodic property 222 5.6.1 Number of steps between consecutive visits 222 5.6.2 Ergodic theorem 224 5.6.3 Powers of irreducible stochastic matrices 226 5.6.4 Markov chain Monte Carlo 228 5.7 Renewal theorem 233 5.7.1 Periodicity 233 5.7.2 Renewal theorem 234 5.7.3 Exercises 239
 6 An introduction to continuous time Markov chains 241 6.1 Poisson process 241 6.2 Continuous time Markov chains 246 6.2.1 Definitions 246 6.2.2 Continuous semigroups of stochastic matrices 248 6.2.3 Examples of rightcontinuous Markov chains 256 6.2.4 Holding times 259 Appendix A Power series 261 A.1 Basic properties 261 A.2 Product of series 263 A.3 Banach space valued power series 264 A.3.2 Exercises 267 Appendix B Measure and integration 270 B.1 Measures 270 B.1.1 Basic properties 270 B.1.2 Construction of measures 272 B.1.3 Exercises 279 B.2 Measurable functions and integration 279 B.2.1 Measurable functions 280 B.2.2 The integral 283 B.2.3 Properties of the integral 284 B.2.4 Cavalieri formula 286 B.2.5 Markov inequality 287 B.2.6 Null sets and the integral 287 B.2.7 Push forward of a measure 289 B.2.8 Exercises 290 B.3 Product measures and iterated integrals 294 B.3.1 Product measures 294 B.3.2 Reduction formulas 296 B.3.3 Exercises 297 B.4 Convergence theorems 298 B.4.1 Almost everywhere convergence 298 B.4.2 Strong convergence 300 B.4.3 Fatou lemma 301 B.4.4 Dominated convergence theorem 302 B.4.5 Absolute continuity of integrals 305 B.4.6 Differentiation of the integral 305 B.4.7 Weak convergence of measures 308 B.4.8 Exercises 312 Appendix C Systems of linear ordinary differential equations 313 C.1 Cauchy problem 313 C.1.1 Uniqueness 313 C.1.2 Existence 315 C.2 Efficient computation of eQt 317 C.2.1 Similarity methods 317 C.2.2 Putzer method 319 C.3 Continuous semigroups 321 References 324 Index 327.
 (source: Nielsen Book Data)
(source: Nielsen Book Data)
 Online

 dx.doi.org Wiley Online Library
 Google Books (Full view)
 First edition  Boca Raton, FL : CRC Press, an imprint of Taylor and Francis, 2001
 Description
 Book — 1 online resource (808 pages)
 Summary

 Markov processes and their applications semimartingale theory and stochastic calculus white noise theory stochastic differential equations and its applications large deviations and applications a brief introduction to numerical analysis of (ordinary) stochastic differential equations without tears stochastic differential games and applications stability and stabilizing control of stochastic systems stochastic approximation  theory and applications stochastic manufacturing systems optimization by stochastic methods stochastic control methods in asset pricing.
 (source: Nielsen Book Data)
(source: Nielsen Book Data)
 Koroliouk, Dmitri.
 London : ISTE, Ltd. ; Hoboken : Wiley, 2020.
 Description
 Book — 1 online resource (229 p.)
 Summary

 Preface ix
 List of Abbreviations xi
 Introduction xiii
 Chapter 1 Statistical Experiments 1
 1.1 Statistical experiments with linear regression 1
 1.1.1 Basic definitions 1
 1.1.2 Difference evolution equations 3
 1.1.3 The equilibrium state 4
 1.1.4 Stochastic difference equations 7
 1.1.5 Convergence to the equilibrium state 9
 1.1.6 Normal approximation of the stochastic component 11
 1.2 Binary SEs with nonlinear regression 13
 1.2.1 Basic assumptions 13
 1.2.2 Equilibrium 15
 1.2.3 Stochastic difference equations 17
 1.2.4 Convergence to the equilibrium state 18
 1.2.5 Normal approximation of the stochastic component 20
 1.3 Multivariate statistical experiments 22
 1.3.1 Regression function of increments 22
 1.3.2 The equilibrium state of multivariate EPs 25
 1.3.3 Stochastic difference equations 26
 1.3.4 Convergence to the equilibrium state 28
 1.3.5 Normal approximation of the stochastic component 29
 1.4 SEs with WrightFisher normalization 31
 1.4.1 Binary RFs 31
 1.4.2 Multivariate RFIs 33
 1.5 Exponential statistical experiments 35
 1.5.1 Binary ESEs 36
 1.5.2 Steady regime of ESEs 37
 1.5.3 Approximation of ESEs by geometric Brownian motion 38
 Chapter 2 Diffusion Approximation of Statistical Experiments in DiscreteContinuous Time 43
 2.1 Binary DMPs 44
 2.1.1 DMPs in discretecontinuous time 45
 2.1.2 Justification of diffusion approximation 47
 2.2 Multivariate DMPs in discretecontinuous time 51
 2.2.1 Evolutionary DMPs in discretecontinuous time 52
 2.2.2 SDEs for the DMP in discretecontinuous time 53
 2.2.3 Diffusion approximation of DMPs in discretecontinuous time 55
 2.3 A DMP in an MRE 58
 2.3.1 Discrete and continuous MRE 58
 2.3.2 Proof of limit theorems 2.3.1 and 2.3.2 62
 2.4 The DMPs in a balanced MRE 65
 2.4.1 Basic assumptions 66
 2.4.2 Proof of limit theorem 2.4.1 70
 2.5 Adapted SEs 74
 2.5.1 Bernoulli approximation of the SE stochastic component 75
 2.5.2 Adapted SEs 77
 2.5.3 Adapted SEs in a series scheme 79
 2.6 DMPs in an asymptotical diffusion environment 84
 2.6.1 Asymptotic diffusion perturbation 85
 2.7 A DMP with ASD 91
 2.7.1 Asymptotically small diffusion 91
 2.7.2 EGs of DMP 94
 2.7.3 AF of DMPs 97
 Chapter 3 Statistics of Statistical Experiments 103
 3.1 Parameter estimation of onedimensional stationary SEs 103
 3.1.1 Stationarity 103
 3.1.2 Covariance statistics 108
 3.1.3 A priori statistics 110
 3.1.4 Optimal estimating function 111
 3.1.5 Stationary Gaussian SEs 114
 3.2 Parameter estimators for multivariate stationary SEs 115
 3.2.1 Vector difference SDEs and stationarity conditions 116
 3.2.2 Optimal estimating function 118
 3.2.3 Stationary Gaussian Markov SEs 119
 3.3 Estimates of continuous process parameters 122
 3.3.1 Diffusiontype processes 122
 3.3.2 Estimation of a continuous parameter 123
 3.4 Classification of EPs 124
 3.4.1 Basic assumption 125
 3.4.2 Classification of EPs 126
 3.4.3 Justification of EP models classification 127
 3.4.4 Proof of Theorem 3.4.1 129
 3.4.5 Interpretation of EPs 133
 3.4.6 Interpretation of EPs in models of collective behavior 138
 3.5 Classification of SEs 139
 3.5.1 The SA of SEs 139
 3.5.2 Classifiers 140
 3.5.3 Classification of SEs 142
 3.6 Evolutionary model of ternary SEs 144
 3.6.1 Basic assumptions 144
 3.6.2 The model interpretation and analysis 146
 3.7 Equilibrium states in the dynamics of ternary SEs 149
 3.7.1 Building a model 149
 3.7.2 The equilibrium state and fluctuations 150
 3.7.3 Classification of TSEs 151
 Chapter 4 Modeling and Numerical Analysis of Statistical Experiments 153
 4.1 Numerical verification of generic model 153
 4.1.1 Evolutionary processes with linear and nonlinear RFIs 153
 4.1.2 Generic model of trajectory generation 156
 4.2 Numerical verification of DMD 158
 4.2.1 Simulation of DMD trajectories 158
 4.2.2 Estimation of DMD parameters 163
 4.3 DMD and modeling of the dynamics of macromolecules in biophysics 167
 4.3.1 The model motivation 168
 4.3.2 Statistical model of a stationary DMD 169
 4.3.3 StokesEinstein kinetic diffusion model 171
 4.3.4 Verification of the model of stationary DMD by direct numerical simulation 172
 4.3.5 Numerical verification of DMD characteristics using the model of StokesEinstein 173
 4.3.6 The ability of the DMD model to detect the proportion of fast and slow particles 176
 4.3.7 Interpretation of the mixes of Brownian motions 182
 References 189
 Index 193.
 (source: Nielsen Book Data)
(source: Nielsen Book Data)
 Théorie des sondages. English
 Tillé, Yves.
 Hoboken, NJ : Wiley, 2020.
 Description
 Book — 1 online resource (450 p.).
 Summary

 A History of Ideas in Survey Sampling Theory
 Population, Sample, and Estimation
 Simple and Systematic Designs
 Stratification
 Sampling with Unequal Probabilities
 Balanced Sampling
 Cluster and Twostage Sampling
 Other Topics on Sampling
 Estimation with a Quantitative Auxiliary Variable
 PostStratification and Calibration on Marginal Totals
 Multiple Regression Estimation
 Calibration Estimation
 ModelBased approach
 Estimation of Complex Parameters
 Variance Estimation by Linearization
 Treatment of Nonresponse
 Summary Solutions to the Exercises.
(source: Nielsen Book Data)
This comprehensive text takes a critical look at the modern development of the theory of survey sampling as well as the foundations of survey sampling, and explains how to put this theory into practice. The treatment of nonsampling errors is featured, and a range of other topics, from the problems of coverage to the treatment of nonresponse, is explored. Real examples, applications, and a large set of exercises are also provided.
(source: Nielsen Book Data)
 Luz, Maksym, author.
 London, UK : ISTE, Ltd. ; Hoboken, NJ : Wiley, 2019.
 Description
 Book — 1 online resource.
 Summary

 Stationary Increments of Discrete Time Stochastic Processes: Spectral Representation
 Extrapolation Problem for Stochastic Sequences with Stationary nth Increments
 Interpolation Problem for Stochastic Sequences with Stationary nth Increments
 Extrapolation Problem for Stochastic Sequences with Stationary nth Increments Based on Observations with Stationary Noise
 Interpolation Problem for Stochastic Sequences with Stationary nth Increments Based on Observations with Stationary Noise
 Filtering Problem of Stochastic Sequences with Stationary nth Increments Based on Observations with Stationary Noise
 Interpolation Problem for Stochastic Sequences with Stationary nth Increments Observed with Nonstationary Noise
 Filtering Problem for Stochastic Sequences with Stationary nth Increments Observed with Nonstationary Noise
 Stationary Increments of Continuous Time Stochastic Processes: Spectral Representation
 Extrapolation Problem for Stochastic Processes with Stationary nth Increments
 Interpolation Problem for Stochastic Processes with Stationary nth Increments
 Filtering Problem for Stochastic Processes with Stationary nth Increments
 Problems to Solve
 Elements of Convex Optimization.
(source: Nielsen Book Data)
 Girko, V. L., author.
 Reprint 2018  Berlin ; Boston : De Gruyter, [2018]
 Description
 Book — 1 online resource (699 p.) Digital: text file; PDF.
 Summary

 Frontmatter
 CONTENTS
 List of basic notations and assumptions
 Preface and some historical remarks
 Chapter 1. Introduction to the theory of sample matrices of fixed dimension
 Chapter 2. Canonical equations
 Chapter 3. The First Law for the eigenvalues and eigenvectors of random symmetric matrices
 Chapter 4. The Second Law for the singular values and eigenvectors of random matrices. Inequalities for the spectral radius of large random matrices
 Chapter 5. The Third Law for the eigenvalues and eigenvectors of empirical covariance matrices
 Chapter 6. The first proof of the Strong Circular Law
 Chapter 7. Strong Law for normalized spectral functions of nonselfadjoint random matrices with independent row vectors and simple rigorous proof of the Strong Circular Law
 Chapter 8. Rigorous proof of the Strong Elliptic Law
 Chapter 9. The Circular and Uniform Laws for eigenvalues of random nonsymmetric complex matrices with independent entries
 Chapter 10. Strong VLaw for eigenvalues of nonsymmetric random matrices
 Chapter 11. Convergence rate of the expected spectral functions of symmetric random matrices is equal to 0(n1/2)
 Chapter 12. Convergence rate of expected spectral functions of the sample covariance matrix Ȓm"(n) is equal to 0(n1/2) under the condition m"n1≤c<1
 Chapter 13. The First Spacing Law for random symmetric matrices
 Chapter 14. Ten years of General Statistical Analysis (The main Gestimators of General Statistical Analysis)
 References
 Index
(source: Nielsen Book Data)
 Gzyl, Henryk, 1946 author.
 Berlin ; Boston : De Gruyter, [2018]
 Description
 Book — 1 online resource (210 p). Digital: text file; PDF.
 Summary

 Frontmatter
 Preface
 Contents
 1 Introduction
 2 Frequency models
 3 Individual severity models
 4 Some detailed examples
 5 Some traditional approaches to the aggregation problem
 6 Laplace transforms and fractional moment problems
 7 The standard maximum entropy method
 8 Extensions of the method of maximum entropy
 9 Superresolution in maxentropic Laplace transform inversion
 10 Sample data dependence
 11 Disentangling frequencies and decompounding losses
 12 Computations using the maxentropic density
 13 Review of statistical procedures
 Index
 Bibliography
(source: Nielsen Book Data)
 Gupta, A. K.
 Boca Raton : Chapman and Hall/CRC, 2018
 Description
 Book — 1 online resource (385 p).
 Summary

 PRELIMINARIES Matrix Algebra Jacobians of Transformations Integration Zonal Polynomials Hypergeometric Functions of Matrix Argument LaGuerre Polynomials Generalized Hermite Polynomials Notion of Random Matrix Problems MATRIX VARIATE NORMAL DISTRIBUTION Density Function Properties Singular Matrix Variate Normal distribution Symmetic Matrix Variate Normal Distribution Restricted Matrix Variate Normal Distribution Matrix Variate QGeneralized Normal Distribution WISHART DISTRIBUTION Introduction Density Function Properties Inverted Wishart Distribution Noncentral Wishart Distribution Matrix Variate Gamma Distribution Approximations MATRIX VARIATE tDISTRIBUTION Density Function Properties Inverted Matrix Variate tDistribution Disguised Matrix Variate tDistribution Restricted Matrix Variate tDistribution Noncentral Matrix Variate tDistribution Distribution of Quadratic Forms MATRIX VARIATE BETA DISTRIBUTIONS Density Functions Properties Related Distributions Noncentral Matrix Variate Beta Distribution MATRIX VARIATE DIRICHLET DISTRIBUTIONS Density Functions Properties Related Distributions Noncentral Matrix Variate Dirichlet Distributions DISTRIBUTION OF MATRIX QUADRATIC FORMS Density Function Properties Functions of Quadratic Forms Series Representation of the Density Noncentral Density Function Expected Values Wishartness and Independence of Quadratic Forms of the Type XAX' Wishartness and Independence of Quadratic Forms of the Type XAX'+1/2(LX'+XL')+C Wishartness and Independence of Quadratic Forms of the Type XAX'+L1X'+XL'2+C MISCELLANEOUS DISTRIBUTIONS Uniform Distribution on Stiefel Manifold Von MisesFisher Distribution Bingham Matrix Distribution Generalized BinghamVon Mises Matrix Distribution Manifold Normal Distribution Matrix Angular Central Gaussian Distribution Bimatix Wishart Distribution BetaWishart Distribution Confluent Hypergeometric Function Kind 1 Distribution Confluent Hypergeometric Function Kind 2 Distribution Hypergeometric Function Distributions Generalized Hypergeometric Function Distributions Complex Matrix Variate Distributions GENERAL FAMILIES OF MATRIX VARIATE DISTRIBUTIONS Matrix Variate Liouville Distributions Matrix Variate Spherical Distributions Matrix Variate Elliptically Contoured Distributions Orthogonally Invariant and Residual Independent Matrix Distributions GLOSSARY REFERENCES SUBJECT INDEX Each chapter also includes an Introduction and Problems @.
 (source: Nielsen Book Data)
(source: Nielsen Book Data)
12. Optimal learning [electronic resource] [2012]
 Powell, Warren B., 1955
 Hoboken, New Jersey : Wiley, ©2012.
 Description
 Book — xix, 384 pages : illustrations ; 24 cm.
 Summary

 Preface xv Acknowledgments xix 1 The challenges of learning 1 1.1 Learning the best path 2 1.2 Areas of application 4 1.3 Major problem classes 12 1.4 The different types of learning 13 1.5 Learning from different communities 16 1.6 Information collection using decision trees 18 1.6.1 A basic decision tree 18 1.6.2 Decision tree for offline learning 20 1.6.3 Decision tree for online learning 21 1.6.4 Discussion 25 1.7 Website and downloadable software 26 1.8 Goals of this book 26 Problems 28 2 Adaptive learning 31 2.1 The frequentist view 32 2.2 The Bayesian view 33 2.2.1 The updating equations for independent beliefs 34 2.2.2 The expected value of information 36 2.2.3 Updating for correlated normal priors 38 2.2.4 Bayesian updating with an uninformative prior 41 2.3 Updating for nonGaussian priors 42 2.3.1 The gammaexponential model 43 2.3.2 The gammaPoisson model 44 2.3.3 The Paretouniform model 45 2.3.4 Models for learning probabilities* 46 2.3.5 Learning an unknown variance* 49 2.4 Monte Carlo simulation 51 2.5 Why does it work?* 54 2.5.1 Derivation of ~ 54 2.5.2 Derivation of Bayesian updating equations for independent beliefs 55 2.6 Bibliographic notes 57 Problems 57 3 The economics of information 61 3.1 An elementary information problem 61 3.2 The marginal value of information 65 3.3 An information acquisition problem 68 3.4 Bibliographic notes 70 Problems 70 4 Ranking and selection 71 4.1 The model 72 4.2 Measurement policies 75 4.2.1 Deterministic vs. sequential policies 75 4.2.2 Optimal sequential policies 76 4.2.3 Heuristic policies 77 4.3 Evaluating policies 81 4.4 More advanced topics* 83 4.4.1 An alternative representation of the probability space 83 4.4.2 Equivalence of using true means and sample estimates 84 4.5 Bibliographic notes 85 Problems 85 5 The knowledge gradient 89 5.1 The knowledge gradient for independent beliefs 90 5.1.1 Computation 91 5.1.2 Some properties of the knowledge gradient 93 5.1.3 The four distributions of learning 94 5.2 The value of information and the Scurve effect 95 5.3 Knowledge gradient for correlated beliefs 98 5.4 The knowledge gradient for some nonGaussian distributions 103 5.4.1 The gammaexponential model 104 5.4.2 The gammaPoisson model 107 5.4.3 The Paretouniform model 108 5.4.4 The betaBernoulli model 109 5.4.5 Discussion 111 5.5 Relatives of the knowledge gradient 112 5.5.1 Expected improvement 113 5.5.2 Linear loss* 114 5.6 Other issues 116 5.6.1 Anticipatory vs. experiential learning 117 5.6.2 The problem of priors 118 5.6.3 Discussion 120 5.7 Why does it work?* 121 5.7.1 Derivation of the knowledge gradient formula 121 5.8 Bibliographic notes 125 Problems 126 6 Bandit problems 139 6.1 The theory and practice of Gittins indices 141 6.1.1 Gittins indices in the betaBernoulli model 142 6.1.2 Gittins indices in the normalnormal model 145 6.1.3 Approximating Gittins indices 147 6.2 Variations of bandit problems 148 6.3 Upper confidence bounding 149 6.4 The knowledge gradient for bandit problems 151 6.4.1 The basic idea 151 6.4.2 Some experimental comparisons 153 6.4.3 Nonnormal models 156 6.5 Bibliographic notes 157 Problems 157 7 Elements of a learning problem 163 7.1 The states of our system 164 7.2 Types of decisions 166 7.3 Exogenous information 167 7.4 Transition functions 168 7.5 Objective functions 168 7.5.1 Designing versus controlling 168 7.5.2 Measurement costs 170 7.5.3 Objectives 170 7.6 Evaluating policies 175 7.7 Discussion 177 7.8 Bibliographic notes 178 Problems 178 8 Linear belief models 181 8.1 Applications 182 8.1.1 Maximizing ad clicks 182 8.1.2 Dynamic pricing 184 8.1.3 Housing loans 184 8.1.4 Optimizing dose response 185 8.2 A brief review of linear regression 186 8.2.1 The normal equations 186 8.2.2 Recursive least squares 187 8.2.3 A Bayesian interpretation 188 8.2.4 Generating a prior 189 8.3 The knowledge gradient for a linear model 191 8.4 Application to drug discovery 192 8.5 Application to dynamic pricing 196 8.6 Bibliographic notes 200 Problems 200 9 Subset selection problems 203 9.1 Applications 205 9.2 Choosing a subset using ranking and selection 206 9.2.1 Setting prior means and variances 207 9.2.2 Two strategies for setting prior covariances 208 9.3 Larger sets 209 9.3.1 Using simulation to reduce the problem size 210 9.3.2 Computational issues 212 9.3.3 Experiments 213 9.4 Very large sets 214 9.5 Bibliographic notes 216 Problems 216 10 Optimizing a scalar function 219 10.1 Deterministic measurements 219 10.2 Stochastic measurements 223 10.2.1 The model 223 10.2.2 Finding the posterior distribution 224 10.2.3 Choosing the measurement 226 10.2.4 Discussion 229 10.3 Bibliographic notes 229 Problems 229 11 Optimal bidding 231 11.1 Modeling customer demand 233 11.1.1 Some valuation models 233 11.1.2 The logit model 234 11.2 Bayesian modeling for dynamic pricing 237 11.2.1 A conjugate prior for choosing between two demand curves 237 11.2.2 Moment matching for nonconjugate problems 239 11.2.3 An approximation for the logit model 242 11.3 Bidding strategies 244 11.3.1 An idea from multiarmed bandits 245 11.3.2 Bayesgreedy bidding 245 11.3.3 Numerical illustrations 247 11.4 Why does it work?* 251 11.4.1 Moment matching for Pareto prior 251 11.4.2 Approximating the logistic expectation 252 11.5 Bibliographic notes 253 Problems 254 12 Stopping problems 255 12.1 Sequential probability ratio test 255 12.2 The secretary problem 260 12.2.1 Setup 261 12.2.2 Solution 263 12.3 Bibliographic notes 266 Problems 266 13 Active learning in statistics 269 13.1 Deterministic policies 270 13.2 Sequential policies for classification 274 13.2.1 Uncertainty sampling 274 13.2.2 Query by committee 275 13.2.3 Expected error reduction 276 13.3 A variance minimizing policy 277 13.4 Mixtures of Gaussians 279 13.4.1 Estimating parameters 280 13.4.2 Active learning 281 13.5 Bibliographic notes 283 14 Simulation optimization 285 14.1 Indifference zone selection 287 14.1.1 Batch procedures 288 14.1.2 Sequential procedures 290 14.1.3 The 01 procedure: connection to linear loss 291 14.2 Optimal computing budget allocation 292 14.2.1 Indifferencezone version 293 14.2.2 Linear loss version 294 14.2.3 When does it work? 295 14.3 Modelbased simulated annealing 296 14.4 Other areas of simulation optimization 298 14.5 Bibliographic notes 299 15 Learning in mathematical programming 301 15.1 Applications 303 15.1.1 Piloting a hot air balloon 303 15.1.2 Optimizing a portfolio 308 15.1.3 Network problems 309 15.1.4 Discussion 313 15.2 Learning on graphs 313 15.3 Alternative edge selection policies 316 15.4 Learning costs for linear programs* 317 15.5 Bibliographic notes 324 16 Optimizing over continuous measurements 325 16.1 The belief model 327 16.1.1 Updating equations 328 16.1.2 Parameter estimation 330 16.2 Sequential kriging optimization 332 16.3 The knowledge gradient for continuous parameters* 334 16.3.1 Maximizing the knowledge gradient 334 16.3.2 Approximating the knowledge gradient 335 16.3.3 The gradient of the knowledge gradient 336 16.3.4 Maximizing the knowledge gradient 338 16.3.5 The KGCP policy 339 16.4 Efficient global optimization 340 16.5 Experiments 341 16.6 Extension to higher dimensional problems 342 16.7 Bibliographic notes 343 17 Learning with a physical state 345 17.1 Introduction to dynamic programming 347 17.1.1 Approximate dynamic programming 348 17.1.2 The exploration vs. exploitation problem 350 17.1.3 Discussion 351 17.2 Some heuristic learning policies 352 17.3 The local bandit approximation 353 17.4 The knowledge gradient in dynamic programming 355 17.4.1 Generalized learning using basis functions 355 17.4.2 The knowledge gradient 358 17.4.3 Experiments 361 17.5 An expected improvement policy 363 17.6 Bibliographic notes 364 Index 379.
 (source: Nielsen Book Data)
(source: Nielsen Book Data)
 Online

 dx.doi.org Wiley Online Library
 Google Books (Full view)
13. Regression analysis by example [2012]
 Chatterjee, Samprit, 1938
 Fifth edition.  Hoboken, New Jersey : Wiley, 2012.
 Description
 Book — xv, 393 p.
 Summary

 Preface xiv
 1 Introduction 1 1.1 What Is Regression Analysis? 1 1.2 Publicly Available Data Sets 2 1.3 Selected Applications of Regression Analysis 3 1.4 Steps in Regression Analysis 13 1.5 Scope and Organization of the Book 21 Exercises 23
 2 Simple Linear Regression 25 2.1 Introduction 25 2.2 Covariance and Correlation Coefficient 25 2.3 Example: Computer Repair Data 30 2.4 The Simple Linear Regression Model 32 2.5 Parameter Estimation 33 2.6 Tests of Hypotheses 36 2.7 Confidence Intervals 41 2.8 Predictions 41 2.9 Measuring the Quality of Fit 43 2.10 Regression Line Through the Origin 46 2.11 Trivial Regression Models 48 2.12 Bibliographic Notes 49 Exercises 49
 3 Multiple Linear Regression 57 3.1 Introduction 57 3.2 Description of the Data and Model 57 3.3 Example: Supervisor Performance Data 58 3.4 Parameter Estimation 61 3.5 Interpretations of Regression Coefficients 62 3.6 Centering and Scaling 64 3.7 Properties of the Least Squares Estimators 67 3.8 Multiple Correlation Coefficient 68 3.9 Inference for Individual Regression Coefficients 69 3.10 Tests of Hypotheses in a Linear Model 71 3.11 Predictions 81 3.12 Summary 82 Exercises 82 Appendix: Multiple Regression in Matrix Notation 89
 4 Regression Diagnostics: Detection of Model Violations 93 4.1 Introduction 93 4.2 The Standard Regression Assumptions 94 4.3 Various Types of Residuals 96 4.4 Graphical Methods 98 4.5 Graphs Before Fitting a Model 101 4.6 Graphs After Fitting a Model 105 4.7 Checking Linearity and Normality Assumptions 105 4.8 Leverage, Influence, and Outliers 106 4.9 Measures of Influence 111 4.10 The PotentialResidual Plot 115 4.11 What to Do with the Outliers? 116 4.12 Role of Variables in a Regression Equation 117 4.13 Effects of an Additional Predictor 122 4.14 Robust Regression 123 Exercises 123
 5 Qualitative Variables as Predictors 129 5.1 Introduction 129 5.2 Salary Survey Data 130 5.3 Interaction Variables 133 5.4 Systems of Regression Equations 136 5.5 Other Applications of Indicator Variables 147 5.6 Seasonality 148 5.7 Stability of Regression Parameters Over Time 149 Exercises 151
 6 Transformation of Variables 163 6.1 Introduction 163 6.2 Transformations to Achieve Linearity 165 6.3 Bacteria Deaths Due to XRay Radiation 167 6.4 Transformations to Stabilize Variance 171 6.5 Detection of Heteroscedastic Errors 176 6.6 Removal of Heteroscedasticity 178 6.7 Weighted Least Squares 179 6.8 Logarithmic Transformation of Data 180 6.9 Power Transformation 181 6.10 Summary 185 Exercises 186
 7 Weighted Least Squares 191 7.1 Introduction 191 7.2 Heteroscedastic Models 192 7.3 TwoStage Estimation 195 7.4 Education Expenditure Data 197 7.5 Fitting a DoseResponse Relationship Curve 206 Exercises 208
 8 The Problem of Correlated Errors 209 8.1 Introduction: Autocorrelation 209 8.2 Consumer Expenditure and Money Stock 210 8.3 DurbinWatson Statistic 212 8.4 Removal of Autocorrelation by Transformation 214 8.5 Iterative Estimation With Autocorrelated Errors 216 8.6 Autocorrelation and Missing Variables 217 8.7 Analysis of Housing Starts 218 8.8 Limitations of DurbinWatson Statistic 222 8.9 Indicator Variables to Remove Seasonality 223 8.10 Regressing Two Time Series 226 Exercises 228
 9 Analysis of Collinear Data 233 9.1 Introduction 233 9.2 Effects of Collinearity on Inference 234 9.3 Effects of Collinearity on Forecasting 240 9.4 Detection of Collinearity 245 Exercises 254
 10 Working With Collinear Data 259 10.1 Introduction 259 10.2 Principal Components 259 10.3 Computations Using Principal Components 263 10.4 Imposing Constraints 263 10.5 Searching for Linear Functions of the ss's 267 10.6 Biased Estimation of Regression Coefficients 272 10.7 Principal Components Regression 272 10.8 Reduction of Collinearity in the Estimation Data 274 10.9 Constraints on the Regression Coefficients 276 10.10 Principal Components Regression: A Caution 277 10.11 Ridge Regression 280 10.12 Estimation by the Ridge Method 281 10.13 Ridge Regression: Some Remarks 285 10.14 Summary 287 10.15 Bibliographic Notes 288 Exercises 288 Appendix 10.A: Principal Components 291 Appendix 10.B: Ridge Regression 294 Appendix 10.C: Surrogate Ridge Regression 297
 11 Variable Selection Procedures 299 11.1 Introduction 299 11.2 Formulation of the Problem 300 11.3 Consequences of Variables Deletion 300 11.4 Uses of Regression Equations 302 11.5 Criteria for Evaluating Equations 303 11.6 Collinearity and Variable Selection 306 11.7 Evaluating All Possible Equations 306 11.8 Variable Selection Procedures 307 11.9 General Remarks on Variable Selection Methods 309 11.10 A Study of Supervisor Performance 310 11.11 Variable Selection With Collinear Data 314 11.12 The Homicide Data 314 11.13 Variable Selection Using Ridge Regression 317 11.14 Selection of Variables in an Air Pollution Study 318 11.15 A Possible Strategy for Fitting Regression Models 326 11.16 Bibliographic Notes 327 Exercises 328 Appendix: Effects of Incorrect Model Specifications 332
 12 Logistic Regression 335 12.1 Introduction 335 12.2 Modeling Qualitative Data 336 12.3 The Logit Model 336 12.4 Example: Estimating Probability of Bankruptcies 338 12.5 Logistic Regression Diagnostics 341 12.6 Determination of Variables to Retain 342 12.7 Judging the Fit of a Logistic Regression 345 12.8 The Multinomial Logit Model 347 12.8.1 Multinomial Logistic Regression 347 12.9 Classification Problem: Another Approach 354 Exercises 355
 13 Further Topics 359 13.1 Introduction 359 13.2 Generalized Linear Model 359 13.3 Poisson Regression Model 360 13.4 Introduction of New Drugs 361 13.5 Robust Regression 363 13.6 Fitting a Quadratic Model 364 13.7 Distribution of PCB in U.S. Bays 366 Exercises 370 Appendix A: Statistical Tables 371 References 381 Index 389.
 (source: Nielsen Book Data)
(source: Nielsen Book Data)
 Online
Science Library (Li and Ma)
Science Library (Li and Ma)  Status 

Stacks  Request (opens in new tab) 
QA278.2 .C5 2012  Unknown 
 Mead, R. (Roger)
 Cambridge : Cambridge University Press, 2012.
 Description
 Book — xiv, 572 pages : illustrations ; 27 cm.
 Summary

 1. Introduction
 2. Elementary ideas of blocking: the randomised complete block design
 3. Elementary ideas of treatment structure
 4. General principles of linear models for the analysis of experimental data
 5. Experimental units
 6. Replication
 7. Blocking and control
 8. Multiple blocking systems and crossover designs
 9. Multiple levels of information
 10. Randomisation
 11. Restricted randomisation
 12. Experimental objectives, treatments and treatment structures
 13. Factorial structure and particular forms of effects
 14. Fractional replication
 15. Incomplete block size for factorial experiments
 16. Quantitative factors and response functions
 17. Multifactorial designs for quantitative factors
 18. Split unit designs
 19. Multiple experiments and new variation
 20. Sequential aspects of experiments and experimental programmes
 21. Designing useful experiments.
 (source: Nielsen Book Data)
(source: Nielsen Book Data)
Marine Biology Library (Miller)
Marine Biology Library (Miller)  Status 

Stacks  
QA279 .M38825 2012  Unknown 
 Korolev, Victor Yu, author.
 Reprint 2011  Berlin ; Boston : De Gruyter, [2012]
 Description
 Book — 1 online resource (410 p). Digital: text file; PDF.
 Summary

 Frontmatter
 Contents
 Preface / Korolev, Victor / Skvortsova, Nina
 Ionacoustic structural turbulence in lowtemperature magnetised plasma / Skvortsova, Ν. N. / Petrov, A. E. / Sarksyan, K. A. / Kharchev, Ν. K.
 Structural plasma turbulence and anomalous nonBrownian diffusion / Skvortsova, Ν. N. / Batanov, G. M. / Petrov, A. E. / Pshenichnikov, A. A. / Sarksyan, K. A. / Kharchev, Ν. K. / Kholnov, Yu. V. / Bening, V. E. / Korolev, V. Yu. / Saenko, V. V. / Uchaikin, V. V.
 Lowfrequency structural plasma turbulence in stellarators / Skvortsova, Ν. N. / Batanov, G. M. / Kolik, L. V. / Petrov, A. E. / Pshenichnikov, A. A. / Sarksyan, K. A. / Kharchev, Ν. K. / Kholnov, Yu. V. / Sanchez, J. / Estrada, T. / van Miliigen, B. / Ohkubo, Κ. / Shimozuma, Τ. / Yoshimura, Y. / Kubo, S.
 New possibilities for the mathematical modelling of turbulent transport processes in plasma / Skvortsova, Ν. N. / Batanov, G. M. / Petrov, A. E. / Pshenichnikov, A. A. / Sarksyan, K. A. / Kharchev, Ν. K. / Korolev, V. Yu. / Maravina, T. A. / Sanchez, J. / Kubo, S.
 Multifractal statistics of edge plasma turbulence in fusion devices / Budaev, V. P.
 Analysis of experimental edge turbulence characteristic by simulation with stochastic numerical model / Urazbaev, A. O. / Vershkov, V. A. / Shelukhin, D. A. / Soldatov, S. V.
 MonteCarlo simulations of resonance radiation transport in plasma / Uchaikin, V. V. / Zakharov, A. Yu.
 Fractionally stable distributions / Bening, V. E. / Korolev, V. Yu. / Sukhorukova, T. A. / Gusarov, G. G. / Saenko, V. E. / Uchaikin, V. V. / Kolokoltsov, V. N.
 Statistical analysis of volatility of financial time series and turbulent plasmas by the method of moving separation of mixtures / Korolev, V. Yu. / Rey, M.
 Hidden Markov models of plasma turbulence / Borisov, Α. V. / Korolev, V. Yu. / Stefanovich, A. I.
(source: Nielsen Book Data)
16. Selected Topics in Characteristic Functions [2011]
 Ushakov, Nikolai G., author.
 Reprint 2011  Berlin ; Boston : De Gruyter, [2011]
 Description
 Book — 1 online resource (364 p). Digital: text file; PDF.
 Summary

 Frontmatter
 Contents
 Preface
 Notation
 1. Basic properties of the characteristic functions
 2. Inequalities
 3. Empirical characteristic functions
 A. Examples
 Β. Characteristic functions of some distributions
 C. Unsolved problems
 Bibliography
 Subject Index
 Author Index
(source: Nielsen Book Data)
 Eye, Alexander von.
 Hoboken, New Jersey : Wiley, [2013]
 Description
 Book — 1 online resource (xv, 450 p.) : ill.
 Summary

 Preface xiii Acknowledgments xvii
 1 Basics of Hierarchical Loglinear Models 1 1.1 Scaling: Which Variables Are Considered Categorical? 2 1.2 Crossing Two or More Variables 4 1.3 Goodman's Three Elementary Views 8 1.4 Assumptions Made for Loglinear modeling 9
 2 Effects in a Table 13 2.1 The Null Model 13 2.2 The Row EffectsOnly Model 15 2.3 The Column EffectsOnly Model 15 2.4 The Rowand ColumnEffectsModel 16 2.5 LogLinear Models 18
 3 GoodnessofFit 23 3.1 Goodness of Fit I: Overall Fit Statistics 23 3.2 GoodnessofFit II: R2 Equivalents and Information Criteria 30 3.3 GoodnessofFit III: Null Hypotheses Concerning Parameters 35 3.4 Goodnessoffit IV: Residual Analysis 36 3.5 The Relationship Between Pearson's X2 and Loglinear Modeling 52
 4 Hierarchical Loglinear Models and Odds Ratio Analysis 55 4.1 The Hierarchy of Loglinear Models 55 4.2 Comparing Hierarchically Related Models 57 4.3 Odds Ratios and LoglinearModels 63 4.4 Odds Ratios in Tables Larger than 2 x 2 65 4.5 Testing Null Hypotheses in Odds Ratio Analysis 70 4.6 Characteristics of the Odds Ratio 72 4.7 Application of the Odds Ratio 75 4.8 The Four Steps to Take When LoglinearModeling 81 4.9 Collapsibility 86
 5 Computations I: Basic Loglinear Modeling 97 5.1 Loglinear Modeling in R 97 5.2 Log linear Modeling in SYSTAT 102 5.3 Loglinear Modeling in lEM
 106
 6 The Design Matrix Approach 111 6.1 The Generalized Linear Model (GLM) 111 6.2 Design Matrices: Coding 115
 7 Parameter Interpretation and Significance Tests 129 7.1 Parameter Interpretation Based on Design Matrices 130 7.2 The Two Sources of Parameter Correlation: Dependency of Vectors and Data Characteristics 139 7.3 Can Main Effects Be Interpreted? 143 7.4 Interpretation of Higher Order Interactions 150
 8 Computations II: Design Matrices and Poisson GLM 157 8.1 GLMbased LoglinearModeling in R
 157 8.2 Design Matrices in SYSTAT 164 8.3 LoglinearModeling with Design Matrices in lEM
 170
 9 Nonhierarchical and Nonstandard Loglinear Models 181 9.1 Defining Nonhierarchical and Nonstandard LoglinearModels 182 9.2 Virtues of Nonhierarchical and Nonstandard LoglinearModels 182 9.3 Scenarios for Nonstandard LoglinearModels 184 9.4 Nonstandard Scenarios: Summary and Discussion 240 9.5 Schuster's Approach to Parameter Interpretation 242
 10 Computations III: Nonstandard Models 251 10.1 NonHierarchical and Nonstandard Models in R
 251 10.2 Estimating NonHierarchical and Nonstandard Models with SYSTAT 256 10.3 Estimating NonHierarchical and Nonstandard Models with lEM
 265
 11 Sampling Schemes and Chisquare Decomposition 273 11.1 Sampling Schemes 273 11.2 ChiSquare Decomposition 276
 12 Symmetry Models 289 12.1 Axial Symmetry 289 12.2 Pointsymmetry 294 12.3 Pointaxial Symmetry 295 12.4 Symmetry in HigherDimensional CrossClassifications 296 12.5 QuasiSymmetry 298 12.6 Extensions and Other Symmetry Models 301 12.7 Marginal Homogeneity: Symmetry in the Marginals 305
 13 Loglinear Models of Rater Agreement 309 13.1 Measures of Rater Agreement in Contingency Tables 309 13.2 The Equal Weight Agreement Model 313 13.3 The Differential Weight Agreement Model 315 13.4 Agreement in Ordinal Variables 316 13.5 Extensions of Rater Agreement Models 319
 14 Homogeneity of Associations 327 14.1 The MantelHaenszel and BreslowDay Tests 327 14.2 LoglinearModels to Test Homogeneity of Associations 330 14.3 Extensions and Generalizations 335
 15 Logistic Regression and Logit Models 339 15.1 Logistic Regression 339 15.2 Loglinear Representation of Logistic Regression Models 344 15.3 Overdispersion in Logistic Regression 347 15.4 Logistic Regression Versus Loglinear Modeling Modules 349 15.5 Logit Models and Discriminant Analysis 351 15.6 Path Models 357
 16 Reduced Designs 363 16.1 Fundamental Principles for Factorial Design 364 16.2 The Resolution Level of a Design 365 16.3 Sample Fractional Factorial Designs 368
 17 Computations IV: Additional Models 379 17.1 Additional LoglinearModels in R
 379 17.2 Additional LoglinearModels in SYSTAT 388 17.3 Additional LoglinearModels in lEM
 404 References 417.
 (source: Nielsen Book Data)
(source: Nielsen Book Data)
Over the past ten years, there have been many important advances in loglinear modeling, including the specification of new models, in particular nonstandard models, and their relationships to methods such as Rasch modeling. While most literature on the topic is contained in volumes aimed at advanced statisticians, Applied LogLinear Modelingpresents the topic in an accessible style that iscustomized for applied researchers who utilize loglinear modeling in the social sciences. The book begins by providing readers with a foundation on the basics of loglinear modeling, introducing decomposing effects in crosstabulations and goodnessoffit tests. Popular hierarchical loglinear models are illustrated using empirical data examples, and odds ratio analysis is discussed as an interesting method of analysis of crosstabulations. Next, readers are introduced to the design matrix approach to loglinear modeling, presenting various forms of coding (effects coding, dummy coding, Helmert contrasts etc.) and the characteristics of design matrices. The book goes on to explore nonhierarchical and nonstandard loglinear models, outlining ten nonstandard loglinear models (including nonstandard nested models, models with quantitative factors, logit models, and loglinear Rasch models) as well as special topics and applications. A brief discussion of sampling schemes is also provided along with a selection of useful methods of chisquare decomposition. Additional topics of coverage include models of marginal homogeneity, rater agreement, methods to test hypotheses about differences in associations across subgroup, the relationship between loglinear modeling to logistic regression, and reduced designs. Throughout the book, Computer Applications chapters feature SYSTAT, Lem, and R illustrations of the previous chapter's material, utilizingempirical data examples to demonstrate the relevance of the topics in modern research.
(source: Nielsen Book Data)
 Online

 dx.doi.org Wiley Online Library
 Google Books (Full view)
18. Statistical Theory : a Concise Introduction [2013]
 Abramovich, Felix, author.
 New York : CRC Press, 2013
 Description
 Book — 1 online resource (241 pages)
 Summary

 Introduction. Point Estimation. Confidence Intervals, Bounds, and Regions. Hypothesis Testing. Asymptotic Analysis. Bayesian Inference. Elements of Statistical Decision Theory. Linear Models. Appendices. Index.
 (source: Nielsen Book Data)
(source: Nielsen Book Data)
 Balakrishnan, N., 1956 editor.
 First edition  Boca Raton, FL : CRC Press, 2004
 Description
 Book — 1 online resource : text file, PDF
 Summary

 Figures Part I: Applied Probability. Part II: Models and Applications. Part III: Estimation and Testing. Part IV: Robust Inference. Part V: Regression and Design. Part VI: Sample Size Methodology. Part VII: Applications to Industry. Part VIII: Applications to Ecology, Biology and Health. Part IX: Applications to Economics and Management.
 (source: Nielsen Book Data)
(source: Nielsen Book Data)
 Balakrishnan, N., 1956 editor.
 First edition  Boca Raton, FL : CRC Press, 2004
 Description
 Book — 1 online resource : text file, PDF
 Summary

 Processes and Inference. Distributions and Characterizations. Inference. Bayesian Inference. Selection Methods. Regression Methods. Methods in Health Research.
 (source: Nielsen Book Data)
(source: Nielsen Book Data)
Articles+
Journal articles, ebooks, & other eresources
Guides
Course and topicbased guides to collections, tools, and services.