Location & Availability for: Modern statistics for the social and beh

 Staff view

You must be logged in to Tag Records

Modern statistics for the social and behavioral sciences : a practical introduction /

Rand Wilcox.

Book Cover
Author: Wilcox, Rand R.
Vernacular: Machine generated contents note: ch. 1 Introduction -- 1.1. Samples Versus Populations -- 1.2. Software -- 1.3. R Basics -- 1.3.1. Entering Data -- 1.3.2. R Functions and Packages -- 1.3.3. Data Sets -- 1.3.4. Arithmetic Operations -- ch. 2 Numerical and Graphical Summaries of Data -- 2.1. Basic Summation Notation -- 2.2. Measures of Location -- 2.2.1. The Sample Mean -- 2.2.2. R Function Mean -- 2.2.3. The Sample Median -- 2.2.4. R Function for the Median -- 2.3. A Criticism of the Median: It Might Trim Too Many Values -- 2.3.1. R Function for the Trimmed Mean -- 2.3.2. A Winsorized Mean -- 2.3.3. R Function winmean -- 2.3.4. What is a Measure of Location-- 2.4. Measures of Variation or Scale -- 2.4.1. Sample Variance and Standard Deviation -- 2.4.2. R Functions var and sd -- 2.4.3. The Interquartile Range -- 2.4.4. R Functions idealf and ideafiQR -- 2.4.5. Winsorized Variance -- 2.4.6. R Function winvar -- 2.4.7. Median Absolute Deviation -- 2.4.8. R Function mad -- 2.4.9. Average Absolute Distance from the Median -- 2.4.10. Other Robust Measures of Variation -- 2.4.11. R Functions bivar, pbvar, tauvar, and tbs -- 2.5. Detecting Outliers -- 2.5.1. A Method Based on the Mean and Variance -- 2.5.2. A Better Outlier Detection Rule: The MAD-Median Rule -- 2.5.3. R Function out -- 2.5.4. The Boxplot -- 2.5.5. R Function boxplot -- 2.5.6. Modifications of the Boxplot Rule for Detecting Outliers -- 2.5.7. R Function outbox -- 2.5.8. Other Measures of Location -- 2.5.9. R Functions mom and onestep -- 2.6. Histograms -- 2.6.1. R Functions hist and splot -- 2.7. Kernel Density Estimators -- 2.7.1. R Functions kdplot and akerd -- 2.8. Stem-and-Leaf Displays -- 2.8.1. R Function stem -- 2.9. Skewness -- 2.9.1. Transforming Data -- 2.10. Choosing a Measure of Location -- 2.11. Exercises -- ch. 3 Probability and Related Concepts -- 3.1. Basic Probability -- 3.2. Expected Values -- 3.3. Conditional Probability and Independence -- 3.4. Population Variance -- 3.5. The Binomial Probability Function -- 3.5.1. R Functions dbinom and pbinom -- 3.6. Continuous Variables and the Normal Curve -- 3.6.1. Computing Probabilities Associated with Normal Curves -- 3.6.2. R Function pnorm -- 3.6.3. R Function qnorm -- 3.7. Understanding the Effects of Nonnormality -- 3.7.1. Skewness -- 3.8. Pearson's Correlation and the Population Covariance (Optional) -- 3.8.1. Computing the Population Covariance and Pearson's Correlation -- 3.9. Some Rules About Expected Values -- 3.10. Chi-Squared Distributions -- 3.11. Exercises -- ch. 4 Sampling Distributions and Confidence Intervals -- 4.1. Random Sampling -- 4.2. Sampling Distributions -- 4.2.1. Sampling Distribution of the Sample Mean -- 4.2.2. Computing Probabilities Associated with the Sample Mean -- 4.3. A Confidence Interval for the Population Mean -- 4.3.1. Known Variance -- 4.3.2. Confidence Intervals When σ Is Not Known -- 4.3.3. R Functions pt and qt -- 4.3.4. Confidence Interval for the Population Mean Using Student's T -- 4.3.5. R Function t. test -- 4.4. Judging Location Estimators Based on their Sampling Distribution -- 4.4.1. Trimming and Accuracy: Another Perspective -- 4.5. An Approach to Nonnormality: The Central Limit Theorem -- 4.6. Student's t and Nonnormality -- 4.7. Confidence Intervals for the Trimmed Mean -- 4.7.1. Estimating the Standard Error of a Trimmed Mean -- 4.7.2. R Function trimse -- 4.7.3. A Confidence Interval for the Population Trimmed Mean -- 4.7.4. R Function trimci -- 4.8. Transforming Data -- 4.9. Confidence Interval for the Population Median -- 4.9.1. R Function sint -- 4.9.2. Estimating the Standard Error of the Sample Median -- 4.9.3. R Function msmedse -- 4.9.4. More Concerns About Tied Values -- 4.10. A Remark About Mom and M-Estimators -- 4.11. Confidence Intervals for the Probability of Success -- 4.11.1. R Functions binomci, acbinomci and and binomLCO -- 4.12. Bayesian Methods -- 4.13. Exercises -- ch. 5 Hypothesis Testing -- 5.1. The Basics of Hypothesis Testing -- 5.1.1. p-Value or Significance Level -- 5.1.2. Criticisms of Two-Sided Hypothesis Testing and p-Values -- 5.1.3. Summary and Generalization -- 5.2. Power and Type II Errors -- 5.2.1. Understanding How n, α, and σ Are Related to Power -- 5.3. Testing Hypotheses About the Mean when σ is Not Known -- 5.3.1. R Function t. test -- 5.4. Controlling Power and Determining the Sample Size -- 5.4.1. Choosing n Prior to Collecting Data -- 5.4.2. R Function power.t.test -- 5.4.3. Stein's Method: Judging the Sample Size When Data Are Available -- 5.4.4. R Functions stein1 and stein2 -- 5.5. Practical Problems with Student's T Test -- 5.6. Hypothesis Testing Based on a Trimmed Mean -- 5.6.1. R Function trimci -- 5.6.2. R Functions stein1.tr and stein2.tr -- 5.7. Testing Hypotheses about the Population Median -- 5.7.1. R Function sintv2 -- 5.8. Making Decisions About which Measure of Location to Use -- 5.9. Bootstrap Methods -- 5.10. Bootstrap-T Method -- 5.10.1. Symmetric Confidence Intervals -- 5.10.2. Exact Nonparametric Confidence Intervals for Means Are Impossible -- 5.11. The Percentile Bootstrap Method -- 5.12. Inferences About Robust Measures of Location -- 5.12.1. Using the Percentile Method -- 5.12.2. R Functions onesampb, momci, and trimpb -- 5.12.3. The Bootstrap-t Method Based on Trimmed Means -- 5.12.4. R Function trimcibt -- 5.13. Estimating Power when Testing Hypotheses About a Trimmed Mean -- 5.13.1. R Functions powt1est and powt1an -- 5.14. A Bootstrap Estimate of Standard Errors -- 5.14.1. R Function bootse -- 5.15. Exercises -- ch. 6 Regression and Correlation -- 6.1. The Least Squares Principle -- 6.2. Confidence Intervals and Hypothesis Testing -- 6.2.1. Classic Inferential Techniques -- 6.2.2. Multiple Regression -- 6.2.3. R Functions ols and lm -- 6.3. Standardized Regression -- 6.4. Practical Concerns About Least Squares Regression and how they Might BE ADDRESSED -- 6.4.1. The Effect of Outliers on Least Squares Regression -- 6.4.2. Beware of Bad Leverage Points -- 6.4.3. Beware of Discarding Outliers Among the Y Values -- 6.4.4. Do Not Assume Homoscedasticity or that the Regression Line is Straight -- 6.4.5. Violating Assumptions When Testing Hypotheses -- 6.4.6. Dealing with Heteroscedasticity: The HC4 Method -- 6.4.7. R Functions olshc4 and hc4test -- 6.4.8. Interval Estimation of the Mean Response -- 6.4.9. R Function olshc4band -- 6.5. Pearson's Correlation and the Coefficient of Determination -- 6.5.1. A Closer Look at Interpreting r -- 6.6. Testing H0: ρ = 0 -- 6.6.1. R Function cor.test -- 6.6.2. R Function pwr.r.test -- 6.6.3. Testing H0: p = 0 When There is Heteroscedasticity -- 6.6.4. R Function pcorhc4 -- 6.6.5. When Is It Safe to Conclude that Two Variables Are Independent-- 6.7. A Regression Method for Estimating the Median of Y and Other Quantiles -- 6.7.1. R Function rqfit -- 6.8. Detecting Heteroscedasticity -- 6.8.1. R Function khomreg -- 6.9. Inferences About Pearson's Correlation: Dealing with Heteroscedasticity -- 6.9.1. R Function pcorb -- 6.10. Bootstrap Methods for Least Squares Regression -- 6.10.1. R Functions hc4wtest, olswbtest, and lsfitci -- 6.11. Detecting Associations Even when there is Curvature -- 6.11.1. R Functions indt and medind -- 6.12. Quantile Regression -- 6.12.1. R Functions qregci and rqtest -- 6.12.2. A Test for Homoscedasticity Using a Quantile Regression Approach -- 6.12.3. R Function qhomt -- 6.13. Regression: Which Predictors are Best-- 6.13.1. The 0.632 Bootstrap Method -- 6.13.2. R function regpre -- 6.13.3. Least Angle Regression -- 6.13.4. R Function larsR -- 6.14. Comparing Correlations -- 6.14.1. R Functions TWOpov and TWOpNOV -- 6.15. Concluding Remarks -- 6.16. Exercises -- ch. 7 Comparing Two Independent Groups -- 7.1. Student's T Test -- 7.1.1. Choosing the Sample Sizes -- 7.1.2. R Function power.t.test -- 7.2. Relative Merits of Student's T -- 7.3. Welch's Heteroscedastic Method for Means -- 7.3.1. R function t. test -- 7.3.2. Tukey's Three-Decision Rule -- 7.3.3. Nonnormality and Welch's Method -- 7.3.4. Three Modern Insights Regarding Methods for Comparing Means -- 7.4. Methods for Comparing Medians and Trimmed Means -- 7.4.1. Yuen's Method for Trimmed Means -- 7.4.2. R Functions yuen and fac21ist -- 7.4.3. Comparing Medians -- 7.4.4. R Function msmed -- 7.5. Percentile Bootstrap Methods for Comparing Measures of Location -- 7.5.1. Using Other Measures of Location -- 7.5.2. Comparing Medians -- 7.5.3. R Function medpb2 -- 7.5.4. Some Guidelines on When to Use the Percentile Bootstrap Method -- 7.5.5. R Functions trimpb2, med2g, and pb2gen -- 7.6. Bootstrap-T Methods for Comparing Measures of Location -- 7.6.1. Comparing Means -- 7.6.2. Bootstrap-t Method When Comparing Trimmed Means -- 7.6.3. R Functions yuenbt and yhbt -- 7.6.4. Estimating Power and Judging the Sample Sizes -- 7.6.5. R Functions powest and pow2an -- 7.7. Permutation Tests -- 7.8. Rank-Based and Nonparametric Methods -- 7.8.1. Wilcoxon--Mann--Whitney Test -- 7.8.2. R Functions wmw and wilcox.test -- 7.8.3. Handling Tied Values and Heteroscedasticity -- 7.8.4. Cliff's Method -- 7.8.5. R Functions cid and cidv2 -- 7.8.6. The Brunner--Munzel Method -- 7.8.7. R Functions bmp and loc2dif.ci -- 7.8.8. The Kolmogorov--Smirnov Test
Published: Boca Raton, FL : CRC Press, Taylor & Francis Group, [2017]
Edition: Second edition.
Topics: Social sciences - Statistical methods. | Psychology - Statistical methods. | Statistik | Datenanalyse | Sozialwissenschaften
Tags: Add


Spaces will separate tags.
Use quotes for multi-word tags.


University of Illinois at Urbana-Champaign

Loading Holdings Data ... Loading

Location & Availability for: Modern statistics for the social and beh