This chapter delves into correlation and linear regression, foundational techniques in statistical analysis for exploring relationships between continuous predictors and outcomes. Beginning with the concept of correlations, the chapter examines how variables relate to each other in data, using scatterplots and descriptive statistics to visualise and interpret the strength and direction of relationships. Pearson’s correlation coefficient r is introduced as a measure ranging from -1 (perfect negative correlation) to +1 (perfect positive correlation), alongside practical guidance on its calculation and interpretation. The chapter also highlights limitations of Pearson’s r, such as its assumption of linearity, and introduces Spearman’s rank correlation for assessing monotonic relationships. Throughout, examples are used that underscore practical applications, supported by scatterplots and jamovi analyses.
The latter sections explore linear regression models, a sophisticated extension of correlation analysis. These models estimate the relationship between predictors and outcomes using a regression line, quantified through coefficients such as slope and intercept. Techniques for evaluating model fit are detailed, including R squared and hypothesis testing for coefficients. The chapter extends the discussion to multiple linear regression, examining how multiple predictors interact within a model. It concludes by addressing critical topics such as assumptions (linearity, independence, normality, equality of variance), diagnostics (residual analysis, outlier detection, collinearity), and model selection methods like AIC and hierarchical regression. Practical, hands-on use of jamovi is emphasised throughout, making the concepts accessible and applicable.