What Conclusions Can We Draw About β0 And β1? | STAT 501

Skip to main content
  1. Home
Access denied You are not authorized to access this page.

Lesson

  • Lesson 1: Simple Linear Regression
    • 1.1 - What is Simple Linear Regression?
    • 1.2 - What is the "Best Fitting Line"?
    • 1.3 - The Simple Linear Regression Model
    • 1.4 - What is The Common Error Variance?
    • 1.5 - The Coefficient of Determination, \(R^2\)
    • 1.6 - (Pearson) Correlation Coefficient, \(r\)
    • 1.7 - Some Examples
    • 1.8 - \(R^2\) Cautions
    • 1.9 - Hypothesis Test for the Population Correlation Coefficient
    • 1.10 - Further Examples
    • Software Help 1
      • Minitab Help 1: Simple Linear Regression
      • R Help 1: Simple Linear Regression
  • Lesson 2: SLR Model Evaluation
    • 2.1 - Inference for the Population Intercept and Slope
    • 2.2 - Another Example of Slope Inference
    • 2.3 - Sums of Squares
    • 2.4 - Sums of Squares (continued)
    • 2.5 - Analysis of Variance: The Basic Idea
    • 2.6 - The Analysis of Variance (ANOVA) table and the F-test
    • 2.7 - Example: Are Men Getting Faster?
    • 2.8 - Equivalent linear relationship tests
    • 2.9 - Notation for the Lack of Fit test
    • 2.10 - Decomposing the Error
    • 2.11 - The Lack of Fit F-test
    • 2.12 - Further Examples
    • Software Help 2
      • Minitab Help 2: SLR Model Evaluation
      • R Help 2: SLR Model Evaluation
  • Lesson 3: SLR Estimation & Prediction
    • 3.1 - The Research Questions
    • 3.2 - Confidence Interval for the Mean Response
    • 3.3 - Prediction Interval for a New Response
    • 3.4 - Further Example
    • Software Help 3
      • Minitab Help 3: SLR Estimation & Prediction
      • R Help 3: SLR Estimation & Prediction
  • Lesson 4: SLR Model Assumptions
    • 4.1 - Background
    • 4.2 - Residuals vs. Fits Plot
    • 4.3 - Residuals vs. Predictor Plot
    • 4.4 - Identifying Specific Problems Using Residual Plots
    • 4.5 - Residuals vs. Order Plot
    • 4.6 - Normal Probability Plot of Residuals
      • 4.6.1 - Normal Probability Plots Versus Histograms
    • 4.7 - Assessing Linearity by Visual Inspection
    • 4.8 - Further Examples
    • Software Help 4
      • Minitab Help 4: SLR Model Assumptions
      • R Help 4: SLR Model Assumptions
  • Lesson 5: Multiple Linear Regression
    • 5.1 - Example on IQ and Physical Characteristics
    • 5.2 - Example on Underground Air Quality
    • 5.3 - The Multiple Linear Regression Model
    • 5.4 - A Matrix Formulation of the Multiple Regression Model
    • 5.5 - Further Examples
    • Software Help 5
      • Minitab Help 5: Multiple Linear Regression
      • R Help 5: Multiple Linear Regression
  • Lesson 6: MLR Model Evaluation
    • 6.1 - Three Types of Hypotheses
    • 6.2 - The General Linear F-Test
    • 6.3 - Sequential (or Extra) Sums of Squares
    • 6.4 - The Hypothesis Tests for the Slopes
    • 6.5 - Partial R-squared
    • 6.6 - Lack of Fit Testing in the Multiple Regression Setting
    • 6.7 - Further Examples
    • Software Help 6
      • Minitab Help 6: MLR Model Evaluation
      • R Help 6: MLR Model Evaluation
  • Lesson 7: MLR Estimation, Prediction & Model Assumptions
    • 7.1 - Confidence Interval for the Mean Response
    • 7.2 - Prediction Interval for a New Response
    • 7.3 - MLR Model Assumptions
    • 7.4 - Assessing the Model Assumptions
    • 7.5 - Tests for Error Normality
    • 7.6 - Tests for Constant Error Variance
    • 7.7 - Data Transformations
    • Software Help 7
      • Minitab Help 7: MLR Estimation, Prediction & Model Assumptions
      • R Help 7: MLR Estimation, Prediction & Model Assumptions
  • Lesson 8: Categorical Predictors
    • 8.1 - Example on Birth Weight and Smoking
    • 8.2 - The Basics
    • 8.3 - Two Separate Advantages
    • 8.4 - Coding Qualitative Variables
    • 8.5 - Additive Effects
    • 8.6 - Interaction Effects
    • 8.7 - Leaving an Important Interaction Out of a Model
    • 8.8 - Piecewise Linear Regression Models
    • 8.9 - Further Examples
    • 8.10 - Summary
    • Software Help 8
      • Minitab Help 8: Categorical Predictors
      • R Help 8: Categorical Predictors
  • Lesson 9: Data Transformations
    • 9.1 - Log-transforming Only the Predictor for SLR
    • 9.2 - Log-transforming Only the Response for SLR
    • 9.3 - Log-transforming Both the Predictor and Response
    • 9.4 - Other Data Transformations
    • 9.5 - More on Transformations
    • 9.6 - Interactions Between Quantitative Predictors
    • 9.7 - Polynomial Regression
    • 9.8 - Polynomial Regression Examples
    • Software Help 9
      • Minitab Help 9: Data Transformations
      • R Help 9: Data Transformations
  • Lesson 10: Model Building
    • 10.1 - What if the Regression Equation Contains "Wrong" Predictors?
    • 10.2 - Stepwise Regression
    • 10.3 - Best Subsets Regression, Adjusted R-Sq, Mallows Cp
    • 10.4 - Some Examples
    • 10.5 - Information Criteria and PRESS
    • 10.6 - Cross-validation
    • 10.7 - One Model Building Strategy
    • 10.8 - Another Model Building Strategy
    • 10.9 - Further Examples
    • Software Help 10
      • Minitab Help 10: Model Building
      • R Help 10: Model Building
  • Lesson 11: Influential Points
    • 11.1 - Distinction Between Outliers & High Leverage Observations
    • 11.2 - Using Leverages to Help Identify Extreme x Values
    • 11.3 - Identifying Outliers (Unusual y Values)
    • 11.4 - Deleted Residuals
    • 11.5 - Identifying Influential Data Points
    • 11.6 - Further Examples
    • 11.7 - A Strategy for Dealing with Problematic Data Points
    • 11.8 - Summary
    • Software Help 11
      • Minitab Help 11: Influential Points
      • R Help 11: Influential Points
  • Lesson 12: Multicollinearity & Other Regression Pitfalls
    • 12.1 - What is Multicollinearity?
    • 12.2 - Uncorrelated Predictors
    • 12.3 - Highly Correlated Predictors
    • 12.4 - Detecting Multicollinearity Using Variance Inflation Factors
    • 12.5 - Reducing Data-based Multicollinearity
    • 12.6 - Reducing Structural Multicollinearity
    • 12.7 - Further Example
    • 12.8 - Extrapolation
    • 12.9 - Other Regression Pitfalls
    • Software Help 12
      • Minitab Help 12: Multicollinearity
      • R Help 12: Multicollinearity
  • Lesson 13: Weighted Least Squares & Logistic Regressions
    • 13.1 - Weighted Least Squares
      • 13.1.1 - Weighted Least Squares Examples
    • 13.2 - Logistic Regression
      • 13.2.1 - Further Logistic Regression Examples
    • Software Help 13
      • Minitab Help 13: Weighted Least Squares & Logistic Regressions
      • R Help 13: Weighted Least Squares & Logistic Regressions

Optional Content

  • Topic 1: Robust Regression
    • T.1.1 - Robust Regression Methods
      • T1.1.1 - Robust Regression Examples
    • T.1.2 - Resistant Regression Methods
    • T.1.3 - Regression Depth
  • Topic 2: Time Series & Autocorrelation
    • T.2.1 - Autoregressive Models
    • T.2.2 - Regression with Autoregressive Errors
    • T.2.3 - Testing and Remedial Measures for Autocorrelation
    • T.2.4 - Examples of Applying Cochrane-Orcutt Procedure
    • T.2.5 - Advanced Methods
      • T.2.5.1 - ARIMA Models
      • T.2.5.4 - Generalized Least Squares
      • T.2.5.2 - Exponential Smoothing
      • T.2.5.3 - Spectral Analysis
    • Software Help: Time & Series Autocorrelation
      • Minitab Help: Time Series & Autocorrelation
      • R Help: Time Series & Autocorrelation
  • Topic 3: Poisson & Nonlinear Regression
    • T.3.1 - Poisson Regression
    • T.3.2 - Polytomous Regression
    • T.3.3 - Generalized Linear Models
    • T.3.4 - Nonlinear Regression
    • T.3.5 - Exponential Regression Example
    • T.3.6 - Population Growth Example
    • Software Help: Poisson & Nonlinear Regression
      • Minitab Help: Poisson & Nonlinear Regression
      • R Help: Poisson & Nonlinear Regression
Save changes Close

Từ khóa » H0 Bo