• Skip to primary navigation
  • Skip to main content
  • Skip to primary sidebar
The Analysis Factor

The Analysis Factor

Statistical Consulting, Resources, and Statistics Workshops for Researchers

  • Home
  • Our Programs
    • Membership
    • Online Workshops
    • Free Webinars
    • Consulting Services
  • About
    • Our Team
    • Our Core Values
    • Our Privacy Policy
    • Employment
    • Collaborate with Us
  • Statistical Resources
  • Contact
  • Blog
  • Login

F test

Measures of Model Fit for Linear Regression Models

by Karen Grace-Martin 37 Comments

A well-fitting regression model results in predicted values close to the observed data values. Stage 2

The mean model, which uses the mean for every predicted value, generally would be used if there were no useful predictor variables. The fit of a proposed regression model should therefore be better than the fit of the mean model. [Read more…] about Measures of Model Fit for Linear Regression Models

Tagged With: F test, Model Fit, R-squared, regression models, RMSE

Related Posts

  • The Difference Between R-squared and Adjusted R-squared
  • Simplifying a Categorical Predictor in Regression Models
  • Eight Ways to Detect Multicollinearity
  • Why ANOVA is Really a Linear Regression, Despite the Difference in Notation

Member Training: The Anatomy of an ANOVA Table

by Jeff Meyer

Our analysis of linear regression focuses on parameter estimates, z-scores, p-values and confidence levels. Rarely in regression do we see a discussion of the estimates and F statistics given in the ANOVA table above the coefficients and p-values.

And yet, they tell you a lot about your model and your data. Understanding the parts of the table and what they tell you is important for anyone running any regression or ANOVA model.

[Read more…] about Member Training: The Anatomy of an ANOVA Table

Tagged With: ANOVA, estimate, estimation, F test, R-squared, residuals, sum of squares, tables, types

Related Posts

  • Member Training: ANOVA Post-hoc Tests: Practical Considerations
  • Member Training: Statistical Contrasts
  • Same Statistical Models, Different (and Confusing) Output Terms
  • Member Training: Elements of Experimental Design

Interpreting Interactions: When the F test and the Simple Effects disagree.

by Karen Grace-Martin 98 Comments

The way to follow up on a significant two-way interaction between two categorical variables is to check the simple effects.  Most of the time the simple effects tests give a very clear picture about the interaction.  Every so often, however, you have a significant interaction, but no significant simple effects.  It is not a logical impossibility. They are testing two different, but related hypotheses.

Assume your two independent variables are A and B.  Each has two values: 1 and 2.  The interaction is testing if A1 – B1 = A2 – B2 (the null hypothesis). The simple effects are testing whether A1-B1=0 and A2-B2=0 (null) or not.

If you have a crossover interaction, you can have A1-B1 slightly positive and A2-B2 slightly negative. While neither is significantly different from 0, they are significantly different from each other.

And it is highly useful for answering many research questions to know if the differences in the means in one condition equal the differences in the means for the other. It might be true that it’s not testing a hypothesis you’re interested in, but in many studies, all the interesting effects are in the interactions.

Tagged With: ANOVA, F test, interaction, Interpreting Interactions, Simple Effect

Related Posts

  • What are Sums of Squares?
  • When Unequal Sample Sizes Are and Are NOT a Problem in ANOVA
  • Same Statistical Models, Different (and Confusing) Output Terms
  • Member Training: The Anatomy of an ANOVA Table

One-tailed and Two-tailed Tests

by Karen Grace-Martin 28 Comments

I was recently asked about when to use one and two tailed tests.

The long answer is:  Use one tailed tests when you have a specific hypothesis about the direction of your relationship.  Some examples include you hypothesize that one group mean is larger than the other; you hypothesize that the correlation is positive; you hypothesize that the proportion is below .5.

The short answer is: Never use one tailed tests.

Why?

1. Only a few statistical tests even can have one tail: z tests and t tests.  So you’re severely limited.  F tests, Chi-square tests, etc. can’t accommodate one-tailed tests because their distributions are not symmetric.  Most statistical methods, such as regression and ANOVA, are based on these tests, so you will rarely have the chance to implement them.

2. Probably because they are rare, reviewers balk at one-tailed tests.  They tend to assume that you are trying to artificially boost the power of your test.  Theoretically, however, there is nothing wrong with them when the hypothesis and the statistical test are right for them.

 

Tagged With: F test, hypothesis testing, one-tailed test, T test, two-tailed test, Z test

Related Posts

  • Measures of Model Fit for Linear Regression Models
  • Centering a Covariate to Improve Interpretability
  • Member Training: Statistical Contrasts
  • Member Training: The Anatomy of an ANOVA Table

Primary Sidebar

This Month’s Statistically Speaking Live Training

  • Member Training: Introduction to SPSS Software Tutorial

Upcoming Free Webinars

Poisson and Negative Binomial Regression Models for Count Data

Upcoming Workshops

  • Analyzing Count Data: Poisson, Negative Binomial, and Other Essential Models (Jul 2022)
  • Introduction to Generalized Linear Mixed Models (Jul 2022)

Copyright © 2008–2022 The Analysis Factor, LLC. All rights reserved.
877-272-8096   Contact Us

The Analysis Factor uses cookies to ensure that we give you the best experience of our website. If you continue we assume that you consent to receive cookies on all websites from The Analysis Factor.
Continue Privacy Policy
Privacy & Cookies Policy

Privacy Overview

This website uses cookies to improve your experience while you navigate through the website. Out of these, the cookies that are categorized as necessary are stored on your browser as they are essential for the working of basic functionalities of the website. We also use third-party cookies that help us analyze and understand how you use this website. These cookies will be stored in your browser only with your consent. You also have the option to opt-out of these cookies. But opting out of some of these cookies may affect your browsing experience.
Necessary
Always Enabled
Necessary cookies are absolutely essential for the website to function properly. This category only includes cookies that ensures basic functionalities and security features of the website. These cookies do not store any personal information.
Non-necessary
Any cookies that may not be particularly necessary for the website to function and is used specifically to collect user personal data via analytics, ads, other embedded contents are termed as non-necessary cookies. It is mandatory to procure user consent prior to running these cookies on your website.
SAVE & ACCEPT