Docsity
Docsity

Prepare for your exams
Prepare for your exams

Study with the several resources on Docsity


Earn points to download
Earn points to download

Earn points by helping other students or get them with a premium plan


Guidelines and tips
Guidelines and tips

Robustness of t-Methods in Data Analysis: Handling Non-Normality and Outliers, Exams of Statistics

The robustness of t-methods in data analysis when dealing with non-normality and outliers. It explains the importance of checking the nearly normal assumption and provides methods for detecting non-normality, such as box plots, histograms, and stem plots. The document also covers the effects of skewness, outliers, and sample size on the use of t-methods and suggests alternatives like nonparametric methods for handling outliers.

Typology: Exams

Pre 2010

Uploaded on 07/28/2009

koofers-user-46a
koofers-user-46a 🇺🇸

10 documents

1 / 2

Toggle sidebar

Related documents


Partial preview of the text

Download Robustness of t-Methods in Data Analysis: Handling Non-Normality and Outliers and more Exams Statistics in PDF only on Docsity! Stat 4473 – Data Analysis Robustness of t-methods The t-methods for : and :d are exactly correct when the population from which we sample is normally distributed. Practically speaking, there’s no way to be certain this is true. (In fact, it’s almost certainly not true, since the normal distribution is an idealized population model.) The usefulness of the t-methods is dependent on how strongly they are affected by lack of normality. Luckily, the t-methods for : and :d are quite robust against non- normality of the population except when outliers or strong skewness are present. Defn: A confidence interval or hypothesis test is called robust if the confidence level or p- value does not change much when the conditions for use of the procedure are violated. We might say that before using the t-methods, we should check the “nearly normal assumption” that the data appear close to normal — symmetric, single peak, no outliers. We will use box plots, histograms and/or stem plots, and formal tests of normality to check the nearly normal assumption. Modified boxplots are often useful in spotting outliers. Ways to not be normal ! More than one peak: The nearly normal assumption is violated if a histogram or stem plot of the data shows two or more peaks. When you see this, look for the possibility that your data come from two groups. If so, try to separate the data into its separate groups, then analyze each group separately. ! Skewness: If the box plot, histogram, or stem plot indicates the data are strongly skewed, you can’t use the t-methods unless your sample is large. ! Outliers: The nearly normal assumption is also violated if the data have outliers. Outliers should be investigated. Sometimes, it’s obvious that a data value is wrong and the justification for removing or correcting it is clear. When there’s no clear justification for removing outliers, you might want to run the analysis both with and without the outliers and note any differences in your conclusions. Any time data values are set aside, you must report on them individually. An analysis of the non-outlying points, along with a separate discussion of the outliers, is often very informative and can reveal important aspects of the data. See more about handling outliers on the next page. P.S. As tempting as it is to get rid of annoying values, you can’t just throw away outliers and not discuss them. It isn’t appropriate to lop off the highest or lowest values just to improve your results.
Docsity logo



Copyright © 2024 Ladybird Srl - Via Leonardo da Vinci 16, 10126, Torino, Italy - VAT 10816460017 - All rights reserved