# Descriptions of Statistical Terms

Describe, the following terms and give an example of each.

a. Analysis of Variance

b. One-way ANOVA

c. Between-group variance

d. Within-group variance

e. Sum of squares between groups

f. Sum of squares within groups

g. Paired difference

https://brainmass.com/statistics/analysis-of-variance/descriptions-238087

#### Solution Preview

a. Analysis of Variance

Fisher developed an elaborate technique of analyzing the variances of tow or more series for the purpose of studying their characteristics. In statistical analysis we are very often concerned with the study of variability amongst the values of a series as it provides a very meaningful picture of the nature and form of the distribution under study. In analysis of variance we are testing the variance the population so the F ratio it is the ration of two independent chi square varieties divided by their respective degrees of freedom. The F test tells us whether the variances of the two universes from which the samples have been taken are equal. If the calculated value of F is more than the critical value for the given degrees of freedom and the given degrees of significance levels the null hypothesis is rejected and the inferences is that the diference is significant. The samples in such cases do not come from the same universe or if they have come from two universes the variance of the two universes are not the same.

The technique of variance analysis originated, in agricultural research where the effect of various types of soils on the output or the effect of different ypes of fertilizers on production had to be studied. Later on the technique of the analysis of variance was found to be extremely useful in all types of researches where the effects of one or two variables on a problem under study had to be determined on the basis of a number of experiments conducted simultaneously. Thus, variance analysis can tell us whether different sample data classified on the basis of two variables. The variance analysis studies the significane of the difference in the means by analyzing variance. The variances would differ only when the means are significantly different.

b. One-way ANOVA

The null hypothesis tested by one-way ANOVA is that two or more population means are equal. The question is whether (H0) the population means may equal for all groups and that the observed differences in sample means are due to random sampling variation, or (Ha) the observed differences between sample means are due to actual differences in the population means.

The logic used in ANOVA to compare means of multiple groups is similar to that used with the t-test to compare means of two independent groups. When one-way ANOVA is applied to the special case of two groups, one-way ANOVA gives identical results as the t-test.

Not surprisingly, the assumptions needed for the t-test are also needed for ANOVA. We need to assume:

1) Random, independent sampling from the k populations;

2) Normal population distributions;

3) Equal variances within the k populations.

Assumption 1 is crucial for any inferential statistic. As with the t-test, Assumptions 2 and 3 can be relaxed when large samples are used, and Assumption 3 can be relaxed when the sample sizes are roughly the same for each group even for small samples.

Example:

Hone Run Distances:

Use a 0.05 significance level to test the claim that the home run hit by Barry Bonds, Mark McGwire, and Sammy Sosa have mean distances that are not all the same. Do the home run distances explain the fact that as of this writing, Barry Bonds has the most home runs in ...

#### Solution Summary

The solution defines statistical terms. Analysis of variance, one-way ANOVA, between-group variance, within-group variance, sum of squares between groups, sum of squares within groups and paired difference are all defined.

Brief descriptions and examples - statistical terms

Describe, in your own words, the following terms and give an example of each.

a. F distribution:

b. F statistic:

c. Chi-square distribution :

d. T distribution

e. Dependent samples

f. Independent samples

g. Degrees of freedom

h. T statistic

i. Paired difference