List of Articles
These concepts have articles devoted to them. The list that follows shows other concepts that are covered in these articles.
- Alpha, α
- Alpha and Beta Errors
- Alpha, p-value, Critical Value and Test Statistic – How They Work Together
- Alternative Hypothesis
- ANOM
- ANOVA – Part 1: What it Does
- ANOVA – Part 2: How it Does It
- ANOVA -- Part 3: One-Way (aka single factor)
- ANOVA -- Part 4: Two-Way (aka two-factor)
- ANOVA vs. Regression
- Binomial Distribution
- Charts/ Graphs/ Plots – Which to Use When
- Chi-Square – the Test Statistic and its Distributions
- Chi-Square Test for Goodness of Fit
- Chi-Square Test for Independence
- Chi-Square Test for the Variance
- Confidence Intervals – Part 1: General Concepts
- Confidence Intervals – Part 2: Some Specifics
- Control Charts – Part 1: General Concepts and Principles
- Control Charts – Part 2: Which to Use When
- Correlation -- Part 1
- Correlation -- Part 2
- Critical Value
- Degrees of Freedom
- Design of Experiments (DOE) -- Part 1
- Design of Experiments (DOE) -- Part 2
- Design of Experiments (DOE) -- Part 3
- Distributions -- Part 1: What They Are
- Distributions -- Part 2: How They Are Used
- Distributions -- Part 3: Which to Use When
- Errors – Types, Uses, and Interrelationships
- Exponential Distribution
- F
- Fail to Reject the Null Hypothesis
- Hypergeometric Distribution
- Hypothesis Testing – Part 1: Overview
- Hypothesis Testing – Part 2: How To
- Inferential Statistics
- Margin of Error
- Nonparametric
- Normal Distribution
- Null Hypothesis
- p, p-value
- p, t, and F: "<" or ">"?
- Poisson Distribution
- Power
- Process Capability Analysis (PCA)
- Proportion
- r, Multiple R, r2, R2, R Square, R2 Adjusted
- Regression – Part 1: Sums of Squares
- Regression – Part 2: Simple Linear
- Regression – Part 3: Analysis Basics
- Regression – Part 4: Multiple Linear
- Regression -- Part 5: Simple Nonlinear
- Reject the Null Hypothesis
- Residuals
- Sample, Sampling
- Sample Size – Part 1: Proportions for Count Data
- Sample Size – Part 2: for Continuous/ Measurement Data
- Sampling Distribution
- Sigma, σ
- Skew, Skewness
- Standard Deviation
- Standard Error
- Statistically Significant
- Sum of Squares
- t, The Test Statistic and Its Distributions
- t-tests -- Part 1: Overview
- t-tests -- Part 2: Calculations and Analysis
- Test Statistic
- Variables
- Variance
- Variation/ Variability/ Dispersion/ Spread
- Which Statistical Tool to Use to Solve Some Common Problems
- z
Other Concepts --and the article(s) in which they are covered
- 1-Sided or 1-Tailed: see the articles Alternative Hypothesis and Alpha, α.
- 1-Way: an analysis that has 1 Independent (x) Variable. E.g. 1-way ANOVA
- 2-Sided or 2-Tailed: see the articles Alternative Hypothesis and Alpha, α.
- 2-Way: an analysis that has 2 Independent (x) Variables. E.g. 2-way ANOVA
- 68-95-99.7 Rule: same as the Empirical Rule. See the article Normal Distribution.
- Acceptance Region: see the article Alpha, α.
- Adjusted R2: see the article r, Multiple R, r2, R2, R Square, R2 Adjusted, Adjusted R2.
- aka: also known as
- Alias: see the article Design of Experiments (DOE) – Part 2.
- Associated, Association: see the article Chi-Square Test for Independence.
- Assumptions: requirements for being able to use a particular test or analysis. For example, ANOM and ANOVA require approximately Normal data.
- Attributes data, Attributes Variable: same as Categorical or Nominal data or Variable. See the articles Variables and Chi-Square Test for Independence.
- Autocorrelation: see the article Residuals.
- Average Absolute Deviation: see the article Variance.
- Average: same as the Mean -- the sum of a set of numerical values divided by the Count of values in the set.
- Bernoulli Trial: see the article Binomial Distribution.
- Beta: The probability of a Beta Error. See the article Alpha and Beta Errors.
- Beta Error: featured in the article Alpha and Beta Errors.
- Bias: see the article Sample, Sampling.
- Bin, Binning: see the articles Chi-Square Test for Goodness of Fit, and Charts/ Plots/ Graphs – Which to Use When.
- Block, Blocking see the article Design of Experiments (DOE) – Part 3.
- Box Plot, Box and Whiskers Plot: see the article Charts/ Graphs/ Plots – Which to Use When.
- Cm, Cp, Cr, or CPK see the article Process Capability Analysis (PCA).
- Capability, Capability Index: see the article Process Capability Analysis (PCA).
- Categorical data, Categorical Variable: same as Attribute or Nominal data/Variable. See the articles Variables and Chi-Square Test for Independence.
- CDF: see Cumulative Density Function below.
- Central Limit Theorem: see the article Normal Distribution.
- Central Location: same as Central Tendency. See the article Distributions – Part 1: What they Are.
- Central Tendency: same as Central Location. See the article Distributions – Part 1: What they Are. Chebyshev's Theorem: see the article Standard Deviation.
- Confidence Coefficient same as Confidence Level. See the article Alpha, α.
- Confidence Level: (aka Level of Confidence aka Confidence Coefficient) equals 1–Alpha. See the article Alpha, α.
- Confounding: see the article Design of Experiments (DOE), Part 3.
- Contingency Table: see the article Chi-Square Test for Independence.
- Continuous data or Variables: see the articles Variables and Distributions – Part 3: Which to Use When.
- Control, "in…" or "out of…": see the article Control Charts – Part 1: General Concepts and Principles.
- Control Limits, Upper and Lower: see the article Control Charts – Part 1: General Concepts and Principles.
- Count data, Count Variables: aka Discrete data or Discrete Variables. See the article Variables.
- Covariance: see the article Correlation – Part 1.
- Criterion Variable: see the article Variables.
- Critical Region: same as Rejection Region. See the article Alpha, α.
- Cumulative Density Function (CDF): the formula for calculating the Cumulative Probability of a Range of values of a Continuous random Variable, e.g. the Cumulative Probability that x ≤ 0.5.
- Cumulative Probability: see the article Distributions – Part 2: How They Are Used.
- Curve Fitting: see the article Regression -- Part 5: Simple Nonlinear.
- Dependent Variable: see the article Variables.
- Descriptive Statistics: See the article Inferential Statistics.
- Dot Plot: see the article Charts/ Graphs/ Plots – Which to Use When.
- Deviation: The difference between a data value and a specified value (usually the Mean). See the article Regression – Part 1: Sums of Squares. See also the article Standard Deviation.
- Discrete data or Variables: see the articles Variables and Distributions – Part 3: Which to Use When.
- Dispersion: see the article Variation/ Variability/ Dispersion/ Spread (they all mean the same thing).
- Effect Size: see the article Power.
- Empirical Rule – same as the 68-95-99.7 Rule. See the article Normal Distribution.
- Expected Frequency: see the articles Chi-Square Test for Goodness of Fit and Chi-Square Test for Independence.
- Expected Value: see the articles Chi-Square Test for Goodness of Fit and Chi-Square Test for Independence.
- Exponential: see the article Exponential Distribution.
- Exponential Curve: see the article Regression – Part 5: Simple Nonlinear.
- Exponential Transformation: see the article Regression -- Part 5: Simple Nonlinear.
- Extremes: see the article Variation/ Variability/ Dispersion/ Spread.
- F-test: see the article F.
- Factor: see the articles ANOVA – Parts 3 and 4 and Design of Experiments – Part 1.
- False Positive: an Alpha or Type I Error; featured in the article Alpha and Beta Errors.
- False Negative: a Beta or Type II Error; featured in the article Alpha and Beta Errors.
- Frequency: a Count-like Statistic which can be non-integer. See the articles Chi-Square Test for Goodness of Fit and Chi-Square Test for Independence.
- Friedman Test: see the article Nonparametric.
- Generator: see the article Design of Experiments – Part 3.
- Goodness of Fit: see the articles Regression – Part 1 of 5: Sums of Squares and Chi-Square Test for Goodness of Fit.
- Histogram: see the article Charts, Plots, Graphs – Which to Use When.
- Independence: see the article, Chi-Square Test for Independence.
- Independent Variable: see the article Variables.
- Interaction: see the articles ANOM; ANOVA – Part 4: 2-Way; Design of Experiments, Parts 1, 2, and 3; Regression – Part 4: Multiple Linear.
- Intercept: see the article Regression -- Part 2: Simple Linear.
- Interquartile Range (IQR): see the article Variation/ Variability/ Dispersion/ Spread.
- Kruskal-Wallis Test: see the article Nonparametric.
- Kurtosis: a measure of the Shape of a Distribution. See the article Normal Distributions – Part 1.
- Least Squares (same as Least Sum of Squares or Ordinary Least Sum of Squares) see the articles Regression – Part 1: Sums of Squares and Regression – Part 2: Simple Linear.
- Least Sum of Squares: same as Least Squares (above).
- Level of Confidence: same as Confidence Level; equal to 1 – α. See the article Alpha, α.
- Level of Significance: same as Significance Level, Alpha (α). See the articles Alpha, α and Statistically Significant.
- Line Chart: see the article Charts/ Graphs/ Plots – Which to Use When.
- Logarithmic Curve, Logarithmic Transformation: see the article Regression – Part 5: Simple Nonlinear.
- Main Effect a Factor which is not an Interaction. See the articles ANOVA – Part 4: 2-Way and Design of Experiments, Part 2.
- Mann-Whitney Test: see the article Nonparametric.
- Mean: the average. Along with Mean and Median, it is a measure of Central Tendency.
- Mean Absolute Deviation (MAD): see the article Variation/ Variability/ Dispersion/ Spread.
- Mean Sum of Squares: see the articles ANOVA – Part 2: MSB, MSW, and F.
- Measurement data: same as Continuous data above.
- Median: the middle of a range of values. Along with Mean and Mode, it is a measure of Central Tendency. It is used instead of the Mean in Nonparametric Analysis. See the article Nonparametric.
- Mode: the most common value within a group (e.g. a Sample or Population or Process). There can be more than one Mode. Along with Mean and Median, Mode is a measure of Central Tendency.
- MSB and MSW: see the articles ANOVA – Part 2: MSB, MSW, and F.
- Multiple R: see the article r, Multiple R, r2, R2, R Square, R2 Adjusted.
- Multiplicative Law of Probability: see the article Chi-Square Test for Independence.
- Nominal data, Nominal Variable: Same as Categorical or Attributes data or Variable. See the article Variables.
- One-sided, One-tailed: see the articles Alternative Hypothesis and Alpha, α.
- One-way: same as 1-way; an analysis that has 1 Independent (x) Variable. E.g. 1-way ANOVA.
- Outlier: See the article Variation/ Variability/ Dispersion/ Spread.
- Parameter: a measure of a property of a Population or Process, e.g. the Mean or Standard Deviation. The counterpart for a Sample is called a "Statistic". Parameters are usually denoted by characters in the Greek Alphabet, such as μ or σ.
- Parametric: see the article Nonparametric.
- Pareto Chart: see the article Charts/ Graphs/ Plots – Which to Use When.
- PCA: see the article Process Capability Analysis (PCA).
- PDF: see Probability Density Function below.
- Pearson's Coefficient:, Pearson's r: the correlation Coefficient, r. See the article Correlation – Part 2.
- Performance Index: see the article Process Capability Analysis (PCA).
- PMF: see Probability Mass Function below
- Polynomial Curve: see the article Regression -- Part 5: Simple Nonlinear.
- "Population or Process": Where most texts say "Population", this book says adds "or Process". Ongoing Processes are handled the same as Populations, because new data values continue to be created. Thus, like Populations, we usually don't have complete data for Processes.
- Power Transformation: see the article Regression -- Part 5: Simple Nonlinear.
- Probability Density Function (PDF) the formula for calculating the Probability of a single value of a Continuous random Variable of, e.g. the Probability that x = 5. (For Discrete random Variables, the corresponding term is Probability Mass Function, PMF.) See also Cumulative Density Function.
- Probability Distribution: see the article Distributions – Part 1: What They Are.
- Probability Mass Function (PMF) the formula for calculating the Probability of a single value of a Discrete random Variable of, e.g. the Probability that x = 5.
- Qualitative Variable, Qualitative data: same as Categorical Variable, Categorical data above. See the article Variables and Chi-Square Test for Independence.
- Outlier See the article Variation/ Variability/ Dispersion/ Spread.
- Random Sample: see the article Sample, Sampling.
- Random Variable: see the article Variables.
- Range: see the article Variation/ Variability/ Dispersion/ Spread.
- Rejection Region: same as Critical Region. See the article Alpha, α.
- Replacement, Sampling With or Without: see the article Binomial Distribution.
- Resolution: see the article Design of Experiments (DOE) -- Part 3.
- Response Variable: see the articles Variables and Design of Experiments (DOE) – Part 2.
- Run Rules: see the article Control Charts – Part 1.
- Scatterplot: see the article Charts/ Graphs/ Plots – Which to Use When.
- Shape: see the article Distributions – Part 1: What They Are.
- Significance Level: see the article Alpha, α.
- Significant: see the article Statistically Significant.
- Slope: see the article Regression -- Part 2: Simple Linear.
- Spread: see the article Variation/ Variability/ Dispersion/ Spread.
- Standard Normal Distribution: see the articles Normal Distribution and z.
- Statistic: a measure of a property of a Sample, e.g. the Mean or Standard Deviation. The counterpart for a Population or Process is called a "Parameter". Statistics are usually denoted by characters based on the Roman Alphabet, such as x̅ or s.
- Statistical Inference (same as Inferential Statistics; see the article by that name.)
- Statistical Process Control: see the article Control Charts – Part 1: General Concepts and Principles.
- Student's t: see the article t – The Test Statistic and Its Distributions.
- Tail: see the articles Alpha, α and Alternative Hypothesis.
- Three Sigma Rule: same as Empirical Rule and the 68-95-99.7 Rule. See the article Normal Distribution.
- Transformation: see the article Regression -- Part 5: Simple Nonlinear.
- Two-sided, Two-tailed: same as 2-sided, 2-tailed. See the articles Alpha, α and Alternative Hypothesis.
- Two-way: same as 2-way; an analysis that has 2 Independent (x) Variables. E.g. 2-way ANOVA.
- Type I and Type II Errors: same as Alpha and Beta Errors, respectively. See the article by that name.
- Variables data same as Continuous data. See the articles Variables and Distributions – Part 3: Which to Use When.
- Variability: see the article Variation/ Variability/ Dispersion/ Spread.
- Wilcoxon Test: see the article Nonparametric.