Different Types of Statistical Tests...

As we know that inferential statistics are the set of statistical tests we use to prepare inferences about data. These statistical tests help us to make inferences as they make us aware of the prototype; we are monitoring is real, or just by chance. Types of statistical tests: There is an extensive range of statistical tests. The research design, the distribution of the data, and the type of variable help us to make decision for the kind of test to use. Generally, if the data is usually distributed we choose parametric tests. If the data is non-normal we can choose from the set of non-parametric tests. Described below are the tests and their uses. Co relational: The tests look for an association between variables. Pearson correlation: It tests the strength of association between two continuous variables. Spearman correlation: It tests the strength of association between two ordinal variables. Chi-square: It tests the strength of association between two categorical variables. A) Comparison of Means: Gaze for the dissimilarity between the means of variables Paired T-test: It tests the difference between two related variables. Independent T-test: It tests the difference between two independent variables. ANOVA: It tests the difference between set means after any other variance in the resulting variable is accounted for. B) Regression: Evaluate if change in one variable depicts change in another variable Simple regression: It tests how change in the predictor variable depicts the change in the outcome variable. Multiple regressions: It tests how change in the grouping of two or more predictor variables depicts change in the outcome variable. C) Non-parametric: Used when the data does not assemble conjecture required for parametric tests Wilcoxon rank-sum test: It tests the difference between two independent variables – accounting...