# Office of Information Technology (OIT)

helpdesk@uta.edu ·  Work Order · 817-272-2208 · System Status

Phone:

(817) 272-5021

connolly@uta.edu

## Short Courses

Short Courses are taught in a single, two-hour session.

Below are the syllabi of the Short Courses currently offered.

Short Course Schedule

Introduction to data analysis using the SAS software package

I. The Data Step

1. Column Style
2. Free Style
3. Formatted

1. Excel
2. dBase

C. Creating new variables from existing variables

D. Creating new data sets from old data sets

1. Set
2. Merge

II. The Procedure Step

A. Proc Sort (data sorting)

B. Proc print (printing your data)

C. Proc means (descriptive statistics)

D. Proc univariated (descriptive statistics)

E. Proc freq (frequencies)

F. Proc corr (correlations)

G. Proc plot (graphics)

H. Proc reg (regression)

I. Proc GLM (ANOVA)

J. Proc TTest (T-Test)

Introduction to data analysis using the SPSS software package

I. Entering Data

A. Defining variables and entering data directly

B. Importing from a database

1. Excel

2. dBase

II. Data

A. Sorting

B. Selecting

C. Merging

III. Transform

A. Computing new variables from old

B. Recoding variables

IV. Analyze

A. Descriptive statistics

B. Compare means

C. Correlation

D. Regression

V. Graphs

A. Scatter plots

Regression analysis using the SAS software package

I. What is linear regression?

A. Model Structure

B. Model Assumptions

1. Linear in the parameters

2. X's are known, non-random, constants

3. Errors

a. homoscedasticity

b. uncorrelated

c. mean=0

d. normal

II. Preliminary steps before fitting the model

A. Choosing a pool of potential independent variables for consideration in the model

B. Scatter plots

III. Fitting the model using PROC REG

A. Testing model fit and assumptions using residual plots

2. normal probability plots

3. partial regression plots (PARTIAL)

B. ANOVA table

1. Over-all F-test

2. Mean square error (MSE)

3. R-squared

C. Least squares parameter estimates with standard errors

1. Significance tests

a. Individual tests

b. Family-wise tests (Bonferroni)

2. Implications of multicollinearity

a. Variance Inflation Factor (VIF)

b. Polynomial models

c. Ridge regression

i. trade off between biased parameter estimates and smaller variance

D. Predicted values (P)

1. As conditional mean with confidence limits (CLM)

2. As new value estimate with confidence limits (CLI)

E. Outlier detection

1. Identifying X outliers - Hat Matrix Leverage Values

2. Identifying Y outliers - Studentized Deleted Residuals

3. Identifying influential cases (INFLUENCE)

a. DFFITS

b. DFBETAS

c. Cook's Distance (R)

F. Automated model selection procedures

1. SELECTION=RSQUARE

2. SELECTION=STEPWISE (SLE) (SLS)

3. SELECTION=FORWARD

4. SELECTION=BACKWARD

Regression analysis using the SPSS software package

I. What is linear regression?

A. Model Structure

B. Model Assumptions

1. Linear in the parameters

2. X's are known, non-random, constants

3. Errors

a. homoscedasticity

b. uncorrelated

c. mean=0

d. normal

II. Preliminary steps before fitting the model

A. Choosing a pool of potential independent variables for consideration in the model

B. Scatter Plots (Graphs...Scatter)

III. Fitting the model (Analyze ...Regression...Linear)

A. Method

1. Enter

2. Stepwise

3. Backward

4. Forward

B. ANOVA table

1. Over-all F-Test

2. Mean square error (MSE)

3. R-squared

C. Statistics

1. Estimates of parameters

a. Significance tests

i. individual

ii. family wise (Bonferroni)

b. SE

2. Confidence intervals for parameters

3. Descriptives

4. Collinearity

a. tolerance

b. variance inflation factor (VIF)

c. eigenvalues

d. condition index

e. variance proportions

D. Save

1. Predicted values

a. SE of mean predicted values

b. Prediction intervals

i. mean

ii. individual

2. Residuals

a. unstandarized

b. standardized

c. studentized

d. deleted

e. studentized deleted

3. Distances

a. Cook's D

b. Leverage value

4. Influence statistics

a. DfBetas

b. DfFit

c. Covariance ratio

E. Descriptive statistics

1. Residual plots

2. Partial regression plots