SPSS Advanced Statistics 17 - University Of Texas At Austin

1y ago
4 Views
2 Downloads
1.99 MB
199 Pages
Last View : 1m ago
Last Download : 3m ago
Upload by : Kaydence Vann
Transcription

iSPSS Advanced Statistics 17.0

For more information about SPSS Inc. software products, please visit our Web site at http://www.spss.com or contactSPSS Inc.233 South Wacker Drive, 11th FloorChicago, IL 60606-6412Tel: (312) 651-3000Fax: (312) 651-3668SPSS is a registered trademark and the other product names are the trademarks of SPSS Inc. for its proprietary computersoftware. No material describing such software may be produced or distributed without the written permission of theowners of the trademark and license rights in the software and the copyrights in the published materials.The SOFTWARE and documentation are provided with RESTRICTED RIGHTS. Use, duplication, or disclosure bythe Government is subject to restrictions as set forth in subdivision (c) (1) (ii) of The Rights in Technical Data andComputer Software clause at 52.227-7013. Contractor/manufacturer is SPSS Inc., 233 South Wacker Drive, 11thFloor, Chicago, IL 60606-6412.Patent No. 7,023,453General notice: Other product names mentioned herein are used for identification purposes only and may be trademarksof their respective companies.Windows is a registered trademark of Microsoft Corporation.Apple, Mac, and the Mac logo are trademarks of Apple Computer, Inc., registered in the U.S. and other countries.This product uses WinWrap Basic, Copyright 1993-2007, Polar Engineering and Consulting, http://www.winwrap.com.Printed in the United States of America.No part of this publication may be reproduced, stored in a retrieval system, or transmitted, in any form or by any means,electronic, mechanical, photocopying, recording, or otherwise, without the prior written permission of the publisher.

PrefaceSPSS Statistics 17.0 is a comprehensive system for analyzing data. The AdvancedStatistics optional add-on module provides the additional analytic techniques describedin this manual. The Advanced Statistics add-on module must be used with the SPSSStatistics 17.0 Base system and is completely integrated into that system.InstallationTo install the Advanced Statistics add-on module, run the License Authorization Wizardusing the authorization code that you received from SPSS Inc. For more information,see the installation instructions supplied with the Advanced Statistics add-on module.CompatibilitySPSS Statistics is designed to run on many computer systems. See the installationinstructions that came with your system for specific information on minimum andrecommended requirements.Serial NumbersYour serial number is your identification number with SPSS Inc. You will need thisserial number when you contact SPSS Inc. for information regarding support, payment,or an upgraded system. The serial number was provided with your Base system.Customer ServiceIf you have any questions concerning your shipment or account, contact your localoffice, listed on the Web site at http://www.spss.com/worldwide. Please have yourserial number ready for identification.iii

Training SeminarsSPSS Inc. provides both public and onsite training seminars. All seminars featurehands-on workshops. Seminars will be offered in major cities on a regular basis.For more information on these seminars, contact your local office, listed on the Website at http://www.spss.com/worldwide.Technical SupportTechnical Support services are available to maintenance customers. Customers maycontact Technical Support for assistance in using SPSS Statistics or for installationhelp for one of the supported hardware environments. To reach Technical Support,see the Web site at http://www.spss.com, or contact your local office, listed on theWeb site at http://www.spss.com/worldwide. Be prepared to identify yourself, yourorganization, and the serial number of your system.Additional PublicationsThe SPSS Statistical Procedures Companion, by Marija Norušis, has been publishedby Prentice Hall. A new version of this book, updated for SPSS Statistics 17.0,is planned. The SPSS Advanced Statistical Procedures Companion, also basedon SPSS Statistics 17.0, is forthcoming. The SPSS Guide to Data Analysis forSPSS Statistics 17.0 is also in development. Announcements of publicationsavailable exclusively through Prentice Hall will be available on the Web site athttp://www.spss.com/estore (select your home country, and then click Books).iv

Contents1Introduction to Advanced Statistics12GLM Multivariate Analysis3GLM Multivariate Model . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6Build Terms . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7Sum of Squares . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7GLM Multivariate Contrasts . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9Contrast Types. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9GLM Multivariate Profile Plots . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10GLM Multivariate Post Hoc Comparisons . . . . . . . . . . . . . . . . . . . . . . . . . . . 12GLM Save. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14GLM Multivariate Options . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16GLM Command Additional Features . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 183GLM Repeated Measures19GLM Repeated Measures Define Factors . . . . . . . . . . . . . . . . . . . . . . . . . . . 23GLM Repeated Measures Model . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 25Build Terms . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 26Sum of Squares . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 26GLM Repeated Measures Contrasts . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 28Contrast Types. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 29GLM Repeated Measures Profile Plots . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 30GLM Repeated Measures Post Hoc Comparisons . . . . . . . . . . . . . . . . . . . . . 31GLM Repeated Measures Save. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 34v

GLM Repeated Measures Options . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 36GLM Command Additional Features . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 384Variance Components Analysis39Variance Components Model . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 42Build Terms . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 43Variance Components Options . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 43Sum of Squares (Variance Components) . . . . . . . . . . . . . . . . . . . . . . . . 44Variance Components Save to New File . . . . . . . . . . . . . . . . . . . . . . . . . . . . 46VARCOMP Command Additional Features . . . . . . . . . . . . . . . . . . . . . . . . . . . 475Linear Mixed Models48Linear Mixed Models Select Subjects/Repeated Variables . . . . . . . . . . . . . . 51Linear Mixed Models Fixed Effects . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 53Build Non-Nested Terms . . . . . . .Build Nested Terms . . . . . . . . . . .Sum of Squares . . . . . . . . . . . . . .Linear Mixed Models Random Effects .54545556Linear Mixed Models Estimation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 58Linear Mixed Models Statistics. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 60Linear Mixed Models EM Means . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 62Linear Mixed Models Save . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 63MIXED Command Additional Features. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 64vi

6Generalized Linear Models65Generalized Linear Models Response . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 71Generalized Linear Models Reference Category. . . . . . . . . . . . . . . . . . . 72Generalized Linear Models Predictors . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 74Generalized Linear Models Options . . . . . . . . . . . . . . . . . . . . . . . . . . . . 76Generalized Linear Models Model . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 77Generalized Linear Models Estimation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 79Generalized Linear Models Initial Values . . . . . . . . . . . . . . . . . . . . . . . . 81Generalized Linear Models Statistics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 83Generalized Linear Models EM Means . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 86Generalized Linear Models Save. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 88Generalized Linear Models Export . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 91GENLIN Command Additional Features . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 937Generalized Estimating Equations94Generalized Estimating Equations Type of Model . . . . . . . . . . . . . . . . . . . . . 98Generalized Estimating Equations Response . . . . . . . . . . . . . . . . . . . . . . . . 103Generalized Estimating Equations Reference Category . . . . . . . . . . . . 105Generalized Estimating Equations Predictors . . . . . . . . . . . . . . . . . . . . . . . 106Generalized Estimating Equations Options . . . . . . . . . . . . . . . . . . . . . . 108Generalized Estimating Equations Model . . . . . . . . . . . . . . . . . . . . . . . . . . 109Generalized Estimating Equations Estimation . . . . . . . . . . . . . . . . . . . . . . . 111Generalized Estimating Equations Initial Values . . . . . . . . . . . . . . . . . . 113Generalized Estimating Equations Statistics . . . . . . . . . . . . . . . . . . . . . . . . 115Generalized Estimating Equations EM Means . . . . . . . . . . . . . . . . . . . . . . . 118Generalized Estimating Equations Save. . . . . . . . . . . . . . . . . . . . . . . . . . . . 121vii

Generalized Estimating Equations Export . . . . . . . . . . . . . . . . . . . . . . . . . . 123GENLIN Command Additional Features . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1258Model Selection Loglinear Analysis126Loglinear Analysis Define Range. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 128Loglinear Analysis Model . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 129Build Terms . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 130Model Selection Loglinear Analysis Options . . . . . . . . . . . . . . . . . . . . . . . . 130HILOGLINEAR Command Additional Features . . . . . . . . . . . . . . . . . . . . . . . 1319General Loglinear Analysis132General Loglinear Analysis Model . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 135Build Terms . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 135General Loglinear Analysis Options. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 136General Loglinear Analysis Save. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 137GENLOG Command Additional Features . . . . . . . . . . . . . . . . . . . . . . . . . . . 13810 Logit Loglinear Analysis139Logit Loglinear Analysis Model . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 142Build Terms . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 143Logit Loglinear Analysis Options . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 144Logit Loglinear Analysis Save . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 145GENLOG Command Additional Features . . . . . . . . . . . . . . . . . . . . . . . . . . . 146viii

11 Life Tables147Life Tables Define Events for Status Variables . . . . . . . . . . . . . . . . . . . . . . . 150Life Tables Define Range. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 150Life Tables Options . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 151SURVIVAL Command Additional Features . . . . . . . . . . . . . . . . . . . . . . . . . . 15212 Kaplan-Meier Survival Analysis153Kaplan-Meier Define Event for Status Variable . . . . . . . . . . . . . . . . . . . . . . 155Kaplan-Meier Compare Factor Levels . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 156Kaplan-Meier Save New Variables . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 157Kaplan-Meier Options. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 158KM Command Additional Features . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15813 Cox Regression Analysis160Cox Regression Define Categorical Variables . . . . . . . . . . . . . . . . . . . . . . . 162Cox Regression Plots . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 164Cox Regression Save New Variables. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 165Cox Regression Options . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 166Cox Regression Define Event for Status Variable. . . . . . . . . . . . . . . . . . . . . 167COXREG Command Additional Features . . . . . . . . . . . . . . . . . . . . . . . . . . . 167ix

14 Computing Time-Dependent Covariates168Computing a Time-Dependent Covariate . . . . . . . . . . . . . . . . . . . . . . . . . . . 169Cox Regression with Time-Dependent Covariates Additional Features . 170AppendicesA Categorical Variable Coding Schemes171Deviation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 171Simple . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 172Helmert . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 173Difference . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 173Polynomial . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 174Repeated . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 175Special . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 176Indicator. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 177B Covariance Structures178Index183x

ChapterIntroduction to AdvancedStatistics1The Advanced Statistics option provides procedures that offer more advancedmodeling options than are available through the Base system. GLM Multivariate extends the general linear model provided by GLM Univariateto allow multiple dependent variables. A further extension, GLM RepeatedMeasures, allows repeated measurements of multiple dependent variables. Variance Components Analysis is a specific tool for decomposing the variability ina dependent variable into fixed and random components. Linear Mixed Models expands the general linear model so that the data arepermitted to exhibit correlated and nonconstant variability. The mixed linearmodel, therefore, provides the flexibility of modeling not only the means of thedata but the variances and covariances as well. Generalized Linear Models (GZLM) relaxes the assumption of normality for theerror term and requires only that the dependent variable be linearly related to thepredictors through a transformation, or link function. Generalized EstimatingEquations (GEE) extends GZLM to allow repeated measurements. General Loglinear Analysis allows you to fit models for cross-classified count data,and Model Selection Loglinear Analysis can help you to choose between models. Logit Loglinear Analysis allows you to fit loglinear models for analyzing therelationship between a categorical dependent and one or more categoricalpredictors. Survival analysis is available through Life Tables for examining the distributionof time-to-event variables, possibly by levels of a factor variable; Kaplan-MeierSurvival Analysis for examining the distribution of time-to-event variables,possibly by levels of a factor variable or producing separate analyses by levels of a1

2Chapter 1stratification variable; and Cox Regression for modeling the time to a specifiedevent, based upon the values of given covariates.

ChapterGLM Multivariate Analysis2The GLM Multivariate procedure provides regression analysis and analysis of variancefor multiple dependent variables by one or more factor variables or covariates. Thefactor variables divide the population into groups. Using this general linear modelprocedure, you can test null hypotheses about the effects of factor variables on themeans of various groupings of a joint distribution of dependent variables. You caninvestigate interactions between factors as well as the effects of individual factors.In addition, the effects of covariates and covariate interactions with factors can beincluded. For regression analysis, the independent (predictor) variables are specifiedas covariates.Both balanced and unbalanced models can be tested. A design is balanced if eachcell in the model contains the same number of cases. In a multivariate model, the sumsof squares due to the effects in the model and error sums of squares are in matrix formrather than the scalar form found in univariate analysis. These matrices are calledSSCP (sums-of-squares and cross-products) matrices. If more than one dependentvariable is specified, the multivariate analysis of variance using Pillai’s trace, Wilks’lambda, Hotelling’s trace, and Roy’s largest root criterion with approximate F statisticare provided as well as the univariate analysis of variance for each dependent variable.In addition to testing hypotheses, GLM Multivariate produces estimates of parameters.Commonly used a priori contrasts are available to perform hypothesis testing.Additionally, after an overall F test has shown significance, you can use post hoc tests toevaluate differences among specific means. Estimated marginal means give estimatesof predicted mean values for the cells in the model, and profile plots (interaction plots)of these means allow you to visualize some of the relationships easily. The post hocmultiple comparison tests are performed for each dependent variable separately.Residuals, predicted values, Cook’s distance, and leverage values can be savedas new variables in your data file for checking assumptions. Also available are aresidual SSCP matrix, which is a square matrix of sums of squares and cross-productsof residuals, a residual covariance matrix, which is the residual SSCP matrix divided3

4Chapter 2by the degrees of freedom of the residuals, and the residual correlation matrix, which isthe standardized form of the residual covariance matrix.WLS Weight allows you to specify a variable used to give observations differentweights for a weighted least-squares (WLS) analysis, perhaps to compensate fordifferent precision of measurement.Example. A manufacturer of plastics measures three properties of plastic film: tearresistance, gloss, and opacity. Two rates of extrusion and two different amounts ofadditive are tried, and the three properties are measured under each combination ofextrusion rate and additive amount. The manufacturer finds that the extrusion rate andthe amount of additive individually produce significant results but that the interactionof the two factors is not significant.Methods. Type I, Type II, Type III, and Type IV sums of squares can be used toevaluate different hypotheses. Type III is the default.Statistics. Post hoc range tests and multiple comparisons: least significantdifference, Bonferroni, Sidak, Scheffé, Ryan-Einot-Gabriel-Welsch multiple F,Ryan-Einot-Gabriel-Welsch multiple range, Student-Newman-Keuls, Tukey’s honestlysignificant difference, Tukey’s b, Duncan, Hochberg’s GT2, Gabriel, Waller Duncan ttest, Dunnett (one-sided and two-sided), Tamhane’s T2, Dunnett’s T3, Games-Howell,and Dunnett’s C. Descriptive statistics: observed means, standard deviations, andcounts for all of the dependent variables in all cells; the Levene test for homogeneity ofvariance; Box’s M test of the homogeneity of the covariance matrices of the dependentvariables; and Bartlett’s test of sphericity.Plots. Spread-versus-level, residual, and profile (interaction).Data. The dependent variables should be quantitative. Factors are categorical and canhave numeric values or string values. Covariates are quantitative variables that arerelated to the dependent variable.Assumptions. For dependent variables, the data are a random sample of vectorsfrom a multivariate normal population; in the population, the variance-covariancematrices for all cells are the same. Analysis of variance is robust to departures fromnormality, although the data should be symmetric. To check assumptions, you can usehomogeneity of variances tests (including Box’s M) and spread-versus-level plots. Youcan also examine residuals and residual plots.

5GLM Multivariate AnalysisRelated procedures. Use the Explore procedure to examine the data before doing ananalysis of variance. For a single dependent variable, use GLM Univariate. If youmeasured the same dependent variables on several occasions for each subject, useGLM Repeated Measures.Obtaining GLM Multivariate TablesE From the menus choose:AnalyzeGeneral Linear ModelMultivariate.Figure 2-1Multivariate dialog boxE Select at least two dependent variables.Optionally, you can specify Fixed Factor(s), Covariate(s), and WLS Weight.

6Chapter 2GLM Multivariate ModelFigure 2-2Multivariate Model dialog boxSpecify Model. A full factorial model contains all factor main effects, all covariate maineffects, and all factor-by-factor interactions. It does not contain covariate interactions.Select Custom to specify only a subset of interactions or to specify factor-by-covariateinteractions. You must indicate all of the terms to be included in the model.Factors and Covariates. The factors and covariates are listed.Model. The model depends on the nature of your data. After selecting Custom, you canselect the main effects and interactions that are of interest in your analysis.Sum of squares. The method of calculating the sums of squares. For balanced orunbalanced models with no missing cells, the Type III sum-of-squares method is mostcommonly used.Include intercept in model. The intercept is usually included in the model. If you canassume that the data pass through the origin, you can exclude the intercept.

7GLM Multivariate AnalysisBuild TermsFor the selected factors and covariates:Interaction. Creates the highest-level interaction term of all selected variables. Thisis the default.Main effects. Creates a main-effects term for each variable selected.All 2-way. Creates all possible two-way interactions of the selected variables.All 3-way. Creates all possible three-way interactions of the selected variables.All 4-way. Creates all possible four-way interactions of the selected variables.All 5-way. Creates all possible five-way interactions of the selected variables.Sum of SquaresFor the model, you can choose a type of sums of squares. Type III is the mostcommonly used and is the default.Type I. This method is also known as the hierarchical decomposition of thesum-of-squares method. Each term is adjusted for only the term that precedes it in themodel. Type I sums of squares are commonly used for: A balanced ANOVA model in which any main effects are specified before anyfirst-order interaction effects, any first-order interaction effects are specified beforeany second-order interaction effects, and so on. A polynomial regression model in which any lower-order terms are specifiedbefore any higher-order terms. A purely nested model in which the first-specified effect is nested within thesecond-specified effect, the second-specified effect is nested within the third, andso on. (This form of nesting can be specified only by using syntax.)Type II. This method calculates the sums of squares of an effect in the model adjustedfor all other “appropriate” effects. An appropriate effect is one that corresponds to alleffects that do not contain the effect being examined. The Type II sum-of-squaresmethod is commonly used for: A balanced ANOVA model. Any model that has main factor effects only.

8Chapter 2 Any regression model. A purely nested design. (This form of nesting can be specified by using syntax.)Type III. The default. This method calculates the sums of squares of an effect in thedesign as the sums of squares adjusted for any other effects that do not contain it andorthogonal to any effects (if any) that contain it. The Type III sums of squares haveone major advantage in that they are invariant with respect to the cell frequenciesas long as the general form of estimability remains constant. Hence, this type ofsums of squares is often considered useful for an unbalanced model with no missingcells. In a factorial design with no missing cells, this method is equivalent to theYates’ weighted-squares-of-means technique. The Type III sum-of-squares methodis commonly used for: Any models listed in Type I and Type II. Any balanced or unbalanced model with no empty cells.Type IV. This method is designed for a situation in which there are missing cells. Forany effect F in the design, if F is not contained in any other effect, then Type IV TypeIII Type II. When F is contained in other effects, Type IV distributes the contrastsbeing made among the parameters in F to all higher-level effects equitably. The TypeIV sum-of-squares method is commonly used for: Any models listed in Type I and Type II. Any balanced model or unbalanced model with empty cells.

9GLM Multivariate AnalysisGLM Multivariate ContrastsFigure 2-3Multivariate Contrasts dialog boxContrasts are used to test whether the levels of an effect are significantly differentfrom one another. You can specify a contrast for each factor in the model. Contrastsrepresent linear combinations of the parameters.Hypothesis testing is based on the null hypothesis LBM 0, where L is the contrastcoefficients matrix, M is the identity matrix (which has dimension equal to the numberof dependent variables), and B is the parameter vector. When a contrast is specified,an L matrix is created such that the columns corresponding to the factor match thecontrast. The remaining columns are adjusted so that the L matrix is estimable.In addition to the univariate test using F statistics and the Bonferroni-typesimultaneous confidence intervals based on Student’s t distribution for the contrastdifferences across all dependent variables, the multivariate tests using Pillai’s trace,Wilks’ lambda, Hotelling’s trace, and Roy’s largest root criteria are provided.Available contrasts are deviation, simple, difference, Helmert, repeated, andpolynomial. For deviation contrasts and simple contrasts, you can choose whether thereference category is the last or first category.Contrast TypesDeviation. Compares the mean of each level (except a reference category) to the meanof all of the levels (grand mean). The levels of the factor can be in any order.

10Chapter 2Simple. Compares the mean of each level to the mean of a specified level. Thistype of contrast is useful when there is a control group. You can choose the first orlast category as the reference.Difference. Compares the mean of each level (except the first) to the mean of previouslevels. (Sometimes called reverse Helmert contrasts.)Helmert. Compares the mean of each level of the factor (except the last) to the mean ofsubsequent levels.Repeated. Compares the mean of each level (except the last) to the mean of thesubsequent level.Polynomial. Compares the linear effect, quadratic effect, cubic effect, and so on. Thefirst degree of freedom contains the linear effect across all categories; the seconddegree of freedom, the quadratic effect; and so on. These contrasts are often usedto estimate polynomial trends.GLM Multivariate Profile PlotsFigure 2-4Multivariate Profile Plots dialog box

11GLM Multivariate AnalysisProfile plots (interaction plots) are useful for comparing marginal means in yourmodel. A profile plot is a line plot in which each point indicates the estimated marginalmean of a dependent variable (adjusted for any covariates) at one level of a factor. Thelevels of a second factor can be used to make separate lines. Each level in a third factorcan be used to create a separate plot. All factors are available for plots. Profile plotsare created for each dependent variable.A profile plot of one factor shows whether the estimated marginal means areincreasing or decreasing across levels. For two or more factors, parallel lines indicatethat there is no interaction between factors, which means that you can investigate thelevels of only one factor. Nonparallel lines indicate an interaction.Figure 2-5Nonparallel plot (left) and parallel plot (right)After a plot is specified by selecting factors for the horizontal axis and, optionally,factors for separate lines and separate plots, the plot must be added to the Plots list.

12Chapter 2GLM Multivariate Post Hoc ComparisonsFigure 2-6Multivariate Post Hoc Multiple Comparisons for Observed Means dialog boxPost hoc multiple comparison tests. Once you have determined that differences existamong the means, post hoc range tests and pairwise multiple comparisons candetermine which means differ. Comparisons are made on unadjusted values. The po

SPSS Statistics 17.0 is a comprehensive system for analyzing data. The Advanced Statistics optional add-on module provides the additional analytic techniques described in this manual. The Advanced Statistics add-on module must be used with the SPSS Statistics 17.0 Base system and is completely integrated into that system. Installation

Related Documents:

Here is what the three main windows in SPSS 17.0—SPSS Data Editor, SPSS Syntax Editor, and SPSS Viewer—look like in the Windows operating environment . The SPSS Data Editor window shows the active data file. The SPSS Syntax Editor window has an SPSS program typed into it. The results of the program appear in the SPSS Viewer window.

Statistics Student Version can do all of the statistics in this book. IBM SPSS Statistics GradPack includes the SPSS Base modules as well as advanced statistics, which enable you to do all the statistics in this book plus those in our IBM SPSS for Intermediate Statistics book (Leech et al., in press) and many others. Goals of This Book

Basic Structure of IBM SPSS Statistics Data Files IBM SPSS Statistics data files are organized by cases (rows) and variables (columns). In this data file, cases represent individual respondents to a survey. Variables represent responses to each question asked in the survey. Reading IBM SPSS Statistics Data Files IBM SPSS Statistics data files .

IBM SPSS Statistics is a comprehensive system for analyzing data. The Advanced Statistics optional add-on module provides the additional analytic techniques described in this manual. The Advanced Statistics add-on module must be used with the SPSS Statistics Core system and is completely integrated into that system. About SPSS Inc., an IBM .

The SPSS Statistical Procedures Companion, by Marija Norušis, has been published by Prentice Hall. A new version of this book, updated for SPSS Statistics 17.0, is planned. The SPSS Advanced Statistical Procedures Companion, also based on SPSS Statistics 17.0, is forthcoming. The SPSS Guide to Data Analysis

organization, and your support agreement when requesting assistance. IBM SPSS Statistics 19 Student Version The IBM SPSS Statistics 19 Student Version is a limited but still powerful version of SPSS Statistics. Capability The Student Version contains many of the important data analysis tools contained in IBM SPSS Statistics, including:

SPSS for Windows Version 19.0: A Basic Tutorial Linda Fiddler, California State University, Bakersfield . all you have to do to start IBM SPSS is to point to the IBM SPSS 19 icon on the desktop and double click. Then wait while IBM SPSS loads. After IBM SPSS loads, you may, depending on how IBM SPSS is set up, get a menu that .

You can run SPSS using either the pull-down menus or the Syntax Editor. The former method is a menu-driven approach, while the latter method involves writing your own SPSS programs. We will begin by using the pull-down menus to run SPSS. The use of the syntax window to run SPSS will be deferred until Chapter 7. SPSS FILES SPSS uses several .