Page 408 - Jolliffe I. Principal Component Analysis
P. 408

14


                              Generalizations and Adaptations of

                              Principal Component Analysis


















                              The basic technique of PCA has been generalized or adapted in many ways,
                              and some have already been discussed, in particular in Chapter 13 where
                              adaptations for special types of data were described. This final chapter dis-
                              cusses a number of additional generalizations and modifications; for several
                              of them the discussion is very brief in comparison to the large amount of
                              material that has appeared in the literature.
                                Sections 14.1 and 14.2 present some definitions of ‘non-linear PCA’ and
                              ‘generalized PCA,’ respectively. In both cases there are connections with
                              correspondence analysis, which was discussed at somewhat greater length
                              in Section 13.1. Non-linear extensions of PCA (Section 14.1) include the
                              Gifi approach, principal curves, and some types of neural network, while
                              the generalizations of Section 14.2 cover many varieties of weights, metrics,
                              transformations and centerings.
                                Section 14.3 describes modifications of PCA that may be useful when
                              secondary or ‘instrumental’ variables are present, and in Section 14.4 some
                              possible alternatives to PCA for data that are are non-normal are discussed.
                              These include independent component analysis (ICA).
                                Section 14.5 introduces the ideas of three-mode and multiway PCA.
                              These analyses are appropriate when the data matrix, as well as having
                              two dimensions corresponding to individuals and variables, respectively,
                              has one or more extra dimensions corresponding, for example, to time.
                                The penultimate miscellaneous section (14.6) collects together some
                              ideas from neural networks and goodness-of-fit, and presents some other
                              modifications of PCA. The chapter ends with a few concluding remarks.
   403   404   405   406   407   408   409   410   411   412   413