Academic journal article Psychologische Beiträge

Feature Pattern Analysis in the Context of Other Models and Methods

Academic journal article Psychologische Beiträge

Feature Pattern Analysis in the Context of Other Models and Methods

Article excerpt


Feature Pattern analysis is a formal research method, not a method of intervention. It describes the structure of contingencies in co-occurence data. FPA is a model for "objects by attribute" data model neither includes distributional assumption, nor an explicit error theory, nor an implicit use of similarity measures. This paper discusses similarities and differences between formal concept analysis (Formale Begriffsanalyse), Configural Frequency Analysis (KFA), Latent Class Analysis, Hierarchical Classes Analysis (HICLUS) and Prediction Analysis (Del). It briefly mentiones the foundational relationship of FPA to MDS and Unfolding.

Key words: "Object by attributes" methods, formal Concept Analysis, MDS, Unfolding

A general perspective In the behavioral and social sciences one may broadly distinguish between research methods and methods of intervention, as used in clinical and educational psychology, and in many other applied fields. Research methods, to continue with global distinctions, may be characterized as either hermeneutic (sometimes misleadingly called 'qualitative') or as formal. While both approaches are concerned with observations, the decisive difference is the conceptualization of observations as data and as variables implied in the formal methods.

Feature Pattern Analysis (FPA) is one of these formal methods which may be further divided into those that put more emphasis on describing the variability in the data as structures and processes, and those aiming at explanations of the variability.

For more than two generations, the explanatory methods are dominated by the general linear model, of which causal modeling by LISREL or similar approaches are only the youngest and most sophisticated family members. Different effects of experimental conditions are conceived of as differences between means, the covariation between variables is analyzed by regression equations. Some methods, like Principal Component Analysis or Thurstonenian Factor Analysis apparently come close to those procedures we just called descriptive. But their basic concepts belong to the general linear model: If sets of covariances or correlations satisfy certain conditions they may be summarized by a few entities called factors or components.

While the general linear model is, no doubt, very elegant and well understood, its validity is based on assumptions which are rarely satisfied by data from research in the social and behavioral sciences. A very different alternative is therefore strongly gaining power: Inferential statistics based on randomization and permutation, the Jackknife, the Bootstrap, and other resampling plans. Ideas and procedures from this area are likely to be combined in the near future with approaches of the realm of descriptive procedures.

The descriptive procedures have participated in the fast development of the whole field of methods. In this part of the area, one might distinguish between representation procedures and measurement models. The representation procedures, as all more recent developments, did profit from the explosion of the computing power on our desktops. Perhaps one can find the beginning of this trend in Tukey's "Exploratory Data Analysis" (1977); Schnell's (1994) book is typical in this respect. The emphasis is on visualizing the data. Quite often, the representations are data transformations, with or without loss of information. Formal Concept Analysis (FCA) is to be mentioned here. FCA is not a model, it can not be falsified, and the representation contains all the information in the data. The original data may be reproduced from their representation completely without error, distortion or loss. The gain of using this or other representational procedures is to obtain a concise structuring of the data. This usually will help to understand their implications or store them efficiently on technical media for intelligent retrieval.

Measurement in its most general meaning is the assignment of formal or symbolic entities (numbers, vectors, graph theoretical or geometrical concepts) "to objects or events in such a way that specified relations among the numbers represent certain empirical relations among the objects. …

Author Advanced search


An unknown error has occurred. Please click the button below to reload the page. If the problem persists, please try again in a little while.