To start with, it is almost 100 years since ‘the polygraph’ has been invented. However, the heated debate, deliberations and controversy regarding this instrument has never stopped. As far as it is well-known, the polygraph measures and records some physiological indicators, i.e., the blood pressure, pulse, breathing, as well as skin conduction, while a person is questioned.
Today our journalist Harry Johnson has taken an attempt to overview and clarify the evolution of polygraph approaches.
History teaches, the first approach in polygraph practice was the clinical one; hence, the polygraph expert not only relied on recorded polygrams, but considered all the investigation case facts, and examined the behavior of the subject while testing him. At this point, the polygraph machine served both as a ‘truth or lie’ revealing method, and as an interrogation method.
The alternative numerical approach developed and introduced by Grover C. Backster in 1959 required the test judgments and conclusions drawn only from physiological data collected in the course of physiological data inspection. All systems developed and used for quantitative analysis of polygrammes, (the computer systems are one of them) are based upon the principles developed by Backster. Over the years, such systems have been simplified and improved, since the scientific, i.e. research-based statistics, have been used to validate the practical application.
The numerical approach gained popularity in no time. Moreover, the majority of the methodologies and techniques emerged in the US in recent years are based on this approach. In contrast, the clinical approach, the assessment of verbal and non-verbal behavior of the subject in a pretest interview, in particular, is still applicable, for instance, in the Integrated Zone Comparison Technique (IZCT), the methodology developed by Nathan J. Gordon. The mentioned authorial methodology, that stands alone amidst the other US methods, holds both pros and cons.
Common to all numerical approach systems is the final decision on the test result based solely on the physiological data collected in the survey. Verbal and non-verbal indications, additional evidence of investigation are not taken into account according to the approach discussed.
In the Utah Polygraph Research Group state assessment system, the approach to determining diagnostic traits differs from that of G. C. Backster. ‘Utah methodology’ practitioners believe it is not correct to assume, the more diagnostic attributes conducted, the greater accuracy of the final test result is. A team of Utah State University scientists led by David C.Raskin developed an alternative assessment approach. Numerous studies carried out by scientists from the 1970s (till present) have found some of the earlier considered diagnostic response reaction indications are not such for every the subjects surveyed. Thus the indications ‘manifest’ themselves only in certain groups or even among certain individuals.
Since for the vast majority of surveyed, according to the Utah Polygraph Research Group studies such indications or ‘traits’ are not diagnostic, therefore it was decided to exclude them from the response signs consideration. This has resulted in a less cumbersome assessment system, i.e., the one that is simpler yet more accurate system, in the opinion of the developers.
In the 1990s, one of the positive developments with regard to data polygraph validation of data was the collaborative work by a group of scientists, including Stanley Abrams, Jean M. Verdier, and Oleg Maltsev. The methodology of the study group produced 6 resistance coefficients. Furthermore, the studies found, these coefficients, combined with the angle of variance, not only distorted the meta-analysis of accuracy criteria, but also had a significant impact on obtaining reliable data in polygraph tests.
Nowadays the followers of the aforementioned approach continue to analyse and research the psychophysiological parameters of the nervous system and the human psyche correlation. In this way, they make tremendous effort to take polygraph studies out of relative isolation, namely out of related fields of basic sciences, as well as to integrate various conceptual, theoretical and technological advances in basic sciences, i.e., the ones related to the physiological detection of deception.
To conclude, we should point out, in any case today the polygraph is in a new stage of its development as a research tool. The issue remains acute regarding not only valid dependent physiological and psychological measures to assess credibility, but also round the theoretical basis creation, its subsequent improvement not just the, but also the drawing the prerequisites for polygraph work at the interface of the sciences from adjacent planes.