AN EVALUATION OF METHODS TO SELECT RESPONDENTS TO STRUCTURED JOB‐ANALYSIS QUESTIONNAIRES

The purpose was to evaluate methods for selecting respondents who would respond accurately to items on a job-analysis questionnaire. One general method involved obtaining from employees measures that assessed background, performance, and organizational information. This information could be used to identify respondents who were knowledgeable about the job and, therefore, able to rate the job accurately. A second general method involved collecting job-analysis data from all potential job-analysis respondents and, on the basis of indices computed on these data, selecting a subsample from them. Two indices were investigated: (1) the D index, which assessed similarity between an individual's ratings and the population's mean ratings, and (2) the carelessness index, which measured an individual's tendency to rate tasks known to be unrelated to the focal job as important. Both methods were applied to a sample of 343 mental-health workers. Four general postulates for job analysts were proposed on the basis of the results: (1) Different selection measures yield somewhat different job-analysis respondents. (2) Respondents are not equally accurate and, with the use of the carelessness index, may be screened for the tendency to make errors. (3) In some applications, the number of sampled respondents needs to be greater than three in order to obtain reliable results. (4) To the degree that the job is ill-defined and unstable, the selection of job-analysis respondents assumes greater importance and is riskier.

[1]  Angelo S. DeNisi,et al.  EXPERT AND NAIVE RATERS USING THE PAQ: DOES IT MATTER? , 1984 .

[2]  Thomas M. Stutzman Within Classification Job Differences. , 1983 .

[3]  Scott E. Maxwell,et al.  DETECTING JOB DIFFERENCES: A MONTE CARLO STUDY , 1981 .

[4]  Ronald A. Ash,et al.  Exploratory comparative study of four job analysis methods. , 1980 .

[5]  Edwin T. Cornelius,et al.  JOB ANALYSIS MODELS AND JOB CLASSIFICATION , 1979 .

[6]  Milton D. Hakel,et al.  CONVERGENCE AMONG DATA SOURCES, RESPONSE BIAS, AND RELIABILITY AND VALIDITY OF A STRUCTURED JOB ANALYSIS QUESTIONNAIRE , 1979 .

[7]  C. J. Huberty,et al.  SOME FURTHER IDEAS ON A METHODOLOGY FOR DETERMINING JOB SIMILARITIES/ DIFFERENCES , 1979 .

[8]  Marvin H. Trattner Task Analysis in the Design of Three Concurrent Validity Studies of the Professional and Administrative Career Examination. , 1979 .

[9]  Kenneth N. Wexley,et al.  An examination of differences between managerial effectiveness and response patterns on a structured job analysis questionnaire. , 1978 .

[10]  C. O'Reilly,et al.  ORGANIZATIONS AS COMMUNICATION STRUCTURES: AN EMPIRICAL APPROACH1 , 1978 .

[11]  Jeffrey S. Kane,et al.  Methods of peer assessment. , 1978 .

[12]  Johnny J. Weissmuller,et al.  New CODAP Programs for Analyzing Task Factor Information. , 1975 .

[13]  H. H. Meyer A Comparison of Foreman and General Foreman Conceptions of the Foreman's Job Responsibilities , 1959 .

[14]  J. Weitz Selecting Supervisors with Peer Ratings , 1958 .

[15]  L. Cronbach,et al.  Assessing similarity between profiles. , 1953, Psychological bulletin.

[16]  K. Davis,et al.  A Method of Studying Communication Patterns in Organizations , 1953 .

[17]  D. Springer Ratings of candidates for promotion by co-workers and supervisors. , 1953 .