• Record Type:
    OSHA Instruction
  • Current Directive Number:
    STP 2.22A
  • Old Directive Number:
    STP 2.22A
  • Title:
    State Plan Policies and Procedures Manual
  • Information Date:
Archive Notice - OSHA Archive

NOTICE: This is an OSHA Archive Document, and may no longer represent OSHA Policy. It is presented here as historical content, for research and review purposes only.

OSHA Instruction STP 2.22A MAY 14 1986 Office of State Programs

SUBJECT: State Plan Policies and Procedures Manual

A. Purpose. This instruction transmits the revised State Plan Policies and Procedures Manual (SPM) and implements revisions to the State plan monitoring and evaluation system.

B. Scope. This instruction applies OSHA-wide.

C. References.

1. OSHA Instruction STP 2-1.117, State Standards, August 31, 1984.
2. OSHA Instructions STP 2-1.18, Changes to State Plans, October 30, 1978.
3. OSHA Instruction STP 2-1.21, Federal Program Changes Requiring Response by State Designees in Approved Plan States, October 30, 1978.
4. OSHA Instruction FIN 3.2, Financial and Administrative Monitoring of the 23(g) Grants, 7(c)(1) Cooperative Agreements and 24(b)(2) Grants and Contracts, August 27, 1984.

D. Cancellation. OSHA Instruction STP 2.22, August 26, 1983; and STP 2.22 CH-1, May 29, 1985, are canceled.

E. Action. OSHA Regional Administrators and Area Directors shall ensure that the following actions are accomplished:

1. Replace the SPM issued on August 26, 1983, and CH-1 issued on May 29, 1985, with the attached revised SPM. File one copy of this instruction in the appropriate OSHA Directives System binder and one copy at the front of a separate binder with the SPM as a record of this change.
2. Ensure that the SPM is promptly forwarded to State designees.

-1-
3. Explain the technical content of the attached SPM revisions to the State designees as requested.
4. Integrated Management Information System (IMIS) States. Apply the monitoring procedures set out in the attached SPM, upon receipt of this instruction.
5. Non-IMIS States. Continue to apply the monitoring and evaluation procedures contained in Chapter XVI of the Field Operations Manual (FOM) until such time as a State participates in the IMIS and state Plan Activities Measures (SPAM) reports on the State are routinely produced.
6. Non-IMIS States Supplying Activities Measures Data. In States which are not yet participants in the IMIS but which agree to collect and provide data on the various Activities Measures on an interim basis:
a. Develop with the State a mutually acceptable format and schedule for the data submission. The data should be submitted on a quarterly basis and should cover as many of the Activities Measures as possible.
b. Compare the State performance data to corresponding Federal data for the same period in a manner similar to the SPAM report.
c. Apply the SPM procedures as much as possible in explaining or analyzing identified outliers. Whenever possible, use readily available information to determine the cause of an outlier.
d. Apply the on-site monitoring procedures contained in Chapter XVI of the FOM to the extent necessary to:
(1) Collect data relating to Activities Measures which are not available through the State alternative data submissions.

(2) Verify the accuracy of the State-submitted data.

-2-
(3) Explain or analyze outliers when review of State-submitted information or other available off-site information is not sufficient to do so. (A Greater reliance on on-site monitoring will be necessary for a State that does not participate in the IMIS because inspection level data will not otherwise be available.)
7. Consultation Activities Measures. Until the consultation data system is operational, the following procedures shall be followed in evaluating consultation programs (public and/or private sector) conducted under the State plans (as distinct from those consultation programs operating under separate agreements pursuant to section 7(c)(1) of the Occupational Safety and Health Act of 1970):
a. Use the same measures that are used to evaluate 7(c)(1) consultation programs. These measures are included in Appendix A, section C.
b. Follow the instructions in the memoranda for Regional Administrators dated February 22, 1985 and March 22, 1985, which set out procedures for evaluating consultation programs until the consultation data system is operational. It is understood that data will not be available for all the Consultation Activities Measures. The March 22 memorandum indicates which of these measures have data that are collectible now.
(1) Every effort should be made to obtain the data from the State, through established State recordkeeping systems or agreed upon interim collection methods. Case file reviews by Federal monitors should be used only when no other means of data collection is available and then only in coordination with the State. A sample of case files for these reviews should be selected in accordance with Appendix K of the SPM.
(2) For those measures for which comparison data is not yet available, make a general determination on State performance based on the program objectives.

-3-
(3) For those measures where the further review level is an absolute number rather than a National average, evaluate State performance in comparison to the further review level.

F. Background.

1. Development of the SPM. The general monitoring and evaluation procedures set forth in this manual were developed in 1981 and 1982 by task forces composed of Federal and State representatives. The SPM replaced an evaluation system which was primarily dependent on on-site monitoring and which no longer appeared appropriate for States which have operated occupational safety and health programs for many years.
a. Activities Measures. OSHA's State plan monitoring system is based on Activities Measures which set out the criteria upon which State performance is measured. State and Federal data on the Activities Measures will be compared as a first step in the process of determining whether a State's program is at least as effective as the Federal program. The Activities Measures were originally developed in 1981 by five Federal/State task groups and were subsequently revised (See STP 2.22 CH-1, May 29, 1985.), based on experience in using the original measures. In general, the revisions were designed to make the measures more comprehensible and to facilitate comparison of Federal and State data. The revised Activities Measures are contained in Appendix A of the SPM, and their content and function are discussed in Chapter III.
b. IMIS. The data necessary to evaluate State performance in relation to the Activities Measures is obtained primarily through State participation in OSHA's computerized IMIS, which stores and processes the same types of data for both Federal and State programs. The IMIS ensures that Federal and State data are comparable through the use of uniform definitions and methods of calculation. Routine State Plan Activities Measures (SPAM) computer output reports which present both Federal and State data are produced quarterly. Monitoring procedures for States not yet participating in the IMIS are discussed in E.5 and E.6. of this instruction.

-4-
2. Current Revision of the SPM. The SPM was revised in 1984 and 1985 by a task force composed of Federal and State representatives.
a. The revision of the Activities Measures in 1984 and the corresponding shift from monthly to quarterly review of SPAM reports necessitated some changes in the monitoring system. The revised Activities Measures which were issued in April 1984 and incorporated into the SPM as STP 2.22 CH-1 on May 29, 1985, have been included in this revision.
b. The Activities Measures on on-site consultation (Appendix A, section C) were not addressed during the revision of the other sections. These measures will be revised in the future based on experience under the consultation data system. Interim monitoring instructions for consultation are discussed in paragraph E.7. of this instruction.
c. Other changes in the manual were made based on experience gained under the monitoring system.
d. Significant Changes. This revision reorganizes, redesignates and streamlines the contents of the former manual for clarity. Specific changes are not indicated throughout the text of the manual because a vast majority of the material has either been reformatted or rewritten. However, substantive changes are highlighted as follows:
(1) INTRODUCTION (Chapter I). Revised to emphasize the cooperative nature of the monitoring program and encourage joint analysis of State performance.
(2) ROLES AND RESPONSIBILITIES OF NATIONAL OFFICE, REGIONS AND STATES (Chapter II). A separate chapter has been added to detail the rules and responsibilities of all organizational elements involved in the monitoring process (Directorate of Federal-State Operations, Directorate of Administrative Programs, Directorate of Field Operations, Regional Offices and State Offices).

-5-
(3) CRITERIA AND INFORMATION USED FOR EVALUATING STATE PROGRAMS (Chapter III).
(a) SPAM Report (III-3). SPAM reports now issued quarterly rather than monthly.
(b) Non-IMIS Statistical Information (III-3-7). Section was reorganized and rewritten to more precisely identify what non-IMIS information is required to be submitted by States. Standardizes timeframes for submission of routine non-IMIS information (from States to Regions within 15 days of end of quarter, and from Region to National Office within 30 days of end of quarter). Establishes specific timeframes for other types of case-by-case submissions.
(4) PROCEDURES FOR REVIEW AND ANALYSIS OF STATE PERFORMANCE INFORMATION (Chapter IV).
(a) Quarterly Discussions Between the Region and State (IV-1-6).
1 Requires quarterly discussions between Region and State as means to assure that all information on a State's performance is reviewed and evaluated. Details purpose and procedures for the discussions, and topics to be addressed.
2 Establishes the concept of and procedures for addressing "old" versus "new" outliers in lieu of granting exceptions for acceptable deviations from further review levels.
(b) Special Investigations (IV-7). No longer requires National Office approval of on-site monitoring where Region and State agree, although coordination with the National Office is encouraged. The National Office shall be notified prior to the use of spot-check monitoring visits.

-6-
(c) Procedures for Outlier Analysis (IV-10-12). Makes level of documentation for analytical plan dependent on complexity of analysis required; simplifies procedures for analysis of outliers and completion of analytical reports; and, reduces formerly required clearances with the National Office.
(d) Procedures for Review and Analysis of Information Outside the IMIS; Bureau of Labor Statistics (BLS) (IV-14). Specifies nature and extent of BLS analysis of State injury/illness rates, and requires coordination between OSHA and BLS regional staff to determine timing of BLS input for annual evaluation.
(e) Contested Cases, Discrimination Cases, Other Significant Actions, and Denials of Entry (IV-16). Clarifies Regional and National Solicitors' roles in review of contested cases, discrimination cases and denials of entry.
(5) REVIEW AND ANALYSIS OF COMPLAINTS ABOUT STATE PROGRAM ADMINISTRATION (Chapter V). Section on investigating CASPAs is now a separate chapter. Listing and discussion of types of complaints not requiring investigation have been added. Procedures are detailed for maintaining complainant confidentiality and for providing opportunity for State response to allegations and findings.
(6) FINANCIAL AND ADMINISTRATIVE MONITORING OF 23(g) GRANTS (VI). References OSHA Instruction FIN 3.2, Financial and Administrative Monitoring of the 23(g) Grants, 7(c)(1) Cooperative Agreements and 24(b)(2) Grants and Contracts, and defines the relationship of the financial and administrative monitoring of 23(g) grants to State plan monitoring procedures set out in STP 2.22A.

-7-
(7) MONITORING PROCEDURES FOR PUBLIC EMPLOYEE ONLY PLAN (Chapter VII). Chapter added identifying which activity measures apply to monitoring public employee only plans, and detailing procedures for their application.
(8) ANNUAL EVALUATION OF STATE PROGRAMS (Chapter VIII). Chapter rewritten to emphasize that the Annual Evaluation Report must cover performance in all program areas, covering achievements as well as addressing outliers; changes timeframes for submission of report to National Office from 45 days to 60 days from receipt of SPAM Report.
(9) MODIFIED ACTIVITIES MEASURES (E-8 and F-24). These measures, concerning abatement, were modified somewhat to correspond to the data available through IMIS. The previous measures were based on the date abatement was indicated, which is not available through the IMIS. The modifications focus on whether abatement was indicated during the time period.
3. Future Revisions of the SPM. It is anticipated that additional chapters will be added to this manual covering other aspects of State plans, such as the review and approval of plan changes. In addition, further changes may be made to the monitoring procedures as a result of experience in implementing the procedures and of developments such as increased computerization of Activities Measures data. Revisions to this manual will be made by means of page changes transmitted by OSHA Instructions.

Patrick R. Tyson Acting Assistant Secretary

DISTRIBUTION: National, Regional and Area Offices State Plan Monitoring Personnel State Designees OSHA Training Institute

-8-

TABLE OF CONTENTS

CHAPTER PAGE

I. INTRODUCTION.....................................I-1
A. The Act......................................I-1 B. The Monitoring System........................I-1 C. Overview of Chapters and Appendixes...............................I-2
  II.    ROLES AND RESPONSIBILITIES OF NATIONAL
           OFFICE, REGIONS, AND STATES....................II-1
         A.  Introduction.................................II-1
         B.  National Office..............................II-1
             1.  Directorate of Federal-State
                 Operations...............................II-1
             2.  Directorate of Administrative
                 Programs.................................II-1
             3.  Directorate of Field Operations          II-2
         C.  Regions......................................II-2
         D.  States.......................................II-2
III. CRITERIA AND INFORMATION USED FOR EVALUATING STATE PROGRAMS........................III-1
A. Introduction.................................III-1 B. Activities Measures..........................III-1 1. Definition and Categories................III-1 2. Activities Measures......................III-1 a. Performance Objectives...............III-2 b. Performance Measures.................III-2 c. Further Review Levels................III-2 d. Outliers.............................III-2 e. Measures for Information Only........III-2 3. Sources of Information for the Activities Measures......................III-3 a. IMIS Information.....................III-3 (1) Submission of Federal and State Data..................III-3 (2) SPAM Reports....................III-3

i
b. Non-IMIS Statistical Information.....III-3 (1) Bureau of Labor Statistics Data.................III-4 (2) Program Resources Data..........III-4 (3) Program Administration (a) State-Initiated Plan Changes...............III-4 (b) State Response to Federal Program Changes............III-5 (c) Enforcement Response Time.......................III-6 (4) Standards.......................III-6 (5) Variances.......................III-7 c. Other................................III-7 C. Nonstatistical Program Information Not Covered by the Activities Measures...........III-8 1. State Decisions..........................III-8 a. Contested Cases......................III-8 b. Discrimination Cases.................III-8 c. Other Significant Actions............III-9 2. Other State Inputs.......................III-10 a. Denials of Entry.....................III-10 b. Special Reports......................III-11 3. State-Specific Program Activities........III-11 a. Definition...........................III-11 b. Sources of Information...............III-11 D. Complaints About State Program Administration (CASPAs)......................III-11
IV. PROCEDURES FOR REVIEW AND ANALYSIS OF STATE PERFORMANCE INFORMATION..........................IV-1
A. Introduction.................................IV-1 B. Quarterly Discussion Between the Region and State....................................IV-1

ii
5. Topics for Quarterly Discussion..........IV-2 a. Non-SPAM Information.................IV-2 (1) Status of Current Non-SPAM Outlier Analysis................IV-2 (2) Review of Completed Analyses of Non-SPAM Specific Data.......IV-3 (3) Review of New Non-SPAM Data.....IV-3 (4) CASPAs..........................IV-3 (5) Any Other Topics Relevant to State Performance...............IV-3 b. Computerized SPAM Data...............IV-3 (1) Status of Current SPAM Outlier Analyses................IV-4 (2) Review of Completed Outlier Analyses........................IV-4 (3) Review of New Outliers..........IV-4 (a) New 3-Month Outliers.......IV-4 (b) New 6-Month Outliers.......IV-5 (c) Grouping of Outliers for Analysis...................IV-5 (4) Review of Old Outliers..........IV-6 (5) Treatment of Measures for Information Only................IV-6 C. Procedures for Review and Analysis of Computerized Data............................IV-6 1. Review of Quarterly SPAM Reports.........IV-6 2. Quarterly Discussion of SPAM Data........IV-7 3. Analysis of 6-Month Outliers.............IV-7 a. Purpose of Outlier Analysis..........IV-7 b. Sources of Information on Causes of Outliers...................IV-7 (1) Special Investigation...........IV-8 (2) Information Readily Available...IV-8 (3) Information from Special Investigations..................IV-9 (a) Interviews.................IV-9 (b) Case File Review (CFR).....IV-9 (c) Accompanied Visit (AV).....IV-10 (d) Spot-Check Monitoring Visit (SCMV)...............IV-10

iii
(e) Other Sources of Information................IV-11 c. Procedures for Outlier Analysis......IV-11 (1) Timing..........................IV-11 (2) Procedures for Analysis.........IV-11 (a) Accuracy of Performance Data.......................IV-11 (b) Possible Causes(s).........IV-12 (c) Information to be Obtained and Reviewed......IV-12 (d) Sources of Information.....IV-12 (e) Data Collection and Analysis...................IV-12 (f) Determination of Cause(s)..IV-12 (g) Conclusion(s)..............IV-12 (h) Recommendation(s)..........IV-13 d. Preparation and Review of Analytical Reports...................IV-13 (1) Analytical Report Format........IV-13 (2) Communication Between Region and State.......................IV-13 (3) Opportunity for Written Response........................IV-14 (4) Submission of Reports to National Office.................IV-14 e. File of Analyses and Responses to Analyses.............................IV-14 D. Procedures for Review and Analysis of Information Outside the IMIS.................IV-14 1. Bureau of Labor Statistics (BLS).........IV-14 2. Program Resources........................IV-15 3. Program Administration...................IV-15 a. State Response to Federal Program Changes and State-Initiated Plan Changes..............................IV-15 b. Enforcement Response Time............IV-16 4. Standards and Variances..................IV-16 5. Contested Cases, Discrimination Cases, Other Significant Actions, and Denials of Entry.................................IV-17 6. State-Specific Program Activities for Which There Are No Performance Measures.................................IV-17

iv
V. REVIEW AND ANALYSIS OF COMPLAINTS ABOUT STATE PROGRAM ADMINISTRATION (CASPAs)..................V-1
A. Definition of a CASPA........................V-1 B. Purpose of a CASPA...........................V-1 C. Determining If a CASPA Warrants Investigation................................V-1 D. Confidentiality..............................V-2 E. Notification of Concerned Parties and Opportunity for State Response...............V-3 1. If an Investigation Is Not Warranted.....V-3 2. If an Investigation Is Warranted.........V-3 a. CASPAs Alleging Situations of Potential Imminent Danger............V-3 b. Routine CASPAs.......................V-3 c. CASPAs Where Initial State Response May Not Be Appropriate......V-4 F. Methods for Investigation of a CASPA.........V-4 G. Review of Completed CASPA Investigations.....V-5 1. Communication Between Region and State....................................V-5 2. Response to Complaint....................V-5 3. Letter to the State......................V-6 4. State Response...........................V-6 5. Forwarding of Response to Complainant to National Office...........V-6 6. Corrective Action........................V-6 H. Documentation of CASPA Investigations........V-6 1. Identification of Allegations to be Investigated.......................V-6 2. Information Reviewed.....................V-7 3. Analysis and Conclusions.................V-7 4. Recommendations..........................V-7 5. Response of State........................V-7 6. Follow-Up by Region......................V-7
VI. FINANCIAL AND ADMINISTRATIVE MONITORING OF 23(G) GRANTS.....................................VI-1
A. Introduction.................................VI-1 B. OSHA Instruction FIN 3.2 as it Relates to OSHA Instruction STP 2.22A...................VI-1

v
VII. MONITORING PROCEDURES FOR PUBLIC EMPLOYEE ONLY PLAN........................................VII-1
A. Public Employee Only State Plans.............VII-1 B. Injury-Illness Rate Activities Measures......VII-1
VIII. ANNUAL EVALUATION OF STATE PROGRAMS..............VIII-1
A. Purpose and Scope of Evaluation Report.......VIII-1 B. Format of Evaluation Report..................VIII-1 1. Title page...............................VIII-1 2. Table of Contents........................VIII-2 3. Executive Summary........................VIII-2 4. Introduction.............................VIII-2 5. Discussion of Program Categories.........VIII-3 a. State's Policies and Procedures......VIII-3 b. Relevant Performance Data............VIII-3 c. Other Program Features...............VIII-4 (1) CASPAs..........................VIII-4 (2) State-Specific Activities.......VIII-4 (3) OSHA Grant Monitoring Report....VIII-4 (4) Response to Previous Recommendations.................VIII-4 (5) Other Information...............VIII-4 d. Overall Performance..................VIII-4 e. Recommendations......................VIII-5 6. Conclusion on Overall Effectiveness of State Program.........................VIII-5 C. Procedures for Developing the Evaluation Report.......................................VIII-5 1. Timing...................................VIII-5 a. Coordination Between Region and State............................VIII-5 b. State Response to Region's Report....VIII-6 2. National Office Role.....................VIII-6 3. State Response to Final Report...........ViII-6 D. Availability of the Final Report and Any State Response...........................VIII-6

vi

APPENDIX --------

A. REVISED STATE PLAN ACTIVITIES MEASURES
B. SAMPLE SPAM REPORT
C. SAMPLE BLS ACTIVITIES MEASURES REPORT
D. STATE STANDARDS DEVELOPMENT AND PROMULGATION LOG
E. SUMMARY OF VARIANCES GRANTED
F. FEDERAL PROGRAM CHANGE LOG
G. STATE INITIATED CHANGE LOG
H. SUMMARY TRANSMITTAL FORM FOR APPELLATE DECISIONS RESULTING FROM CONTESTED CASE APPEALS AND EMPLOYEE DISCRIMINATION CASES
I. COMPLAINT ABOUT STATE PROGRAM ADMINISTRATION LOG
J. SAMPLE ACKNOWLEDGMENT LETTER TO CASPA COMPLAINT
K. PROCEDURES FOR CASE FILE REVIEWS, ACCOMPANIED VISITS, AND SPOT-CHECK MONITORING VISITS
L. SAMPLING SCHEME FOR ON-SITE MONITORING
M. DEVELOPING AN ANALYTICAL PLAN
N. ANALYTICAL PLAN FORMAT
O. ANALYTICAL REPORT FORMAT

vii

CHAPTER I--INTRODUCTION

A. The Act. Section 18 of the Occupational Safety and Health Act requires that State occupational safety and health programs be "at least as effective" as the Federal program. To ensure that this requirement is met and that State programs are operating in an effective manner, section 18(f) of the Act provides that the U.S. Department of Labor shall make a continuing evaluation of State programs. The basic objectives for evaluating State programs are established in the Act and further defined in relation to the Federal program in 29 CFR Parts 1902 and 1953.

B. The Monitoring System. Based on the objectives established in the Act and regulations, the monitoring system provides the framework for continuing evaluation of State programs. This monitoring system applies to all State programs, including those which have been granted "final approval" under section 18(e) of the Act. (An 18(e) determination of "final approval" is the formal relinquishment of concurrent Federal jurisdiction in a state based on the judgment that the State, in addition to having a structurally complete program, is in actual operation "at least as effective" as the Federal program. A State must also have met 100 per cent of its compliance staffing benchmarks as required by the court decision in AFL-CIO v. Marshall.)

1. They key to successful functioning of this monitoring system is continuous and constructive communication and cooperation between OSHA and the States. The system is designed to achieve the following goals:
a. Gather and analyze data and information on State programs in comparison to the Federal program and other established criteria;
b. Determine whether the States are maintaining programs that are "at least as effective" as the Federal program;
c. Provide technical assistance and information to States that can be used as management tools by the States in running their programs, and to aid in correcting deficiencies if a program or an aspect of a program has deficiencies; and,

I-1
d. Provide objective and consistent evaluations of State program performance.
2. The monitoring system utilizes primarily statistical methods of analysis and achieves consistency through the use of uniform or similar definitions of performance characteristics. The data base used for the majority of the statistical analyses is OSHA's Integrated Management Information System (IMIS). The IMIS is a computerized system which accumulates data from both Federal and State OSHA programs. Ultimately, all program data entry will be made by means of microcomputer at the local level. The monitoring system also treats nonstatistical data uniformly wherever possible.
3. A primary component of the monitoring system is the identification and analysis of "outliers," or areas where State performance falls outside of an established level or range of performance generally in comparison to the Federal program. An outlier is not necessarily a deficiency, but a difference which requires further explanation. The Regions and States work closely together to plan and conduct outlier analyses when State performance falls outside of an established level or range of performance for the preceding 6 months.

C. Overview of Chapters and Appendixes.

1. Chapter II describes the roles and responsibilities of OSHA and the States.
2. Chapter III discusses the criteria for evaluating State program effectiveness: activities measures, State-specific program activities, and Complaints About State Program Administration (CASPAs). These criteria are defined and their sources of information are identified.
3. In Chapter IV the procedures for the review and analysis of computerized and noncomputerized State performance information are presented in detail. The key components of the monitoring system are quarterly discussions between the OSHA Regions and the States and 6-month outlier analysis.
4. Chapter V contains procedures for the review and analysis of Complaints About State Program Administration (CASPAs).

I-2
5. Chapter VI discusses the financial and administrative monitoring of 23(g) grants.
6. Chapter VII describes the procedures for monitoring public employee only plans.
7. Chapter VIII discusses the Annual Evaluation Report, how it should be prepared, what format should be used, and what information it should contain.
8. The appendixes contain the criteria used to evaluate the State's performance (activities measures), and sample copies of documents used in the monitoring system, and specialized instructions on various monitoring activities.

I-3

CHAPTER II-ROLES AND RESPONSIBILITIES

OF NATIONAL OFFICE, REGIONS AND STATES

A. Introduction. The successful implementation of the monitoring program is dependent upon the cooperative and continuing effort of and interaction among the National Office, Regions and States. The purpose of this chapter is to define their respective roles and responsibilities.

B. National Office.

1. Directorate of Federal-State Operations. The Directorate of Federal-State Operations through the Office of State Programs is responsible for establishing policies and procedures of the State plan monitoring program, and for coordinating the program to ensure consistent and objective application to the States. The Directorate is also available to assist the Regions and States in the effective implementation of the program and in resolving disputes. For example, consultation with the Directorate is encouraged before the start of special investigations requiring review of data sources other than computerized information, such as case file reviews, interviews and accompanied visits, as a means of determining if the National Office knows of experience in other Regions or States that may be helpful in investigation of the outlier(s).
2. Directorate of Administrative Programs. The Directorate of Administrative Programs through the Office of Management Data Systems (OMDS) is responsible for the systems analysis, systems design and programming of State Plan Activities Measures and for processing both Federal and State program performance data as part of the computerized Integrated Management Information System (IMIS) and producing and distributing both routine reports (e.g., the State Plan Activities Measures (SPAM) Report) and special reports. OMDS also designs and writes software programs to enable Federal field offices and States to query the National data base directly and writes computer programs to respond to requests for more complex, special, ad hoc reports. OMDS provides the States, Regions and other components of the National Office with training and assistance in operating the IMIS.

II-1
3. Directorate of Field Operations. The Directorate of Field Operations is responsible for providing the Regions with coordination, direction and guidance. The Directorate of Field Operations represents the interest of the Region in dealing with principal program heads in the National Office in matters concerning the development, interpretation and implementation of major program issues bearing on field operations, including State programs.

C. Regions. The Regions are responsible for monitoring the State plans within their Regions in accordance with Federal monitoring policies and consistent with the effort to support the States in their building and maintaining of effective safety and health programs. Continuous communications and cooperative relations between the Regions and the States are essential to meeting these responsibilities. For example, the Regions work together with analyses of outliers. The Regions also provide technical assistance to the States and communication of Federal program direction. The mutual maintenance of regular channels of communication is an essential component of the Federal/State partnership.

NOTE: Where the term "Region" is used throughout this manual, responsibility for various aspects of monitoring is likely to be delegated to the Area Offices and/or to offices within the Regional Offices as determined appropriate by the Regional Administrator.

D. States. The States are responsible for maintaining programs that are "at least as effective" as the Federal program, and are actively involved in the monitoring process that assess their performance. On order to meet this responsibility the States shall maintain continuous and constructive communications with the Regions and provide them with appropriate information about State performance, including information about State-specific activities. The State's involvement in the monitoring system includes the opportunity to participate in the review and analysis of their programs. The States are encouraged to work closely with the Regions to jointly plan and conduct appropriate outlier analyses.

II-2

CHAPTER III--CRITERIA AND INFORMATION USED

FOR EVALUATING STATE PROGRAMS

A. Introduction. State program performance is measured primarily by use of activities measures that are based on those criteria established in the Act and the regulations. For those non-statistical State program activities not covered by the activities measures, performance is measured using other criteria consistent with the Act and the regulations. The purpose of this chapter is to describe in detail the criteria and information used for evaluating State programs.

B. Activities Measures.

1. Definition and Categories. Activities measures are the criteria against which State performance is measured. The 19 major categories (See Appendix A.) of activities measures listed below represent the program areas in which comparison of State performance with Federal performance or other established criteria is made:

Major Categories of Activities Measures
o Standards o Variances o Voluntary Compliance (18(b) On-site Consultation) o Voluntary Compliance (Training and Education Outreach) o Voluntary Compliance (Training and Education Staff Training) o Public Employee Program o Enforcement (Targeting) o Enforcement (Complaints) o Enforcement (Referrals) o Enforcement (Right of Entry) o Enforcement (Inspection Procedures) o Enforcement (Identifying and Citing Hazards) o Enforcement (Abatement Periods) o Enforcement (Penalties) o Review Procedures o Discrimination o Program Administration o Program Resources - Sufficient Personnel and Effective Utilization of Resources o Program Results 2. Activities Measures. The activities measures consist of:

III-1
a. Performance Objectives. Performance objectives are set forth in Section 18 of the Act and further defined in 29 CFR Part 1902. They qualitatively describe the aims of the Federal and State programs for each major program area. Their inclusion within the monitoring system defines the program goals for both the Federal and State programs.
b. Performance Measures. Performance measures are quantitative indicators of whether a State program meets each of the performance objectives for each major program area. The measures are stated as comparisons of State performance to Federal performance or other established criteria. They are stated for the most part as percentage levels or averages.
c. Further Review Levels. Further review levels are numerical limits or ranges of State performance that, if met, will generally result in no further review of State performance for a particular performance measures. In most cases, these levels are established by comparison with Federal performance for a particular measure.
d. Outliers. An outlier is State performance on a particular performance measure that falls outside the level or range of performance established by the further review level. Outliers generally will be subject to further investigation if reflected in the data on average State performance for the preceding 6 months. However, outliers are not necessarily deficiencies but may be differences which require further evaluation.
e. Measures for Information Only. Measures designated "For Information Only" are secondary measures that can assist in determining whether a particular performance objective is being met. These measures do not result in outliers but are subject to discretionary evaluations as determined in discussions between the Region and the State. (See Chapter IV.)

III-2
3. Sources of Information for the Activities Measures. Much of the data necessary to evaluate State performance in terms of the activities measures are obtained through State participation in the Integrated Management Information System (IMIS). IMIS is OSHA's computerized Management Information System which stores and processes the same types of data for both Federal and State programs.
a. IMIS Information. Uniform definitions and methods of calculation are used in most instances to ensure that Federal and State data are comparable. As new data systems are developed or added to IMIS or IMIS compatible software, more activities measures output reports will become available in standardized computer format.
(1) Submission of Federal and State Data. State and Federal programs submit, by means of the IMIS, inspection-level and other required data on a continual basis, on comparable forms, using uniform definitions as appropriate.
(2) SPAM Reports. State Program Activities Measures Reports (SPAM Reports) are IMIS computer output reports that are produced for each State on a quarterly basis by OSHA's Directorate of Administrative Programs through the Office of Management Data Systems (OMDS), and sent to the appropriate State and Region within 30 calendar days of the end of each quarter. The SPAM Report indicates the State's performance for each activities measure, Federal data where applicable, and, among other things, identifies any statistical outliers. (See appendix B for sample.) The quarterly SPAM Report shows State and Federal performance for the most recent 3-month, 6-month and 12-month outliers. Special SPAM Reports for any time period can be prepared by OMDS upon request from the State or Region.
b. Non-IMIS Statistical Information. The following non-IMIS information is used to calculate State performance for those activities measures for which data are not yet available through the IMIS or are produced by a different statistical system:

III-3
(1) Bureau of Labor Statistics Data. The National Office of the Bureau of Labor Statistics (BLS) shall prepare an annual computerized activities measures report for each State displaying data for those activities measures based on the State injury/illness statistics--public and private sector (i.e., Program Results data). (See Appendix C for sample.) These reports generally should be available in February for the calendar year prior to the past one.
(2) Program Resources Data. The National Office shall annually calculate and prepare an activities measures report for each State, when the necessary data become available, for the Program Resources activities measures not available through the IMIS. In preparing the report the National Office shall obtain assistance from the Region where necessary, and shall use, among other things, data from annual grant applications and other financial reports submitted by States to the Office of Grants Management. The format of this report shall be consistent with activities measures.
(3) Program Administration Data. Program administration data include data on the timely submission of plan changes and data on enforcement response time.
(a) State-Initiated Plan Changes. If a State-initiated plan change involves a legislative amendment or other action, or affects State plan funding, a State shall immediately notify the Region and within 30 days of such change (29 CFR 1953.41(a)) submit to the Region a formal plan change. With respect to any other type of State-initiated plan change (29 CFR 1954.41(a)), a State is encouraged to keep the Region advised of its intended action and shall submit to the Region a formal plan change within 6 months.

III-4
1 The Region shall maintain on a current basis a record of these changes in the "State-Initiated Change Log" and shall forward one copy of the pages of the log that reflect State-initiated plan change activities during the quarter, or any outstanding from previous quarters, to the National Office and one to the end of the quarter. (See Appendix F for example.)
2 The Region shall obtain the data necessary to calculate relevant activities measures for appropriate time periods from this log.
(b) State Response to Federal Program Changes. A State shall submit to its Region, within 30 calendar days of receipt of any Federal program change, an indication of its intent regarding such notice, and within 6 months of notification, a plan change supplement when required. The Region shall maintain on a current basis a record of such changes, State acknowledgments, and State responses in the "Federal Program Change Log." (See Appendix G for example.) The Region shall forward a copy of those pages of the log reflecting State responses to Federal program changes occurring during the quarter of outstanding from previous quarters to the National Office and one to the State within 30 calendar days of the end of the quarter.
1 The National Office shall inform the Region of any Federal program changes on a continuous bases and shall, within 2 weeks of the end of the quarter, forward a list of all Federal program changes occurring during the quarter for use in ensuring currency and consistency in the regionally maintained logs.

III-5
2 The Region shall obtain the data necessary to calculate relevant activities measures for appropriate time periods from this log.
(c) Enforcement Response Time. The National Office shall, on a quarterly basis, calculate and prepare an activities measures report for each State, when the necessary data becomes available, from special computer queries for the Program Administration activities measures relating to the average time required to issue citations, analyze sample and respond to complaints. In preparing the report, the National Office shall obtain assistance from the Region, where necessary, and the National Office shall forward a copy of the Program Administration activities measures report for each State to the Region, which shall promptly forward a copy to the State. The format of this report shall be consistent with activities measures.
(4) Standards. A State shall submit to the Region, within 30 calendar days of adoption, any standard adopted. The Region shall maintain on a current basis a record of Federal standards actions and comparable State actions as well as independent State standards actions (including stays of standards or changes in effective dates) in the "State Standards Development and Promulgation Log." (See Appendix D for example.) The Region shall forward one copy of those pages of the log reflecting State standards activity during the quarter and any pending or incomplete actions from previous quarters to the National Office and one to the State within 30 calendar days of the end of the quarter.
(a) Within 2 weeks of the end of the quarter, the National Office shall send to the Region a list of all Federal standards actions occurring during the quarter for use in ensuring currency and consistency in the regionally maintained logs.

III-6
(b) The Region shall obtain the data necessary to calculate relevant activities measures for appropriate time periods from this log.
(5) Variances. A State shall submit to the Region, within 15 calendar days of the close of the quarter, one copy of any permanent or temporary variance granted, as well as identification of any federally granted variances accepted by the State through the variance reciprocity procedures.
(a) The Region shall maintain on a current basis a record of variances granted in the "Summary Transmittal Form for Granted Variances." (See Appendix E for example.) The Region shall forward one copy of the transmittal form and all State variances granted during the quarter to the National Office within 30 calendar days of the end of the quarter.
(b) The Region may choose to grant an exception to this procedure for a State with a large number of variances that prefers the Region to review its variances files at the State office. In such cases, the Region shall forward to the National Office the complete variance log plus copies of any variances granted during the quarter that do not appear to ensure equivalent protection.
c. Other. Where data for activities measures analysis are not available to the Region either through computerized SPAM Reports or the described noncomputerized sources, the Office of State Programs shall work with the Office of Management Data Systems to provide in a timely manner the data needed.

III-7

C. Nonstatistical Program Information Not Covered by the Activities Measures

1. State Decisions.
a. Contested Cases. A State shall submit to the Region, within 15 calendar days of the end of each quarter, a copy of any decisions on contested cases decided during the quarter that establish a precedent which could affect State plan policies and procedures. In most cases, such decisions and procedures. In most cases, such decisions shall be limited to appellate decisions; i.e., administrative or judicial reviews above the fact-finding or trial level. Within 30 calendar days of the end of the quarter, the Region shall submit to the Directorate of Federal-State Operations one copy of all such decisions along with a completed Summary Transmittal Form for Appellate Decisions Resulting from Contested Case Appeals and Employee Discrimination Case Decisions. (See Appendix H for example.)
(1) The Region shall reach an agreement with the Regional Solicitor (SOL) as to whether the SOL will routinely review each such decision and provide an opinion on whether such case has the potential for making State performance in a particular area less effective than comparable Federal performance.
(2) If the Regional SOL agrees, the Region shall routinely send copies of such decisions to the Regional Solicitor's Office, and so indicate each quarter on the Transmittal Form.
(3) If the Regional SOL is not available to routinely review such decisions, then the Region shall forward these decisions each quarter to the Directorate of Federal-State Operations for transmittal to the National SOL for such an opinion.
b. Discrimination Cases. A State shall submit to the Region, within 15 calendar days of the end of each quarter, a copy of each discrimination case decision issued during the quarter by a court or administrative body authorized to issue enforcement orders. A

III-8
discrimination case results from an employee claim of discrimination in the workplace as a result of having filed a complaint with the State OSHA, or having otherwise exercised his/her occupational safety and health rights. Within 30 calendar days of the end of the quarter, the Region shall submit to the Directorate of Federal-State Operations a copy of each such decision along with a completed Summary Transmittal Form for Appellate Decisions Resulting from Contested Case Appeals and Employee Discrimination Case Decisions. (See Appendix H for example.)
(1) The Region shall reach an agreement with the Regional SOL as to whether the SOL will routinely review each decision on a discrimination case and provide an opinion on whether such case has the potential for making State performance in a particular area less effective than comparable Federal performance.
(2) If the Regional SOL agrees, the Region shall routinely send copies of the discrimination case decisions to the Regional SOL within 30 calendar days of the end of each quarter, and so indicate each quarter on the Transmittal Form.
(3) If the Regional SOL is not available to routinely review such decisions, then the Region shall forward these decisions, then the Region shall forward these decisions each quarter to the Directorate of Federal-State Operations for transmittal to the National SOL for such an opinion.
c. Other Significant Actions, A State shall promptly notify the Region by telephone of any State standard, administrative procedure or legal provision overturned, rendered inoperable or not "at least as effective" by judicial or other action, or any other judicial, administrative, budgetary, or legislative action that has a significant impact on the State's program. The State shall follow up with an written statement and shall keep the Region informed of the legal or administrative remedy, if any, being sought by

III-9
the State in response. The State shall, in a timely manner, submit to the Region documentation on the action and a written report of any intended State response.
(1) Immediately upon such telephone notification, the Region shall notify the Directorate of Federal-State Operations by telephone and shall submit a copy of the documentation and of the State's written report to the Directorate of Federal-State Operations.
(2) The Directorate of Federal-State Operations shall submit a copy of the relevant action and of the State's written report to the National SOL and request an opinion as to whether such case has the potential for making State performance in a particular program area less effective than comparable Federal performance. The Region shall also, at the same time, request the Regional SOL to render such an opinion.
(3) In addition, the State should notify the Region of any such actions that are pending that may affect the ability of the State to maintain an "at least as effective" program and necessitate supplemental Federal assistance, including the reinstitution of concurrent Federal enforcement, where appropriate.
2. Other State Inputs.
a. Denials of Entry. A State shall submit to its Region, within 5 calendar days of occurrence, a report of any refusal of entry for which a warrant has been denied, including the reasons for denial and any State action being taken in response.
(1) For States which have been granted final approval (18(e) determination), the Region shall immediately submit one copy to the Regional SOL and request an opinion as to whether such case has the potential for making State performance in a particular program area less effective than comparable Federal performance.

III-10
(2) For non-18(e) States, the Region, in accordance with the terms of the State's Operational Status Agreement, shall consider taking appropriate enforcement action after consultation with the Directorates of Federal-State Operations and Field Operations, and the State.
(3) If the Regional SOL is unavailable, the Region shall inform the Directorate of Federal-State Operations, which shall request the National SOL to render such an opinion.
b. Special Reports. A State shall submit in a timely manner any special reports requested by the Region or the National Office pursuant to section 18(c)(8) of the Act.
3. State-Specific Program Activities.
a. Definition. State-specific program activities are those activities undertaken by a State that are not specifically addressed in the current activities measures. They may include entire program efforts as well as State alternatives to Federal procedures. State-specific program activities are encouraged to the extent that a State program continues to perform at least as effectively as the Federal program.
b. Sources of Information. The State shall submit to its Region in a timely manner information necessary to evaluate any State-specific program activity. The information needed to make such an evaluation shall be determined in discussions between the State and Region and set forth in the documenting plan change submitted for approval.

D. Complaints About State Program Administration (CASPAs). A CASPA is a complaint, either oral or written, made by any person or group of persons to the National Office or a Region about some aspect of the operation or administration of a State plan. It may relate to a specific State action; e.g., an inspection, or may reflect a more generic criticism of State program administration.

III-11
1. CASPAs provide an additional source of information on State plan performance which may not be otherwise reflected and require complete investigation both for assessment of State plan effectiveness and to correct individual discrepancies. Upon receipt, the Region shall promptly record all CASPAs in its "CASPA Log." (See Appendix I for example.)
2. The Region shall forward a copy of those pages of the log relating to any CASPAs received or acted on during the quarter to the National Office within 30 calendar days of the close of the quarter.

III-12
           CHAPTER IV--PROCEDURES FOR REVIEW AND ANALYSIS
                  OF STATE PERFORMANCE INFORMATION

A. Introduction. This chapter and its associated appendixes describe the procedures for review and analysis of State performance information.

1. The quarterly analysis of outliers by the Region and State is a key aspect of these procedures. While this quarterly analysis, including the analytical reports, provides immediate feedback on a State program, it also serves as a basic source of information for the annual evaluation report on the overall effectiveness of a State program.
2. The quarterly analysis is a means of determining both those aspects of the State program that effectively achieve the program's goals and those areas of the program's performance that are in need of improvement, including the impact of these latter areas on the overall effectiveness of the State's program. While some outliers may indicate deficiencies, other outliers may be a means of documenting the extent of success of a particular State program activity with procedures different from the Federal or with performance that exceeds the Federal program.

B. Quarterly Discussions Between the Region and State. The success of the monitoring process is dependent upon continuous and constructive communication between the Region and State, including quarterly discussions.

1. To ensure that all information about the performance of a State's program is reviewed and analyzed as appropriate throughout the year, the Region and State shall hold formal discussions either in person or by telephone on at least a quarterly basis on:
a. Computerized SPAM data;
b. Information on State plan performance obtained from other sources such as the logs discussed in Chapter 3; and
c. Other State plan and Federal activities.

IV-1
2. If the Region and State agree that it is convenient and appropriate to hold discussions on all items at the same time, they may do so, provided the discussions are held within 20 calendar days after the receipt of each quarterly SPAM report from OMDS.
3. The purpose of these quarterly discussions is to provide for a sharing of information between the Region and the State on both past periods of monitoring and future monitoring. The discussions also provide opportunities for the Region to receive from the State a general update on all ongoing and anticipated State activities, problems and issues, and for the Region to provide to the State an update on Federal activities and initiatives.
4. While the National Office does not directly participate in the quarterly discussions, it serves as an advisor and a source of information. The Region is encouraged to consult the National Office for advice and assistance if any difficulty is encountered or anticipated in performing any analysis of State program performance.
a. For example, the National Office may suggest some options for analysis and share experience gained in other Regions with outliers on similar measures. In addition, if a disagreement between the Region and State arises concerning any aspect of the process that cannot be resolved between them, the National Office shall be consulted.
b. The National Office shall also be informed of any issues raised at the quarterly discussions that may relate to the overall effectiveness of a State program.
5. Topics for Quarterly Discussion. The following is a list of the topics to be included in the quarterly discussions:
a. Non-SPAM Information. Topics related to State performance data that are obtained from sources other than the quarterly computerized SPAM report shall be included in the quarterly discussions.

IV-2
(1) Status of Current Non-SPAM Analysis. The Status of any analysis of non-SPAM-specific data currently in progress shall be discussed. If any difficulties in completing the analysis have been encountered, they shall be identified and, whenever possible, remedied.
(2) Review of Completed Analyses of Non-SPAM-Specific Data. A quarterly discussion is an appropriate time for the Region and State to communicate with regard to completed analyses on non-SPAM-specific data, particularly the sufficiency of the analyses, conclusions reached, recommendations (if any), and planned State action in response, either in progress or pending.
(3) Review of New Non-SPAM Data. A quarterly discussion is an appropriate time for the Region and State to discuss new non-SPAM information and appropriate procedures for review and analysis of this information.
(4) CASPAs. A quarterly discussion is an appropriate time for the Region and State to discuss any aspect of the review and analysis of CASPAs.
(5) Any Other Topics Relevant to State performance. The quarterly discussions provide a consistently available forum for communication between the Region and State on any and all aspects of the State program. To this end, the participants should use the discussions as an opportunity to keep each other informed of new State program developments and any new Federal policies and procedures that could impact the State program, to exchange ideas for improvements of any program areas, and to discuss any other topics relevant to State performance.
b. Computerized SPAM Data. The following topics related to computerized SPAM data shall be included in the quarterly discussions:

IV-3
(1) Status of Current SPAM Outlier Analyses. The status of analyses of SPAM data currently in progress shall be discussed. Any difficulties in completing the analyses shall be identified and remedied if possible.
(2) Review of Completed Outlier Analyses. The Region and the State shall discuss any outlier analyses of computerized data completed during the previous quarter, particularly the sufficiency of the analyses, conclusions reached, recommendations made (if any), and any planned State action in response, either in progress or pending. Any pending or completed corrective action on other outliers should also be reviewed, as appropriate.
(3) Review of New Outliers. The Region and State shall review new SPAM outliers. New outliers are outliers that have not appeared in the 6-month data on any of the preceding three quarterly SPAM reports. Regardless of whether non-SPAM information is discussed at the same time, the Region and the State shall, within 20 calendar days after receipt of each quarterly SPAM report from OMDS, hold a discussion of the data, any outliers indicated, and the appropriate methods for their analysis. In preparing for this discussion, the State and Region shall review the SPAM report and perform a preliminary review of readily available information in order to provide insight into the possible causes of each outlier and to determine the appropriate method of outlier analysis.
(a) New 3-Month Outliers. When a new outlier has appeared in the past 3 months' data, the Region and State shall note and discuss the outlier and its potential impact on the effectiveness of State performance in a particular program area. Although only 6-month outliers need to be formally analyzed if the Region and State agree that the outlier could potentially make the State performance in a particular program area less effective than comparable Federal performance, a discussion of strategies to improve State performance may be advisable.

IV-4
(b) New 6-Month Outliers. When a new outlier has appeared in the past 6 months' data, the Region and State shall note the outlier, discuss possible cause(s) of the outlier, and agree upon the plan to be used for its analysis in accordance with this chapter.
1 After agreeing upon the analytical plan to be used, the Region and State shall decide whether the analysis shall be conducted by the Region, the State, or as a joint effort by both parties. Although the analysis of outliers is ultimately the responsibility of the Region, it is recommended that the State and Region conduct a joint analysis whenever possible. When it is decided that a joint analysis shall be done, the Region and State shall also decide which party shall have which responsibilities in conducting the analysis.
2 Finally, the parties shall agree on an appropriate timeframe for completing the analysis. If at all possible, the analyses of all outliers should be completed prior to the issuance of the next quarterly SPAM report.
(c) Grouping of Outliers for Analysis. Frequently, State performance in one program area will result in outliers in several related measures. When it appears that outliers are related and could be more effectively analyzed together, the Region and State should combine the analyses of such outliers in a single analytical plan and report. Analysis of complex outliers may also require the review of activities measures which, while not resulting in outliers themselves, may impact the performance of the area under study.

IV-5
(4) Review of Old Outliers. "Old" outliers are outliers that have appeared in the 6-month data on any of the preceding three SPAM reports. Old outliers may be currently under review, or corrective action may be underway but as yet incomplete; or they may reflect acceptable differences in the State's program; e.g., lack of records review, which have previously been evaluated.
(a) The Region and State shall identify all old 6-month outliers and their status and that of any corrective action being taken; but no further action is necessary for such outliers if analysis or corrective action is underway in accordance with agreed-upon timetables.
(b) Old outliers that have been explained in terms of acceptable differences in State policies or conditions need only to be reviewed for consistency with the explanation given in the previous analysis of the outlier and assessed for any change in circumstances that might impact on the effectiveness of the State's progress.
(5) Treatment of Measures For Information Only. The primary purpose of measures designated "For Information Only" is to provide additional data on State performance that may be useful in analysis of outliers. However, if the Region and State believe that State performance on a "For Information Only" measures merits further review and analysis either independently or in relation to existing outliers, then they shall develop an analytical plan and timetable for performing such review and analysis.

C. Procedures for Review and Analysis of Computerized Data. Within 30 days of the close of the quarter, OMDS shall send to the States and Regions copies of the SPAM reports.

1. Review of Quarterly SPAM Reports. The States and Regions shall review the SPAM reports, identify all performance measures for which there are 6-month outliers, and classify such outliers as "new" or "old." In addition, the other measures on the SPAM report shall be reviewed to assist in analysis of related measures and to identify performance data which merit analysis and discussion.

IV-6
2. Quarterly Discussion of SPAM Data. A quarterly discussion of SPAM data, including plans for their analysis, is then held between the Region and the State.
3. Analysis of 6-Month Outliers. Each activities measure provides a point of comparison between State and Federal activity or between State performance and another established level of performance. The occurrence of an outlier on a performance measure serves as an indicator that an aspect of State performance may differ significantly from comparable Federal performance.
a. Purpose of Outlier Analysis. The purpose of outlier analysis is to answer the following questions:
(1) Are the data indicating that an outlier exists accurate?
(2) Is the outlier caused by a program factor that does not reduce the effectiveness of State performance in a particular program area below that of comparable Federal performance?
(3) Even if State effectiveness is reduced in a particular program area, does the State performance in other related areas act to offset the impact of the outlier?
(4) Are there ways in which the State could improve its performance in a particular program area?
(5) Does the outlier reflect a program deficiency that requires correction if "at least as effective" status is to be maintained?
b. Sources of Information on Causes of Outliers. There are numerous sources of information to assist in the performance of outlier analysis. Some information is readily available, such as other related measures, other IMIS reports, and computer queries that can be performed directly by the Region and/or State or by OMDS upon request. Special investigations such as case file reviews or accompanied visits may be required to obtain other information.

IV-7
(1) Special Investigations. Whenever possible, the cause of an outlier should be determined based on a review of readily available information. If it is not possible to determine the cause of an outlier using readily available information, and both the State and Region agree that use of special investigations, through case file reviews, formal interviews, review of other primary source State data, and/or accompanied visits, is the preferred method of analysis, they may proceed to conduct such investigations without approval from the National Office.
(a) However, prior to the initiation of such an investigation, consultation with the National Office is encouraged as a means of determining if the National Office knows of experience in other Regions or States that may be helpful in investigation of the outlier.
(b) If the State and Region agree that a study utilizing spot-check monitoring visits is necessary, the National Office shall be consulted before proceeding with the investigation.
(2) Information Readily Available. The readily available sources of information include:

(a) Data available elsewhere on the SPAM report;
(b) Data from standard IMIS reports (e.g., INSPs) and routine State-submitted reports;

(c) Data available through computer queries;
(d) Other Federal data; e.g., CASPAs and OSHA 23(g) grant monitoring report findings;
(e) Information readily available to the State and/or Region from an informal review of a limited sample of case files, logs, policy documents, etc.; and,

(f) Informal discussions with the States.

IV-8
(3) Information from Special Investigations. Obtaining information using the following investigation techniques requires a more extensive data-gathering effort than for obtaining readily available information. There analytical tools are to be used only as part of formal controlled studies.
(a) Interviews. A formal interview is an in-person planned discussion to obtain information from specific State program, staff, employers, employees, or other persons, apart from personal communication that occurs in the conduct of case file reviews, accompanied visits, spot-check monitoring visits, or as part of day-to-day communication with staff. 1 If it is determined that a structured series of formal interviews is necessary to do the outlier analysis, the individuals to be interviewed shall be identified at or as a followup to the quarterly discussion.
2 OSHA may determine that an interviewee's name shall not be released to the State if he/she requests anonymity.
3 The interviewer shall conduct (and record) the interviews in accordance with a scope of inquiry agreed upon by the Region and State.
(b) Case File Review (CFR). A case file review is the examination of the documentation relating to any inspection or consultation.
1 If a CFR is used in conjunction with an accompanied or spot-check monitoring visit, or interview, it may serve to verify the observed State activity.
2 If it is determined that a formal CFR study is necessary to do the outlier analysis, the number and type of CFRs shall be decided in accordance with the procedures set forth in Appendix K.

IV-9
(d) Spot-Check Monitoring Visit (SCMV). A spot-check monitoring visit is a visit by an OSHA monitor to an establishment previously inspected by State enforcement personnel.
1 In order to avoid imposing the added burden on employers and employees of having Federal monitors visit workplaces recently inspected by the State, SCMVs shall be used only as a last resort in doing an outlier analysis, only after notification of the National Office, and in accordance with any restrictive appropriations legislation language or OSHA regulations or directives.
2 If it is determined that a formal SCMV study is necessary to do the outlier analysis, the number and type of SCMV (and related CFRs) shall be decided in accordance with the procedures set forth in Appendix K.

IV-10
(e) Other Sources of Information. Sources of information other than those specified in this chapter may include, but are not limited to, attendance at training sessions or hearings, examination of State documents other than case files, review of equipment or laboratory facilities, and evaluation of sample analyses. The Region and State shall determine other sources of information that need to be accessed. If appropriate, established procedures for investigation (Appendix K) shall be utilized.
c. Procedures for Outlier Analysis. A timeframe for completing each outlier analysis (or analysis of a group of outliers) shall be established by the Region and State at the discussion of the quarterly SPAM Report.
(1) Timing. All analyses shall be completed and reviewed prior to the issuance of the SPAM Report for the next quarter unless a longer time is needed due to the complexity of the required analysis.
(2) Procedures for Analysis. An analytical plan shall be developed for analysis of each outlier or group of related outliers. The extent of the plan will vary depending upon the nature (i.e., complexity, uniqueness) of the outlier and the monitor's familiarity with the related data and issues. (See Appendix M for guidelines on developing an analytical plan and Appendix N for an analytical plan format.) For example, a more extensive analytical plan should be developed for an outlier that does not appear to be easily explainable, or if the analysis is to be performed by an analyst unfamiliar with the information that needs to be reviewed. The analysis should address the following issues:
(a) Accuracy of Performance Data. A confirmation of the accuracy of the performance data showing the 6-month outlier. If it is found that the data are inaccurate, the data shall be corrected promptly, which may involve the State, Region, and/or the National Office.

IV-11
If, after the data are corrected, an outlier does exist, analysis shall be initiated.
(b) Possible Cause(s). A listing of the possible cause(s) of the outlier.
(c) Information to be Obtained and Reviewed. An identification of all information to be obtained and reviewed to prove or disprove the hypothesized cause(s) of the outlier.
(d) Sources of Information. An identification of all sources for the information to be obtained and reviewed, and methods for its collection.
(e) Data Collection and Analysis. The collection and analysis of the data to identify the cause(s) of the outlier.
(f) Determination of Cause(s). A statement of the identified cause(s) of the outlier, supported by the review and analysis of all relevant information.
(g) Conclusion(s). An assessment of the potential impact of an outlier and its cause on the effectiveness of State performance in a particular program area relative to comparable Federal performance. One of the following six conclusions may be reached:
1 The outlier does not exist after review and correction of actual performance data.
2 The outlier exists, but it does not reduce the effectiveness of State performance in a particular program area below that of comparable Federal performance.
3 The outlier exists, but indicates that State performance exceeds Federal performance.

IV-12
4 The outlier exists, but performance on other related measures compensates for the existence of the outlier, so that overall State effectiveness in a program area is at least as effective as comparable Federal performance.
5 The outlier exists, and could cause State performance in a program area to be less effective than comparable Federal performance.
6 After review and analysis of all relevant information, the cause of the outlier and its impact on a State's program cannot be determined.
(h) Recommendation(s). If the outlier suggests that the State performance in a particular program area may be less effective than comparable Federal performance, a remedial strategy and timetable for corrective action shall be developed. Suggestions for improving performance may also be made even though it has been determined that an outlier does not render a program area less effective.
d. Preparation and Review of Analytical Reports. At the completion of each outlier's analysis, a report documenting the study, its findings and any recommended actions shall be prepared. These reports should provide an important source of information for the Annual Evaluation Report.
(1) Analytical Report Format. Analytical reports shall be prepared using the format presented in Appendix O.
(2) Communication Between Region and State. As soon as a preliminary analytical report is completed by the regional and/or State staff who conducted the analysis, the Region and State shall discuss the report with regard to the sufficiency of analysis, conclusions reached, and recommendations made as

IV-13
part of the quarterly discussion of SPAM data or on an ad hoc basis. Based on such discussion, the Region shall prepare a final analytical report and forward a copy to the State.
(3) Opportunity for Written Response. If the State disagrees with the sufficiency of the analysis, conclusions reached, or recommendations (if any) made in a final analytical report, discussions shall be held and modifications to the report made as mutually agreeable. Where there is no resolution of differences of opinion, the State shall have an opportunity to file a written response that shall be attached to the final analytical report when it is sent to the National Office by the Region. The State shall have a reasonable amount of time to prepare its response.
(4) Submission of Reports to National Office. The Region shall submit to the National Office in a timely manner a copy of the final analytical report and any State response.
e. File of Analyses and Responses to Analyses. Copies of all completed analytical reports, as well as all responses to analyses, shall be maintained in the respective Region and the National Office. The National Office shall maintain a compendium of selected outlier analyses to be provided to the Regions as examples of methods used for analysis of similar outliers. On a periodic basis, the National Office shall send to the Regional Administrator a memorandum identifying the outlier analyses that are being maintained in its compendium.

D. Procedures for Review and Analysis of Information Outside the IMIS.

1. Bureau of Labor Statistics (BLS). After the BLS Activities Measures Reports for States are available, the BLS National Office shall send a copy of the appropriate State's reports to each BLS Regional Office, and the OSHA National Office. The BLS Regional Office shall forward a copy to the State agency which participates in the BLS statistical program. The OSHA National Office shall forward a copy of the appropriate State's reports to each OSHA Region, which, in

IV-14
turn, shall promptly forward a copy to the States. After receipt of the reports, the BLS Regional Office shall prepare an analysis of the data after consultation with the OSHA Regional Office with respect to the timing and content of the analysis. The analysis shall include, but not be limited to:
a. A discussion of levels and trends in employment; rate differentials between the State public and private sectors; and rate differentials between the State and non-State plan States.
b. An analysis of all outliers by identifying contributing factors such as differences and changes in industrial composition and industry rates.
c. Upon completion of a. and b. (above), the BLS Regional Office shall send a copy of the analysis to the OSHA Regional Office, which, in turn, shall promptly forward a copy to the State. If the Region or State should disagree with any aspect of the analysis, the Region and State should discuss the issue with the BLS Regional Office. The Region and State, in consultation with the BLS Regional Office, shall then draw conclusions relating the BLS data and analysis to State performance on a particular activities measure or to overall State program effectiveness.
d. The OSHA Regional Office shall provide a copy of the State's Annual Evaluation Report to the BLS Regional Office.
2. Program Resources. Within 2 months of the annual grant awards, or when the necessary data become available, the National Office shall forward a copy of the Program Resources activities measures report for each State to the Regional Administrator, who shall promptly forward a copy to the State. Upon receipt of this report, the Region and State shall analyze and review any outliers identified in accordance with this chapter.
3. Program Administration.
a. State Response to Federal Program Changes and State-Initiated Plan Changes. On a quarterly basis, the Region shall calculate the performance measures for IV-15
State response to Federal program changes and for State-initiated plan changes, based on timeliness, and identify any outlier(s). Any such outlier(s) shall be analyzed and reviewed in accordance with this chapter. State responses to Federal program changes and State-initiated plan changes shall be reviewed for their impact on a State program other than timeliness according to other established procedures. (See 29 CFR Part 1953 and OSHA Instructions STP 2-1.18 and STP 2-1.21.)
b. Enforcement Response Time. Upon receipt of the program administration activities measures report from the National Office on enforcement response time, the Region and State shall analyze and review any outliers identified in accordance with this chapter.
4. Standards and Variances. On a quarterly basis, the Region shall review any standards adopted or variances granted, calculate the performance measures, and identify any outlier(s). Any such outlier(s) shall be analyzed and reviewed in accordance with this chapter. Any standards adopted shall be reviewed for their impact on a State program other than that covered by the activities measures according to other established procedures. (See OSHA Instruction STP 2-1.117.) Procedures for National Office review of variances are as follows:
a. The Office of State Programs shall provide a copy of any variance granted to the Office of Variance Determination in the National Office, which shall review such variance for its impact on a State program with regard to comparable Federal precedence and procedures.
b. If a State has been granted an exception to the variance procedure due to a large number of variances (See Chapter III, B.3.b.(5)(b), page III-7), the Office of State Programs shall provide to the Office of Variance Determination a copy only of those variances that the Regional Administrator feels do not ensure equivalent protection.
c. The Office of State Programs shall forward any comments received from the Office of Variance Determination to the Region and State in a timely manner.

IV-16
5. Contested Cases, Discrimination Cases, Other Significant Actions, and Denials of Entry. Upon receipt of the opinion form the Regional or National SOL on whether any contested case, discrimination case, other significant action, or denial of entry has the potential for making State performance in a particular program area less effective than comparable Federal performance, the Region shall promptly forward a copy to the State. The Region, after assessing all available information and consulting with the State, shall then make a determination as to the potential impact of the occurrence. If the determination is that such contested case, discrimination case, judicial action, or denial of entry has the potential for making State performance less effective than the Federal, it shall be considered an outlier. In accordance with this chapter, the necessity for corrective action shall be determined and appropriate recommendations developed in coordination with the State.
6. State-Specific Program Activities for Which There Are No Performance Measures. With regard to existing State-specific program activities and alternative State procedures for which there are no performance measures, the Region and State shall agree upon the procedures to be utilized in reviewing and analyzing such program activities.
a. When a State submits a proposed plan change for new State-Specific program activities for which there are no performance measures, it shall include proposed procedures for review and analysis of such activities.
b. If the procedures are acceptable, the Region shall discuss them with the State and implement them.
c. If the procedures are not acceptable, the Region and State shall discuss and modify them before the plan change is forwarded to the National Office for approval.

IV-17
      CHAPTER V--REVIEW AND ANALYSIS OF COMPLAINTS ABOUT STATE
                   PROGRAM ADMINISTRATION (CASPAS)

A. Definition of a CASPA. A Complaint About State Program Administration (CASPA) is complaint, either oral or written, made by any person or group of persons to the National Office or a Region about some aspect of the operation or administration of a State plan. It may relate to a specific State action; e.g., an inspection, or may reflect a more generic criticism of State program administration.

B. Purpose of a CASPA. The CASPA provides a means to:

1. Give employees, employers and the public an opportunity to bring specific problems or dissatisfactions about State performance to the attention of OSHA;
2. Maintain an awareness of areas of public concern about a State's program;
3. Identify possible problem areas of State performance that may not be discovered through routine monitoring activities; and
4. Form the basis for corrective action to be taken by the State in the case of valid complaints both with regard to individual concerns and systemic deficiencies.

C. Determining If a CASPA Warrants Investigation. The Region shall determine if a CASPA warrants investigation within 5 days of receipt.

1. All CASPAs shall be investigated unless:
a. The complainant has not exhausted applicable State administrative remedies specifically provided for within State procedures and regulations. For example, if a CASPA involves a State case under contest, and if the contest could provide the complainant with an administrative remedy to his/her complaint, the Region shall not investigate and shall notify the complainant that if the results of the contest prove unsatisfactory, a CASPA may be filed at that time;
b. The complaint questions the authority of a State program

V-1
to take a particular action which the Sate is clearly required or allowed to do under its approved plan;
c. The complaint pertains to a matter not within the jurisdiction of the State program; or
d. The Region has already investigated a sufficient number of complaints concerning the same issues such that an additional investigation would not be worthwhile.
2. If the Region lacks sufficient information to make such a determination, the Region shall solicit additional information in a timely manner from the complainant if his/her identity is known or from the State. The Region shall, within 5 days of receipt of such additional information, determine if the CASPA warrants an investigation.
3. Notwithstanding the timetable established in this section, if a CASPA alleges that a situation of imminent danger exists, such a determination shall be made immediately.
4. Anonymous CASPAs shall be investigated unless there is insufficient information to proceed with an investigation, or if it is clear from the CASPA that one of the above (c.1) mentioned exceptions applies.

D. Confidentiality. Federal regulations (29 CFR 1954.21) require that the identity of any CASPA complainant remain confidential.

1. At any time that the Region contacts the State concerning a CASPA, it shall withhold the name of the complainant. When the Region forwards a copy of the complaint or its response to the complainant to the State, the name of the complainant shall be deleted.
2. If deletion of the name of the complainant is not sufficient to maintain his/her confidentiality, the Region shall prepare a summary of the complaint or response that does not reveal the complainant's identity to forward to the State. The name of the complainant shall not appear in any record published, released, or made available.
3. When it is impossible, for what ever reason, to maintain confidentiality, the Region shall conduct the investigation alone.

V-2
4. As required by Chapter XIV of OSHA Instruction CPL 2.45A, the identity of a CASPA complaint or any information tending to identify a complainant which is contained in a Federal case file shall not be disclosed.
5. For CASPAs concerning the State's handling of a discrimination complaint, the Region shall delete the name of the CASPA complainant from the complaint, but shall also inform the State of the name of the person who filed the discrimination complaint, so that it can investigate.
6. Notwithstanding the above, if in the judgment of the Region, it would facilitate the investigation, the Region may attempt to obtain a written waiver of confidentiality from the complainant.

E. Notification of Concerned Parties and Opportunity for State Response.

1. If An Investigation Is Not Warranted. If the Region determines that a CASPA does not warrant an investigation, the complainant shall be notified immediately, in writing, of the preliminary determination and the reasons for it. The complainant shall be informed that he/she can ask the Regional Administrator for reconsideration of the decision. The Region shall also forward to the State a copy of the letter to the complainant with the confidentiality of the complainant maintained in accordance with D., page V-2.
2. If An Investigation Is Warranted.
a. CASPAs Alleging Situations of Potential Imminent Danger. If a CASPA alleges that a situation of potential imminent danger exists, the Regional Administrator shall, while maintaining

V-3
the confidentiality of the complainant in accordance with the above D., page V-2, immediately contact the state to ensure that if an imminent danger exists, appropriate action is taken. The Region shall notify the complainant of its action in a timely manner and keep him/her informed of the CASPA investigation as appropriate. (See Appendix J for an example of an acknowledgment letter.)
b. Routine CASPAs. The Region shall, as soon as it determines that CASPA investigation is warranted, forward to the State a copy of such CASPA, while maintaining the confidentiality of the complainant in accordance with D., page V-2 and provide the State the opportunity to respond to such CASPA within a reasonable timeframe for consideration as part of the Region's investigation. The Region may, if appropriate, identify the issues for which State response would be most useful. The Region shall also notify the complainant of the following items:
(1) A decision has been made to investigate the CASPA;
(2) The first step in the investigation involves a State review of the matter at issue;
(3) The Region may be contacting him/her to obtain additional information; and
(4) The Region will send the complainant a written response detailing the results of the investigation.
c. CASPAs Where Initial State Response May Not Be Appropriate. Unless the CASPA alleges that a situation of potential imminent danger exists, the Region may determine that the opportunity for an initial State response may not be appropriate.
(1) Such a determination would be due to the sensitivity of the issues involved, including situations where it would be impossible to provide an initial State response while at the same time maintaining the confidentiality of the complainant, or because the complainant is an employee of the State program.
(2) In such a case, the Region shall conduct an independent investigation in a timely manner. Before initiating the investigation, the Region shall notify the State of such decision, in writing, maintaining the confidentiality of the complainant in accordance with D., Page V-2. The Region shall keep the complainant informed of the CASPA investigation as appropriate.

F. Methods for Investigation of a CASPA. The Region shall determine the methods to be used to investigate a CASPA. If deemed appropriate, some or all of the procedures for outlier analysis may be used.

V-4
1. In cases where a State response was solicited and received in a timely manner, the Region shall verify the information provided and consider it as an information source in the Region's own analysis of the CASPA. The Region may, however, initiate its own investigation before receiving the State's response.
2. If a CASPA involving a specific establishment or jobsite requires a visit to the establishment or jobsite in order for a State to prepare its response, the State and Region should, if at all possible, coordinate a joint onsite visit. This procedure should preclude the necessity for two onsite visits (one State and one Federal) and the resultant burden on the employer and employees.
3. The Region shall keep the complainant informed of the CASPA investigation as appropriate.

G. Review of Completed CASPA Investigations.

1. Communication Between Region and State. As soon as a CASPA investigation is completed, the Region and State shall discuss the Region's findings with regard to the sufficiency of analysis, conclusions reached, recommendations made (if any), and any planned State action in response, including a time-table for implementation of such action.
2. Response to Complainant. As soon as the Regional Administrator and State Designee have discussed on the findings and recommendations from a CASPA investigation, the Regional Administrator shall draft a letter of response to the complainant summarizing the investigative steps taken, analysis conducted, conclusions reached, and any corrective action taken or planned to be taken by the State. Such letter shall also advise the complainant of his/her rights to request reconsideration by the OSHA Regional Administrator.
a. Review of Draft Response to Complainant. As soon as a draft response to the complainant has been prepared, the Regional Administrator shall discuss the proposed response with the State. The copy of any draft response provided to the State at its request shall maintain the confidentiality of the complainant in accordance with D., page V-2.

V-5
b. Notification of Complainant. Based on the discussion with the State, the Regional Administrator shall make any changes in the draft response to the complainant that is deemed appropriate, and send a final response to the complainant in a timely manner.
3. Letter to the State. The Region shall send a letter to the State settling out the conclusions of its investigation and recommendations for corrective actions, if any. The Region may also send to the State a copy of its final response to the complainant, maintaining the confidentiality of the complainant in accordance with D., page V-2. The Region shall also propose a timetable for the corrective actions.
4. State Response. If the State disagrees with any aspect of the investigation, it shall have an opportunity to file a written response that shall be attached to the copy of the response to the complainant when it is sent to the National Office. The State shall have a reasonable amount of time to prepare a response.
5. Forwarding of Response to Complainant to National Office. The Region shall forward to the National Office, upon completion of the CASPA investigation, one copy of the response sent to the complainant plus one copy of any correspondence regarding the CASPA. If the State disagrees with any aspect of the investigation, the Region shall attach a copy of the State's response.
6. Corrective Action. The State shall take corrective action on individual cases with valid CASPAs, if at all possible. If no relief is possible, the State shall put in place a procedure to avoid recurrence of the problem.
a. If systemic problems are identified by the CASPA, the Region shall initiate a full analysis of the program at issue and develop methods for correction.
b. In either case, if the State fails or refuses to take corrective action and the Region is unable to negotiate a solution, the case shall be referred to the National Office for coordination of further action.

V-6

H. Documentation of CASPA Investigation. The amount and kind of information that is collected and analyzed will vary for each CASPA. However, each case file should contain written documentation of the following:

1. Identification of Allegations to Be Investigated. A statement of allegations to be investigated.
2. Information Reviewed. A listing of all information reviewed that is relevant to investigating the CASPA.
3. Analysis of Conclusions. An analysis of the allegations in the complaint and conclusions with respect to their validity. Conclusions should be reached with respect to any specific as well as systemic problems.
4. Recommendations. Where deemed appropriate, reasonable corrective action to be taken by the State and a timetable for such action. Recommendations should address any specific as well as systemic problems.
5. Response of State. The actions taken by the State in response to conclusions reached and recommendations made.
6. Followup by Region. The actions of the Region as a followup to the conclusions reached and recommendations made.
NOTE: The Regional Administrator's written response to the complainant may satisfy most of these documentation requirements.

V-7

CHAPTER VI--FINANCIAL AND ADMINISTRATIVE

MONITORING OF 23(g) GRANTS

A. Introduction. Monitoring of financial and administrative aspects of OSHA 23(g) grants be done in accordance with OSHA Instruction FIN 3.2, August 27, 1984.

B. OSHA Instruction FIN 3.2 As It Relates to OSHA Instruction STP 2.22A. A report on the administrative and financial monitoring of a State plan prepared in accordance with FIN 3.2 may contain findings indicating that a State's administrative and Financial policies, systems or practices may have an effect on State plan performance. In such a case, the Region shall ascertain the cause thereof and assess its impact on the effectiveness of the State's plan, generally following the analytical procedures set out in Chapter IV of this manual. Where appropriate and relevant to State plan performance, the FIN 3.2 report findings and results of any further analysis conducted by the Region shall be included in the State's Annual Evaluation Report. (See page VIII-4.)

VI-1
        CHAPTER VII-MONITORING PROCEDURES FOR PUBLIC EMPLOYEE
                          ONLY STATE PLANS

A. Public Employee Only State Plans. These plans shall be monitored in accordance with the policies and procedures set forth in this manual except as provided below:

1. Generally, all of the activities measures contained in Appendix A apply to these State plans, with the exception that measures of injury-illness rates in the Public Employee Program category (section E) are used instead of those in the Program Results category (section K).
2. None of the other measures in the Public Employee Program category are applicable. Computerized SPAM reports will reflect this.

B. Injury-Illness Rate Activities Measures. With regard to the injury-illness rate activities measures of the Public Employee Program category, if there are no State private sector data available to which the State public sector performance measures can be compared, such measures shall be compared to an average of the Federal private sector injury and illness data for the performance measures in the BLS Activities Measures Report. This comparison shall be contained in the Activities Measures report prepared by BLS.

VII-1
CHAPTER VIII--ANNUAL EVALUATION OF STATE PROGRAMS

A. Purpose and Scope of Evaluation Report. The Annual Evaluation Report presents OSHA's assessment of the overall effectiveness of a State program to the State and the interested public.

1. The report shall include a discussion of the State's performance in all program areas, covering both aspects of the program that effectively achieve the program's goals and those areas of the program's performance over the year that are in need of improvement. Activities measures without outliers may be grouped together and discussed in general terms, and performance data from several such activities measures may be covered at the same time.
2. The report shall include a review of any analyses of statistical outliers or any other evidence of potential program deficiencies identified during the year, and evaluation of their impacts on overall program performance, and an indication of any State corrective action already undertaken, as well as any recommendations on how a State might improve its performance.
3. Because the report is an annual evaluation, it will reflect the 12-month data for the period in question. It is likely that any 12-month outlier will have been previously identified and analyzed during the year as a 6-month outlier. The findings and conclusions of that analysis should be just as applicable to the 12-month data. Thus, use of 12-month data should not in most cases require any additional special monitoring, but rather extrapolation of conclusions reached during the course of the evaluation period. Analysis of these 12-month outliers should be incorporated into the text of the Annual Evaluation Report without attaching the original 6-month outlier reports. Evaluation of any new or unusual outliers or performance first reflected in the 12-month data used for an evaluation report may be deferred.

B. Format of Evaluation Report.

1. Title Page. The title page shall include:
a. The name of the State;

VIII-1
b. The designated Agency;
c. The dates of plan approval, certification and final approval;
d. The period covered by the report; and
e. The Region where the report was prepared.
2. Table of Contents. The table of contents shall include a complete outline of the report identifying each section and the page on which it begins.
3. Executive Summary. The executive summary shall include:
a. A brief description and assessment of a State's performance within each major program category;
b. A brief description and assessment of any State-specific program activity not discussed in the previous subsection; and,
c. An assessment of the overall effectiveness of the State program relative to the Federal program.
4. Introduction. The introduction shall include:
a. A brief historical profile of the State program including a brief description of the program's activities and structures, and, if applicable, any significant differences from the Federal program;
b. A summary of any major trends or changes that occurred during the year impacting the State program;
c. A brief statement of the number of CASPAs received and investigated during the year, and the number found valid; and,
d. A brief description of the system used to monitor the State's performance, including definitions of further review level and outlier.

VIII-2
5. Discussion of Program Categories. The report shall address each of the 11 major program categories in an individual section:
o Standards o Variances o Consultation o Voluntary Compliance (Training and Education) o Public Employee Program o Enforcement o Review Procedures o Discrimination o Program Administration o Program Resources o Program Results
Each section shall include:
a. State's Policies and Procedures. A description of the State's policies and procedures and how they influence State performance in the program category.
b. Relevant Performance Data. A description and analysis of relevant State performance data shall include a discussion of the 12-month performance measures for each activities measure plus a discussion of any earlier 6-month outliers not reflected in the 12-month data. Those measures "For Information Only" may be discussed if they are relevant.
(1) The detail of discussion of the performance measures will vary. For those measures having any 12-month outlier(s), the cause(s) of the outlier(s) and its/their impact on the State's program shall be discussed citing analyses conducted in response to parallel 6-month outliers occurring during the evaluation period. This should include:

(a) A summary of the analysis(es) performed;
(b) Corrective actions in progress or taken; and,
(c) The status of the outlier and of any recommendations outstanding at the end of the evaluation period.

VIII-3
(2) Activities measures not having outliers shall be discussed to demonstrate the effectiveness of the State's program in that area. Further review levels and Federal data do not need to be mentioned when there are no outliers except to make a point relevant to the discussion.
c. Other Program Features. A description and analysis of other program features in the program category include:
(1) CASPAs. All CASPAs that were found to have a significant impact on a State's program relevant to a category shall be discussed. This discussion shall include a description of the CASPA, conclusions reached, any recommendations made, and the status of any such recommendations at the end of the evaluation period.
(2) State-Specific Activities. Any State-specific activity relevant to a category shall be described and its impact on State performance shall be analyzed and assessed in regard to the agreed-to monitoring scheme discussed in Chapter IV.
(3) OSHA Grant Monitoring Report. Any relevant findings in OSHA's 23(g) grant monitoring report shall be summarized.
(4) Response to Previous Recommendations. Any State responses, including evaluation change supplements, to recommendations in previous Annual Evaluation Reports shall be discussed. Recommendations in previous reports that have not been addressed by the State shall be noted and the impact on the State's program assessed.
(5) Other Information. Any other State program information that impacts State performance in a category shall be described, and its impact on State performance shall be analyzed.
d. Overall Performance. A qualitative assessment of the State's overall performance in the category in relation to the "at least as effective" standard shall be made.

VIII-4
e. Recommendations. Recommendations, if any, on how a State may act to improve its performance in the category shall be included.
6. Conclusion on Overall Effectiveness of State Program. A conclusionary statement of the overall effectiveness of the State program in comparison to the Federal program, including a judgment as to whether any outliers or pattern of outliers render or could potentially render the overall State program less effective than the Federal program.

C. Procedures for Developing the Evaluation Report.

1. Timing. The Region shall submit a report to the National Office within 60 calendar days of receipt of the last SPAM Report of the evaluation period. The Annual Evaluation Report does not have to include a discussion of new outliers and other information appearing in the 6-month data on the last SPAM Report of the evaluation period if this analysis could require a delay in preparation of the report. However, they should be discussed, where appropriate, in the following year's report.
a. Coordination Between Region and State. The State and Region shall determine the nature and extent of interaction between them, including a time schedule, that is necessary to allow the Region to submit a timely and comprehensive report to the National Office. At such discussion, the State and Region should agree on when the report would be ready for the State to review to ensure that the 60-day timeframe will be met. The State shall have a minimum of 14 calendar days for review.
b. State Response to Region's Report. The State shall have the opportunity to submit to the Region a written response to the report prior to its submission to the National Office. The Region shall attempt to resolve differences with the State and, as appropriate, modify the report to incorporate State oral or written comments. The State response shall be appended to the report when the Region forwards it to the National Office only if agreement on changes to the report has not been reached.
2. National Office Role. The National Office shall review the Region's report and any accompanying State response, and discuss any recommended changes with the Region. The Region and the National Office shall coordinate any revisions to be

VIII-5
made. If substantive changes are made to the Region's report, the National Office shall promptly notify the State. The National Office shall also prepare a summary assessment of the report and its findings for the Assistant Secretary and a transmittal letter to the State. The National Office shall then forward the final report to the Assistant Secretary for transmittal to the Region and State no later than 30 calendar days after receipt of the report.
3. State Response to Final Report. Within 60 days of receipt of the final report from the Assistant Secretary, the State shall have the opportunity to submit a formal response to such report to the Region. The State's formal response it required if the report contains recommendations. The response may comment on the monitoring findings and respond to the recommendations, as appropriate. The Region shall upon receipt of such response promptly forward a copy to the National Office.

D. Availability of the Final Report and Any State Response. In accordance with the Freedom of Information Act, once the final report has been officially transmitted to the State by the National Office, the final report and any State response shall be available to the public upon request.

VIII-6