- Record Type:OSHA Instruction
- Current Directive Number:STP 2.22A CH-5
- Old Directive Number:STP 2.22A CH-5
- Title:Changes to the State Plan Policies and Procedures Manual
- Information Date:
NOTICE: This is an OSHA Archive Document, and may no longer represent OSHA Policy. It is presented here as historical content, for research and review purposes only.
OSHA Instruction STP 2.22A CH-5 March 10, 1995 Office of State Programs
Subject: Changes to the State Plan Policies and Procedures Manual
A. Purpose. This instruction transmits revisions to the State Plan Policies and Procedures Manual (SPM).
B. Scope. This instruction applies OSHA-wide.
C. Action. OSHA Regional Administrators shall ensure that the following actions are accomplished:
1. Remove OSHA Instruction STP 2.22A, Part II, Chapters I through V, Appendix D, page 1, Appendix F, page 1, Appendix I, pages 3 and 4, and Appendix K, pages 3 and 4, and replace them with OSHA Instruction STP 2.22A, CH-5, Part II, Chapters I through V, Appendix D, page 1, Appendix F, page 1, Appendix I, pages 3 and 4, and Appendix K, pages 3 and 4.D. Background.
2. File one copy of this instruction in the appropriate OSHA Directives System binder and one copy at the front of your current SPM binder, directly behind STP 2.22A, CH-4, as a record of this change.
3. Ensure that State designees and monitoring Area Offices have received copies of this instruction and appropriately updated their SPM.
4. Discuss the changes in procedures with the State designees.
1. Previous Manual. On March 3, 1994, Change 4 to this Instruction was issued after a series of meetings by several task groups composed of Federal and State representatives. Significant changes in that revision include revised State Plan Activities Measures and the requirement for States to implement internal evaluation programs. The Federal Program Change requirement for submission of a plan supplement describing the State Internal Evaluation Program by September 1994 (Section E.2), and the requirements for submission of copies of permanent and temporary variances granted and precedential decisions on contested and employee discrimination cases (E.3) remain in effect.Joseph A. Dear Assistant Secretary
2. Reinvention Conference. In April 1994, a Reinvention Conference on State Plan Monitoring was held in which front-line monitoring staff from Federal, and State plan offices discussed their suggestions to improve the monitoring process. The Report on the Conference, issued on June 2, 1994, discussed proposed changes to the monitoring procedures, and a memorandum on Streamlining Monitoring Guidelines, issued on June 20, 1994, implemented several changes. This document incorporates those changes into the revised State Plan Policies and Procedures Manual.
3. Significant Changes. Changes in the text are highlighted by a black line in the margin. Significant changes include:a. Comprehensive evaluation reports on State performance shall be issued on a biennial basis rather than annually. Interim reports will be issued in alternate years focusing on major accomplishments and unresolved issues. (See pages V-1.)4. Future Revisions of the SPM. It is anticipated that further changes will be made to the monitoring procedures as a result of other recommendations resulting from the Reinvention Conference, including changes to the format of the Computerized State Plan Activities Measures (C-SPAM) report. Revisions to the manual will be made by means of page changes transmitted by OSHA Instructions.
b. All evaluation reports will be issued by the Regional Administrator after coordination with the State and two weeks opportunity for review by the Office of State Programs. (See pages V-11 and V-12.)
c. At the suggestion of participants at the Reinvention Conference, the manual clarifies and strengthens some monitoring instructions which were included in Change 4 of this Instruction. These include:(1) The role of the State as a partner in monitoring.
(2) The requirement to incorporate grant monitoring into the evaluation process and the biennial report.
(3) Attendance of monitoring staff at quarterly meetings.
(4) Attendance of monitors at State training sessions.
DISTRIBUTION: National, Regional and Area Offices State Plan Monitoring Personnel State Designees OSHA Training Institute
Part Two, State Plan Monitoring and Evaluation
CHAPTER I. -- INTRODUCTION
CHAPTER I. -- INTRODUCTION
A. The Act
1. The Monitoring SystemC. Scope of the
1. Monitoring SystemD. Goals of the Monitoring System
1. The objective and consistent analysis of dataE. Framework of the Monitoring Process
2. Technical assistance and information
3. Consistency in evaluation of State programs combined with the flexibility to address unique state situations
1. An ongoing processF. Roles and Responsibilities of the Parties
2. Regular evaluation of readily available data
3. The monitoring plan
4. Quarterly Meetings
5. Data Analysis
6. Evaluation Report
1. National Office
2. Regions
3. States
4. Resolution of Issues
CHAPTER II -- THE MONITORING AND EVALUATION PROCESS
A. Ongoing Performance Tracking
1. IntroductionB. The Annual Monitoring Plan
2. Readily Available Data - State Plan Activities Measures
3. Readily Available Data - IMIS Reports
4. Readily Available Data - State Activity Logs
5. Readily Available Data - The Grant
1. IntroductionC. Quarterly Meetings
2. Purpose
3. Coordination with the States
4. Timing
5. Informational Copy to the National Office
6. Identification of Issues to be Evaluated
7. Development of the Annual Monitoring Plan
1. SchedulingD. Evaluation Methodologies
2. Purpose
3. Topics
4. Documentation
1. IntroductionE. Evaluation and Analysis
2. Evaluation Hierarchy
3. Host Reports
4. On Site Monitoring
1. Identification of Potential CausesF. Review of State Internal Evaluations and Use of Results
2. Formulation of Analytical Questions
3. Identification of Data Sources
4. Review of Data
5. Conclusions and Recommendations
6. Documentation
1. The Region shall review the effectivenessG. Coordination Between the Region and State on Findings
2. The State shall forward to the Region a summary
3. After the monitors conclude
1. After analysis of every issue identified for evaluationH. Monitoring Developmental State Plans
2. The purpose of these discussions
1. Quarterly MeetingsI. Monitoring Public Employee Only State Plans
2. Outlier Analysis
3. On-Site Monitoring
J. The Report
CHAPTER III -- MONITORING SECTION 23(G) GRANTS
A. Introduction
B. Ongoing Monitoring
1. The Financial Status Report (SF 269) and the Federal Cash Transactions Report (PMS 272)C. Annual Financial Review
2. The other form of readily available data
3. Discussions of Programmatic Goals at Quarterly Meetings
CHAPTER IV -- COMPLAINTS ABOUT STATE PROGRAM ADMINISTRATION
A. Definition of a CASPA
B. Purposes of a CASPA
1. Provides employers, employees, and the public an opportunity to address specific issuesC. Determining If a CASPA Warrants Investigation
2. Provides OSHA with an additional source of information
3. Provides the basis for State corrective action
1. All CASPAs shall be investigatedD. Confidentiality
2. If the Region lacks sufficient information
3. If a CASPA alleges that a situation of imminent danger exists
4. Anonymous CASPAs
5. All CASPAs will be entered on the CASPA Log
1. When contacting the State concerning a CASPAE. Notification of Concerned Parties and Opportunity for State Response
2. Notwithstanding the above
1. If An Investigation Is Not WarrantedF. Methods for Investigation of a CASPA
2. If An Investigation Is Warranted
1. If the State responds to the RegionG. Review of Completed CASPA Investigations
2. If the Region or State determines that it is necessary
1. Communication Between Region and StateH. Documentation of CASPA Investigations
2. Response to Complainant
3. Letter to the State
4. State Response
5. Forwarding of the Response to Complainant to the Office of State Programs
6. Corrective Action
7. Resolution
1. Identification of Allegations To Be Investigated
2. Information Reviewed
3. Analysis and Conclusions
4. Recommendations
5. Response of State
6. Followup by Region
CHAPTER V -- THE EVALUATION REPORTS
A. Introduction
1. A Comprehensive Evaluation ReportB. Comprehensive Evaluation Report
2. An Interim Evaluation Report
3. Monitoring activities
1. Purpose and ScopeC. Interim Evaluation Report
2. Format
1. Purpose and ScopeD. Procedures for Developing the Interim and Comprehensive Evaluation Reports
2. Format
1. TimingE. Availability of the Final Report and Any State Response
2. State Response to the Report
3. Role of the Office of State Programs
4. State Response to the Final Report - Action Plans
5. Regional Action on State Responses
F. Evaluation Report for Developmental Plans
CHAPTER I. -- INTRODUCTION
A. The Act
Section 18 of the Occupational Safety and Health Act allows any State to assume responsibility for the administration of an occupational safety and health program meeting each of the criteria established in the Act. The Secretary of Labor is required to make a continuing evaluation of each State's program to ensure that it meets the statutory test being "at least as effective as" the Federal program. Regulations regarding the Agency's evaluative responsibilities are codified as 29 CFR 1902 and 1953.B. The Monitoring System
The Agency has established a monitoring system applicable to all State occupational safety and health programs. The continuing evaluation required by the Act is accomplished using this system.C. Scope of the Monitoring System
The scope of OSHA's monitoring includes any State activity that: (1) receives Federal funding under Section 23(g) of the Act; (2) would be preempted by Section 18 of the Act absent an approved State plan, or (3) is included in the State plan. If the State uses 100% State funding for activities that do not meet the foregoing tests, such activities are not generally assessed by OSHA, but their impact on a State's responsibilities under its plan shall be subject to monitoring.D. Goals of the Monitoring System
The monitoring system's success is dependent on constant and constructive communication and cooperation between OSHA and the States. The States are given the opportunity to participate in every phase of the monitoring process and are encouraged to do so. The process is designed to achieve the following goals:E. Framework of the Monitoring Process1. The objective and consistent analysis of data from all relevant sources to determine whether a State program is operating at least as effectively as the Federal program.
2. Technical assistance and information to the States to help in program implementation and the correction of any deficiencies.
3. Consistency in evaluation of State programs combined with the flexibility to address unique state situations.
1. An ongoing process. The monitoring process is an ongoing one. It entails the regular review of data throughout the evaluation period to track State activities. Although comprehensive evaluation reports are produced biennially, monitoring activities are continuous throughout the reporting period.F. Roles and Responsibilities of the Parties
2. Regular evaluation of readily available data. Throughout the evaluation period the monitors assigned primary responsibility will review quarterly Computerized State Plan Activities Measures (C-SPAM) reports as they are issued, as well as other readily available IMIS reports to track a State's activities, and to identify areas requiring further analysis.
3. The monitoring plan. An essential step in the monitoring process is the establishment of an annual monitoring plan. In consultation with the State, the Region will create a detailed plan for monitoring throughout the evaluation period. The monitoring plan should be adjusted during the evaluation period in response to changing circumstances.
4. Quarterly Meetings. Federal and State representatives must meet at least every quarter to review the ongoing tracking of State performance and to discuss any issues covered by the monitoring plan. Quarterly meetings foster a necessary ongoing and open dialogue between the monitors and the States. Records of the meetings are retained to document the issues discussed and any commitments made during the meetings.
5. Data Analysis. Utilizing the varied data sources, the monitors must analyze all data necessary to assess a State program's effectiveness. The analytical tools available may be used, in the order outlined in this manual, to secure a thorough analysis of any issues identified for analysis in the monitoring plan.
6. Evaluation Report. At the close of the evaluation period OSHA shall develop an evaluation report that provides a comprehensive review of a State program's activity. The report describes the State's program, evaluates overall performance, reviews the status of any corrective action proposed in prior reports, and makes recommendations for improvements. The evaluation report is the culmination of the monitoring process and provides an overview of monitoring activities, State performance and State-initiated actions throughout the evaluation period. Comprehensive reports will generally cover a two-year period. Interim evaluation reports will be prepared for alternate years and will cover only major issues such as unresolved performance issues, State-initiated changes and Complaint About State Program Administration results.
1. National Officea. Directorate of Federal - State Operations. Through the Office of State Programs (OSP), the Directorate of Federal-State Operations (FSO) provides programmatic advice to the Assistant Secretary and the Regional Administrators relating to all State plan issues. OSP shall coordinate the monitoring program to ensure that it is applied to the States in a consistent and objective manner. OSP is also available to assist the Regions in implementing the program effectively and in the resolution of disputes.2. Regions. The Regional Administrators are responsible for the consistent monitoring of State programs within their Regions. In this context the Regional Administrator may assign responsibilities for monitoring within and between the Regional and Area Offices. The Regional and Area Offices are responsible for supporting the States in the development and implementation of effective occupational safety and health programs. The Region acts as liaison between the National Office and the States providing a channel of ongoing communication and support in the areas of policy guidance, technical assistance, and training. RegionalAdministrators are responsible for the timely preparation and issuance of interim and comprehensive evaluation reports.
b. Directorate of Administrative Programs(1) Office of Management Data Systems. The Directorate of Administrative Programs, through the Office of Management Data Systems (OMDS) is responsible for systems analysis, systems design, the programming of computerized State Plan Activities Measures (C-SPAM), and the processing of both Federal and State program data as part of OSHA's computerized Integrated Management Information System (IMIS). OMDS is also responsible for producing and distributing routine reports (such as C-SPAM) and special reports. OMDS also designs and writes software programs to respond to requests for more complex, special, ad hoc reports and to enable Federal field offices and States to query the national data base directly. In addition, OMDS provides the States and the Regions, and other components of the National Office, with training and assistance in operating the IMIS.
(2) Office of Program, Budgeting Planning and Financial Management - Division of Grants Management. The Division of Grants Management is responsible for:
(a) Day-to-day management of grant funds
(b) Annual grant instructions
(c) Guidelines for financial amendments
(d) The evaluation of Regional financial analysis
3. States. The States are responsible for maintaining occupational safety and health programs that are "at least as effective" as the Federal program, and for participating actively in assessing their performance as part of the monitoring process. State participation in the national occupational safety and health program requires continual and open communication with the Region, and the sharing of all appropriate information on State performance including information about specific State activities. States shall be provided an opportunity to participate in all facets of the monitoring process. State involvement in the monitoring system requires participation in the review and assessment of all aspects of the State's program. The States are also responsible for participating actively in assessing their performance by administering their own internal evaluation program.
4. Resolution of Issuesa. In the event that a State and Region cannot agree on the resolution of any issue relating to the administration of the State program or the monitoring system, the State has the right to seek resolution by the Assistant Secretary.
b. Prior to the submission of any dispute, the Region and the State are expected to make every reasonable effort to resolve it. The Region shall notify the Office of State Programs of the nature of the dispute, and OSP shall be available to provide any clarification of policy, procedure, or precedent relating to the issue in dispute.
c. If the issue cannot be resolved, the State must inform the Region that it will seek resolution by the Assistant Secretary. Both the State and Region shall have the right to submit a written statement of position to the Assistant Secretary.
d. Both parties shall be notified of the Assistant Secretary's determination.
CHAPTER II -- THE MONITORING AND EVALUATION PROCESS
A. Ongoing Performance Tracking
1. Introduction. The monitoring process is an ongoing one. It entails the regular review of readily available data throughout the evaluation period, to track State activities. The process identifies differences between and trends in both State and Federal performance.B. The Annual Monitoring Plan
2. Readily Available Data - State Plan Activities Measuresa. State Plan Activities Measures (SPAM) reports are the primary form of readily available data. The Activities Measures are listed in Appendix A of this manual.3. Readily Available Data - IMIS Reports. The following reports are additional sources of readily available data.
b. Activities measures provide the basic means of comparing State and Federal performance in essential program areas. Each activities measure provides a point of comparison between State and Federal activity, or between State performance and another established level of performance. The activities measures are generally expressed as percentages or averages.
c. Further review levels are numerical limits established by OSHA to facilitate the comparison of State to Federal performance. Where appropriate, further review levels are based on Federal performance goals. Further review levels define the acceptable range of State performance. If a State's performance regarding a particular performance measure falls within the bounds established by the associated further review level, no further review of State performance in that area may be needed.
d. Outliers occur when State performance does not fall within the limits established by the further review levels. If an outlier appears in the data for the preceding 6 months, it is analyzed to determine whether it reflects a program deficiency or is merely an indication of a difference between the programs.
e. Quarterly Computerized SPAM (C-SPAM) reports are produced by OMDS for those measures for which data are available in the IMIS. Appendix B, "C-SPAM Data Sources," provides a detailed description of each C-SPAM measure in terms of the data sources used for calculation of the measure. The C-SPAM reports compare Federal and State performance for the most recent 3, 6, and 12 month periods, indicating outliers for each period. A sample C-SPAM format is included in Appendix C of this manual. A special 24-month C-SPAM will be run for a State's biennial evaluation period. Regions should request this special run through the Office of State Programs before the end of the evaluation period.
f. The activities measures include the following 11 major categories:(1) Standards - measures a State's success in promulgating standards within six months of Federal promulgation. Relevant data is obtained from the State Standards Development and Promulgation Log (see Appendix D).
(2) Variances - measures a State's effectiveness in granting variances from existing standards. Relevant data is obtained from the Summary of Variances Granted Log (see Appendix E).
(3) Enforcement - Private Sector - measures State performance in comparison to Federal performance in 18 enforcement areas. With the exception of performance measure 11 relating to the timely abatement of cited hazards, relevant data is provided on the C-SPAM reports provided by OMDS. Data concerning performance measure 11 is collected in accordance with the Annual Monitoring Plan.
(4) Public Employee Program - measures State public sector enforcement performance against State private sector enforcement. With the exception of performance measure 8 relating to the timely abatement of cited hazards, relevant data is provided on the C-SPAM reports provided by OMDS. Data concerning performance measure 8 is collected in accordance with the Annual Monitoring Plan.
(5) Review Procedures - measures State performance in contested cases. Relevant data is provided on the C-SPAM reports provided by OMDS.
(6) Discrimination - measures State performance in responding to discrimination cases. Relevant data is provided on the C-SPAM reports provided by OMDS. In addition, periodic case file reviews are performed to assess the quality of discrimination investigations.
(7) Consultation - measures State Public Sector Consultation programs funded under Section 23 (g). Relevant data is provided on the C-SPAM reports provided by OMDS. In those States where the Private Sector Consultation program is also funded under Section 23(g), Consultation Activities Measures (CAM) reports are used for monitoring. Consultation programs funded under Section 7(c)(1) are not monitored as part of the State plan.
(8) Training and Education - measures State performance in providing training and education. Both performance measures require data provided by the States.
(9) Program Administration - measures the timeliness of State responses to Federal program changes and the submission of State initiated changes, and the adequacy of the State's Industrial Hygiene Laboratory's Performance Analytical Testing (PAT). Relevant data regarding the first two issues is obtained from the Federal Program Change Log (Appendix F) and the State Initiated Program Change Log (Appendix G), respectively. PAT results are provided by the Salt Lake City Laboratory.
(10) Program Resources - measures the number of authorized State positions against established benchmarks and the percentage of authorized positions filled. Relevant data is supplied by the States.
(11) Program Results - compares trends in State injury and illness data to Federal injury and illness data. Relevant data is provided annually by BLS. The Office of Statistics provides a report showing 5-year trends in injury and illness data. (See Appendix H for a sample report.)a. Micro to Host reports provide access to data stored on the host computer. These data are not limited to the requesting office but may encompass the entire Federal/State program or any specified jurisdiction. Following are examples of micro to host reports which are useful in State plan monitoring. For a more complete list, see Appendix I of this manual and ADM 1-1.19A.4. Readily Available Data - State Activity Logs(1) Enforcement Statistics Reports (ENFST Reports) allow comparisons between the numbers of inspections planned by a State and the number of inspections actually conducted.b. Micro reports allow users to develop and run tailored reports which retrieve data stored on the micro. The data relate only to the office which stores data on the particular micro. A catalogue of Micro reports and directions for running them are found in OSHA Instruction ADM 1-1.21A, "The IMIS Enforcement Micro Reports Documentation Manual."
(2) Inspection Reports (INSP Reports) provide data on State inspection performance. The report does not compare this performance to Federal performance but rather provides data concerning a variety of issues such as industry mix, citation lapse time, and contest rates. The data may be supplied on a Statewide basis or may be supplied office by office within the State and may be run for a time period of more than one year, if long-range performance data are desired. These reports also allow the selection of national (Federal plus State) data, which may be considered in the analysis of State performance.
(3) No Inspection Reports (NINSP Reports) provide data on the number of inspections that could not be conducted and why on an office by office and Statewide basis.
(4) Program Activity Reports (PROG Reports) provide data on time utilization by compliance personnel in the following categories: inspections/discrimination, compliance assistance, program support, and unavailable hours (leave, and central office support). These categories are identical to the categories contained in the OSHA 31. The data is provided office by office and Statewide.
(5) Related Inspection Reports provide data on inspections on a case related basis, i.e., inspections selected for this report will contain all the related data for those inspections whether or not the related action occurred in the selected time period. The selection criteria for this report are the same as the INSP reports. These reports may be especially useful since the C-SPAM report also uses related data.(1) In accordance with ADM 1-1.21A, States are required to use a set of standard micro reports in managing their program.
(2) The reports cover such areas as complaint processing, citation issuance, and referral response.
(3) These reports are also useful as a source for information on the State's program, especially in providing information on timeliness of State actions, such as complaint response and citation issuance.a. State Standards Development and Promulgation Log (see Appendix D). This log is maintained by Federal monitors based on data supplied by the State to track the State's timeliness in adopting Federal standards. Quarterly, the Office of State Programs will provide to the Region a listing of all Federal standards promulgated by OSHA during the quarter. It is reviewed as part of SPAM measure A-1. See discussion above in A.2.f(1).5. Readily Available Data - The Grant
b. Summary of Variances Granted (see Appendix E). This log is maintained by Federal monitors based on data supplied by the State to track variances granted by the State during the evaluation period. It is reviewed as part of SPAM measure B-1 and 2. See discussion above in A.2.f(2).
c. Federal Program Change Log (see Appendix F). This log is maintained by Federal monitors based on data supplied by the State to track the timeliness of State adoption and approval of Federally initiated plan changes. Quarterly, the Office of State Programs will provide to the Region a listing of all Federal Program Changes issued by OSHA during the quarter. It is reviewed as part of SPAM measure I-1. See discussion above in A.2.f(9).
d. State Initiated Change Log (see Appendix G). This log is maintained by Federal monitors based on data supplied by the State to track the timely submission and Federal approval of State initiated plan changes. It is reviewed as part of SPAM measure I-2. See discussion above in A.2.f(9).
e. Summary of Appellate Decisions (see Appendix J). This log is maintained by Federal monitors based upon their review and analysis of appellate decisions supplied by the State. It is not reviewed as part of any SPAM measure. It is used to gauge the program impact of new decisions rendered on appeal in State program cases.
f. Denial of Entry Information. This information is maintained by the States in logs or other documents and reflects denials of entry and their disposition. (Review of this information may entail on-site activity.)
g. CASPA Log (see Appendix L). This log is used to track and maintain information on Complaints About State Program Administration received by OSHA. This information is not used for review of a SPAM measure, but the information obtained through CASPA investigations is included where appropriate in the evaluation report.a. The Financial Status Report and the Federal Cash Transactions Report are submitted to the Office of Grants Management quarterly.(1) The Financial Status Report (SF 269) is a report generated quarterly by the State which reflects both State and Federal outlays as well as unliquidated obligations.b. The Consolidated Audit Reports are usually prepared by State agencies and forwarded to and reviewed by the Department's Office of the Inspector General (OIG). These reports are issued on a schedule independent of the evaluation period and may cover more than one year. Consolidated audit reports require response to the OIG and are used to review the control and expenditure of Federal funds.
(2) The Federal Cash Transactions Report -- Status of Federal Cash (PMS 272) is a report generated by the Department of Health and Human Services Payment Management System (HHS-PMS) and completed by the State which reflects cash drawdowns of Federal funds. The State completes the report by filling Federal funds expended and returns the signed document to HHS.
c. Section 23 (g) Grants - Each State's grant application reflects both the financial data and program narrative. The grant narrative delineates goals for each program activity funded under the grant.
1. Introduction. The Monitoring Plan contains the procedures and the framework for conducting the evaluation of State plan activities.C. Quarterly Meetings
2. Purpose. The Annual Monitoring Plan is a planning document for the identification of issues to be evaluated, the scheduling of the evaluation, and the assignment of responsibilities for the evaluation. The monitoring plan may be adjusted at any time during the evaluation period to reflect newly identified issues for evaluation or the results of analysis already underway.
3. Coordination with the States. The development of the annual monitoring plan and any amendments thereto is the responsibility of the Regional Administrator in coordination with the State designee. This coordination must extend to all issues covered by the monitoring plan, regardless of funding source.
4. Timing. The plan shall be developed prior to or at the first quarterly meeting of each evaluation year between the monitors and the State. While some aspects of the plan will cover the biennial comprehensive report period, other aspects will address monitoring activities to be performed annually. Data obtained through sources other than the C-SPAM will need to be obtained and reviewed at least annually. In addition, new issues may arise during the course of the biennial period. Therefore, the plan should he prepared annually. The plan for the first year of a biennial evaluation period may include projections for some activities to be performed during the second year of the period.
5. Informational Copy to the National Office. The monitoring plan, along with any amendments, shall be shared with the Office of State Programs.
6. Identification of Issues to be Evaluated. Federal monitors may evaluate any performance issue within the scope of monitoring described in Chapter I of this manual. Monitors may prioritize issues for analysis based on the availability of resources.a. Mandatory Issues for Analysis.7. Development of the Annual Monitoring Plan. The monitoring plan provides an overview of the monitoring process and may include one or more analysis workplans. The analysis workplans provide details of how specific issues are evaluated.(1) Outliers. Performance issues which appear as 6-month outliers on C-SPAM reports, and other outliers noted through review of State logs or other sources, except where an exception has been granted.b. Exceptions to Activities Measures
(2) Required Corrective Action. Issues regarding which requirements for correction have been included in prior evaluation reports must be included as subjects for analysis.
(3) Adequacy and timeliness of abatement. The Region shall annually analyze the adequacy of abatement assurance. To do so, the Region shall utilize case file reviews of a sample of closed cases with serious, willful, and repeated violations, or it may rely on an adequate analysis by the State's internal evaluation program.
(4) Discrimination Program. Annually the Region shall review closed discrimination case files to determine whether a detailed analysis is necessary to determine if the State's discrimination program is operating effectively and in accord with approved procedures. The Region may rely on an adequate analysis by the State's internal evaluation program.
(5) Inspection Personnel. The Region shall compare the allocated number of inspection personnel included under the State's grant and the covered worker population to calculate the ratio of covered workers per State inspector. OSP will provide the same data for the Federal program for purposes of further comparison.(1) Reasons for Exception. Whenever approved policies and procedures or other unique factors cause recurring outliers which prior analyses have shown not to have negative program impact, a State may submit a request for an exception.c. Discretionary Issues for Analysis
(2) Contents of Requests. For each exception requested the State shall explain why State performance differs significantly from Federal performance, the cause of such difference, and the reasons such recurring differences do not have negative program impact. The request may, if appropriate, propose an alternate further review level or performance objective against which State performance is measured.
(3) Submission of Requests for Exceptions. Requests for exceptions shall be submitted to the Regional Administrator for review and approval. The Region shall maintain an updated record of all exceptions requested and granted. The Region will notify the Office of State Programs of all exceptions granted. The OSP will in turn maintain a log of exceptions to ensure consistency.(1) Performance trends discerned in ongoing tracking. An example might be a downturn in any issue which would lead either to an outlier or a conclusion that State performance is less than effective.
(2) Program changes.
(3) Program elements that present special vulnerabilities.
(4) Program elements which have not been analyzed in recent evaluation reports.a. Format. The monitoring plan shall include the following:(1) The evaluation periodb. Analysis Workplans: A workplan should be developed for each issue to be analyzed. The workplans should contain at a minimum the following:
(2) Schedule for quarterly meetings
(3) Mandatory and Discretionary issues to be Analyzed and Analysis Workplans
(4) Assignment of Monitoring Responsibilities
(5) Incorporation of the State Internal Evaluation Program (SIEP) - The State should make available at the first quarterly meeting of each year a written plan that identifies the issues to be addressed by the internal performance evaluation program. The monitor and State will negotiate the evaluation responsibility for those issues where duplication is indicated. A copy of the State's internal evaluation plan for the year will be incorporated as part of the monitoring plan. Federal monitors and the State will also develop a schedule for submission of the State internal evaluation report(s) to the Region in sufficient time for inclusion of findings in the evaluation report.
(6) Schedule for Analysis Start and Completion
(7) Schedule for Preparation and Submission of Interim and Comprehensive Evaluation Reports(1) Issue to be Analyzed
(2) Identification of hypothetical causes for the issues being analyzed
(3) Formulation of questions that must be answered to determine the validity of each hypothetical cause
(4) Audit methodology - identifies the information which will address the causes of the issues being analyzed. Data sources shall be selected based on the evaluation hierarchy and specify the method of data selection and data criteria.
(5) Assignment of Responsibility for Analysis
(6) Timeframes for Analysis
1. Scheduling. The Region shall meet with representatives of the State at least on a quarterly basis in accordance with the schedule established in the monitoring plan. When feasible, monitoring staff should be given an opportunity to attend these meetings.D. Evaluation Methodologies
2. Purpose. Quarterly meetings provide a forum for ongoing communication between the monitors and the State. Any issue of concern to either party is appropriate for discussion at these meetings.
3. Topics. Examples of discussion topics are:a. SPAM reports and outlier analyses4. Documentation. Records must be maintained of each quarterly meeting indicating the date, location, persons in attendance, and a summary of the significant issues discussed and conclusions. The Regional Administrator shall assign responsibility for creating the required record. A copy should be shared with the State.
b. Interim monitoring conclusions
c. Coordination regarding issues to be covered by State internal evaluations and interim findings
d. State internal evaluation findings
e. Effect of State policies and procedures on State program administration
f. Status of State responses to requirements and recommendations in prior evaluation reports
g. State responses to proposed evaluation change supplements
h. State responses to Federal Program Changes
i. Standards developments, both State and Federal
j. Progress on monitoring activities set out in annual plan
k. State progress in meeting goals in grant
l. Ongoing grant monitoring conclusions
m. State consultation activities and Consultation Activities Measures
n. Legislative initiatives, both State and Federal
o. Jurisdictional issues
p. Instances where the State was unable to secure entry
q. Upcoming State training courses
r. Any other issues of concern to either party
1. Introduction. All of the issues identified for evaluation in the monitoring plan, as updated during the course of routine monitoring, must be analyzed to determine whether they impact the effectiveness of a State's program. This analysis shall be conducted in accordance with the monitoring plan and in consultation with the State. During each evaluation the Region shall conduct on site monitoring, in accordance with the hierarchy described below.E. Evaluation and Analysis. Once issues requiring analysis have been identified during the course of ongoing tracking and included in the annual monitoring plan or its amendments, it becomes the monitor's responsibility to assess the program impact of each such issue. There can be no universal formula to be followed for every analysis. Rather the course and nature of the analysis relies heavily on the monitor's understanding of both the Federal and State programs and the monitor's professional judgment as well as the nature of the issue being evaluated. The following outlines the steps generally taken in analysis. Much the same steps were taken earlier in the process of creating workplans.
2. Evaluation Hierarchy. In analyzing any issue monitors shall rely initially on readily available data. As indicated in A.3. above, "readily available data" includes C-SPAM reports, other SPAM information and standardized IMIS micro and host reports. Should these data sources not allow for an adequate analysis, the monitor shall utilize other methods, including requested IMIS reports with limited selection criteria and on site monitoring, to provide a complete analysis of an issue.
3. Host Reports. Host reports provide data from the host computer. Reference should be made to ADM 1-1.19A, "The IMIS Host Reports Documentation," which details all of the reports available.a. There are four categories of host reports as follows, all of which may be used in monitoring:4. On Site Monitoring. On site monitoring techniques are listed below. They are evaluative tools which rely on contact with State personnel to secure and analyze data. Such techniques will be coordinated in advance with the States, usually as part of the development of the monitoring plan. They will normally be used in the sequence described below, starting with the least intrusive technique listed and proceeding to each subsequently listed technique until a satisfactory analysis of an issue can be completed, all in coordination with the State designee. If it appears that the initial monitoring tools would not yield the required information or be the most efficient method, monitors may deviate from the given sequence, with the concurrence of the State designee.(1) Micro-to-host Reports - reports available via the micro computer in the monitor's office.b. Micro to Host reports are most useful in outlier analysis when used to compare Federal and State data on a specific aspect of program operation.
(2) Scheduled Reports - reports produced and distributed on a regular basis.
(3) On Request Reports - reports produced and made available to the field on request.
(4) Data Audit Reports - reports produced and distributed periodically to validate the accuracy of data on the micro and host data bases, and to insure that required operational actions are taken.(1) For example, if it is suspected that the industry mix in a State is creating an outlier because it differs widely from the national average, monitors can run two INSP reports presenting Federal and State summary data on those SICs prevalent in the State. If differences in performance are no longer evident, the different industry mix may be considered a reason for the outlier. If differences are still apparent, another cause should be explored.c. Securing Host Reports. ADM 1-1.19A provides a menu of standardized reports which can be run on request. Additionally the monitor may make special queries to secure data not contained in these standardized reports using any appropriate selection criteria. Appendix I of this manual provides details on using host reports in the analysis of State data. Monitors are cautioned not to routinely use limited computer resources unnecessarily.
(2) The "Standards Cited" report is also useful in analyzing outliers related to violation classification. The report shows the number of times a standard was cited during the requested time period, the classification, and proposed penalties, and may be run for both Federal OSHA and the State for any SIC range.a. Discussions with State Personnel and Review of State Records. Federal monitors may discuss State performance with State personnel, review documents maintained by the State, including State internal evaluation reports, review equipment or laboratory facilities, and attend meetings relating to State plan activities, after coordination with the State.
b. Attendance at meetings. Federal monitors may attend other program related meetings such as training sessions and court hearings in which the State participates, after consultation with the State. Attendance at training sessions will help to ensure that monitoring staff is familiar with State policies and procedures.
c. State Computer Systems. Federal monitors may access State micro data bases after coordination with the State to analyze any issue. The State may assign a User Identification number for the Federal monitor. Additionally the monitor shall review the State's use of standard micro reports as required in ADM 1-1.21A, and review the quality control procedures including input and correction of data, responsibilities of personnel, and requirements for production and use of micro reports.(1) If there are questions about the accuracy of C-SPAM data, micro reports should be run to determine that all appropriate data have been entered into the IMIS. Monitors should be aware that due to the continuous processing of data as well as specific definitions for SPAM measures, the numbers may not be identical. However, if there is an apparent discrepancy, the Regions should discuss the data with OMDS to determine the cause.d. Case File Reviews. Federal monitors may review State inspection or consultation case files. Such case file reviews shall be used to the extent that readily available data and on site monitoring techniques listed in a., b., and c. immediately above cannot provide the basis for an adequate analysis of an issue.
(2) For those measures where outliers appear to be influenced by a small number of cases (e.g., timeliness of response to referrals, or denial of entry data), micro reports can be used to determine and track the specific cases in question. In such areas, micro reports can also be used to track improvements in performance over the course of the evaluation period.(1) Scope of Review. The scope of review will be specified in the monitoring plan. It will depend upon the issue being analyzed and may encompass the entire case file or be limited to a specific subject.e. Accompanied Visits. Federal monitors may join State compliance personnel during the course of State inspections. Such accompanied visits shall be used to the extent that readily available data and on site monitoring techniques listed in a., b., c. and d. immediately above cannot provide the basis for an adequate analysis of an issue. The scope of the review will be specified in the monitoring plan and will determine the selection criteria used for accompanied visits. Instructions for selection of cases where random sampling is appropriate is found in Appendix K. During such accompanied visits monitors are to observe the following procedures:
(2) Selection of Case Files. The selection of case files will be specified in the monitoring plan and will depend upon the issue being analyzed. If the monitor needs to draw statistically valid conclusions from case file reviews, he/she must review a statistically valid sample of randomly selected case files as outlined in Appendix K.
(3) Documentation of Findings. The monitor shall document findings regarding each case file reviewed. Additionally the monitor shall document all conclusions regarding program impact based on the analysis of all cases.(1) Assessment of Individual Performance. Accompanied visits are not intended to evaluate the performance of individual compliance officers. Such evaluation is the responsibility of State program management. The purpose of accompanied visits is to assess program performance. If, however, an accompanied visit suggests that the practices of a particular compliance officer present a problem, the appropriate State manager shall be informed of such practices.f. Spot Check Monitoring Visits. Federal monitors may visit establishments previously inspected by State compliance personnel. Before such visits are scheduled, the Region will contact the State to explain the reason that such visits are necessary. Such spot check monitoring visits (SCMVs) shall be used as a last resort to the extent that readily available data and the on site monitoring techniques listed in a., b., c., d. and e. immediately above cannot provide the basis for an adequate analysis of an issue. While the following procedures shall be observed on every SCMV, they may not be appropriate in the context of some CASPA investigations:
(2) Observed Violations
(a) If a monitor observes an apparent violation which the State compliance officer has failed to note, the monitor shall privately advise the compliance officer of the violation prior to the closing conference.
(b) If the State compliance officer agrees to document the apparent violation and identifies same to the employer, no further action is required at the worksite.
(c) If a State has an operational status agreement or if a determination pursuant to Section 18(e) has been made, the monitor will not issue citations for any violation observed during the AV. If the State compliance officer refuses to document the alleged hazards and identify them as such to the employer, the monitor shall document the alleged hazards. During the closing conference the monitor shall inform the employer of the alleged violations and that information will be forwarded to the State for further action. Immediately after the accompanied visit, the monitor will report the alleged violations to his/her supervisor who shall forward a memorandum containing this information to the State which shall decide if a citation is appropriate.
(d) If a State does not have an operational status agreement or a determination pursuant to Section 18(e) has not been made, the monitor may issue citations for any violation observed during the AV and not documented and identified as an apparent violation by the State.
(3) Imminent Dangers. If a State has an operational status agreement or if a determination pursuant to Section 18(e) has been made, the monitor will not issue citations for any imminent danger observed. If the monitor observes an apparent imminent danger and the compliance officer disagrees, the monitor shall immediately contact his/her supervisor who in turn shall contact the State to request further investigation. The State shall determine what further action is appropriate and notify the Region of its decision. If a State does not have an operational status agreement or a determination pursuant to Section 18(e) has not been made, the monitor may issue citations for any violation observed during the AV.
(4) Documentation of Findings. At the conclusion of each accompanied visit, the monitor shall document all results. This documentation shall include pertinent information regarding the inspection such as date, place, establishment inspected, and inspection ID number. The monitor shall also describe his/her findings regarding the issue or issues under analysis. The monitor shall also document any other issues which have program impact.(1) Timing. Spot checks shall be conducted as close in time to the completion of the underlying State inspection as possible. Monitors should keep in mind that conditions at a workplace are not static and changes may have occurred since the State inspection.
(2) Selection of Establishment. The selection of the establishment(s) to be visited is within the discretion of the monitor, but must be relevant to the analysis required by the monitoring plan.
(3) Review of Case File. The monitor may review the State's inspection file if it is available before the spot check visit. If not reviewed before, the case file should be reviewed after the worksite visit to compare Federal and State findings.
(4) State Opportunity to Accompany. State personnel shall be offered an opportunity to accompany the monitor in every instance. The monitor shall afford a minimum of two days notice to the State before any SCMV is conducted.
(5) Denials of Entry
(a) Issue Analysis. If the purpose of the SCMV is to gather data pursuant to the monitoring plan, the monitor shall terminate a visit should an employer deny the monitor entry. Absent extraordinary circumstances another establishment shall be selected.
(b) CASPA Investigations. If an employer denies entry to a monitor conducting an SCMV as part of the investigation of a Complaint About State Program Administration (CASPA), the Region, in consultation with the State, shall decide whether to seek compulsory process to allow the visit to continue.
(6) Opening Conference. The monitor shall hold an opening conference with the employer and the employees' representative to discuss the purpose of the SCMV. The State's representative is encouraged to participate. A separate opening conference will be held with the employee representative if the employer objects to a joint meeting.
(7) Opportunity to Participate in Walkaround. The employer, employee representatives, and the State representative shall be encouraged to participate in the walkaround. The monitor may consult privately during the walkaround with any employee.
(8) Handling Apparent Violations. If a State has an operational status agreement or if a determination pursuant to Section 18(e) has been made, the monitor will not issue citations for any violation observed during the SCMV.
(a) Observed While Accompanied by State Representative. If the monitor observes an apparent violation while accompanied by a State representative, the apparent violations shall be handled in accordance with the directions regarding accompanied visits in e.(2) above.
(b) Observed While Not Accompanied by State Representative. If the monitor is not accompanied by a State representative, the monitor will inform the employer of any observed hazards at the closing conference and will encourage the employer to abate them. The monitor will also provide the employer with an advisory letter outlining all observed violations. The monitor will provide a copy of that advisory letter to the State along with documentation regarding any corrective actions taken by the employer and observed by the monitor during the course of the SCMV.
(9) Imminent Dangers
(a) Observed while Accompanied by State Representative. If the monitor observes an imminent danger during the course of the SCMV and he/she is accompanied by a State representative, the imminent danger shall be handled in accordance with the directions regarding accompanied visits contained in e.(3) above.
(b) Observed while Not Accompanied by State Representative. If the monitor observes an imminent danger during the course of an SCMV and he/she is not accompanied by a State representative, the monitor shall immediately notify the employer and request voluntary abatement.
i) Voluntary Abatement. If the employer voluntarily abates the imminent danger or removes affected employees from exposure, the monitor shall contact his/her supervisor who shall in turn advise the State of the imminent danger and the steps taken by the employer to abate it. The State shall determine appropriate follow up action and notify the Region of it.
ii) Refusal to Abate. If the employer refuses to abate the imminent danger, the monitor shall immediately contact his/her supervisor. The supervisor shall immediately contact the State to advise of the imminent danger and request an immediate inspection. The State shall determine the appropriate action and advise the Region of same.
(10) Closing Conference. The monitor shall hold a closing conference with the employer and representative of employees. The State representative is encouraged to participate. A separate closing conference will be held with the employee representative if the employer objects to a joint meeting. During the closing conference the monitor shall discuss all hazards observed during the SCMV.
(11) Follow up Action. If no representative of the State participated in the SCMV, then the monitor shall send the employer a letter listing all hazards identified during the SCMV as well as all corrective steps observed by the monitor during the visit. A copy of this letter shall be provided to the State and to the representative of employees.
(12) Documentation of Findings. At the conclusion of each spot check monitoring visit, the monitor shall document all results. This documentation shall include pertinent information regarding the inspection such as date, place, establishment inspected, and inspection ID number. The monitor shall also describe his/her findings regarding the issue or issues under analysis. The monitor shall also document any other issues which have program impact.
1. Identification of Potential Causes. List each hypothetical reason that would explain the outlier or difference.F. Review of State Internal Evaluations and Use of Results
2. Formulation of Analytical Questions. For each hypothetical reason listed, formulate all the questions that would have to be answered to establish its validity.
3. Identification of Data Sources. For each question identify what data should be collected to confirm or refute the hypothetical reason, rank ordering them in accordance with the evaluation hierarchy described in D.2. above.
4. Review of Data. All data should be reviewed to determine whether it provides an answer to the analytical questions posed. The data collected should allow for conclusions as to the causes of outliers and differences which in turn must be assessed to determine program impact.
5. Conclusions and Recommendationsa. No Potential Program Impact. Should the monitor conclude from the analysis of the data that the differences between State and Federal performance have no potential negative impact on the State's effectiveness, the monitor shall document the reasons for such conclusion.6. Documentation shall be retained for every analysis performed. The documentation shall describe the methods used, findings, and recommendations made. Such documentation should be maintained for a period of three years with the documentation of the quarterly meetings at which the analyses were discussed.
b. Potential Negative Program Impact. Should the monitor conclude that the differences between State and Federal performance have the potential to diminish State effectiveness, the monitor shall document the reasons for such conclusion. A remedial strategy and timetable for its implementation shall be discussed between the State and the monitors at the first opportunity.
c. Deficiencies in Policies or Procedures. Should the monitor conclude that the differences in State and Federal performance result from deficiencies in the State's procedures, the monitor shall document the reasons for such conclusion. Changes to the State plan in accordance with Part I of this manual shall be discussed between the State and the monitors.
1. The Region shall review the effectiveness of the State's internal evaluation program. Recognizing that the internal evaluation programs may vary depending on a variety of factors including but not limited to resources, Federal monitors shall review the following issues relevant to the effectiveness of the internal evaluation program:G. Coordination Between the Region and State on Findingsa. The qualifications of the personnel assigned to the internal evaluation program.2. The State shall forward to the Region a summary of the results of its internal programmatic evaluation. The Region shall also have access to the detailed findings underlying any State conclusions, but will retain only the management summary in the evaluation file. Requests for background materials on a State's internal evaluation that are not in Federal OSHA files will be referred to the individual State for a decision on releasability based on its own laws.
b. Conformity with standard auditing procedures.
c. Conformity with the State's internal evaluation plan.
d. The adequacy of documentation regarding findings.
e. Whether the monitor may validate State findings short of duplicating the internal evaluation.
f. The adequacy of proposals for corrective action.
3. After the monitors conclude that the State's internal evaluation program is an effective one, and that the analysis of an issue identified for evaluation is complete and appropriate, Federal monitors need not conduct duplicative monitoring of the subject of the State's analysis. The State's conclusions regarding that issue, including findings of effective performance, shall be considered as the basis for findings in the evaluation report. This will avoid a duplication of monitoring effort where the parties can agree on the adequacy of the evaluation conducted.
1. After analysis of every issue identified for evaluation, the Region and the State shall discuss all tentative findings. Such discussions may occur at any time but will be reviewed at a minimum during quarterly meetings.H. Monitoring Developmental State Plans. Developmental State plans shall be monitored in accordance with the procedures outlined in this Chapter with the modifications that follow:
2. The purpose of these discussions is to review the findings and determine whether additional analysis is needed by either party. Additionally the parties will discuss the State's responses to proposed recommendations.
1. Quarterly Meetings. In addition to the topics for discussion outlined in C.3. of this chapter, the parties shall review the State's progress toward meeting developmental steps at each quarterly meeting. If it appears that a State will have significant difficulties meeting developmental commitments the Office of State Programs shall be notified.I. Monitoring Public Employee Only State Plans. State programs that have assumed jurisdiction over public employees only shall be monitored in accordance with this Chapter. Activities measures regarding private sector enforcement (Section C) are inapplicable and C-SPAM Reports are generated by comparing plan performance with public sector data from all State plans.
2. Outlier Analysis.a. C-SPAM reports will be generated quarterly as soon as the State participates in IMIS.3. On-Site Monitoring.
b. In analyzing outliers, monitors should consider whether they may be the result of factors inherent in the developmental nature of the State's program.
c. As developmental commitments are met, the Region shall evaluate their operational effectiveness.a. The Region shall conduct on-site monitoring in accordance with the procedures outlined in this Chapter.
b. Additional on-site monitoring may be required during the early developmental stages of a program especially before comprehensive computerized data is available.
c. On-site monitoring shall be directed toward the evaluation of overall program effectiveness and not the performance of individual compliance officers.
J. The Report. The ongoing monitoring process culminates in the evaluation report. That report combines the findings from all analyses in the format prescribed in Chapter V of this manual.
CHAPTER III -- MONITORING SECTION 23(G) GRANTS
A. Introduction. This chapter discusses the monitoring of the fiscal aspects of Section 23(g) grants and the program goals established therein.
B. Ongoing Monitoring.
1. The Financial Status Report (SF 269) and the Federal Cash Transactions Report (PMS 272) are readily available data described above in Chapter II, A.5.a. They should be used to monitor the State's cash draw downs and projected expenditures on a quarterly basis. This report is especially valuable in tracking potential funding shortfalls or overages. Such matters should be discussed during the quarterly meetings and any trends brought to the attention of the National Office.C. Annual Financial Review. A review of the State's financial program shall be conducted annually in accordance with OSHA Instruction FIN 3.2. The State's conformance with any exemptions and limitations imposed by the Federal OSHA appropriations bill will be examined as part of this review. Monitoring findings resulting from that review will be integrated into the comprehensive evaluation report.
2. The other form of readily available data is the consolidated audit report which is also described in Chapter II.A.5.b.. Timing of the distribution of this report can vary and likely will not coincide with the evaluation period. It should be discussed at the next scheduled quarterly meeting. Response to the findings outlined in the report must be made to the Office of the Inspector General and may be the basis for additional issues to be addressed in OSHA's annual financial review.
3. Discussions of Programmatic Goals at Quarterly Meetings. In addition to discussions of the Financial Status Report and any consolidated audit reports that may be available, the monitor should discuss the State's progress in meeting the goals outlined in the grant narrative and the OSHA Form 146. Progress in meeting grant goals should also be discussed in the evaluation report. Since goals are to be established to reflect optimal levels of resources and performance, the State's failure to meet such goals might not be basis for a finding of negative program impact.
CHAPTER IV -- COMPLAINTS ABOUT STATE PROGRAM ADMINISTRATION (CASPAs)
A. Definition of a CASPA. A Complaint About State Program Administration (CASPA) is a complaint, either oral or written, made to OSHA by any person or group of persons about some aspect of the operation or administration of a State plan. The complaint may relate to a specific State action, e.g., an inspection, or it may reflect a more generic criticism of State program administration.
B. Purposes of a CASPA. The CASPA mechanism:
1. Provides employers, employees, and the public an opportunity to address specific issues or concerns about the State program to OSHA;C. Determining If a CASPA Warrants Investigation. Within 5 calendar days of receiving a CASPA, the Region shall determine whether the CASPA warrants investigation.
2. Provides OSHA with an additional source of information on State performance; and
3. Provides the basis for State corrective action in those cases where the complaint is determined by investigation to be valid.
1. All CASPAs shall be investigated unless:D. Confidentiality. Federal regulations (29 CFR 1954.21) require that the identity of any CASPA complainant be kept confidential.a. The complainant has not exhausted the available administrative remedies specifically provided for by State procedures and regulations. For example, if a CASPA involves a State case under contest and the contest could provide the complainant with an administrative remedy, the Region shall not investigate the CASPA but shall instead notify the complainant that, if the results of the contest are unsatisfactory, he or she may file a CASPA at that time;2. If the Region lacks sufficient information to make such a determination, the Region shall in a timely manner solicit additional information from the complainant or from the State.
b. The complaint questions the authority of a State program to take a particular action that the State is clearly, required or allowed to do under its approved plan;
c. The complaint pertains to a matter not within the jurisdiction of the State program;
d. The Region has already investigated enough complaints of the same nature to make an additional investigation unnecessary; or
e. The events involved in the complaint occurred so long ago that an investigation would not be meaningful in the context of current conditions.
3. If a CASPA alleges that a situation of imminent danger exists, the Region shall make a determination immediately.
4. Anonymous CASPAs will be investigated if there is sufficient information to proceed with an investigation.
5. All CASPAs will be entered on the CASPA Log found in Appendix L.
1. When contacting the State concerning a CASPA, the Region shall withhold the name of the complainant. The name of the complainant shall not appear in any record published, released, or made available.E. Notification of Concerned Parties and Opportunity for State Response
2. Notwithstanding the above, the Region may attempt to obtain a written waiver of confidentiality from the complainant if, in the judgment of the Regional Administrator, the waiver would facilitate investigation of the CASPA.
1. If An Investigation Is Not Warranted. If the Region determines that a CASPA does not warrant investigation, the complainant shall be notified in writing of this determination and the reasons for it. The complainant shall also be informed that he/she may ask the Regional Administrator to reconsider this decision. The Region shall forward to the State a copy of the letter informing the complainant of the preliminary determination after the deletion of any information that would identify the complainant.F. Methods for Investigation of a CASPA. The Region shall determine the methods to be used to investigate a CASPA.
2. If An Investigation Is Warranted.a. Routine CASPAs. The Region shall, as soon as it determines that a CASPA investigation is warranted, forward to the State a letter describing the nature of the complaint. The Region shall provide the State with the opportunity to respond to the CASPA within a reasonable time, generally not to exceed two weeks, and to have its response considered as part of the Region's investigation. The Region may, if appropriate, identify the issues on which a response from the State would be most useful. The Region shall also notify the complainant that:(1) A decision has been made to investigate the CASPA;b. CASPAs Alleging Situations of Imminent Danger. If a CASPA alleges that an imminent danger exists, the Regional Administrator shall immediately contact the State to ensure that appropriate State enforcement action will be taken to address the alleged imminent danger. The Region shall notify the complainant of its action in a timely manner and keep him/her informed of the CASPA investigation as appropriate.
(2) The Region may be contacting him/her to obtain additional information; and
(3) The Region will send him/her a written response detailing the results of the investigation.
(See Appendix M for an example of a letter to a CASPA complainant acknowledging receipt of the CASPA.)
c. CASPAs Where Notifying the State of the Nature of the Allegation Is Not Appropriate. The Region may determine that providing the State with information on the specific nature of the CASPA allegation and with an initial opportunity to respond is not appropriate, for example in cases where providing information on the substance of the complaint would disclose the identity of a complainant requesting anonymity. In such cases, the Region shall notify the State that it is investigating the CASPA and provide whatever general information it can release.
1. If the State responds to the Region in a timely manner, the Region shall verify the information provided by the State and treat it as an information source in the Region's own investigation of the CASPA. The Region may, however, initiate its own investigation before receiving the State's response if it is delayed unreasonably.G. Review of Completed CASPA Investigations.
2. If the Region or State determines that it is necessary to visit a particular establishment or jobsite to prepare its response, the Region and State should, if at all possible, arrange a joint jobsite visit. This procedure is intended to avoid the need to make two on-site visits (one Federal and one State) and to minimize the burden of such visits on the employer and employees. The procedures established in Chapter II for accompanied visits or spot check monitoring visits shall be followed.
1. Communication Between Region and State. After a CASPA investigation is completed, the Region and State shall discuss the Region's findings, the recommendations made (if any), and any action the State is planning to make in response to the CASPA.H. Documentation of CASPA Investigations. The amount and kind of information that is collected and analyzed will vary for each CASPA. However, each case file should contain written documentation of the following:
2. Response to Complainant. As soon as the Region and the State have discussed the findings and recommendations resulting from the CASPA investigation, the Region shall draft a letter of response to the complainant summarizing the investigative steps taken, the analysis conducted, the conclusions reached, and any corrective action taken or planned by the State. This letter shall also advise the complainant of his/her rights to request reconsideration by the Regional Administrator.a. Review of Draft Response to Complainant. As soon as a draft response to the complainant has been prepared, the Region shall provide a copy of the draft response to the State and shall discuss it with the State. Any copy of the draft response provided to the State shall maintain the confidentiality of the complainant.3. Letter to the State. The Region shall send a letter to the State setting out the conclusions of its investigation and its recommendations for corrective action, if any. The Region shall also send the State a copy of its final response to the complainant, and this response shall maintain the confidentiality of the complainant. The Region shall also propose a timetable for any corrective action recommended.
b. Notification of Complainant. Based on the results of the discussion with the State, the Region shall make any changes to the draft response that are deemed appropriatly and shall send a final response to the complainant in a timely manner.
4. State Response. If the State disagrees with any aspect of the investigation, it may file a written response. The Region shall include a copy of the State's written response in the materials (CASPA, response to complainant, and State's written response) the Region sends to the Office of State Programs. The State shall have a reasonable amount of time to prepare this response.
5. Forwarding of the Response to Complainant to the Office of State Programs. For CASPAs having national or significant program impact, the Region shall forward to the Office of State Programs, upon completion of the investigation, one copy of the response sent to the complainant and one copy of the letter sent to the State.
6. Corrective Action. When appropriate, the State shall take corrective action in individual cases where CASPAs are determined by investigation to be valid and shall put in place a procedure to avoid a recurrence of the problem identified by such investigations. The Region shall monitor the state's corrective action.
7. Resolution. If the State fails or refuses to take corrective action and the Region is unable to negotiate a solution, the issue shall be handled in accordance with the resolution process outlined in Chapter I of this Part.
1. Identification of Allegations To Be Investigated. A statement of allegations to be investigated.
2. Information Reviewed. A listing of all information reviewed that is relevant to investigating the CASPA.
3. Analysis and Conclusions. An analysis of the allegations in the complaint and conclusions with respect to their validity.
4. Recommendations. Where deemed appropriate, reasonable corrective action to be taken by the State and a timetable for such action.
5. Response of State. The actions taken by the State in response to conclusions reached and recommendations made.
6. Followup by Region. The actions of the Region as a followup to the conclusions reached and recommendations made.
NOTE: The Regional Administrator's written response to the complainant may satisfy most of these documentation requirements.
CHAPTER V -- THE EVALUATION REPORTS
A. Introduction. Two types of evaluation reports are prepared based on the findings of the monitoring process.
1. A Comprehensive Evaluation Report is prepared every two years.B. Comprehensive Evaluation Report
2. An Interim Evaluation Report is prepared on the alternate years and focuses on major State achievements, unresolved performance issues and major new issues including State-initiated changes and results of CASPA investigations.
3. Monitoring activities are conducted throughout the reporting periods.
1. Purpose and Scope. The evaluation report outlines the Agency's assessment of State program effectiveness at the conclusion of the evaluation period. The report is provided to the State and is made available to the public.C. Interim Evaluation Report.a. The report shall address State performance in all program categories. The report shall emphasize performance that meets or exceeds Federal performance as well as State performance considered to be less effective.2. Format.
b. The report shall address the analyses of all outliers or other issues identified for evaluation during the evaluation period and provide an overview of monitoring activities, State performance and State-initiated actions throughout the evaluation period. The report shall also address the impact of each such issue on overall State program performance. In this regard the report shall address the following:(1) State corrective actions underway;
(2) Requirements for corrective actions and recommendations for improvement;
(3) Proposed evaluation change supplements to remedy problems identified in policies and procedures in the State plan documents.a. Title Page. The title page shall include:(1) The name of the State;b. Table of Contents. There shall be a table of contents that lists each section of the report with the page on which it begins.
(2) The designated State agency;
(3) The dates of plan approval, certification, and final approval, as applicable;
(4) The period covered by the evaluation;
(5) The Regional Office responsible for the report's preparation, including full agency identification.
c. Executive Summary. There shall be an executive summary which shall include the following:
d. A brief description and assessment of State performance in each major program category, including any statistical outliers;(1) A brief description of any requirement for corrective action plans to address deficiencies;e. Introduction. The introduction shall include:
(2) A brief description of any recommendation to address differences uncovered during the evaluation;
(3) A brief description of any evaluation change supplements necessary to address deficiencies in State plan documents;
(4) A brief description and assessment of any unique State program activities not otherwise discussed.(1) A brief historical profile of the State program including its activities and structure, and describing any significant differences from the Federal program; andf. Major New Issues. Any new issues arising during the period which have a significant impact on program performance should be discussed briefly in this section, and in more detail under the appropriate program category.
(2) A brief description of the system used to monitor State performance, noting all data sources including the State Internal Evaluation Program.
g. Summary of Recommendations and Evaluation Change Supplements. There shall be a listing of required corrective action plans, recommendations and evaluation change supplements discussed in the report and the pages on which they appear for ease of reference.
h. Discussion of Program Categories(1) The comprehensive evaluation report shall address each of the 11 major program categories. The discussion of each of the program categories shall be clearly identified with headings as follows:
(a) Standards
(b) Variances
(c) Private Sector Enforcement
(d) Public Sector Enforcement
(e) Review Procedures
(f) Discrimination
(g) Consultation
(h) Training and Education
(i) Program Administration
(j) Program Resources
(k) Program Results (Private Sector)
(2) The following information is to be included under each category listed above or alternatively in a separate appendix:
(a) State Policies and Procedures. There shall be an introductory section for each category describing the State's policies and procedures and their effect on State performance. Any differences should be discussed in detail.
(b) Activities Measures. Activities measures may either be included in the body of the evaluation report or appended with the C-SPAM report. A special 24-month C-SPAM report will be run for a State's biennial report period. Data on each activities measure regarding a program category shall be presented separately along with a description and analysis of relevant State performance. Data for all activities measures shall be presented in a consistent form at that includes for each subcategory the following Federal and State data: activities measure performance, e.g., percentages, and numbers; further review levels; and a statement of criteria (e.g., 20% deviation).
i) SPAM Data. All 6-month outliers shall be analyzed to discover their causes and assess their program impact, unless an exemption has been granted.
a) It should not be necessary to perform additional analysis for outliers that were analyzed earlier in the evaluation period. The results of prior analyses may be incorporated into the report after their current validity is confirmed.
b) The discussion of each activities measure should include a summary of analyses conducted for any outlier even if the data no longer constitutes an outlier at the time the report is written.
c) Outliers appearing for the first time on the last C-SPAM of the evaluation period need not be analyzed in the report. However, the report should note any new outliers and state that they will be analyzed during the next report period.
ii) Other Quantitative Information. The discussion of each activities measure should include all other relevant data, including readily available data such as planned vs. actual performance information and data secured through on site monitoring. The State's progress in meeting its goals as set out in the grant and related documents such as the OSHA Form 146 should be discussed under the relevant program categories.
iii) State Internal Evaluation. Findings shall be included under the relevant program categories.
iv) CASPAs. Each CASPA having a significant impact on the State program shall be discussed. The discussion shall include a brief description of the CASPA allegations, the conclusions reached and recommendations made, and the status of the State's response to such recommendations.
v) Unique State Program Activities. All those State program activities that have no Federal parallel should be described and their impact on performance analyzed. Unique activities that enhance performance should be emphasized. Analyses of unique programs need not be reiterated in successive reports assuming no significant changes have occurred in the interim.
vi) Special Information for Selected Categories. The following information regarding State performance should be included either in the text or an appendix:
a) Standards. This section should list chronologically all newly adopted Federal standards transmitted to the State by memorandum during the evaluation period. It should also list chronologically earlier adopted Federal standards to which the State has not yet responded. The list should identify the Federal standard, the date of State adoption, whether the adoption was timely, and whether the standard was identical or at least as effective as the Federal standard. Untimely standards and less than effective standards must be addressed in the text of the report.
b) Training and Education. The subject of all State training sessions should be noted along with findings from on site monitoring.
c) Program Administration - Federal Program Changes. Federal program changes adopted during the evaluation period along with the State's responses thereto should be discussed. All Federal program changes adopted in earlier periods to which the State has not yet responded shall also be reviewed. An appendix shall include a list of all such program changes which identifies the change, indicates if a plan change required, the date and timeliness of initial State response, and the date, timeliness and adequacy of State adoption of the program change.
d) Program Administration - State Initiated Plan Changes. Significant State initiated plan changes submitted during the evaluation period should be highlighted. An appendix shall include a chronological list of same identifying the subject, a brief explanation, its timeliness, and whether it was approved. The list should also include known State initiated plan changes not yet formally submitted.
e) Program Resources. Monitoring findings as a result of the annual reviews of the State's financial program in accordance with OSHA Instruction FIN 3.2 should be included in this section.
(c) Status of Responses to Corrective Action Plans and Recommended Corrective Action from Previous Evaluation Reports.
i) Corrective action plans required in prior evaluation reports and recommendations for correction shall be discussed along with the State's response.
ii) Corrective action recommended during the current evaluation period should also be discussed along with the State's response to date.
(d) Evaluation Findings. The results of all analyses conducted during the evaluation period shall be presented. The evaluation reports shall make reference to all findings both confirming effectiveness or questioning it.
i) The evaluation report shall note those aspects of State performance or administration which confirm the effectiveness of the State program.
ii) The evaluation report shall also note any aspect of State performance or administration which detracts from the State's effectiveness.
a) When such deficiencies relate to any of the basic prerequisites enumerated in Section 18 and its implementing regulations for continued plan approval, the State is required to submit a corrective action plan. Corrective action plans are due within 30 working days of the State's receipt of the final evaluation report. While it is the State's obligation to develop a corrective action plan to address the deficiency, it is within the State's discretion to initially identify the means deemed most effective to correct the deficiency, subject to the approval of the Assistant Secretary.
b) Differences which do not render the State program less effective, but correction of which, in the judgment of the Region would improve the effectiveness or efficiency of the State program, will be noted and a recommendation made that the State address the differences at issue. The State's response to such recommendations need not have the detail included in a formal corrective action plan.
c) All deficiencies requiring the submission of a corrective action plan shall be set off from the text and labelled as well as included in the Summary of Recommendations.
d) When State policies or procedures contained in the State plan document must be changed, the report will indicate that an evaluation plan change supplement is required.
1. Purpose and Scope. The interim evaluation report discusses major occurrences during the interval between comprehensive reports. These include State achievements, major performance issues, major program changes, and CASPA investigations. The report is provided to the State and is made available to the public.D. Procedures for Developing the Interim and Comprehensive Evaluation Reports.
2. Format.a. Title Page. The title page shall include:(1) The name of the State;b. Table of Contents. There shall be a table of contents that lists each section of the report with the page on which it begins.
(2) The designated State agency;
(3) The dates of plan approval, certification, and final approval, as applicable;
(4) The period covered by the evaluation;
(5) The Regional Office responsible for the report's preparation, including full agency identification.
c. Introduction. The report should briefly describe the structure and history of the State plan. The system used to monitor State performance should be discussed briefly, noting all data sources including the State Internal Evaluation Program if relevant.
d. State Achievements. Any major accomplishments by the State during the period, such as special initiatives aimed at reducing injuries and illnesses, shall be discussed briefly.
e. Unresolved Performance Issues. Any issues which were unresolved at the end of the prior comprehensive reporting period shall be discussed.(1) Corrective Action Plans. If a corrective action plan was required for an issue, the State's commitment to take corrective action and any specific actions taken should be discussed.f. Major New Issues. Any new issues arising during the period which have a significant impact on program performance should be discussed briefly, with an indication that they will be more fully addressed in the comprehensive report.
(2) Evaluation Change Supplements. If an evaluation change supplement was required for an issue, the content and status of the supplement should be discussed.
(3) Current data from the C-SPAM or other sources should be cited, as appropriate. Any additional data regarding the issue, such as the results of on-site monitoring or the State internal evaluation program should also be included. The source of the data should be noted.
(4) Any additional action to be taken on the issue should be discussed.
g. State-Initiated Changes. Any significant State-initiated changes not discussed above, e.g. legislation, funding, new programs or procedures, should be discussed briefly.
h. CASPAs. Any CASPA having a significant impact on the State program shall be discussed. The discussion shall include a brief description of the CASPA allegations, the conclusions reached and recommendations made, and the status of the State's response to such recommendations.
i. C-SPAM. The C-SPAM for the report period should be attached for reference.
1. Timing.E. Availability of the Final Report and Any State Response. The final interim or comprehensive report transmitted to the State by the Regional Administrator, as well as any subsequent State response, is available to the public on request pursuant to the Freedom of Information Act (FOIA). If FOIA requests for the report are received prior to the receipt of State comments, the report will be released within the required timeframes with a note that a response from the State may be forthcoming. Each requestor shall be provided a copy of the State's response even if it is not available at the time of the original release.a. The Regional Administrator shall transmit the comprehensive Evaluation Report to the State within 90 calendar days of receipt of the final C-SPAM Report for the evaluation period. For biennial reports, the Region should request a special 24-month C-SPAM through the Office of State Programs before the end of the evaluation period.2. State Response to the Report.
b. The Regional Administrator shall transmit the Interim Evaluation Report to the State within 45 calendar days of receipt of the final C-SPAM Report for the evaluation period. If there is a delay in meeting this timeframe, the Regional Administrator shall notify the Office of State Programs.
c. A draft copy of each report shall be provided to the State at least 21 calendar days before its intended issuance date.a. The State shall have an opportunity to submit a written response to the draft interim or comprehensive report.3. Role of the Office of State Programs.
b. The Region shall attempt to resolve any differences noted in the State's response and, if appropriate, modify the report to incorporate the State's response. The State's position regarding any unresolved issues shall be noted in the report. Discussions of monitoring results throughout the evaluation period should reduce instances of disagreement concerning the report.a. The Region shall provide a copy of the draft interim or comprehensive report to the Office of State Programs immediately upon completing any modifications in response to the State's comments. Sharing of earlier drafts is encouraged to facilitate the review.4. State Response to the Final Report - Action Plans.
b. The Office of State Programs shall have 14 calendar days to review the draft interim or comprehensive report to ensure consistency and conformity with national policy and precedent.
Any changes recommended by the Office of State Programs shall be discussed with the Region and forwarded to the RA for consideration. The Region shall promptly notify the State of any substantive changes to the report resulting from such discussions.
c. Where major issues requiring the attention of the Assistant Secretary are identified by the Regional Administrator and/or the Office of State Programs, the Office of State Programs will coordinate the necessary review and consultation. The report shall not be released by the Regional Administrator until such issues are resolved.
d. The Region shall send a copy of each final interim or comprehensive report and transmittal to the Office of State Programs.
e. The Office of State Programs shall provide a copy of the Executive Summary and Summary of Recommendations for each comprehensive evaluation report and a full interim evaluation report to the Assistant Secretary after issuance of the report by the Regional Administrator.a. Within 30 calendar days of receipt of the final interim or comprehensive report, the State shall have an opportunity to submit a formal response to the Regional Administrator. A formal response is required if the report outlines any required corrective action.5. Regional Action on State Responses. The Region shall evaluate the State's response to ensure that all requirements for corrective action have been addressed. The Region may request additional information from the State. A copy of the State response shall be forwarded by the Region to the Office of State Programs. The Region shall also ensure that the annual monitoring plan is updated to reflect the report's final requirements for corrective actions and the State's response.
b. The State's response shall include an action plan which details corrective actions proposed to address each requirement and recommendation contained in the report and the timeframes therefor.
F. Evaluation Report for Developmental Plans. In addition to the requirements of this Chapter, evaluation reports for developmental plans shall include a discussion of the State's progress in meeting developmental commitments.
APPENDIX D
STATE STANDARDS DEVELOPMENT AND PROMULGATION LOG
(For Appendix D, see printed copy)
STATE STANDARDS DEVELOPMENT AND PROMULGATION LOG
(For Appendix D, see printed copy)
APPENDIX F
FEDERAL PROGRAM CHANGE LOG, EXCLUDING STANDARDS
(For Appendix F, see printed copy)
FEDERAL PROGRAM CHANGE LOG, EXCLUDING STANDARDS
(For Appendix F, see printed copy)
D. Technical Notes for SPAM Analysis.
1. "No Inspections" (OSHA-1, item 35, scope = D) and Corporate Settlement Agreements (OSHA-1, item 35, scope = C) are NOT included in the Inspection (INSP) report.E. Available IMIS Reports To Be Used in Routine Monitoring and SPAM Analysis. Following is a list of routine readily available reports to supplement SPAM analysis. These IMIS reports SHOULD NOT be run routinely to verify the SPAM data. Suspected system or program factors should first be checked on the micro database. The running of the report(s) should occur ONLY when the report(s) is appropriate to assist in the analysis of outliers.
2. "No inspections" and Corporate Settlement Agreements ARE included in the Summary SCAN and Detailed SCAN unless you specifically set the parameters to exclude them. To exclude them on micro-to-host, use prompt 30 and enter A and B.
3. Maritime inspections are included in the total counts on the INSP and SCAN reports. The way they are categorized as maritime industry on the INSP report is when item 25a and 25b on the OSHA-1 form is checked MARITIME. For example, if a safety fatality inspection is conducted at Bob's Stevedoring (SIC 4463) and the SIC is on the maritime high hazard list, the Compliance Officer should check box 25a as maritime. If he/she does not, the inspection will show up as "Other" under industry type on the INSP report, no matter what the SIC is. There is a special prompt on micro-to-host for selecting maritime inspections only.
4. On some of the micro-to-host reports, the default value for prompt #28 is F - Federal data only. If you want only State data, you MUST enter S - State data only, even though you used the State's RID in the earlier prompt. If you use a State RID and leave prompt #28 blank, you will get a blank report.
1. Standard Micro Reports must be used to enable States to manage their program as outlined in ADM 1-1.21A.GLOSSARY
2. IMIS HOST Reports - as outlined in ADM 1-1.19A.
3. Micro-to-host Detailed SCAN of inspections conducted during the period (opening conference date during the period), sorted first by ownership, secondly by safety/health, and third by opening conference data. NOTE: Be aware that the public sector sorts by State Government facilities first and Local Government facilities second. Inspections will appear on the SCAN in the following order: private sector health inspections, private sector safety inspections, State Government health inspections, State Government safety inspections, Local Government health inspections, and Local Government safety inspections. If there are any inspections that are incorrectly marked Federal Agency, they will appear after the private sector safety and before the State Government health inspections.)
4. Micro-to-host Detailed Scan of inspections with CITATION ISSUANCE DATE DURING THE PERIOD, sorted first by ownership, secondly by safety/health, and third by opening conference date. See NOTE in #3 regarding ownership sort.
5. Micro-to-host Inspection Report (INSP) for private sector. This can be run for just safety, just health, or both, as needed.
6. Micro-to-host Inspection Report (INSP) for public sector. This can be run for just safety, just health, or both, as needed.
7. Micro-to-host Frequently Violated Standards Report. Limit this to the top 50 or top 100 as needed.
8. Micro-to-host Complaint Query for the private sector. NOTE: You must run separate reports for the private and public sectors, because ownership is not noted on the Complaint Query.
9. Micro-to-host Complaint Query for the public sector.
10. Micro-to-host Referral Query for the private sector. NOTE: You must run separate reports for the private and public sectors because ownership is not noted on the Referral Query.
11. Micro-to-host Referral Query for the public sector.
12. Micro-to-host Program Activity Report (PROG). This can be run for just safety, just health, or both, as needed.
13. HOST Health Sampling Activity Report.
14. HOST report on OSHA-7's Sent for Signature. explained by the absence or infrequent occurrence of a certain characteristic in the case files that comprise the sample.]
3. Developing a Sampling Frame. The sampling frame can be obtained from the IMIS. This frame consists of a listing of the case file numbers for the population of interest.
4. Ascertaining Sample Size. In all cases, the size of the sample will depend on the size of the population of interest; population size can be determined prior to sampling because it represents the number of case files that comprise the population of interest, e.g., all case files having a certain characteristic. Using the size of the population of interest, refer to the Sampling Table (page K-6); this table shows the sample size that must be used with a population of this size if the resulting estimate is to fall within + 10 percent of the true value 95 percent of the time.
5. Random Inspection List. The Office of Management Data Systems (OMDS) can provide upon request a randomly selected list of inspections that meet the desired selection criteria. Federal monitors must request such a list directly from OMDS on an as needed basis.
6. Calculating Estimates. For each case file in the sample, determine whether the characteristic(s) of interest is present or absent. (If a case file in the sample does not belong to the population of interest or the file is missing, go back to the Random Number Table and select a replacement.) To estimate the proportion (P) of the sample that has the characteristic, determine the fraction of all case files in the sample that possess the characteristic:
EXAMPLE:
number of case files reviewed that have the P = characteristic of interest --------------------------
sample size
Multiply the result by 100 to determine the percentage.
7. Interpreting Results. The percentage of the sample of case files that has the characteristic of interest represents an estimate of the percentage of the entire population of case files that has this characteristic; the way in which the sample was drawn allows the analyst to have confidence that, 95 percent of the time, the true measure of the characteristic of interest lies within to percent of this estimate. In other words, if the analysis reveals that an estimated 30 percent of the case files in the sample have a certain characteristic, the analyst can be 95 percent confident that between 27 and 33 percent of all case files in the population actually have the characteristic of interest [30 X 0.10 = 3; 30 +/-3 = 27 to 33].
Statistical Terms
Characteristic - A distinguishing trait, quality, or property, e.g., age, size type of inspection.
Confidence Interval - An estimate of she range within which a sample value of a characteristic will vary from the true population value.
Confidence Level - The probability that the population value lies within the margin of error established for a sample value (e.g., a confidence level of 95%).
Population - The aggregate of all the people, objects, or events that share a common characteristic, e.g., all the case files for a State.
Population Parameter - A numerical characteristic of a population, such as the mean, total or variance. A proportion is the mean of a binary variable.
Probability Sample - A sample in which each member of the population has a measurable and known chance of being included.
Proportion - A fraction of a sample that shares some attribute (usually, the number of elements of a sample that have one of the two possible values of a binary characteristic) divided by the sample size.
Sample - A group drawn from the population of interest with the intention of finding out something about the larger population from which the sample is drawn.
Sampling Frame - All the elements of a population numbered consecutively and listed so that a sample of them may be chosen.
Sample Size - The number of population elements included in the sample.