Appendix
Data Validation and Verification Overview
OPM uses its performance data to promote improved outcomes, and senior leaders regularly review performance information to identify successful or promising practices, where the agency is not making sufficient progress, and plans for future improvement. The performance information in this report is reasonably complete and reliable, as defined by the Government Performance and Results Modernization Act of 2010.
The following section describes the steps that OPM has taken to promote the accuracy, completeness, and reliability of the performance information it reports for each measure. Additionally, the following steps outline agency-wide efforts to promote data quality:
- OPM developed and regularly updates dashboards with the agency’s performance results, facilitating senior management review. Senior agency leaders participate in Results OPM performance review meetings at least quarterly. This process includes substantiating that actual results reported are indeed correct whenever those results reveal substantial changes in trends or variances from targets.
- The Office of the Chief Financial Officer (OCFO) provides guidance to Objective Teams on data quality and developed a standard form for Objective Teams to document data collection and reporting procedures, definitions, source data, validation and verification, and limitations. The OCFO reviews such documentation for adequacy, providing feedback and recommendations for improvement to Objective Teams. This documentation serves as a job aid to performance measurement and reporting staff, helping to promote the use of consistent definitions and methods.
- To reduce manual processes and the risk of human error, OCFO has developed a new application for performance data collection and reporting and began using the application in FY 2022.
- OCFO, Goal Owners, and Objective Owners assess the use and effectiveness of the agency’s performance measures and consider alternative measures during the agency’s annual performance budgeting process. Cross-organizational teams of Objective Owners establish consensus on the validity of the measures.
These agency-wide efforts, in addition to the specific actions that Goal and Objective Owners have taken for each measure, as described in the following section, support the completeness, reliability, and quality of OPM’s performance information.
Measure Definitions, Data Sources, Verification, and Validation
Strategic Goal 1: Position the Federal Government as a model employer, improving the Government-wide satisfaction index score by 4 points.
Strategic Objective 1.1: Achieve a Federal workforce that is drawn from the diversity of America, exhibited at all levels of Government, by supporting agencies in fostering diverse, equitable, inclusive, and accessible workplaces. By FY 2026, increase a Government-wide Diversity, Equity, Inclusion, and Accessibility index score by 6 percentage points.
Performance Measure | Government-wide Diversity, Equity, Inclusion, and Accessibility index score |
---|---|
Definition |
The average of the scores on a one-hundred-point scale (strongly disagree to strongly agree) for the following OPM FEVS items related to diversity, equity, inclusion, and accessibility: My organization’s management practices promote diversity (e.g., outreach, recruitment, promotion opportunities). My supervisor demonstrates a commitment to workforce diversity (e.g., recruitment, promotion opportunities, development). I have similar access to advancement opportunities (e.g., promotion, career development, training) as others in my work unit. My supervisor provides opportunities fairly to all employees in my work unit (e.g., promotions, work assignments). In my work unit, excellent work is similarly recognized for all employees (e.g., awards, acknowledgements). Employees in my work unit treat me as a valued member of the team. Employees in my work unit make me feel I belong. Employees in my work unit care about me as a person. I am comfortable expressing opinions that are different from other employees in my work unit. In my work unit, people’s differences are respected. I can be successful in my organization being myself. I can easily make a request of my organization to meet my accessibility needs. My organization responds to my accessibility needs in a timely manner. My organization meets my accessibility needs. |
Data Source | OPM Federal Employee Viewpoint Survey (FEVS) |
Frequency | Annual |
Verification and Validation | Between 2010 and 2019, response rates to the OPM FEVS ranged between 41 and 52 percent. Thus, the cleaned OPM FEVS data are weighted so that survey estimates accurately represent the survey population (unweighted data could produce biased estimates of population statistics). The final data set reflects the agency composition and demographic makeup of the Federal workforce within plus or minus one percentage point. Demographic results are not weighted. Additional details of OPM FEVS validation methods are found in the appendix of the Government-wide Management Report for the relevant year at https://www.opm.gov/fevs. OPM’s Survey Analysis Group within Workforce Policy and Innovation leads the survey administration and conducts extensive data analysis to verify the results and identify any systemic data issues. OPM FEVS is a web-based survey, and the instrument has built-in programs to inspect data for various response errors or out of range values; thus, data cleaning is a continuous operation throughout the data collection period. |
Data Limitations |
The OPM FEVS is administered annually and reflects employee opinions at a single point in time. Events around the time of the data collection (historicity effect) could possibly influence results. Not all executive agencies participate in the OPM FEVS. For example, the U.S. Department of Veterans Affairs no longer participates. The OPM FEVS response rate varies but is generally around 45 percent. However, it is important to note that the large sample size (OPM sent the 2020 survey to more than 1.41 million employees, with 624,800 employees completing a survey), combined with the weighting procedures described above, support the accuracy of the survey data. |
Strategic Objective 1.2: Develop a Government-wide vision and strategy and implement policies and initiatives that embrace the future of work and position the Federal Government as a model employer with respect to hiring, talent development, competitive pay, benefits, and workplace flexibilities.
Performance Measure | Percent of CHCOs who report they have the necessary guidance and resources from OPM to inform their future of work planning |
---|---|
Definition | The number of CHCO survey respondents who indicate that they "agree" or "strongly agree" with the statement "My agency has the necessary guidance and resources from OPM to inform our future of work planning" divided by the total number of CHCOs who responded to the survey item. CHCOs are defined as the CHCOs and Deputy CHCOs of the CHCO Act of 2002 agencies. |
Data Source | CHCO Council Survey |
Frequency | Annual |
Verification and Validation | As part of the verification process, responses to the survey item are checked for appropriate and accurate coding (for example, there were no out of range responses and responses corresponded with survey skip patterns). Double-checking the coding of each survey item enhances data quality, supporting accuracy, completeness, and reliability. OPM subject matter experts and the CHCO executive council validated the items. Data is cross referenced with other feedback channels and coordination efforts through the CHCO Council. |
Data Limitations | This data is collected annually and, therefore, reflects CHCO opinions at a single point in time. Events around the time of the data collection (historicity effect) could possibly influence results. In addition, turnover in some agency CHCO positions can be high, meaning that the sample is not stable over time. Other key limitations are response rate, which may vary by year, and sample size, which is expected to be low and limits the ability to make precise determinations or comparisons. |
Performance Measure | Percent of CHCOs who report they find the services from OPM to inform their future of work planning helpful |
---|---|
Definition | The number of CHCO survey respondents who indicate that they "agree" or "strongly agree" with the statement "OPM's services to inform future of work planning are helpful" divided by the total number of CHCOs who responded to the survey item. |
Data Source | CHCO Council Survey |
Frequency | Annual |
Verification and Validation | As part of the verification process, responses to the survey item are checked for appropriate and accurate coding (for example, there were no out of range responses and responses corresponded with survey skip patterns). Double-checking the coding of each survey item enhances data quality, supporting accuracy, completeness, and reliability. OPM subject matter experts and the CHCO executive council validated the items. Data is cross referenced with other feedback channels and coordination efforts through the CHCO Council. |
Data Limitations | This data is collected annually and, therefore, reflects CHCO opinions at a single point in time. Events around the time of the data collection (historicity effect) could possibly influence results. In addition, turnover in some agency CHCO positions can be high, meaning that the sample is not stable over time. Other key limitations are response rate, which may vary by year, and sample size, which is expected to be low and limits the ability to make precise determinations or comparisons. |
Strategic Objective 1.3: Build the skills of the Federal workforce through hiring and training. By FY 2026, increase the Government-wide percentage of respondents who agree that their work unit has the job-relevant knowledge and skills necessary to accomplish organizational goals by 4 points.
Performance Measure | Percent of respondents who agree that their work units have the job-relevant knowledge and skills necessary to accomplish organizational goals |
---|---|
Definition |
The number of Federal employees who responded positively (strongly agree or agree) to the following OPM FEVS item, divided by total number of Federal employees who responded to the item: My work unit has the job-relevant knowledge and skills necessary to accomplish organizational goals |
Data Source | OPM Federal Employee Viewpoint Survey (FEVS) |
Frequency | Annual |
Verification and Validation | Between 2010 and 2019, response rates to the OPM FEVS ranged between 41 and 52 percent. Thus, the cleaned OPM FEVS data are weighted so that survey estimates accurately represent the survey population (unweighted data could produce biased estimates of population statistics). The final data set reflects the agency composition and demographic makeup of the Federal workforce within plus or minus one percentage point. Demographic results are not weighted. Additional details of OPM FEVS validation methods are found in the appendix of the Government-wide Management Report for the relevant year at https://www.opm.gov/fevs. OPM’s Survey Analysis Group within Workforce Policy and Innovation leads the survey administration and conducts extensive data analysis to verify the results and identify any systemic data issues. OPM FEVS is a web-based survey, and the instrument has built-in programs to inspect data for various response errors or out of range values; thus, data cleaning is a continuous operation throughout the data collection period. |
Data Limitations |
The OPM FEVS is administered annually and reflects employee opinions at a single point in time. Events around the time of the data collection (historicity effect) could possibly influence results. Not all executive agencies participate in the OPM FEVS. For example, the U.S. Department of Veterans Affairs no longer participates. The OPM FEVS response rate varies but is generally around 45 percent. However, it is important to note that the large sample size (OPM sent the 2020 survey to more than 1.41 million employees, with 624,800 employees completing a survey), combined with the weighting procedures described above, support the accuracy of the survey data. |
Performance Measure | Average score for hiring manager satisfaction that applicants to human resources, acquisitions, and cybersecurity positions are referred in a timely manner with the necessary skills to perform the job |
---|---|
Definition |
The average weighted hiring manager ratings on a scale of 1 – 10 (with 1 being strongly disagree and 10 being strongly agree) for the Hiring Manager Satisfaction Survey questions below, converted to a 5-point scale: 17. A sufficient number of qualified applicants were referred for hiring consideration (weighted 30 percent). 18. The applicants who were referred had the skills to perform the job (weighted 40 percent). 23. I received the certificate of eligible applicants from the human resources office in a timely manner (weighted 15 percent). 24. The overall hiring process occurred in a timely manner (weighted 15 percent). |
Data Source | Hiring Manager Satisfaction Survey |
Frequency | Annual |
Verification and Validation | The vendor that administers the Hiring Manager Satisfaction Survey provides quarterly verification of data completeness and accuracy. As part of the verification process, responses to the survey items are checked for appropriate and accurate coding, including no out of range responses and responses corresponded with survey skip patterns. A team of industrial and organizational psychologists assists in the creation, development, and monitoring of the survey process. The survey, including individual questions, has been vetted and approved by subject matter experts and the CHCO Council. |
Data Limitations | Data and results are based upon the responses from those who voluntarily complete the survey and who self-identify as having participated in the hiring process. These responses provide a portrayal of their perceptions and experiences regarding the timeliness of services and quality of applicants received. However, the number of service recipients is currently unknown as not every hiring manager completes the survey. To promote use of the survey, USA Staffing, which 75 percent of Federal agencies use as their Talent Acquisition System, automates the survey process while OPM works with the other Talent Acquisition Systems used by 25 percent of Federal agencies to further automate the survey process. Because three agencies represent almost 70 percent of responses, they have a disproportionate impact on the overall results of the Hiring Quality and Timeliness Index. |
Performance Measure | Percent of vacancies using alternative assessments to replace or augment the self-report occupational questionnaire |
---|---|
Definition | The number of Government-wide competitive permanent and term jobs open to the public and open to Federal employees posted to USAJOBS and sourced from USA Staffing and Monster hiring systems that use an assessment type other than or in addition to a self-assessment questionnaire (such as a multiple-choice online exam to assess skills like reasoning, judgment, and interaction), divided by the number of Government-wide competitive jobs open to the public and open to Federal employees posted to USAJOBS and sourced from USA Staffing and Monster hiring systems. |
Data Source | USA Staffing and Monster hiring systems |
Frequency | Quarterly |
Verification and Validation | OPM, OMB, and GSA publish a publicly available dashboard, enabling all agencies to verify their data. |
Data Limitations | Not all manual assessments are tracked in the Talent Acquisition Systems, resulting in potentially underreporting for those assessment types. The results reflect jobs posted to USAJOBS and sourced from USA Staffing and Monster hiring systems. The results represent hires into the competitive service (Delegated Examining and Merit Promotion), which represent a slide of overall agency hires. Other hiring authorities such as direct hire and excepted service positions are excluded from the data, however, some of those positions are still found in the dataset due to challenges identifying and excluding such positions. |
Strategic Objective 1.4: Champion the Federal workforce by engaging and recognizing Federal employees and elevating their work. By FY 2026, increase the number of social media engagements on recognition-focused content by 15 percent.
Performance Measure | Number of social media engagements on recognition-focused content |
---|---|
Definition |
The number of engagements on recognition-focused content shared by OPM on X and LinkedIn. In FY 2022, the number also included content shared on Facebook. Engagements are defined as the number of times users liked, @replied, retweeted, or clicked on posts (not including quote tweets) on X and reacted to, commented on, shared, or clicked on posts on LinkedIn. Recognition-focused content includes content shared on OPM social media that is designed to engage, recognize, or elevate the Federal workforce. |
Data Source | Sprout Social Profile Performance Report |
Frequency | Quarterly |
Verification and Validation | The responses are tracked by the social media companies and reviewed by OC in the Sprout Social Profile Performance Report. |
Data Limitations | The data may not reflect all viewers of the content who may see it via other platforms, or who may not engage with the content sufficiently to be captured. |
Strategic Goal 2: Transform OPM’s organizational capacity and capability to better serve as the leader in Federal human capital management.
Strategic Objective 2.1: Build the skills of the OPM workforce and attract skilled talent. By FY 2026, increase the percentage of OPM employees who agree that their work unit has the job-relevant knowledge and skills necessary to accomplish organizational goals by 3 percentage points.
Performance Measure | Percent of respondents who agree that their work unit has the job-relevant knowledge and skills necessary to accomplish organizational goals |
---|---|
Definition | The number of OPM employees who responded positively (strongly agree or agree) to the following OPM FEVS item, divided by the number of OPM employees who responded to the FEVS item: My work unit has the job-relevant knowledge and skills necessary to accomplish organizational goals. |
Data Source | OPM Federal Employee Viewpoint Survey (FEVS) |
Frequency | Annual |
Verification and Validation | Between 2010 and 2019, response rates to the OPM FEVS ranged between 41 and 52 percent. Thus, the cleaned OPM FEVS data are weighted so that survey estimates accurately represent the survey population (unweighted data could produce biased estimates of population statistics). The final data set reflects the agency composition and demographic makeup of the Federal workforce within plus or minus one percentage point. Demographic results are not weighted. Additional details of OPM FEVS validation methods are found in the appendix of the Government-wide Management Report for the relevant year at https://www.opm.gov/fevs. OPM’s Survey Analysis Group within Workforce Policy and Innovation leads the survey administration and conducts extensive data analysis to verify the results and identify any systemic data issues. OPM FEVS is a web-based survey, and the instrument has built-in programs to inspect data for various response errors or out of range values; thus, data cleaning is a continuous operation throughout the data collection period. |
Data Limitations |
The OPM FEVS is administered annually and reflects employee opinions at a single point in time. Events around the time of the data collection (historicity effect) could possibly influence results. Not all executive agencies participate in the OPM FEVS. For example, the U.S. Department of Veterans Affairs no longer participates. The OPM FEVS response rate varies but is generally around 45 percent. However, it is important to note that the large sample size (OPM sent the 2020 survey to more than 1.4 million employees, with 624,800 employees completing a survey), combined with the weighting procedures described above, support the accuracy of the survey data. |
Strategic Objective 2.2: Improve OPM's relationships and standing as the human capital management thought leader. By FY 2026, increase the percent of CHCOs who strongly agree that OPM treats them as a strategic partner by 23 percentage points.
Performance Measure | Percent of CHCOs indicating that OPM treats them as strategic partners |
---|---|
Definition | The number of CHCO survey respondents who indicate that they "agree" or "strongly agree" with the statement "OPM treats CHCOs as strategic partners" divided by the total number of CHCOs who responded to the survey item. CHCOs are defined as the CHCOs and Deputy CHCOs of the CHCO Act of 2002 agencies. |
Data Source | CHCO Council Survey |
Frequency | Annual |
Verification and Validation | As part of the verification process, responses to the survey item are checked for appropriate and accurate coding (for example, there were no out of range responses and responses corresponded with survey skip patterns). Double-checking the coding of each survey item enhances data quality, supporting accuracy, completeness, and reliability. OPM subject matter experts and the CHCO executive council validated the items. Data is cross referenced with other feedback channels and coordination efforts through the CHCO Council. |
Data Limitations | This data is collected annually and, therefore, reflects CHCO opinions at a single point in time. Events around the time of the data collection (historicity effect) could possibly influence results. In addition, turnover in some agency CHCO positions can be high, meaning that the sample is not stable over time. Other key limitations are response rate, which may vary by year, and sample size, which is expected to be low and limits the ability to make precise determinations or comparisons. |
Performance Measure | Percent of CHCOs who strongly agree that OPM treats them as strategic partners |
---|---|
Definition | The number of CHCO survey respondents who indicate that they "strongly agree" with the statement "OPM treats CHCOs as strategic partners" divided by the total number of CHCOs who responded to the survey item. CHCOs are defined as the CHCOs and Deputy CHCOs of the CHCO Act of 2002 agencies. |
Data Source | CHCO Council Survey |
Frequency | Annual |
Verification and Validation | As part of the verification process, responses to the survey item are checked for appropriate and accurate coding (for example, there were no out of range responses and responses corresponded with survey skip patterns). Double-checking the coding of each survey item enhances data quality, supporting accuracy, completeness, and reliability. OPM subject matter experts and the CHCO executive council validated the items. Data is cross referenced with other feedback channels and coordination efforts through the CHCO Council. |
Data Limitations | This data is collected annually and, therefore, reflects CHCO opinions at a single point in time. Events around the time of the data collection (historicity effect) could possibly influence results. In addition, turnover in some agency CHCO positions can be high, meaning that the sample is not stable over time. Other key limitations are response rate, which may vary by year, and sample size, which is expected to be low and limits the ability to make precise determinations or comparisons. |
Strategic Objective 2.3: Improve OPM's program efficacy through comprehensive risk management and contract monitoring across the agency. By FY 2026, achieve the OMB-set target for the percentage of spending under category management.
Performance Measure | Percent of OPM's spend under management (SUM) (Cumulative) |
---|---|
Definition | The amount of OPM’s spend that is actively managed according to category management principles divided by the amount of OPM’s spend. Category management refers to the business practice of buying common goods and services as an enterprise to eliminate redundancies, increase efficiency, and deliver more value and savings from the Government’s acquisition programs. |
Data Source | Federal Procurement Data System |
Frequency | Quarterly |
Verification and Validation | OPM compares contract data from GSA SUM reports with contract data reported in OPM’s contract writing system to verify GSA’s SUM calculation. |
Data Limitations | There are no significant data limitations. |
Performance Measure | Percent of contract actions in compliance with Government-wide past performance reporting requirements (Cumulative) |
---|---|
Definition | The number of completed performance evaluations divided by the number of contract actions that are subject to performance evaluation reporting requirements. |
Data Source | Contractor Performance Assessment Reporting System |
Frequency | Quarterly |
Verification and Validation | The U.S. Navy, administrator of the Contractor Performance Assessment Reporting System, validates the methodology and verifies the data. OPM verifies the narratives and reviews contracts in the system contract pool and works directly with system customer service representatives to remove all contracts that do not require past performance reporting from the calculation pool. |
Data Limitations | There are no significant data limitations. |
Strategic Objective 2.4: Establish a sustainable funding and staffing model for OPM that better allows the agency to meet its mission. By FY 2026, increase the percentage of OPM managers who indicate that they have sufficient resources to get their jobs done by 4 percentage points.
Performance Measure | Percent of OPM managers who indicate that they have sufficient resources to get their job done |
---|---|
Definition | The number of OPM managers who responded agree or strongly agree to the following Federal Employee Viewpoint Survey item: "I have sufficient resources (for example, people, materials, budget) to get my job done" divided by the total number of OPM managers who responded. |
Data Source | OPM Federal Employee Viewpoint Survey(FEVS) |
Frequency | Annual |
Verification and Validation | Between 2010 and 2019, response rates to the OPM FEVS ranged between 41 and 52 percent. Thus, the cleaned OPM FEVS data are weighted so that survey estimates accurately represent the survey population (unweighted data could produce biased estimates of population statistics). The final data set reflects the agency composition and demographic makeup of the Federal workforce within plus or minus one percentage point. Demographic results are not weighted. Additional details of OPM FEVS validation methods are found in the appendix of the Government-wide Management Report for the relevant year at https://www.opm.gov/fevs. OPM’s Survey Analysis Group within Workforce Policy and Innovation leads the survey administration and conducts extensive data analysis to verify the results and identify any systemic data issues. OPM FEVS is a web-based survey, and the instrument has built-in programs to inspect data for various response errors or out of range values; thus, data cleaning is a continuous operation throughout the data collection period. |
Data Limitations |
The OPM FEVS is administered annually and reflects employee opinions at a single point in time. Events around the time of the data collection (historicity effect) could possibly influence results. Not all executive agencies participate in the OPM FEVS. For example, the U.S. Department of Veterans Affairs no longer participates. The OPM FEVS response rate varies but is generally around 45 percent. However, it is important to note that the large sample size (OPM sent the 2020 survey to more than 1.4 million employees, with 624,800 employees completing a survey), combined with the weighting procedures described above, support the accuracy of the survey data. |
Performance Measure | Percent of OPM staff who indicate that they have sufficient resources to get their job done |
---|---|
Definition | The number of non-supervisory OPM staff who responded agree or strongly agree to the following Federal Employee Viewpoint Survey item: "I have sufficient resources (for example, people, materials, budget) to get my job done" divided by the total number of OPM staff who responded. |
Data Source | OPM Federal Employee Viewpoint Survey (FEVS) |
Frequency | Annual |
Verification and Validation | Between 2010 and 2019, response rates to the OPM FEVS ranged between 41 and 52 percent. Thus, the cleaned OPM FEVS data are weighted so that survey estimates accurately represent the survey population (unweighted data could produce biased estimates of population statistics). The final data set reflects the agency composition and demographic makeup of the Federal workforce within plus or minus one percentage point. Demographic results are not weighted. Additional details of OPM FEVS validation methods are found in the appendix of the Government-wide Management Report for the relevant year at https://www.opm.gov/fevs. OPM’s Survey Analysis Group within Workforce Policy and Innovation leads the survey administration and conducts extensive data analysis to verify the results and identify any systemic data issues. OPM FEVS is a web-based survey, and the instrument has built-in programs to inspect data for various response errors or out of range values; thus, data cleaning is a continuous operation throughout the data collection period. |
Data Limitations |
The OPM FEVS is administered annually and reflects employee opinions at a single point in time. Events around the time of the data collection (historicity effect) could possibly influence results. Not all executive agencies participate in the OPM FEVS. For example, the U.S. Department of Veterans Affairs no longer participates. The OPM FEVS response rate varies but is generally around 45 percent. However, it is important to note that the large sample size (OPM sent the 2020 survey to more than 1.4 million employees, with 624,800 employees completing a survey), combined with the weighting procedures described above, support the accuracy of the survey data. |
Strategic Objective 2.5: Modernize OPM IT by establishing an enterprise-wide approach, eliminating fragmentation, and aligning IT investments with core mission requirements. By FY 2026, increase the percentage of software projects implementing adequate incremental development to 95 percent.
Performance Measure | Percent of software projects implementing adequate incremental development |
---|---|
Definition | The number of OPM projects that have at least one associated activity that plans to deliver functionality in approximately six months divided by the total number of OPM current IT projects. |
Data Source | Federal Information Technology Acquisition Reform Act Dashboard, Agency Chief Information Officer Authority Enhancements (Incremental Development) |
Frequency | Semi Annual |
Verification and Validation | OMB requires agencies’ investments to deliver functionality every six months. Congress, OMB, and GAO’s work support the use of incremental development practices. OPM reports the data to the Committee on Oversight and Reform and is then verified by additional subject matter experts. |
Data Limitations | There are no significant data limitations. |
Performance Measure | Score for utilization of the working capital fund to support IT modernization and security |
---|---|
Definition | Definition OPM’s average monthly score for using working capital funds, on a 5-point scale, based on the Federal Information Technology Acquisition Reform Act scoring methodology. An agency receives a five (or A) if it has a Modernizing Government Technology Act-specific working capital fund with a Chief Information Officer in charge of decision-making, a four (or B) if it plans to setup a Modernizing Government Technology working capital fund in the current or next fiscal year, a three (or C) if it has a department working capital fund or equivalent, a two (or D) if it has some other IT related funding method, and a one (or F) otherwise. |
Data Source | Federal Information Technology Acquisition Reform Act Dashboard |
Frequency | Semi Annual |
Verification and Validation | The OCIO data collection lead develops the information requested for the Committee on Oversight and Reform's bi-annual scorecard. The OPM Chief Information Officer reviews the information before submittal to the Committee on Oversight and Reform. |
Data Limitations | There are no significant data limitations. |
Strategic Objective 2.6: Promote a positive organizational culture where leadership drives an enterprise mindset, lives the OPM values, and supports employee engagement and professional growth. By FY 2026, increase OPM's Leaders Lead Score by 3 points.
Performance Measure | OPM Leaders Lead score |
---|---|
Definition |
The average of the scores for the following OPM FEVS items, which reflects OPM employees’ perceptions of the integrity of leadership, as well as leadership behaviors such as communication and workforce motivation:
|
Data Source | OPM Federal Employee Viewpoint Survey (FEVS) |
Frequency | Annual |
Verification and Validation | Between 2010 and 2019, response rates to the OPM FEVS ranged between 41 and 52 percent. Thus, the cleaned OPM FEVS data are weighted so that survey estimates accurately represent the survey population (unweighted data could produce biased estimates of population statistics). The final data set reflects the agency composition and demographic makeup of the Federal workforce within plus or minus one percentage point. Demographic results are not weighted. Additional details of OPM FEVS validation methods are found in the appendix of the Government-wide Management Report for the relevant year at https://www.opm.gov/fevs. OPM’s Survey Analysis Group within Workforce Policy and Innovation leads the survey administration and conducts extensive data analysis to verify the results and identify any systemic data issues. OPM FEVS is a web-based survey, and the instrument has built-in programs to inspect data for various response errors or out of range values; thus, data cleaning is a continuous operation throughout the data collection period. |
Data Limitations |
The OPM FEVS is administered annually and reflects employee opinions at a single point in time. Events around the time of the data collection (historicity effect) could possibly influence results. Not all executive agencies participate in the OPM FEVS. For example, the U.S. Department of Veterans Affairs no longer participates. The OPM FEVS response rate varies but is generally around 45 percent. However, it is important to note that the large sample size (OPM sent the 2020 survey to more than 1.4 million employees, with 624,800 employees completing a survey), combined with the weighting procedures described above, support the accuracy of the survey data. |
Strategic Goal 3: Create a human-centered customer experience by putting the needs of OPM’s customers at the center of OPM’s workforce services, policy, and oversight, increasing OPM’s customer satisfaction index score for targeted services to 4.3 out of 5.
Strategic Objective 3.1: Enhance the Retirement Services customer experience by providing timely, accurate, and responsive service that addresses the diverse needs of OPM’s customers. By FY 2026, improve the customer satisfaction score to 4.2 out of 5.
Performance Measure | Average number of minutes to answer phone calls (Cumulative) |
---|---|
Definition | The average amount of time contacts spent waiting for an agent to answer after requesting to speak with an agent (from “in queue” state to “active” state). It does not include abandoned calls. |
Data Source | CXone Platform |
Frequency | Monthly |
Verification and Validation | OPM reviews data collection and reporting procedures and tests data to assess its accuracy. These tests include comparing data for a given fiscal year to similar data collected for previous years, researching any anomalies that are observed, and comparing data with similar information collected from other sources. Quality and management control devices are built into these data collection mechanisms to verify accuracy and reliability. |
Data Limitations | There are no significant data limitations. |
Performance Measure | Average number of days to process retirement cases |
---|---|
Definition | The average number of days from when OPM receives a retirement application from the annuitant’s agency (or for disability cases, when OPM approves the medical determination) to when final adjudication and payment is issued. |
Data Source | Annuity Roll Processing System |
Frequency | Monthly |
Verification and Validation | OPM reviews data collection and reporting procedures and tests data to assess its accuracy. These tests include comparing data for a given fiscal year to similar data collected for previous years, researching any anomalies that are observed, and comparing data with similar information collected from other sources. Quality and management control devices are built into these data collection mechanisms to verify accuracy and reliability. |
Data Limitations | The processing times do not include the time period before OPM receives the applications from the annuitant’s agencies, and for disability cases, do not include the time period before OPM approves the medical determination. |
Performance Measure | Average satisfaction score for services received from Retirement Services |
---|---|
Definition | The average survey recipient response, on a five-point scale (very dissatisfied to very satisfied), for the following statement: I am satisfied with the service received from OPM Retirement Services |
Data Source | RS Quarterly Customer Satisfaction Survey |
Frequency | Quarterly |
Verification and Validation | OPM has validated survey items with survey experts for comprehension. OPM reviews the data and compares historical trends where applicable. |
Data Limitations | The survey is administered quarterly and limited to annuitants who have a valid email address on file with OPM and who received their full annuity payment within the previous three-month period. The scope of this survey does not include certain populations such as those with Civil Service Retirement System deferred cases, survivors, or former spouses. Responses may also be impacted by the amount of time between the customer’s transaction and the completion of the survey. |
Strategic Objective 3.2: Create a personalized USAJOBS® experience to help applicants find relevant opportunities. By FY 2026, improve applicant satisfaction to 4.1 out of 5 for the desktop platform and to 4.5 out of 5 for the mobile platform.
Performance Measure | Average overall satisfaction score with USAJOBS (desktop) |
---|---|
Definition | The average survey recipient response, on a five-point scale (strongly disagree to strongly agree), for the following statement: What is your overall satisfaction with this site? |
Data Source | Verint Foresee survey service (USAJOBS survey) |
Frequency | Quarterly |
Verification and Validation | As part of the verification process, OPM checks responses to the survey for appropriate and accurate coding (for example, there were no out of range responses and responses corresponded with survey skip patterns), enhancing data quality by verifying data file accuracy, completeness, and reliability. To minimize potential bias in responses and promote statistical validity, the USAJOBS program office uses random sampling of customers and a large sample size. |
Data Limitations | Data and results are based upon the responses from those who voluntarily completed the survey. These responses portray respondents’ perceptions and experiences regarding satisfaction. However, the sample may not be fully representative of the population of service users as some may not have elected to complete the survey. |
Performance Measure | Average ease score (desktop) |
---|---|
Definition | The average survey recipient response, on a five-point scale (strongly disagree to strongly agree), for the following statement: It was easy to complete what I needed to do. |
Data Source | Verint Foresee survey service (USAJOBS survey) |
Frequency | Quarterly |
Verification and Validation | As part of the verification process, OPM checks responses to the survey item for appropriate and accurate coding (for example, there were no out of range responses and responses corresponded with survey skip patterns), enhancing data quality by verifying data file accuracy, completeness, and reliability. To minimize potential bias and promote statistical validity, the USAJOBS program office uses random sampling of customers, and a large sample size is used to ensure statistical validity. |
Data Limitations | Data and results are based upon the responses from those who voluntarily completed the survey. These responses portray respondents’ perceptions and experiences regarding the ease of services received. However, the sample may not be fully representative of the population of service users as some may not have elected to complete the survey. |
Performance Measure | Average efficiency score (desktop) |
---|---|
Definition | The average survey recipient response, on a 5-point scale (strongly disagree to strongly agree), for the following statement: It took a reasonable amount of time to do what I needed to do. |
Data Source | Verint Foresee survey service (USAJOBS survey) |
Frequency | Quarterly |
Verification and Validation | As part of the verification process, the USAJOBS data analyst checks responses to the survey item for appropriate and accurate coding (for example, there were no out of range responses and responses corresponded with survey skip patterns). By double-checking the coding of each survey item, the office enhances data quality by verifying data file accuracy, completeness, and reliability. To minimize potential bias and promote statistical validity, the USAJOBS program office uses random sampling of customers, and a large sample size is used to ensure statistical validity. |
Data Limitations | Data and results are based upon the responses from those who voluntarily completed the survey. These responses portray respondents’ perceptions and experiences regarding the timeliness of services received. However, the sample may not be fully representative of the population of service users as some may not have elected to complete the survey. |
Performance Measure | Average transparency score (desktop) |
---|---|
Definition | The average survey recipient response, on a 5-point scale (strongly disagree to strongly agree), for the following statement: I understand what is being asked of me throughout the Federal application process. |
Data Source | Verint Foresee survey service (USAJOBS survey) |
Frequency | Quarterly |
Verification and Validation | As part of the verification process, the USAJOBS data analyst checks responses to the survey item for appropriate and accurate coding (for example, there were no out of range responses and responses corresponded with survey skip patterns). By double-checking the coding of each survey item, the office enhances data quality by verifying data file accuracy, completeness, and reliability. To minimize potential bias and promote statistical validity, the USAJOBS program office uses random sampling of customers, and a large sample size is used to ensure statistical validity. |
Data Limitations | Data and results are based upon the responses from those who voluntarily completed the survey. These responses portray respondents’ perceptions and experiences regarding the transparency of services received. However, the sample may not be fully representative of the population of service users as some may not have elected to complete the survey. |
Performance Measure | Average website helpfulness score (desktop) |
---|---|
Definition | The average survey recipient response, on a 5-point scale (strongly disagree to strongly agree), for the following statements: The website helped me do what I needed to do. |
Data Source | Verint Foresee survey service (USAJOBS survey) |
Frequency | Quarterly |
Verification and Validation | As part of the verification process, the USAJOBS data analyst checks responses to the survey item for appropriate and accurate coding (for example, there were no out of range responses and responses corresponded with survey skip patterns). By double-checking the coding of each survey item, the office enhances data quality by verifying data file accuracy, completeness, and reliability. To minimize potential bias and promote statistical validity, the USAJOBS program office uses random sampling of customers, and a large sample size is used to ensure statistical validity. |
Data Limitations | Data and results are based upon the responses from those who voluntarily completed the survey. These responses portray respondents’ perceptions and experiences regarding website helpfulness. However, the sample may not be fully representative of the population of service users as some may not have elected to complete the survey. |
Performance Measure | Average trust score (desktop) |
---|---|
Definition | The average survey recipient score, on a 5-point scale, for the following statement: This interaction increased my trust in USAJOBS. |
Data Source | Verint Foresee survey service (USAJOBS survey) |
Frequency | Quarterly |
Verification and Validation | As part of the verification process, the USAJOBS data analyst checks responses to the survey item for appropriate and accurate coding (for example, there were no out of range responses and responses corresponded with survey skip patterns). By double-checking the coding of each survey item, the office enhances data quality by verifying data file accuracy, completeness, and reliability. To minimize potential bias and promote statistical validity, the USAJOBS program office uses random sampling of customers, and a large sample size is used to ensure statistical validity. |
Data Limitations | Data and results are based upon the responses from those who voluntarily completed the survey. These responses portray respondent’s perceptions and experiences regarding their trust of services received. However, the sample may not be fully representative of the population of service users as some may not have elected to complete the survey. |
Performance Measure | Average effectiveness score (desktop) |
---|---|
Definition | The average survey recipient response, on a 5-point scale (strongly disagree to strongly agree, for the following statement: My need was addressed. |
Data Source | Verint Foresee survey service (USAJOBS survey) |
Frequency | Quarterly |
Verification and Validation | As part of the verification process, the USAJOBS data analyst checks responses to the survey item for appropriate and accurate coding (for example, there were no out of range responses and responses corresponded with survey skip patterns). By double-checking the coding of each survey item, the office enhances data quality by verifying data file accuracy, completeness, and reliability. To minimize potential bias and promote statistical validity, the USAJOBS program office uses random sampling of customers, and a large sample size is used to ensure statistical validity. |
Data Limitations | Data and results are based upon the responses from those who voluntarily completed the survey. These responses portray respondents’ perceptions and experiences regarding the effectiveness of services received. However, the sample may not be fully representative of the population of service users as some may not have elected to complete the survey. |
Performance Measure | Average overall satisfaction score with USAJOBS (mobile) |
---|---|
Definition | The average survey recipient response, on a 5-point scale (strongly disagree to strongly agree), for the following statement: What is your overall satisfaction with this site? |
Data Source | Verint Foresee survey service (USAJOBS survey) |
Frequency | Quarterly |
Verification and Validation | As part of the verification process, the USAJOBS data analyst checks responses to the survey item for appropriate and accurate coding (for example, there were no out of range responses and responses corresponded with survey skip patterns). By double-checking the coding of each survey item, the office enhances data quality by verifying data file accuracy, completeness, and reliability. To minimize potential bias and promote statistical validity, the USAJOBS program office uses random sampling of customers, and a large sample size is used to ensure statistical validity. |
Data Limitations | Data and results are based upon the responses from those who voluntarily completed the survey. These responses portray respondents’ perceptions and experiences regarding satisfaction. However, the sample may not be fully representative of the population of service users as some may not have elected to complete the survey. |
Performance Measure | Average trust score (mobile) |
---|---|
Definition | The average survey recipient response, on a 5-point scale (strongly disagree to strongly agree), for the following statement: This interaction increased my trust in USAJOBS. |
Data Source | Verint Foresee survey service (USAJOBS survey) |
Frequency | Quarterly |
Verification and Validation | As part of the verification process, the USAJOBS data analyst checks responses to the survey item for appropriate and accurate coding (for example, there were no out of range responses and responses corresponded with survey skip patterns). By double-checking the coding of each survey item, the office enhances data quality by verifying data file accuracy, completeness, and reliability. To minimize potential bias and promote statistical validity, the USAJOBS program office uses random sampling of customers, and a large sample size is used to ensure statistical validity. |
Data Limitations | Data and results are based upon the responses from those who voluntarily completed the survey. These responses portray respondents’ perceptions and experiences regarding their trust of services received. However, the sample may not be fully representative of the population of service users as some may not have elected to complete the survey. |
Performance Measure | Average effectiveness score (mobile) |
---|---|
Definition | The average survey recipient response, on a five-point scale (strongly disagree to strongly agree), for the following statement: My need was addressed. |
Data Source | Verint Foresee survey service (USAJOBS survey) |
Frequency | Quarterly |
Verification and Validation | As part of the verification process, the USAJOBS data analyst checks responses to the survey item for appropriate and accurate coding (for example, there were no out of range responses and responses corresponded with survey skip patterns). By double-checking the coding of each survey item, the office enhances data quality by verifying data file accuracy, completeness, and reliability. To minimize potential bias and promote statistical validity, the USAJOBS program office uses random sampling of customers, and a large sample size is used to ensure statistical validity. |
Data Limitations | Data and results are based upon the responses from those who voluntarily completed the survey. These responses portray respondents’ perceptions and experiences regarding the effectiveness of services received. However, the sample may not be fully representative of the population of service users as some may not have elected to complete the survey. |
Performance Measure | Average ease score (mobile) |
---|---|
Definition | The average survey recipient response, on a 5-point scale (strongly disagree to strongly agree), for the following statement: It was easy to complete what I needed to do. |
Data Source | Verint Foresee survey service (USAJOBS survey) |
Frequency | Quarterly |
Verification and Validation | As part of the verification process, the USAJOBS data analyst checks responses to the survey item for appropriate and accurate coding (for example, there were no out of range responses and responses corresponded with survey skip patterns). By double-checking the coding of each survey item, the office enhances data quality by verifying data file accuracy, completeness, and reliability. To minimize potential bias and promote statistical validity, the USAJOBS program office uses random sampling of customers, and a large sample size is used to ensure statistical validity. |
Data Limitations | Data and results are based upon the responses from those who voluntarily completed the survey. These responses portray respondents’ perceptions and experiences regarding the ease of services received. However, the sample may not be fully representative of the population of service users as some may not have elected to complete the survey. |
Performance Measure | Average efficiency score (mobile) |
---|---|
Definition | The average survey recipient response, on a 5-point scale (strongly disagree to strongly agree), for the following statement: It took a reasonable amount of time to do what I needed to do. |
Data Source | Verint Foresee survey service (USAJOBS survey) |
Frequency | Quarterly |
Verification and Validation | As part of the verification process, the USAJOBS data analyst checks responses to the survey item for appropriate and accurate coding (for example, there were no out of range responses and responses corresponded with survey skip patterns). By double-checking the coding of each survey item, the office enhances data quality by verifying data file accuracy, completeness, and reliability. To minimize potential bias and promote statistical validity, the USAJOBS program office uses random sampling of customers, and a large sample size is used to ensure statistical validity. |
Data Limitations | Data and results are based upon the responses from those who voluntarily completed the survey. These responses portray respondents’ perceptions and experiences regarding the efficiency of services received. However, the sample may not be fully representative of the population of service users as some may not have elected to complete the survey. |
Performance Measure | Average transparency score (mobile) |
---|---|
Definition | The average survey recipient response, on a 5-point scale (strongly disagree to strongly agree), for the following statement: I understand what is being asked of me throughout the Federal application process. |
Data Source | Verint Foresee survey service (USAJOBS survey) |
Frequency | Quarterly |
Verification and Validation | As part of the verification process, the USAJOBS data analyst checks responses to the survey item for appropriate and accurate coding (for example, there were no out of range responses and responses corresponded with survey skip patterns). By double-checking the coding of each survey item, the office enhances data quality by verifying data file accuracy, completeness, and reliability. To minimize potential bias and promote statistical validity, the USAJOBS program office uses random sampling of customers, and a large sample size is used to ensure statistical validity. |
Data Limitations | Data and results are based upon the responses from those who voluntarily completed the survey. These responses portray respondents’ perceptions and experiences regarding the transparency of services received. However, the sample may not be fully representative of the population of service users as some may not have elected to complete the survey. |
Performance Measure | Average website helpfulness score (mobile) |
---|---|
Definition | The average survey recipient response, on a 5-point scale (strongly disagree to strongly agree), for the following statement: The website helped me do what I needed to do. |
Data Source | Verint Foresee survey service (USAJOBS survey) |
Frequency | Quarterly |
Verification and Validation | As part of the verification process, the USAJOBS data analyst checks responses to the survey item for appropriate and accurate coding (for example, there were no out of range responses and responses corresponded with survey skip patterns). By double-checking the coding of each survey item, the office enhances data quality by verifying data file accuracy, completeness, and reliability. To minimize potential bias and promote statistical validity, the USAJOBS program office uses random sampling of customers, and a large sample size is used to ensure statistical validity. |
Data Limitations | Data and results are based upon the responses from those who voluntarily completed the survey. These responses portray respondents’ perceptions and experiences regarding the helpfulness of services received. However, the sample may not be fully representative of the population of service users as some may not have elected to complete the survey. |
Strategic Objective 3.3: Create a seamless customer and intermediary experience across OPM’s policy, service, and oversight functions. By FY 2026, increase the average score for helpfulness of OPM human capital services in achieving human capital objectives to 4.5 out of 5.
Performance Measure | Average score for helpfulness of OPM human capital services in achieving human capital objectives |
---|---|
Definition | Average response on a five-point scale (strongly disagree to strongly agree) of human capital community respondents to the following survey item: OPM was helpful in achieving your human capital objectives. |
Data Source | Customer Satisfaction Surveys (WPI, MSAC, HRS) |
Frequency | Semi Annual |
Verification and Validation | OPM checks the responses to the questions for appropriate and accurate coding. For example, OPM checks that there are no out of range responses or unaccounted for responses. Double checking the responses for each survey item enhances data quality by promoting accuracy, completeness, and reliability. |
Data Limitations | Data and results are based on the responses from those who voluntarily respond to the questions and who self-identified as having received human capital services from OPM. These responses provide an accurate portrayal of their perceptions and experiences regarding the quality of services received. It is likely that the voluntary nature of the survey and self-identification as a service recipient underestimates the actual number of service recipients. |
Strategic Objective 3.4: Transform the OPM website to a user-centric and user-friendly website. By FY 2026, achieve an average effectiveness score of 4 out of 5.
Performance Measure | Average effectiveness score |
---|---|
Definition | The average survey recipient response, on a five-point scale (strongly disagree to strongly agree), for the following statement: I found what I needed on the site. |
Data Source | Website feedback survey. |
Frequency | Quarterly |
Verification and Validation | OPM checks the responses to the questions for appropriate and accurate coding. For example, OPM checks that there are no out of range responses or unaccounted for responses. Double checking the responses for each survey item enhances data quality by promoting accuracy, completeness, and reliability. |
Data Limitations | Data and results are based upon the responses from those who voluntarily completed the survey. These responses portray respondents’ perceptions and experiences regarding OPM’s website. However, the sample may not be fully representative of the population of website users as some may not have elected to complete the survey. |
Performance Measure | Average ease score |
---|---|
Definition | The average survey recipient response, on a five-point scale (strongly disagree to strongly agree) for the following statement: It was easy to find what I needed. |
Data Source | Website feedback survey. |
Frequency | Quarterly |
Verification and Validation | OPM checks the responses to the questions for appropriate and accurate coding. For example, OPM checks that there are no out of range responses or unaccounted for responses. Double checking the responses for each survey item enhances data quality by promoting accuracy, completeness, and reliability. |
Data Limitations | Data and results are based upon the responses from those who voluntarily completed the survey. These responses portray respondents’ perceptions and experiences regarding OPM’s website. However, the sample may not be fully representative of the population of website users as some may not have elected to complete the survey. |
Strategic Goal 4: Provide innovative and data-driven solutions to enable agencies to meet their missions, increasing the percentage of users throughout Government who agree that OPM offered innovative solutions while providing services or guidance by 4 points
Strategic Objective 4.1: Foster a culture of creativity and innovation within OPM. By FY 2026, increase the percentage of employees who agree that innovation is valued by 4 points.
Performance Measure | OPM Innovation score |
---|---|
Definition |
The average of the scores for the following OPM FEVS items:
|
Data Source | OPM Federal Employee Viewpoint Survey (FEVS) |
Frequency | Annual |
Verification and Validation | Between 2010 and 2019, response rates to the OPM FEVS ranged between 41 and 52 percent. Thus, the cleaned OPM FEVS data are weighted so that survey estimates accurately represent the survey population (unweighted data could produce biased estimates of population statistics). The final data set reflects the agency composition and demographic makeup of the Federal workforce within plus or minus one percentage point. Demographic results are not weighted. Additional details of OPM FEVS validation methods are found in the appendix of the Government-wide Management Report for the relevant year at https://www.opm.gov/fevs. OPM’s Survey Analysis Group within Workforce Policy and Innovation leads the survey administration and conducts extensive data analysis to verify the results and identify any systemic data issues. OPM FEVS is a web-based survey, and the instrument has built-in programs to inspect data for various response errors or out of range values; thus, data cleaning is a continuous operation throughout the data collection period. |
Data Limitations |
The OPM FEVS is administered annually and reflects employee opinions at a single point in time. Events around the time of the data collection (historicity effect) could possibly influence results. Not all executive agencies participate in the OPM FEVS. For example, the U.S. Department of Veterans Affairs no longer participates. The OPM FEVS response rate varies but is generally around 45 percent. However, it is important to note that the large sample size (OPM sent the 2020 survey to more than 1.4 million employees, with 624,800 employees completing a survey), combined with the weighting procedures described above, support the accuracy of the survey data. |
Performance Measure | Percent of OPM leaders trained in innovation techniques (Cumulative) |
---|---|
Definition | The number of OPM employees classified as supervisors and managers, team leaders, leaders, and management officials trained in innovation techniques divided by the total number of supervisors and managers, team leaders, leaders, and management officials. |
Data Source | OPM HR employee supervisor status report and innovation training tracking spreadsheet |
Frequency | Quarterly |
Verification and Validation | The OPM program offices that coordinate the innovation trainings and workshops verify the attendee lists. OPM crosschecks the attendee lists from innovation trainings and workshops with the list of supervisors and managers, team leaders, leaders, and management officials. |
Data Limitations | There are no significant data limitations. |
Strategic Objective 4.2: Increase focus on Government-wide policy work by shifting more low-risk delegations of authorities to agencies.
Performance Measure | Percent of low-risk delegations with errors identified through OPM or agency led evaluations |
---|---|
Definition | The number of errors found in a representative sampling of delegated low risk transactions during OPM-led evaluations and by CHCO agency internal reviews divided by the number of actions reviewed by OPM and the agency. |
Data Source | Results of OPM-led Human Capital Management Evaluations, Delegated Examining reviews, or special studies, and results of annual agency internal assessment that are provided to OPM |
Frequency | Quarterly |
Verification and Validation | OPM will verify the number of low-risk transactions processed by agencies through OPM’s Enterprise Human Resource Integration (EHRI) or, for those transactions not captured in EHRI, through a data call to agencies. OPM will verify the number of low-risk transactional errors identified in OPM-issued reports during the review and clearance process. OPM will also verify the number of errors identified in agency led assessments. |
Data Limitations | There are no significant data limitations. |
Performance Measure | Percent of CHCOs who agree that OPM provides appropriate delegations to agencies |
---|---|
Definition | The number of CHCO survey respondents who indicate that they "agree" or "strongly agree" with the statement “OPM provides appropriate delegations to agencies” divided by the total number of CHCOs who responded to the survey item. CHCOs are defined as the CHCOs and Deputy CHCOs of the CHCO Act of 2002 agencies. |
Data Source | CHCO Council Survey |
Frequency | Annual |
Verification and Validation | Items were verified by OPM subject matter experts and the CHCO executive council. Data is cross referenced with other feedback channels and coordination efforts through the CHCO Council. As part of the verification process, responses to the survey item are checked for appropriate and accurate coding (for example, there were no out of range responses and responses corresponded with survey skip patterns). Double-checking the coding of each survey item enhances data quality, supporting accuracy, completeness, and reliability. |
Data Limitations | This data is collected annually and, therefore, reflects CHCO opinions at a single point in time. Events around the time of the data collection (historicity effect) could possibly influence results. In addition, turnover in some agency CHCO positions can be high, meaning that the sample is not stable over time. Other key limitations are response rate, which may vary by year, and sample size, which is expected to be low and limits the ability to make precise determinations or comparisons. |
Performance Measure | Percent of low-risk delegations granted to agencies |
---|---|
Definition | The number of low-risk delegations granted to agencies from OPM divided by the number of potential transactions determined by OPM to be low risk. |
Data Source | Tracking spreadsheet that includes all transactions identified for potential delegation to agencies |
Frequency | Quarterly |
Verification and Validation | OPM’s Merit System Accountability and Compliance, Workforce Policy and Innovation, and Suitability Executive Agent Programs offices review the list of delegations for accuracy and completeness. |
Data Limitations | There are no significant data limitations. |
Strategic Objective 4.3: Expand the quality and use of OPM’s Federal human capital data. By FY 2026, increase the percentage of CHCO survey respondents who agree that OPM provides agencies with high quality workforce data and information to be used in decision-making by 20 percentage points.
Performance Measure | Percent of CHCOs who agree that OPM provides agencies with high quality workforce data and information for decision-making |
---|---|
Definition | The number of CHCO survey respondents who indicate that they "agree" or "strongly agree" with the statement "OPM provides agencies with high quality workforce data and information for decision-making" divided by the total number of CHCOs who responded to the survey item. CHCOs are defined as the CHCOs and Deputy CHCOs of the CHCO Act of 2002 agencies. |
Data Source | CHCO Council Survey |
Frequency | Annual |
Verification and Validation | Items were verified by OPM subject matter experts and the CHCO executive council. Data is cross referenced with other feedback channels and coordination efforts through the CHCO Council. As part of the verification process, responses to the survey item are checked for appropriate and accurate coding (for example, there were no out of range responses and responses corresponded with survey skip patterns). Double-checking the coding of each survey item enhances data quality, supporting accuracy, completeness, and reliability. |
Data Limitations | This data is collected annually and, therefore, reflects CHCO opinions at a single point in time. Events around the time of the data collection (historicity effect) could possibly influence results. In addition, turnover in some agency CHCO positions can be high, meaning that the sample is not stable over time. Other key limitations are response rate, which may vary by year, and sample size, which is expected to be low and limits the ability to make precise determinations or comparisons. |
Performance Measure | Average quarterly number of users of OPM’s publicly available human capital dashboards |
---|---|
Definition | The average quarterly number of unique visitors to OPM’s human capital dashboards featured or housed on the public OPM Data Portal. |
Data Source | Universal Analytics |
Frequency | Quarterly |
Verification and Validation | OPM subject matter experts review the data and research any anomalies. |
Data Limitations | The number of dashboards will increase over time. |
Performance Measure | Average quarterly number of authenticated users of OPM’s human capital dashboards |
---|---|
Definition | The average quarterly number of unique users who access OPM's human capital dashboards through some authenticated means as a precursor to gaining access. |
Data Source | Interactive data visualization software that tracks web traffic and analytics |
Frequency | Quarterly |
Verification and Validation | OPM validates the number of users who request access and view the human capital dashboards. |
Data Limitations | Given that some OPM dashboards contain sensitive/confidential information, not all human capital dashboards are made public. |
Strategic Objective 4.4: Improve OPM’s ability to provide strategic human capital management leadership to agencies through expansion of innovation, pilots, and identification of leading practices across Government. By FY 2026, provide Federal agencies with 25 leading practices.
Performance Measure | Number of leading practices shared with Federal agencies |
---|---|
Definition | The number of leading practices shared with Federal agencies via publications or events. Leading practices demonstrate efficiency and effectiveness for delivering a particular outcome. They may be specific to organizational context and time period. Leading practices are continuously developing. |
Data Source | Internal database |
Frequency | Quarterly |
Verification and Validation | The OPM program offices who issue the publications or coordinate the events in which leading practices are shared with agencies verify the counts. OPM also verifies the data using records of communications and event resources, including agendas and presentations. |
Data Limitations | The reported results do not capture leading practices shared informally via OPM technical assistance. |
Strategic Objective 4.5: Revamp OPM’s policy-making approach to be proactive, timely, systematic, and inclusive. By FY 2026, increase the percent of CHCOs who agree that OPM’s policy approach is responsive to agency needs by 8 percentage points.
Performance Measure | Percent of priority policy guidance issued by the deadline |
---|---|
Definition | The number of priority policy guidance documents issued by the deadline divided by the number of policy guidance documents issued. Priority guidance is statutorily required, related to OPM’s Strategic Plan, or related to the President’s Management Agenda. For this measurement, the deadline for a priority policy guidance document is based on an 18-day review period. |
Data Source | Document approval system |
Frequency | Quarterly |
Verification and Validation | OPM uses internal reporting from the document approval system to verify when documents are approved and if the documents were approved by the established deadline. |
Data Limitations | As OPM transitions from its legacy internal document approval system to a new system, there is a risk of inconsistencies in data collection procedures. |
Performance Measure | Percent of CHCOs who agree that OPM’s policy approach is responsive to agency needs |
---|---|
Definition | The number of CHCO survey respondents who indicate that they "agree" or "strongly agree" with the statement "OPM's policy approach is responsive to agency needs" divided by the total number of CHCOs who responded to the survey item. CHCOs are defined as the CHCOs and Deputy CHCOs of the CHCO Act of 2002 agencies. |
Data Source | CHCO Council Survey |
Frequency | Annual |
Verification and Validation | Items were verified by OPM subject matter experts and the CHCO executive council. Data is cross referenced with other feedback channels and coordination efforts through the CHCO Council. As part of the verification process, responses to the survey item are checked for appropriate and accurate coding (for example, there were no out of range responses and responses corresponded with survey skip patterns). Double-checking the coding of each survey item enhances data quality, supporting accuracy, completeness, and reliability. |
Data Limitations | This data is collected annually and, therefore, reflects CHCO opinions at a single point in time. Events around the time of the data collection (historicity effect) could possibly influence results. In addition, turnover in some agency CHCO positions can be high, meaning that the sample is not stable over time. Other key limitations are response rate, which may vary by year, and sample size, which is expected to be low and limits the ability to make precise determinations or comparisons. |
Strategic Objective 4.6: Streamline Federal human capital regulations and guidance to reduce administrative burden and promote innovation while upholding merit system principles. By FY 2026, improve CHCO agreement that human capital policy changes resulted in less administrative burden to agencies by 8 percentage points.
Performance Measure | Percent of CHCOs who agree that the human capital management system changes resulted in less administrative burden to agencies |
---|---|
Definition | The number of CHCO survey respondents who indicate that they "agree" or "strongly agree" with the statement "Human capital management system changes resulted in less administrative burden to agencies" divided by the total number of CHCOs who responded to the survey item. CHCOs are defined as the CHCOs and Deputy CHCOs of the CHCO Act of 2002 agencies. |
Data Source | CHCO Council Survey |
Frequency | Annual |
Verification and Validation | Items were verified by OPM subject matter experts and the CHCO executive council. Data is cross referenced with other feedback channels and coordination efforts through the CHCO Council. As part of the verification process, responses to the survey item are checked for appropriate and accurate coding (for example, there were no out of range responses and responses corresponded with survey skip patterns). Double-checking the coding of each survey item enhances data quality, supporting accuracy, completeness, and reliability. |
Data Limitations | This data is collected annually and, therefore, reflects CHCO opinions at a single point in time. Events around the time of the data collection (historicity effect) could possibly influence results. In addition, turnover in some agency CHCO positions can be high, meaning that the sample is not stable over time. Other key limitations are response rate, which may vary by year, and sample size, which is expected to be low and limits the ability to make precise determinations or comparisons. |