Quick Search

Training

Featured Text

 

Supplement to "Essential Perspectives for Improving Performance Measures: Users and Their Uses of Performance Information"

Additional Considerations for Improving the Usefulness of Performance Information

 

Several Possible Audit Objectives Related to the Usefulness of Performance Information

  • Determine whether data are available in time for decisions
  • Determine whether appropriate comparisons are made to assist analysis, decision making, and accountability
  • Determine whether performance information is understandable
  • Determine whether at least some key measures are responsive to change or “actionable” (i.e., there are policies or practices government can change to improve results)
  • Determine how well performance information meets the broad needs of users. The most basic step to accomplishing this objective, and essential to any assessment of “usefulness,” is to ask the intended users.

Back to article


Selecting Users of Public Performance Reporting by a General Purpose Government

If an audit is intended to cover an entire multi-agency general purpose government, rather than just one or two agencies, it is unlikely to be practical to engage all kinds of external interests in the jurisdiction. Instead, an auditor can look for a sampling of “most likely” users that represent a variety of interests, though not all. For example, a small number of civic groups that tend to represent resident interests may be included as well as a chamber of commerce or other business group. If there are groups of any type that are known to have already made use of the jurisdiction’s public data (e.g., university or other local research organizations), they would be good candidates to include. When trying to help improve the usefulness of performance reporting by a general purpose government to a broad-based public, an audit office may need to think long-term and focus on just the “most likely” external users during a first audit of this type.  The audit may then include a recommendation that management follow the auditor’s lead and reach out to more groups to determine the usefulness of performance information to them, and how to make it more useful. The recommendation might also suggest that each agency reach out to groups with significant interest in their performance. Later, when following up on audit recommendations, the audit office can determine whether management has reached out to more groups, or auditors can reach out to additional interest groups as part of a follow-up audit.

Back to article

Examples of Typical Uses, Barriers, & Ways to Overcome Barriers to Make Performance Information More Useful

It is important to obtain and analyze detailed information from specific intended users in each jurisdiction, rather than rely on general lists as described here.  But these lists can be a starting point for an auditor’s exploratory questions to determine what to ask about in interviews, focus groups, or surveys.

Typical Uses

  • Performance targeting and accountability, from personal, team, or organizational monitoring of actual performance vs. targets to performance evaluation at all levels
  • Policy development and improvement, including budgeting
  • Resource reallocation within existing budgets
  • Performance improvement, from outcome and service improvements to cost savings and revenue increases

Typical Barriers and Ways to Overcome Them to Make Performance Information More Useful

  • Barrier: Users do not trust the reliability of the data
    • Overcome: Start a program of auditing data reliability, possibly followed up by training managers to assess and improve the reliability of their agencies’ data.
  • Barrier: Up-to-date performance data not available in time for key decisions
    • Overcome: Improve timeliness of performance reporting.
  • Barrier: Intended users do not understand performance information, its relevance, or how it can be used
    • Overcome: Provide clear explanations of key performance measures, data, and analyses; where practical, provide training or technical assistance to key users (e.g., legislators) or engage third-party “data intermediaries” (e.g., from a university) to assist residents.
  • Barrier: Data not detailed enough for some users
    • Overcome: Make disaggregated data available (e.g., geographic or demographic disaggregation).
  • Barrier: Users feel that performance measures do not cover their most important issues
    • Overcome: Consult users when there are opportunities to revise performance measures.

Back to article

Using a Survey of Intended Users to Supplement Interview and Focus Group Results

Results of user interviews and focus groups can help the auditor craft questions and multiple choice answers for a survey.  The choices would include the most common uses and barriers found in the interviews and focus groups.  Perhaps more important is to include optional open-ended questions for respondents to describe other uses and barriers, describe more specifically how they use the data, and what would make performance information more useful for them. The auditor might then follow-up with phone or in-person interviews of a few survey respondents who provided interesting answers and said they’d be willing to be interviewed.  Obtaining responses from a truly representative sample of potential users or interests may not be possible, but it is not essential, as the qualitative information obtained from the survey is likely to be more important than quantitative data from the multiple choice questions.  When reporting on the survey, the auditor should cite limitations of the survey data (e.g., potential bias due to unrepresentative sample) and explain why some survey results are nonetheless valuable.

Back to article

Using Information on How People Use Performance Information to Improve its Usefulness

Once you have learned details of how various people use (or would like to use) performance information, you can use those details to good advantage to find ways to improve the usefulness of performance measures and data.   By comparing these details across users, you may identify ways intended users can learn from each other. For example, some users may have data analysis tools that others lack, leading you to recommend that these tools be shared with all users who have similar information needs. You may also learn common issues across intended users that can be addressed to make the information more useful to multiple users. For example, you may find several users who review jurisdiction-wide information a few times a year and take it into account when making budget requests or considering operational changes, but they would be able to make more targeted operations improvements or resource allocations if they had performance data disaggregated geographically (e.g., by neighborhood) or demographically (e.g., by age, gender, ethnicity, income level).

Back to article

A Sampling of the Types of Insights Possible by Analyzing Intended User Information in Table Form

When you have summarized information obtained from intended users or user groups in a table (as in this free downloadable audit tool), you can do a broad analysis across users to learn valuable insights that can inform your audit findings and recommendations. 

  • Existing uses or potential new uses that can or do take good advantage of the “information value” of performance data:
    • Information value is not “dollar value,” but significant ways some aspects of management, performance, or accountability can be more effective due to the use of performance information, for example:
      • Resources can be better targeted to where they’re most needed
      • Policies can be improved
      • Quality improvement processes can be informed by the data
      • Operations can be made more efficient
      • Outcomes can be improved
      • Cost savings or revenue increases can be achieved.
    • For existing uses, look for the extent to which the potential information value is captured (e.g., data are being used to deploy staff where they’re most needed; data are being used to inform policy or operational decisions) and for opportunities for changes to make even better use of the information (e.g., more frequent data can enable fine tuning of staff scheduling for more timely, responsive, or efficient services).  Also, try to discern reasons data may not be used as effectively as it could (e.g., not timely enough, lack of analysis tools).
    • For each potential new uses of performance information, try to discern how management, accountability, or performance could be improved by using data in that way.
  • Common issues or problems across user groups, especially if a common solution can help multiple users or groups become more effective users of performance information, for example, if several users will benefit from the same type of data disaggregation.

Back to article

Reasons for Reporting Findings of Even Minimal Uses of Performance Information in a Positive Light

Assuming auditors see at least part of their role as helping to improve accountability and performance, they should want to encourage more use of performance measurement, not less.  However, in most jurisdictions in North America below the federal level, performance measurement and management systems are not mandated but are voluntary management efforts.  So audit reports that only criticize agency attempts at using performance information may discourage managers from doing so and lead to less performance reporting and accountability, rather than more.  And even where there are mandates for performance management or reporting, wary agency managers are likely to stick to a minimal “compliance approach” to measurement and accountability if all they see is criticism for their attempts.  In either case, what little value is realized from performance measurement is unlikely to outweigh the costs.  So it is important for an auditor to be encouraging about substantiated uses found even if most of the report calls attention to the need to unlock much more value from performance information by making it more useful and stimulating more uses.  Those additional uses would be presented as “opportunities for improvement” in report findings rather than negative findings.  Even if an audit finds only one substantiated use of performance information for accountability or decision making, and nine missed opportunities for other effective uses, it may be best to produce a report that reads as if the glass is 10% full rather than 90% empty.

Back to article

Determining Relative Cost-Effectiveness of Possible Changes in Performance Information: Some things to consider:

  • Develop an idea of the relative costs of different possible changes to improve usefulness: Actual dollar estimates are not needed, but descriptions that provide an idea of relative costs of one change compared with others. For example, a reporting change to provide explanations to help a user understand performance measures or data is less costly than implementing an entirely new data collection technique. Similarly, adding an extra question to a survey that is done every year is less costly than conducting an additional survey.
  • Look for the same change that can improve the value of the information for several users, though take care to be sure it is really one change that will benefit multiple users.  For example, timeliness issues may have been identified by several intended users.  But this may require more than one change to the measurement system because “timeliness” can mean different things to different users. For example, field supervisors may want more frequent data (e.g., weekly, daily) and want it quickly (e.g., next day). Meeting their need is very different, and probably more costly, than meeting elected officials’ need for getting performance updates a week or two earlier once a year before budget hearings.  Audit findings and recommendations should acknowledge these differences.
  • Consider changes to provide “high leverage” opportunities to add value to performance. For example:
    • Improving the usefulness of performance data for high-budget operations is likely to provide higher leverage than doing the same for data about low-cost activities.

    • Obtaining better information on factors that drive high priority community outcomes (e.g., as determined by elected officials or documented in a strategic plan) can be considered “high leverage” even if budgeted government resources involved is relatively small.

    • Consider, for each program or service covered by an audit, whether changes to performance information to improve policies or strategies will offer higher leverage than changes to better inform operations.  Generally, changes to improve strategy or policy development (including budgeting) or strategy implementation are higher-leverage than changes on the operational level, as it’s important to be sure an organization is “doing the right things” before investing a lot of effort in operationally “doing things right.” But each case is different and generalities do not always hold true. Service operations with known deficiencies or obvious opportunities to improve performance if better operational data become available can offer higher leverage than informational changes that can, at best, enable minor tweaks to policy or strategy.
  • Consider changes that provide potentially important intangible benefits. For example, changes that increase elected officials’ understanding of reported performance information can increase their use of data and increase their confidence in their budget and policy decisions. Changes that make performance information more useful to residents or their interest groups (e.g., neighborhood associations) can help get more people engaged in their communities and build public trust in government. Similarly, changes to provide better information to help any key interest group the government works with (e.g., businesses, universities, nonprofits) can make partnerships more effective at addressing priority community issues that cannot be resolved by government alone.
  • Consider changes that provide or increase “open data” opportunities to “citizen scientists” and civic-minded app developers.  This is growing trend that benefits communities and their residents and has even occurred at the federal level. Instead of only making summary performance data available as typically found in performance reports or performance reporting websites, some jurisdictions are making detailed electronic data sets available as “open data.” Anyone or any group with appropriate skills can access and use the data sets for civic research or create software applications individuals or groups can use for public benefit, from enhancing public services, to increasing and improving community engagement, to finding innovative solutions to public problems, to increasing productive use of volunteers.

Back to article