Getting a fix on Oregon’s dead reckoning ability

The “Oregon Transparency” website – lies, damn lies, and statistics

 Dead reckoning – Part 2
 
Last week, OregonPEN published the first in what will be a series on Oregon’s “dead reckoning” ability — how well public agencies perform in terms of having a clear and accurate sense of their own performance at any given time. As any sailor knows, dead reckoning can be a life-saver, or a deceiver that guides you onto the rocks and disaster. The difference arises mainly from two things:

  1) How carefully the navigator trying to steer a course behaves in terms of keeping track of the heading and progress actually made, without regard to hopes and wishes; and

  2) Whether the navigator overcomes human nature and honestly incorporates the signs that internal problems and external forces are producing travel in directions other than the desired course.

Whether Oregon public agencies are any good at planning for the future is difficult to know; however, it seems likely that agencies that do a good job knowing what their past progress has been are the ones in the best position to plan for the future. So in this issue, OregonPEN visits a website that provides links to some of the Oregon state agency “annual progress reports” submitted to the Legislative Fiscal Office, just to get a rough sense of whether agencies take the reporting obligation seriously and try to make it useful for themselves and the public.
 
What stands out when sampling this site and the agency self-evaluation reports?
 
First, the progress report format seems designed to defeat or at least seriously deter transparency. Individually prepared reports that do not use a common set of agency metrics prevents comparison of agencies, despite the principle of “transparency” that is supposed to animate the whole effort. Worse, the progress reports themselves are scans of static paper reports containing inconsistent low-resolution black and white graphics, so that there are no live links and no connections to the underlying data. In other words, although these reports are prepared using digital tools that make it possible to provide inexpensive, real-time, vivid updates of all the key metrics, Oregon has instead chosen to force the data out of the digital realm and onto paper so that it is an information dead-end. Anyone trying to use these reports to assess agency performance against other Oregon agencies faces a hopeless task.
 
Second, even the ordering of the reports works against real transparency, because the reports are presented in the least useful order possible, alphabetically.  This trivial point is actually not so trivial, because the way the agency progress reports are presented has a good deal to do with how likely they are to be viewed and considered, and how easy is it for an Oregonian to turn the data into usable knowledge.
 
The Accountancy Board is not ranked first in terms of importance to understanding what state government is doing. Absent a better plan, the agencies reports should appear according to budget size, or budget categories, with the major agencies grouped as one category, minor agencies another, professional licensing boards in another. The object should be to group like with like, to promote comparisons and , present the reports according to agency importance – how much that agency affects the people of Oregon. While this can be difficult to know (the DMV might be the agency with the greatest amount of contact with Oregonians), agency budget provides a reasonable index for ranking agency importance.
 
Another problem with the presentation is that the agency listing approach provides no warning about gaps. No matter how long they were given, few Oregonians looking at the Oregon Transparency website are likely to note that the Oregon State Bar, the organization responsible for licensing and regulating attorneys, is nowhere to be found there. Are reports from other important state entities missing?
 
Third, Oregon allows all agencies to create or propose their own performance metrics. That sounds reasonable until you seek to compare two agencies that serve similar functions, such as the Board of Accountancy, the Board of Dentistry, the Licensed Professions Counselors & Therapists Board, the Licensed Social Workers Board, the Medical Board, the Board of Medical Imagine, and so on down the professions. While the professional licensing boards are superficially diverse, they are essentially all the same – each one is the body that is supposed to oversee and implement the regulations that the state has deemed necessary to apply to members of that profession for the benefit of the public. A real transparency effort would require all similar agencies to use a “Common Core” of performance measures, supplemented by any agency-specific measures that are appropriate because they add value in terms of helping Oregonians understand what they are getting in return for the resources spent on that particular agency.
 
Fourth, related to the problem of allowing agencies to define their own performance measures, it is impossible to summarize the data using a common measure. That means that Oregonians have no simple way to judge performance of one agency against another, even when there are similar metrics being used that could be made comparable with some effort. Compare that to the way that the federal Department of Transportation auto mileage regulations create a common standard of comparison (mpg, or miles per gallon rating) for city and highway driving. Even though personal vehicles range from enormous SUVs and big pickup trucks to tiny two-seaters, every gas vehicle completes a standardized testing regime and is assigned a city and a highway mpg rating figure. Even if these DOT ratings are off in an absolute sense, because the tests don’t reflect real-world driving conditions, they are still valuable because a would be car-buyer can still rely on the ratings to rank the different cars being considered, since every car completes the same testing.
 
This same issue creates another problem, namely that of “standardless standards.” Without a common basis for comparison between agencies, it is impossible to assess which agency or board sets the standard to which the rest should aspire. Without common measures, there can be no cross-agency benchmarking. Without benchmarks, agency performance measures and goals reflect agency culture, morale, and history instead of realistically attainable results.
 
Wherever there are agencies and boards with common functions within Oregon, the performance measures should not just be comparable, they should also be compared, and all the agencies should be ranked against each other, measure by measure, and the best performing agencies on any given measure should be studied to determine what they are doing that the rest are not.
 
Where there are truly no in-state agencies with a similar function to compare against, each agency should be responsible for identifying out-of-state agencies to benchmark against.
 
Probably the most important problem with the Transparency website is the “streetlight problem.” The streetlight problem is named for the old joke about the drunk who gets kicked out of the bar and finds that he has lost his keys, but only looks for them under a streetlight where visibility is good. In this electronic world, it is relatively easy to get good visibility for a number of activities that are easily tracked and measured. What is important, on the other hand, is how well an agency’s chosen performance measures actually reflect something important about the agency’s product rather than its inputs, its ends rather than its means.
 
What separates dead reckoning from simply robotic recording of course and speed changes is that the navigator is continuously assessing how well reality compares to the idealized (estimated) track  that would be traveled if the engine speeds ordered and the course headings chosen did not have to operate against hidden factors such as tides, currents, compass errors and wind. It is tempting for any agency to dispense with this assessment part, and to simply measure inputs and activities – investigations opened, investigations closed, delays, etc. – without ever bothering to look at whether the problems that caused the agency to be formed in the first place are being addressed or not.
 
Another way to describe the streetlight problem is the old saying that “When you’re up to your ass in alligators, it can be pretty hard to remember that you were actually sent there to drain the swamp.” Report after report on the Oregon Transparency website shows this perfectly: each report presents a handful of “key performance indicators,” or KPIs, that refer only to easily-measured busyness or internal agency activity, with no reference to any real-world outcome that would be meaningful in terms of the agency’s purpose.
 
In medical research, this is the “intermediate endpoints” problem, where researchers fail to remember that the goal is health or longevity, not some intermediate measure such as lower cholesterol. In the last fifty years, American medicine became fanatical in pursuit of lower cholesterol scores, and drug sales soared, even as heart disease and other chronic diseases soared apace.
 
For real transparency, the “key performance indicators” need to really be about performance in draining the swamp that the agency was created to drain, with a focus on results in the real world, outside the boundary of the given agency. And the measures actually need to be “key,” and not just easily keyed into a computer. That means selecting important endpoints about things that actually matter to regular Oregonians (health, crime, environmental quality, etc.) instead of the barrage of intermediate agency-centered midpoints such as how fast the agency responds to inquiries, issues permits, or conducts investigations.
 
A maxim of quality management is that “a crude measure of something important is better than a precise measure of something that’s not.” The Oregon Transparency progress reports are filled with precise measures of nothing important, mostly about agency internal process instead of about real world outcomes that affect Oregon, and all are presented in a way that masks and conceals deficiencies instead of providing actual transparency so that problems can be identified and addressed.
 
Many of the Oregon Transparency reports illustrate many of the shortcomings noted above, as well as others not mentioned. However, one agency report in particular exemplifies the problems with the entire project.
 
The progress report from the Oregon Teacher Standards and Practices Commission could be an “anti-Transparency” award winner, as it stands head and shoulders above the rest in displaying a commitment to obscure and excuse performance rather than to present it fairly. Not only are all four TSPC metrics all intermediate process/busy-ness rather than important outcomes in terms of the agency’s stated mission, the text accompanying the report flatly attempts to contradict the data in the report.
 
In any public agency, if the key performance indicators are actually important at all, then poor results cannot be ignored or minimized. In its claims that the agency is actually performing well, the TSPC report asks the reader, “Who you gonna believe, me or your own lying eyes?”

TEACHER STANDARDS and PRACTICES COMMISSION
Annual Performance Progress Report (APPR) for Fiscal Year (2014-2015)

Original Submission Date: 2015 Finalize Date: 12/15/2015

KPM #  2014-2015 Approved Key Performance Measures (KPMs)

1 – PHONE/EMAIL CUSTOMER SERVICE Percent of phone calls and email responded to within 3 days.

 2 – APPLICANT CUSTOMER SERVICE Percent of completed applications processed in 20 days.

3 – INVESTIGATION SPEED Percent of investigated cases resolved in 180 days (unless pending in another forum).

6CUSTOMER SERVICE Percent of customers rating their satisfaction with the agency’s customer service as “good” or “excellent”: overall customer service, timeliness, accuracy, helpfulness, expertise and availability of information.

[The report omits any mention of KPMs 4 and 5.  – Ed.]

I. EXECUTIVE SUMMARY

Agency Mission:  To establish, uphold and enforce professional standards of excellence and communicate those standards to the public and educators for the benefit of Oregon’s students.

Contact:         Vickie Chamberlain

Contact Phone:         503-378-6813

1.  SCOPE OF REPORT 
Licensure and discipline functions are the agency services covered by the key performance measures. Program approval functions are not covered by the key performance measures, although reports of program site visits are public documents and available upon request.

2.  THE OREGON CONTEXT
The Oregon Teacher Standards and Practices Commission sets standards for, approves and reviews Oregon educator preparation programs including: teaching; administration; school counseling, school psychology and school social work. These standards are the context for Oregon college and university graduates’ professional educational licensure quality. The Commission issues licenses in all of the above-mentioned categories and also issues charter school registrations for charter school teachers and administrators and school nurse certifications. These Commission-issued licenses, registrations and certifications permit public school educators to work in their licensed field in Oregon public schools supported by public funds. Finally, the Commission serves as the professional practices board for public educator misconduct and has the authority to issue private letters of reproval, reprimands, place educators on probation, suspend or revoke educators’ licenses as a result of professional misconduct.

The Commission partners with: Chief Education Office, Oregon Department of Education; Oregon public higher education educator preparation programs (Western Oregon University; Oregon State University; University of Oregon; Portland State University; Eastern Oregon University; Southern Oregon University); private higher education educator preparation programs (Concordia University; Corban University; George Fox University; Lewis and Clark College; Linfield College; Marylhurst University; Multnomah University; Northwest Christian University; Pacific University; University of Portland; Warner Pacific College); Oregon Education Association, Confederation of Oregon School Administrators; Oregon School Personnel Association and the Oregon School Boards Association. Since 2013 three educator preparation programs have closed: Willamette University; University of Phoenix Oregon; Lesley University.

3.  PERFORMANCE SUMMARY
The agency’s performance has increased on all KPM’s.

KPM #1 (speed returning email and phone calls): Our target is 60 percent of email and phone calls returned in 3 days or less. The agency improved from 35% in 2014 to 48 percent in 2015.

KPM #2 (speed issuing licenses): Performance in number of applications processed in 20 days improved from 14% in 2014 to 17 percent in 2015. The agency’s performance improved on

KPM #3 (speed from complaint to case completion 180 days): The agency climbed from 12% in 2014 to 21% in 2015.

KPM #4: The agency’s ratings of above average to excellent remained improved slightly from 28% in 2014 to 29% in 2015.

4.  CHALLENGES 
The agency’s challenges have been related to staffing levels and consistency. The agency turned over the Director of Licensure position two times since 2008 and lost the position entirely during the 2013 Legislative Session. We have combined these duties with the oversight of the Professional Practices unit in the office into the Director of Licensure and Professional Practices position. This position was hired in April 2014.Staffing in the agency has been reduced from a high of 26 (with two limited duration positions) to the current staffing of 19 FTE (throughout the 2013-2015 biennium). The agency’s electronic data system is dated and breaking down and the only system we have been able to use this past biennium. This breakdown slows down the processing of applications due to inability to filter out complete applications from incomplete applications causing duplication daily when reviewing new documentation and pending applications. We have been working with DAS egoverment and the NIC-USA vendor to implement an online application system. Phase One of this system is scheduled to go “live” the first month in 2016 with subsequent roll-outs of other pieces of the system throughout 2016.

5.  RESOURCES AND EFFICIENCY
The agency’s actual expenditures in 2011-2013 were $4,945,000. The 2013-2015 Legislatively Adopted Budget approved $4,939,153 in agency expenditures slightly less than a flat-funded budget.The impact was as follows: 1. Staffing was reduced throughout the 2013-2015 Biennium. 2. Agency backlogs grew in application processing, email responses and complaint investigations. Efficiency: We were able to temporarily hire a former employee for nearly a year which resulted in having an knowledgeable and experienced customer service representative for nearly a year.

Picture

If these are “Key Performance Measures,” then is not 100% fail a problem?

 The first specific indicator is simply responding to phone and email contacts within 3 days. This provides TSPC with an opportunity to look reality in the eye and deny it, stating that

“We do not have actual data, but in reviewing results with many of our neighboring states (through informal conversations), it appears that even though we are not meeting our own expectations, in the educator licensure arena, Oregon’s office excels at customer service.”

Picture

Uh-oh, better start spinning!


1.  OUR STRATEGY

Returning phone calls and email quickly allies [sic] licensee anxiety. It also facilitates the issuance of licenses if we are able to help the applicant make a better application. The slower we are at responding, the more people send duplicate email searching for answers (or they call). We publish statistics daily regarding numbers of calls answered as well as numbers of email pending and responded to by staff.

 2. ABOUT THE TARGETS

 An ideal target would be 100 percent in 48 hours. However, we do not have the staffing to manage this outcome. If any person is ill, we quickly become buried in the volume. A higher percentage represents a better response time to licensees.

3.  HOW WE ARE DOING
We are not doing as well as we believe we could do in this area. [Communications are email and phone calls.]

The move to assigning staff as district liaisons has been a success, communications- wise, however, we are unable to electronically track the number of phone calls that agency staff receive related to customer service on their direct phone lines. Nor are we able to track the “turn-around” time for these calls. We started this program about mid-year 2010, and have continued it into the present. Due to budget reductions, we lost five positions in licensure related to lay offs and natural attrition. Positions were not filled to reduce agency overall expenditures. Finally, due to the small size of the staff, any turn over, illness or other legitimate absence sets us back very quickly. We currently have three full time staff assigned to answer phone calls, respond to email and serve walk-in customers. This is down from a high of six public service representatives in 2007-2009. Additionally, due to cuts in other areas of the office, employees staffing phone calls and email also have to assist with opening the mail (takes two people 1-2 hours daily); data input the mail; bar code and scan the mail as well as prepare the daily bank depositions from cash (checks and money orders) received each day in the mail. These “side duties” do not allow us to fully staff the phones on any given day. Once the online application system is launched, a significant amount of the paperwork handling as well as all of the money handling will arrive in the office electronically allowing us to redirect staffing resources directly to customer service. Once we are able to reduce the side duties, we can focus on building capacity to capture messages from licensees and return them as well as have more people answer the phones throughout the day.

4.  HOW WE COMPARE
We do not have actual data, but in reviewing results with many of our neighboring states (through informal conversations), it appears that even though we are not meeting our own expectations, in the educator licensure arena, Oregon’s office excels at customer service.

5.  FACTORS AFFECTING RESULTS

Factors affecting results:

1. Reduced staffing due to reduced revenue;

2. Staff turnover (6 different people have occupied the two public service positions since July 1, 2013). Turnover has resulted in delayed responses times as we hire and train new employees.

3. Bare-bones staffing results in further delays when there is illness in the office.

4. Increased volume in email due to long turn-around time to issue licenses.

5. The need to devote several hours daily to opening mail, data entering mail, manually recording money received, manually taking money to bank, scanning (imaging) documents received and associating these scanned documents to individual educators’ accounts prevents us from using these same people to answer phones and email.

6.  WHAT NEEDS TO BE DONE
 1.   Hire more staff.

2. Continue to monitor performance both good and bad.

3. Implement online application system as quickly as possible.

7.  ABOUT THE DATA

The reporting cycle is the calendar year: July 1, 2014 to June 30, 2015 The data for email are reliable. We have accurate electronic tracking of all phone calls and email through our electronic filing system. However, as noted above, we have transferred some of the workload to direct-phone access rather than sole access through the agency’s main phone line and general email inbox. That workload is fully not trackable given our current configuration for electronically collecting data and lack of ability to replace our current electronic filing system

As with the 3-day response data, the data on the rates of licenses issued within 20 days is presented mainly to argue for additional resources; apparently, failing to meet performance targets — even key performance measures — year after year causes no introspection among management as to methods and processes. Instead, TSPC highlights its own failure to find real benchmarks, which continuing to insist that all is well: “Our customer service survey respondents tell us we are generally faster than California, Washington and Arizona when it comes to issuing licenses. We do not have data on how other state agencies fair [sic] in this area.”
Picture

A solid record of accomplishment not achieved

1.  OUR STRATEGY 
We have increased the number of license evaluators (people who issue the licenses) from three to five (effective May 1, 2011), but reduced staffing in other areas of the agency due to budgeting has affected this strategy.
 
2.  ABOUT THE TARGETS 
Originally, we developed the targets using anecdotal information. The data collected since 2006 represents actual numbers. We believed, based on the anecdotal data that we were ambitious about adopting targets believing it would drive us more quickly toward achieving them. The real data reveal that we were too cautious. The direction we want to achieve is a higher percentage.
 
3.  HOW WE ARE DOING 
Due to staffing trends in 2008 through 2010, the numbers of licenses issued dropped slightly resulting in a gradually building backlog. By reorganizing the that area of the agency, and increasing the number of people issuing licenses, we were able to reduce the backlog of unprocessed complete applications. However, due to severe budget reductions during the 2011-2013 biennium, staffing in the licensure area was reduced by one manager and two support positions.

Additionally, the reduced revenue was a result of reduced numbers of applications submitted resulting in an opportunity for staff to catch up . Processing was averaging 20 days or less from February 15, 2013 through June 15, 2013. By October 2013, it was 16 weeks, and by January 2014, it was over 20 weeks. Persons issuing licenses frequently have to backfill answering the phones, serving walk-in customers, opening the daily mail, assisting with inputting new applications, assisting with scanning documents received by the agency. Until we can totally reduce the backlog, we cannot gain on this target.

 
4.  HOW WE COMPARE 
Our customer service survey respondents tell us we are generally faster than California, Washington and Arizona when it comes to issuing licenses. We do not have data on how other state agencies fair [sic] in this area.
 
5.  FACTORS AFFECTING RESULTS 
1. Staffing reductions resulting in job rotations into areas outside of the general licensure area (opening mail, mail intake, deposit receipting, answering phones, serving walk-in customers, document scanning and review, etc.)

2. We went fully paperless in October 2012 which reduced the amount of time each months (several days of man-hours) handling paper licenses, letters, renewal notices and other correspondence.

3. Lack of ability to provide direct supervisory oversight of the licensure unit. (The Director of Licensure position was eliminated in the 2013-2015 LAB.)
 
6.  WHAT NEEDS TO BE DONE
 
1.  We have increased the number of people issuing licenses from two in 2010 to five in 2011 and 2012.During the 2015-2017 biennium, we will have 5.5 FTE issuing licenses.

2. Continue strong staffing in positions that issue licenses;

3. Continue implementation of a new online application system. This will reduce the amount of paper handling; mail that needs to be opened, money data entry and manual transfer to bank, and allow for greater focus on issuing licenses and customer service.
 
7.  ABOUT THE DATA 
Data cycle: July 1, 2014 through June 30, 2015.Strengths of the data:

1. Collected from electronic data base. Reliability: We compare the figures collected at year end to the ongoing figures collected monthly and reported to the Commission at each meeting.

The next indicator is the TSPC’s rate of closing investigations in 180 days. Again, TSPC ignores that its function is actually quite similar to that of many other regulatory and licensing agencies in Oregon, not to mention in other states. Yet, according to TSPC, “It is difficult to find agencies with similar staffing; similar procedures and similar numbers of investigations.” As if the only fair or useful comparison is not with any agency that does not do exactly the same sorts of investigations with the same number of staff.

1.  OUR STRATEGY
Our strategy to achieve this goal is to tackle the work based on urgency of the facts presented in the complaints. We work closely with the Department of Justice on discipline cases to accomplish this goal.

2. ABOUT THE TARGETS

Discipline cases should be processed as quickly as possible. Investigating a higher number of complaints in 180 days would be a sign of expeditious action. Higher is better.

3.  HOW WE ARE DOING
In 2003, the rate of resolving cases was nearly 60%.

A lower than expected performance in 2007 resulted from staff turnover and a vacancy in the full time investigator position for nearly three months. The results in 2008 reflect the addition of 3.0 FTE investigators (limited duration) to the staff.

The results in 2009 reflect 4 FTE investigators and 2 FTE support staff. Three of these six FTE are currently Limited Duration positions.

Due to the increased staffing, our performance increased sharply from 48% in 2008 to 63% in 2012. Performance this year (2011-2012) dropped to 43%.

The Commission’s workload has high and has been as follows (number of cases considered (investigations reviewed and final order entered):

2009: 271

2010: 285

2011: 265

2012: 260

2013: 248

The number of complaints received annually continues to remain high with 291 cases of alleged misconduct in 2012, 260 cases in 2013, and 259 cases in 2014. This compares to 135 cases reported in 2004.

4.  HOW WE COMPARE
 No data at this time. It is difficult to find agencies with similar staffing; similar procedures and similar numbers of investigations.

5.  FACTORS AFFECTING RESULTS

1.   Staff turnover resulting in needing to train new investigators. (One investigator resigned following a marriage out of state, and another investigator was deployed to Afghanistan soon after he was hired (has not yet returned).

2. Staffing with temporary employee.

6.  WHAT NEEDS TO BE DONE
Continue to focus on serious cases and delay negotiations for settlement until after the commission considers the evidence.

7.  ABOUT THE DATA

Data reported is from July 1, 2013 through June 30, 2014.
Strengths of data include:
1. Have been collecting this data since 1997.

Weaknesses of the data:

1. Does not reflect the variability of staffing, case complexity, and other measures that would impact results.

Reliability: Data has been compiled and collected by one person over the past 12 years.

The graphic for the final TSPC performance indicator is a candidate for a special award in the “How to Lie with Excel Graphs” – one must not merely glance, but must look carefully at the graph to realize that the right-most bar is not the latest result, but is rather the claimed target, although presented as a bar on the chart. This is even leaving aside the quality of the data, which seems extraordinarily weak, as the text connected to the graph highlights. 

According to the agency itself, even cherry picking the positive data is not enough to bring the overall satisfaction of customers (teachers) out of the range of abysmal; nonetheless, TSPC determines that the data is good because it “gives general perception of agency’s above average performance.”

1.  OUR STRATEGY 
Our strategy is to improve our customer service, thereby improving the results. In October 2008, we added a comment box to our customer service survey. The results were much more valuable than the Lickert [sic – Likert] scale rating system.

2.  ABOUT THE TARGETS 
Based on the low performance of 2006, the targets were set high to encourage improvement in the evaluation of our performance.
 
3.  HOW WE ARE DOING 
We expect to improve these rating with additional staffing and a new online application system that will be implemented throughout the 2015-2017 biennium.
 
4.  HOW WE COMPARE 
No data.
 
5.  FACTORS AFFECTING RESULTS 
1.   Reduced staffing; and
2. Slow licensure processing.
3. Reduced number of people answering phones and email.
4. We are not able to capture accurate phone data due to lack of capacity to save messages and return messages to the public.

 

6.  WHAT NEEDS TO BE DONE 
1.   Increase staffing on phones and email sufficiently to allow for ability to capture and return messages left by licensees; (approved more staffing in 2015 Legislative session)
2. Reduce application backlog
3. Keep information clear and accessible
4. Implement new online application system (will allow the agency to significantly redirect staff to phone, emails and processing licenses.)

 
7.  ABOUT THE DATA 
Reporting cycle: July 1, 2014 through June 30, 2014 5

Data: We only “count” ratings “above average” or “excellent” in the Overall performance question from our Customer Service surveys.

Strengths gives general perception of agency’s above average performance. Weaknesses: Only 22% of all people who were issued a license in 2014-2015 (4,130 out of 18,772) responded to the survey.