Organizational Entrenchment. Another area of importance is organizational entrenchment, which should be cascaded up/down so that achievement of lowerlevel key performance indicators ensures
the organization reaches its goals. It is also critical to achieve organizational buy-in, which requires that each organization accepts
the metrics and targets.
“This was easy for us because we keep so many statistics,” Chambers said. “We reviewed our statute and mission statement and the statistics that we keep.”
Statistics kept by the state’s fleet department are available on the Utah’s fleet Web site, http://fleet.utah.gov. These reports
include:
Rate matrix.
Surplus.
Fiscal year.
State vehicle.
Legislative updates.
Web real-time reports for:
Accidents and preventable accidents.
Fuel management and fuel throughout for state-owned vehicles.
Cost-per-mile exceptions.
Utilization by equipment.
Utilization by class/code and/or mileage.
Operators.
Take-home vehicle list.
Step 2. Filter to ‘Vital Few’ Indicators
It was time to filter down to the “vital few” key performance metrics, which can be a challenging step in the process for fleets that track a large amount of data.
“The BSC suggests fewer than 15 key indicators, so this was the difficult part — to take all the data that we track and narrow it down to the indicators that drive our division,” Chambers said.
So, Chambers and team set out to filter down to their “vital few” metrics bybreaking out into groups. They first discussed key activities (drivers) that most affected their success. Then, they reviewed and proposed existing metrics.
Chambers’ team analyzed four key areas:
Brainstorming on additional ways to measure performance on key
activities.
Screening metrics using “basic guidelines” and identifying “vital few.”
Discussing whether metrics are linked, controllable, balanced, and
Developing hypotheses on which metrics are “right.”
Step 3. Set Targets and Identify Secondary Drivers and Metrics
During the next phase of the project, Chambers set targets and identified secondary drivers and metrics. At this point in the process, Chambers emphasized the importance of making sure the scorecard’s guiding principles are on track.
“It is important to set challenging targets,” Chambers said. “If you choose targets that you can reach without effort, you are missing the point of the process.”
Step 4. Draft Scorecards and Targets
Chambers made sure the appropriate people reviewed and signed off on all metrics and drivers, including UPP and the department director.
Next, Chambers drafted the scorecards and targets. The scorecard design included several categories, including metric, target, frequency, beginning, previous, current, status, and trend. It is vital that scorecards are easy to read and understand.
“This step is to finalize the key performance indicators and capture the beginning measure,”Chambers said.“For example, under Process Excellence — Effectiveness, we used vehicle compliance
percentage for PMs and recalls. Under Process Excellence — Efficiency, we used average full-lease vehicle cost per mile.”
{+PAGEBREAK+}
Step 5. Implement and Execute a Balanced Scorecard
Finally, Chambers implemented and executed the balanced scorecards by:
Communicating. Chambers communicated the plan during several manager meetings by explaining the balanced scorecard process and the department guidelines. “The managers were involved in
developing the key performance indicators,” she said.
Training. Chambers made sure everyone understood how each
metric was calculated and where to get the data.
Detailing roles and responsibilities. “We made assignments for who was responsible to submit the metric and when it is due,” Chambers said.
Outlining the management process. Chambers recommends using
the scorecard in weekly and monthly meetings. Chambers’ scorecard is
reviewed every month in a one-on-one meeting with the department
director.
Determining the IT platform. According to Chambers, it is critical to determine how the statistics will be gathered and what software will be used for the scorecard. Chambers uses Web and Microsoft Access reports to collect the data, as well as Excel for the scorecard.
“Our scorecard is submitted to the department of administrative services every month,”Chambers said. “Several of our indicators roll up to the department scorecard, which is submitted to the governor.”
Fleet and Surplus Property Lowers Overhead Costs
Since implementing the balanced scorecard 18 months ago, fleet and surplus property’s overhead costs have dropped significantly, according to Chambers. She has also seen resale value increase. “We
noticed that we were getting more for vehicle sales when we sold the vehicle at our location as opposed to sending the vehicles to our outsourced vendor,” she said. “We have changed our process to keep the vehicle on our lot longer to get the higher resale.”
The balanced scorecard initiative has been well received in Chambers’ division.“ We have seen results from using it,” she said.
What’s Next for Utah’s Fleet and Surplus Property Division
Because the department’s balanced scorecard was implemented from the top down — from the governor and department of administrative services — fleet and surplus’ next step is to define scorecards for each section of its division, including fleet, fuel, surplus, and
administration.
“Each of the section scorecards will roll up to the division scorecard,”Chambers said. “We are working to get the employees to take ownership of the scorecard. Performance plans will show responsibilities that will roll to the section scorecard.”
Chambers is also developing a “visual scoreboard” for each section of the scorecard, which will help track the department’s progress over a 12-month period. The visual scoreboard assigns points as balls — fleet positives; strikes — fleet negatives; and outs — 3 outs equal zero points.
The scoreboard will also highlight vehicle compliance percentage, vehicle utilization, and cost per mile. Teams can score up to three “runs” per inning (or month).
Click here to see the article