Comment

Spatial disorientation

Posted: April 29, 2016 at 8:50 am   /   by   /   comments (0)

Is Shire Hall run efficiently? By this I mean, how does the County compare to other municipalities? Does it use its resources well? Are its operations run efficiently? How does it rank among its peers? What about productivity? Are we getting enough out of our human resources? Are we improving? Or getting worse? In what areas, services or programs do we need to do better? In what ways is our municipality outperforming the average? Or behind the curve? And why?

The fact is, we don’t have a clue. And sadly neither does Shire Hall. Any data or metric comparing this municipality’s performance versus another, is at best, selective—assembled for a specific purpose, providing a glimpse of a narrow sliver of business at a moment in time.

How has this come to be? How can we know so little about the relative performance of local government? Shire Hall will spend $57 million this year; and we have no objective way to measure whether it will be spent well.

How do we hold our managers accountable when we have no means by which to measure performance? How do we ask our representatives to govern effectively when they lack the metrics to know if the business is being operated leanly or extravagantly?

This is not a comment on professionalism. The folks I encounter at Shire Hall are unquestionably professional and competent.

It is an altogether different matter, however, to measure organizational efficiency and productivity. In this regard, we are utterly and helplessly in the dark.

Without metrics and benchmarks to regularly compare and rate performance, even the most professional managers are flying by the seat of their pants. I use this phrase decidedly. In the early days of flying, before navigational aids, planes were often observed flying out of dense clouds upside down, the pilot completely unaware of the plane’s orientation or proximity to the ground. Even after the advent of life-saving instruments, including the altimeter and attitude indicator, many pilots refused to use them, preferring to fly by the seat of their pants. Sadly, many flew blissfully into the ground at full throttle, unaware of their impending peril.

We, too, lack critical navigational aids.

Look at it this way: if you were investing the equivalent amount of your taxes into a business, you would know a great deal more about that business than you do about the County’s operations. You would know, for example, the average cost to process a cubic metre of sewage. Comparing your cost to the average would provide a rough measure of the relative efficiency of the process and therefore, the business. It would not tell you everything—but it would be a starting point to ask questions. To look at areas for improvements. To streamline processes. To understand the differences.

If the County were your business, you would know the average cost to maintain 100 kilometres of class four road. You would know much an average bridge costs. A culvert replacement. You would know the average cost of running a municipal arena in Ontario.

And when your costs deviated significantly from the average, you would ask questions. Pointed questions. Specific questions. Not vague expressions of discontent, but rather clear and direct questions about what had changed and why.

But we can’t do that with our local government because we don’t measure performance.

We used to. But we don’t anymore.

The municipal performance measurement program (MPMP) was initiated in 2000 by the province as a means to make local governments more accountable to ratepayers. Municipalities were required to calculate a wide variety of costs into comparable units. It wasn’t a particularly good tool. Given the variability in size, kind and scope of duties of municipalities across the province, the MPMP it was often difficult to compare apples to apples. But it was something. A foundation upon which better tools might have been formed.

In 2013, the last year this data was presented, it cost $3,417.82 to maintain and operate one lane of a kilometre of hard-top roadway in the County. In Quinte West, it cost $1,496 per lane kilometre. A big difference. Four years earlier, the County’s cost was a third of what it was in 2013.

It raises many questions. Questions like Why? Why did the County’s costs rise so rapidly? Why are our costs so much more to maintain roads in Prince Edward County than Quinte West?

There may be solid reasons explaining this wide deviation. But we don’t know what they are. We may never know.

We don’t keep track of this data any longer. In 2013, the province dropped its requirement that municipalities make this data public. Last year, it killed the municipal performance measures program outright.

We are in the dark—flying without instruments.

rick@wellingtontimes.ca

 

 

Comments (0)

write a comment

Comment
Name E-mail Website