Features

Progress– by Numbers

Bringing A Sense of Accountability to the ‘Plan’
Timothy Brennan

Pioneer Valley Planning Commission Executive Director Timothy Brennan

Since it was first blueprinted in 1994, the region’s Plan for Progress has identified growth strategies for the Pioneer Valley, and helped to keep area business and civic leaders focused on the proverbial big picture. What had been missing from the equation with the ‘plan,’ according to some involved with it, was a method for actually measuring progress with regard to those strategies. A recently implemented accountability system was designed with the specific goal of filling that void.

Timothy Brennan equated it to checking the gauges on a car’s dashboard.

“We need to be able to see if we have enough gas to get where we want to go,” he explained. “We need to see if everything’s working the way we want.”

It was with this well-thought-out analogy that Brennan, executive director of the Pioneer Valley Planning Commission, summed up an elaborate effort to add a strong measure of accountability to the region’s so-called Plan for Progress and the many strategic components imbedded within it.

The ‘plan,’ first drafted in 1994 as a road map of sorts for guiding the Valley out of the seemingly endless recession of the early ’90s, has evolved over the years, but its basic mission has remained the same: to give the region focus points for growth and economic development that will enable it to effectively compete against other economic-development regions across the country.

The document, which has been updated and expanded for its 10th anniversary, has no less than 14 individual bits of strategy — from improving and enriching K-12 education to “enhancing high-tech and conventional infrastructure” to “revitalizing the Connecticut River.” A small army of area business and civic leaders has been assembled to address these strategic components and develop action plans for addressing them.

What was missing from the equation, according to some involved with the plan, was a measure of accountability, or a means to measure the progress being made with each of these strategic points — or not being made, as the case may be.

And what has emerged over the past several months is a system that fills that void, said Brennan, who told BusinessWest that it uses numbers, not words, to gauge (there’s that word again) whether the region is moving forward on a specific issue, going backward, or remaining in neutral — another automotive term.

“We didn’t want to use words alone to measure progress,” he said, noting that there was a certain subjectivity to the one-paragraph narratives that had been used to create “progress reports,” for lack of a better term, in the past.

We wanted to do this in a metric fashion to give it a harder edge.”

To illustrate how the new accountability system works, Brennan and Molly Jackson-Watts, the PVPC’s Regional Information and Policy Center manager, focused on one of four larger groupings of strategic components within the plan— in this case “Strategic Grouping III: Supply the Region with an Educated, Skilled, and Adequately Sized Pool of Workers.” Within this group are four of the plan’s 14 strategy points:

  • Integrate workforce development and business priorities;
  • Advance early-education strategy at state and regional levels;
  • Improve and enrich K-to-12 education; and
  • Support higher education and retain graduates.
  • There is a also a list of six so-called “indicators,” ranging from average MCAS test scores (including breakouts for the region’s urban core and rural school districts) to the median age of the region’s workforce to the number of older workers (55 to 75 years old) who remain engaged in the region’s workforce.

    These and others are all telling statistics, said Brennan, who noted that for some, including most MCAS scores, the region is trending down, while for others, like the number of older workers, the region is gaining ground.

    What does it all mean? Well, that’s open to interpretation, said Brennan, and also subject to comparisons with other regions similar to this one. Indeed, part of any attempt to quantify progress is to put any numbers in perspective, and this is the next challenge for those involved in this initiative.

    “We knew that there was another piece of this coming down the road,” he explained. “It involves not just issuing ourselves a report card, but taking our report card and putting it against peer regions. That’s the next thing we have to do.”

    In this issue, BusinessWest takes an indepth look at the the plan’s new accountability system, and focuses on the broad ‘workforce strategy’ grouping to show how it works and why it’s important to the program’s success.

    Statistics Course

    Brennan told BusinessWest that the search for a system of accountability for initiatives like the Plan for Progress is not exactly a recent phenomenon. It’s been a topic of discussion nationally, at meetings and conferences involving agencies (like the PVRPC) that fall under the auspices of the federal Economic Development Administration.

    “They’ve been on this for more than six years, since the start of the Bush administration, really,” said Brennan. “They kept saying that these planning efforts across the country needed to have to a more rigorous measurement system. So I’d go to these meetings, put my hand up, and say, ‘got any examples that you can show us to help us along?’

    “The standard answer, which almost became comical, was, ‘no … you go figure it out,’” he continued. “They’d say, ‘this is what we want, but we haven’t figured it out yet — you go find a way.’”

    So he and some of the others involved with the Plan for Progress did, although Brennan admits to being somewhat defensive the first time it was suggested that the plan needed more accountability. Such prompting came from several members of the initiative’s executive board, especially former Stanhome CEO John Gallup.

    “He challenged everyone,” Brennan said of Gallup. “He said, ‘we have a plan, and we’re trying to follow it, but it’s time to reach beyond what we had been doing and do it better.’ In general terms, he said, ‘is there a better way for us to be accountable than what we have now?’”

    Over the course of several months, a team involved with the plan pieced together that better way, an accountability system that essentially gauges progress and awards a score, or rating. Such ratings are ‘1,’ ‘2’, or ‘3’, to connote a negative trend, a neutral trend, or a positive trend, respectively.

    Going into this exercise, Brennan said that those involved with it had several fears or concerns, but two that really stood out.

    First, there was acknowledgement that individuals and groups can manipulate numbers to show virtually whatever they want to show. Plan for Progress leaders wanted to avoid such appearances, and focused on objective questions, or statistical points, that would reduce or eliminate such doubts. Second, there was the fear that by putting numbers, or scores, out for everyone in the region to see, there would be a focus on the negative, which is something else that organizers wanted to avoid.

    Overall, architects of the new system wanted something intrinsically simple, yet effective, and conducted a good amount of research to achieve that end.

    “We had done a fair amount of study into systems of measurement in other metropolitan areas, and it helped us identify what we wanted — and didn’t want,” Brennan explained. “We came across one in Cleveland, for example, that had 300 indicators. They spent millions of dollars on it, and it lasted about two years before it melted down. So we knew we didn’t want to go with something that was so laborious that you couldn’t maintain it.”

    Developers of the new system also wanted the results to be readily accessible, so they put a new section on the PVPC Web site (www.pvpc.org) called ‘The Plan for Progress/Region Wide Performance Indicators Summary,’ which is updated as new data is available, said Jackson-Watts.

    Number of Possibilities

    As they pieced together their accountability system, organizers settled on four strategy groupings. In addition to the ‘workforce’ category, which Brennan said is perhaps the most critical set of issues facing the region, there are others titled ‘Strengthen and Expand the Region’s Economic Base,’ ‘Foster Means of Regional Competitiveness,’ and ‘Enhancements Fostering the Region’s Business Climate and Prospects for Sustainable Economic Growth.’

    Each grouping has three or four of the strategic components that were set down within the expanded, revamped plan. The ‘expand the economic base’ grouping, for example, has three strategic elements — ‘attract, retain, and grow existing businesses and priority clusters,’ ‘promote small businesses and generate flexible risk capital,’ and ‘market our region.’

    The ‘workforce strategy’ grouping, which is focused primarily on the matters of supply and demand with regard to skilled labor, now and for the foreseeable future, produced some fairly mixed results for the region — in this case, Franklin, Hampden, and Hampshire counties — as a visit to the ‘performance indicators summary’ section of the PVPC Web site clearly shows.

    Overall, the section earned a 1.9 rating, showing a neutral trend, but there was great fluctuation in the numbers for individual indicators. While the percentage of students scoring proficient or above on the MCAS third-grade English test increased 1.7% between 2006 and 2007, for example, the percentage of students passing the MCAS math test (administered in 10th grade) decreased 4.2% across the Valley between 2005 and 2006, the latest numbers available. Meanwhile, the high school dropout rate for the region was virtually unchanged (up 0.9%), while the percentage of high school graduates in the workforce among those ages 25 or older increased nearly 6%.

    What do these dashboard gauges, as Brennan called them, show at this time? In some cases, as with the MCAS math scores, they show where work needs to be done, he explained, adding quickly that it is generally difficult to extract meaningful findings unless or until the numbers are compared to peer regions.

    In the meantime, though, the numbers provide a base on which future years can be assessed, giving the region some direction in matters such as workforce development, which will be a challenging realm as Baby Boomers retire and the smaller generations that succeed them are asked to step up.

    “There is a major focus on supply, in this region and everywhere else, and with good reason,” said Brennan. “That’s because we know there’s going to be a huge exodus of workers, in probably 2010 or 2011, when people will hit age 65 and reach retirement, or at least soft retirement. One of the big questions is whether we’re going to have worker shortfalls in the New England states, and most of us think there will be.”

    Numbers gathered in the ‘workforce’ grouping can help area development leaders gauge how the region will fare up to the supply challenges, he continued, by offering indications on everything from the sheer number of bodies in the workforce to the overall quality of that constituency.

    “We’re in a new economy, a global economy, and we’re not chasing smokestacks anymore, we’re chasing talent — that’s what makes an economy grow,” he said, adding that the new accountability system should help the region assess just how it’s faring in that chase.

    Summing things up, Brennan said the assembled numbers hold up a truly objective mirror to the region and its strategies regarding growth and competitiveness.

    “The idea was to try to be as candid as possible — tell the good stuff but also tell the bad stuff to make it as believable as possible,” he explained. “But also, don’t be afraid — tell it like it is and hope, particularly with those things that we’re not doing so well on, that we have something to shoot for.

    “At the same time, the numbers can help us clear up some of the misconceptions about this region,” he continued. “There are many out there, and the numbers can help separate fact from perception.”

    Off the Charts

    When asked to elaborate on what numbers can do that words can’t do, or do as well, when it comes to gauging progress, Brennan said the numerical statistics hit harder and speak in a louder voice, one that’s much harder to ignore.

    “By going to numbers, you’re forcing yourself to get more rigorous,” he said, referring to those involved with carrying out the plan. “It’s less easy to weasel out of problems, but it’s also easier to celebrate successes.”

    Like the plan itself, its accountability system is a work in — and about — progress. It is already showing great promise as a method for showing area economic developers when to hit the gas, and what problems or issues they may confront further down the road.

    And that’s what dashboard gauges are for.

    George O’Brien can be reached at[email protected]