Wednesday, December 31, 2008

Aggregating Indicator Scores

To measure the performance of your organization in a certain area, you will typically use a set of indicators. These indicators may or may not cover the entire area you are trying to measure, may contain indicators of short-term or long-term progress, etc. Anyhow, you'll have a set of indicators that you have chosen to represent an area of management, a process, an activity, etc.

So indicators are a set of metrics. You may have something like this to measure client service:



IndicatorActual ValueTarget
Percentage of pizzas delivered within 30 minutes90%100%
Percentage of calls answered within 2 minutes of entering the queue80%100%


Now, to get an aggregate score for client service, you could just take the average of the 2 indicators, that would give you (90+80)/2=85. However, you may decide that the indicators don't all have the same importance, the so they shouldn't all have the same weight. Let's say people hate waiting in a telephone queue, but won't notice if there pizza is 2 minutes late. In that case, the indicator for call wait time is more important, so we'll give it a weight of 70%, and we'll give a weight of 30% to the pizza delivery time. That would give us a score of (90*0.3)+(80*0.7)=27+56=83.

A few notes on this:

be careful of the units you use, in the example, we used 2 percentages with the same target, so we know they'll be fairly close and that they are fairly comparable. But if you were measuring something like the number of units sold and average call wait time in minutes, your units would be too different to be compared directly. What can you do? Use the target, and compare the result to the target. That will give you 2 results in "percentage of target achieved", which can than be directly compared to one another. If you use that method, setting meaningful targets becomes essential if you want your aggregate indicator score to be meaningful and useful.

In the example, the weights used add up to 1. It doesn't necessarily have to. But having a score that has an understandable maximum (100 in this case) makes it more understandable and intuitive. The resulting aggregate indicator score in the example, is not in a particular unit: all we know is that it's maximum is 100. There are times when, because of either your indicator or target your result may exceed 100. There is nothing wrong with that, but it highlights the importance of explaining how you go about measuring your performance, and how your data should be interpreted.

Finally, defining weights is a tricky exercise, and some managers may abuse this system by assigning low weights to indicators on which they know they will perform poorly. Another aspect to consider is that you may want to assign low weights to indicators for which the results are not very reliable.

Thursday, November 20, 2008

Speech from the Throne

The Speech from the Throne and the Prime Minister's speech provide the foundation for the government's legislative priorities and agenda for the current session of Parliament. These priorities should be reflected, where applicable, in a department's Report on Plans and Priorities (RPP).

Speech from the Throne: http://www.sft.gc.ca/eng/media.asp?id=1383

Address by the Prime Minister in Reply to the Speech from the Throne: http://www.pm.gc.ca/eng/media.asp?id=2318

Thursday, October 30, 2008

Correlation and Causality

Correlation is not causality, they are two different concepts.

Correlation

Correlation is a relationship between variables. When the value of X goes up (or down), the value of Y goes up (or down) in a predictable way. The height and weight of a person are correlated. Their eye color and their weight is not.

Causality

Causality is a cause-effect relationship between variables. A change in the value of X is the cause of a change in the value of Y. For example, viruses make you sick. Be careful not to confuse the cause and the effect: you sneeze because you have a cold, but you don’t have a cold because you sneeze.

Proving a cause and effect relationship is difficult, as all other variables must be controlled. It is also possible for an observation to have more than a single cause; the change of the price of a stock is an example. Normally, the change in the variable causing a change in the other is observed before the change of the value of the dependent variable.

Wednesday, October 29, 2008

The Management Accountability Framework (MAF)



The Management Accountability Framework (MAF) is a framework used by the Treasury board Portfolio to assess the quality of management in departments. It is structured around 10 elements: Public Service Values, Governance and Strategic Directions, Policy and Programs, Results and Performance, Learning, Innovation and Change Management, Risk Management, People, Stewardship, Citizen-focused Service, and Accountability. Indicators are defined for each element and are used to measure performance in each area.

More information on the MAF is available on the Treasury Board of Canada Secretariat's website: http://www.tbs-sct.gc.ca/maf-crg/index-eng.asp

Tuesday, September 2, 2008

Dimensions of an Outcome

Outcomes are normally part of a performance measurement framework of one type or another. Most often, they will be used in the public sector or the non-profit sector to explain how their activities’ contribution to society. The might also be used at a lower level to measure the implementation of a strategy in the private sector. In that particular case an organization would be measuring the outcome of a strategy, although in the terminology generally used in public sector performance measurement; this would be closer to an expected result.

Regardless, the purpose of this post is to clarify a perceived ambiguity surrounding outcome levels. In performance measurement literature, different levels of outcomes are often mentioned, such as immediate, intermediate, long-term and final outcomes. The descriptions given usually revolve around time and impact on society.

However, to clearly define outcomes, they need to be perceived through at least 3 dimensions:

  1. reach or societal impact,
  2. time (frame, lag, or delay) and
  3. attributability or responsibility

The reach or societal impact can be generally conceived of as the “societal importance or value” of the outcome. For example, “reducing the number of sick Canadians” may be an outcome, but “healthy Canadians” is a broader, and further reaching one.

The time dimension is a little more complex, because more things can be measured here. For example, an outcome could be defined as a desired end-state. In that context the time dimension would refer to the time required to bridge the gap between the current state and the desired end-state. The time dimension can also be important in a context where an organization’s action will only have an impact on the outcome after a period of time.

The attributability of an outcome for the organization or the responsibility or the organization for the outcome are also to be considered. Attributability can be defined as the amount of “credit” an organization can take for the change in the outcome. Most often, not all change in an outcome can be attributed to the actions of an organization. The concept of attributability is closely linked to the concept of causality. A change in the outcome is attributable to the organization if the organization’s actions are the cause of the change. Responsibility, however, is a different concept. Where attribution is when an organization appropriates changes in an outcome, responsibility is when an organization is made responsible for an outcome, or if you prefer, is mandated to have an impact on the outcome. However, attributability of the change in the outcome still remains to be proven of organizations with clear responsibilities. For example, the Bank of Canada has an agreement with the Government of Canada regarding target inflation rates. To a certain extent, it is responsible for the rate of inflation. The question in that case is, what level of change (or lack of) in the inflation rate can the Bank take credit for?

Although it has not been included with the 3 other dimensions, the measurability of an outcome should always be considered. It is hard to measure the performance of a set of actions if the change in the outcome itself is not measurable. An unmeasurable outcome will also lead to questions and debates about methods and approaches, and may lead to questioning of the value the organization brings to society.

Thursday, August 21, 2008

Cognos Help Resources

For those of you interested in Business Intelligence (BI) software, here are a few links relevant for IBM Cognos 8 BI:


Cognos

Most of the support or help information on the Cognos site requires a login and password.

Supportlink is published frequently and includes some interesting tips and techniques
http://support.cognos.com/supportlink/

The main Cognos support site, the Knowledge Base is a useful tool
http://support.cognos.com/en/support/index.html

Customer Resource Center - Report Author Section contains more detailed documents on different subjects and techniques
http://support.cognos.com/en/resources/roles/gcs_3.html


COGNOISe

Cognos Centered community, here's the link to the forums:

http://www.cognoise.com/community/


ITtoolbox

Some Cognos related forums, take a look at the forum list for other products

http://businessintelligence.ittoolbox.com/groups/technical-functional/cognos8-l
http://businessintelligence.ittoolbox.com/groups/technical-functional/cognos-l


Tek-Tips

Another Cognos related forum

http://www.tek-tips.com/threadminder.cfm?pid=401

Monday, August 18, 2008

DPR Requirements Relating to Government of Canada Outcome Areas

When writing the Departmental Performance Report, a link between the department’s strategic outcomes and the Whole-of-Government Framework must be established. An explanation of how the department’s strategic outcomes are aligned with Government of Canada outcome areas. This should be done in Section 1: Overview under the Summary Information heading.

From the Template Instructions for Departmental Performance Reports (http://www.tbs-sct.gc.ca/rma/dpr3/06-07/instructions/instructions_e.asp):

“a summary status on the department’s performance in achieving their strategic outcome(s) and program activity expected results. The Summary Information table is mandatory and must be followed by a narrative section. The narrative section is to provide an overall description of the department’s performance for 2007–08. All key elements provided in the summary table must be explained. This section should provide the department’s overall performance in relation to the previously set priorities; indicate the progress made towards departmental strategic outcomes and how it is supported by the program activities; and outline how the departmental strategic outcomes contribute to broader government-wide objectives.”


“the description of the departmental context must also include a discussion of how departmental strategic outcomes are aligned with Government of Canada outcome areas. For more information on current outcome areas or Canada’s Performance and the RPP Overview for Parliamentarians website, departments can consult the “Whole of Government Framework” instructions online at http://www.tbs-sct.gc.ca/pubs_pol/dcgpubs/mrrsp-psgrr/siglist_e.asp (see the contact list at the end of the Guide to the Preparation of Part III of the 2007–08 Estimates).”


Here is a link to the Template Instructions for Departmental Performance Reports (PDF and RTF version available by clicking on the links at the bottom of the menu on the left side): http://www.tbs-sct.gc.ca/dpr-rmr/2007-2008/instructions/instructions00-eng.asp

Thursday, August 7, 2008

Whole-of-Government Framework

The Whole-of-Government Framework can be found here:

http://www.tbs-sct.gc.ca/ppg-cpr/framework-cadre-eng.aspx?Rt=1037

It looks something like this:




I'm no expert, but it doesn't look like it's accessible for the visually impaired.

The following page explains in more detail how it works/how it is used:

http://www.tbs-sct.gc.ca/reports-rapports/cp-rc/2006-2007/cp-rc02-eng.asp#Introduction

The Government’s Priorities

The Government’s 5 priorities are:

A Proud and Sovereign Canada

There is nothing more fundamental than the protection of our nation’s sovereignty and security. The Government will rigorously defend Canada’s place in the world including through the realization of our strong Arctic vision and a responsible, effective path forward in Afghanistan.

A Strong Federation

Canada is more united today than it has been in 40 years. The Government will continue to strengthen the federation – and modernize its democratic institutions – through measures including formal limits on federal spending power and long-overdue reform of the Senate.

A Prosperous Future

Canada cannot be complacent about the continued growth of its economy. The Government will provide effective economic leadership and a prosperous future by aggressively moving forward with broad tax relief that includes a further promised reduction in the GST.

A Safe and Secure Canada

Canadians want their safe streets and communities back. The Government will continue to tackle crime and strengthen the security of Canadians by reintroducing important crime legislation with the new a Tackling Violent Crime Bill, and by putting a strong focus on safe communities and youth and property crime.

A Healthy Environment for Canadians

Canada’s environmental and health standards are already among the highest on Earth. The Government will continue to improve the environment and health of Canadians by delivering realistic and achievable results in areas such as environmental enforcement and product and food safety.

Source: http://www.pm.gc.ca/eng/feature.asp?featureId=5

Thursday, July 10, 2008

Instructions (Guide) for Developing a Management, Resources, and Result Structure

This document is a set of instructions, although I see it more as a guide, to develop (or if you prefer, implement) Program Activity Architectures (PAA), Performance Measurement Frameworks (PMF) and Management, Resources, and Results Structure (MRRS).

I found it useful. Constructive criticism: the pages showing the PMF tables didn't print out well on letter size paper in portrait layout. If I remember correctly, those pages are towards the end and print out well on landscape legal. I wish they would have made a pdf file, probably would have avoided this type of problem, and it would have made the document more portable and sharable.

Link: http://www.tbs-sct.gc.ca/pubs_pol/dcgpubs/mrrsp-psgrr/id-cm/id-cm_e.asp

Thursday, May 22, 2008

RCMP Environmental Scan

A very nice document prepared by the RCMP, and publicly available.

It covers demographics, society, economy, politics & government, science & technology, environment and public safety & security at both the global and Canadian levels.

Link: http://www.rcmp.gc.ca/enviro/2007/index_e.htm

Tuesday, May 20, 2008

Kurtosis

Kurtosis is the degree to which the frequency distribution is concentrated around a peak, that is, it describes the sharpness of the central peak of the curve, usually as compared with the normal distribution.

Higher kurtosis means more of the variance is due to infrequent extreme deviations (more variance, less concentrated around the mean), as opposed to frequent modestly-sized deviations (less variance, more concentration around the mean)

The normal distribution is mesokurtic; the curve with a higher degree of kurtosis (peakedness) is leptokurtic; and the curve with the flat top (compared to the normal curve) is platykurtic.

Links:

http://www.riskglossary.com/articles/kurtosis.htm

http://mvpprograms.com/help/mvpstats/distributions/SkewnessKurtosis

http://www.statistics4u.info/fundstat_eng/cc_kurtosis.html

http://www.almprofessional.com/Articles/skew.pdf

http://www.ats.ucla.edu/stat/spss/faq/kurtosis.htm

Monday, May 5, 2008

Mean Absolute Deviation

The mean deviation is the mean of the absolute deviations of a set of data about the data's mean.

Mean deviation is an important descriptive statistic that is not frequently encountered in mathematical statistics. The mean deviation has a natural intuitive definition as the "mean deviation from the mean".

The average absolute deviation from the mean is less than or equal to the standard deviation.

When applied to time series, the mean absolute deviation becomes a measure of volatility.

Standard Deviation

Standard deviation is a measure of dispersion. It is defined as the square root of the variance. It measures how widely spread the values in a data set are. If many data points are close to the mean, then the standard deviation is small; if many data points are far from the mean, then the standard deviation is large. In a loose sense, the standard deviation tells us how far from the mean the data points tend to be. The standard deviation has the same units as the data points themselves.

When applied to time series, standard deviation becomes a measure of volatility.

See also: http://www.childrensmercy.org/stats/definitions/stdev.htm
http://www.quickmba.com/stats/standard-deviation/

Tuesday, April 15, 2008

Skewness

Skewness is an asymmetrical frequency distribution in which the values are concentrated on one side of the central tendency and trail out on the other side. If the trail is to the right or positive end of the scale, the distribution is said to be positively skewed. If the distribution trails off to the left or negative side of the scale, it is said to be negatively skewed.

Positive Skew


Negative Skew


Sources:

http://www.brendan.com/glossary.html

http://cnx.org/content/m11011/latest/

Friday, April 4, 2008

Measures of Central Tendency

Central Tendency: The center or middle of a distribution. There are many measures of central tendency. The most common are the mean, median, and mode.

http://cnx.org/content/m11011/latest/

Mode

The mode is the most frequently-occurring value (or values).

To calculate the mode:
  • Calculate the frequencies for all of the values in the data.
  • The mode is the value (or values) with the highest frequency.

Median

The median is the midpoint of a distribution: the same number of observations are above the median as below it

To calculate the median:

  • Sort the values into ascending order.
  • If you have an odd number of values, the median is the middle value.
  • If you have an even number of values, the median is the arithmetic mean (see above) of the two middle values.

Tuesday, March 25, 2008

Harmonic Mean

Formula:



Description:

The harmonic mean is the number of variables divided by the sum of the reciprocals of the variables. Typically, it is appropriate for situations when the average of rates is desired.


Example:

For instance, if for half the distance of a trip you travel at 40 kilometres per hour and for the other half of the distance you travel at 60 kilometres per hour, then your average speed for the trip is given by the harmonic mean of 40 and 60, which is 48; that is, the total amount of time for the trip is the same as if you travelled the entire trip at 48 kilometres per hour. If you had travelled for half the time at one speed and the other half at another, the arithmetic mean, in this case 50 kilometres per hour, would provide the correct average.

In finance, the harmonic mean is used to calculate the average cost of shares purchased over a period of time. For example, an investor purchases $1000 worth of stock every month for three months and the prices paid per share each month were $8, $9, and $10, then the average price the investor paid is $8.926 per share. However, if the investor purchased 1000 shares per month, the arithmetic mean (which turns out to be $9.00) would be used. Note that in this example, the investor buying $1000 worth of the stock each month means buying 125 shares at $8 the first month, 111.11 shares at $9 the second month, and 100 shares at $10 in the third month. Fewer shares are purchased at higher prices while more shares are purchased at lower prices. Thus more weight is given to the lower prices than the higher prices in the calculation of the average cost per share ($8.926). If the investor had instead purchased 1000 shares each month then equal weight would be given to high and low purchase prices, leading to an average cost per share of $9.00. This explains why the harmonic mean is less than the arithmetic mean.

Sources:

http://en.wikipedia.org/wiki/Harmonic_mean


Tuesday, March 18, 2008

Weighted Geometric Mean

Formula:



Description:

I found very little information about this type of mean. So far, I've only seen it used with weights that add up to 1.

Example:

none yet

Monday, March 17, 2008

Geometric Mean

Formula:



Description:

The geometric mean only applies to positive numbers. It is also often used for a set of numbers whose values are meant to be multiplied together or are exponential in nature, such as data on the growth of the human population or interest rates of a financial investment.

Any time you have a number of factors contributing to a product, and you want to find the "average" factor, the answer is the geometric mean. The example of interest rates is probably the application most used in everyday life.

Example:

Suppose that I’m 30% richer than last year, but last year I was 20% richer than the year before… what is the average growth? Well, my current wealth is 1.3 * 1.2 * w if w is my wealth two years ago. I can expect that if t is the average growth factor over the last two years, then my current wealth is t * t * w. Setting t = 1.25 is the wrong answer. In such a case, choosing t = sqrt(1.3 * 1.2) solves the problem.

Sources:

http://www.daniel-lemire.com/blog/archives/2006/04/21/when-use-the-geometric-mean/

http://en.wikipedia.org/wiki/Geometric_mean

http://www.math.toronto.edu/mathnet/questionCorner/geomean.html

Monday, March 10, 2008

Weighted Arithmetic Mean

Formula:



Where w are the weights

Description:

Data elements with a high value for the weight contribute more to the weighted mean than do elements with a low value for the weight. The weights must not be negative.

Example:

I buy 20 red balls and 30 blue balls. Red balls cost one dollar each, blue balls cost two dollars each. What is the average price of the balls I purchased?

20*1+30*2/50=80/50=$1.60

Another way using proportions: red balls represent 40%, blue balls 60%.

0.4*1+0.6*2/1=$1.60

Thursday, March 6, 2008

Arithmetic Mean

Formula:



Description:

This is usually what people refer to when they talk about a mean or average. It is the sum of the items, divided by the number of items.

Example:

There are 4 students in a class. Their final grades are 50, 60, 75 and 90. What is the average for the class?

Average for the class = (50+60+75+90)/4 = 68.75

Wednesday, March 5, 2008

Statistical and Mathematical Formulas and Equations, and Their Uses

The next few posts may seem very dry, and the link to performance measurement might not be immediately clear.

They will consist of different statistical and mathematical formulas which I may or may not refer to in future posts. Without going into too much detail, let me just say that “how” you measure something doesn’t just refer to what indicators you will use to try to measure an outcome, “how” can also refer to how your indicators or measures are calculated. That’s just one of the challenges of the practical side of performance measurement, as opposed to the more theoretical side of defining outcomes, logic models and frameworks.

Sunday, March 2, 2008

Two Interesting Papers on the Basics

Mark Schacter has written some interesting papers on performance measurement.

They are fairly theoretical however, but give a good overview of some of the basics, in particular of logic models, frameworks and some advice on how to choose indicators.

I would recommend reading:

Not a Toolkit. Practitioner's Guide to Measuring the Performance of Public Programs

and

What Will Be, Will Be. The Challenge of Applying Results-based Thinking to Policy


Good reading!

source: http://www.schacterconsulting.com/

Tuesday, February 26, 2008

A Timeline of Government Performance Reporting Initiatives

1981

The government committed itself to provide Parliament with improved and expanded information in the Estimates. In particular Part III of the Estimates was designed to provide information to Parliament on departmental spending intentions and about performance and results produced by expenditures previously authorized.


1983

The government agreed to include summaries of program evaluations in Part III.


1995

The government revised the Expenditure Management System. As part of this initiative, it launched the Improved Reporting to Parliament Project, which split Part III of the Estimates into two documents:

  • Report on Plans and Priorities—tabled in the spring, it sets targets and the general direction;
  • Performance Report—tabled in the fall, it indicates the results achieved against those planned.

Six departments piloted the new approach.

The President of the Treasury Board tabled the first government-wide report describing progress made by implementing results-based management in federal departments and agencies. The report is part of the fall performance package and is tabled in Parliament with the departmental performance reports.


1996

Sixteen departments piloted the Improved Reporting to Parliament Project. The Treasury Board President tabled their performance reports in the House of Commons.


1997

On 24 April 1997, the House of Commons passed a motion dividing what was known as the Part III of the Estimates document for each department or agency into two documents, a Report on Plans & Priorities and a Departmental Performance Report. It also required all departments and agencies to table these reports on a pilot basis.


1998

Most departments and agencies submitted reports on plans and priorities and performance reports.


2000

The Treasury Board Secretariat published Results for Canadians, which emphasized the importance of ensuring timely and accurate reporting to Parliament.


2001

The Treasury Board Secretariat introduced the Results-Based Management Lexicon. This lexicon provided new, standardized terminology for results management and reporting.

The Treasury Board Secretariat published its renewed guidance to departments for the preparation of performance reports and introduced six principles for effective reporting.


2004

The Treasury Board Secretariat replaced the Planning, Reporting, and Accountability Structure Policy with the Management, Resources and Results Structure (MRRS) Policy, effective 1 April 2005.

The policy requires that departments have clearly defined and measurable strategic outcomes, an articulated Program Activity Architecture (an inventory of all the programs and activities undertaken by a department or agency that are depicted in their logical relationship to each other and to the strategic outcomes to which they contribute), and a description of the current governance structure.

The Treasury Board Secretariat published an integrated Guide for the Preparation of the 2005-2006 Part III of the Estimates: Reports on Plans and Priorities and Departmental Performance Reports. The goal of the integrated guidelines was to reinforce the complementary features of the two documents and their parallel reporting requirements.

Source: http://www.oag-bvg.gc.ca/internet/English/oag-bvg_e_14013.html

Friday, February 15, 2008

OAG Audit Criteria for Performance Information

The Office of the Auditor General (OAG) assesses a number of annual reports every year.

Here are the criteria it uses to do so:


Fairness Criterion

Relevant

The performance information reports in context, tangible, and important accomplishments against objectives and costs.
  • Program context includes

    • the mandate, strategic outcomes and objectives, which are linked to government priorities

    • program structures

    • key horizontal initiatives and partners used to deliver the results, and

    • a discussion of the external environment and the key risks faced

  • A description of intended users

  • Reported results are focused on outcomes with related program activity types and outputs identified. (Results Chain)

  • New initatives for improving outcomes are described

  • The measures used to support the results chain are valid and complete

  • The public performance report should link financial and non-financial information to show how resources and strategies influence results, including relating costs to outputs and outcomes. Where possible, cost information includes both direct costs and allocated indirect costs of programs and services to present 'full cost' data.

Meaningful

The performance information describes expectations and provides benchmarks against which performance is compared.
  • Expectations are set out which are

    • clear, concrete, and measurable, identifying the amount and direction of the change, the target groups, and timeframes,

    • focused on outcomes with relevant activities and outputs identified, and

    • consistent with the strategic outcomes.

  • Comparisons are provided between reported accomplishments (actuals) and the expected performance with a realistic interpretation of the gap between the two

  • Comparisons are provided with relevant benchmarks, such as similar activities, programs or organizations, or trends over time, and their significant explained.

  • Key lessons learned about past performance and any resulting actions are discussed.

Attributable

The performance information demonstrates why the program made a difference.
  • The contribution that has been made by the program to the reported results is demonstrated, including evidence regarding attribution, where available (such as evaluations).

  • The contribution of key partners and other external factors is discussed.

Balanced

A representative and clear picture of performance is presented, which does not mislead the reader.
  • The explanation of variances would include both positive and negative aspects of performance, as well as major challenges identified for programs and services, in order to provide a complete picture of performance

  • Significant unintended impacts are reported

  • There is coverage of all key objectives and only performance information that is significant to the organization is reported.

  • There are no distortions of information through presentation, or through omission of information or context.

  • The emphasis on information presented is proportional to its importance and materiality.

Reliability Criterion

Reliable

The performance information adequately reflects the facts
  • Valid and consistent/comparable measures of performance are used.

  • Performance information is based on data that can be readily replicated to produce similar results.

  • The data that supports the performance measures are complete

  • The basis for confidence in the reliability of the information (including third party data) being reported is disclosed, including
    • methods of data collection

    • explanations for any limitations

    • sources of information are reported

source: http://www.oag-bvg.gc.ca/internet/English/oag-bvg_e_10217.html

Tuesday, January 29, 2008