Thursday, November 15, 2007

Call Centre Metrics

Here’s a nice little source for ideas on how to measure telephone service:

http://www.the-cma.org/?WCE=C=47%7CK=227275

it explains 10 call centre metrics, amongst which:

Abandon Rate: the number of calls that hang-up before connecting to an agent. This number does not include those calls that receive a busy signal.

Calculation: Abandoned Calls / Total Incoming calls

Service Level: usually defined as the percentage of calls answered within a predetermined number of seconds.

Average Speed of Answer: (commonly referred to as ASA) is the average number of seconds it takes for a call to be answered.

First Call Resolution: measures the percentage of customer issued calls that are resolved the first time.

Calculation: Number of FCR calls / Total Number of Calls

Tuesday, October 30, 2007

Metrics for Telephone Calls

Here's a short overview of what I could find in terms of metrics used to measure telephone calls in organizations of the federal government. I based my findings on DPRs or Annual Reports. Only a surprisingly small number of organizations actually talk about the performance of their call centres.

Canada Revenue Agency:
Based on their 2005-06 Annual Report, caller accessibility measures the percentage of callers who succeed in reaching the telephone service, which basically means that the caller gets into the queue. Telephone service level is the percentage of calls answered within 2 minutes of entering the queue.

Service Canada:
Based on Service Canada Annual Report 2005-2006, Appendix 4 (Performance Scorecard) on page 37, it measures 3 things: the percentage of calls answered within 180 seconds (btw that’s 3 minutes guys… plain language writing…), percentage of Automated Telephone Information Service Usage (no reference to a page that explains how that works), % of calls blocked (EI High Volume Message) (which I take it means a busy signal).

Canada Business:
Based on their 2004-05 Annual Report, they seem to measure the volume of calls and do make a difference between automated and officer-assisted calls. They do have service standards, but they are very lax in terms of metrics, for example, their telephone service standard is: “Telephone service is available free of charge, generally from 9:00 a.m. to 5:00 p.m., Monday to Friday (except on holidays as they apply in each province). For exact service times, contact the centre in your province or territory. TTY is available for the hearing-impaired.”


Here are the links to the reports I used:

CRA Annual Report to Parliament 2005-06: http://www.cra-arc.gc.ca/agency/annual/2005-2006/performance-e/ar_2005-06_htmlTOC-e.html

Service Canada Annual Report 2005-2006:
http://www.servicecanada.gc.ca/en/about/reports/ar_0506/pdf/ar_0506.pdf

Canada Business Annual Report 2004-05:
http://www.canadabusiness.ca/gol/cbec/site.nsf/vDownload/annual_reports/$file/canada_business_2004-2005_annual_report_en.pdf


Now for added fun, take a look at how dropped calls on automated lines are integrated in the measures. It would seem that most of the time, a call dropped on an automated line is considered as answered, the assumption being that the caller got the information he was looking for while going through the menus, or listening to the messages played while on hold.

Tuesday, October 23, 2007

Cognos Performance 2007

I had the opportunity to go to Cognos Performance 2007. It's basically a one day event centered around their products: new ones, what they do, how to use them, how they are used by different organizations, there's also the odd performance measurement/management related presentation.

I've been going for a few years now, I started because I was a user of their products, I'm not sure whether it was Impromptu or ReportNet at the time. I keep going both to keep in touch and know what's coming down the line.

I wouldn't recommend it for someone who never used Cognos products and who never plans to, but if you're a user, or customer, you might be interested. It's free, so the only cost is your time and transportation. I'm suprised that we don't see many students at this event.

you can access the day's agenda here: http://www.cognos.com/performance2007/na_agenda.html#on_ott

I went to Best Practices in Framework Manager, Keys to Successful Migrations from Cognos Series 7 to Cognos 8 BI and Complete Reporting and Analysis Coverage with Cognos 8 BI.

Those choices might seem strange to you, but remember that I've been using Cognos products for a few years now. If I had to choose again, I would've taken the IBM presentation over the migration session. I thought was hoping the migration would talk more about the new features in Cognos 8 (how to leverage them), but it was actually about the actual migration process itself.

Overall, I don't regret going.

Sunday, October 14, 2007

Increase in the Number of Canadian Households with Internet Access

I was wrong: the number of households with Internet access is growing, but at an increasing speed.

According to the CRTC's Broadcasting Policy Monitoring Report, there 56% of canadian households had Internet access in 2003, 59% in 2004, 64% in 2005 and 70% in 2006. That's an increase of 3 percentage points between 2003 and 2004, 5 percentage points between 2004 and 2005 and 6 percentage points between 2005 and 2006.

Tuesday, September 4, 2007

Sources for Internet Related Statistics

Ipsos News Center - “Digital Divide” Remains Wide – Only Six-In-Ten Canadians Aged 55+ Have Access To The Internet
http://www.ipsos-na.com/news/pressrelease.cfm?id=3365

The Daily - Canadian Internet Use Survey (2005):
http://www.statcan.ca/Daily/English/060815/d060815b.htm

The Daily - E-commerce: Shopping on the Internet (2005):
http://www.statcan.ca/Daily/English/061101/d061101a.htm

The Daily - Electronic commerce and technology (2005):
http://www.statcan.ca/Daily/English/060420/d060420b.htm

CRTC Telecommunications Monitoring Report (2007):
http://www.crtc.gc.ca/eng/publications/reports/PolicyMonitoring/2007/tmr2007.pdf
note: see page 69 of pdf or 60 of the document

Broadcasting Policy Monitoring Report 2007:
http://www.crtc.gc.ca/eng/publications/reports/PolicyMonitoring/2007/bpmr2007.pdf
note: see page 131 of pdf or 123 of the document

Statistics Canada – Tables by Subject: Internet
http://www40.statcan.ca/l01/ind01/l3_2256_3817.htm?hili_arts56

BuddeComm – Canada telecommunications market overview
http://www.budde.com.au/reports/Category/Canada-8.html?r=51

Internet World Stats – Internet usage, broadband and telecommunications reports
http://www.internetworldstats.com/am/ca.htm

Internet World Stats – Top 20 Countries with the Highest Number of Internet Users
http://www.internetworldstats.com/top20.htm

Thursday, August 30, 2007

Measuring the Effectiveness of a Government Website / E-Service

Nielsen/NetRatings is now according more value to the time spent on a website than to the number of page views for it’s ranking of website popularity.

http://www.nielsen-netratings.com/pr/pr_070710.pdf
http://www.beet.tv/2007/07/nielsen-exec-de.html
http://www.marketingpilgrim.com/2007/07/the-problem-with-measuring-time-spent-on-a-web-site.html
http://www.iht.com/articles/ap/2007/07/09/business/NA-TEC-US-Online-Measurements.php

OK, from a governmental point-of-view we don’t really care about rankings. But that doesn’t mean we can’t learn from what the private sector is doing. What Nielsen/NetRatings is trying to do here is improve the metrics they use to measure the effectiveness of a website in attracting visitors. That goal, attracting visitors, makes sense in the context of the private sector. The goal of a government site is different. The goal of a government website is to offer information and services. In a government context you just want people to find what they are looking for as fast as possible. Actually, that’s closer to what they want. They are using your site, they are not enjoying it. You don’t want them spending more time than they need to on your site or, chances are, they’ll get frustrated and leave (and I won’t get into the impact that has on perception and image). They might choose to use another channel, telephone or in-person service, but they’ll most likely be frustrated, rightfully so because they’ll have wasted time looking on your website where the information -should- be.

OK… metrics. How do you measure the effectiveness of a website / e-service in a government context?

Clearly time doesn’t work: if they stay a long time on your site, your site is too hard to navigate, too wordy or too complicated (think plain language writing), if they don’t stay long at all, they probably haven’t found what they were looking for, or got frustrated, because your site is too hard to navigate, too wordy or too complicated.

Page views: have you ever visited a government site? You don’t get anywhere with 2 clicks of the mouse on a government site. You have a title page, and a main/portal page, and sub-pages, and how many menus? Anyhow, obviously, the most visited pages will be the top pages. But there is no information on those pages.

Users: it’s a nice statistic to have, the number of people who visit your site. But that doesn’t mean much in the government context. It’s nice to track its change from year-to-year. You can make pretty graphs with that. But the relative number of visitors depends on your mandate. I doubt the smaller or more specialized entities get much traffic. It doesn’t mean they are not effective, it doesn’t mean their website isn’t either. It’s just that the nature of their mandate makes it so they probably won’t have many visitors, but those that need to find their site just might. And that’s what matters, citizens finding what they are looking for.

The best I can come up with is surveys, either on the pages themselves or just your plain old surveys. You just have to make sure that the people you are surveying have used your site, in its current configuration. I don’t like it, I really don’t, but that’s the best I can come up with to measure the effectiveness of a government website / e-service.

Wednesday, August 22, 2007

Pitfalls of Internet Access Statistics

It’s hard to find good statistics on the percentage of Canadians with access to the Internet. First, Statistics Canada changed the way they measure this metric in the last few years, making prior year data incomparable (It used to be by household if memory serves). Second: define “Internet access”; do web-enabled cellphones count? Do Blackberries? What about access in public places like Internet cafés, public libraries and schools?

Truth is, measuring how many Canadians have access to the Internet is more complicated than it used to be. It gets worse if you are trying to use that statistic to support a web-service proposal. For example, you probably won’t use your cellphone to fill-out an online form (i.e. an application for a government service or program). But you might use it to check the weather (Environment Canada). This doesn’t mean that cellphones have no value as a channel; it just means they serve a different purpose. A good example would be the ice status updates for the Rideau Canal that the NCC sends out: you receive these on your cellphone. The Internet used to be considered a single channel because most people would use it from a PC at home, but the modern truth is, it’s not, it’s many channels with different uses and different potentials for service delivery.

Another aspect to take into consideration is from where a user is accessing the Internet. Organisations have policies on what type of sites you can visit and when or if you can use the Internet connection provided to you by your employer for personal purposes. There are also the issues of security and privacy: I wouldn’t do online banking from an Internet Café and I doubt many people are installing quicktax in their cubicle and e-filing from work.

So how do you gauge the number or proportion of potential users for an online service, or the potential or predicted growth of such services?

Hard to do. The percentage of the population with access to the Internet is still rising, but likely the speed (% change from year to year) of this growth is slowing. I’d estimate that most gains now occur in rural areas as coverage expands (as service becomes available) and in the elder segments of the population, possibly also in the segments with lower incomes or education, as the prices of computers go down and the Internet becomes more of a commodity. Now this is all nice and interesting, but bottom line is, most people who wanted to get on the Internet probably have done so by now if they could.

As for potential market for a new service, you have to define how your online service will be accessed. It might be erroneous to include those who only have access to the Internet at work or via cellphone if the service you are considering is more along the lines of an application than a simple webpage with information on it. The opposite side of the coin is also true. You don’t want to double count (or more) the number of users who might have access to your service: i.e. those who possess a web-enabled cellphone and Internet access at home and at the office, etc.

Finally, just because some percentage of the population has the right type of access to the Internet (let’s say at home), it doesn’t automatically mean that they are potential users of your service: age and eligibility are two criteria I can think of off the bat that would rule out a number of potential users.

Tuesday, August 21, 2007

The Web/Internet as a Service Delivery Channel

With the advent of government online, the measurement of the effectiveness of the Web as a channel of service delivery becomes more important, as a greater portion of the population gains access to the Internet and as the services offered online become more substantive.

The presumed cost efficiency of the Internet as a means of providing services to citizens is one of the main factors driving the push to web services. The added flexibility (anywhere, anytime, anyplace) gained by the population in accessing services is another.

Sunday, May 20, 2007

Results for Canadians

Results for Canadians: A Management Framework for the Government of Canada was published back in 2000, but it's just one of those documents that I keep going back to, so I decided to put the link here, as much for your benefit as mine.

(link to pdf is at the bottom of the menu on the left)

Thursday, May 17, 2007

E-Reporting

In my last post I talked about an E-Reporting session at the PPX Symposium with Mordelai Lee as the speaker.

Here's a link to a report he wrote in 2004: E-Reporting: Strenghening Democratic Accountability.

It basically reviews a bunch of online reports from the US, and focuses around using E-Reporting to inform and engage citizens.

Wednesday, May 16, 2007

PPX Symposium 2007

I had the opportunity to go to a few sessions of this year's PPX Symposium.

I liked the E-Reporting session (Mordecai Lee was the speaker) and the Panel of Members of Parliament. I didn't attend any of May 15's sessions, and missed the concurrent workshops today, so I'm biased.

The E-Reporting session was basically an invitation to summarize reports, slap them online and add a few online features like asking related questions. The concept of E-Reporting has been around for awhile, but from what I see, read and hear, it's not quite clear what its direction is or what final form it will take. I do believe that people use the term E-Reporting very loosely at this point in time. I'll do a post on E-Reporting at some point... more on that later.

The Members of Parliament basically said what you'd expect them to say, which is: less pages, and report both the good and the bad (they were talking about DPRs).