So your organization picked you to run a feedback program. Congratulations.
When it happened to me I knew I was doomed. I had the project from hell. It had failed every year and it was now my turn to go before management and tell them all of the details surrounding the latest failure. Not exactly what you might want to hear.
No one had ever been able to get a feedback project to work in our department. But I was lucky enough to gather a great team. To have a manager that was more like a partner. To have built up some positive internal karma so I would get a bit of latitude. To have a sense of humor. And to know I needed a vision.
I was first looking at focus groups. I love the interaction you can generate with a well facilitated group. After a gentle shove from management, the focus group concept warped into one-on-one conversations.
These would actually be easier to organize, maintain, and control, especially as we were running the program ourselves. So we sharpened up our questions. Put in a lead tracking system to manage our contacts. And put out the word we were looking for subjects. We networked our way into a representative set of subjects. And we asked every feedback subject if they knew someone else we could talk with. This, as it turned out. was surprisingly effective.
So, when it is your turn to run your feedback program, do look at the literature, and check with the experts. But, remember that your internal resources (your team, your management, your style) can provide the edge you need to succeed.
Posted in Uncategorized
Tagged business communications, complaints, customer feedback, customers, electronic surveys, employee input, face-to-face, feedback project, monitoring, project management, running feedback program, telephone surveys
Many organizations don’t have the luxury of using dedicated customer service professionals to collect, organize, and analyze their customer feedback. I was in that situation in the customer feedback project I ran. I did, however, have the luxury of using authors and translators to ask the survey questions. Communications professionals trained to gather and publish information. The major difference? We were asking our customers to give us feedback on the products we produced. We weren’t the service department. We were the front-line production. This was good news and bad news. The good news, we had the opportunity to finally ask our customers all of the things we wanted to know but had never had a chance to ask. The bad news, they might actually tell us.
We worked at the questions and developed a set that would require real answers. And we talked about the kinds of responses we might receive just to be prepared. And did a bit of role playing to get a feel for the rhythm of asking questions and getting responses. Then we started.
Everyone that ran a survey discussed their experience with the group. The group was in a safe enough space that we could honestly talk about the good parts and the bad parts of every discussion. Because we were performing the work, it was also very exciting as we completed the project a month earlier than originally planned.
Doing the surveys ourselves turned out to be a good thing. It really helped us to establish ourselves as internal experts. And because of the work we did, the members of the group became leaders for an on-going department-wide feedback program.
Given the breadth and size of the organization where I ran a feedback project, we combined face-to-face and telephone surveys into one chain. It was going to be impossible to get the survey team in front of enough targets in a manageable timeframe without using both avenues. We felt that this would not adversely affect the outcome.
We were looking for feedback on our products (technical documentation) from our main customers (internal consultants). The point of the survey was to determine how we could provide consultants with better products.
The conventional wisdom was that the consultants would never talk to us – they never had in the past. All feedback our internal team received prior to our project was based on rumor and innuendo.
But even starting from here – that they would not talk to us – gave us a position. It told me I had to make it as easy as possible from the start to increase participation. So this is what I did to set the stage:
As these were consultants that always billed their time to projects, I was able to get a general project in place where they in fact could put the time we asked for.
As this was in a multi-language company, we offered surveys in two languages to eliminate language issues.
If they were in an office where we could meet – we ran the survey face-to-face, if not we set up a phone interview. And with survey takers literally around the world, we could account for any time they had to talk with us.
Part of our success was based on the attention we took at the start. It wasn’t the strength or probing quality of the survey questions, it was that we established the ground rules so that we made it easy to participate.
I found this list of seven ways of collecting customer feedback in the SkillSoft Corporation course The Voice of the Customer:
- Face-to-face surveys: Face-to-face surveys are easy to administer and provide you with an opportunity to spend time with your customer. They require consistent, objective questions, and expert analysis.
- Mail and electronic surveys: Surveys sent by mail or via the Web or email need formal questionnaire development and analysis. Unfortunately, these types of surveys yield the lowest response rate.
- Telephone surveys: To be effective, telephone surveys need consistent questions and require expert analysis.
- Employee input: Soliciting employee input is an excellent informal technique. However, the information is sometimes “second hand” and may be subject to personal interpretations.
- Customer complaints: Customer complains are easy to track and analyze. They reveal trends and provide a means of checking quality and customer satisfaction. They are, however, somewhat biased because you are hearing primarily from dissatisfied customers.
- Quality call monitoring: Quality call monitoring is a good technique for a manager or supervisor. It is important to note that the results are subject to the manager’s interpretations and opinions.
- Customer advisory panels: Customer advisory panels require a great deal of coordination. They are, however, particularly useful for testing products or services, and for solving problems.
I thought the list was a helpful place to start. I wanted to put this out and then talk about the various methods in later posts.