Tag Archives: customer feedback

Customer Interaction Framework – Big Name, Lots of Work

Recently I was given the task of running a project that will create what is known around here as a Customer Interaction Framework. In short, it’s meant to determine what our internal customers need when they do their job with our product, and then find a way to get useful and meaningful feedback from them. And once we figure that out, create the framework so it can be applied to all the global teams so they can generate a feedback loop to build in improvements.

Man, that is a lot of work.

The project scope is meant to span the globe yet only comprises 5 team members. We decided that keeping the team small and staffed with people who not only have experience in this area, but who are also passionate about having a dialogue with their customers and encouraging feedback. It will focus on deliverables we know we can get good feedback on, and the people in the project all have their area of expertise.

But one of the aims of this project is to set up a framework where we can get pretty specific feedback on certain deliverables and then act on it at a global level. If we don’t, we run the definite risk of alienating our customers. The tricky part is that some people are nervous about getting feedback as they think it will reflect poorly on them personally. But we have to avoid that.

However, the prerequisite for success in this project, in my opinion, isn’t the team itself being competent or motivated. That was a given. It’s actually doing our homework at the start of the project to make sure we know who we are speaking to — so we can speak their language. This will entail mapping our deliverables to the project phases and the roles that come into play at specific points, and what those people in those roles need to get their jobs done. Not an easy task. But with a name like this, I never thought it would be easy.

I hope to post more updates on this topic as the year goes by.


The Small-Big-Small Approach


One of the things I have learned about setting up a feedback process is that you need a clear methodology before you start a feedback campaign. I have been participating in and leading a work group for a little while now, and have determined that a small-big-small approach seems to work best, but I think it would be applicable to almost any size company where you are trying to get feedback.

So let’s supposed you are tasked with leading a feedback campaign and you don’t know where to start. Make sure you can do the following things: 1) Get commitment from your superiors on the value this brings and so you can promote it, 2) Run the campaign multiple times, not just once,  and 3) go for a small-big-small approach (assuming you got the approval for #2). So what is this approach? Read on.


Assuming you have your target audience for the campaign figured out and have the tools you need to do it (I know, I am assuming a lot), try running  a small feedback campaign, targeting let’s say no more than 50 – 60 people who you know can give you good feedback. By limiting it to this size, you can keep a better overview of the whole project. Also you will probably need fewer people to support you when you start to evaluate the results (if you can, have a small but dedicated team to help you get through this). Granted, you may not get the widest sample information, but it serves as a very good baseline.


After you have completed your first feedback campaign, and you have wildly impressed your superiors with your brilliant deductions from the feedback you received, propose expanding the scope of the next feedback campaign to double the size. That way you can get a larger number of results and compare them to the first campaign. Yes, a bigger campaign means more coordination, more time spent evaluating the responses and just plain more headaches as you have to keep track of all of it. But bigger numbers do count for something. By casting a wider net, you sometimes catch some good fish. Tip: Also use the first campaign’s questions to drive the new questions of the second campaign.


When the big campaign is over and you have evaluated the results, pick out the respondents who gave you the best feedback and ask them to join a blue ribbon feedback panel. Keep it small, under 30 people if you can. this keeps it very manageable and easier to dissect and evaluate. If they have the time, and are motivated, they can become part of something that really improves your product/service. But the questions you pose in a smaller, more focused feedback campaign will have to be very well thought out and more nitty-gritty.

It’s a ton of work to get feedback, but this approach gives you a long-term view and good method of getting the feedback you need.

Get Feedback Before The Release

I am helping out with a customer engagement project where we are trying to get a feedback loop built in to the testing validation process. The idea is to get feedback from the experts on our documentation before it goes out on the market. A novel idea, in many ways.

The whole effort was a result of just asking questions internally at the company as to when we need feedback and what that feedback should include. It isn’t an effort to get a focus group. It’s using our internal experts and speaking to them face-to-face about possible issues.

It turns out the feedback loop internally is lacking. Our area hasn’t even been considered as part of the feedback chain and never was. But it wasn’t due to intentional neglect. As with so many companies, people just don’t think of the whole feedback loop and when it begins.

To make a long story short (this is a blog after all), I did some hunting around, contacted everyone I know and finally found the right person. We talked, I explained my needs and situation, and we agreed on how the documentation can be tested to get feedback from the people who will use it the most. And all this before the product goes out to the customer. Not after it’s out on the market. Problems solved by us, not the customer.

Why Take the Time?

I am taking a series of on-line courses from Social Media Marketing University on using social media in the business world. The courses have been jam packed and really interesting. The extended session on Facebook was last week, It was given by Jennifer Shaheen. And she is one of the best virtual presenters I have heard. So I sent a mail with this unsolicited, positive feedback to Jennifer and to John Souza from SMMU.

SMMU has asked for my feedback after every session — a nice, well conceived survey fronted by the note: “Please take a moment and fill out these quick surveys so we can better serve you as a client.”

It got me thinking. How will my completing the survey help them serve me as a client? The survey did not ask what I would like to see in the future. Or how, in fact SMMU can serve me better. I would rather have them offer me a coupon a free coffee. Or extend my support a month for every survey I fill out. Or offer me a reduced rate off for my next order. Or offer a bonus for the presenter for every 5 star rating.

Their request did not make it personal enough for me to respond. While I was fine with sending out my feedback on my own, I didn’t feel compelled to take part in the company’s survey. My mail touched on just about every category they had in the survey — so it wasn’t the content. It just did not have a strong enough pitch to have me take the time to fill it out.

When you ask for feedback, make it personal and important to them (your customer) not to you (the organization). You will get people to respond, but the more personal you can make it the better the response.

Congratulations, It’s a Feedback Project

So your organization picked you to run a feedback program. Congratulations.

When it happened to me I knew I was doomed. I had the project from hell. It had failed every year and it was now my turn to go before management and tell them all of the details surrounding the latest failure. Not exactly what you might want to hear.

No one had ever been able to get a feedback project to work in our department. But I was lucky enough to gather a great team. To have a manager that was more like a partner. To have built up some positive internal karma so I would get a bit of latitude. To have a sense of humor. And to know I needed a vision.

I was first looking at focus groups. I love the interaction you can generate with a well facilitated group. After a gentle shove from management, the focus group concept warped into one-on-one conversations.

These would actually be easier to organize, maintain, and control, especially as we were running the program ourselves. So we sharpened up our questions. Put in a lead tracking system to manage our contacts. And put out the word we were looking for subjects. We networked our way into a representative set of subjects.  And we asked every feedback subject if they knew someone else we could talk with. This, as it turned out. was surprisingly effective.

So, when it is your turn to run your feedback program, do look at the literature, and check with the experts. But, remember that your internal resources (your team, your management, your style) can provide the edge you need to succeed.

Running a Program Yourself Brings Rewards

Many organizations don’t have the luxury of using dedicated customer service professionals to collect, organize, and analyze their customer feedback. I was in that situation in the customer feedback project I ran. I did, however, have the luxury of using authors and translators to ask the survey questions. Communications professionals trained to gather and publish information. The major difference? We were asking our customers to give us feedback on the products we produced. We weren’t the service department. We were the front-line production. This was good news and bad news. The good news, we had the opportunity to finally ask our customers all of the things we wanted to know but had never had a chance to ask. The bad news, they might actually tell us.

We worked at the questions and developed a set that would require real answers. And we talked about the kinds of responses we might receive just to be prepared. And did a bit of role playing to get a feel for the rhythm of asking questions and getting responses. Then we started.

Everyone that ran a survey discussed their experience with the group. The group was in a safe enough space that we could honestly talk about the good parts and the bad parts of every discussion. Because we were performing the work, it was also very exciting as we completed the project a month earlier than originally planned.

Doing the surveys ourselves turned out to be a good thing. It really helped us to establish ourselves as internal experts. And because of the work we did, the members of the group became leaders for an on-going department-wide feedback program.

Want Results? Make it Easy to Participate

Given the breadth and size of the organization where I ran a feedback project, we combined face-to-face and telephone surveys into one chain. It was going to be impossible to get the survey team in front of enough targets in a manageable timeframe without using both avenues. We felt that this would not adversely affect the outcome.

We were looking for feedback on our products (technical documentation) from our main customers (internal consultants). The point of the survey was to determine how we could provide consultants with better products.

The conventional wisdom was that the consultants would never talk to us – they never had in the past. All feedback our internal team received prior to our project was based on rumor and innuendo.

But even starting from here – that they would not talk to us – gave us a position. It told me I had to make it as easy as possible from the start to increase participation. So this is what I did to set the stage:

  • As these were consultants that always billed their time to projects, I was able to get a general project in place where they in fact could put the time we asked for.
  • As this was in a multi-language company, we offered surveys in two languages to eliminate language issues.
  • If they were in an office where we could meet – we ran the survey face-to-face, if not we set up a phone interview. And with survey takers literally around the world, we could account for any time they had to talk with us.

Part of our success was based on the attention we took at the start. It wasn’t the strength or probing quality of the survey questions, it was that we established the ground rules so that we made it easy to participate.