Most leaders like to think they know what members need, want, and enjoy. Especially if the leadership has been in place for a long time, or grew up in the organization. That’s not to advocate against internal growth opportunities or continuity of leadership. It’s simply recognizing the challenge that the longer you stay with an organization, the better at “active listening” you have to become to avoid insularity and letting “it’s always been done that way” drive your decisions.

In order to learn more about how an audience thinks, perceives or believes, we naturally turn to listening tools like surveys to give us insight and direction. But tread carefully—it’s easy to get it wrong.

Asking someone how they will like their lunch tomorrow doesn’t make much sense. Will they have lunch or just a snack? Will they eat out or brown bag? How can they know whether they will enjoy it? Will they even have time for lunch?

It’s a question people can’t answer, and unfortunately, it’s a lot like many questions we see in member surveys. For example, questions like these:

  • “What would you like to read about in the magazine?” Wait—isn’t content development YOUR job?
  • “What member benefits would you like us to offer?” Hold on a minute—if they say they would really like you better if you offered to pay their kid’s college tuition, would you be able to deliver?
  • “Will you renew?” This is a big disconnect. Members often say what they think you want to hear, not what they are actually going to do at some point in the future when there is a cost involved.

How do you most effectively ask to get meaningful feedback?

Ideally you would have a face-to-face conversation with all your members and get to the heart of what they want from you (and even then, they may just tell you what you want to hear). But that’s not feasible, so a good substitute is a well-crafted survey. Surveys are great directional tools when done right, and done more than once (so you can look for trends over time).

There are myriad survey techniques. There are readership surveys, attitude/awareness/understanding surveys, focus groups, post-experience surveys, and a whole lot more.

It’s easy to go a little survey-crazy. But like ice cream and chocolate, too much of a good thing may have the opposite effect. Too many surveys (looking at you, Delta Airlines), and you will numb your audience with survey fatigue. As a frequent flyer, I got a survey after every flight but never completed a single one. Do I have opinions? Sure. Did the constant barrage drive me to give them meaningful feedback? Absolutely not.

It’s also very tempting to ask everything you could possibly want to know in one survey. This was particularly true before online survey tools existed because there were print, postage and data collection costs to mail a survey and organizations wanted to make sure they gathered as much information as possible for the expense.

Have you ever let it slip that you’re sending out a member survey during a cross-department meeting and then received a lot of “since you’re already sending a survey, could you add …” requests from other departments? Question creep is real, and if you’re not careful, will detract from your goal of meaningfully and effectively listening for clues from your members.

Better practice: evolve your survey by repeating it frequently and iterating the questions to elicit clear feedback. Send to small (representative) segments and then fine-tune your questions before sending to a larger audience. Use skip-logic where you can so you don’t waste a question on someone who has indicated it’s not relevant. Start with a few questions, and add more as you read outcomes. This isn’t a one-and-done exercise.

The importance of Active Listening: be authentic, relevant and have a purpose.

Crafting a good survey is much like having a satisfying conversation with a friend. If you spend all your energy during the conversation thinking about what YOU are going to say instead of listening to what the other person is saying, you will learn far less and be a lot less satisfied with the experience.

Think about what you want to know, and most importantly, what you will do with the information. If you ask what your constituents WANT to read about, you may not be able to deliver on their requests (if they even have any qualified ideas). And they may perceive that you ignored their suggestions if they don’t see that topic appear. Better: ask what they thought of each of the articles in the last issue you sent. Compare those answers to surveys from previous issues. Trend their satisfaction with articles about specific areas of interest and you’ll begin to understand what content engages your readers.

→Tip: how readers react to a cover image impacts their overall impression of the magazine. First impressions matter, always.

Start with a reasonable goal for the survey. Do you want insight into future trends? Perception of value of current benefits or services? Overall satisfaction? Maybe you want all of these things, but prioritize and write the survey in a way that will deliver direction for your most urgent desire for insight. This helps in the order of the questions as well. Think about this as you decide on questions: If they stopped at a question, will the previous questions give you the clues you are looking for? A portion of survey recipients will abandon your survey at some point. Plan your survey so the most important questions come first.

Clues and insight, not “answers.”

Give your survey audience the ability to give you a reaction without investing too much cognitive time on the question. “On a scale of 1 to 5, with 5 being the highest, how interesting are the following topics to you” will give you great insight. Asking a binary (yes/no) question makes the respondent think in black and white, and could trigger them to abandon your survey early.

Consider the assertions put forth by Malcolm Gladwell in his book, Blink: The Power of Thinking Without Thinking. In a pulse survey, you’re looking for that “blink” reaction because you want to keep the friction low and the completion rate high.

Manage your own expectations of the outcomes as well. You are not likely to get empirical answers. What you WILL get are clues to what your constituency thinks of you, your service, your programs, and what concerns them. Note well: all surveys are flawed. A high number of responses reduces the error range and gives you more confidence that the outcomes are representative of a larger audience. However, be very, very careful about how much stock you put in the results. According to polls, the Edsel was going to be everyone’s car of choice, and market research done on New Coke said it was going to be a great success.

That’s why getting the survey intent right in the first place and repeating a survey on a regular cadence is so critically important. Attitudes change. Perceptions morph. Distractions occur. Ask the same questions of a consistent cohort on a regular basis and you can avoid unpleasant surprises for retention, engagement and loss of relevance.

It’s the act of asking that matters.

It’s an honor to be asked for one’s opinion. If your invitation is sincere and the circumstances plausible, your survey recipient will have a positive reaction to being asked for their honest opinion. They will respond if they have time or are otherwise inclined to share their thoughts, but the act of asking is a positive gesture regardless of their response. You have sent a message to them that their opinion matters.

When you can, share the results appropriately. This is another positive reinforcement that IF the individual responded, they were part of a group of influencers (even if they didn’t vote with the majority). If they didn’t take the survey, perhaps it will encourage them to respond the next time.

There are lots of “rules” floating around. And some are even worth following.

“Never ask more than five questions.” Well, we’ve done surveys with dozens of questions, that take 24 minutes to complete, and have a 90% completion rate by a large number of people. And we’ve done two-question surveys that didn’t get answered. If your questions are relevant and the survey recipient is engaged, the number of questions shouldn’t be a hard and fast rule.

“Don’t have too many options for a given question.” Now that’s one we like. If you have a wall of options, your abandon rate will spike. If you find yourself skimming over your own survey when you do a test, that’s a clue. Craft your survey in a way that moves the respondent along easily (low friction), and you’ll get more responses.

“Don’t ask leading questions.” Well, that depends on the point of your survey.

“Don’t make a question required.” In the world of online tools, sometimes you just have to because of how you’re going to use skip-logic for a better user experience. If you see a high abandon rate at a required question, look at how you are crafting the question.

Active Listening is a culture, not a burden.

It’s amazing what we learn when we simply sit down with someone and get them started talking. Conduct an Active Listening session: Get four or five members in a similar category on a conference or video call and ask a leading question, like “what keeps you up at night in your profession or business.” Then listen. Listen for language, for phrases, for common pain points. Prompt for issues, then (later) consider how you might be able to present plausible solutions.

Walk in their shoes for 90 minutes, and then do it again with another group, and another, and another. Active Listening sessions don’t have to be nationwide, highly structured focus groups. They can be casual, take-their-pulse kinds of conversations where the audience you care the most about (your constituency) becomes a part of your strategic thinking. Do them often enough and they become part of your culture.

What if you don’t like what you hear?

Sometimes the outcome of a survey isn’t what you expect, or what you want to hear. What will you do? File it away and chalk it up to bad timing or a distraction out of your control? Or sit down with your team to read, and openly think about what your audience is telling you?

Keep in mind that surveys are directional, not facts, and you are hearing from people who take the time to respond (you’ll never hear from 100% of your audience). However, failure to find ways to actively listen comes with a big price. Unanticipated retention loss, lowered engagement, and the worst sin of all: tone-deaf communications.

Active listening is a privilege and an obligation.

It’s what will keep you in touch with audience attitudes, and keep your service levels satisfactory. It can be exhilarating (nothing like a high satisfaction score to get the team high-fiving!), insightful, and rewarding.

At MCA, we consider the process of creating an effective survey a privilege and an obligation of working with our clients. Looking for clues to drive acquisition messaging, retention improvements and a stronger engagement with members is a key strength.

More On This Topic