Thursday, December 20, 2012

Slow Down!


I was out of town for several days while a telephone survey project was going on at my office.  When I got back to work and started editing the completed interviews I noticed some odd things in the data.  I cannot give too many details without violating the client’s confidentiality, but I can say that it was a problem that I anticipated before I left town.  I briefed our interviewers on the project the night before I caught my flight to Phoenix to visit my mother.

The survey was one that we have conducted almost every year for the last several years.  The sample for the survey is retired people and the questions with the weird answers can be confusing to people of any age.  We ask people to calculate percentages of their household income.  When I briefed the project last week I told interviewers to slow down, especially when reading these questions.  I reminded the interviewers that we are interviewing people in their 70s, 80s and 90s, so they might not hear or completely understand the questions if the interviewer reads too fast.  One of the interviewers, a man in his 70s, said in the briefing that he was offended by this.  I replied that I was basing the instruction on my experience with the project.  I knew that we would have problems if the interviewers did not slow down.

A good interviewer will mirror their pace to the pace at which a respondent speaks.  A good interviewer will also listen for when a respondent is not listening and re-read questions as appropriate.  Our questionnaires are written with certain words EMPHASIZED so that respondents have a clearer understanding of what we are asking.  When I brief and coach interviewers, I usually tell them to emphasize sibilant words.  Words such as “IF” and “BOTH” do not travel well over the telephone, so respondents sometimes do not hear them.  When this happens, respondents often become confused by the question or think we are asking a different question.

I held a meeting with the interviewers on my first day back at work, before we started the dialing shift.  I told them I was cutting off their coffee until they slowed down.

Wednesday, December 12, 2012

Don't Get Discouraged


“Nothing in this world can take the place of persistence. Talent will not; nothing is more common than unsuccessful people with talent. Genius will not; unrewarded genius is almost a proverb. Education will not; the world is full of educated derelicts. Persistence and determination alone are omnipotent. The slogan "press on" has solved and always will solve the problems of the human race”  Calvin Coolidge



We finished a huge telephone survey project yesterday.  By huge, I mean 1,000 interviews.  We usually conduct 300 or 400 interviews on a project.  On the last few days of the project I held very brief pre-shift meetings.  These meetings were quite brief.  The purpose was to tell the interviewers not to get discouraged.

It is easy to get discouraged during the last few days of a survey project involving random digit dialing.  We ask to speak to the person in the household who is having the next birthday.  We almost always have quotas by gender, and we almost always fill our quotas of women before we fill our quotas of men.  When we fill our quotas of women, we call random households and ask to speak to the man in the household who is having the next birthday.  It takes many calls to find a man who is willing to participate in a survey.  This project was even more difficult because we had to interview people between the ages of 18 and 55.

I have conducted interviews on and supervised many such projects.  An interviewer can sit on the phone for an hour and listen to answering machine messages and busy signals.  When someone finally answers the phone, they tell you to take their number off the list.  Another hour goes by.  A woman answers the phone.  She says that her husband is not home and he doesn’t do surveys anyway.  Fifteen minutes later a man answers the phone.  He says to unrandomly select another household for the survey.

It is important for an interviewer not to get discouraged because if he or she gets discouraged, he or she will sound discouraged.  If an interviewer sounds discouraged, a respondent will be less likely to want to participate in the survey.  This makes it even more difficult to get the last few interviews needed to finish the project.

Our recent survey was not as difficult as a project we did last year.  We did a survey in which respondents had to be 25 or older and thinking about going back to school in the next six months.  After we filled our quotas of women on that project, we had shifts in which we completed one or no interviews.  That is how I knew that I needed to give our interviewers some encouragement on our recent project.

Wednesday, November 28, 2012

Engagement


I have had some success at social media marketing since my earlier blog post.  The type of marketing that I do is different from most other types of marketing in that I do not sell a product or service.  I recruit interviewers for a market research and public opinion research firm.  Still, I am in the online marketplace offering something in exchange for something else.

My supervisor told me a few weeks ago that my recruiting efforts through Facebook and Google+ plus were a good idea, and that they had produced some good results, but that they were not enough.  We have several large projects that need to be completed by the end of the year.  So, I got busy again.  We were slow over the summer.  During that time I built my networks on Facebook and Google+ and worked at practicing engagement.  I Liked and +1ed comments and pictures.  I engaged in discussions about some of these posts.  I remember spending most of a morning arguing with Facebook friends about President Obama’s “You didn't build this” comment.

The engagement paid off.  I got my first Google+ referral as a result of sharing a blog post on Google+.  I sent emails to several Facebook friends.  Here is what I sent them:

Saperstein Associates recently started a very interesting and worthwhile project.  It must be done sooner than I expected so that we can start on several other interesting and worthwhile projects.

I need to recruit a few more people to fill our phones through the holiday season.

Please take a look at the link below and let me know if you can help out and make a little extra money over the next few weeks.  If not, please pass the link along to anyone you think might be interested.

Thanks for your help.

John


Not everyone responded, and not all of those who responded were able to become interviewers, but the email campaign resulted in several good hires.  I got a phone call from a Facebook friend about an hour after I posted my letter to the editor of the Columbus Dispatch.

We could still use a few more interviewers, especially for our Saturday and Sunday shifts.  Please give me a call if you are interested.

John C. Stevens
Saperstein Associates
(614) 261-0065

Friday, October 26, 2012

Establishing Rapport


Telephone survey respondents sometimes tell me that they will do my telephone survey, but they don’t usually do surveys.  Experience has given me some skill at getting people to answer survey questions, but my voice probably deserves most of the credit.  I have a deep and authoritative-sounding voice.

One reason that many people do not participate in telephone surveys is personal or hot-button questions.  We tell people when we read the introduction to a questionnaire that the survey is confidential.  We tell them that their responses will be reported in the aggregate and not individually.  Respondents often do not hear this.  If they do hear it, they perceive it more from the tone of voice of the interviewer.  Some respondents ask direct questions about the confidentiality of a survey.  If an interviewer can answer such questions without getting flustered, the interviewer can usually establish rapport with the respondent.

Establishing rapport with a survey respondent is important not simply to get them to participate at all, but also to give honest answers to hot-button and personal questions.  If a respondent perceives that an interviewer is truly neutral the respondent will honestly give answers that may be considered socially unacceptable.  If we do a survey about household financial management, we want accurate information about gambling habits and credit card debt.  We ask demographic questions about age, income, marital status and education level so that we can make sure that we get a representative sample of a population.  We usually ask these questions at the end of the questionnaire.  An interviewer who can establish rapport at the beginning of the call is less likely to have a respondent refuse to answer these personal questions.

Sometimes it is too easy to establish rapport.  Some respondents love to talk.  Several years ago I worked on a survey in which we asked women about their experiences at the maternity center of a local hospital.  One woman went into great detail about childbirth, but was offended that I asked about her annual household income.  Some respondents want to talk about anything but the questions on the survey.  They view the call as an opportunity to get something off their chest.  The interviewer’s responsibility in such situations is to keep the call on track by continually bringing the respondent back to the questions.  This often requires tact and diplomacy, which is also part of establishing rapport.  

Monday, October 22, 2012

Probing for Illuminating Information


My work as a market research interviewer helps organizations to be more competitive and efficient.  Businesses commission market research studies to learn how to add value to their product or service.  They may also conduct market research to uncover problems.  Market research studies are also done so that public relations workers can learn how to put the best spin on a message.

One of my best memories is of when a client said that the information we gathered was illuminating.  To get illuminating data, an interviewer must be able to probe well.  Probing is asking questions that are not written in the questionnaire to clarify vague or general responses to open-ended questions.  We also probe with questions such as “What else?” or “Why else?” to get as much information as possible from a respondent.

Probing well requires the ability to think quickly and the ability to read a respondent.  An interviewer must not ask leading questions.  For example, a respondent might say that the most important issue to be addressed by local public officials is education.  One person might be thinking about teacher salaries when they talk about education.  Another person might be thinking about outdated textbooks.  Still another person might be thinking about dropout rates.  If the interviewer asks “Do you mean the high dropout rate?” the interviewer is asking a leading question.  The respondent might mean the dropout rate, but they might mean something else.  If it is something else, we could lose the information by asking a leading question.  Many respondents will just say “Uh, yeah” rather than correct the interviewer.

A stock probing question is “What do you mean?”  This question usually gets a respondent to be more specific, but sometimes a respondent will give a definition of a word that they used.  Such a response is not usually useful.  A few years ago I did a telephone interview about an upcoming election.  I asked a respondent how he would vote in a race for a seat in the United States House of Representatives.  He told me how he would vote.  The follow-up question was “Why?”  The respondent told me that he would vote for his candidate because the other candidate was a slacker.  I could tell that this guy would give me a definition of the word “slacker” if I asked what he meant, so I asked “What gives you that impression?”  He talked about missed votes and failure to answer letters and phone calls.  This is the kind of information clients need to address why people make the choices that they do.

Another good probing question is “Why is that important to you?”  This usually gets at information that is not revealed by agree/disagree statements or demographic information.  It must be asked appropriately, however.

John C. Stevens
Saperstein Associates
(614) 261-0065

Friday, October 12, 2012

Neutrality in Political Polls




A Facebook friend who lives in Florida recently posted about a polling call that his wife answered.  The interviewer asked if she planned to vote for Barack Obama in the upcoming election.  When she said no, the interviewer said they hoped she would change her mind and ended the call.

I commented on my friend’s post that this did not sound like a legitimate poll.  It was really a canvassing call.  A real poll would not ask a leading question.  Where I work, we even go so far as to rotate the names of the candidates in a race so that half of the respondents hear the name of the Democratic candidate first, and half of the respondents hear the name of the Republican candidate first.  That is one way that we work to reduce bias in polling.

Respondents often ask our interviewers who is sponsoring a survey.  When we conduct political polls we often do not tell the interviewers the name of the client.  We tell them to tell respondents that interviewers are intentionally not told who is sponsoring a survey, so as not to bias or influence anyone’s answers.  We have done several polls for the Columbus Dispatch.  At the end of the call we ask respondents if a reporter from the Dispatch may call them to discuss their answers.  When we do these polls, we tell interviewers to tell respondents who is sponsoring the survey after they have completed the survey.

When I brief interviewers on a new telephone survey project I like to remind them that they are free to have opinions about candidates and issues, but to keep those opinions to themselves when interviewing respondents.  If an interviewer asks:

“If the election were held today, who would you vote for President of the United States, Barack Obama, the Democrat, or Mitt Romney, the Republican?”

OR

“If the election were held today, who would you vote for President of the United States, Mitt Romney, the Republican, or Barack Obama, the Democrat?”

The interviewer should not emphasize either choice.  If the interviewer asks a respondent if they agree or disagree with a political position, the interviewer needs to read the question in such a way that the respondent cannot tell if the interviewer agrees or disagrees with the position.  We occasionally have respondents ask interviewers how they feel about issues.  They are permitted to discuss their opinions only after the respondent has answered all of the survey questions, and then to keep it brief.  We remind interviewers not to say okay or uh-huh after a respondent answers a question.  These utterances can be interpreted as approval or disapproval of answers to questions.

As far as I know, CBS News, Gallup, Quinnipiac University and Rasmussen train their pollsters in maintaining neutrality.  These organizations have an interest in making sure that the data they collect is accurate.  The larger the phone room, the more difficult it may be to make sure that each interviewer is asking questions in a neutral manner.  Errors in sampling may also skew polling results slightly.  That is why a margin of error is always reported with survey results.  For the most part, however, polling organizations have neither a Democratic nor a Republican agenda.

John C. Stevens
Saperstein Associates
(614) 261-0065

Wednesday, October 10, 2012

Reading Questions Verbatim


Art requires creativity and innovation based on a foundation of discipline.  A musician must master scales before he or she can improvise.  A painter must understand the color spectrum and geometric forms to create compelling abstract art.  An interviewer needs to be able to read verbatim in order to gather useful information.

Several years ago I devised a mock survey questionnaire to use for screening applicants for telephone interviewer positions.    Research and my own experience showed that interviewers need continual supervision and coaching to consistently read survey questionnaires verbatim.  I needed to do something that would help me hire interviewers who already have the ability to read verbatim.

Reading survey questions exactly as they are written is important because changing the wording of a question can change the meaning of a question. 
  •         Could you tell your boss to go jump in the lake?
  •         Would you tell your boss to go jump in the lake?
  •         Should you tell your boss to go jump in the lake?

 These three questions are very similar, but they are different questions.  I could tell my boss to go jump in the lake.  I am physically able to utter those words.  Would I?  It depends on the answer to the last question.  If an interviewer says “could” instead of “would” the interviewer has re-written the question.  If some respondents hear “could” instead of “would,” we would get answers to different questions when we think we are getting answers to the same question.  Our analysis of opinions would be inaccurate.

The interviewers I supervise generally do a good job of reading verbatim, but they are human beings.  If I could record myself reading survey questions verbatim and then let a voice recognition system call people to let them hear the questions I could reduce interviewer error.  Reducing interviewer error would help us deliver more accurate information to our clients while reducing the amount of money we spend on supervising and coaching interviewers to read verbatim.

An interviewer needs to be a robot when reading questions and a human being when listening to answers.  A voice recognition system would need to have an Artificial Intelligence component to be able to probe vague answers to open ended questions.  We will need disciplined interviewing artists until then.

John C. Stevens
Saperstein Associates
(614) 261-0065
jstevens@sapersteinassociates.com



Monday, September 24, 2012

Social Media in Market Research


I read about social media marketing, and have some experiments going.  I recruit part time telephone interviewers for a small market research firm.  My experiments involve using my personal Facebook page and Google+ page.  I have also placed ads on local college job boards.

I still have more experimenting and learning to do.  My goals going into it were to avoid spending money on advertising in the local newspaper or online forums.  I also wanted to avoid advertising on Craigslist.  The volume of the responses through the college job boards, Facebook and Google+ has been much less than the responses we have seen from Craigslist ads, but the quality of the applicants has been much higher.

I am usually quite skeptical of coincidence.  That is why it surprises me that some of the people who have applied for jobs over the summer have been people who did not respond to any advertisement.  One person worked for us three years ago, one person ten years ago and another person worked at the firm 18 years ago, before my time.  Another person lives in the neighborhood and wants to practice his interviewing skills as part of his education in the social sciences.  Perhaps these people heard through the grapevine that we were hiring.  They heard that I sent emails and posted ads from other people.  Perhaps it is coincidence.

All of this gives me the impression that any kind of effective social media marketing will be an adjunct to word of mouth advertising.  Facebook and Google+ can make the process more convenient, but having something that people want and a good reputation are what gets people to pick up the phone or come in the door.

This leads me to believe that the same principals apply to conducting research via social media;  Facebook and Google+ may make the process more convenient, but someone will still have to make sure that a sample is representative of a population and phone calls will still have to be made to ask people to participate in surveys.  The survey might be on a Facebook page, but interviewers will need to call people to direct them to that page.  The chat function may be used to clarify responses to open-ended questions.  That part may be fun.

Monday, September 17, 2012

The Hearty Hello


One thing about my job that gratifies me is that I occasionally run into people I used to supervise, either in person or on social media.  They tell me that some of the interviewing techniques I taught them have helped them in their careers.  What is especially gratifying is when they tell me that they are glad that I told them about The Hearty Hello.  What makes it especially gratifying is that most interviewers seem to think The Hearty Hello is pretty corny when I introduce the concept to them.

I call it The Hearty Hello because it is alliterative, which makes it easy to remember.  I could call it The Sincere Hello.  The idea is to make a good first impression.  I learned from telemarketing that a caller has a few seconds after a person answers the phone to make a good impression.  I noticed that all of the survey questionnaires we use start with the word “Hello.”  Respondents are more likely to agree to participate in a survey if I say “Hello” with a positive attitude.  I try to say hello in such a way as to thank the person for answering the telephone.  It does not work if I ham it up.

The few variations to The Hearty Hello that I make are when I conduct business to business interviews.  I like to use a person’s name if they give their name when they answer the phone, like this:

“Good morning, XYZ Company, Linda speaking.  How may I help you?”

“Good morning, Linda.  May I speak to Mr. Jones please?”

I will say “good morning” or “good afternoon” if that is how the person answers the phone.  Otherwise, I stick to “hello.”  Repeating the person’s name back to them signals that I am paying attention.  That means less work for them in handling the call.  Whatever impression I make with the person answering the phone will be conveyed to the person I want to interview.

Say “Hello” in a hearty and sincere manner when you greet people in person or on the telephone.  The result of the call will likely be positive.

Friday, September 7, 2012

Robot Interviewers


A few years ago I conducted telephone interviews for a business-to-business market research project.  I called librarians to ask their opinions of a proposed new software product for libraries.  One woman told me very politely after I read the introduction that it sounded like an interesting topic to discuss and that she would not mind taking the time, but that she had recently agreed to do a telephone survey that turned out to be an obscene phone call.  She said that she was very sorry, but that she would have to decline.  I had been trained to overcome objections, but this was a new one for me.  I had to mark the response for that library as refused to participate and move on.

The very next day, on the same project, another woman told me that the survey I had called her about sounded interesting, but that she had recently agreed to do a survey that turned into an obscene phone call.  This time I was quicker on the uptake.  I asked the woman if we could have a female interviewer call her.  She told me “Well, since you asked that question, go ahead.”  I was able to complete the interview with her.  I guess some obscene callers have a thing for librarians.

This is one of many anecdotes that I need to keep in mind as I investigate the use of speech recognition technology to collect information for market research and public opinion research telephone surveys.  My motivation for this investigation is not to reduce payroll costs, although that is a consideration.  Nor is my motivation to eliminate the headaches involved in supervising human interviewers who either do not want to work or who do not pay attention to instructions.

My motivation for investigating the use of speech recognition technology for telephone interviewing is that I think robot interviewers could get more accurate information for our clients.  Some of the vendors of systems that use speech recognition technology for market research interviewing use this as a selling point.  They say that it is actually an advantage for a respondent to know that they are being interviewed by a robot because the respondent is more likely to give an honest opinion. 

I am thinking specifically about a survey we did earlier this year.  We called registered voters throughout Ohio to ask them their opinions of proposed legislation regarding animals.  The survey had questions about regulating the ownership of exotic animals, the treatment of chickens on factory farms, and whether the penalty for cockfighting should be a felony instead of a misdemeanor.  I wondered at the time whether people would give honest answers to these questions or if they would give what they considered to be socially acceptable answers.  Who is going to say that they are in favor of cockfighting?  A person might say that if they knew that their answers would be kept confidential and if they did not have to say it to a human interviewer.

We will most likely not use speech recognition technology for telephone surveys until the technology advances quite significantly.  The technology can now be used for simple surveys that have yes/no or agree/disagree questions.  It can skip a question based on an answer to a question if appropriate.  It can record answers to open-ended questions, but cannot probe those responses.  A robot interviewer would not know when and when not to ask “Why?”  The speech recognition systems used in customer service applications can understand what a customer is saying well enough to route a call to a human CSR and can even schedule a reservation, but cannot actually help a customer resolve a billing discrepancy.

Another consideration as I investigate this issue is that researchers and their clients may rely less on telephone polling in the future.  Google claims that its Google Surveys can provide data that is statistically representative of a population.  If so, this will remove an obstacle to doing more research via the Internet.

These are my impressions of speech recognition technology so far.  I would like to find out if I am incorrect.  Future posts on this blog will cover what I learn about speech recognition technology and artificial intelligence as well as successful interviewing techniques as they apply to concepts in market research and public opinion research.  I will also be looking at how big  data can beused for market research and public opinion research and more generally about how automation replaces human workers.  Please feel free to direct me to sources of information or share your own stories about interviewing.

John C. Stevens
jstevens@sapersteinassociates.com