Issues with Do-It-Yourself Surveys

Share This Resource

Having reviewed still one more homegrown survey, I would like to state, for the record, the problems I have with homegrown surveys. Undoubtedly you will conclude that I am biased since GraceWorks has its own proprietary survey product.

So let me say right off the bat that I am biased against home-grown surveys, for very good reasons.

With various difficult custom survey projects I’ve done over the years, there are times when I do wish homegrown surveys would in fact do the job – it’s frankly a lot of work!

In priority order, here are the problems I see with do-it-yourself surveys.

1) Is it a good score?  

Without comparison data, you tell me. (e.g. Please rate the overall customer service you and your family receive from ______________:  Excellent 46.60%, Very Good – 32.04% Good – 14.56%, Needs Improvement – 6.80%.  Help me understand, please: Is that good or bad?) This is the typical Survey Monkey approach, and most people will conclude that customer service is fine – only 6.8% say it needs improvement.

But what if your competitor had 58% responding Excellent to the same question, and 40% responding Very Good? Now would you say this is a good score? (We do this on our surveys ….)

2) How much do parents care about this item?  

In other words, even if the effectiveness of a given program item is not stellar, does it really matter? This is almost a tie with (1) in concern.  

For example, for a Christian school, is “Teachers exhibit care and concern for students” more important than “Use of technology in Instruction?”   Answer: The former is more important, BY FAR.  (How do I know?  Asked over 60,000 parents … )

3) How much does this impact satisfaction?

If a program element is done poorly or well, to what degree does it impact overall satisfaction?  The assumption of our survey is that Christian schools have limited time and money, and therefore we need to pick out the program elements to improve that REALLY MATTER.

Taking this one giant step further, we are working with one of the top educational researchers in the country, Dr. Dick Carpenter, to determine what program elements, done well, are predictive of actual enrollment growth for you future.  I don’t believe anyone is doing that research in the whole country, and we are incorporating that into our own survey.

4) Scale problems.  

There is a whole book, (The Ultimate Question, 2nd edition) –  not to mention numerous websites – on the correct scale to ask the willingness to refer question:   On a scale of 0 to 10, with 10 being high, how likely is it that you will refer our _______ to a friend or colleague?  

This is not a copyrighted question.  How to score the question can easily be found as well. For the love of science, please do NOT ask it Yes / No, Don’t ask it in 4 or 5 point scales.  

On five-point scales, there are OCD / Perfectionists types who will never, ever give you a 5. That’s no matter how much you deserve it!  And 4 misses the cut on the top category of willingness to refer. 

It took Fred Reichheld 45 years to figure this all out – why mess with a good thing?

5) Testing multiple aspects in one question.  

In general, if your question or answer has an “etc.” this is NOT GOOD.  Consider this question:  “Has your student ever been a victim of bullying, teasing, harassment, etc. while attending ________.”  Can anyone answer “No” to that question without lying?  And do we think for a minute that bullying = teasing = harassment = etc. (Headache coming on.)

6) No cross-tabbing.  

It’s not just how many people answered a question a given way, but how does answering the question that way correlate with other important things, like overall willingness to refer, or overall satisfaction?

Let’s say that 50% of respondents check the box that says that coming up with the cost of tuition is a large sacrifice for them.  Is that good or bad?  Now consider if you knew that those 50% also were moderately MORE SATISFIED with the school than anyone else, and the difference was statistically significant… How might that change your view of people sacrificing to pay your tuition? 

(BTW… that is the typical result when we ask the question.)

7) Opportunities lost.

… to recruit volunteers, ask for leads, keep tabs on your alum.  Of course, this requires you to vary the questions asked based on the answers to other questions. Can be done, for sure, but difficult and painstaking for people who don’t do it all the time.  

The Solution to Survey Malpractice

As you can see, conducting a homegrown survey inevitably yields incomplete, misinterpreted, or sloppy data. This can lead to mistakes that cost your school dearly.

GraceWorks’ Parent Satisfaction & Referral Survey was designed and refined for schools like yours. Learn more about it and contact us to get the insights your school needs.

Table of Contents

Scroll to Top