Why Your Home-Grown Survey is Unlikely to Get You What You Want

Share This Resource

Today I am going to explain why your own year-end survey is unlikely to get you what you want – and why frankly you should pay for one of ours

Over 820 PK-2012 schools have taken one of our surveys, many multiple years. That’s 140,000+ respondents to date

Here are four sound reasons why your own home-grown survey will likely not give you the wisdom you need to improve your school strategically.

Problem #1: What’s a good score? 

(AKA, No normed data.)

One of the most important findings of doing Christian school surveys over a decade now is how very good they are – how very satisfying they are

PK-12 Christian schools are among the most satisfying organizations in the country, really in the world. 

What that means to you is that survey scores that seem good to you are often not that great in comparison to other Christian PK-12 schools.

It’s the worst of all possible worlds: The world of false positives, the world that believes everything is OK, when really it is not.  

Here’s a concrete example of what I mean:

Ask yourself, with 10 being high, is an overall satisfaction average of 7.75 a good score or a bad score? How about a solid 3.9 average score on Principal leadership, where 5 is high? Is that good or bad?

Answer: Both of these scores are 16th percentile in our normed data. 84% of our schools scored higher than that for both questions. 

An implication of this is that unless your survey company has really thought this through, you are likely to get great score kudos, when in fact your scores aren’t that great. We call that “happy talk surveys.” Worse, demographic, school, and community characteristics all impact your scores to a degree. With great difficulty, we were able to adjust for all of that. To our knowledge, no other survey company in America does this.

Problem #2: “I don’t care so much … “

(AKA, Effectiveness scores should match relative importance.)

On that same 1-5 effectiveness scale, with 5 being high, is an average score of 4.23 out of 5 a good score for (1) Teachers are Christian role models, or (2) Use of technology in instruction? 

Here’s the answer. 4.23 on use of technology in instruction is a great score for Christian Pk-12 schools – 80th percentile. Only 20% of schools will score better on this item. 

However, an average effectiveness of 4.23 out of 5 for teachers as Christian role models is a terrible score, just at the 20th percentile.  About 80% of Christian schools will score better on this program element.

On the home-grown survey, we interpret these scores exactly the sameWe assume they are equally important to parents, and they clearly are not.

Gene Frost, on his take on Good to Great for Christian Schools, makes a big deal of this, and rightly so. That’s why he recommends our survey in his book, because we ask about both the importance and effectiveness of program elements.

What I just said is that it is virtually impossible, on a home-grown survey, to know if the scores we receive are good or bad. Worse, we typically interpret our scores to be good, when in fact they are just average or worse.  

I call this the Pollyanna Effect – who wants to change anything when we are doing just fine?

The classic instance of the Pollyanna effect was a school in the Northwest, where the accreditation team thought the teachers were outstanding. And said so, in their final report. 

The Administrator did not believe it, and our survey, with its normed data, confirmed her concerns. Imagine how hard change would have been without GraceWorks’ survey! 

That’s why we do surveys for accreditations – it’s hard to argue with the comparison data of 140,000+ Christian school constituents. 

Problem #3: “It Matters to Me – or Not.” 

(AKA Some issues impact satisfaction more than others.)

Let’s pretend we’re on Jeopardy, and I’ll give you the answers first: Much worse, Somewhat worse, About the same, Somewhat better, and Much better.

Ok, I’ll even give you the questions: 

  1. How do compare the Christian character of students at our school to students in public schools in our area?  How do you compare the academic quality of our school to public schools in our area?

So I’ve given you the questions with the same answers for both.

Now comes the crucial question. Which of the answers are good and which are bad for each question?

We can all agree that the first three answers – Much worse, Somewhat worse, and About the same – will hurt us in overall satisfaction, and by the numbers, they do.

Certainly “Much better” must help us with overall satisfaction, and by the numbers, it does.

So that leaves “Somewhat better.”  Are respondents who feel Christian character and Academic quality are somewhat better than public schools less satisfied with, and thus less willing to refer to, your school?

From over 840 Christian Schools, the answer is usually yes and no. 

Yes – parents are much less satisfied if Christian character is somewhat better than public schools. 

No – parents are typically no less satisfied if academic quality is somewhat better than Christian schools.  

If you think that’s a big deal, you are right.

TranslationWhen it comes to Christian character at Pk-12 schools, “Somewhat better” is just not good enough. 

Christian character is job #1

In 2021 we went to a great deal of trouble to determine program elements predictive of satisfaction, willingness to refer, and overall enrollment growth. One of the most fascinating findings is the programmatically, the quality gap scores of “Bible/religion” program is in the TOP 4 of 40 program elements for predicting enrollment growth.

And if you don’t believe that for your school, you can find out for as little as $995 and 4 hours of staff time. (If you can pull together a list of names, you can do our survey.)

You can certainly ask importance and effectiveness on your own surveys, and you should, but you will never be able to determine – outside of factor analysis and regression – how much any particular program aspect impacts overall satisfaction and willingness to refer. On our survey, we went to A LOT of trouble to be able to tell you that – what real program elements really matter to your parents – and what you can do about it.

(It took me three days to figure out a way to do that automatically, and that was after a year in the most research-intensive Ph.D. program in education in the state of Colorado.)

This brings us to the final problem.

Problem #4: Now what do we do?

(AKA How do we prioritize what to “fix” based on the survey?)

Here’s the real beauty of your own home-grown survey. Because of all the problems above, you can interpret it any way you want! I am sorry for being sarcastic, but it is unfortunately true. I’ve seen it way too many times.

You can dedicate time and money to various pet projects and someone’s gut feeling about what parents want. An ambiguous survey can back you up!

These interpretation dynamics are particularly interesting when we do it as a group, especially with boards. (Just thinking about that process makes my head hurt).

There are only three limits to this do-it-yourself approach: Time, Money, and Reality.

For my money, I’d rather put my time and energy into projects and problem fixes that for sure, hands-down, no question, will result in your overall program getting better. 

GraceWorks Survey – the Parent Satisfaction and Referral Survey – solves all these problems (and many more.)

We norm everything – everything! We adjust everything that needs to be adjusted based on the demographics of your parents, the characteristics of your community, regional issues, and the religiosity of the area. We ask how effective and how important for each of your program elements.

Plus – a two-page summary report for your board. Splits by divisions if you need it. Custom questions.  Satisfaction / Willingness to refer by demographic.

And, we help you present the results to teachers/parents/boards. By me personally. I estimate I have presented our survey over 2,000 times to various school constituencies.

In addition to that, our survey provides all of the following:

  1. Actual leads of potential families, with a contact.
  2. Volunteers willing to help with marketing and fundraising tasks.
  3. Enrollment status of non-returning or not all enrolled families – where else they are going and why.
  4. A research-based answer to “Will they pay” & ”Can they pay” – by income level – for tuition increases.
  5. Barna-like alumni outcomes data.
  6. Promoters – dozens willing to spread the word about your school (with a month by month calendar of how to work with them.)
  7. Detailed comments of why your constituents love your school or not so much, broken out by demographics. (Such as, what your 3rd grade parents think, what people making over $150,000 a year think, what your Millennial parents think.)
  8. Parent testimonials – often ready to go with minimal editing. All you need to do is ask permission to use them.

The survey will pay for itself many times over by the students you save – because you know what the real problems are – and new students you gain – through actual leads and later leads working with your newly found Promoters.

Scroll to Top