Students Decide What School They Will Attend?!?

Share This Resource

Once I was in an earnest conversation with a former ISM consultant who has now started his own Christian ministry.

Very seriously and authoritatively, he told me: “We as consultants have to recognize that students are making the decision of what school they will attend, and we have to work with it.”

I have two questions about that, questions all of us in Christian education must answer. Question #1: Is that what we want? Do we want students to decide what school they attend? Question #2: Can we do something about it… particularly if the answer to #1 is NO.

Question #1: Do we really want students to decide?

Most Christian school surveys will ask only the effectiveness of various program elements like “Engaging Teaching” that are presumed to be important. In our Parent Satisfaction and Referral Survey, we ask both how important and how effective each program element tested is to the respondent. This gives us rich comparison data, both comparing effectiveness and importance scores for a given program item, as well as comparing the importance of items between groups.

At an individual school level, you would hope that most program elements are rated about as effective as they are important to your parents and students. Both importance and effectiveness ratings are fluid – but fixing problems with either is much more than simply “changing perceptions.” And we go over these strategies and tactics in-depth in our survey debriefs.

Some of the important comparisons are in fact between groups. For example, in combining the results of 821 surveys, we learned that board members and volunteer leaders are among the most critical of all raters in the average satisfaction survey. That was new information because the average standalone school didn’t have enough board members or volunteer leaders to respond in order to make a meaningful comparison. Combined, however, we had over 2,000 respondents for both groups (of 70,000+ total.)

This brings us to students. For many years, I’ve known that student ratings were considerably more negative than any other group. By combining all the survey data, we were able to quantify the differences with statistical precision (99% certainty). The differences themselves were quantified using effect sizes.

How we determined these effect sizes and their relative degree of impact is really important. However, I know most of you are busy and want to skip right to the results in plain English. If you want the deep dive, click the toggle below. Otherwise, feel free to skip it and read on!

Curious how we came up with the effect sizes? Click here for a deep dive.

**** Trigger Warning: Statistics Ahead ****

Even if you hate statistics, understanding effect sizes can prove helpful. Negative effect sizes, like -0.70, mean “less than.” In the case of students, students think “academically competent teachers” are less important than everyone else. Yes, -0.70 is the actual score.

Positive effect sizes, like +0.28 or +0.22 mean “more than.” These too are actual scores for students on the “quantity and choice of extracurriculars” and “driving time to school“, respectively. Students find these items more important than all other raters, including parents, teachers, board members, and volunteers.

The really hard question about effect sizes is the degree of the effect. Is -0.70 a lot more negative than all the other responders? (It is.) Is 0.28 a lot more than other raters? (Actually, it’s moderately more.)

There are basically three ways – two easy, and one hard – to answer the question of “to what degree.” The easy ways are to use the old “Canned” version of effect size interpretation, where .20 to .49 is “small”, 0.50 to 0.79 is “moderate” and anything over 0.80 is considered to be “large.” Put a minus sign in front of these for small, moderate and large on the negative side. That’s the first easy way to interpret an effect size. 

How to calculate an effect size is cut and dried. And a whole lot of educational researchers noted that they were not getting effect sizes anywhere near that large or small, which made tracking the effectiveness of educational intervention a nihilistic exercise. So, the thresholds for effect size differences were dramatically lowered: .05 to .099 was small, .10 to .2499 was moderate, and anything greater than .25 was a “large” effect size. That is the second easy way to interpret an effect size.

The problem with both of these schemes is that they really didn’t seem to fit our data. This brings us to the hard way to determine the real impact of an effect size, which is in fact the ideal solution to this problem. 

In the hard way, you gather up all your data that has effect sizes, arrange these scores from highest to lowest, and then cut the data at certain percentage points. For example, effect sizes in the top 8% of all effect sizes would be considered “very large.” Effect sizes in the top 21% (to the 8%) are considered “large.” The other floors are 31%, 42%, 58%, 69%, 79%, and 92%. For reference, if you split the effect size such that everything above the line is 92% of all responses, then any effect size below that line would be considered “very small.”

The hard part of the hard way is collecting all those effect sizes. In the course of putting all of our survey data together, we ended up with 26,174 effect sizes. Through the wonders of MS Excel, we arranged all that data in various ways and came up with the following effect size interpretations:

Effect Size Interpretation
Effect SizeDegreeDirection
0.45Very LargePositive
-0.15Small Negative
-0.45Very LargeNegative

**** End of hard statistics, deep breath ********

How Students Think Differently

The reason we went through all that work (hidden in the toggle above) is to help you understand how much differently students think compared to the adults in their life.

Here are the actual scores of over 4,000 students on the importance of various program elements (as compared to adult respondents):

Program ElementRating (with Effect Size)
Discipline consistent, High behavioral standardsVery much less important (<-0.90)
Parents & teachers working together, communicating, Principal addresses parent concernsVery much less Important (-0.59 to -1.00)
Challenging / High academics, Academically competent teachersVery much less important (-0.58 to -0.73)
Christian character developmentVery much less important (-0.60)
Quantity of Extra-CurricularsModerately more important (+0.24)

In other words, when asking students how important discipline, parent/teacher communication, challenging academics, and Christian character are to them, they rate these items as MUCH LESS important than the adults in their life.

The effect size difference is not just LARGE it is VERY LARGE. This is no small matter.

And what mattered more to our students? Quantity of extracurricular activities, one of the hardest items for a small high school to readily improve.

Note that these are largely high school students – the same students that “enlightened” parents are most apt to “let my child” decide. 

Keep in mind that these findings are not based on a small sample, but rather over 4,000 students. We agree with Dr. Leonard Sax, MD, that allowing students to decide what school they are going to attend is a terrible idea. (See his book, The Collapse of Parenting.)

Question #2: Can we do something about it?


It may not feel natural to challenge a parent’s approach to choosing a school (…or in this case, allowing their child to choose a school), but that’s exactly what needs to happen. Sure, we want Junior to be happy. But we are far more concerned with his success and growth in character, skills, and academics.

Survey after survey shows that, given the choice, Junior will choose the former with little concern for the latter. While we don’t neglect him, we can’t let him steer this ship.

How do you empower the parent to make this consequential decision?

At the end of day, parents need to be parents. Parents need to decide what is best for their child’s K-12 school experience. Empowering parents requires you to first help them see education and the enrollment decision in a different light. It requires a guided paradigm shift.

We’ve taught our coaching clients how to do this as part of their enrollment closing process for well over a decade. The impact on enrollment has been remarkable.

We’ll soon make these enrollment closing lessons available to every Christian school (not just our one-on-one coaching clients). If you’d like to be notified when lessons are available, simply add your email to the launch list below.

Be Notified

CIAB Launch List

Table of Contents

Scroll to Top