Public Schools are Public Cooperatives – Divided They Fall

We do not think about it much until we pass a ranger station, forestry tower, or a pickup truck with their department logos on the side, but every day many hard working individuals go to work for our state forestry, federal forestry, state game warden, and federal parks services around our state and country. Often we do not let our minds think about these important individuals as they work hard to provide the services we need and enjoy. Whether protecting the game and fish we all see the need to the preserve or guarding the forests that are reserved for our enjoyment, these individuals are a vital part to maintaining the way of life we have come to enjoy and insuring the country our children inherit is one that has these same natural resources preserved and hopefully even improved.
 
Of course, some of us do not use these services directly at all. The person growing up and living in an apartment in downtown Jackson may never even meet one of these workers in his entire life. The catfish farmer in the delta, although he raises, catches, and comes in contact with more fish than even the most avid tournament fisherman, may never contact someone from the Department of Wildlife, Fisheries, and Parks, as the farmer maintains his operation and provides for his fish and ponds. The large landowner with literally hundreds, if not thousands, of acres in pine and hardwood may never meet his local forestry commission as his investment in timberland grows awaiting eventual harvest. Yet, a portion of all of these individuals’ taxes goes directly to pay for these departments, their buildings, and their employee salaries. Yes, regardless of whether or not we take advantage of our beautiful state and federal parks for a vacation or outing, our taxes are used to support and maintain these public resources.
 
There are many other similar government resources, like these, that we all pay for in taxes, yet maybe never take direct benefit from or take much less benefit than some other of our fellow citizens. Social security, the departments of agriculture, our local law enforcement, and the list can go on and on. We pay into these, yet never may receive the direct benefit many other citizens do from their existence. Our taxes flow into them, yet the person who hunts once a year on state land and the person who visits state waterways daily both get vastly different levels of benefits, although both may be paying the exact same share of the taxes to operate the state’s Department of Wildlife, Fisheries, and Parks. We all realize that this is the only way these public services could possibly be run. We justify our tax payment into them, despite the vast difference in the direct benefits from their existence we may receive, as a necessity since they benefit the overall public good.
 
Imagine the ridiculousness that would be greeted to the delta catfish farmer who suggested he receive “his share” of the taxes going to the Department of WFP back to him in the form of a tax deduction or refund, since he does not use the services himself directly. The same could easily be said for the large landowner who operates his enclosed “fox pen” of hundreds of acres which he maintains; who then suggests he receive back his “share” of the tax money that normally goes to game wardens’ management of wildlife beyond his hundreds of acres of fenced in hunting land. Why? I mean, why is this suggestion ridiculous? Why would it be beyond silly for the person living in a gated community with its own security guards to demand back his or her portion of tax money that would go to support the local police force? What would be the huge downside to allowing such a person to have his or her “share” of the taxes pulled back out of the public service which they do not themselves choose to use?
 
The first answer that comes to mind is that we all know these are government services for the public good. The public obtains the benefits, sometimes very directly and sometimes totally indirectly, but we all benefit in some way. Beyond this fact that we all are receiving “something,” whether direct or indirect, from these public services, we also all know what would happen to these departments and these public resources, should those who have the means to not use them directly or do not have the need to use the services themselves be able to “pull out” their tax money. Would the Natchez Trace, Pickwick Lake, Holly Springs National Forest, or the many resources we enjoy be able to operate, if such funds were suddenly pulled from their budgets to operate? If only those using them directly were the only ones paying taxes for their operation, we know that these departments and the services they provide would probably cease to exist or exist on such a small scale that we would no longer recognize them. After all, these are public works and as taxpayers, we all are needed to pay into this “cooperative” that pools our funds together to provide public services which could not exist without this cooperation. Without all of our membership, such resources would soon be nonexistent for the regular citizen to enjoy and only exist for the most rich among us to pay to enjoy privately on privately operated land. Perhaps, we could still visit the overgrown state park with impassible roads and thorns clogging the trails that the meager budget they were left to operate with allowed, while the few rich and their families were able to visit their plush private parks and hunting reserves, maintained with some of the money they were able to “pull out” of the public departments they did not themselves benefit from.  But, we all know this is not the type of country we have chosen to live within, we made this choice long ago and continue to maintain it today by continuing these cooperative public works that we all are able to enjoy equally.
 
The above situation is the reason so-called “vouchers” for use in private schools do not work. It is the same situation and has the same effect for the government resource of a free, public education for all children that we all enjoy the benefits from, either directly or indirectly.  If there is waste in our public system, address the waste and enforce the laws against such waste already on the books.  If money is being spent in unwise ways, then simply pass laws listing those as unallowable expenditures by local boards.  But, vouchers do nothing to address such issues; they only use the issues as an excuse to effectively doom the entire system for everyone.  One could go into much more detail about the reasoning behind how such an ability to pull out of our public education “cooperative” would doom the system to fade into a slum version of what we currently enjoy with quality, safe education only being available to the richest in our communities, many of whom had always used private schools anyway, regardless of the introduction of vouchers allowing them to siphon off their share of tax dollars from our cooperatively funded public schools.  Yes, it could be discussed at much greater length, but I do not think it needs to be. The heart of why such a system of pulling out of the cooperative public services is outlined above, and it does not really need further explanation for the average person to see the common sense of its inevitable effects.
 
Further discussion with the same concepts can be read in these two prior blog posts:
– Clint Stroupe
Advertisements

School Funding in Mississippi House Bill 957, Good or Bad?

Okay, so I have had a couple of requests to summarize the changes proposed in HB 957 and the proposed effect for the future of public education in Mississippi.  While there are many others with a better ability to explain the changes and their implications, I will try to do the best I can with my current understanding.  The bill would effectively replace the current Mississippi Adequate Education Program (MAEP).  It is a significant bill and does drastically change the way Mississippi “should” fund public education each year in our state.  While there are many changes to the current funding law for our schools by this bill, I will look briefly at the most obvious one and that is the change to “base student cost.”

The change in base student cost, which is the amount of money given to districts per child for the expense of their education, can and will likely have an impact on local school district funding and local taxes in the future.  MAEP used a formula to determine this cost which looked at districts that were doing an “average” job, according to accountability measures, of educating their students and using the spending needed by those districts to accomplish this goal to determine the base student cost to reimburse each year.  You may be asking several questions at this point.  Why would this amount be different for each district across the state?  Shouldn’t it be the same?  Why would this average spent per student from these average performing districts change the amount the state gives?

The answer lies in where districts mainly get their money to pay for educating their students.  The majority of it typically comes from the state (which is the topic we are discussing), but individual districts use their share of local property taxes (land, houses, etc.) to supplement the money from the state.  This amount can vary HUGELY from district to district.  The value of property in each district can vary to large degrees (think DeSoto County vs. Kemper County) and the percent used of the available property tax rate for schools may also vary.  So let’s say you live in the very poor “County X” with little or no industry and little value in property, their school district can make a request each year to increase their share of local property taxes in “County X” to attempt to get the funds needed to provide for students to be well-educated to the maximum percentage set by state law.  So very economically poor “County X” has now maxed out its share of property taxes for all of its property (which is mainly all of low value), but “County X” still has very little money from this source to add to the money coming from the state for education, despite its maxed out rate, due to the fact property is simply not highly valued in that area.  Now, “County Y” is on the other end of the state and has huge industry with vibrant economic development.  In “County Y” the businesses are many, the houses are large, and the property values are high.  “County Y” does not request each year to increase its percent of available property tax money.  This is because the smaller percentage is so large an amount of money (bigger pie equals bigger pieces), it does not need to max the rate out to get plenty of local funds.  Thus, “County Y” has much, much more local money to spend per student, despite its lower rate of education property tax, than “County X” has, despite its maxed out rate.  As you can see, if more local education money is needed, “County Y” can raise their rate, because it is not reached the maximum allowed by state law.  But, poorer “County X” can only sit back and decide what will go unpaid and unsupplied in the district.

Now in the above example, who knows who is doing better academically at the end of the year.  Maybe, it is “County X,” but it is probably going to be “County Y.”  Regardless, MAEP is like a honey badger in the sense that “it don’t care” who is spending more or less at the beginning.  All MAEP is concerned about is listing performance for students and doing the math to determine the amount of total average money spent to get this average performance result.  Once this amount of money is determined (which is done every four years), the base student cost is updated and the state uses this to “recommend” to the legislature the amount to allocate per pupil for each district.  This is base student cost.  Now, I am leaving out some other MAEP details, such as the fact the cost goes up a little per year due to inflation and some other details.  But, the main point is that MAEP looks at what it takes to educate an average performing student and updates it every four years to determine the base amount to give all districts per student, with the idea this is the minimal, adequate amount needed for a student achieve average performance.

Now, the big change with HB 957 is that it gets rid of this formula and says the amount is now $4,800 per student.  This move is being criticized because the whole point of the above formula was to make a logical and somewhat scientific determination of how much minimum spending it takes per student to achieve average results and replaces this objectively determined number with a number made up on the spot.  Again, so what?  I mean what is the big deal if the number is made up out of thin air, as long as it works out well for your district to get its funding?

Well, here is the rub.  While the breakdown of increase or decrease of funding for each district may seem higher as listed in local newspapers under the new law, this amount is an increase based only upon the amount funded last year for that district, which was less than what MAEP says each district needs to adequately educate its students.  Right now MAEP is figured using some of the calculations mentioned above and is standing there waiting every year and figuratively staring the legislature and governor in the face.  It is basically saying with real, hard data, this is the number that needs to be funded per student, objectively.  With the formula and hard data gone to be replaced with a made-up number, there is nothing holding them accountable to fund at a certain amount based on data.  If ten years from now, the legislature comes in and changes that number to $3,800, there is no real data to say this number is not just as valid.  Another issue is that this number does not adjust automatically, once fully implemented.  With MAEP figuring that average cost for average performance base student cost, the amount typically will go up over time as costs increase.  Without this type of recalculation, the new bill’s $4,800 per student will have less and less buying power over time.  Like your father’s salary when you were a child, what was a large number then in buying power becomes a smaller number in buying power every year, until decades later it almost seems a funny joke unless it is raised over time (i.e. Cokes were a nickel back in my day).  There is no mechanism for inflation or other adjustments to raise this base $4,800 amount from year to year as it more than likely is capable of buying less and less.

Some might say, “Hey, the legislature almost never funds the current MAEP amount anyway, so who cares about the rewrite.  I mean they give schools what they feel like giving, despite the formula they adopted themselves and various governors have voiced approval for, including current Gov. Bryant.  What’s the difference now?”  In response to this, there really is not a huge difference other than the issue of how easily schools can be starved for funding without an automatic means of accountability or political consequences due to MAEP being an existing law.  Currently, MAEP is like a divorce settlement document that spells out how much child support the mother (school districts) is due from the father (state government) each month to provide for the father’s share of the children’s needs based upon that year’s cost for insurance, baby sitting, etc.  Now, in this analogy, the father has been defying the divorce settlement for years and years.  Month after month, he says in effect, “I’m sorry baby, but times are tough.  I’ll send what I can.”  The mother just keeps making up the difference out of her income (local taxes).  Then, every year or two, good old dad sends a check with the amount that was legally due for the first time in ages, and wants mom to praise him and be tickled for him simply doing one month what he was supposed to do every month for years.  Now, the mother does not make a big deal of this and just keeps the peace, despite being shorted.  However, what would you think would be her reaction if and when the father calls her up and says, “Hey babe, guess what?  I think it’s time we rewrote our divorce settlement.  I mean, we both know it’s not realistic, and I say we just lower it and set a number without all this yearly increase mumbo-jumbo.  The kids are almost school age, so just figure how much it costs you right now and let’s set that number.  I mean it’s not like food, insurance, or other stuff goes up every year.  Besides, you know me, I’ll treat ya right, if you need more cash in the future.  Daddy is good for it!”  Would this be something advisable, in your opinion, for the mother to agree upon?  My opinion would be, if you think the mother should gladly agree to a new settlement, then you will certainly have no concerns with HB 957 in relation to local schools.  But, if you think the mother would be unwise to revise the current formula for Dad’s new plan, even though Dad has almost never kept his end of the current agreement anyway, then you probably would have issues with making this drastic a change to the current MAEP funding formula for schools.  Personally, while I do think a divorce settlement (state education funding law) that is constantly being violated (underfunding by the legislature almost every year) is a problem that needs addressing, I think the only way a fair settlement will be created is if mother (the local school districts) and father (the state government) both sit down at the same table and come up with a real plan that address real-world funding that is needed and will be needed from year to year with an understanding of how accountability measures will be in place to make sure both mother and father actually follow-through.  I can say without reservation that this current bill, HB 957, is definitely not this type of realistic agreement where both sides’ needs are addressed in a way that both sides actually understand.

– Clint Stroupe

 

*If anything in the above article seems factually incorrect, please let me know.  Also, the view expressed, as always, are my own personal views and in no way affiliated with anyone else or any other entity.

2016-17 Mississippi U.S. History Rankings by School & by District

In previous blog postings, I have bemoaned the fact that the Mississippi Department of Education does not or at least has not consistently released any data detailing the various components of the Mississippi public school accountability model by grade and school for the majority of items schools and districts earn points for in the current state accountability model.  Specifically, we currently have no available data available to view by school for the 2016-2017 school year on 5th Grade Science, 8th Grade Science, Biology I, Reading Growth by grade/subject, and Mathematics Growth by grade/subject.  This is in spite of the fact we are consistently given results by district and school with grade by grade breakdowns for overall achievement level (1-5) scores for 3-8 English Language Arts (ELA), 3-8 Mathematics, Algebra I, and English II in a well-publicized fashion.  As we can see below, none of the above mentioned categories we lack released itemized data for are available as of October 23, 2017.  This information is needed just as much, if not possibly more, than what is traditionally made available, and hopefully, it will be made available at some point.  This is discussed in greater detail in a previous blog post:  https://thinkingconservativeblog.wordpress.com/2017/08/25/mississippi-accountability-wheres-the-growth-or-science-or-history/

 

Oct 23 2017

However, with the complete listing of school and district accountability grades last week, we can easily figure at least one of the categories that has still not been released as an individual assessment result item:  U. S. History proficiency.  Since U. S. History proficiency’s category in the accountability model only comes from one assessment, as opposed to Science (5th Science, 8th Science, and/or Biology I), we can break it down accurately and fairly easily from the data released by the state on a school and district specific way.  The following two files show the rankings of Mississippi’s individual schools and school districts for the U. S. History SATP2 assessment for the 2016-17 school year.  As always, this information is merely put out for informative purposes with the goal that greater distribution and understanding of results will help to improve instruction.  If you make use of the data in this format, an acknowledgement of where the data came from would be greatly appreciated.

2016-17 U.S. History Proficiency Rankings by School

2016-17 U.S. History Proficiency Rankings by District

 

-Clint Stroupe

PDF Files of 2016-2017 Mississippi Accountability Results

The following are the district, elementary & middle schools (700 point schools), and attendance centers & high schools (1,000 point schools) breakdowns from the media file released by the Mississippi Department of Education today after being approved by the Mississippi Board of Education.  I thought it might be beneficial to have them in an easy to view PDF, especially for viewing on tablets and other devices.

2016-2017 School District Accountability Results

2016-2017 Elementary & Middle School Accountability Results

2016-2017 Schools with a 12th grade (Attendance Center & High School) Accountability Results

 

Clint Stroupe

2016 to 2017 Average Mississippi ACT Composite Score Growth by District & School

In the absence of assessment results for the same group of students over a period of time designed to measure growth in academic achievement, the best data to look at in regards to the assessment of all Mississippi 11th graders using the ACT is the change in composite scores.  However, we must also understand looking at the growth (or lack thereof) in one group of 11th grade individuals at a school or school district in one year in comparison to another totally different class of 11th grade students another year has several limitations.

Most obviously, the limitation is that the students as a group arrived at differing levels at the beginning of their junior years or before.  For example, had the group that made up the 2016 11th grade at one school taken the ACT in the 9th grade their average composite score may have been a 17.  At the same school, had the group that made up the 2017 11th grade taken the ACT in the 9th grade their average composite score may have been a 14.  When these two different cohorts of students finally took the ACT in their 11th grade years, we might find the 2016 group now had an average ACT score of 19.  We also might find that the 2017 group of 11th graders now had an average ACT score of 18.  However, the 2016 group grew by only 2 points on average over two years, and the 2017 group grew by a whopping 4 points on average.  Thus, the argument could be made with a great deal of weight that the school was able to improve the academic achievement of the 2017 group to a larger degree than they were able to improve the 2016 group.  However, just tracking the 11th grade scores of the two groups and comparing them together (as I am about to do) would seem to show a decrease in the growth (-2 points) among the 11th graders at this school on the ACT, despite what might have been a tremendous job growing the second group of students from the level they arrived.

Unfortunately, we do not live in a perfect world where such data would be available for our Mississippi students.  What we do have is two separate years of ACT average composite scores for all 11th graders at each particular school and school district.  We can assess whether growth occurred in results of the two totally different years 11th graders, but it is admittedly of limited value compared to growth of the same students over time.  This begs the question:  Why is this information of any value?

This growth of non-cohort 11th graders, though imperfect, is of more value than what we would have otherwise.  Without looking at such trends in the scores (even among different cohorts), all we would have is the final achievement results each year, if we did not look at this type of growth.  Such results without further analysis would offer us no glimpse as to whether the arrow, in regards to student achievement, might be improving or digressing.  However, even as imperfect as these attempts at assessing growth are, they can begin to show us trends about our schools and districts which can be very valuable.  As an example, if we notice that a particular school’s average ACT composite score went down 1 point over the previous year, this might cause us some concern, but could be nothing at all to worry about.  But, if the next year, we again notice the average ACT composite score of 11th graders at the same school went down another point, this might give us more cause for concern with a bit more legitimacy.  After still another year, if the score continues to drop, we might be wise to become very concerned as the trend line seems to be going down, instead of the desired upward trend.

Another positive for examining growth, even from different cohorts of 11th graders, is that schools with lower overall scores need attainable, measurable goals to gauge their progress.  A school with an average ACT composite score of 15 is definitely not at the top of the rankings of overall scores for the state.  Even if they improve to a 16 the following year, they are still not anywhere close to the top or where they eventually want to be as a school.  However, they did improve.  The next year, if they moved up to a 16.5, we are noticing a positive trend in their average scores that gives them feedback and hope in regards to their efforts at improvement.  Since the only way to reach the top of the mountain is to take steps up the slope, this type of growth measure is the only way to assess whether this is happening as a trend over multiple years.  Growth should be the goal of every district, whether at the top or at the bottom in overall achievement.  Thus, it is also valuable for those toward the top every year.

There are many other things to consider though which do make this type of simplistic growth on ACT composite score averages problematic.  The measure of growth does not take into account that as average scores enter the upper range, it becomes harder and harder to make point improvements.  I could go into great detail on this, but it is likely that improving one year’s average score of 24 to 25 the next year, would be less likely to happen than improving another school’s group from a 16 to a 17 the next.  Using what data we have, we simply do not have the information to determine how much harder or likely it is to move up how many points or fraction of points based upon the score from the previous year.  But, it is worth remembering that generally speaking it is much less likely you will be able to raise a baseball team’s batting average from .385 to .400 from one year to the next, statistically speaking.  In comparison, a team with a .250 one year might stand a very good chance of moving to a .265 the next.  The closer to perfect; the harder it is to achieve the same level of raw score improvements and much smaller incremental improvements may take tremendous levels of effort to produce.

All of that being said, the average growth in composite ACT scores among Mississippi Juniors from 2016 to 2017 is available by clicking on the following links:

2016-2017 Mississippi 11th Grade ACT Comp Score Growth by School

2016-2017 Mississippi 11th Grade ACT Comp Score Growth by District

***The original file for growth by district had at least one school district out of order.  The incorrect version was online from the time of publication (about 1:00 am) to 3:15 pm on the same day (Sept. 18th), when I noticed the error.  The amount of growth was correct, but the ranking order failed to number correctly.  The error seemed to only affect #31 and below in the rankings.  It has since been corrected and updated.  But, if you were to spot anything inconsistent, please let me know.***

I hope the data is of some value to you as we all work to improve student achievement!

-Clint Stroupe

2016 & 2017 Mississippi ACT Composite Score Rankings by District and School

As part of our present Mississippi public school accountability model, the ACT assessment is given for free to all Mississippi 11th graders.  Most of us are very familiar with the ACT and know it to be a nationally recognized assessment of college readiness made up of English, Mathematics, Reading, and Science Reasoning sections.  These sections are scored individually and then used to form the overall ACT composite score.  The individual section scores are often used for decisions regarding whether or not students will require remediation classes when entering post-secondary institutions and whether students might be able to bypass certain normally required courses, based upon student performance.  The overall composite score is typically used for post-secondary acceptance decisions and scholarship awards.  All of that being said, I personally like looking at ACT scores due to the independence of ACT as a nationwide, non-profit company in comparison to traditional state contracted assessment providers.  ACT result averages of our Mississippi 11th graders in a given year can be extremely valuable when comparing them to other states who also assess all of their 11th grade students.

The following link will allow you to view how individual public schools in Mississippi compared to one another and the state average in regards to their average ACT composite scores for all Mississippi 11th graders in 2017:

2017 Mississippi 11th Grade ACT Comp Score Rankings by School

The above ACT rankings by school for the 2016-2017 school year can be compared to the results from the previous 2015-2016 class of 11th graders by clicking the following link:

2016 Mississippi 11th Grade ACT Comp Score Rankings by School

The same rankings can also be viewed by district below:

2017 Mississippi 11th Grade ACT Comp Score Rankings by District

2016 Mississippi 11th Grade ACT Comp Score Rankings by District

As always, I try to mention that these results give us a snapshot as to how only one cohort of Juniors at each school did one year and then another cohort did the next.  It is valuable, but not nearly as valuable as growth data from the same cohort tracked over time to assess growth.  However, the current system of assessing 11th graders using the ACT is not designed to assess such growth.  These important facts should be taken into account as we examine these results.

-Clint Stroupe

 

Mississippi Accountability: Where’s the Growth? Or Science? Or History?

As anyone who has ever had more than a five second conversation with me regarding the purpose of assessment in our schools knows, it is my whole-hearted belief that it should be growth-oriented and used for formative purposes.  For all of the somewhat scattered nature of our accountability model in Mississippi, one of its strengths is the weight it puts upon growth in student achievement.  Without getting too deep into a different topic, I would also say that one of the primary faults of the accountability model is that the growth it focuses upon is too heavily weighted on the “bottom quartile” (the bottom 25% of test-takers in the current school year based upon their scores from the previous year in language arts or mathematics) and leaves science, as well as U.S. History, standing alone without a needed means to determine growth.  But, I will save that topic for another day.  Today, I am simply referencing that growth in performance of individual students from year to year, whether the bottom quartile or the whole, is an extremely large element of the accountability model which determines the school and school district’s accountability level and letter grade (A-F).  Yes, growth in language arts and mathematics is extremely significant and vitally important.  As mentioned earlier, performance on science (5th, 8th, and Biology I) and U.S. History assessments are also key factors in determining how well our districts and individual schools are performing.  This begs the question then, why does the Mississippi Department of Education not make any of this information (English/language arts growth, mathematics growth, Biology I scores, or U.S. History scores) available to the public at all from the data for the previous year?  With all of the fanfare and publicity that is given when MAAP language arts and mathematics achievement scores come out for the state and with the subsequent very public publishing of those results for each school and district, what happened to the growth and scores in these other subjects which make up a much larger portion of the grade designation with which each school and district will be labeled?

If my first paragraph was a little too wordy, I will attempt to simplify the point I am trying to make.  Mississippi looks at many factors to determine the points a school or district has earned in order to rise to a higher letter grade level (A-F).  The heaviest factor is growth of students in English/language arts (ELA) and mathematics scores from the previous year.  Another factor is 5th/8th grade science and Biology I assessment performance levels.  Yet another factor are U.S. History assessment performance levels.  However, last year the state did not make readily available to the public the growth data by grade and subject for each school or district for public view.  Likewise, there was never posted any data as to the performance level results for the end of year assessments in Biology or U.S. History.  The performance level results were posted last year for 5th/8th grade science, albeit in November (while the performance level data for language arts and math was posted in August).

Now, through much work and digging one was able to determine how much total growth was obtained in each broad category (bottom quartile ELA, bottom quartile math, overall ELA, overall math) by getting the information from the media file released at the same time as school letter grades.  The same method could be used to determine the overall (combined 5th/8th science & Biology I) Level 3 or 4 percentages for the school or district.  But, this could only be obtained by careful, patient digging through the file by someone with odd hobbies (like myself).  Even through such intensive digging, as near as I can tell, there was no data ever released to determine how students did on the Fall end-of-year performance levels on the Biology I or U.S. History assessments.  Why not?

In closing, if we truly desire public involvement and understanding in our school accountability, the public must have the information available in a detailed understandable manner.  Mississippi does this for ELA and mathematics MAAP performance level data already.  For last year and up until this point in our present year, the state has not produced this type of detailed data for ELA growth, math growth, Biology I end of year performance levels, or U.S. History end of year performance levels for all schools.  Last year, we were given the results eventually for Biology I (only from the 1st semester), U.S. History (only from the 1st semester), and 5th/8th science.  The public should be able to view ELA and math growth data in detail by subject and grade level, just as they are able to view ELA and math achievement level results.  The public should be able to view the complete results for the year for Biology I and U.S. History assessment performance levels, as they are able to view ELA and math.  Without consistently providing this type of data to the public, how can we really expect them to understand the letter grades we are assigning to our districts and schools?  How too, can we expect the public to know where performance was in need of praise within our schools or where there might be room for improvement without this type of complete data?  If the purpose of our accountability model is truly to encourage improvements in the instruction of our students and growth of student achievement, how can these goals be reached without this type of complete data?  I hope this year, we will be able to view the information needed to provide this type of complete picture.

-Clint Stroupe

SchoolDigger Ratings of Mississippi Schools: A Brief Critique

Many of us may have seen the ratings website SchoolDigger with its ranking of schools in Mississippi based upon their Mississippi Academic Assessment Program (MAAP) scores.  You might also notice that these ratings based upon the same assessment give different rankings from the ones I may share on this blog.  Well, truthfully, no one, with the exception of myself, may have paid much attention to the difference.  However, I wanted to mention what it is SchoolDigger is showing and why, in my opinion, its methodology is not the best way to look at assessment results for our Mississippi schools or districts.

SchoolDigger is a great website in many ways.  It gives information on free/reduced lunch rates and student to teacher ratios, which are both valuable to know for our schools.  But, we need to understand what they are showing and using to determine their school rankings and ratings.  SchoolDigger primarily uses the “average standard score” to rank schools.  Basically, the “standard score” they are referring to is the MAAP assessment scores converted to a normal percent-style grade that most of us are familiar with from our days in the classroom.  This conversion of the score (scale score) to a percent (standard score) is not really the problem.  The problem comes into play when SchoolDigger then takes all of those scores for every student in the school taking that particular assessment that year and averages them together to give their “average standard score.”  The site then uses this “average standard score” to decide which school has performed more or less successfully.  However, there are two major issues with using this means to determine the success of any Mississippi school or school district on any particular MAAP test that make it a poor means of measurement.

First, the big problem is this is an “average.”  Averages are great for some things, but not for others.  Using the SchoolDigger methodology, you could have a hypothetical school with 100 students taking the test.  We will refer to this example test as “Test X.”  Of those 100 student who took “Test X” that year, 50 students made a “standard score” of 100.  The remaining 50 students all made a standard score of 60 (what we would typically consider a failing grade).  Now in this example, the “average standard score” for the school would be an 80, which sounds pretty good.  So in this hypothetical, half of the kids in the class passed and half of the kids in the class failed, which most of us would agree would not be the making of a good, much less a great, result for the school.  But, the “average scale score” result shows an 80, which would be a “B” score in traditional terms.  You can see the issue here in that average scores do not really show how successful the school is in educating or attempting to educate all of their students.

Now keep in mind in the above hypothetical, the schools “average scale score” used for ranking is an 80.  Then, let’s go on to imagine that we are comparing them to another school which also tested 100 students on the same “Test X” during the same year.  At the second school, of the 100 students taking the test, 99 of the students made an 80 and one single student made a 79.  From a common sense standpoint, the second school was far more successful than the first with virtually every student having a “B” level of performance on the assessment!  However, the second school’s “average scale score” is only a 79.99, which puts them lower in the ranking than the first example school (where half of the class failed) with its 80 “average scale score.”  I think you can see from this example and many others that could occur, “average” scores are a very poor way to attempt to rank a school, if the goal is to educate every child.

Secondly, there is nothing remotely referencing “average scale score” or anything “average” in relation to student scores in the Mississippi Accountability Model.  The Mississippi model uses a variety of ways to get “points” for the school which are used to determine the school’s rating (A – F ratings).  Taking away graduation rate and a few other factors which have nothing to do with the MAAP or other standardized tests created by the state, there are only two things the state is concerned with:  growth and percent scoring in the top two levels on the test.  The top two levels of scores that a student could possibly make (Level 4 and Level 5) are basically considered as being “proficient or above” scores on the test.  These are the targets the state wants every student to reach.  Every goal that our schools have in regards to our testing results are geared toward getting all students to at least a Level 4.  On every single test our state administers for accountability, they only want to know did the student grow from their level the previous year and/or did they reach the target of Level 4 or 5.  “Average” scores have nothing to do with this and cannot be used to tell you what percent of students were able to reach the minimum of the Level 4, which is the target given by the state.  This without question makes “average scale score” or any type of “average” score of zero value in relation to our Mississippi Accountability Model on which all schools are judged.

Thus, the means which SchoolDigger is utilizing to rank our schools are in my opinion of little practical value, since they do not tell us the percentage of the class or group who actually are meeting the target level of performance.  The SchoolDigger “average scale score” is also of no value in relation to our Mississippi system of determining school accountability and grade levels.  While growth of the same student from one year to the next, would be the ideal means to make any sort of judgement as far as how much a student actually learned in a year and teaching effectiveness, this information is not available to the public in the released Mississippi test scores.  Without such growth data, the only accurate and relevant way to rank our assessment results for schools and districts is using the percent of students who were able to score a Level 4 or Level 5 on each of the assessments given in our schools by the state.  This is why you will never see an accountability report by the Mississippi Department of Education with any mention of “average” scores being used to demonstrate proof of success on our MAAP assessment scores by our schools or school districts.  The SchoolDigger data using “average scale score” is interesting to look over, but is of no real value for ranking success or failure of our Mississippi schools.

-Clint Stroupe

2016-2017 Mississippi Algebra I MAAP Results Ranked by School & by District

Due to a little more teasing apart of the data, it always takes me a bit longer to post the Algebra I rankings.  As usual, I have listed the results for the state in Algebra I by district/school and ranked them by percent scoring in the top two levels.  Using the percent in the top two levels seems to be the preferred method of determining the percent scoring a “Proficient or above” type score, which is the goal score range.

The following links will take you to the Mississippi Academic Assessment Program (MAAP) Algebra I results from the 2016-2017 school year for junior high & middle schools without a 9th grade, high schools and attendance centers with a 9th grade, and for the districts as a whole (if curious about the reasoning behind this splitting of school rankings see the “Caveats” below):

2016-17 MAAP Algebra I Rankings of Middle & Junior High Schools Without a 9th Grade

2016-17 MAAP Algebra I Rankings of Attendance Centers & High Schools With a 9th Grade

2016-17 MAAP Algebra I Rankings of Mississippi School Districts

As discussed in previous years, more caution should be used in examining these Algebra I results than with any others listed.  There are several extremely important differences in how the Algebra I assessment is given and reported that make it quite unique.

Caveats of the 2016-2017 MAAP Algebra I results:

Algebra I is unique in that students may take it during the middle school years (typically the 8th grade).  These middle school students who took Algebra I in 2016-2017, all took the end-of-course MAAP Algebra I assessment just as their high school counterparts did.  In many school districts across the state, the decision is made to allow students who have demonstrated advanced achievement in 7th grade mathematics to take Algebra I in the 8th grade in order to “get a jump” on the accumulation of high school credits.  This “jump” might pay off by freeing up the student to take more advanced electives, dual-credit/enrollment, or AP courses later in high school.  Why is this important when analyzing results reported by school?

  1. In a situation where a district has a separate elementary, junior high, or middle school which includes an 7th or 8th grade and has Algebra I testers, those results will show up under the elem/jr. high/middle school where they took it.  This has a two-fold effect.  First, the school with the junior high test takers will typically have extremely high test scores as the more advanced students are typically enrolled in the course (with some exceptional cases at schools where the total opposite might be taking place for strategic reasons with polar opposite results).  Second, the school where those students typically move on to the 9th grade (the “high school”) will typically now have extremely lower Algebra I scores on average due to the fact that the upper achieving students have already taken the course in the 8th grade at the elem/jr. high/middle school where they were the year before.  Thus, middle schools will typically have extremely higher scores in comparison to all other school types.  This is in reference to the results only and not in reference to where the student’s results will apply in terms of the school’s accountability model grade.
  2. In some school districts these extremes do not take place at all and results are not skewed due to the “split” between taking Algebra I in the middle school grades.  This occurs for three typical reasons.  First, some districts have a blanket policy that no student, regardless of achievement, will be able to take Algebra I before 9th grade.  Thus, in those schools all students’ scores will fall under the high school in which they enter the 9th grade.  The only exception to this is a few schools across the state that include the 9th grade in their middle school or have a middle school made only of 9th graders.  This 9th grade middle school scenario is extremely rare in Mississippi, but it does exist causing further skewing of results when attempting to compare schools head to head.  Second, there are a fair number of high schools which include 7th – 12th grades.  In these combined 7th – 12th high schools, no skewing takes place as all Algebra I test takers are reported under the one school name regardless of the grade they take the course.  Third, there are a minority of K-12 schools still left across the state.  These schools have the same situation as the 7th – 12th grade high schools listed previously, in that they will not have skewing of results as takes place in the “caveat #1” schools listed above.
  3. Thus, in an ideal situation, one might compare three categories of schools’ Algebra I results.  The first category being elem/jr. high/middle schools with students taking Algebra I in the 7th/8th grade.  The second category being high schools which receive students from those type of elem/jr. high/middle schools which allow Algebra I to be taken.  The third category being made up of K-12 attendance centers and 7th – 12th high schools whose scores reflect all of their Algebra I students regardless of grade level.
  4. In the real world, these categories must be taken into consideration when comparing schools (district comparisons are not affected because all students regardless of grade level taking Algebra I end up under the umbrella of the particular district’s results).  However, attempting to show these distinctions when examining statewide results is impossible without the state supplying information about each schools grade levels (and perhaps even their philosophy or rules regarding students taking Algebra I).  Since my rankings rely on publicly available data, I have to use my own judgement as to what category a school might fall under.

Due to these very important caveats, I have made my best attempt to show this distinction of results by making two categories for ranking schools.  The first category includes elementary, junior high, and high schools which do not have a 9th grade.  The second category includes K-12 attendance centers and all high schools that have a 9th grade (including both 7th-12th & 9th-12th high schools).  These categories are not perfect as some schools (such as those very rare 9th grade only schools) have to be lumped into one category or the other even though they are unique situations.  Also, some schools names may not reflect their actual grade levels (such as Nowhereville High School which despite its name is actually a K-12 attendance center) resulting in me accidentally placing them in an inappropriate category.  However, I feel the attempt must be made to show at least these two category distinctions or else the results would make little sense (with middle schools virtually dominating the top half of the rankings for the reasons listed above).

Despite the long-winded dissertation, I hope these results provide information which you find beneficial.  I feel pretty confident in the data at this point, but please let me know if you spot any errors.  My goal for making this information available in this particular format is to aid in improved instruction for all of our students.  I simply ask, if you make use of the data in this format, please pass along the word of where you obtained it.  To paraphrase Crash Davis from Bull Durham, I hope when you speak of me, you speak well.

Thanks,

Clint Stroupe

2016-17 Mississippi MAAP English II Rankings by District & School

It is once again time for our Mississippi Academic Assessment Program (MAAP) data release.  As usual, I have listed the results for the state in English II by district/school and ranked them by percent scoring in the top two levels.  Using the percent in the top two levels seems to be the preferred method of determining the percent scoring a “Proficient or above” type score, which is the goal score range.

I feel pretty confident in the data at this point, but please let me know if you spot any errors.  My goal for making this information available in this particular format is to aid in improved instruction for all of our students.  I simply ask, if you make use of the data in this format, please pass along the word of where you obtained it.  To paraphrase Crash Davis from Bull Durham, I hope when you speak of me, you speak well.

Simply click the link below to access the DISTRICT LEVEL English II ranking report:

2016-17 MAAP English II Rankings by District

Click the following link below to access the SCHOOL LEVEL English II ranking report:

2016-17 MAAP English II Rankings by School

Thanks,

Clint Stroupe

*These rankings are for informational purposes only.  Growth is far more valuable information on determining whether learning took place and to what degree rather than end-of-year scores only, which only tell us where students at a school “ended up” without knowledge of where they “began.”