In the absence of assessment results for the same group of students over a period of time designed to measure growth in academic achievement, the best data to look at in regards to the assessment of all Mississippi 11th graders using the ACT is the change in composite scores. However, we must also understand looking at the growth (or lack thereof) in one group of 11th grade individuals at a school or school district in one year in comparison to another totally different class of 11th grade students another year has several limitations.
Most obviously, the limitation is that the students as a group arrived at differing levels at the beginning of their junior years or before. For example, had the group that made up the 2016 11th grade at one school taken the ACT in the 9th grade their average composite score may have been a 17. At the same school, had the group that made up the 2017 11th grade taken the ACT in the 9th grade their average composite score may have been a 14. When these two different cohorts of students finally took the ACT in their 11th grade years, we might find the 2016 group now had an average ACT score of 19. We also might find that the 2017 group of 11th graders now had an average ACT score of 18. However, the 2016 group grew by only 2 points on average over two years, and the 2017 group grew by a whopping 4 points on average. Thus, the argument could be made with a great deal of weight that the school was able to improve the academic achievement of the 2017 group to a larger degree than they were able to improve the 2016 group. However, just tracking the 11th grade scores of the two groups and comparing them together (as I am about to do) would seem to show a decrease in the growth (-2 points) among the 11th graders at this school on the ACT, despite what might have been a tremendous job growing the second group of students from the level they arrived.
Unfortunately, we do not live in a perfect world where such data would be available for our Mississippi students. What we do have is two separate years of ACT average composite scores for all 11th graders at each particular school and school district. We can assess whether growth occurred in results of the two totally different years 11th graders, but it is admittedly of limited value compared to growth of the same students over time. This begs the question: Why is this information of any value?
This growth of non-cohort 11th graders, though imperfect, is of more value than what we would have otherwise. Without looking at such trends in the scores (even among different cohorts), all we would have is the final achievement results each year, if we did not look at this type of growth. Such results without further analysis would offer us no glimpse as to whether the arrow, in regards to student achievement, might be improving or digressing. However, even as imperfect as these attempts at assessing growth are, they can begin to show us trends about our schools and districts which can be very valuable. As an example, if we notice that a particular school’s average ACT composite score went down 1 point over the previous year, this might cause us some concern, but could be nothing at all to worry about. But, if the next year, we again notice the average ACT composite score of 11th graders at the same school went down another point, this might give us more cause for concern with a bit more legitimacy. After still another year, if the score continues to drop, we might be wise to become very concerned as the trend line seems to be going down, instead of the desired upward trend.
Another positive for examining growth, even from different cohorts of 11th graders, is that schools with lower overall scores need attainable, measurable goals to gauge their progress. A school with an average ACT composite score of 15 is definitely not at the top of the rankings of overall scores for the state. Even if they improve to a 16 the following year, they are still not anywhere close to the top or where they eventually want to be as a school. However, they did improve. The next year, if they moved up to a 16.5, we are noticing a positive trend in their average scores that gives them feedback and hope in regards to their efforts at improvement. Since the only way to reach the top of the mountain is to take steps up the slope, this type of growth measure is the only way to assess whether this is happening as a trend over multiple years. Growth should be the goal of every district, whether at the top or at the bottom in overall achievement. Thus, it is also valuable for those toward the top every year.
There are many other things to consider though which do make this type of simplistic growth on ACT composite score averages problematic. The measure of growth does not take into account that as average scores enter the upper range, it becomes harder and harder to make point improvements. I could go into great detail on this, but it is likely that improving one year’s average score of 24 to 25 the next year, would be less likely to happen than improving another school’s group from a 16 to a 17 the next. Using what data we have, we simply do not have the information to determine how much harder or likely it is to move up how many points or fraction of points based upon the score from the previous year. But, it is worth remembering that generally speaking it is much less likely you will be able to raise a baseball team’s batting average from .385 to .400 from one year to the next, statistically speaking. In comparison, a team with a .250 one year might stand a very good chance of moving to a .265 the next. The closer to perfect; the harder it is to achieve the same level of raw score improvements and much smaller incremental improvements may take tremendous levels of effort to produce.
All of that being said, the average growth in composite ACT scores among Mississippi Juniors from 2016 to 2017 is available by clicking on the following links:
***The original file for growth by district had at least one school district out of order. The incorrect version was online from the time of publication (about 1:00 am) to 3:15 pm on the same day (Sept. 18th), when I noticed the error. The amount of growth was correct, but the ranking order failed to number correctly. The error seemed to only affect #31 and below in the rankings. It has since been corrected and updated. But, if you were to spot anything inconsistent, please let me know.***
I hope the data is of some value to you as we all work to improve student achievement!