When I was in high school, I loved spreadsheets. I used to design measurement tools to test my friends' abstract psychological constructs like "Movie Trivia Content Mastery" and generate data tables of the results. Then I would develop intervention and enrichment activities in preparation for a reevaluation at a later time. I would analyze my data tables at the item level, design reliability checks to decrease error potential, and scale the results so that I could relate multiple assessments over time to each other.
When it came time to try and figure out what I wanted to do after high school, I sat down with my counselor and told her about my passion for psychometrics. "You're in luck!" she excitedly told me. "Teachers do that!" And the rest is history.
Except, of course, nothing like this has happened in the history of ever. We all know that teaching is an Art and a Science, but I have yet to meet someone who was called to this profession by an overwhelming drive to be a social scientist. It is the Art that brought us here, and that continues to inspire us and drive us in our professional growth, anchored in a faith that all students can learn at high levels.
I believe this is a major source for cognitive dissonance in our work. We are all social scientists. Scientific method is vital to continuous improvement, but it should never be the Art OR Science of teaching. The concepts are not mutually exclusive. Just as we should strive to improve assessment practices, intervention methodologies, and professional learning planning, so should we never fail to remember the passion and drive that inspired us to be educators.
As we begin a new school year, let us agree to listen to our inner artists when we feel like we may be going in a wrong direction with our assessment and data analysis processes. Let us deconstruct our assessment programs and ask ourselves if they match what research tells us works best for learning. With the upcoming changes to Aware, districts may be making some changes to local test types for logistics purposes. While we are at it, why not also look at local test types and make sure they match our philosophies regarding assessment best practices. Let us look at what our teachers see when they analyze tests and click on their quick view drop down menu. Let us ask ourselves how many different data views do they truly need to effectively analyze test results and make reteach, intervention, and enrichment decisions.
Back when I was a teacher in the nineties, I attended staff development on a regular basis. When I became an instructional coach, we called it professional development.
I want to talk about the hot topic of Student Growth, but I’m going to take the long way around. So let's begin with a little pop quiz:
The easiest method to determine growth is to take a measurement, take a second measurement at a later time, and subtract the results.