Applying What I’m Learning About How Kids Learn to Read

It was pretty cool to see my last post catch 🔥 and link me in to a vibrant and smart community of educators committed to the science of reading.

To review, in that post I laid out what I’d begun learning after realizing I knew absolutely nothing about learning to read:

Summary of critical points on word-level reading

The Simple View of Reading provides us with a clear and research-based model of reading comprehension

  • This doesn’t mean it’s completely definitive–no model is. But it does give us a useful map for aligning and targeting our assessments and instruction

Anyone who hears and speaks can be taught to decode words in print

  • IQ is not the basis for the ability to decode
  • Nor is it ever too late to address decoding issues

Units of sound (phonemes -> phonology) are the basis of written language (graphemes -> orthography)

  • Most word-level reading challenges are related to issues with hearing and speaking the sounds of the letters in words

We acquire new words as we read via a process called orthographic mapping

  • It is the phonological part of our brain that anchors the written word in our memory, not our visual memory
  • We learn the vast majority of words (after we have decoded them) by rapidly and unconsciously recognizing the sequence of the sounds of the letters in a word — even when they are irregular

The root cause of most struggles in word-level reading is a lack of proficiency with advanced phonemic skills

  • Students require fluency with deleting, substituting, and reversing phonemes to acquire a large stock of sight vocabulary

Since Then

Since writing that post, it’s felt like a whirlwind of learning. In the NYCDOE, I learned that there are K-2 supports in many elementary schools called Universal Literacy coaches, and they are trained in the science of reading. I spoke with a few and saw how they are attempting to bridge the various programs and curricula schools use to the science. I read Robert Pondiscio’s superb book on Success Academy, How the Other Half Learns, and struggled to square how SA consistently achieves the highest reading proficiency rates in NY state, while applying some reading approaches not fully aligned to the science. (More on that in another post; there’s a lot to dig into from that book, and I’d like to do it justice.)

I then went to a training on Equipped for Reading Success with David Kilpatrick, and got to ask him directly about the distinction between statistical learning and orthographic mapping. He views them as different processes — orthographic mapping refers specifically to the mapping of individual phonemes, and it’s far more quickly acquired (1-4 exposures), as compared to statistical learning, which is a more global pattern recognition process that requires far more exposures. He had a nifty little chart he pulled up to explain the distinctions. Either way, however, I found Marnie Ginsberg’s explanation in a comment on my last post to be a pretty good way to think of it, though with the key addition being that while proficient readers can rapidly do all of this on their own, we need to explicitly train and teach the skills required for orthographic mapping (a chart that outlines those skills below).

A graphic from Equipped for Reading Success that should be widely known in every school.

It can be hard to gain clarity on anything in the world of education, but most especially when it comes to reading. So even as I take one step forward, I often take two steps back further steeped in doubt. Yet I’ve decided to commit to Kilpatrick’s manual as my North Star for the next quarter.

The Knowledge

I’m still moving through the Equipped manual a little each day on my commute, marking it up and imbibing what I’ve taken to calling “the Knowledge” in my annotations, an allusion to the famed test for London cab drivers. The Knowledge, in this case, being terms like digraphs, blends, diphthongs, onset, and rime.

Terms like these, much like grammatical terminology, can seem unnecessarily technical and unessential to good teaching. Yet imagine a world in which it was required for teachers to learn and be assessed on the knowledge behind the terms of word-level reading! I never understood– nor was exposed to–what “onset-rime” means until I read Kilpatrick’s manual. Yet once I grasped it, it served as a threshold concept for understanding phonological awareness.

Here’s the passage from Equipped for Reading Success that expanded my mind and made me aware of a key distinction between the syllable level and onset-rime level of phonological awareness:

“The onset-rime level of phonological awareness goes beyond the syllable level because the child has to break apart the syllable. . . . Onsets and rimes can only be understood within the syllable. Not every syllable has an onset, but every syllable has a rime. This is because every syllable has a vowel.”

–David Kilpatrick, “Equipped for Reading Success” pgs. 20-21

Remember how in my last post I had the big realization that phonemes are an abstraction from our everyday experience of spoken language as a stream of sound? The onset-rime level of sound awareness is one further abstraction from hearing syllable level sounds. There are gradations of abstraction on the road to distinguishing those individual phonemes, and that progression moves from syllable level (“baseball” = 2 claps), to onset-rime level (“baseball” = 4 claps (“b” is onset, “ase” is rime, “b” is next onset, “all” is final rime), to phoneme level (“baseball” is 6 claps (/b/, /A/, /s/, /b/, /a/, /l/).

I’ve begun playing some of the “word games” in Kilpatrick’s manual with my two and a half year old son to cultivate phonemic awareness, and I’ve noticed he can’t yet isolate the second part of a two syllable word. He can identify the first part, however. Which is of absolutely no concern to me, given his age, but I found it revealing of an even more fundamental progression in terms of working memory and the awareness that we can break up multisyllabic words into smaller parts.

When it comes to foundational reading skill knowledge like this, it’s always been something I’ve wished I’d known, but didn’t consider it essential, because the expectation was that I focus on grade-level texts and content. And yet I had students reading far below grade-level. One would think that this would have compelled me to learn it at that point–and I did try, I went through some of the files from my first years of teaching, and I found a whole set of phonics related stuff I’d amassed–but the reality is that it was something else on top of many other things I needed to know and do, and I put my primary focus on grade-level texts and skills. Not a bad focus, of course, but I look back on my many students who were struggling with decoding words, and I feel like I have failed them. I have failed them.

Teaching is a hard job. But so is nursing, and I’m watching my wife as she goes through a nursing program and struggles to acquire a vast body of knowledge that must be applied on a daily basis in a clinical setting. Nurses have to acquire this knowledge and be able to apply it, their jobs demand it. People’s lives are literally on the line. And yet, when it comes to teachers, our society seems to be perfectly fine to let them off the hook.

In How the Other Half Learns, Pondiscio has an especially wry zinger (in a book full of them) in Chapter 1 when he states, “Teaching is the easiest job in the world to do badly. . . But it’s the hardest job to do well.”

We are graduating too many students who are functionally illiterate. We all need to step up our game.

My Theory of Action

My working hypothesis, based on Kilpatrick: many of the struggling readers in the schools I support are struggling with a core phonological deficit. Therefore, if I administer the PAST and identify where a student’s phonemic awareness level is (and train teachers to do so), and support targeted daily instruction in phonemic awareness until proficiency is attained, then those students’ reading levels will improve.

I’ve brought the PAST, a short phonemic awareness assessment from Equipped for Reading Success, to a few of the middle schools I work with, and have begun pilots with self-contained classrooms and students. I just administered the PAST to my 1st student last Wednesday. We selected him because we knew he was struggling with reading. But it still shocked me with just how basic his phonemic awareness level was. He was at nearly the lowest level, the syllable level, a pre – mid kindergarten level.

Let me frame the wider context of what we’re up against: in that school, roughly 40-50% of students across the 6-8th grades are identified as struggling with decoding, according to an iReady diagnostic. Of that ~50%, how many are struggling with a phonological deficit? I’d like to find out. And help to do something about it.

Finding a way to tackle something that massive, while continuing to ensure that core instruction demands grade-level expectations, is a tough challenge. Because let it be known that I am in no way suggesting that kids struggling with word-level reading should no longer be exposed to grade-level texts and content. What I am suggesting is that it is incumbent on teachers at any level (and schools) to be knowledgeable enough of foundational skills and grade-level content and skills to scale their instruction accordingly. And yes, this is a heavy lift indeed. There’s never enough time in the day.

Yet I’ve found Kilpatrick’s materials promising in this regard, because some of the phonemic awareness activities are “1 minute” practice sessions. Every single minute we have with a student is precious time, all too easily squandered.

I recognize there’s many other aspects to this, such as administering a phonics screen or oral fluency task and pairing students with different programs depending on the need. But I’ve got to start somewhere. I’m going to start small to see if my hypothesis is verified and if I can help to enact instruction that will target those needs. This is where the rubber hits the road.

I may fail. This whole thing is, ironically enough, a pet project of mine. It is no official aspect of my duties and role in the schools I support. And I take on too many side projects as it is. I’ve got a book I’m supposed to be writing, by the way, but can no longer find the time for, let alone post on this blog. But I have a hard time thinking of anything more important than getting this right. So I’m saying this publicly so the network I’ve begun connecting to can help support me, so I can better help support the students and teachers I touch each day.

If you are on a similar journey, please connect with me here or on Twitter @mandercorn and let’s work through this together. There’s a wealth of knowledge out there, we just have to each individually connect the dots.

Thank you in advance, and thank you for reading. In solidarity.

Advertisements

Learning How Kids Learn to Read

You might assume I know something about teaching kids to read. I studied English at UCLA and obtained my master’s in education at The City College of NY. I taught special education grades 5-8 for 7 years, and I’ve supported schools and teachers throughout the Bronx with K-8 ELA instruction over the past 3 years.

Yet you’d be wrong. I’ve come to realize I know next to nothing.

In case you haven’t been aware, there’s been a firestorm of educators on platforms like Twitter gaining newfound awareness of the science of reading, with an urgent bellows inflamed by the ace reporting of Emily Hanford. For a great background on this movement, with links, refer to this post by Karen Vaites. And make sure you check out Hanford’s most recent podcast (as of today!!! It’s amazing!) outlining how current classroom practice is misaligned to research.

Impelled by this burgeoning national and international conversation, I’ve sought to educate myself about the science of reading. I began with Mark Seidenberg’s Language at the Speed of Sight, took a linguistics course, and have just completed David Kilpatrick’s Essentials of Assessing, Preventing, and Overcoming Reading Difficulties. Seidenberg is not only pithy, but furthermore impassioned, while Kilpatrick is deeply versed in both the research and application in practice as a former school psychologist. Both experts provide an incendiary takedown of more than a few sacred cows in the educational establishment.

It’s been fascinating to learn more about the science of reading while simultaneously working with a school where I could see problems elucidated by reading researchers and advocates play out in real-time. It has made what I’m learning gain an even greater sense of urgency. I would read pages critiquing the “three-cueing system” and balanced literacy approaches on the bus in the morning, then walk into classrooms where I saw teachers instructing students, when uncertain about a word, to use guessing strategies such as “look at the picture” and the “first letter of the word,” rather than stress the need to be able to decode the entire word (for more on the problems with current classroom practice, listen to Hanford’s podcast).

There’s so much to digest and apply from all of this. This post is my attempt to begin synthesizing the information I’ve read. I’ll start general and then focus on the word-level reading aspect of the research in this post. And there’s so much more I want to cover, but I’ll be leaving tons of stuff out that I would love to explore further. Someday . . .

Reading Can Be Simple

First off, though reading is complicated, it can be outlined by a simple model, known aptly enough as The Simple View of Reading. It can even be put into the form of an equation. The theory was first developed in 1986 by researchers Gough and Tunmer. The original formulation was D (decoding) X LC (linguistic or language comprehension) = Reading Comprehension.

After years of further research, this distinction has mostly held up, though it has become greatly expanded, especially in our understanding of what constitutes language comprehension.

Decoding has been clarified as one umbrella aspect of word-level reading, which is composed of many sub-skills. A more updated formula, courtesy of Kilpatrick, is:

Word Recognition X Language Comprehension = Reading Comprehension.

If you struggle with word recognition (such as with dyslexia), or if you struggle with language comprehension (English language learner), then you have difficulty reading.

Protip: if you are an educator in NY, know that this distinction can be framed around the language from Advanced Literacy as code-based (word recognition) and meaning-based (language comprehension) skills. And if you are a NYC educator, you can furthermore align this to the Instructional Leadership Framework. Bonus points for alignment to state and city initiatives! Yay!

Within each of these two domains lie the various sub-skills and knowledge that make reading so very complicated. Here’s a chart I made to visualize the “Expanded” Simple View:

Protip: Most educators are already familiar with the “five pillars” or “Big 5” of reading instruction: phonemic awareness, phonics, fluency, vocabulary, and comprehension, so it can be helpful to build a bridge between that knowledge and the Simple View. At a recent session I facilitated, I asked teachers to consider the Big 5, introduced the notion that they are composed of subskills, then asked them to sort those subskills into code-based or meaning-based groups. Here’s a print-out you could use to create the sorting strips.

Note that word recognition skills are mostly mastery-based. And a key point experts like Seidenberg and Kilpatrick make about word recognition is that word recognition can be acquired by all children. IQ discrepancy is not a factor.

Here’s Seidenberg:

“For children who are poor readers, IQ is not a strong predictor of intervention responses or longer-term outcomes. Moreover, the behavioral characteristics of poor readers are very similar across a wide IQ range. . . Within this broad range of IQs, poor readers struggle in the same ways, need help in the same areas, and respond similarly to interventions. In short, the skills that pose difficulties for children are not closely related to the skills that IQ tests measure. The primary question is about children’s reading—whether it is below age-expected levels—not their intelligence.”

Here’s Kilpatrick:

“Discrepancies between IQ and achievement do not cause word-reading problems. Rather, deficits in the skills that underlie word-level reading cause those problems. The component skills of word reading can be strong or weak, independently of IQ test performance.”

“A common belief that continues to be recommended is that some students with severe reading disabilities simply cannot learn phonics and they should be shifted to a whole-word type of approach. This recommendation is inconsistent with the accumulated research on the nature of reading development and reading disabilities

“The simple view of reading applies to poor readers with IDEA disabilities (SLD, SLI, ID, ED/BD, TBI) and poor readers not considered disabled. Thus, when asked the question, ‘Why is this child struggling in reading?’ we would no longer answer, ‘because the child has an intellectual disability (or SLI or ED/BD or whatever).’ Those disability categories do not cause reading difficulties—specific reading-related skill deficits cause reading difficulties.”

What this means for educators: there is simply no excuse for any student to graduate from any of our schools without the ability to decode words in print. As Kilpatrick stated in a presentation (thanks to Tania James for her wonderful notes), “If a child can speak, they can learn phonics.”

Language comprehension, on the other hand, may be a tougher beast to tackle. Linguistic skills and knowledge are cumulative and on-going. Most importantly, a core component of language comprehension is background and topical knowledge, in addition to grammatical and syntactical knowledge — both which are inadequately taught in most schools due to the lack of a strong and coherent core curriculum.

I should note that Siedenberg doesn’t seem to fully subscribe to the Simple View, and that by no means should we begin to think any one model can adequately describe something so complex as reading. In his endnotes he states, “The main weakness in Gough’s theory is that it did not make sufficient room for the ways that the components influence each other. Vocabulary, for example, is jointly determined by spoken language and reading. Vocabulary can also be considered a component of both basic skills and comprehension.”

Kilpatrick contradicts this view when he states, “In the context of the simple view of reading, it appears that vocabulary belongs primarily on the language comprehension side of the simple view equation, not necessarily on the word-reading side.”

Seidenberg proposes his own model, based on computational simulations, which looks something like this (Figure 6.2 from Chapter 6):

I think this model is useful for conveying why reading is complicated and can be hard to learn, but maybe not quite as useful for guiding school-based assessment and instruction.

Why is The Simple View of Reading important?

Having a clear model for reading comprehension means we have a guide for aligned assessment, prevention, and intervention. Unfortunately, many schools base ELA instruction primarily on state assessments, which tell you very little about a student’s reading needs. People seem to forget that the function of a state assessment is for school, district, and state level accountability, not to direct classroom instruction.

Protip: One “research snapshot” I found useful from Nonie Lesaux and Emily Galloway’s Advanced Literacy framework is the distinction they make between “literacy performances” and “specific skills and competencies.”

A state literacy assessment is a “literacy performance.” Here’s an explanation in their book, Teaching Advanced Literacy Skills:

“There is a tendency to examine the results of outcome assessments at the item level—to figure out the types of items groups of students struggled with and then go back and teach to support this understanding. Perhaps the most universal example is ‘finding the main idea’ in a passage . . . the problem is that finding the main idea—among many other similar performances or exercises—is just that—a reading performance. It is not a specific skill. That is, to perform the task at hand, in this case to find the main idea, the reader draws on many component skills and composite competencies and initiates those in concert with one another. In turn, when a student is not able to find the main idea, we still do not know why.”

In order to know why, we need assessments that can better pinpoint where the breakdown occurs, whether in word recognition or in language comprehension, or both. And then we need to do something about it. This is where it gets hard.

Reading is Hard

Though we can draw on a simple model to explain it, in actuality reading is complicated.

First of all, it’s completely unnatural. While we acquire spoken language organically, reading requires the imposition of an abstract system onto that language, a grafting of a fragmented alphabet onto a river of sound. Writing is something our species invented, an ingenious mechanism to convey information across space and time. While the first writing appeared around 3,200 BC, humans have been speaking for anywhere between 50,000 to 2 million years prior (we don’t know for sure because we couldn’t record anything yet, duh).

There are many irregular words in the English language, which would appear to make the teaching of something like phonics a daunting endeavor. We assume that kids need to be taught the rules, and then memorize the exceptions (these are known as “sight words.”) Makes sense, right?

Yet research has made it clear we don’t acquire most sight words through memorization. Instead, we draw upon our letter-sound knowledge and phonological analysis skills to recognize new written words and unconsciously add them to our “orthographic lexicon.”

What’s interesting on this point is there appears to be some disagreement between the models Seidenberg and Kilpatrick use to explain this process. Seidenberg calls it statistical learning, meaning that we learn to recognize patterns in common words, from which we then can recognize many others, including ones with irregularities. Kilpatrick, on the other hand, terms it orthographic mapping, which is the process of instantaneously pulling apart and putting back together the sounds in words, drawing upon letter-sound knowledge and phonemic awareness. In either model, what is acknowledged is that children learn to recognize a large volume of new words primarily on their own, but that such an ability is founded upon a strong understanding of sounds (phonology) and their correspondences in written form (orthography).

Honestly, I find both concepts—statistical learning and orthographic mapping—hard to wrap my head around.

It’s also possible they describe different things. Seidenberg’s term seems more global, explaining how we acquire vocabulary, while orthographic mapping refers more specifically to the relationship between decoding and acquiring vocabulary. I should note here that Kilpatrick did not come up with the term, “orthographic mapping,” but rather draws on the research of Linnea Ehri.

Here’s Seidenberg on statistical learning:

“…learning vocabulary is a Big Data problem solved with a small amount of timely instruction and a lot of statistical learning. The beauty part is that statistical learning incorporates a mechanism for expanding vocabulary without explicit instruction or deliberate practice. The mechanism relies on the fact that words that are similar in meaning tend to occur in similar linguistic environments.”

Here’s Kilpatrick on orthographic mapping:

“Roughly speaking, think of phonic decoding as going from text to brain and orthographic mapping as going from brain to text. This is, however, an oversimplification because orthographic mapping involves an interactive back and forth between the letters and sounds. However, it is important that we do not confuse orthographic mapping with phonic decoding. They use some of the same raw materials (i.e., letter-sound knowledge and phonological long-term memory), but they use different aspects of phonological awareness, and the actual process is different. Phonic decoding uses phonological blending, which goes from “part to whole” (i.e., phonemes to words) while orthographic mapping requires the efficient use of phonological awareness/analysis, which goes from “whole to part” (i.e., oral words to their constituent phonemes).”

Kilpatrick notes that “The vast majority of exception words have only a single irregular letter-sound relationship.” This means that if a reader knows their letter-sound relationships well, they will be able to negotiate the majority of words with exceptions and irregularities.

What this means is that students need to be provided with sufficient practice to master phonological awareness and phonics skills. And we can not blame the failure of a student to learn to decode on the irregularity of the English language.

Phonology: What We Can Hear and Speak is the Root of Written Language

In Seidenberg’s book, he argues that phonemes are the first abstraction on the road to the written word. A phoneme is the individual sound that a letter can represent (e.g. the sound of “p”). While we learn many such sounds as we acquire spoken language, the need to disaggregate a single component sound into a phoneme only becomes necessary in the translation of speech into the written form. As Seidenberg puts it:

“Phonemes are abstractions because they are discrete, whereas the speech signal is continuous. . . The invaluable illusion that speech consists of phonemes is only completed with further exposure to print, often starting with learning to spell and write one’s name.”

No wonder “phonemic awareness” is central to learning to read! The ability to know and discern individual sounds, and then to be able to play with them and put them back together, is the core skill of reading. In other words, if you struggle with blending and manipulating the sounds in words, you struggle with reading.

And indeed, this is why far too many of our kids have problems with reading. As Kilpatrick puts it, “The phonological-core deficit is far and away the most common reason why children struggle in word-level reading.”

Once I grasped this deceptively simple idea—that fluent reading is dependent on the ability to hear and speak the sounds of letters within words—prevention and intervention began to make more sense to me. Before, my understanding of the distinction between phonological awareness and phonics and what this meant for instruction was muddy. Now, I know that before even looking at a letter or a word, a student needs to practice hearing and speaking the sounds. This is how the student develops phonemic awareness. Phonics, on the other hand, is taught when those sounds are then applied to letters.

Protip: “A good way to remember the difference…is that phonemic awareness can be done with your eyes closed, while phonics cannot” (Kilpatrick, 2015a, p. 15).

The Importance of Advanced Phonemic Awareness

That may sound straightforward (no pun intended), but one of the key understandings I gained from Kilpatrick is that we too often stop at basic phonological awareness, both in our assessments and in our intervention. While sound instruction in grades K-1 in phonological awareness and phonics should help to prevent most word reading difficulties (“Intervention researchers estimate that if the best prevention and intervention approaches were widely used, the percentage of elementary school students reading below a basic level would be about 5% rather than the current 30% to 34%”), there are some students who will present with more severe difficulties. And those difficulties often stem from lacking more advanced phonemic awareness. He also points out that these advanced phonemic skills continue to typically develop in grades 3 and 4, well past the point that most schools provide systematic phonemic and phonics instruction.

Kilpatrick stresses that intervention and remediation for such students requires explicitly teaching advanced phonological skills.

So What Can We Do?

The great thing about Kilpatrick’s book—and why you should buy it—is that unlike many writers in the field of education, he actually goes through what assessments you can use and what you can do instructionally, both for prevention (K-1) and for intervention (grades 2 and up), to address reading needs. He calls out programs by name and praises or critiques them based on key understandings from the research, and some of it was pretty surprising to me.

But I’m going to stop here for this post before it gets overlong. When I can find time to post again (it’s seriously hard with a 2 year old and 9 month old and the school year is about to begin), I’ll share some of the assessments and programs that I think are most accessible from Kilpatrick, as well as dig into some of the sacred cows that Kilpatrick, Seidenberg, and Hanford have slayed.

Afterword

You’ll notice I didn’t mention what I learned from my linguistics course, which was just an online series. It was fine, but I only found it useful insofar as it equipped me with some terms like lexicon, morphology, semantics, or pragmatics. If you have any recommendations for further learning in linguistics, please let me know.

Also, if I’ve demonstrated any misconceptions in this piece or you would like to challenge or add to anything I wrote, please share!

And thank you for reading.

Close Reading: The Context of an Exegesis

The first thing that happened to reading is writing. For most of our history, humans have been able to speak but not read. Writing is a human creation, the first information technology, as much an invention as the telephone or computer.

—Mark Seidenberg, Language at the Speed of Sight

A growing contingent of scholars argue that our “superpower” as a species is not so much our intelligence as our collective intelligence and our capacity for what’s called cumulative culture: that is, our ability to stockpile knowledge and pass it down from generation to generation, tinkering with it and improving it over time.

—Steve Stewart-Williams, “How Culture Makes Us Smarter

The written word emerged from the fogs of the distant past in places as disparate as the hills of Oaxaca, the banks of the Huan River, and the dry yet fertile expanse between the Tigris and Euphrates. Some of this early transcription was record-keeping, the accounting of ownership, an empirical truth-telling that extended the reach of commerce. Yet there were also the words of the prophets and priests—the divinations, omens, prophecies, and revelations—and the words of the scholars and poets—the stories, laws, and myths. A reckoning with the enduring and the sacred. The Akkadian texts, the Vedas, the Avestas, the Torah and the commentaries that were made to explain them.

In such scripture, contradictory accounts, allegories, and the use of a more complex language not spoken on a daily basis presented challenges beyond the pragmatic literacy of record-keeping. Clearly, the word of the godhead cannot be so easily confined by the shallow tongue of humans, however divinely inspired. The act of understanding sacred texts has thus always been one of interpretation.

And from the start, there have been two broad approaches to interpretation: a literal interpretation, which sticks to what is most plainly evident in the text itself, and an inferential interpretation, which situates a text within a larger framework. These approaches can work together as a progression towards a fuller understanding, though they can also exist sometimes in opposition.

Scriptural Exegesis: the literal and the nonliteral meaning

Scholarly interpretation of scripture, termed exegesis, has a storied tradition, extending to formalized methods termed hermeneutics. Hermeneutics has since developed far beyond scripture into a theory of knowledge and understanding itself.

The early usage of “hermeneutics” places it within the boundaries of the sacred. A divine message must be received with implicit uncertainty regarding its truth. This ambiguity is an irrationality; it is a sort of madness that is inflicted upon the receiver of the message. Only one who possesses a rational method of interpretation (i.e., a hermeneutic) could determine the truth or falsity of the message. (Jean Grondin via Wikipedia)

Hermeneutics spans a wide gamut, from theology and philosophy, from Hillel to Heidegger, and also parallels the development of literary criticism, from Plato and Aristotle, from Russian Formalism to Reader-response Theory, with both threads leading, quite fascinatingly (if you follow edu stuff at all) to E.D. Hirsch, Jr., who argued that an objective interpretation of literary texts is possible (by adhering to the author’s intention). And this lineage extends all the way up to the Common Core Standards and its promotion of a particular form close reading.

But I’m getting ahead of myself. Let’s keep with the exegesis thing for a minute. I just said there’s literal interpretation and interpretation which goes beyond what is in the text, both approaches which often interweave.

Here’s how both of these approaches work together in Zoroastrian commentaries (zand) on the Avestas:

A consistent exegetical procedure is evident in manuscripts in which the original Avestan and its zand coexist. The priestly scholars first translated the Avestan as literally as possible. In a second step, the priests then translated the Avestan idiomatically. In the final step, the idiomatic translation was complemented with explanations and commentaries, often of significant length, and occasionally with different authorities being cited.

Here’s how both Hebrew and Akkadian methods of exegesis try to resolve contradictions between the approaches:

…in order to clarify the interpretation of a text, it may be necessary to adopt a solution that goes beyond the immediate and literal sense of the text. Indeed, the tension between the literal sense of a text and the sense of the text in its larger context is a perpetual concern of Akkadian and Hebrew commentators alike. An awareness of this tension is reflected in commentaries that attach two interpretations to one phrase from the base text: the literal interpretation, which does not necessarily agree with the context, and a nonliteral interpretation that succeeds in reconciling the phrase with its larger context.

In the rigorous and rich Judaic traditions of textual interpretation, extensive commentaries have been developed, and the midrash of the Torah and the halakhah (Talmud) were formalized into hermeneutic rules. There was no distinction initially drawn between literal meaning, peshat, and inferential interpretation, derash, but over time the two terms became more distinguished from each other. In halakhic, or legal, interpretation, scholars had to not only attempt to reconcile tensions within a text itself, but further reconcile laws in relation to changing economic and cultural circumstances. In the attempt to resolve such problems, “scholars…first and above all sought to find the solutions in Scripture itself, by endeavoring to penetrate to its inner or ‘concealed’ content.” In the non-legal rabbinic midrash of the Torah, there was even more room for creative interpretation. Stories termed aggadah could be interpreted at both a literal and allegorical level. Some believe there are hidden layers of meaning that can only be unveiled to those properly trained to unlock them. In the tradition of Kabbalah, exegesis moves far beyond allegorical into the realm of the mysterious and mystical.

What is interesting is how extremely literal methods could be used to move into the realm of the occult. As an example, a hermeneutical method termed notarikon takes out a letter of a word to make the initial letter of another word, such that one word could become an entirely new sentence. Another method termed gematria assigns numerical values to words based on the letters, and then use the numbers to make esoteric inferences.

While such methods may seem bizarre at first glance, remember that scriptural exegesis assumes the premise that scripture is sacred in nature, and thus, without error. If you follow this premise all the way through, that means every single letter of every single word has a divine purpose and meaning, even when it is not immediately evident, and even when some verses or texts stand in seeming contradiction to others.

In Christian Biblical exegesis, scholars also approached interpretation from various angles, some of them in opposition and others within a progression:

… whereas some have argued that the interpretation must always be literal, or as literal as possible (since “God always means what he says”), others have treated it as self-evident that words of divine origin must always have some profounder “spiritual” meaning than that which lies on the surface, and this meaning will yield itself up only to those who apply the appropriate rules of figurative exegesis similarly developed hermeneutics based on literal, allegorical, moral, and anagogical interpretations. (Britannica.com entry)

There’s even a Latin rhyme that encapsulates the four methods, or quadriga, of figurative Biblical exegesis:

Litera gesta docet, Quid credas allegoria,

Moralis quid agas, Quo tendas anagogia.

The rhyme roughly translated:

The literal teaches what God and our ancestors did,

The allegory is where our faith and belief is hid,

The moral meaning gives us the rule of daily life,

The anagogy shows us where we end our strife.

A Talmudic scholar, Rashi, provides an instructive example of moving between different levels of interpretation:

Rashi’s Bible commentary illustrates vividly the coexistence and, to some extent, the successful reconciliation of the two basic methods of interpretation: the literal and the nonliteral. Rashi seeks the literal meaning, deftly using rules of grammar and syntax and carefully analyzing both text and context, but does not hesitate to mount Midrashic explanations, utilizing allegory, parable, and symbolism, upon the underlying literal interpretation. (Britannica.com entry)

In Islamic exegesis, or tafsir, the classical Arabic language itself is central to the task of interpretation through intensive study of rhetoric, etymology, morphology, syntax, and metaphor. The verses, or ayah, of the Qur’an can be delineated into “those that are clear and unambiguous (muhkam) and those that are allegorical (mutashabeh).” It is said that the Qur’an is revealed through seven different forms of recitation, or arhuf. Yet there is debate about what the meaning of arhuf even is. Here is a hadith that elucidates the difficulty in pinning down that meaning:

From ʿAbdallâh Ibn Masʿūd: The Messenger of Allah said: “The Quran was sent down in seven ahruf. Each of these ahruf has an outward aspect (zahr) and an inward aspect (batn); each of the ahruf has a border, and each border has a lookout.”

What is common in all scriptural exegesis is the belief that the text is divinely inspired in origin, and thus, worthy of intense scrutiny to unfurl that revelatory meaning, down to the deconstruction and reconstruction of letters, morphemes, and syntax, as well as righteous attempts to ensure that any contradictions within and between sacred texts are resolved.

The truth is, truth and meaning in the written word can be a slippery thing, subject to abstraction and contradiction. Herein lies its power—the power to reveal or to deceive, both sacred and dangerous. While the word of a prophet or god requires painstaking exegesis to unspool into moral or legal guidance, poets and storytellers can craft and bend language at will to elicit desired reactions from their audience.

Literary Criticism: the significance of a text and its context

Plato feared this deceptive power. He even went so far as to advise that poets should be banned from his ideal republic. In The Republic, written in 360 BCE, Plato argued that poetry is a mere imitation of nature, and thus, inferior. Yet in this shallow deception lay great power, for the poet, through the use of melody, rhythm, and other “ingenious devices,” could take advantage of the irrational “weakness of the human mind. . . having an effect upon us like magic.”

Plato instead argued for the supremacy of the rational “arts of measuring and numbering and weighing.” While he did have an appreciation for poetry, he believed that the primary function of art should be to serve a moral purpose. Anything else was not only frivolous, but dangerous.

Yet a decade later, in Poetics, Aristotle offered an alternative vision of the power of poetry. While he acknowledges that poetry is an imitation of reality, he argues that the most potent of the dramatic arts, tragedy, “imitated noble actions, and the actions of good men,” thus providing the virtuous guidance that Plato found so lacking.

Aristotle examined the “ingenious devices” of poetry closely and provided a clear description of effective literary techniques such as character, plot, and diction, while introducing concepts like catharsis and mimesis that are still applied in literary study today. He also argued that poetry serves a different function than the more quantifiable arts of the specific and the particular, and that it serves an even higher purpose:

…it is not the function of the poet to relate what has happened, but what may happen,—what is possible according to the law of probability or necessity. The poet and the historian differ not by writing in verse or in prose. . . The true difference is that one relates what has happened, the other what may happen. Poetry, therefore, is a more philosophical and a higher thing than history: for poetry tends to express the universal, history the particular.

Aristotle also tackled problems of interpretation. He suggests a number of issues and solutions, but the following one especially stood out to me due to later literary debates about whether a text should be studied based solely on what is within the text, or with consideration of an author’s intent and biography:

Things that sound contradictory should be examined by the same rules as in dialectical refutation whether the same thing is meant, in the same relation, and in the same sense. We should therefore solve the question by reference to what the poet says himself, or to what is tacitly assumed by a person of intelligence.

What’s fascinating is that this tension between Plato and Aristotle’s stances on poetry can be seen resounding in centuries of literary criticism since. As the Encyclopaedia Britannica puts it: “Although almost all of the criticism ever written dates from the 20th century, questions first posed by Plato and Aristotle are still of prime concern, and every critic who has attempted to justify the social value of literature has had to come to terms with the opposing argument made by Plato in The Republic.”

In the Medieval period, Plato’s staunch focus on the service of art for moral good is echoed in the largely unimaginative and depressing body of artwork produced by the Western world. The truly exciting action was taking place, instead, in the scholarly exegesis of biblical texts.

But in 1440, hermeneutics moved beyond its role of merely explaining the “true” meaning of the Bible. An Italian humanist and literary curmudgeon, Lorenzo Valla, proved that a document used by the papacy to claim that emperor Constantine the Great had transferred authority of Rome to the Pope was a forgery by using evidence solely within the text itself. How did he do this? As a scholar of Latin grammar and rhetoric, he explained that the crude Latin used by its anonymous author did not match the form of Latin used in the time of Constantine.

During the Renaissance, the recovery of classical literary texts spurred a flowering of literary criticism. At the same time that Aristotle’s Poetics was translated into Latin and regained a new audience, hermeneutics shifted into a fully Aristotelian appreciation for the beauty of well-crafted art. In his Defence of Poesie, Sir Philip Sidney, echoing Aristotle, argued that the poet was superior to the historian:

So then the best of the Historian is subject to the Poet, for whatsoever action or faction, whatsoever counsaile, pollicie, or warre, strategeme, the Historian is bound to recite, that may the Poet if hee list with his imitation make his owne; bewtifying it both for further teaching, and more delighting as it please him: having all from Dante his heven to his hell, under the authority of his pen. (Renascence Edition)

This new concern with the subjective mind of the author and the reader themselves, rather than on rigid methods, became a growing focus throughout the decline of Neoclassicism and into the Romantic era, during which “The poet was credited with the godlike power that Plato had feared in him.”

This isn’t to say that Biblical exegesis faded away. Instead, new methods were developed that focused on reframing the Bible within a broader context.

The rationalist Enlightenment led hermeneutists, especially Protestant exegetists, to view Scriptural texts as secular classical texts. They interpreted Scripture as responses to historical or social forces so that, for example, apparent contradictions and difficult passages in the New Testament might be clarified by comparing their possible meanings with contemporary Christian practices. (Wikipedia entry)

This takes us into the 20th century, where a renewed formalism led to methods that refocused on interpretation of the text itself, without reference to anything else. In France, Gustave Lanson, a literary critic, promoted a pedagogical method in French universities termed l’explication de texte, in which a text’s structure, style, and literary devices are objectively examined. In Russia, a Formalist method “attempted a scientific description of literature (especially poetry) as a special use of language with observable features.” Russian Formalism stood in sharp contrast to Plato’s argument that poetry was a mere imitation of reality; the stance of Formalism is that “words were not simply stand-ins for objects but objects themselves.” Meanwhile, in Britain and the United States academia, New Criticism became the dominant form of literary interpretation, in which the author’s intent and a reader’s responses were viewed as largely irrelevant distractions. Similar to Russian Formalism, New Criticism took the stance that “everything that is needed to understand a work is present within it.”

Scholars of New Criticism even coined terminology to make it explicit that an author’s background or intent or a reader’s personal and emotional responses were invalid methods of interpretation. They termed these “The Intentional Fallacy” and “The Affective Fallacy,” respectively.

This discounting of the author’s intent and biography and of a reader’s responses both generated opposing schools of literary criticism in the latter half of the 20th century. New Historicism focuses on the social, cultural, and philosophical contexts within which an author wrote, to the point that the text and author seemed to have been almost inevitably created by their context, rather than vice versa:

In its tendency to see society as consisting of texts relating to other texts, with no ‘fixed’ literary value above and beyond the way specific cultures read them in specific situations, New Historicism is a form of postmodernism applied to interpretive history. (Wikipedia entry)

Reader-response theory, on the other hand, “argues that a text has no meaning before a reader experiences—reads—it.” This approach rejects the grounds for objectivity in the interpretation of texts, suggesting instead that meaning arises out of personal reactions and the particular context that a reader is situated within.

Other forms of criticism that drew heavily upon wider contexts beyond the text itself also became more widespread in the late 20th century, such as sociological, psychoanalytic, Marxist, feminist, or post-Structuralist criticism. Increasingly, criticism was viewed as a subjective and highly specialized academic endeavor.

Into this fray stepped E.D. Hirsch, Jr. with a book titled Validity in Interpretation, in which he argued that an objective interpretation of a text was possible, in contrast to the positions of New Historicism or Reader-response theory. However, he also took issue with the stance of New Criticism that authorial intent was a distraction from the text itself. Instead, Hirsch argued that determining authorial intent was the basis for a valid, more objective interpretation. He drew a distinction between the meaning of a text and its significance. The meaning, an understanding as determined by authorial intent, is something that is stable and does not change, while a text’s significance changes in accordance with new explanations and connections to new contexts. In his own words:

Meaning is that which is represented by a text; it is what the author meant by his use of a particular sign sequence; it is what the signs represent. Significance, on the other hand, names a relationship between that meaning and a person, or a conception, or a situation, or indeed anything imaginable. … Significance always implies a relationship, and one constant, unchanging pole of that relationship is what the text means (Validity in Interpretation).

Meaning . . . may be conceived as a self-identified schema whose boundaries are determined by an originating speech event, while significance may be conceived as a relationship drawn between that self-identified meaning and something, anything, else (“Meaning and Significance Reinterpreted”).

This distinction stood out to me because it seemed analogous to one made by Christine Counsell about two main types of knowledge in school curriculum. She distinguishes between substantive knowledge and disciplinary knowledge. Substantive knowledge is knowledge that is relatively stable and can be taught as established fact, while disciplinary knowledge engages students in the use of tools and pathways of inquiry fundamental to the discipline, and which is always evolving.

I find both of these distinctions, between meaning and significance, and substantive and disciplinary knowledge, to be useful, as they allow us to see that there are some forms of understanding that are more static than others, and also that the interpretation of a text is always situated within a wider context, and that interpretations will shift in accordance with that context.

The crucial point, then, is that any text has an envisioning historical and cultural context and that the context of a text is itself not simply textual—not something that can be played out solely and wholly in the textual domain. This context of the texts that concern us constrains and limits the viable interpretations that these texts are able to bear. The process of deconstruction—of interpretatively dissolving any and every text into a plurality of supposedly merit-equivalent construction—can and should be offset by the process of reconstruction which calls for viewing texts within their larger contexts. After all, texts inevitably have a setting—historical, cultural, authorial—on which their actual meaning is critically dependent (Nicholas Rescher, as quoted by a Stanford Encyclopedia of Philosophy entry on hermeneutics)

Another important aspect of E.D. Hirsch’s analysis is that it represents a convergence between literary criticism and the discipline of hermeneutics, which had been developing along largely separate tracks. Hermeneutics sprung originally out of the study of scripture, then developed into philosophical explorations of epistemology, while literary criticism clung more closely to aesthetics and classical literature.

Before we move from the topic of hermeneutics and of the relationship of a text to wider context, I think it’s important to touch on the concept of the hermeneutic circle, which very much relates to the movement between literal and nonliteral in scriptural exegesis, as well as the interpretation of the meaning and significance of a piece of literature.

The hermeneutic circle refers to the recursive movement between part and whole, whether within a text itself, in connection between texts, in connection between a text and something else, or even more broadly, in the relationship between an individual and the world he or she inhabits. We will see this circle in action in our next section on close reading.

One other fascinating thing to note about E.D. Hirsch, Jr., which can help us transition into our next section on primary and secondary public education: he has become more widely known for his promotion of cultural literacy, the idea that literacy is founded on background knowledge relevant to a culture, and that therefore a shared body of core knowledge and vocabulary should be taught explicitly in each grade. He founded an organization, Core Knowledge, which developed K-8 curricular materials to address this need. This was a contentious idea when first introduced in the late 1980s, and continues to generate debate today.

The Common Core Enters Stage Left

Ever since the exhortation of the Common Core ELA standards for students to “read closely” and “cite evidence,” close reading has been a thing in K-12 education.

CCSS.ELA-Literacy.CCRA.R.1

Read closely to determine what the text says explicitly and to make logical inferences from it; cite specific textual evidence when writing or speaking to support conclusions drawn from the text.

As you will see, debates about close reading closely echo the ancestry of the textual exegesis, hermeneutics, and literary criticism that preceded them.

Advocates and developers of the Common Core, such as David Coleman and Sue Pimental, promoted a form of close reading in which analysis is confined to the text itself, similar to the approach of Formalism, l’explication de texte, and New Criticism. This approach could be viewed as an explicit reaction to a trend in K-12 classrooms of providing only easily accessible texts and questions and too much background context prior to reading, most especially to those students who already struggled with reading. This seemed to ill prepare graduating students for college-level tasks oriented around highly complex academic texts, nor for the reading of technical texts required for advancement in many careers.

The solution proffered by the standards was to engage students in reading increasingly complex texts throughout the span of their education, and to ensure questions were “text-dependent,” rather than answerable without any evidence.

However, there was a backlash against this form of close reading.

For example, Nancy Boyles wrote in ASCD that asking students only text-dependent questions doesn’t explicitly prepare students for engaging independently in their own close reading practices. She recommends asking four generic questions that students can apply independently to any text.

“The final, most compelling reason I don’t care for the Student Achievement Partners [text-dependent] questions is that although they teach the reading—the content of the text—there’s no attempt to teach the reader strategies by which that reader can pursue meaning independently. . . Teaching is about transfer. The goal is for students to take what they learn from the study of one text and apply it to the next text they read.”

Educators Kylene Beers and Robert Probst further argued in Notice & Note: Strategies for Close Reading that it is essential to support students in making personal connections to texts because meaning and engagement are created via the interaction between a text and its reader, a view similar to Reader-Response Theory:

“Meaning is created not purely and simply from the words on the page, but from the transaction with those words that takes place in the reader’s mind. . . Close reading, then, should not imply that we ignore the reader’s experience and attend closely to the text and nothing else. It should imply that we bring the text and the reader close together. To ignore either element in the transaction, to deny the presence of the reader or neglect the contribution of the text, is to make reading impossible. If we understand close reading this way, when the reader is brought into the text we have the opportunity for relevance, engagement, and rigor.”

Another knock against confining a close reading solely to what is within a text is that a reader may miss the wider social or historical context that a text is situated within. As the Odegaard Writing and Research Center at the University of Washington puts it:

“Remember that every writer is in conversation: with other writers, with history, with the forces of her culture, with the events of his time. It is helpful, for example, to read Karl Marx or Sigmund Freud with some knowledge of their moment in history. Virginia Woolf and Simone de Beauvoir were responding to writers and events in their cultures, too. When you understand the context of a work, you can better see the forces that moved the author to write that work.”

Kate Roberts and Christopher Lehman, in Falling In Love With Close Reading, suggest that you can have your cake and eat it, too.

“Instead of seeing this as a debate between two opposing sides, we believe there is a way to achieve both goals–to teach students to read more analytically, while also valuing their lives and experiences. In fact, in this book we argue that by learning to read more closely, our lives and experience grow richer as well.”

This made me think of a related conversation regarding critical thinking skills. In “What REALLY Works: Optimizing Classroom Discussions to Promote Comprehension and Critical-Analytic Thinking,” P. Karen Murphy et al. lay out two broad approaches to text-based inquiry in the classroom: an expressive approach, which taps into a student’s personal experience and emotion, and an efferent approach, which is a more objective attempt to acquire and obtain information. P. Karen Murphy et al. argue that it is not one or the other, but rather both working in tandem, that can best develop critical-analytic thinking:

We propose that the solution lies not in either an efferent or an expressive approach to text and other content, but in pedagogical approaches such as small-group, classroom discussions that value knowledge-seeking, in concert with lived-through experience, to promote critical-analytic thinking.

If we agree with P. Karen Murphy et al., then we end up with a model something like this:

All of that said, however, it must be acknowledged that state ELA assessments ask students to answer efferent text-dependent questions about a written passage in complete isolation from any wider context. A student’s personal opinion and experience, as well as the author’s biography, plays no role in the analysis students are asked to conduct. So students will need to have some level of practice with this form of close reading, whatever one believes that textual interaction should ideally be, so long as the coin of the realm is test scores.

So what is close reading, then, exactly? Here’s a few definitions to complicate your understanding:

  • Close reading is thoughtful, critical analysis of a text that focuses on significant details or patterns in order to develop a deep, precise understanding of the text’s form, craft, meanings, etc.

—“A Close Look at Close Reading” by Beth Burke

  • Close reading is a strategic process a reader uses in dealing with a complex text to acquire the information needed to complete a task. There is no single correct way to read something closely.

—“A Close Look at Close Reading” by LEAF (WestEd)

  • People read differently for different purposes. When you read in order to cram for a quiz, you might scan only the first line of every paragraph of a text. When you read for pleasure, you might permit yourself to linger for a long while over a particular phrase or image that you find appealing. It shouldn’t come as a surprise, then, that when you read in order to write a paper, you must adopt certain strategies if you expect your efforts to be fruitful and efficient.

—“Close Reading” by Odegaard Writing & Research Center

  • Close Reading of text involves an investigation of a short piece of text, with multiple readings done over multiple instructional lessons. Through text-based questions and discussion, students are guided to deeply analyze and appreciate various aspects of the text, such as key vocabulary and how its meaning is shaped by context; attention to form, tone, imagery and/or rhetorical devices; the significance of word choice and syntax; and the discovery of different levels of meaning as passages are read multiple times.

—“Implementing the Common Core State Standards: A Primer on Close Reading of Text” by Sheil Brown and Lee Kappes

  • Love brings us in close, leads us to study the details of a thing, and asks us to return again and again. … we argue that teaching readers to look at texts closely–by showing them how one word, one scene, or one idea matters–is an opportunity to extend a love affair with reading. It is also a chance to carry close reading habits beyond the page, to remind students that their lives are rich with significance, ready to be examined, reflected upon, and appreciated.

Falling In Love With Close Reading by Christopher Lehman and Kate Roberts

  • Close reading should suggest close attention to the text; close attention to the relevant experience, thought, and memory of the reader; close attention to the responses and interpretations of other readers; and close attention to the interactions among those elements.

Notice and Note: Strategies for Close Reading by Kylene Beers and Robert Probst

  • Reading nonfiction, in many ways, requires an effort not required in the reading of fiction. We must question the text, question the author, question our own understanding of the topic, and accept the possibility that our views will change as a result of the reading we’re doing. All those demands mean that the reader has great responsibility when reading nonfiction.

Reading Nonfiction by Kylene Beers and Robert Probst

  • Close reading is important because it is the building block for larger analysis. Your thoughts evolve not from someone else’s truth about the reading, but from your own observations. The more closely you can observe, the more original and exact your ideas will be.

—“Close Reading of a Literary Passage” by Dr. Kip Wheeler

There is a multiplicity of sources providing a methodology and form for close reading. Here’s just a snapshot of the different sources I’ve reviewed in my own research on the topic in preparation for a professional development series:

There are some essential components to a close reading process that become evident from these different approaches:

  • It is performed with a short text or short snippet of a longer text
  • There is a specific focus and purpose to reading that particular text
  • There are multiple reads, through which the meaning discovered in a particular portion becomes extended across or beyond that text
  • A system of annotation is applied
  • Textual understanding and interpretation typically moves from literal to inferential (though in the case of French l’explication de texte, analysis is maintained at the literal, more objective level of summary) based on patterns identified in initial observations
  • The end product is a written response or discussion

OK. So there it is. I’m pretty sure there’s a lot more to say on any and all of these things, but writing this has already taken way too much of my very limited time these days. I mean seriously, I’ve spent over a month writing this.

What have I learned? I don’t know if I can concisely articulate it, but it seems to me that textual interpretation, in any form you can name, whether scriptural exegesis or close reading, is most fruitful when it is viewed in a more flexible, rather than rigid, manner. That is, whatever stance and method one adopts, one recognizes there will be limitations based on that stance and method. Furthermore, it seems to me that methods which are able to accommodate and balance both literal and nonliteral meanings, and bear significance that is both objective and subjective, while acknowledging both the text itself and its relationship to a broader context, will be the most compelling.

But maybe that’s just me. What do you think?

Supporting the Development of Clear and Coherent Literacy Instruction in Schools

close up photography of colored pencils
Photo by Jess Watters on Pexels.com

I spent some time this summer drafting a policy proposal for the P2Tomorrow competition, mostly as an exercise to sharpen my own thinking around issues I’ve seen with literacy. Thanks to some great feedback from some very smart people (if you are reading this and you are one of them: thank you!), I am proud of the final result. I didn’t win, but I don’t feel so bad about that since the winners are a truly diverse and amazing collection of ideas (see the list of winners and their ideas here).

So I’m sharing my proposal with you. Please share if you find these ideas useful.

Supporting the Development of Clear and Coherent Literacy Instruction in Schools

The Problem with Literacy: It’s Not Just ELA

Is literacy a subject, or a whole school endeavor?

While defining “literacy” is tricky, especially in a rapidly changing society, most would include in their definition the ability to read and think critically and to communicate effectively. Such literacy is not developed haphazardly nor solely within one subject. It requires a school to work cohesively across classrooms to develop shared expectations, content, and practices.

Yet states label Grade 3-8 literacy assessments as “English Language Arts,” and accountability thus falls primarily on the shoulders of one content area: the ELA department. In effect, ELA is reduced to the practice of generic and shallow reading and writing skills as preparation for state assessments. Results on both national (NAEP) and international (PISA) scores for reading have flatlined for two decades. One reason is that most students receive only scattered exposure to the academic language and conceptual understandings gained from a school-wide engagement in a coherent set of literacy practices.

Though the Common Core Standards attempted to address this disconnect through promotion of literacy standards for ELA and History/Social Studies, Science, & Technical Subjects, a misconception remains in the field that the recommendation for a “balance between informational and literary reading” should be solely driven by ELA, rather than across those other content areas. This has led some educators to believe literature should now rarely be taught, a misreading reinforced by state ELA assessments skewed towards nonfiction passages.

This narrowing of the curriculum has been widely recognized since 2001. ESSA sought to rectify this by redefining what is meant by a “well-rounded education,” and including more subjects beyond the “core academic subjects” of the original ESEA legislation. ESSA also allows Title II funding to be used to help teachers “integrate comprehensive literacy instruction in a well-rounded education.”

Yet thus far states have been largely unable to clarify what it means to teach literacy coherently and effectively at the ground-level. Some school leaders and teachers continue to remain misinformed about the key shifts of their own state standards, and confusion about the meaning of literacy and its relationship to ELA and other subjects has led to a wide variety of pedagogical approaches and curricula of variable quality, complicated by layers of often contradictory state and district policies and initiatives.

A growing recognition of the importance of curriculum and the need for more effective resources is promising, but solutions must go far beyond the evaluation and adoption of higher quality curriculum. A school may adopt standards-aligned, high quality curriculum for various subjects but remain completely incoherent. What is needed are consistent and ongoing processes for collaborative planning and reflection on curriculum and literacy practices across a school.

What State and District Leaders Can Do

How can state and district leaders support school teams in developing, reflecting on, and sustaining processes that will promote literacy coherently across a school?

There are four moves that policy leaders can make:

  • Redefine literacy
  • Clarify expectations for school-wide processes for collaborative planning and reflection on literacy content and practices
  • Create a process for surveying educators and the wider public on what texts should be selected for literacy assessments, and publish that list in advance of each school year
  • Promote team — rather than individual — accountability for results on literacy assessments

Step 1 We have to begin with a redefinition of what we mean by literacy.
The ESEA, since updated under NCLB and ESSA, requires states to assess “reading or language arts” annually in grades 3-8. Despite ESSA’s expansion on a “well-rounded education,” states continue to narrowly label their assessments as subject-specific ELA (46 out of 50, according to my count). Only 6 states mention the word “literacy” in their assessment title.

It may seem like a small thing, but relabeling state assessments as literacy assessments, rather than ELA, would send a clear signal that literacy is not confined to a single subject. This could initiate a state-wide dialogue about what literacy means as a whole school endeavor.

Step 2 As a part of that dialogue, expectations should be developed for what school-level processes will support the development of shared, high-quality literacy content and practices. As a model, the International Baccalaureate standards for curriculum provide guidance for the collaboration and discussion expected between all teachers within a school. By establishing clear criteria for ongoing school-based reflection and curriculum alignment, state and district leaders can promote the idea that curriculum is dynamic and constantly in development, rather than a static item that is purchased and put in place.

Step 3 To further foster an innovative school-wide focus on literacy improvement, the state could engage multiple stakeholders in the cross-curricular selection of texts that would be on assessments the following year. By involving educators and the wider public in this process in partnership with the assessment vendor, greater focus, clarity, and transparency for what is taught and assessed would be cultivated. Furthermore, this could help level the playing field for students that need more exposure to the academic vocabulary and background knowledge required for comprehension of the selected texts and topics.

Step 4 Accountability for literacy assessments could then shift from resting solely on ELA departments to include other subjects, resisting the narrowing of curriculum that is so pervasive. One state, Louisiana, has already taken a bold step towards this by piloting assessments that blend social studies and ELA, and which assesses books that kids have actually studied, rather than random passages.

Such measures signal to schools that teaching literacy is the responsibility of a team, and can do much to counteract the prevailing headwinds of narrow and shallow test prep.

Anticipated Outcomes

What could we expect as a result of these moves?

Let’s consider a school representative of our current situation.

MS 900 is a public middle school in an urban district. The school has an alternating schedule for reading and writing, using two separate and unaligned ELA curriculum. The ELA teachers complain about the complexity of the writing program and the lack of professional development. Students complain about boring instruction. Grade-level ELA and math teams meet two times per week, and the social studies and science teams meet once per week. According to the state’s teacher evaluation system and testing data, the instructional quality varies widely across the school, with a few effective teachers, two highly effective teachers, and the rest developing.

Step 1 At a district meeting, the MS 900 staff learned about a new state initiative where the expectation would be that a whole school should work together to teach literacy, and that tests will reflect this. The administrators and teachers considered how schedules would need to change to provide opportunities for cross-curricular teams to meet regularly to discuss and plan for this new conception of literacy.

Step 2 Grade-level teams at MS 900 were rescheduled to meet 3 times a week, and each departmental team 1 time a week. The school’s support organization introduced protocols for teams to share and discuss the content and practices currently used across different classrooms. Grade-level teams also examined student work and discussed common approaches to targeting student literacy needs. Meanwhile, the ELA department determined that reading literature and writing narratives and poetry had been too long neglected, and discussed with their grade-level teams how strategies for reading and writing informational texts could be shared across the grade. The SS and science departments highlighted strategies specific to their subjects, while sharing topics and themes that could be developed across the the grade. The teachers who had more effective practices began to be recognized by their colleagues for their expertise, and other teachers requested to visit their classrooms to learn.

Step 3 When the new state survey for text selection opened up in the next year, both grade-level and departmental teams discussed which texts and topics were critical for meeting state standards, for teaching their students about the world, and for providing texts and topics that were relevant and engaging. Each team came to a consensus and submitted their selections. When the state published the texts, teachers were excited to see some of their choices reflected on the list, as well as to be introduced to new literary and nonfiction texts they hadn’t read yet but that were highly rated. Teams began planning how they would incorporate study of the selected texts into their shared curriculum.

Step 4 After two years of this process, when the state introduced new accountability measures for schools based on literacy results that bear shared weighting by ELA, social studies, and science teachers, MS 900 teachers felt prepared for the challenge, and were even eager to view the results and item analysis so they could figure out how they could work together to improve their students’ literacy abilities. Imagine that.

References

1 Cambridge Assessment (2013) “What is literacy? An investigation into definitions of English as a subject and the relationship between English, literacy and ‘being literate’: A Research Report Commissioned by Cambridge Assessment.” http://www.cambridgeassessment.org.uk/Images/130433-what-is-literacy-an-investigation-into-definitions-of-english-as-a-subject-and-the-relationship-between-english-literacy-and-being-literate-.pdf

2 Wexler, N. (2018) “Why American Students Haven’t Gotten Better at Reading in 20 Years.” The Atlantic. https://www.theatlantic.com/education/archive/2018/04/-american-students-reading/557915/
Serino, L. (2017) “What international assessment scores reveal about American education.” Brookings Institution, https://www.brookings.edu/blog/brown-center-chalkboard/2017/04/07/what-international-assessment-scores-reveal-about-american-education/

3 Shanahan, T. (2013) “You Want Me to Read What?!” Educational Leadership, ASCD. http://www.ascd.org/publications/educational-leadership/nov13/vol71/num03/You-Want-Me-to-Read-What%C2%A2!.aspx

4 King, K.V. and Zucker, S. (2005) “Curriculum Narrowing – Pearson Assessments.” 18 Aug. 2005, http://images.pearsonassessments.com/images/tmrs/tmrs_rg/CurriculumNarrowing.pdf

5 Workman, E. and Jones, S.D. (2016) “ESSA’s Well-Rounded Education.” Education Commission of the States. https://www.ecs.org/essas-well-rounded-education/

6 Kaufman, J., Lindsay, T., and V. Darleen Opfer. (2016) “Creating a Coherent System to Support Instruction Aligned with State Standards: Promising Practices of the Louisiana Department of Education.” The Rand Corporation, https://www.rand.org/pubs/research_reports/RR1613.html
Kaufman, J. & Tsai, T. (2018). “School Supports for Teachers’ Implementation of State Standards Findings from the American School Leader Panel.” The Rand Corporation, https://www.rand.org/pubs/research_reports/RR2318.html

7 Whitehurst, G.J. (2009) “Don’t forget curriculum.” Brookings Institution, https://www.brookings.edu/research/dont-forget-curriculum/
Chingos, M. M., & Whitehurst, G. J. (2012) “Choosing blindly: Instructional materials, teacher effectiveness, and the Common Core.” Brookings Institution, https://www.brookings.edu/research/choosing-blindly-instructional-materials-teacher-effectiveness-and-the-common-core/ Kane, T. J. (2016) “Never judge a book by its cover – use student achievement instead.” Brookings Institution, https://www.brookings.edu/research/never-judge-a-book-by-its-cover-use-student-achievement-instead/ Steiner, D. (2017) “Curriculum research: What we know and where we need to go.” StandardsWork, https://standardswork.org/wp-content/uploads/2017/03/sw-curriculum-research-report-fnl.pdf Chiefs for Change (2018) “Statement on the need for high-quality curriculum.” http://chiefsforchange.org/statement-on-the-need-for-high-quality-curricula/

8 International Baccalaureate (2014) “Programme standards and practices.” https://www.ibo.org/globalassets/publications/become-an-ib-school/programme-standards-and-practices-en.pdf

9 Louisiana Department of Education (2018) “Louisiana Essa Innovative Assessment Pilot First To Receive Federal Approval.” https://www.louisianabelieves.com/newsroom/news-releases/2018/07/27/louisiana-essa-innovative-assessment-pilot-first-to-receive-federal-approval.

Building an Instructional Core for Student Literacy: Part II

In my last post, way back in August (things got busy!), we examined the problem of incoherency in literacy instruction, and I proposed the following hypothesis:

If a school comes to a clear understanding of what they teach, and can articulate why they are teaching it to parents, students, and the wider public, then this will ultimately result in improved academic outcomes for students due to the greater coherency and consistency in what is taught to students throughout the school.

So how can a school come to a clearer understanding of what they teach, how they will teach it, and a rationale and vision for literacy?

Define What You Want Kids to Know and Be Able to Do

A good place to start is for a school to define what knowledge and skills they believe children should walk out of their building equipped with when they graduate. And by define, I mean truly define discretely, not simply generate a set of feelgood statements like, “I want kids to be lifelong learners and passionate, independent readers who have 21st century learning skills. .  .”

A hearty chunk of skills are already defined by state standards (which was already contentious enough of a process) but due to the decentralized nature of American schools, as well as a strong anti-intellectual current, there’s reluctance to define the content–in the form of topics or texts–that students should study.

And so here we are, like I said in my last post, in the situation wherein parents and the wider public have nary a clear what is actually taught in most public school classrooms.

I think that much of this is attributable to confusion between knowledge and skills. Some school officials, if challenged, will point to state standards and say, “There’s what we teach.” But standards are relatively abstract goals composed of various strands of skills wrapped together. They require significant work on the educator’s part to “unpack” in order to break them down into more concrete subskills and targets for learning.

But even then, you’ll still be missing a critical component of literacy.

Let me give you an example. Let’s say you’ve broken down the third grade Common Core reading literature standard which states, “Determine the main idea of a text and explain how it is supported by key details.” You’ve broken down the ability to meet this standard into a few subskills, such as, “I can distinguish between the important and unimportant details in a text” (look familiar? This is from my post on scaffolding a while back!).

That would be a typical “learning target” in a classroom. But there’s a key element missing: Which text? What details?

Because that’s really where the rubber hits the road. We pretend academic literacy can be developed from an isolated set of skills, but vocabulary and background knowledge are a critical component of literacy. And academic vocabulary and knowledge are built cumulatively from the study of related texts and topics over time.

Knowledge vs. Skills

Knowledge and skills are not the same thing, and it’s important to be able to delineate them. They exist on different planes, but they share a point at which they converge.

Here’s a little graphic I came up with to demonstrate their convergence and differentiation:

The Roadmap to Literacy V2

Where lies the point of convergence between literacy knowledge and skills? It lies in the texts that students read.

Do the texts cumulatively build knowledge of key topics and themes? Or are they happenstance and scattered, dependent on the teacher?

This is why you’ll hear literacy experts talk about the importance of “text sets” that focus on key topics and themes. This is how academic vocabulary and knowledge is built.

Speaking of “experts,” I had an extended dialogue with an author and professional development facilitator who claimed that rather than two strands, like the one above, that there are in fact three strands, or “three dimensions” of learning: concepts, facts, and skills. She and I went back and forth about whether and how conceptual knowledge is distinguished from factual knowledge. Rather than refer me to research or other verified sources, she would just tell me that I needed to read her book. Color me skeptical.

Maybe she’s right, but I ain’t reading her book (unless she sends me a free copy. Then, maybe). But I figured I should put it out there in case you do find that distinction useful.

I’ll stop there for this already overlong post. There’s a heck of a lot more to dive into. But I’m hoping that even just drawing that line between knowledge and skills might help you to redefine how you talk about literacy in your school. Do your students have text sets available to them that build knowledge around key topics and themes that they can use to practice and apply key literacy skills more independently? Has your school defined key topics and themes?

Does the curriculum your school uses build knowledge around key topics and themes? Does it build it vertically across grades and horizontally across subjects? Or is it so heavily skills-based that it’s entirely unclear what texts are actually to be read?

Start there, and your school can begin to tackle the sticky problems of literacy at a much deeper level, and that approach can pay dividends in student learning over time.