Assessing and Supporting Word-level Reading

Photo by Pixabay on Pexels.com

In this post, I’ll continue pulling together my notes on what I’m learning about reading. Thank you in advance for reading, sharing your thinking, and helping me to connect with a broader community committed to improving literacy instruction.

I want to first draw everything back to the Simple View of Reading as a friendly reminder that reading is big.

Word-level Recognition X Language Comprehension = Reading Comprehension

I’ve only been focused on the word-level piece, because that’s the part that was so new to my own understanding. But the language piece is HUGE!

Anyway, in this post, I’ll keep with the word-level recognition side of things and focus on what assessments and programs we might be able to use to tackle just that one side of things.

It’s one thing to have a clear theory and a model; it’s another thing to act upon it. This is where the real debates begin, because at some point, the rubber needs to hit the road:

  1. What will we use to screen and diagnose code-based and meaning-based literacy skills?
  2. What will we do in our core instruction to prevent reading difficulties?
  3. What will we do to intervene when core instruction is insufficient?

This means a school needs to have an RTI model of some kind, which is a level of sophistication, unfortunately, many schools struggle with. There’s a lot more that Seidenberg, Kilpatrick, and I have to say on this topic, but in this post, I’ll maintain a narrower focus. I’d like to dig further into the RTI piece of it in a future post (I have some criticism of the model, though I’m rethinking it in light of some of my new understandings).

Every School Needs a Universal Screener

One of my favorite things about the Advanced Literacy framework that both NY state and NYC have adopted is that it promotes the need to go far beyond the data provided by a state assessment. We need universal literacy screeners–a short test that can help immediately identify kids who are behind in either code or meaning-based ability, from which we can then drill into further with diagnostics.

Because the available tests are not always ideally suited for assessing the various components of reading, the best reading assessment tool is the evaluator’s knowledge of research on reading acquisition and reading difficulties.

David Kilpatrick in Essentials of Assessing, Preventing, and Overcoming Reading Difficulties

The problem is that there are no perfect assessments. As Kilpatrick notes, “Because the available tests are not always ideally suited for assessing the various components of reading, the best reading assessment tool is the evaluator’s knowledge of research on reading acquisition and reading difficulties.

And there are so many levels to reading assessment that it’s almost fractal in nature. You’d need a significant battery of subskill assessments to get a full and accurate picture of any individual child’s reading ability.

Another problem is that time is limited, and there are already a substantial number of tests that students are forced to take. Some are in-house, some are district mandated, some are used to evaluate teachers.

Ultimately, a school must make sense of them as best they can. This is where The Simple View of Reading really comes in handy. Different assessments provide you with different kinds of information.

There’s thankfully a lot of great resources in determining what screeners your will use. The Gaab Lab at Boston’s Children Hospital has an extensive compilation of screeners here.

Assessments of Phonological Awareness

Kilpatrick recommends using both the PAST and the C-TOPP2 as a further diagnostic after a universal screener. But I recommend using only the PAST because it’s free vs. $347 for the C-TOPP2 kit. Why is the PAST test free? Because Kilpatrick developed it and publishes it for free here: https://www.thepasttest.com/ He provides instructions on the site as well. Pretty darn cool.

I’ve begun piloting the use of the PAST at a few schools I support, and it’s been pretty eye-opening to see just how much need there is with phonological awareness in the students I’ve tested. I’ve administered it to an 8th grade self-contained class, and all of the students had phonological deficits — some at the most basic of levels. One student struggled to say the word “fantastic.” He couldn’t get that last syllable, even when I slowed it down and repeated it. 8th grade.

This has only gave me a greater sense of urgency in figuring this out.

The other thing I noticed is that the person administering the PAST really has to know their phonemes. It’s surprisingly hard to do well. In order to get an accurate gauge of student ability, you have to deliver the instructions swiftly and precisely. If you slow down or stumble when saying, “Say guide. Now say guide . . . but instead of . . . /g/ say /r/,” you can easily tax the student’s working memory, and they forget which word they are supposed to use while paying attention to the phonemes you’re saying.

When training teachers to administer the PAST, I first have to ensure they can pronounce the phonemes accurately, and then deliver the tasks with swift pacing. This takes practice!

So my advice is to practice delivering the PAST with someone else, multiple times, before you administer to a student.

Check out my new Resources page for a couple of trackers you can use once you’ve administered the PAST.

Assessments of Phonics Skills

Kilpatrick recommends using the TOWRE-2 Phonemic Decoding Efficiency subtest and KTEA-3 Nonsense Word Decoding subtests. The problem is that these are subkits of a larger kit, and the kits for each are expensive. If you’ve got a school psychologist in your building who uses these and can lend you a hand, that’s great.

Protip: “Nonsense word tasks appear to be the best way to evaluate a student’s phonics skills. In essence, all unfamiliar words a student encounters are functionally ‘nonsense’ words until they are correctly identified. . . . Timed nonsense word reading, such as in the TOWRE-2 and the KTEA-3, is arguably a better assessment of a student’s cipher skills than the traditional, untimed nonsense word reading tasks.  . . . It is recommended that any timed nonsense word reading task be administered after an untimed task, and not before.”

Though Kilpatrick recommends these normed assessments, he does acknowledge that they “do not provide much information about the specifics of what elements of phonics skills are weak or missing. By contrast, there are many criterion-based assessments of very specific elements of phonic knowledge. Some are commercially available assessments and others are free online. These criterion-referenced assessments will index the particular letter-sound combinations that the student knows, such as the various letters, blends, digraphs, and diphthongs, which can aid instructional planning.”

So my (admittedly amateur) advice? Normed assessments are great if you can afford them. But you can use something like the CORE Phonics Survey, the DIBELS Nonsense Word Fluency tasks, or Ruth Miskin Nonsense Word Test (all available for free). I also just got an OG (Orton-Gillingham) phonic screen from a colleague, and it was really short. Please let me know what else you might recommend.

I’ll stop here.

There’s much more to talk about with assessments for word-level reading, but I’ll stop here. Even out of these two, phonemic awareness and phonics, I’ve elected to only focus on one — phonemic awareness. Why? Because if Kilpatrick and Seidenberg are right, this is the core area of deficit that causes word-level reading gaps. And because I’m just trying this out and seeing what kind of practices and systems I can support a school in developing that are sustainable and scalable, and you have to start somewhere.

Even just administering the PAST is a much bigger endeavor than it seems at first glance. You need to train and practice with it. Then you need to test each student individually, in a space where you have enough quiet to be heard.

And then you need to figure out how to provide effective intervention in a consistent and effective way. From the first set of data I just collected last week, I can see this will be more complicated than I thought. Each student is at different levels of phonemic awareness, so how can we group them strategically while still addressing each student’s need?

Help! If you’ve used Kilpatrick’s Equipped for Reading Success program, especially with older students, any advice is much appreciated.

Stochastic Terrorism

https://www.wired.com/story/jargon-watch-rising-danger-stochastic-terrorism/

An interesting concept that has relevance for schools.

Though stochastic bullying or stochastic cheating might be more appropo…

Close Reading: The Context of an Exegesis

The first thing that happened to reading is writing. For most of our history, humans have been able to speak but not read. Writing is a human creation, the first information technology, as much an invention as the telephone or computer.

—Mark Seidenberg, Language at the Speed of Sight

A growing contingent of scholars argue that our “superpower” as a species is not so much our intelligence as our collective intelligence and our capacity for what’s called cumulative culture: that is, our ability to stockpile knowledge and pass it down from generation to generation, tinkering with it and improving it over time.

—Steve Stewart-Williams, “How Culture Makes Us Smarter

The written word emerged from the fogs of the distant past in places as disparate as the hills of Oaxaca, the banks of the Huan River, and the dry yet fertile expanse between the Tigris and Euphrates. Some of this early transcription was record-keeping, the accounting of ownership, an empirical truth-telling that extended the reach of commerce. Yet there were also the words of the prophets and priests—the divinations, omens, prophecies, and revelations—and the words of the scholars and poets—the stories, laws, and myths. A reckoning with the enduring and the sacred. The Akkadian texts, the Vedas, the Avestas, the Torah and the commentaries that were made to explain them.

In such scripture, contradictory accounts, allegories, and the use of a more complex language not spoken on a daily basis presented challenges beyond the pragmatic literacy of record-keeping. Clearly, the word of the godhead cannot be so easily confined by the shallow tongue of humans, however divinely inspired. The act of understanding sacred texts has thus always been one of interpretation.

And from the start, there have been two broad approaches to interpretation: a literal interpretation, which sticks to what is most plainly evident in the text itself, and an inferential interpretation, which situates a text within a larger framework. These approaches can work together as a progression towards a fuller understanding, though they can also exist sometimes in opposition.

Scriptural Exegesis: the literal and the nonliteral meaning

Scholarly interpretation of scripture, termed exegesis, has a storied tradition, extending to formalized methods termed hermeneutics. Hermeneutics has since developed far beyond scripture into a theory of knowledge and understanding itself.

The early usage of “hermeneutics” places it within the boundaries of the sacred. A divine message must be received with implicit uncertainty regarding its truth. This ambiguity is an irrationality; it is a sort of madness that is inflicted upon the receiver of the message. Only one who possesses a rational method of interpretation (i.e., a hermeneutic) could determine the truth or falsity of the message. (Jean Grondin via Wikipedia)

Hermeneutics spans a wide gamut, from theology and philosophy, from Hillel to Heidegger, and also parallels the development of literary criticism, from Plato and Aristotle, from Russian Formalism to Reader-response Theory, with both threads leading, quite fascinatingly (if you follow edu stuff at all) to E.D. Hirsch, Jr., who argued that an objective interpretation of literary texts is possible (by adhering to the author’s intention). And this lineage extends all the way up to the Common Core Standards and its promotion of a particular form close reading.

But I’m getting ahead of myself. Let’s keep with the exegesis thing for a minute. I just said there’s literal interpretation and interpretation which goes beyond what is in the text, both approaches which often interweave.

Here’s how both of these approaches work together in Zoroastrian commentaries (zand) on the Avestas:

A consistent exegetical procedure is evident in manuscripts in which the original Avestan and its zand coexist. The priestly scholars first translated the Avestan as literally as possible. In a second step, the priests then translated the Avestan idiomatically. In the final step, the idiomatic translation was complemented with explanations and commentaries, often of significant length, and occasionally with different authorities being cited.

Here’s how both Hebrew and Akkadian methods of exegesis try to resolve contradictions between the approaches:

…in order to clarify the interpretation of a text, it may be necessary to adopt a solution that goes beyond the immediate and literal sense of the text. Indeed, the tension between the literal sense of a text and the sense of the text in its larger context is a perpetual concern of Akkadian and Hebrew commentators alike. An awareness of this tension is reflected in commentaries that attach two interpretations to one phrase from the base text: the literal interpretation, which does not necessarily agree with the context, and a nonliteral interpretation that succeeds in reconciling the phrase with its larger context.

In the rigorous and rich Judaic traditions of textual interpretation, extensive commentaries have been developed, and the midrash of the Torah and the halakhah (Talmud) were formalized into hermeneutic rules. There was no distinction initially drawn between literal meaning, peshat, and inferential interpretation, derash, but over time the two terms became more distinguished from each other. In halakhic, or legal, interpretation, scholars had to not only attempt to reconcile tensions within a text itself, but further reconcile laws in relation to changing economic and cultural circumstances. In the attempt to resolve such problems, “scholars…first and above all sought to find the solutions in Scripture itself, by endeavoring to penetrate to its inner or ‘concealed’ content.” In the non-legal rabbinic midrash of the Torah, there was even more room for creative interpretation. Stories termed aggadah could be interpreted at both a literal and allegorical level. Some believe there are hidden layers of meaning that can only be unveiled to those properly trained to unlock them. In the tradition of Kabbalah, exegesis moves far beyond allegorical into the realm of the mysterious and mystical.

What is interesting is how extremely literal methods could be used to move into the realm of the occult. As an example, a hermeneutical method termed notarikon takes out a letter of a word to make the initial letter of another word, such that one word could become an entirely new sentence. Another method termed gematria assigns numerical values to words based on the letters, and then use the numbers to make esoteric inferences.

While such methods may seem bizarre at first glance, remember that scriptural exegesis assumes the premise that scripture is sacred in nature, and thus, without error. If you follow this premise all the way through, that means every single letter of every single word has a divine purpose and meaning, even when it is not immediately evident, and even when some verses or texts stand in seeming contradiction to others.

In Christian Biblical exegesis, scholars also approached interpretation from various angles, some of them in opposition and others within a progression:

… whereas some have argued that the interpretation must always be literal, or as literal as possible (since “God always means what he says”), others have treated it as self-evident that words of divine origin must always have some profounder “spiritual” meaning than that which lies on the surface, and this meaning will yield itself up only to those who apply the appropriate rules of figurative exegesis similarly developed hermeneutics based on literal, allegorical, moral, and anagogical interpretations. (Britannica.com entry)

There’s even a Latin rhyme that encapsulates the four methods, or quadriga, of figurative Biblical exegesis:

Litera gesta docet, Quid credas allegoria,

Moralis quid agas, Quo tendas anagogia.

The rhyme roughly translated:

The literal teaches what God and our ancestors did,

The allegory is where our faith and belief is hid,

The moral meaning gives us the rule of daily life,

The anagogy shows us where we end our strife.

A Talmudic scholar, Rashi, provides an instructive example of moving between different levels of interpretation:

Rashi’s Bible commentary illustrates vividly the coexistence and, to some extent, the successful reconciliation of the two basic methods of interpretation: the literal and the nonliteral. Rashi seeks the literal meaning, deftly using rules of grammar and syntax and carefully analyzing both text and context, but does not hesitate to mount Midrashic explanations, utilizing allegory, parable, and symbolism, upon the underlying literal interpretation. (Britannica.com entry)

In Islamic exegesis, or tafsir, the classical Arabic language itself is central to the task of interpretation through intensive study of rhetoric, etymology, morphology, syntax, and metaphor. The verses, or ayah, of the Qur’an can be delineated into “those that are clear and unambiguous (muhkam) and those that are allegorical (mutashabeh).” It is said that the Qur’an is revealed through seven different forms of recitation, or arhuf. Yet there is debate about what the meaning of arhuf even is. Here is a hadith that elucidates the difficulty in pinning down that meaning:

From ʿAbdallâh Ibn Masʿūd: The Messenger of Allah said: “The Quran was sent down in seven ahruf. Each of these ahruf has an outward aspect (zahr) and an inward aspect (batn); each of the ahruf has a border, and each border has a lookout.”

What is common in all scriptural exegesis is the belief that the text is divinely inspired in origin, and thus, worthy of intense scrutiny to unfurl that revelatory meaning, down to the deconstruction and reconstruction of letters, morphemes, and syntax, as well as righteous attempts to ensure that any contradictions within and between sacred texts are resolved.

The truth is, truth and meaning in the written word can be a slippery thing, subject to abstraction and contradiction. Herein lies its power—the power to reveal or to deceive, both sacred and dangerous. While the word of a prophet or god requires painstaking exegesis to unspool into moral or legal guidance, poets and storytellers can craft and bend language at will to elicit desired reactions from their audience.

Literary Criticism: the significance of a text and its context

Plato feared this deceptive power. He even went so far as to advise that poets should be banned from his ideal republic. In The Republic, written in 360 BCE, Plato argued that poetry is a mere imitation of nature, and thus, inferior. Yet in this shallow deception lay great power, for the poet, through the use of melody, rhythm, and other “ingenious devices,” could take advantage of the irrational “weakness of the human mind. . . having an effect upon us like magic.”

Plato instead argued for the supremacy of the rational “arts of measuring and numbering and weighing.” While he did have an appreciation for poetry, he believed that the primary function of art should be to serve a moral purpose. Anything else was not only frivolous, but dangerous.

Yet a decade later, in Poetics, Aristotle offered an alternative vision of the power of poetry. While he acknowledges that poetry is an imitation of reality, he argues that the most potent of the dramatic arts, tragedy, “imitated noble actions, and the actions of good men,” thus providing the virtuous guidance that Plato found so lacking.

Aristotle examined the “ingenious devices” of poetry closely and provided a clear description of effective literary techniques such as character, plot, and diction, while introducing concepts like catharsis and mimesis that are still applied in literary study today. He also argued that poetry serves a different function than the more quantifiable arts of the specific and the particular, and that it serves an even higher purpose:

…it is not the function of the poet to relate what has happened, but what may happen,—what is possible according to the law of probability or necessity. The poet and the historian differ not by writing in verse or in prose. . . The true difference is that one relates what has happened, the other what may happen. Poetry, therefore, is a more philosophical and a higher thing than history: for poetry tends to express the universal, history the particular.

Aristotle also tackled problems of interpretation. He suggests a number of issues and solutions, but the following one especially stood out to me due to later literary debates about whether a text should be studied based solely on what is within the text, or with consideration of an author’s intent and biography:

Things that sound contradictory should be examined by the same rules as in dialectical refutation whether the same thing is meant, in the same relation, and in the same sense. We should therefore solve the question by reference to what the poet says himself, or to what is tacitly assumed by a person of intelligence.

What’s fascinating is that this tension between Plato and Aristotle’s stances on poetry can be seen resounding in centuries of literary criticism since. As the Encyclopaedia Britannica puts it: “Although almost all of the criticism ever written dates from the 20th century, questions first posed by Plato and Aristotle are still of prime concern, and every critic who has attempted to justify the social value of literature has had to come to terms with the opposing argument made by Plato in The Republic.”

In the Medieval period, Plato’s staunch focus on the service of art for moral good is echoed in the largely unimaginative and depressing body of artwork produced by the Western world. The truly exciting action was taking place, instead, in the scholarly exegesis of biblical texts.

But in 1440, hermeneutics moved beyond its role of merely explaining the “true” meaning of the Bible. An Italian humanist and literary curmudgeon, Lorenzo Valla, proved that a document used by the papacy to claim that emperor Constantine the Great had transferred authority of Rome to the Pope was a forgery by using evidence solely within the text itself. How did he do this? As a scholar of Latin grammar and rhetoric, he explained that the crude Latin used by its anonymous author did not match the form of Latin used in the time of Constantine.

During the Renaissance, the recovery of classical literary texts spurred a flowering of literary criticism. At the same time that Aristotle’s Poetics was translated into Latin and regained a new audience, hermeneutics shifted into a fully Aristotelian appreciation for the beauty of well-crafted art. In his Defence of Poesie, Sir Philip Sidney, echoing Aristotle, argued that the poet was superior to the historian:

So then the best of the Historian is subject to the Poet, for whatsoever action or faction, whatsoever counsaile, pollicie, or warre, strategeme, the Historian is bound to recite, that may the Poet if hee list with his imitation make his owne; bewtifying it both for further teaching, and more delighting as it please him: having all from Dante his heven to his hell, under the authority of his pen. (Renascence Edition)

This new concern with the subjective mind of the author and the reader themselves, rather than on rigid methods, became a growing focus throughout the decline of Neoclassicism and into the Romantic era, during which “The poet was credited with the godlike power that Plato had feared in him.”

This isn’t to say that Biblical exegesis faded away. Instead, new methods were developed that focused on reframing the Bible within a broader context.

The rationalist Enlightenment led hermeneutists, especially Protestant exegetists, to view Scriptural texts as secular classical texts. They interpreted Scripture as responses to historical or social forces so that, for example, apparent contradictions and difficult passages in the New Testament might be clarified by comparing their possible meanings with contemporary Christian practices. (Wikipedia entry)

This takes us into the 20th century, where a renewed formalism led to methods that refocused on interpretation of the text itself, without reference to anything else. In France, Gustave Lanson, a literary critic, promoted a pedagogical method in French universities termed l’explication de texte, in which a text’s structure, style, and literary devices are objectively examined. In Russia, a Formalist method “attempted a scientific description of literature (especially poetry) as a special use of language with observable features.” Russian Formalism stood in sharp contrast to Plato’s argument that poetry was a mere imitation of reality; the stance of Formalism is that “words were not simply stand-ins for objects but objects themselves.” Meanwhile, in Britain and the United States academia, New Criticism became the dominant form of literary interpretation, in which the author’s intent and a reader’s responses were viewed as largely irrelevant distractions. Similar to Russian Formalism, New Criticism took the stance that “everything that is needed to understand a work is present within it.”

Scholars of New Criticism even coined terminology to make it explicit that an author’s background or intent or a reader’s personal and emotional responses were invalid methods of interpretation. They termed these “The Intentional Fallacy” and “The Affective Fallacy,” respectively.

This discounting of the author’s intent and biography and of a reader’s responses both generated opposing schools of literary criticism in the latter half of the 20th century. New Historicism focuses on the social, cultural, and philosophical contexts within which an author wrote, to the point that the text and author seemed to have been almost inevitably created by their context, rather than vice versa:

In its tendency to see society as consisting of texts relating to other texts, with no ‘fixed’ literary value above and beyond the way specific cultures read them in specific situations, New Historicism is a form of postmodernism applied to interpretive history. (Wikipedia entry)

Reader-response theory, on the other hand, “argues that a text has no meaning before a reader experiences—reads—it.” This approach rejects the grounds for objectivity in the interpretation of texts, suggesting instead that meaning arises out of personal reactions and the particular context that a reader is situated within.

Other forms of criticism that drew heavily upon wider contexts beyond the text itself also became more widespread in the late 20th century, such as sociological, psychoanalytic, Marxist, feminist, or post-Structuralist criticism. Increasingly, criticism was viewed as a subjective and highly specialized academic endeavor.

Into this fray stepped E.D. Hirsch, Jr. with a book titled Validity in Interpretation, in which he argued that an objective interpretation of a text was possible, in contrast to the positions of New Historicism or Reader-response theory. However, he also took issue with the stance of New Criticism that authorial intent was a distraction from the text itself. Instead, Hirsch argued that determining authorial intent was the basis for a valid, more objective interpretation. He drew a distinction between the meaning of a text and its significance. The meaning, an understanding as determined by authorial intent, is something that is stable and does not change, while a text’s significance changes in accordance with new explanations and connections to new contexts. In his own words:

Meaning is that which is represented by a text; it is what the author meant by his use of a particular sign sequence; it is what the signs represent. Significance, on the other hand, names a relationship between that meaning and a person, or a conception, or a situation, or indeed anything imaginable. … Significance always implies a relationship, and one constant, unchanging pole of that relationship is what the text means (Validity in Interpretation).

Meaning . . . may be conceived as a self-identified schema whose boundaries are determined by an originating speech event, while significance may be conceived as a relationship drawn between that self-identified meaning and something, anything, else (“Meaning and Significance Reinterpreted”).

This distinction stood out to me because it seemed analogous to one made by Christine Counsell about two main types of knowledge in school curriculum. She distinguishes between substantive knowledge and disciplinary knowledge. Substantive knowledge is knowledge that is relatively stable and can be taught as established fact, while disciplinary knowledge engages students in the use of tools and pathways of inquiry fundamental to the discipline, and which is always evolving.

I find both of these distinctions, between meaning and significance, and substantive and disciplinary knowledge, to be useful, as they allow us to see that there are some forms of understanding that are more static than others, and also that the interpretation of a text is always situated within a wider context, and that interpretations will shift in accordance with that context.

The crucial point, then, is that any text has an envisioning historical and cultural context and that the context of a text is itself not simply textual—not something that can be played out solely and wholly in the textual domain. This context of the texts that concern us constrains and limits the viable interpretations that these texts are able to bear. The process of deconstruction—of interpretatively dissolving any and every text into a plurality of supposedly merit-equivalent construction—can and should be offset by the process of reconstruction which calls for viewing texts within their larger contexts. After all, texts inevitably have a setting—historical, cultural, authorial—on which their actual meaning is critically dependent (Nicholas Rescher, as quoted by a Stanford Encyclopedia of Philosophy entry on hermeneutics)

Another important aspect of E.D. Hirsch’s analysis is that it represents a convergence between literary criticism and the discipline of hermeneutics, which had been developing along largely separate tracks. Hermeneutics sprung originally out of the study of scripture, then developed into philosophical explorations of epistemology, while literary criticism clung more closely to aesthetics and classical literature.

Before we move from the topic of hermeneutics and of the relationship of a text to wider context, I think it’s important to touch on the concept of the hermeneutic circle, which very much relates to the movement between literal and nonliteral in scriptural exegesis, as well as the interpretation of the meaning and significance of a piece of literature.

The hermeneutic circle refers to the recursive movement between part and whole, whether within a text itself, in connection between texts, in connection between a text and something else, or even more broadly, in the relationship between an individual and the world he or she inhabits. We will see this circle in action in our next section on close reading.

One other fascinating thing to note about E.D. Hirsch, Jr., which can help us transition into our next section on primary and secondary public education: he has become more widely known for his promotion of cultural literacy, the idea that literacy is founded on background knowledge relevant to a culture, and that therefore a shared body of core knowledge and vocabulary should be taught explicitly in each grade. He founded an organization, Core Knowledge, which developed K-8 curricular materials to address this need. This was a contentious idea when first introduced in the late 1980s, and continues to generate debate today.

The Common Core Enters Stage Left

Ever since the exhortation of the Common Core ELA standards for students to “read closely” and “cite evidence,” close reading has been a thing in K-12 education.

CCSS.ELA-Literacy.CCRA.R.1

Read closely to determine what the text says explicitly and to make logical inferences from it; cite specific textual evidence when writing or speaking to support conclusions drawn from the text.

As you will see, debates about close reading closely echo the ancestry of the textual exegesis, hermeneutics, and literary criticism that preceded them.

Advocates and developers of the Common Core, such as David Coleman and Sue Pimental, promoted a form of close reading in which analysis is confined to the text itself, similar to the approach of Formalism, l’explication de texte, and New Criticism. This approach could be viewed as an explicit reaction to a trend in K-12 classrooms of providing only easily accessible texts and questions and too much background context prior to reading, most especially to those students who already struggled with reading. This seemed to ill prepare graduating students for college-level tasks oriented around highly complex academic texts, nor for the reading of technical texts required for advancement in many careers.

The solution proffered by the standards was to engage students in reading increasingly complex texts throughout the span of their education, and to ensure questions were “text-dependent,” rather than answerable without any evidence.

However, there was a backlash against this form of close reading.

For example, Nancy Boyles wrote in ASCD that asking students only text-dependent questions doesn’t explicitly prepare students for engaging independently in their own close reading practices. She recommends asking four generic questions that students can apply independently to any text.

“The final, most compelling reason I don’t care for the Student Achievement Partners [text-dependent] questions is that although they teach the reading—the content of the text—there’s no attempt to teach the reader strategies by which that reader can pursue meaning independently. . . Teaching is about transfer. The goal is for students to take what they learn from the study of one text and apply it to the next text they read.”

Educators Kylene Beers and Robert Probst further argued in Notice & Note: Strategies for Close Reading that it is essential to support students in making personal connections to texts because meaning and engagement are created via the interaction between a text and its reader, a view similar to Reader-Response Theory:

“Meaning is created not purely and simply from the words on the page, but from the transaction with those words that takes place in the reader’s mind. . . Close reading, then, should not imply that we ignore the reader’s experience and attend closely to the text and nothing else. It should imply that we bring the text and the reader close together. To ignore either element in the transaction, to deny the presence of the reader or neglect the contribution of the text, is to make reading impossible. If we understand close reading this way, when the reader is brought into the text we have the opportunity for relevance, engagement, and rigor.”

Another knock against confining a close reading solely to what is within a text is that a reader may miss the wider social or historical context that a text is situated within. As the Odegaard Writing and Research Center at the University of Washington puts it:

“Remember that every writer is in conversation: with other writers, with history, with the forces of her culture, with the events of his time. It is helpful, for example, to read Karl Marx or Sigmund Freud with some knowledge of their moment in history. Virginia Woolf and Simone de Beauvoir were responding to writers and events in their cultures, too. When you understand the context of a work, you can better see the forces that moved the author to write that work.”

Kate Roberts and Christopher Lehman, in Falling In Love With Close Reading, suggest that you can have your cake and eat it, too.

“Instead of seeing this as a debate between two opposing sides, we believe there is a way to achieve both goals–to teach students to read more analytically, while also valuing their lives and experiences. In fact, in this book we argue that by learning to read more closely, our lives and experience grow richer as well.”

This made me think of a related conversation regarding critical thinking skills. In “What REALLY Works: Optimizing Classroom Discussions to Promote Comprehension and Critical-Analytic Thinking,” P. Karen Murphy et al. lay out two broad approaches to text-based inquiry in the classroom: an expressive approach, which taps into a student’s personal experience and emotion, and an efferent approach, which is a more objective attempt to acquire and obtain information. P. Karen Murphy et al. argue that it is not one or the other, but rather both working in tandem, that can best develop critical-analytic thinking:

We propose that the solution lies not in either an efferent or an expressive approach to text and other content, but in pedagogical approaches such as small-group, classroom discussions that value knowledge-seeking, in concert with lived-through experience, to promote critical-analytic thinking.

If we agree with P. Karen Murphy et al., then we end up with a model something like this:

All of that said, however, it must be acknowledged that state ELA assessments ask students to answer efferent text-dependent questions about a written passage in complete isolation from any wider context. A student’s personal opinion and experience, as well as the author’s biography, plays no role in the analysis students are asked to conduct. So students will need to have some level of practice with this form of close reading, whatever one believes that textual interaction should ideally be, so long as the coin of the realm is test scores.

So what is close reading, then, exactly? Here’s a few definitions to complicate your understanding:

  • Close reading is thoughtful, critical analysis of a text that focuses on significant details or patterns in order to develop a deep, precise understanding of the text’s form, craft, meanings, etc.

—“A Close Look at Close Reading” by Beth Burke

  • Close reading is a strategic process a reader uses in dealing with a complex text to acquire the information needed to complete a task. There is no single correct way to read something closely.

—“A Close Look at Close Reading” by LEAF (WestEd)

  • People read differently for different purposes. When you read in order to cram for a quiz, you might scan only the first line of every paragraph of a text. When you read for pleasure, you might permit yourself to linger for a long while over a particular phrase or image that you find appealing. It shouldn’t come as a surprise, then, that when you read in order to write a paper, you must adopt certain strategies if you expect your efforts to be fruitful and efficient.

—“Close Reading” by Odegaard Writing & Research Center

  • Close Reading of text involves an investigation of a short piece of text, with multiple readings done over multiple instructional lessons. Through text-based questions and discussion, students are guided to deeply analyze and appreciate various aspects of the text, such as key vocabulary and how its meaning is shaped by context; attention to form, tone, imagery and/or rhetorical devices; the significance of word choice and syntax; and the discovery of different levels of meaning as passages are read multiple times.

—“Implementing the Common Core State Standards: A Primer on Close Reading of Text” by Sheil Brown and Lee Kappes

  • Love brings us in close, leads us to study the details of a thing, and asks us to return again and again. … we argue that teaching readers to look at texts closely–by showing them how one word, one scene, or one idea matters–is an opportunity to extend a love affair with reading. It is also a chance to carry close reading habits beyond the page, to remind students that their lives are rich with significance, ready to be examined, reflected upon, and appreciated.

Falling In Love With Close Reading by Christopher Lehman and Kate Roberts

  • Close reading should suggest close attention to the text; close attention to the relevant experience, thought, and memory of the reader; close attention to the responses and interpretations of other readers; and close attention to the interactions among those elements.

Notice and Note: Strategies for Close Reading by Kylene Beers and Robert Probst

  • Reading nonfiction, in many ways, requires an effort not required in the reading of fiction. We must question the text, question the author, question our own understanding of the topic, and accept the possibility that our views will change as a result of the reading we’re doing. All those demands mean that the reader has great responsibility when reading nonfiction.

Reading Nonfiction by Kylene Beers and Robert Probst

  • Close reading is important because it is the building block for larger analysis. Your thoughts evolve not from someone else’s truth about the reading, but from your own observations. The more closely you can observe, the more original and exact your ideas will be.

—“Close Reading of a Literary Passage” by Dr. Kip Wheeler

There is a multiplicity of sources providing a methodology and form for close reading. Here’s just a snapshot of the different sources I’ve reviewed in my own research on the topic in preparation for a professional development series:

There are some essential components to a close reading process that become evident from these different approaches:

  • It is performed with a short text or short snippet of a longer text
  • There is a specific focus and purpose to reading that particular text
  • There are multiple reads, through which the meaning discovered in a particular portion becomes extended across or beyond that text
  • A system of annotation is applied
  • Textual understanding and interpretation typically moves from literal to inferential (though in the case of French l’explication de texte, analysis is maintained at the literal, more objective level of summary) based on patterns identified in initial observations
  • The end product is a written response or discussion

OK. So there it is. I’m pretty sure there’s a lot more to say on any and all of these things, but writing this has already taken way too much of my very limited time these days. I mean seriously, I’ve spent over a month writing this.

What have I learned? I don’t know if I can concisely articulate it, but it seems to me that textual interpretation, in any form you can name, whether scriptural exegesis or close reading, is most fruitful when it is viewed in a more flexible, rather than rigid, manner. That is, whatever stance and method one adopts, one recognizes there will be limitations based on that stance and method. Furthermore, it seems to me that methods which are able to accommodate and balance both literal and nonliteral meanings, and bear significance that is both objective and subjective, while acknowledging both the text itself and its relationship to a broader context, will be the most compelling.

But maybe that’s just me. What do you think?

Hysteresis and the Legacy of Industrialization

IMG_20151231_160803.jpg

I recently shared a fascinating study on the impact of the historical legacy of a place, which found that students living in neighborhoods with a legacy of economic and residential segregation had greater odds of dropping out of high school compared to their peers in other neighborhoods.

The existing social capital of a neighborhood, in other words, is associated with the historical legacy of that particular place.

This makes a lot of sense to those of us that work in communities with legacies of poverty and trauma. And it also relates to a concept that Will shared here back in 2012: hysteresis. As explained on Wikipedia, hysteresis refers to “the dependence of the state of a system on its history.” This concept can be applicable to a wide range of systems—in our case here, we are considering socio-ecological systems.

Another recent study presents further support for the impact of the legacy of a place on people. Researchers used online surveys of the “big five” personality traits (openness, conscientiousness, extraversion, agreeableness, and neuroticism) and examined them in connection to a region’s historical legacy associated with industrialization during the 19th and 20th century.

Their results suggest “that the massive industrialization of the 19th and 20th centuries had long-term psychosocial effects that continue to shape the well-being, health, and behaviors of millions of people in these regions today.”

“. . . .Our research shows that a region’s historical industries leave a lasting imprint on the local psychology, which remains even when those industries are no longer dominant or have almost completely disappeared.”

The author concludes that “Without a strong orchestrated effort to improve economic circumstances and people’s well-being and health in these regions, this legacy is likely to persist.”

Granted that this study is based on data gathered from online surveys. But the “big five” survey has a fairly robust research base behind it and predicts academic achievement and parenting behavior (you can also take the survey yourself; I found my own results enlightening). But of course, further research into the impacts of the historical legacy of a place should continue to be pursued.

In the meantime, for those of us who work with children raised in communities that bear the legacies of injury, we need to be mindful not only of the individual needs of the children before us, but furthermore the history of the place within which they live.

 

Research: The Industrial Revolution Left Psychological Scars That Can Still Be Seen Today, Martin Obschonka / Harvard Business Review