[Ohiogift] knowledge is a source of bias

Gifted and Talented in Ohio Discussion List ohiogift at lists.osu.edu
Fri Oct 3 16:17:57 EDT 2014


Lets close the bias gap along with the achievement gap.
As though every bias is a bad thing.
Maybe we can somehow achieve entropy.

Mark

On Fri, Oct 3, 2014 at 2:26 PM, Gifted and Talented in Ohio Discussion List
<ohiogift at lists.osu.edu> wrote:

> Isn't this what E.D. Hirsch said years (decades) ago? That there is a
> fundamental core of knowledge that underlies being able to read and
> comprehend???
> Susan
>
> -----Original Message-----
> From: Gifted and Talented in Ohio Discussion List
> Sent: Oct 2, 2014 11:39 AM
> To: ohiogift at lists.service.ohio-state.edu
> Subject: [Ohiogift] knowledge is a source of bias
>
>
> ...knowledge is a source of bias…(Who knew??)
>
> *The Core Knowledge Blog <http://blog.coreknowledge.org/>*
>
> *Reading Test Developers Call Knowledge a Source of Bias*
> <http://blog.coreknowledge.org/2014/10/01/reading-test-developers-call-knowledge-a-source-of-bias/>
> Lisa Hansel
> *October 1st, 2014*
>
>
> You might expect to see a headline like this in the *Onion*, but you
> won’t. The *Onion* can’t run it because it isn’t just ironic—it’s 100%
> true.
>
> *A few years ago, a researcher at one of the big testing companies told me
> that when developing a reading comprehension test, knowledge is a source of
> bias. *He did not mean the obvious stuff like knowledge of a yacht’s
> anemometer. *He meant typical K–12 subject matter.*
>
> *Since reading comprehension depends chiefly on knowledge of the topic *(including
> the vocabulary) in the passage, the student with that knowledge has a large
> advantage over the student without it. And since there have always been
> great educational inequities in the United States, *students’
> knowledge—acquired both at home and at school—is very strongly correlated
> with socioeconomic status.*
>
> A logical solution would be to test reading comprehension using only those
> topics that students have been taught. Teachers can do this, but testing
> companies can’t—how would they have any idea what topics have been taught
> in each grade? It’s rare for districts, much less states, to indicate what
> or when specific books, people, ideas, and events should be taught.
>
> Without a curriculum on which to base their assessments, testing companies
> have devised their own logic—which is sound given the bind they’re in. They
> distinguish between common and specialized knowledge, and *then they
> select or write test passages that only have common knowledge*. In
> essence, they’ve defined “reading comprehension skill” as including broad
> common knowledge. This is perfectly reasonable. When educators, parents,
> etc. think about reading comprehension ability, they do not think of the
> ability to read about trains or dolphins or lightning. They expect the
> ability to read about pretty much anything one encounters in daily life
> (including the news).
>
> I already had this basic understanding, but still I found the “*ETS
> Guidelines for Fairness Review of Assessments*
> <http://www.parcconline.org/sites/parcc/files/FairnessReviewGuidelines.pdf>”
> eye opening. Guideline 1 is to *“avoid cognitive sources of
> construct-irrelevant variance…. If construct-irrelevant knowledge or skill
> is required to answer an item and the knowledge or skill is not equally
> distributed across groups, then the fairness of the item is diminished” (p.
> 8). It continues, growing murkier:*
>
> *“*Avoid unnecessarily difficult language. Use the most accessible level
> of language that is consistent with valid measurement…. Difficult words and
> language structures may be used if they are important for validity. For
> example, difficult words may be appropriate if the purpose of the test is
> to measure depth of general vocabulary or specialized terminology within a
> subject-matter area. It may be appropriate to use a difficult word if the
> word is defined in the test or its meaning is made clear by context.
> Complicated language structures may be appropriate if the purpose of the
> test is to measure the ability to read challenging material.
>
> “*Avoid unnecessarily specialized vocabulary* unless such vocabulary is
> important to the construct being assessed. What is considered unnecessarily
> specialized requires judgment. Take into account the maturity and
> educational level of the test takers in deciding which words are too
> specialized.”
>
> On page 10, it offers a handy table that “provides examples of common
> words that are generally acceptable and examples of specialized words that
> should be avoided…. The words are within several content areas known to be
> likely sources of *construct-irrelevant knowledge.*”
>
>
>
>
>
> _______________________________________________
> Ohiogift mailing list
> Ohiogift at lists.osu.edu
> https://lists.osu.edu/mailman/listinfo/ohiogift
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.osu.edu/pipermail/ohiogift/attachments/20141003/09445d3f/attachment.html>


More information about the Ohiogift mailing list