Glossary of Commissioned Paper Synopsis

NAGB Conference on Increasing the Participation of SD and LEP Students in NAEP

Commissioned Paper Synopsis

 

The attached paper is one of a set of research-oriented papers commissioned by NAGB to serve as background information for the conference attendees.  The authors bear sole responsibility for the factual accuracy of the information and for any opinions or conclusions expressed in the paper.

An Analysis of State Assessment Policies Addressing the Accommodation of English Language Learners

 

Charlene Rivera, Ed.D.

Eric Collum, Ph.D.

The George Washington University

Center for Equity and Excellence in Education

 

January 2004

 

  • This paper reviews 15 research studies that (1) examined effects of particular accommodations or groups of accommodations on performance, and (2) employed experimental and quasi-experimental research designs that allowed examination of the effect of the accommodation(s) on ELLs and non-ELLs. Studies looked at one or more of the following types of accommodations: (1) linguistic simplification, (2) customized English dictionaries and glossaries, (3) use of the native language, (4) reading items and/or directions aloud, and (5) providing extra time in combination with other accommodations.

 

  • Although research on accommodations for ELLs is inconclusive, two kinds of accommodations appear to hold promise: native language versions of assessments and linguistic simplification of English versions. In addition, combining specific direct linguistic support accommodations (i.e., bilingual glossaries) with specific indirect linguistic support accommodations (i.e., extra time) also appears to support ELL’s performance on assessments.

 

  • This paper also discusses how second language acquisition research informs the use of accommodations. The tendency of ELLs to process the language of a test by focusing on linguistic structures, lexical items, and phonological features leaves fewer cognitive resources available for accessing the content of the test.  Furthermore, in addition to processing the language of the test, ELLs also must negotiate the sociocultural practices and expectations embedded in assessment.  These disadvantages can be mitigated directly through the (1) simplification, (2) repetition, or (3) clarification of the test items or directions, or indirectly, by modifying the conditions under which a test is taken.

 

  • This paper reviews states’ SY 2000–2001 policies related to testing accommodations for ELLs. Policies were often organized explicitly around the needs of two student groups: ELLs and SDs.  Some states’ policies treated these as entirely separate groups, whereas other addressed these as one group.  In most cases, guidelines for individual assessments for which accommodations were offered were subordinated to considerations of student groups (ELLs and SDs).  Overall, the most noticeable trend with regard to states’ treatment of content areas is that accommodations providing direct linguistic support to ELLs were more likely to be prohibited for English language arts than for other content areas.

 

  • A majority of states arranged accommodations within a taxonomy developed for students with disabilities (timing/scheduling, setting, presentation, and response). This taxonomy was used in states’ policies to organize many of the 75 accommodations listed among states’ assessment policies as available to ELLs.  Thirty-one of these accommodations are clearly relevant only to SDs, whereas only 44 are relevant to ELLs.

 

  • In response to the lack of focus on ELLs evident in state policy the research team developed an ELL-responsive taxonomy linking the use of accommodations more closely to the needs of ELLs and incorporating research on second language acquisition. Accommodations were divided into those providing direct linguistic support and those providing indirect linguistic support.  Direct linguistic support includes native language accommodations (translation of some or all of the language of the test into the student’s native language) and English language accommodations directed at (1) simplification of some or all aspects of the test language, (2) repetition of the test language, or (3) clarification of parts of the test language.  Indirect linguistic support includes (1) adjustments to test schedules or the time allowed to take an assessment, or (2) adjustments of the test environment.

 

  • States’ policies for determining which ELL students are eligible for accommodations are of four types: (1) language-related: level of English language proficiency or placement in a language-related program of instruction, (2) time-related: length of time a student has been in an academic environment in which English was the primary language of instruction, (3 academic-related: student’s prior schooling and academic achievement as measured by test performance, and (4) opinion-related: judgment of school personnel and/or family of student (including student). Academic-related criteria can be helpful in taking into account important parts of a student’s background, including the student’s language of instruction.
  • The most common approach mentioned in states’ policies for designating decision makers is to include those who are more familiar with the ELL’s academic work at the local level. However, decision makers who can provide insight on how to maintain the validity of the test should also be part of the team.
  • Recommendations for selecting appropriate accommodations for NAEP include the following:
  • Use an ELL-responsive framework as a tool for selecting appropriate accommodations for ELLs
  • Use accommodations that are responsive to ELLs to provide (a) direct linguistic support or (b) indirect linguistic support
  • Use student background variables to inform selection of appropriate accommodations based on (a) a consistent operational definition of English language learner, (b) the student’s level of English language proficiency, and (c) the language of instruction
  • Use accommodations supported by research

 

In addition, the paper recommends that a panel of experts be charged with identifying at least two accommodations to be field tested for use on NAEP.

 

An Analysis of State Assessment Policies

Addressing the Accommodation of English Language Learners

 

Issue paper prepared for the

National Assessment Governing Board

Charlene Rivera, Ed.D.

Eric Collum, Ph.D.

 

January 2004

 

The George Washington University

Center for Equity and Excellence in Education

1730 N. Lynn Street, Suite 401

Arlington, VA 22209-2004

(703) 528-3588/1 (800) 925-3223

http://ceee.gwu.edu

 

The data presented in this paper are excerpted from a project being conducted for the US Department of Education,

Office of English Language Acquisition.

 

Copyright © 2004 The George Washington University Center for Equity and Excellence in Education. All rights reserved.

 

 

An Analysis of State Assessment Policies Addressing the Accommodation of English Language Learners

For nearly a decade, federal legislation has prompted states to establish standards-based systems that challenge every student, including English language learners (ELLs),[1] to achieve the same high content and performance standards. The 1994 reauthorized Elementary Secondary Education Act (ESEA), the Improving America’s Schools Act (IASA), required that states adopt standards-based systems that enable all K–12 students, including ELLs, to strive toward the same high standards. As part of the IASA legislation, states were required to establish accountability systems to track the achievement of all students, including ELLs. The reauthorized 2002 ESEA, No Child Left Behind Act of 2001 (NCLB), continues to emphasize accountability for every child and has instituted more stringent measures to ensure that ELLs are included fully in state assessment systems. Under NCLB, states are required to test and present aggregated and disaggregated results for ELLs on state assessments.

One of the primary tools that may be employed to strengthen states’ response to the federal mandate to include all students in accountability systems centers on the use of accommodations in state assessments. Accommodations are changes to a test or testing situation that facilitate students’ access to test content. Accommodations are used widely for the assessment of students who, because of limited proficiency in English or physical or cognitive disabilities, are deemed unable to participate meaningfully in state assessments under standard testing conditions. To be effective, accommodations must address the unique needs of the students for whom they are provided. In the case of ELLs, this means providing the test taker with assistance in overcoming the linguistic and socio-cultural barriers that prevent them from accessing the content of the test. Without this support, ELLs will, to a great extent, be tested on their knowledge of the language of the test rather than its content.

Although states’ policies have addressed the use of accommodations as a tool for the greater inclusion of ELLs in state assessment, typically, these policies have not done so systematically or in a way responsive to ELLs. Because accommodations represent a powerful tool for including ELLs in state assessment systems, it is imperative that states’ approaches to the use of accommodations are fully understood.

This paper summarizes and synthesizes findings from An Analysis of State Assessment Policies Addressing the Accommodation of English Language Learners during SY 2000–2001. The study examines states’ policies regarding the use of accommodations on state assessments for school year (SY) 2000–2001, the year in which states’ assessment systems were to have been implemented fully under IASA. Data from the policies for SY 2000–2001 provide a baseline for examining the extent to which states’ policies were positioned to address the more stringent requirements of NCLB.

This paper is divided into three parts. Part I presents key issues that inform the study of states’ policies regarding accommodations. This section also provides a review of the literature on accommodations. Part II synthesizes findings from the SY 2000–2001 study, presenting data and analysis regarding states’ policies in the following areas: (1) organization of states’ policies, (2) accommodations made available to ELLs, and (3) determination of students eligible for accommodations and the selection of accommodations for those students. Finally, Part III offers suggestions regarding how assessment policies can provide more effective guidance for including ELLs in state assessment.

Part I: Overview

An Analysis of State Assessment Policies Addressing the Accommodation of English Language Learners during SY 2000–2001 builds on a previous nationwide study of state assessment policies for SY 1998–1999 conducted by The George Washington University Center for Equity and Excellence in Education (Rivera, Stansfield, Scialdone, & Sharkey, 2000). The previous study was the first to use primary sources to examine the assessment policies of states’ policies rather than relying on self-report (e.g., Rivera & Vincent, 1997; CCSSO, 1997).

Like the previous study, the current study examined policy documents to provide a comprehensive view of states’ assessment policies regarding accommodations. The following research questions guided the collection and analysis of data:

  • To what extent did states’ assessment policies for SY 2000–2001 address the use of accommodations specifically for ELLs?
  • Which accommodations did states’ assessment policies indicate were available to ELLs?
  • What frameworks did states’ policies use to organize accommodations for ELLs?
  • To what extent did states’ assessment policies address the content for which an accommodation could be made available?
  • To what extent did states’ assessment policies address the process by which appropriate accommodations were to be selected for eligible ELLs?

The research team reviewed state policy documents from the 50 states and the District of Columbia.[2] Documents received by the research team included such materials as handbooks for particular assessments, excerpts from legislation, memoranda, letters, and print outs from web sites. The documents submitted by SEAs varied greatly in size and scope. Some SEAs sent large, carefully produced handbooks, whereas others merely sent pages copied from relevant sources such as legislation or web sites. With regard to the content of these documents, some SEAs provided considerable guidance in selecting accommodations for eligible students, whereas others left much of the decision-making to district staff. Finally, the documents provided by some SEAs were clearly targeted to address ELLs, whereas other states’ policy documents addressed “special needs” students, which generally referred to ELLs and students with disabilities (SD). Policy documents collected from each state were considered to be representative of each state’s policy.

Key Issues Informing the Accommodation of ELLs

The interest in developing tools, such as accommodations, for including ELLs in state assessment has been stimulated by the passage of federal legislation in 1994 and 2002. Under both IASA and NCLB, states are required to develop assessments accessible to all students including ELLs. The 1994 law indicated that assessments were to be made available with “reasonable adaptations and accommodations for students with diverse learning needs,” in other words, ELLs and students with disabilities. The 1994 law also required assessment results to be disaggregated by gender, major racial and ethnic groups, and English proficiency status. Inherent in this requirement was the need for states to begin to utilize methods of test administration and accessibility options that facilitated the inclusion of ELLs, while also maintaining the reliability, validity, and comparability of scores (U.S. Congress, 1994, Section 1111[b][3]).

The 2002 law (NCLB) extends the assessment requirements of the 1994 law and expands accountability requirements for all students. Under NCLB, states are required to include “limited English proficient students, who shall be assessed in a valid and reliable manner and provided accommodations . . . including, to the extent practicable, assessments in the language and form most likely to yield accurate data on what students know and can do in academic content areas” (U.S. Congress, 2002, Section 1111[b][3][C][ix][III]). The results of these assessments are to be used for “determining the yearly performance of the State and of each local educational agency and school in the State” (1111[b][3][A]).

In response to the federal mandate that states must include ELLs for purposes of accountability, states increasingly have begun to examine and modify assessment policies and practices for ELLs to facilitate their greater participation in state assessment. To date, accommodation is the primary tool used by states to comply with the federally mandated inclusion of ELLs in state assessment.

Accommodations can help ELLs gain access to the content of a test by enabling students to overcome linguistic and socio-cultural barriers. Second language acquisition research has shown that in the early stages of second-language acquisition, language learners require more cognitive resources to process that language than do their more language-proficient peers. This is because when learning a second language, learners attend closely to the forms of the second language: That is, in order to extract meaning from an utterance or text, their attention is directed toward linguistic structures, lexical items, and phonological features. In essence, second language learners tend to process language unit by unit, piece by piece, focusing closely on each discrete element of language. By contrast, native or proficient second language speakers have largely automatized language processing, giving only peripheral attention to language forms (McLaughlin, Rossman, & McLeod, 1983; McLaughlin, 1990).

The implications for the assessment of ELLs are clear. When faced with a standardized test, fully English-proficient students, who have automatized language processing, need fewer cognitive resources for language processing and therefore have more resources to attend to the meaning conveyed in the test. By contrast, ELLs who have not fully automatized language processing must direct more cognitive resources to processing the language of the test and therefore have fewer resources available to attend to content being tested. Accommodations are intended to minimize the cognitive resources ELLs need to process the language of the test and maximize the cognitive resources available for accessing the content of the test.

Accommodation offers promise as a tool for appropriately including ELLs in state assessments. However, the use of this tool is not unproblematic. A key concern is to provide support to ELLs in processing the language of the test without providing help on the test’s content. In other words, a state assessment administered to ELLs with accommodations must maintain its original purpose, assess the original construct, and yield scores that are comparable to those of other students taking the test without accommodation.[3] The concern over validity, and hence over score comparability, is expressed in the mandates of the 1994 and 2002 ESEA, which specify that only accommodations that preserve the validity of the test should be used: Accommodations must yield “reliable information” (1994 ESEA) or “reliable data” (2002 ESEA). Essentially, if states provide accommodations to ELLs that invalidate test scores, it will be impossible to measure the achievement of ELLs relative to their English-proficient peers.

Validity is acknowledged as a central concern in the assessment of ELLs in the Standards for Educational and Psychological Testing, AERA, APA, & NCME (1999). The Standards acknowledge that knowledge and degree of English language development is a factor that has the potential of affecting anyone taking a test because “any test that employs language is, in part, a measure of . . . language skills.” The measurement of language proficiency by a test purporting to measure, for instance, mathematics or reading comprehension, can introduce construct-irrelevant variance. “In such instances, test results may not reflect accurately the qualities and competencies intended to be measured” (AERA et al., 1999, p. 91). Further variance may be introduced by the fact that “language differences are almost always associated with concomitant cultural differences that need to be taken into account when tests are used” (p. 91).

One option for mitigating construct-irrelevant variance is to ensure ELLs have access to the content of the test by providing appropriate test accommodations in English or in the native language. Appropriate test accommodations for a test given in English should permit ELLs who have been instructed in the content tested to demonstrate academic knowledge despite limited English proficiency and restricted cultural knowledge of the United States. Likewise, an accommodation, or entire test, provided in the native language has the same intent: to remove construct-irrelevant variance in order to allow ELLs to demonstrate content-specific knowledge. To ensure an accommodation appropriately addresses ELLs’ needs, it is necessary to consider individual student characteristics such as level of English language proficiency, age, length of time in the U.S., and educational background both in the U.S. and in the native country (Butler & Stevens, 1997).

In theory, all accommodations allowed for ELLs should not alter the validity or score comparability of a test. To ensure test validity and score comparability, accommodations must not give a demonstrable advantage to students who receive them over students who do not (Elliott, Kratochwill, & Schultel, 1998). In short, the challenge of accommodating ELLs is to ensure that accommodations ameliorate linguistic and cultural barriers associated with learning a new language without altering essential features or destroying test validity and other technical qualities of the test. Only by preserving the validity of tests and the comparability of test scores can accommodations constitute a truly meaningful response to the federal mandate to include ELLs in state accountability systems.

One of the difficulties faced by educators is selecting appropriate accommodations, i.e., those that preserve test validity and the comparability of test scores. Focusing on data from National Assessment of Educational Progress (NAEP) and research conducted on accommodations and second language acquisition, the following section reviews current research on accommodations for ELLs.

Research on Accommodations

Within the extremely limited pool of available research on accommodations, few studies focus on accommodations intended to address the linguistic needs of ELLs or on how accommodations, separately or in combination, affect ELLs’ performance. As Sireci, Li, & Scarpati (2002) observed in their recent research synthesis of accommodation studies: “relative to research on SWD [students with disabilities], little research has been conducted on the effects of test accommodations on the test performance of ELLs” (p. 49). Of the 150 articles reviewed in by Sireci et al. in 2002, only 38 of these were studies that examined the effects of test accommodations on the performance of students with disabilities or ELLs; of these, 13, or just under nine percent of all studies examined, focused on ELLs (Sireci, 2003).

Two research perspectives have dominated the knowledge base on accommodations for ELLs. The first perspective is inspired by the challenge to include a more representative sample of students in the National Assessment of Educational Progress (NAEP).[4] The second perspective is motivated by the empirical need to understand the effects of particular accommodations or groups of accommodations on ELLs.

NAEP is intended to account for the progress of all students. Under NCLB, the results of states’ standards-based assessments are to be compared to NAEP results. As the national assessment, the policies and perspectives taken by NAEP offer useful benchmarks.

Prior to 1995, NAEP policy allowed for the exclusion of ELLs judged incapable of participating meaningfully in the assessment. However, in the mid 1990s, NAEP’s inclusion policies underwent significant modification to broaden participation among special needs students, which NAEP defined as ELLs and students with disabilities (NCES, 2003). In an effort to make NAEP more inclusive, NAEP researchers began to experiment with test accommodations for ELLs and students with disabilities.

In 1995, six accommodations for ELLs and students with disabilities (SDs) were field tested in NAEP science and mathematics assessments. The 1995 field test was designed to evaluate not the effect of individual accommodations on student performance, but rather how accommodated field test scores would affect data comparability over time (Lutkus & Mazzeo, 2003, p. vii). Although analysis verified that inclusion of accommodated data would have an impact on NAEP trend data, to meet the goal of increasing ELLs’ participation in NAEP, all accommodations used in the field test, with the exception of the Spanish-only assessment, were permitted in the 1996 operational NAEP science and mathematics assessments (Olson & Goldstein, 1997). Accommodations have been permitted on NAEP since 1996. Currently, NAEP offers a total of 21 accommodations for ELLs and SDs (NCES, 2004).

Primarily, the body of research produced by NAEP provides insight into the impact that the use of accommodations has on the inclusion rates of ELLs in NAEP assessments. However, NAEP research offers no guidance on which specific accommodations are most appropriate for ELLs and which have the potential to raise ELLs’ scores without invalidating the construct of the test. While NAEP likely will continue to address issues of inclusion, future research should center on examining the effect of specific accommodations on ELLs’ performance.

The second body of research produced by multiple researchers examines the effects of particular accommodations or groups of accommodations on ELLs.[5] This research involves examining the effect of specific accommodations on ELLs and non ELLs.

For purposes of this review, two criteria were established to guide the search for research studies. Relevant studies included those that (1) examined effects of specific accommodations or groups of accommodations on performance, and (2) employed experimental and quasi-experimental research designs that allowed examination of the effect of the accommodation(s) on ELLs and non-ELLs.[6] Studies using ex post facto designs, such as the study carried out by Shepard, Taylor, and Betebenner (1998), are not reviewed here because the design did not permit a direct examination of the effect of accommodations on test scores. Studies reviewed were conducted between 1990 and 2003. Thirty documents referencing accommodations were screened; from these, 15 studies were identified that examined accommodations addressing ELLs and met the criteria outlined above.

Each of the 15 studies examined one or more of the following types of accommodations: (1) linguistic simplification, (2) customized English dictionaries and glossaries (e.g., English-to-Spanish glossary, Spanish/English glossaries, simplified English glossaries, computer test with pop-up glossary), (3) use of the native language (e.g., dual language tests), (4) reading items and/or directions aloud, and (5) providing extra time in combination with other accommodations. Eight of the 15 studies reviewed focused on individual accommodations; seven examined more than one accommodation.

Student sample sizes in these studies varied: ELL sample sizes ranged from 105 to 864; non-ELL sample sizes ranged from 69 to 11,306. The student samples of 14 of the studies included ELLs and non-ELLs, whereas the student sample used by one study included only ELLs. In the studies examined, score comparability as a function of the presence/absence of a student characteristic (e.g., English language proficiency status), the use of an accommodation, and the interaction of these two factors could be examined.[7] In some studies, student samples were divided further to take into account factors differentiating ELLs, such as amount of time a student received instruction in English, reading ability, and level of English language proficiency.

Findings from the studies must be viewed cautiously for several reasons. First, 15 studies constitute a very small pool of research from which to draw generalizations regarding the effectiveness of specific accommodations. Second, the limited pool of studies identified report on only a small subset of accommodations found in state policies. Third, the specific accommodations examined do not necessarily represent those that most directly address the linguistic needs of ELLs. Fourth, criteria for the classification of students as “ELLs” or “LEP students” is not always clearly defined.[8] For example, the California ESL (English as a second language) designation was used in one study to classify the group of LEP students provided with accommodations. ESL categories, in this case, included students designated initially fluent in English (IFE), fully English proficient (FEP), as well as limited English proficient (LEP) students. In other studies, ELL or LEP student samples were designated using such methods as student self-reports and teacher reports from background questionnaires. In sum, a great deal more research is needed to identify and understand which accommodations respond to the linguistic needs of ELLs and the degree to which test validity and score comparability are affected by the use of such accommodations.

Research relevant to each of the accommodations addressed by the studies under review is summarized below.

Studies Examining the Effect of Accommodations on ELLs

The information provided for each study discussed highlights (1) the content tested, (2) the accommodation examined, (3) key aspects of the sample (ELLs and non-ELLs, total sample tested, grade level), and (4) study findings. Tables accompany text as an organizer for the reader. To better understand the effects of individual accommodations, the 15 studies are presented according to the type(s) of accommodation(s) each examined. In cases where a single study examined more than one accommodation, that study is listed under each type of accommodation considered. Table A-1 in Appendix A provides an overview of the accommodations studies. Following Table A-1, individual accommodations studies are discussed in the context of the accommodation(s) addressed in the study.

Linguistic Simplification

As a test accommodation, linguistic simplification refers to the process of editing test items and/or directions using clear and concise language to convey the same meaning. Simplified text uses vocabulary that avoids ambiguity, colloquialisms, or synonyms and uses uncomplicated linguistic structure(s). The goal of linguistic simplification is to ensure understanding of the test item/directions without compromising the construct being tested. Because the language demands of a test have the greatest potential of introducing construct irrelevant variance it follows that simplifying the language of a test may help English language learners access its content and thereby increase score validity and comparability.

As shown in Table 1, of the 15 studies under review, eight examined linguistic simplification. CRESST researchers, using fourth- and eighth-grade released NAEP math and science items, carried out six of the eight studies. Using California designations for student subgroups, the samples were classified into LEP and non-LEP categories. A seventh study used NAEP mathematics items to simulate the Colorado State Assessment Program (CSAP). For this study, students confirmed the student designation of ELL by responding to a background questionnaire. The eighth study, using the state’s designation to categorize LEP students, examined the effects of linguistic simplification on fourth- and sixth-grade science items in the Delaware Student Testing Program.

 

Table 1. Studies examining the effectiveness of linguistic simplification for ELLs
Study

 

Content Sample
Reading Math Science Social Studies ELLs non-ELLs Total Grade(s)
Abedi, Lord, & Plummer (1997) ü 320 711 1,031 8
Abedi, Lord, & Hofstetter, (1998)a ü 864 530 1,394 8
Abedi & Lord (2001) ü 372 802 1,174 8
Abedi, Hofstetter, Baker, & Lord (2001)a ü 501 445 946 8
Abedi, 2003 ü 317 294 611 4, 8
Hofstetter, 2003 ü 676 173 849 8
Kiplinger, Haug, & Abedi (2000) a ü 152 1,046 1,198 4
Rivera & Stansfield (in press) ü 109 11,306 11,415 4, 6
Note. In describing student samples some researchers used the terms “LEPs” and “non-LEPs” rather than “ELLs” and non-ELLs.”

aBecause ELL and non-ELL sample Ns were not reported, values were calculated from percents of total sample.

Overall, the eight studies examining linguistic simplification were the least inconclusive. A number of studies found that the use of linguistic simplification had positive results for ELLs. Abedi (2003) indicated that linguistic simplification was among those accommodation strategies studied that were “effective in increasing the performance of ELLs students and reducing the performance gap between ELLs and non-ELL students” (p. xiii). Rivera and Stansfield (in press) found evidence that linguistic simplification did not impose a threat to score comparability for monolingual English students. Unfortunately, the sample sizes for ELLs (n = 109) were too small to compare their performance on the simplified and non-simplified versions of the items. The results of other studies were more equivocal. For instance, Abedi and Lord (2001) found that the linguistically modified versions of test items were only slightly easier for students to comprehend than the original items and the difference in difficulty was not statistically significant. Furthermore, the accommodation was found to be no more beneficial to ELLs than to non-ELLs. Similarly, Abedi, Lord, and Plummer (1997) found that, irrespective of LEP status, students in low and average-level math classes performed best on linguistically modified versions of the test items. However, it should be noted that Abedi and his colleagues were unable to determine the LEP status of 70% of the students.

It is possible to make three major observations based on these studies. First, and perhaps most important, the use of linguistic simplification as a type of accommodation for mathematics and science appears promising. Second, ELLs’ level of language proficiency must first be considered to gauge whether the use of linguistic simplification is merited. That is, for students at lower levels of English language proficiency, linguistic simplification appears useful; conversely, for students at higher levels of English language proficiency, the effects of linguistic simplification need to be examined more closely (Abedi et al., 1997; Abedi & Lord, 2001). Third, researchers should explain the process used to simplify test items including the safeguards employed to ensure that the linguistic simplification in no way compromised the content of individual items.

Dictionaries or Glossaries

Bilingual dictionaries and native language glossaries are provided to ELLs to help them understand the meaning of words that may be less familiar due to their English language proficiency status. Glossaries and bilingual dictionaries are designed to help students gain access to the language of test items (e.g., mathematics or science) and not to provide explanations or clues regarding the construct being tested.

Although the function of dictionaries and glossaries is similar, there is an important difference between these two accommodations. Broadly speaking, a dictionary provides a general definition of a word, whereas a glossary provides an explanation of a word customized for a particular context and audience.

The types of dictionaries used in studies of accommodations for ELLs include standard English dictionaries, learners’ dictionaries, and customized dictionaries. A learner’s dictionary is designed specifically for ELLs and defines words in simplified English. Like some standard English dictionaries, learners’ dictionaries also give examples of usage and may provide synonyms. The term customized dictionary is used by researchers to refer to a dictionary that has been altered or specially compiled for a given context. It may refer to a learner’s dictionary where language has been simplified specifically for ELLs. A customized dictionary also may contain a specialized list of standard dictionary definitions compiled for a particular assessment and containing words relevant to that assessment.

In accommodation studies, glossaries appear as specialized lists of key words in English with definitions or explanations customized to fit the perceived needs of the test taker. Glossaries may use simplified language and may also be provided in the student’s native language, but glossaries can take other forms as well. For example, in one study (Abedi, Courtney, & Leon, 2003), instead of a combined list of words, students were provided with a pop-up glossary. In this case, computer testing was utilized, and the explanation (or gloss) of a key term appeared when the student passed a cursor over that term on his or her computer screen. In another study (Abedi, Lord, Boscardin, & Miyoshi, 2001), marginal glosses printed on the test booklet were used. The gloss on one margin included a definition or explanation in English, the gloss on the other provided a Spanish translation of the English gloss.

However, the distinctions made here are not applied consistently in accommodations research. Abedi, Courtney, and Leon (2003) described the customized English dictionary used in their study as “a glossary of non-content words in the math test . . . composed of exact excerpts from an ELL dictionary” (p. 5). Abedi (2003) refers to commercial bilingual dictionaries as glossaries on the basis that, unlike English dictionaries, these texts provide translations of terms rather than definitions (p. 18, footnote 2). On the whole, in the research examined, there is little agreement on what constitutes a dictionary as opposed to a glossary and therefore no identifiable standard to govern the use of a dictionary versus a glossary as an appropriate form of accommodation.

Although glossaries also appear to have the promise of being useful, only three studies were found that examined the use of glossaries. When English dictionaries and glossaries were used as accommodations in the same study (Abedi, 2003; Abedi, Lord, et al., 2001; Abedi, Courtney, & Leon, 2003) dictionaries generally were found to be more useful.

Of the 15 available studies, six, listed on Table 2, examined dictionaries and glossaries as accommodations. Four studies examined the use of dictionaries; two of these four studies also examined the use of a glossary. Two separate studies examined the effect of glossaries only.

 

Table 2. Studies examining the effectiveness of dictionaries and/or glossaries for ELLs
Study Content Sample
Reading Math Science Social Studies ELLs non-ELLs Total Grade
Albus, Bielinski, Thurlow, & Liu (2001)b ü   133 69 202 middle school
Abedi (2003)bc ü 317 294 611 4, 8
Abedi, Courtney, & Leon (2003)b ü   535 614 1,149 4, 8
Abedi, Lord, Boscardin, & Miyoshi (2001)bc   ü 183 236 419 8
Abedi, Hofstetter, Baker, & Lord, (2001)ac ü   501 445 946 8
Kiplinger, Haug, & Abedi (2000)ac ü   152 1,046 1,198 4
Note. In describing student samples some researchers used the terms “LEPs” and “non-LEPs” rather than “ELLs” and non-ELLs.”

aBecause ELL and non-ELL sample Ns were not reported, values were calculated from percents of total sample.

bStudy examined use of dictionary. cStudy examined use of glossary.

 

Available research suggests that, with regard to using dictionaries in a testing situation, the effect of the accommodation on test validity is a key concern. For example, test validity may be compromised by the use of a dictionary that defines key vocabulary or illustrates content tested (Rivera & Stansfield, in press; Laufer & Hadar, 1997). In light of this concern, Abedi, Lord, Boscardin, and Miyoshi (2001) suggest that dictionaries should be customized to control vocabulary and other types of information provided to test takers. Some researchers have noted that a positive aspect of using dictionaries as a test accommodation is that they are widely used as part of instruction and should therefore be familiar to students (e.g., Abedi, 2003; Albus et al., 2001). Researchers at the National Center for Educational Outcomes (Albus et al., 2001) contend that customized dictionaries in particular do not burden administrators and students with the bulk of published dictionaries, nor do they contain words that assist students with test content. In cases where concerns arise that providing ELLs with a traditional dictionary may provide an unfair advantage, customized dictionaries offer a potentially viable alternative (Abedi, 2001).

In the Abedi (2003) study using Spanish language glosses, the researchers noted that it was difficult to understand the effect of the accommodation in the absence of data on students’ level of Spanish language proficiency. Overall, English language glossaries seemed to be more useful than Spanish language glossaries. For those students not literate in Spanish who are being instructed in English, it stands to reason that a Spanish language glossary may not be helpful. However, for students with basic literacy in Spanish or in English or students participating in a dual-language program, it is possible that Spanish language glosses could prove useful. These observations support the need to consider student background variables carefully prior to selecting an accommodation. Overall, however, more research needs to be conducted to examine the effects of English and native language glossaries.

Based on the limited number of studies and the often blurred definitions between dictionary and glossary conditions, it is imperative that future research define these accommodations consistently to understand clearly the separate effects of the two approaches. To discern whether the impact(s) of these accommodations are significant, effects should be documented separately. This is particularly important in cases where other accommodations (e.g., extra time) are used in tandem with glossaries and dictionaries. It also is important to explore further the separate effects of these two unique accommodations on test validity.

Native Language

Accommodations in the native language are wide-ranging but may include written translation of test directions and/or items; bilingual or dual language versions of the test; oral repetition of test directions and/or items in the native language via audiotape; or sight translation (i.e., a spontaneous, oral rendition of the test content in the student’s native language).

The effects of native language accommodations were examined in five of the 15 identified studies (see Table 3). Three of the five studies examined the use of written translation of test directions/items or bilingual versions of the test. A fourth study examined the oral delivery of native language accommodations in the context of “oral presentation,” an accommodation that included the option of reading directions in the student’s native language (as well as other options unrelated to native language accommodation). The fifth study identified allowed the use of audiotape as a native language accommodation, but did not examine this as a separate accommodation.

 

Table 3. Studies examining the effectiveness of native language accommodations for the assessment

of ELLs

Study Content Sample
Reading Math Science Social Science ELLs non-ELLs Total Grade
Garcia (2000) ü 320 82 402 8
Abedi, Lord, & Hofstetter (1998)a ü   864 530 1,394 8
Anderson, Liu, Swierzbin, Thurlow, & Bielinski (2000) ü   105 101 206 8
Hofstetter (2003) ü   676 173 849 8
Hafner (2000) ü 82 288 370 4, 7
Note. In describing student samples some researchers used the terms “LEPs” and “non-LEPs” rather than “ELLs” and non-ELLs.”

aBecause ELL and non-ELL sample Ns were not reported, values were calculated from percents of total sample.

 

These studies highlight the need to make decisions about the use of native language accommodations based on whether students are being instructed in whole or in part in the native language. In cases where a student is instructed in the native language and/or when a student literate in the native language is recently enrolled in a U.S. school, this limited pool of research suggests that testing in the student’s native language can facilitate access to the content of the test. By contrast, when students are being instructed only in English, a native language test has the potential to affect student performance adversely (Hofstetter, 2003).

Reading Test Items or Directions Aloud

As a test accommodation, reading aloud is used primarily for dyslexic and blind students. It requires the student to listen to the text, comprehend, and process it based on short-term memory. No written text is provided. By contrast, when reading aloud is used for ELLs, the student typically is allowed to hear and read text at the same time.

In the two available research studies using this accommodation for ELLs, two approaches were taken. In one study, an exact oral rendition of the items was provided; in a second study, an interpretation of test directions was allowed. Both studies allowed extra time and utilized a quasi-experimental design. Table 4 profiles these studies.

 

Table 4. Studies examining the effectiveness of reading aloud for the assessment of ELLs
Study Content Sample
Reading Math Science Social Science ELLs Non-ELLs Total Grade
Castellon-Wellington (2000)   ü 106 0 106 7
Hafner (2000) ü 82 288 370 4, 7
Note. In describing student samples some researchers used the terms “LEPs” and “non-LEPs” rather than “ELLs” and non-ELLs.”

 

The two studies provide a contrast in offering a read-aloud accommodation to ELLs. Castellon-Wellington allowed test items to be read aloud, whereas Hafner allowed test directions to be provided to students as an extended oral presentation. A perhaps more significant difference, however, is that whereas Castellon-Wellington allowed an exact reading of the test items, Hafner allowed a great deal of latitude on the part of the test administrator to choose what form the oral presentation of directions would take, including simplification, re-reading test directions, providing additional examples, or reading directions in a student’s native language. Furthermore, for the Hafner study, no record was kept of the form of oral presentation provided.

These two studies are excellent examples of why it is essential that the separate needs of ELLs and students with disabilities be examined carefully before an accommodation is targeted for use; that is, is a selected accommodation directly responsive to the linguistic needs of ELLs or is its use more appropriately directed toward the cognitive and/or physical needs of students with disabilities? Because the two studies examining reading aloud differed widely in terms of approach, it is difficult to assess the rationale, processes, and purposes for providing a read-aloud accommodation to ELLs. One study focused on test items while the other study centered only on test directions; one provided a straight rendering of the test items orally, while the second study allowed the tester latitude in administering the accommodation. In sum, from the limited number of studies available, many questions remain unanswered regarding whether read aloud is an accommodation appropriate for and of benefit to ELLs.

Extra Time

The use of extra time on an assessment is “based on the premise that if language poses a problem for ELLs, students under normal testing conditions may not be able to carefully consider all of the items on the test” (Castellon-Wellington, 2000, p. 3). Although extra time may be provided as a single accommodation, more commonly it is provided in conjunction with other accommodations. For example, students may be permitted both to use a customized dictionary and receive extra time. For “speeded” tests, or tests that assess students’ rates of item completion as part of the construct being measured, providing students with extra time violates part of the construct under examination. For all other assessments, extra time does not violate the construct being measured.

Six of the 15 studies under review made use of extra time. Of these, four examined the use of extra time as an accommodation, whereas two studies simply permitted all students to use extra time in combination with other accommodations; these studies did not examine the effect of extra time separately (Abedi, 2003; Albus et al., 2001). The four studies directly examining the use of extra time are presented in Table 5.

 

Table 5. Studies examining the effectiveness of extra time for the assessment of ELLs
Study Content Sample
Reading Math Science Social Studies ELLs non-ELLs Total Grade
Abedi, Hofstetter, Baker, & Lord, (2001) ü   501 445 946 8
Abedi, Courtney, & Leon (2003)a ü   535 614 1,149 4, 8
Castellon-Wellington (2000)   ü 106 0 106 7
Hafner (2000)   ü 82 288 370 4, 7
Note. In describing student samples some researchers used the terms “LEPs” and “non-LEPs” rather than “ELLs” and non-ELLs.”

aBecause ELL and non-ELL sample Ns were not reported, values were calculated from percents of total sample.

 

With the single exception of the “speeded” test, in which time is integral to the construct being measured, there appears to be no harm, and, in some cases, potential advantage, from the provision of extra time as a form of accommodation. The majority of studies reviewed demonstrate that extra time generally is helpful—ELLs may not have performed to advanced levels, but often performed better when afforded extra time. However, these studies also appear to indicate that extra time offered in isolation as a singular form of accommodation is not as helpful as more targeted forms of accommodation that are linked to ELLs’ level of English language proficiency. In sum, in cases where extra time is used, to better ‘level the playing field’ for ELLs, the limited research suggests that it is best paired with a type of accommodation that directly targets ELLs’ linguistic needs. For example, some studies have demonstrated positive effects for ELLs in cases where extra time was coupled with glossaries and/or dictionaries (Abedi et al., 2003; Abedi et al., 2001).

Summary of Research on Accommodations

In its review of accommodations, the National Research Council (NRC, 1999a) reached the conclusion that “research on the effects of test accommodations for English-language learners is inconclusive” (p. 62). Five years later, this seems still to be the case. Although some accommodations, such as linguistic simplification, appear to be promising, the body of available research is far too limited to provide conclusive evidence regarding the utility of specific accommodations. Additional studies designed to examine promising types of accommodations with appropriate, sizeable student populations need to be carried out. Native English speakers also must be included in the studies, along with control groups (i.e., students who do not receive accommodations) so as to examine the full effects of the accommodation (Thurlow et al., 2000). In designing studies, researchers also must take into account other factors that may affect outcomes, such as the diversity within samples of English language learners (e.g., differing cultural backgrounds, level of English language proficiency, education in the native language), as well as the methods used to create and implement accommodations (Thurlow et al., 2000).

A complementary perspective from which to study strategies that support ELLs’ use of accommodations is research on second language acquisition. By understanding how ELLs process language, researchers will be better able to judge the effectiveness of individual accommodations in allowing ELLs access to the content of the test. This research is examined in the following section.

How Second Language Acquisition Research Informs the Use of Accommodations

Research on second language acquisition centers on the linguistic and cognitive processes involved in learning a second language. Decisions about accommodations appropriate to the needs of ELLs are informed by understanding how second language learners process language, the difficulties they face, and how language modifications can influence ELLs’ comprehension.

As discussed above, the tendency of ELLs to process the language of a test by focusing on linguistic structures, lexical items, and phonological features leaves fewer cognitive resources available for accessing the content of a test. Compared to peers who have automatized processing the language of the test, ELLs who have not automatized language processing are at a distinct disadvantage. Furthermore, in addition to processing the language of the test, ELLs also must negotiate the socio-cultural practices and expectations embedded in assessment. These aspects of testing often involve students’ academic background, which is affected by socio-economics as much as by linguistic differences. Accommodations can mitigate these disadvantages by helping ELLs access the content of the assessment. This support can be provided directly, through the (1) simplification, (2) repetition, or (3) clarification of the test items or directions, or indirectly, by modifying the conditions under which a test is taken. The manner in which each facet of support is addressed by second language acquisition research is discussed below.

(1) Simplification of test language can facilitate ELLs’ comprehension and reduce the linguistic load placed on them during the assessment. The process of simplification is intended to reduce the semantic and syntactic complexity of the English used in the test while preserving vocabulary and terms pertinent to the content area. A number of factors contribute to linguistic complexity, such as word frequency, word length, morphological complexity, and sentence length. Low frequency, long, or morphologically complex words and long sentences are especially difficult for ELLs to process (Abedi, Lord, & Plummer, 1995).

Second language researchers have identified passive voice constructions (Forster & Olbrei, 1973), long noun phrases (Halliday & Martin, 1993), long question phrases (Adams, 1990), comparative structures (Jones, 1982), prepositional phrases, conditional clauses (Celce-Murcia & Larsen-Freeman, 1983), and relative clauses (Schachter, 1983) as difficult for both ELLs and native English-speakers to process. Because of these factors, in the process of simplification, such late-acquired or complex structures are minimized, replaced with simpler ones (Abedi, Hofstetter, Baker, & Lord, 2001). Simplification of this type makes the language more accessible to ELLs, thereby allowing them to more easily access the core messages of the test (Chaudron, 1988).

Even those ELLs with more than a beginning knowledge of English may encounter difficulties in effectively marshalling their knowledge in a testing situation. Despite their increased proficiency in English, research shows that bilinguals still in the earlier stages of language acquisition carry out encoding and decoding in the weaker language at slower processing speeds (Blair & Harris, 1981; Dornic, 1979; Mack, 1986; Soares & Grosjean, 1984). In addition, during second language processing, both short-term and working memory may be significantly taxed (Ellis, 1996; Ellis & Schmidt, 1997; Ellis & Sinclair, 1996; Hoosain & Salili, 1987; Miyake & Friedman, 1998; Naveh-Benjamin & Ayres, 1986; Robinson, 1995, 2001; Skehan, 1998).

(2) Repetition of test directions and/or items also affords ELLs an additional opportunity to process the language of the test by reducing the impact of processing speed and memory capacity on ELLs’ comprehension. The repetition type most frequently discussed in second language acquisition research is that in which the meaning of an utterance is restated or rephrased in order to keep a conversation flowing. This type of repetition contrasts with exact repetition, or that which occurs when test directions and/or items are read more than once. Research conducted on this type of repetition has found that exact repetition has a positive effect on comprehension of a particular utterance (Cervantes, 1983, Van Patton, 1990, Jensen & Vinther, 2003). This research has operated from the premise that second language learners will try to extract the meaning of an utterance on a first listening but, if comprehension fails, will use the repetition to notice linguistic details they missed the first time in order to make a more accurate hypothesis about meaning (Cervantes, 1983; Van Patton, 1990).

(3) Clarification also can be used to help ELLs gain access to the language of a test. Clarification can be provided either as an input strategy (through explanation of the linguistic input provided to the student) or as an output strategy (through clarification requests). Clarification requests occur when negotiating meaning and are used even by native speakers of a language when something in the linguistic input is unclear. When ELLs negotiate meaning through clarification requests, they receive support for just the specific portions of the linguistic input that they do not understand, and the reformulated input is more manageable and within their processing capacities (Long, 1980; 1983, 1996). ELLs at an intermediate stage of acquiring English are thought to be more successful at negotiating meaning and making clarification requests because they have access to a larger number of linguistic resources in English than do ELLs at the beginning stages of learning English (Pica, Lincoln-Porter, Paninos, & Linnell, 1996).

Some second language acquisition research has also suggested that situations that require a second language learner to produce language may serve an even more important function—having to produce language may force learners to “notice the gap” between what they know and what they want to be able to say (Swain, 1985, 1995). Opportunities for language output in a testing situation, such as having the student verify his or her understanding of test directions, may be limited. However, Ellis (1999) argues that such opportunities may allow ELLs to notice specific linguistic features that are particularly problematic and, therefore, compel them to engage in a deeper level of language processing.

Based on the understanding of ELLs’ use of cognitive resources provided by second language acquisition research, it may reasonably be assumed that other strategies providing indirect linguistic support to ELLs may help maximize the cognitive resources at their disposal in a testing situation. Despite the fact that such strategies as changes in test schedules or changes in test venues are routinely used in state assessment, there is no research directly supporting this practice.

Conclusion

Part I has highlighted the key issues educators and policy makers must face when considering the use of accommodations for ELLs taking state assessment. As the federal legislation makes clear, only accommodations that yield “reliable information” (1994 ESEA) or “reliable data” (2002 ESEA) will enable states to include ELLs meaningfully in state assessment systems.

Although legislation dictates that ELLs be assessed “in the language and form most likely to yield accurate and reliable information on what such students know and can do,” more research clearly is needed to determine the most appropriate test accommodations for ELLs. While the number of studies of specific accommodations is limited and the evidence base quite mixed, several types of direct linguistic support accommodations, including simplified language and customized dictionaries, appear to hold promise. Additionally, the use of extra time, an indirect linguistic support accommodation, particularly when paired with an accommodation that offers direct linguistic support, appears to benefit ELLs.

Because the research base is weak a great deal more work remains to better define (1) which accommodations directly target the linguistic and socio-cultural needs of ELLs and (2) which accommodations best support ELLs’ access to particular content areas. In the absence of additional empirical data, the extent to which particular test accommodations affect test validity remain to be examined. As a result, Stansfield (2002) argues that it is critical for research to be conducted to determine whether accommodations pose a threat to a test’s validity or to score comparability between ELL and non-ELL test-takers. In theory, only once score comparability has been established can an accommodation reasonably be considered for use.

A promising resource for accommodations research is the research on second language acquisition (SLA). As shown, SLA research sheds light on how the use of accommodations that provide direct linguistic support improves ELLs’ access to the content of the assessment. This research identifies the various forms of direct linguistic support, including translation or the use of modified input (i.e., simplification, repetition, or clarification). Such support reduces the cognitive resources needed to process the language of the test and leaves more resources available to ELLs for processing the test content. By implication, accommodations that make use of translation or modified input can help ELLs better attend to test content by reducing the cognitive resources expended on processing the language of the test.

 

Part II: Synthesis of State Policy Findings

One of the primary goals of states’ accommodations policies is (or should be) to provide districts with sufficient guidance to make good decisions regarding the inclusion of ELLs in state assessment. To accomplish this goal, states’ policies must help districts determine which students should be accommodated and which accommodations are most likely to benefit each of these students. As the findings of the SY 2000–2001 study suggest, this is a complex task. States’ policies must provide guidance regarding which of the many available accommodations are most likely to provide support that will result in valid and comparable test scores. They must designate criteria and personnel to match eligible ELLs with appropriate accommodations. Finally, to be effective, these policies must be comprehensive and accessible to district and school personnel.

In reviewing state assessment policies for SY 2000–2001, the research team attempted to identify factors that contribute to policy that provides effective guidance to districts using accommodations for ELLs. Findings of the study indicate that the comprehensiveness and clarity of states’ assessment policies varied greatly. Indeed, many of these policies fell short of providing effective guidance to districts regarding the use of accommodations. The following points summarize findings in relation to key elements of assessment policy.

  • In addressing the use of accommodations, states’ assessment policies often did not focus adequately on the unique needs of ELLs.
    • ELLs and SDs were grouped inconsistently, often at the expense of ELLs.
    • A variety of approaches were taken in listing accommodations available to ELLs, including the use of a taxonomy targeted toward SDs rather than ELLs.
  • States’ policies often did not make clear which accommodations were appropriate to particular content areas.
  • States’ policies indicated that a wide variety of accommodations were available to ELLs.
  • States’ policies were inconsistent in indicating how appropriate accommodations were to be matched to appropriate ELLs.
    • Criteria often were not indicated.
    • Decision makers often were not designated.

How States’ Policies Addressed Accommodations for ELLs

A large majority of states’ policies for SY 2000–2001 addressed the accommodation of ELLs. Of the 51 states,[9] only four (AK, GA, ID, and IL) had no policies addressing the accommodation of ELLs. Nearly all the policies of the remaining 47 states named particular accommodations districts might use for the assessment of ELLs. Iowa’s policy was the only exception. This state’s policy provided inclusion guidelines, which provided suggestions for selecting appropriate accommodations for ELLs and examples of particular accommodations. However, the Iowa policy did not delimit districts’ selection of accommodations as other states’ policies did. Hence, 46 states were identified as listing or naming accommodations.

States’ policies varied greatly in addressing the use of accommodations for ELLs. Wide differences among states’ policies limits the ability of the research team to make generalizations regarding these policies. It is possible, nonetheless, to identify three central concerns that informed the organization of states policies: (1) identification of student groups eligible to take accommodated assessments (e.g., ELLs, SDs), (2) identification of accommodations to be made available to eligible students within these groups, and (3) identification of content areas (e.g., mathematics, English language arts) for which particular accommodations should be allowed or prohibited. Each consideration should form part of any accommodation policy. In the judgment of the research team, states’ policies often succeeded or failed according to the degree to which they focused on the needs of ELLs in addressing these three considerations.

How ELLs and SDs Were Grouped in States’ Policies

States’ policies were often organized explicitly around the needs of two student groups: ELLs and SDs. Some states’ policies treated these as entirely separate groups, whereas others addressed these as one group. In some cases, states’ policies adopted special nomenclature to describe the combined group of ELLs and SDs, such as “special populations” or “special needs” students. In other cases, policies simply addressed both student groups in the same document. The strategy of grouping of ELLs and SDs has served as an expedient for organizing state policy and addressing the needs of students for whom the state is legislatively accountable.

In most cases, guidelines for individual assessments for which accommodations were offered were subordinated to considerations of student groups (ELLs and SDs). However, a small number of states’ policies addressed student groups within the context of particular state assessments. For example, much of Oregon’s policy regarding accommodations was found in the state’s administration manuals for the Statewide Knowledge and Skills Assessments. The manuals addressed two areas: (1) reading and literature, math, and science, and (2) writing and mathematics problem solving. Each manual contained guidelines for including ELLs and SDs in each content area tested in Oregon’s Statewide Knowledge and Skills Assessments.

Irrespective of how particular states’ policies were organized—by lumping ELLs and SDs together in the same document or by providing separate documentation for each group—there was a significant variation in the comprehensiveness of guidance provided by states. Whereas some states offered separate and extensive policy documents identifying accommodations appropriate for ELLs, other states provided only cursory guidance restricted to one or two pages.

Although some states’ policies grouped ELLs and SDs and provided guidance for both groups, the research team found that, in many instances, grouping ELLs and SDs occurred to the detriment of ELLs. This may be due to the fact that when federal legislation required the inclusion of ELLs in state assessment, many states initially used existing accommodations policies designed for students with disabilities as the framework for ELL accommodations policy (Rivera et al., 2000). Although it is not entirely representative of states’ assessment policies regarding accommodations for ELLs, Connecticut’s policy provides a compelling example of how some states’ policies seem to be written within an SD-responsive mindset.

In its Assessment Guidelines, the Connecticut State Department of Education addressed the use of accommodations for testing all eligible students. In a discussion providing “General Information About Accommodations” the document cites Section 504 of the Rehabilitation Act of 1973 and the Americans with Disabilities Act of 1990 as entitling “students with disabilities the opportunity to participate in and receive the benefits to be derived from statewide testing efforts.” When describing accommodations, the document continues in the same vein, suggesting that accommodations are “provided to allow students with disabilities the opportunity to demonstrate their aptitude and achievement in testing situations” (CT-4, p. 17). No mention is made of ELLs in this introductory material. In fact, ELLs are not named explicitly until the section entitled “Who May Receive Accommodations.” Here, ELLs are referenced in the last sentence of the first paragraph, almost in an aside: “Additionally, limited accommodations are available for bilingual and ESL students” (p. 18). Finally, ELLs are treated explicitly in a separate section describing the three accommodations for which they are eligible: reading aloud/clarifying test directions in English or the student’s native language; extending time allowed for assessment; individual administration of test. Nine accommodations were available for SDs.

Because policy regarding SDs is more developed than that regarding ELLs, providing policy documents that attempt to address the combined needs of ELLs and SDs have the potential to obscure the needs of ELLs in favor of those of SDs.

How Accommodations for ELLs Were Organized in States’ Policies

The second major strategy used in states’ policies to provide guidance for the accommodation of ELLs was the listing of accommodations available for eligible students (e.g., ELLs and SDs). Forty-five states provided lists of accommodations available to ELLs. The research team found that these states’ policies tended to blur distinctions between ELLs and SDs in two ways: (1) by organizing lists of accommodations according to a taxonomy developed for SDs and (2) by providing lists of accommodations meant to address the combined needs of ELLs and SDs.

Accommodations Taxonomies

States adopted a variety of strategies for organizing accommodations in policies for SY 2000–2001. Some states’ policies simply listed available accommodations; other states’ policies listed accommodations available to ELLs according to particular assessments. A majority of states, however, (28 of 45) arranged accommodations within a taxonomy developed for students with disabilities. (See Appendix B, Table B-1.) According to this taxonomy, available accommodations can be sorted into the following categories: (1) timing/scheduling, (2) setting, (3) presentation, and (4) response. This taxonomy was used in states’ policies to organize many of the 75 accommodations listed among states’ assessment policies as available to ELLs during SY 2000–2001. Thirty-one of these accommodations are clearly relevant only to SDs, whereas only 44 are relevant to ELLs. These data are discussed in greater detail below.

Accommodations appropriate for ELLs can be found in every category of this taxonomy, which suggests that the SD-responsive taxonomy is broad enough to encompass nearly all available accommodations for “special needs” students, i.e., ELLs and SDs. However, the inclusiveness of this taxonomy is its primary shortcoming. Policies relying on such an inclusive taxonomy run the risk of obscuring differences in accommodations appropriate for ELLs and those appropriate for SDs.

Listing accommodations for ELLs

Whether adopting the SD-responsive taxonomy or some other means of classification, many states’ policies listed individual accommodations available for ELLs alongside those available for students with disabilities. For instance, among the accommodations listed in Mississippi’s policy, those involving reading test directions and test items aloud also involve the use of sign language or text scanners and voice synthesizers (MS-2, p. 27). Maryland policy lists “accompany oral directions with written directions” alongside “Cue the student to remain on task” (PA-2, p. 7). Wyoming policy listed 56 accommodations available to all eligible students. As was the case with the policies of Maryland and Mississippi, the majority of these accommodations are clearly intended for SDs; only a handful are relevant to ELLs (e.g., extra time, reading directions aloud), and none address ELLs’ needs exclusively. Wyoming policy provided an additional list of six accommodations directed toward ELLs specifically. Some of these accommodations, such as the individual administration of the assessment, the reading aloud of the math assessment in English, and clarification of words (in English), were also represented on the primary list. Other accommodations address ELLs by offering support in the native language such as the reading aloud of instructions.

In some cases, the distinctions among accommodations relevant to ELLs, as opposed to those relevant to SDs, are obvious. For instance, it is fairly clear that “use of enlarged print” would be of little help to ELLs. However, by adopting an SD-responsive framework to organize accommodations or by listing ELL-responsive and SD-responsive accommodations in a single list, states’ policies ignore important differences between ELLs and students with disabilities and encourage the perception that the needs of all “special populations” can be met through the same assessment strategies. Table B-2 in Appendix B provides data regarding how states listed accommodations.

Accommodations and Content Area

States’ policies were inconsistent in addressing the relationship between content areas and accommodations. Overall, the most noticeable trend with regard to states’ treatment of content areas is that accommodations providing direct linguistic support to ELLs were more likely to be prohibited for English language arts (ELA) than for other content areas. This is most likely due to the perception that direct linguistic support accommodations are more likely to compromise the validity of an ELA test than a mathematics or science test. Data relating to content areas for which accommodations were explicitly designated can be found in Appendix B, Table B-3.

Most states explicitly addressed the content areas for which accommodations could be made available to ELLs. For instance, Wyoming policy listed six accommodations designated exclusively for ELLs. Of these accommodations, two were allowed only for the mathematics test (reading aloud in English and reading aloud in student’s native language). Wisconsin’s policy recommended the use of six accommodations for the Wisconsin Knowledge and Concepts Examinations. However, the states’ policy explicitly prohibited the use of accommodations on the Wisconsin Reading Comprehension Test because this test “addresses specific English language-based skills (reading comprehension) and is administered in an untimed format” (WI-3, p. 1). Kansas policy listed nine accommodations that were “allowed on Kansas Assessments” with only one caveat: Reading aloud to ELLs was not permitted for reading comprehension tests.

A few states, however, did not explicitly reference content areas in their policies. For example, the policies of Kentucky, the District of Columbia, Kentucky, New Jersey, South Dakota, Virginia, and West Virginia did not indicate which tests were to be administered along with particular accommodations.

Despite the fact that some states’ policies did not explicitly identify content areas for which particular accommodations were appropriate, it is often clear from context which content areas are allowed to be administered with particular accommodations. Often, by knowing the assessment addressed in the states’ policy one can determine the content areas for which accommodations are allowed. For instance, South Carolina policy made five accommodations available to ELLs taking the Palmetto Achievement Challenge Test (PACT). Of the five accommodations, one (oral administration) was explicitly allowed only for the PACT mathematics test. Because PACT consists of ELA and mathematics assessments, it might be assumed that the other four accommodations were available for the both ELA and mathematics.

In many cases, knowing the relevant state assessments may provide readers of states’ policies with enough information to administer accommodations appropriately. However, by not explicitly identifying content areas for which accommodations can be allowed, states’ policies introduce an unnecessary level of ambiguity regarding which tests are eligible to be accommodated and which specific accommodations can be used.

Accommodations Available to ELLs in SY 2000–2001

Seventy-five accommodations were cited in states’ assessment policies as being available to ELLs. Of these, 31 exclusively addressed the needs of SDs. Only 44 addressed the needs of ELLs. (Table B-4 in Appendix B lists the 75 accommodations, indicating which accommodations are relevant only to SDs.) Data suggest that states’ policies are not focused directly on the needs of ELLs. In response to the lack of focus on ELLs evident in state policy, the research team developed an ELL-responsive taxonomy linking the use of accommodations more closely to the needs of ELLs.

As an initial step in building a taxonomy for classifying accommodations for ELLs, the research team reviewed the classification of accommodations used for the 1998–1999 study of state assessment policies for ELLs (Rivera et al., 2000). In this study, Rivera et al. identified accommodations appropriate for ELLs and classified these as “linguistic accommodations” in both English and ELLs’ native languages. All other accommodations, including those designed for students with disabilities, were classified as “non-linguistic.” For the 1998–1999 study, 16 of 37 accommodations found in state policies were classified as linguistic.

Next, the research team examined second language acquisition research for insight into additional approaches that might offer ELLs access to the language of the assessment. This research supported the use of “linguistic” accommodations. Moreover, it led the research team to further consider how the cognitive demands placed on second language learners when processing a second language could be ameliorated. This resulted in the further consideration of the use of “non-linguistic” accommodations to support ELLs to maximize their cognitive resources during a testing situation.

These two steps enabled the development of a new accommodations taxonomy that accounts for both direct linguistic support accommodations and indirect linguistic support accommodations. Accommodations providing direct linguistic support involve changes to the language of the test that directly assist ELLs in processing the language of the test. Such accommodations can be provided in English or in the student’s native language. These accommodations include, for example, providing a version of the test translated into the student’s native language or clarifying the (English) language of the test items and/or directions so that students are not assessed inadvertently on their knowledge of English rather than the construct of the test.

Because students in the process of learning English as a second language may have greater linguistic demands placed on them in the testing situation than do their native English–speaking peers, ELLs may need additional forms of support to allow them to demonstrate their knowledge of the content being assessed. Indirect linguistic support accommodations provide this support to ELLs by adjusting the conditions under which they take an assessment. Such accommodations are designed to help ELLs process language more easily, but they are not direct changes of the language of the test. Hence, indirect linguistic support accommodations include such considerations as increasing the time during which a student is allowed to take an assessment or allowing an ELL to take a test in a familiar room. The use of direct and indirect linguistic support accommodations help ensure that assessments measure ELLs’ progress in relation to the academic construct being measured by the assessment, not ELLs’ progress in developing English language proficiency.

As Figure 1shows, of the 44 ELL-responsive accommodations, the research team identified 27 direct linguistic support accommodations and 17 indirect linguistic support accommodations. The analysis of states’ assessment policies from the perspective of the ELL-responsive taxonomy provides insight into the extent to which states’ policies provide guidance in the use of test accommodations. A complete list of direct and indirect linguistic support accommodations specified in states’ policies for SY 2000–2001 are provided in Appendix B, Table B-5.

 

Figure 1. Profile of accommodations for ELLs found in states’ policies

Accommodations Providing Direct Linguistic Support

As discussed in Part I, second language acquisition research shows that during an assessment, ELLs are overwhelmed with linguistic input, which they receive faster than they are able to process effectively. Direct linguistic support accommodations are intended to provide ELLs linguistic support that mitigates the language demands placed on them during assessment. At the same time, these accommodations must preserve the validity of the test by ensuring that the construct being tested remains unaltered. In other words, linguistic accommodations are not intended to give ELLs support on how to respond to test items correctly.

Direct linguistic support accommodations support ELLs by providing modified input in the native language or in English. Native language accommodations involve the translation of some or all of the language of the test into the student’s native language. English language accommodations involve (1) the simplification of some or all aspects of the test language, (2) repetition of the test language, or (3) clarification of parts of the test language. In representing data for these accommodations, the research team has, where possible, followed the practice among states of distinguishing between the use of linguistic accommodations for test directions and test items.

In understanding the data presented in this study (in particular, data provided in Appendix B, Tables B-6–B-9), it is important to keep in mind that states provided different levels of detail in addressing accommodations. This variation resulted in challenges in coding data: (1) identifying the purpose of particular accommodations and (2) identifying the content areas for which accommodations were available.

First, it is important to keep in mind that states provided different levels of detail in addressing accommodations. Therefore, accommodations organized under the ELL-responsive taxonomy may overlap in purpose. This means that some accommodations listed separately may in practice be identical. Three accommodations involving test directions are a case in point: (a) reading aloud of test directions, (b) repeating test directions, and (c) providing “both oral and written directions.” In most cases states’ policies did not make clear whether or not directions read aloud were also intended to be presented in written form, making it difficult to distinguish between (a), (b), and (c). If directions read aloud were also presented in writing, then (a), (b), and (c) would be identical. Unfortunately, because there is no information on practice it is not always possible to identify which accommodations listed in states’ policies serve the same purpose. In the interest of accuracy, the wording of the states’ policies in which these accommodations appeared has been preserved where possible.

Second, states’ assessment policies often, but not always, specified which accommodations were allowed or prohibited for use with specific content area tests (e.g., English language arts [ELA] or mathematics). However, because states’ policies were not consistent in specifying the subject matter for which specific accommodations were allowed, the research team coded accommodations as follows:

  • allowed for at least one content area (A),
  • prohibited for some content areas (PS), or
  • prohibited for all content areas (PA).

This coding does not transparently identify the content areas for which accommodations were available. First, the present coding does not allow for a distinction between states allowing a particular accommodation for all content areas and those states that allowed a particular accommodation for only one content area. Second, some overlap between (A) and (PS) is inevitable. It may be assumed that a state’s policy prohibiting an accommodation for some content areas (PS), allowed that accommodation on at least one content area (A). However, such determinations often required information beyond that provided by the policies themselves. As the goal of this study was to present only that information explicitly represented in the states’ policies, the research team chose not to code implied references to the allowance or prohibition of accommodations for particular content areas.

Despite the limitations of this approach, the research team felt that the coding used throughout this study provides the clearest possible picture of the extent to which states’ policies took content into consideration when listing accommodations. To clarify the implications of this picture it should be noted that, accommodations prohibited in some content areas (PS) were generally prohibited for English language arts (ELA) but allowed for mathematics (and often science).

Native language accommodations

The fewer English-language resources an ELL has, the more difficulty he or she may have comprehending the language of the test. By providing test directions and/or test items in the student’s native language, ELLs who have limited linguistic resources in English and who have been taught in their native language are given linguistic access to the tasks on which they are being assessed. Native language accommodations are intended to provide direct linguistic support to ELLs through written translation, scripted oral translation, sight translation, and by permitting the student to respond in his or her native language. Table 1 classifies native language accommodations specified in states’ policies for SY 2000–2001.

 

Table 1. Native language accommodations found in states’ assessment policies,

SY 2000–2001

Written translation

1.       Word lists (mono- or dual-language dictionaries and glossaries),

2.       Written directions provided

3.       Side-by-side dual language versions of the test provided

4.       Translated test provided

Scripted oral translation

5.       Oral translation of directions in native language

6.       Audio-taped directions provided in native language

7.       Audio-taped test items provided in native language

Sight translation

8.       Directions translated into native language

9.       Directions explained/clarified in native language

10.    Test items read aloud in native language

11.    Interpreter or sight translator provided

Student response

12.    Student responds in native language

Native language accommodations were found in the policies of 42 states. Figure 2 provides an overview of how these states’ policies addressed native language accommodation from the perspective of content area. As Figure 2 shows, the policies of 36 states allowed at least one native language accommodation for at least one content area; 11 states prohibited at least one of these accommodations for some content areas; 10 prohibited at least one native language accommodation for all content areas. In most cases, native language accommodations prohibited for some content areas were prohibited for ELA but allowed for mathematics (and often for science). This highlights states’ concern that native language accommodations are more likely to affect the validity of ELA assessments than are mathematics or science assessments. Table B-6 (in Appendix B) provides a detailed breakdown of these data in terms of state and particular accommodation cited.

Figure 2. Overview of accommodations in native language allowed or prohibited by states’ policies

English language accommodations

Many of the accommodations found in states’ policies provided linguistic support to ELLs in English and were classified by the research team as English language accommodations. From the perspective of second language acquisition research, these accommodations can be understood as employing the following strategies: (1) simplification of some or all aspects of the test language, (2) repetition of the test language, or (3) clarification of parts of the test language. Table 2 lists English-language accommodations specified in states’ policies for SY 2000–2001.

Table 2. English language accommodations found in states’ assessment policies, SY 2000–2001
Simplification

1.       Directions simplified

2.       Test items read aloud in simplified/sheltered English

3.       Simplified/sheltered English version of test provided

Repetition

4.       Directions read aloud in English

5.       Directions repeated in English

6.       Audio-taped directions provided in English

7.       Oral and written directions provided in English

8.       Key words or phrases in directions highlighted

9.       Test items read aloud in English

10.    Audio-taped test items provided in English

11.    Key words and phrases in test highlighted

Clarification

12.    Directions explained/clarified in English

13.    Test-taker verifies understanding of directions

14.    Words on test clarified (e.g. words defined, explained)

15.    Spelling assistance, spelling dictionaries, spell/grammar checker

Accommodations providing direct linguistic support in English were found in the policies of 40 states. Figure 3 provides an overview of how these states’ policies addressed the four different kinds of English-language accommodations from the perspective of content area. As Figure 3 shows, the policies of 32 states allowed at least one English-language accommodation for at least one content area; 35 states prohibited at least one of these accommodations for some content areas; seven prohibited at least one English-language accommodation for all content areas. As was the case for native language accommodations, in most instances, accommodations prohibited for some content areas were prohibited for ELA but allowed for mathematics (and often for science). Tables B-7–B-9 (in Appendix B) provides a detailed breakdown of these data in terms of state and particular accommodation cited.

Figure 3. Overview of accommodations providing direct linguistic support in English allowed or prohibited by states’ policies

Indirect Linguistic Support Accommodations

The ability of ELLs to process the language of a test can be affected by test conditions as well as by the language of the test. When test conditions hinder ELLs’ ability to process language, test performance becomes a reflection of ELLs’ English language proficiency rather than their academic capability. Indirect linguistic support accommodations are intended to facilitate ELLs’ comfort level, so their full attention can be given to processing the language and content of the test. Indirect linguistic support accommodations include (1) adjustments to test schedules or to the time allowed students to take an assessment or (2) adjustment of the test environment. A complete list of indirect linguistic support accommodations is provided in Table 3.

Adjusting the test schedule (providing extra time or allowing the student to take a test at a time of day at which the student is most likely to perform at his or her best) can ease anxiety about the test and, therefore, allow ELLs opportunity to more fully attend to accessing the language and content of the test. As shown in Part I of this study, the provision of extra time may also be helpfully combined with direct linguistic support accommodations that require the student to engage in extra tasks. For instance, a student who is given access to a dictionary or glossary will have to spend time reading the glosses or definitions and possibly looking them up on a separate book or section of the test. Accommodations that involve adjustments to schedule or timing include extending test time, providing breaks, and administering the test at time of day most beneficial to test-taker.

 

Table 3. Indirect linguistic support accommodations (N = 17 of 75)
Test Schedule

1.       Test time increased

2.       Test schedule extended

3.       Subtests flexibly scheduled

4.       Test administered at time of day most beneficial to test-taker

5.       Breaks during test sessions

Test environment

6.       Test individually administered

7.       Test administered in small group

8.       Teacher faces test-taker

9.       Test administered in location with minimal distraction

10.    Test-taker provided preferential seating

11.    Test-taker tested in separate location (or carrel)

12.    Person familiar to test-taker administers test

13.    ESL/bilingual teacher administers the test

14.    Additional one-to-one support during test administration in general education classroom (e.g. instructional assistant, special test administrator, LEP staff, etc.)

15.    Test administered in familiar room

16.    Test administered in ESL/Bilingual classroom

17.    Special test preparation provided

 

Test environment accommodations involve adjustments to the physical and socio-cultural features of the testing situation. Taking a test in an unfamiliar room or with an unfamiliar test administrator may heighten the stress caused by assessment and increase test anxiety. Such additional stress can inhibit ELLs’ ability to process the language, and therefore content, of the test. Adjustments to test environment, such as allowing small-group or individual testing or testing in a familiar room or with familiar personnel, can help minimize the stress of assessment. Another hurdle that some ELLs face during assessment is unfamiliarity with the nature and form of a standardized test. Providing special test preparation to ELLs may help minimize the potential stress and confusion likely to accompany assessment.

Forty-four states’ policies included indirect linguistic support accommodations. As Figure 4 shows, the policies of all 44 states allowed the use of at least one indirect linguistic support accommodation for at least one content area. Ten states’ policies prohibited at least one of these accommodation for some content areas, and Tennessee’s policy prohibited the use of indirect linguistic support accommodations for all content areas.

 

Figure 4. Overview of accommodations in providing indirect linguistic support allowed or prohibited by states’ policies

The Accommodations Decision-Making Process

Most states’ policies acknowledge that the identification of accommodations appropriate to ELLs should be made on an individual basis. States’ policies for SY 2000–2001 provided guidance to help districts determine (a) which ELLs should take accommodated versions of state assessment and (b) which accommodations are appropriate for particular ELLs. This guidance included (1) criteria to be used in the decision-making process and (2) individuals responsible for making the decisions. Not all policies included both of these considerations. Furthermore, often the relationship between criteria and decision makers was unclear, and the coordination of the efforts of decision makers unguided.

Criteria

Overall, criteria for determining which students are eligible for ELLs-responsive accommodations addressed two points: (a) when ELLs might be ready to take the same assessments as their English-language peers and (b) when ELLs might take other-than-standard versions of the assessment (August & Hakuta, 1997). Taken as a whole, states’ policies addressed these concerns by developing criteria of the following types: (1) language-related: level of English language proficiency or placement in a language-related program of instruction, (2) time-related: length of time a student has been in an academic environment in which English was the primary language of instruction, (3) academic-related: student’s prior schooling and academic achievement as measured by test performance, and (4) opinion-related: judgment of school personnel and/or family of student (including student). Table B-11 provides data on criteria designated in states’ policies for SY 2000–2001. As these data indicate, most states (27) listing criteria specified language-related criteria (specifically, level of English language proficiency). A significantly fewer number of states cited academic-related, time-related, or opinion-related criteria, four, eight, and four states, respectively. Although language proficiency is probably the most important of the criteria, it addresses only one aspect of ELLs’ needs. Academic-related criteria can be helpful in taking into account important parts of a student’s background, including the student’s language of instruction.

Another consideration found in many states’ assessment policies is the use only of those accommodations that are used during instruction. This consideration is more pertinent to SDs than to ELLs. It is based on the idea that there is a system in place to track the use of routine classroom accommodations. The Individual Education Plan of an SD provides such an apparatus and the accommodations used during assessment are more likely to be those used routinely during instruction. In the case of ELLs, however, the most important consideration is the language of instruction, which is not an accommodation. Rather, the language of instruction forms a part of a student’s academic background. Therefore, policy stipulating that only accommodations used routinely in the classroom for ELLs can be used during assessment is unrealistic.

Decision Makers

In addition to criteria, states’ assessment policies for SY 2000–2001 often indicated which individuals were to participate in decisions regarding how to accommodate ELLs and which ELLs to accommodate. In SY 2000–2001, states’ policies designated the following individuals to participate in the decision-making process: (1) language acquisition specialists, (2) test officials (those administering the test), (3) general education teachers, (4) school administrators, and (5) student or student’s family.

To their credit, most states’ listed decision makers designated more than one individual, implicitly acknowledging that more than one perspective should be considered in decisions regarding inclusion. However, many states did not address the coordination of these decision makers.

The qualifications of decision makers also merits consideration. As Rivera et al. (2000) observe, “state policies should also encourage professionals with knowledge of language learning processes to participate in the decision-making process” (p. 69). Although most state policy documents on accommodations policies from SY 2000–2001 did designate decision makers (e.g., principal, ESL/bilingual teacher, parent, local committee), many were quite vague. Vague designations, such as “school personnel” or “district officials,” leave important personnel decisions up to districts.

The specification of a variety of decision makers is one way for those at the state level to ensure that a sound decision is made at the local level. However, state policies are still vague as to whether decision makers work as a coordinated unit or individually in sequence. Only 12 states specified that decision makers would work together as a team during the decision-making process.

In designating decision makers, states’ policies for SY 2000–2001 were are shaped by a number of factors, including, no doubt, expedience and practicality. However, two particularly important factors emerge from analysis of decision makers found in states’ policies: (1) awareness of learner characteristics, which informed the designation of individuals familiar with the students (e.g., classroom teacher, parent) and (2) concern over test validity, which informed the designation of staff familiar with the administration of tests for particular content areas (e.g., reading specialist). The most common approach used in the designation of decision makers is to include those who are more familiar with the ELL’s academic work at the local level. However, decision makers who can provide insight on how to maintain the validity of the test should also be part of the team. The inclusion of the test coordinator, for example, is one way that those at the state level can ensure that validity concerns can be specifically addressed by the local committee during the decision making process at the local level.

Conclusion

Data presented in this section has shown that, taken as a whole, states’ policies for SY 2000–2001 addressed a number of key elements for the accommodation of ELLs on state assessments. First, a variety of accommodations (75) available to ELLs were listed in states’ policies. Of these accommodations, 44 were responsive to the needs of ELLs. The kinds of accommodations addressed in states’ policies varied and included those that addressed the linguistic needs of ELLs directly (in English or in ELLs’ native languages) and indirectly, through accommodations that adjust the test schedule or environment.

Second, data show that states’ policies addressed the process by which eligible ELLs are identified and given accommodations appropriate for their individual needs. The policies of 28 states identified criteria to be used in the decision-making process, and 22 identified personnel who should participate in this process.

Analysis of SY 2000–2001 data from state assessment policies also provides insight into the complexity of the task facing state policy makers attempting to employ accommodation as an inclusion strategy for ELLs. Although many important considerations for accommodating ELLs are addressed in the assessment policies examined, few states’ policies provided comprehensive guidance.

 

Part III: Recommendations

Findings from An Analysis of State Assessment Policies Addressing the Accommodation of English Language Learners during SY 2000–2001 highlight the complexity of the task facing policy makers who seek to provide guidance for the use of accommodations for ELLs. Yet, given the current federal mandate that states include ELLs in state assessment systems and the continued effort to provide equitable educational opportunities for ELLs, providing effective accommodations policy, and disseminating it widely to key stakeholders, is more important than ever.

NAEP’s efforts to include ELLs through the use of accommodations parallel those represented in the policies addressed in the current study. Like states, if NAEP is to use accommodations effectively to fully represent “what America’s students know and can do” NAEP must ensure that accommodations used on assessments give students access to the content of the test while preserving validity and score comparability. The best way to ensure the appropriate use of accommodations is to develop and apply a set of procedures for determining (a) which ELLs should take accommodated versions of state assessment and (b) which accommodations are appropriate for particular ELLs. In order to facilitate these decisions, recommendations are made for determining the selection of particular accommodations and for further research.

Selecting Appropriate Accommodations

(1) Use an ELL-responsive framework as a tool for selecting appropriate accommodations for ELLs. Currently, NAEP policy uses the same accommodations taxonomy as that found in states’ policies: a taxonomy that was developed to classify accommodations for students with disabilities. Because the assessment needs of ELLs and SDs differ significantly, it is recommended that NAEP go further in distinguishing these two student groups. A framework premised on what is known about second language acquisition, such as that used for the study of states policies, holds promise as an organizer of ELLs’ accommodations. Such a framework would also communicate the unique linguistic needs of ELLs that should be considered by an assessment program. Table C-1 (Appendix C) classifies accommodations currently offered by NAEP into an ELL-responsive framework.

 

(2) Use accommodations that are responsive to ELLs. As described below, accommodations within the ELL-responsive framework meet the needs of ELLs with different backgrounds.

Accommodations providing direct linguistic support

  • Native language accommodations are most appropriate for students in early stages of learning English and students participating in native-language instruction.
  • English language accommodations are most appropriate for students receiving instruction in English in the content being tested.

 

Accommodations providing indirect linguistic support

  • Adjustments to test schedule. Accommodations that adjust the test schedule may help enhance ELLs’ performance on tests. However, there is no research directly supporting the use of these accommodations. It seems likely that, used in conjunction with direct linguistic support accommodations, extra time may help maximize the cognitive resources at ELLs’ disposal in a testing situation. More research is necessary before a firm recommendation can be made.
  • Adjustments to test environment. Like adjustments to test schedule, test environment accommodations may help maximize the cognitive resources at ELLs’ disposal in a testing situation. These accommodations are generally innocuous and are not considered a threat to score comparability, however, more research is necessary before a firm recommendation can be made.

 

(3) Use student background variables to inform selection of appropriate accommodations. Decisions regarding the use of accommodations should be made on an individual basis. The following suggestions can help make appropriate decisions regarding which accommodations are appropriate to which students.

  • Develop and apply consistent operational definition of English language learner (or LEP student). The federal definition provided by both IASA and NCLB define ELLs as individuals whose (A) language background is other than English, and (B) level of English language proficiency negatively affects their ability to succeed academically.
  • Consider level of English language proficiency as determined by the state English language proficiency assessment.
  • Based on record review and school questionnaires, establish the extent to which the ELL has been instructed in content being assessed by NAEP. In addition to helping match ELLs to appropriate accommodations, this information might be used to examine the effect of specific accommodations on students with different academic backgrounds.
  • Take into account the language of instruction when determining which accommodations are most appropriate for students.

 

(4) Use accommodations supported by research. Although research on accommodations for ELLs is inconclusive, two kinds of accommodations appear to hold promise: native language and linguistic simplification. In addition, combining specific direct linguistic support accommodations (i.e., bilingual glossaries) with specific indirect linguistic support accommodations (i.e., extra time) also appears to support ELLs performance on assessments.

Of the accommodations offering linguistic support, English language accommodations far outnumber native language accommodations. Though research on the impact of native language accommodation on assessment performance of ELLs is inconclusive, the result most likely is due to the lack of control in the research design for students’ prior schooling in native language in the content area being tested. Clearly, students who have not been schooled in their native language in mathematics, for example, should not be accommodated on assessments of mathematics in the native language. Furthermore, research on the effectiveness of native language in instructional contexts points to its usefulness as a tool for helping ELLs access content. Thus, for those students who have received schooling in the native language in a particular content area but are not yet proficient in English, the use of native language accommodations for that content area may allow these students to demonstrate their understanding of the content knowledge being tested (for example, in mathematics and science).

Research on the impact of linguistic simplification on ELLs’ assessment performance has resulted in some evidence that it is an effective accommodation for mathematics and possibly science. However, it should be kept in mind that this accommodation appears to be most useful for students at lower levels of English language proficiency. Further research needs to be conducted to examine the effects of this accommodation on ELLs who are at more advanced stages of English language proficiency.

Finally, combinations of direct and indirect linguistic support accommodations support ELLs on state assessments. For example, there is a strong rationale for combining the use of bilingual glossaries with extra time to complete the assessment. If bilingual glossaries are to be used effectively, it is reasonable to expect the student to need extra time to access the glossary and to read the glossary and use it when decoding the test items.

Recommended Research

Given the importance and complexity of fairly and accurately assessing ELLs in NAEP, it is recommended that NAEP convene a panel comprised of leading researchers of accommodations, second language acquisition researchers, state education agency (SEA) leaders involved in policy development around the assessment of ELLs, and practitioners. The panel would be charged with identifying at least two accommodations to be field tested for use on NAEP. The panel should also review and critique criteria for administering all NAEP accommodations. A panel of experts should be reconvened periodically to review the criteria for administering accommodations and the extent to which individual accommodations appear promising in supporting ELLs.

A final general recommendation can be made in the interest of practicality. As this discussion makes clear, research on the operational impact of accommodations is a complex matter. This is because such research has the goal of determining which accommodations have a positive impact on ELLs’ test scores, without being a threat to score comparability or validity. The research is made more complex by the challenge of selecting appropriate accommodations for each student or subject area. Because of the difficulty of conducting such research in the most discerning way, NAEP should consider conducting research on the comparability of scores for regular (non-ELL) students exposed to different direct linguistic and indirect linguistic support accommodations. Accommodations that pose no threat to score comparability could then be used routinely with ELLs. In time, the experience of teachers and test program administrators would provide evidence concerning which accommodations are most appropriate and effective for ELLs with certain academic and background characteristics. Such a research program, based on score comparability research using an ELL-responsive framework and non-ELLs as subjects, may be more practical than accommodations research using ELLs as subjects. Accommodations that preserve score comparability would be viewed as innocuous and could be used for ELLs. Subsequent research could then be carried out to assess how and for which ELLs such accommodations improve test scores.

 

References

Abedi, J. (2001). Assessment and accommodations for English language learners: Issues and recommendations. (Policy brief 4). Los Angeles, CA: University of California, Center for the Study of Evaluation/National Center for Research on Evaluation, Standards, and Student Testing.

 

Abedi, J., Courtney, M., & Leon, S. (2003). Research-supported accommodation for English language learners in NAEP. (CSE Tech. Rep. No. 586). Los Angeles, CA: University of California, National Center for Research on Evaluation, Standards, and Student Testing.

 

Abedi, J., Courtney, M., Mirocha, J., Leon, S., & Goldberg, J. (in press). Language accommodations for English language learners in large-scale assessments: Bilingual dictionaries and linguistic modification. Los Angeles, CA: National Center for Research on Evaluation, Standards, and Student Testing.

 

Abedi, J., Hofstetter, C., Baker, E., & Lord, C. (2001). NAEP math performance and test accommodations: Interactions with student language background. (CSE Tech. Rep. No. 536). Los Angeles, CA: National Center for Research on Evaluation, Standards, and Student Testing.

 

Abedi, J., & Lord, C. (2001). The language factor in mathematics tests. Applied Measurement in Education, 14(3), 219–234.

 

Abedi, J., Lord, C., Boscardin, C. K., & Miyoshi, J. (2000). The effects of accommodations on the assessment of LEP students in NAEP. Los Angeles, CA: University of California, National Center for Research on Evaluation, Standards, and Student Testing.

 

Abedi, J., Lord, C., Boscardin, C.K., & Miyoshi, J. (2001). The effects of accommodations on the assessment of limited English proficient students in the National Assessment of Educational Progress. (Publication No. NCES 2001–13), Washington, DC: National Center for Education Statistics.

 

Abedi, J., Lord, C., & Hofstetter, C. (1998). Impact of selected background variables on students’ NAEP math performance (CSE Technical Report No. 478). Los Angeles, CA: University of California, National Center for Research on Evaluation, Standards, and Student Testing.

 

Abedi, J., Lord, C., & Plummer, J. R. (1997). Final report of language background as a variable in NAEP mathematics performance. (CSE Technical Report No. 429). Los Angeles, CA: University of California, National Center for Research on Evaluation, Standards, and Student Testing.

 

Abedi, J., Lord, C., & Plummer, J.R. (1995). Language background as a variable in NAEP mathematics performance. Los Angeles, CA: University of California, National Center for Research on Evaluation, Standards, and Student Testing.

 

Adams, M.J. (1990). Beginning to read: Thinking and learning about print. Cambridge, MA: MIT Press.

 

Albus, D., Bielinski, J., Thurlow, M., & Liu, K. (2001). The effect of a simplified English language dictionary on a reading test. (LEP Project Rep. No. 1). Minneapolis, MN: University of Minnesota, National Center for Educational Outcomes.

 

American Educational Research Association, American Psychological Association, & National Council on Measurement in Education (1999). Standards for educational and psychological tests. Washington, DC: American Psychological Association.

 

Anderson, M., Liu, K., Swierzbin, B., Thurlow, M., & Bielinski, J. (2000). Bilingual accommodations for limited English proficient students on statewide reading rests: Phase 2. (Minnesota Rep. No. 31). Minneapolis, MN: National Center for Educational Outcomes.

 

Blair, D., & Harris, R. (1981). A test of interlingual interaction in comprehension by bilinguals. Journal of Psycholinguistic Research, 10(4) 457–467.

 

Butler, F. A., & Stevens, R. (1997). Accommodation strategies for English language learners on large-scale assessments: student characteristics and other considerations. (CSE Tech. Rep. 448). Los Angeles, CA: Center for the Study of Evaluation, Standards and Student Testing.

 

CCSSO (1997) Annual survey of state assessment programs. Washington, DC: Author.

 

Castellon-Wellington, M. (2000). The impact of preference for accommodations: The performance of ELLs on large-scale academic achievement tests. (Tech. Rep. No. 524.) Los Angeles, CA: University of California, National Center for the Study of Evaluation, Standards, and Student Testing.

 

Celce-Murcia, M., & Larsen-Freeman, D. (1983). The grammar book: An ESL/EFL teacher’s book. Rowley, MA: Newbury House.

 

Cervantes, R. (1983). Say it again Sam: The effect of repetition on dictation scores. Unpublished term paper, University of Hawaii at Manoa.

 

Chaudron, C. (1988). Second language classrooms: Research on teaching and learning. New York: Cambridge University Press.

 

Dornic, S. (1979). Information processing in bilinguals: Some selected issues. Psychological Research, 40, 329–348.

 

Elliott, S. N., Kratochwill, T. R., & Schulte, A. G. (1998). The assessment accommodations checklist: Who, what, where, when, why, and how? Teaching Exceptional Children, 31(2), 10–14.

 

Ellis, N.C. (1999). Cognitive approaches to SLA. Annual Review of Applied Linguistics, 19, 22–42.

 

Ellis, N. (1996). Sequencing in SLA: Phonological memory, chunking, and points of order. Studies in Second Language Acquisition, 18, 91–126.

 

Ellis, N., & Schmidt, R. (1997). Morphology and longer distance dependencies: Laboratory research illuminating the A in SLA. Studies in Second Language Acquisition, 19, 145–171.

 

Ellis, N., & Sinclair, S. (1996). Working memory in the acquisition of vocabulary and syntax: Putting language in good order. The Quarterly Journal of Experimental Psychology, 49(A), 234–250.

 

Forster, K.I., & Olbrei, I. (1973). Semantic heuristics and syntactic trial. Cognition, 2, 319–347.

 

Garcia, T., (with del Rio Paraent, L., Chen, L., Ferrara, S., Garavaglia, D., Johnson, E., Liang, J., Oppler, S., Searcy, C., Shieh, Y., & Ye, Y.). (2000, November). Study of a dual language test booklet in eighth grade mathematics: Final report. Washington, DC: American Institutes for Research.

 

Halliday, M.A.K., & Martin, J.R. (1993). Writing science: Literacy and discursive power. Pittsburgh, PA: University of Pittsburgh Press.

 

Hafner, A.L. (2000). Evaluating the impact of test accommodations on test scores of LEP students and non-LEP students. Dover, DE: Delaware Department of Education.

 

Hofstetter, C.H. (2003). Contextual and mathematics accommodation test effects for English language learners. Applied Measurement in Education, 16(2), 159–188. Mahwah, NJ: Lawrence Erlbaum Associates.

 

Hoosain, R., & Salili, F. (1987). Language differences in pronunciation speed for numbers, digit span, and mathematical ability. Psychologia, 30(1), 34–38.

 

Jensen, E. D. & Vinther, T. (2003). Exact repetition as input enhancement in second language acquisition. Journal of Learning Language, 53(3), 373–428.

 

Jones, P.L. (1982). Learning mathematics in a second language: A problem with more and less. Educational Studies in Mathematics, 13, 269–287.

 

Kiplinger, V. L., Haug, C. A., & Abedi, J. (2000, June). A math assessment should test math, not reading: One States’ approach to the problem. Paper presented at the 30th annual National Conference on Large-Scale Assessment, Snowbird, Utah.

 

Laufer, B., & Hadar, L. (1997). Assessing the effectiveness of monolingual, bilingual, and “bilingualized”: dictionaries in the comprehension and production of new words. The Modern Language Journal, 81, 189–196.

 

Long, M. (1980). Inside the ‘black box’: methodological issues in classroom research on language learning. Language Learning, 30(1), 1A2

 

Long M. (1983). Native speaker/non-native speaker conversation in the second language classroom. In On TESOL ’82: Pacific perspectives on language learning and teaching. Eds. M. Clarke and J. Handscombe. Washington DC: TESOL.

 

Long, M. (1996). The role of the linguistic environment in second language acquisition. In Ritchie, W C. and Bhatia, T K. (eds), Handbook of second language acquisition. San Diego: Academic Press, 413–68.

 

Lutkus, Anthony D., & Mazzeo, John (2003, February). Including special-needs students in the NAEP 1998 Reading Assessment. Part I: Comparison of overall results with and without accommodations. Jessup, MD: U.S. Department of Education.

 

Mack, M. (1986). A study of semantic and syntactic processing in monolinguals and fluent early bilinguals. Journal of Psycholinguistic Research, 15(6), 463–488.

 

McLaughlin, B. (1990). ‘Conscious’ versus ‘unconscious’ learning. TESOL Quarterly, 24, 617–34.

 

McLaughlin, B., Rossman, T., & McLeod, B. (1983). Second language learning: An information processing perspective. Language Learning, 33, 135–58.

 

Messick, S. (1989). Validity. In R. L. Linn (Ed.), Educational measurement (3rd ed.), pp. 13–103.

 

Messick, S. (1994, December). Standards-based score interpretation: Establishing valid grounds for valid inferences. Princeton, NJ: Educational Testing Service.

 

Miyake, A., & Friedman, N. (1998). Individual differences in second language proficiency: Working memory as language aptitude. In A.F. Healy & I.E. Bourne (Eds.), Foreign language learning: Psycholinguistic studies on training and retention, (pp. 339–364). Mahwah, NJ: Erlbaum.

 

National Center for Educational Statistics (2003). What is NAEP? Retrieved April 28, 2003 from http://nces.ed.gov/nationsreportcard

 

National Center for Educational Statistics (2004). NAEP inclusion policies. Retrieved January 10, 2004 from http://nces.ed.gov/nationsreportcard/about/inclusion.asp#accom_table

 

National Research Council. (1999a). Testing, teaching, and learning. Washington, DC: National Academy Press.

 

National Research Council. (1999b). High stakes: Testing for tracking, promotion, and graduation. Washington, DC: National Academy Press.

 

Naveh-Benjamin, M., & Ayres, T.J. (1986). Digit span, reading rate, and linguistic relativity. Quarterly Journal of Experimental Psychology, 38A, 739–751.

 

Olson, J., & Goldstein, A. (1997). The inclusion of students with disabilities and limited English proficient students in large-scale assessments: A summary of recent progress. Washington, DC: National Center for Education Statistics.

 

Pica, T., Lincoln-Porter, F, Paninos, D., & Linnell, J. (1996). Language learners’ interaction: How does it address the input, output, and feedback needs of language learners? TESOL Quarterly, 30(1), 59–84.

 

Rivera, C., & Stansfield, C. (1998). Leveling the playing field for English language learners: Increasing participation in state and local assessments through accommodations. In R. Brandt (Ed.), Assessing student learning: New rules, new realities (pp. 65–92). Arlington, VA: Educational Research Service.

 

Rivera, C, & Stansfield, C. (in press). The effects of linguistic simplification of science test items on the performance of limited English proficient and monolingual English speaking students. In C. Rivera and C. Stansfield, (Eds.), Educational Assessment. New York: Lawrence Erlbaum Associates.

 

Rivera, C., Stansfield, C. W., Scialdone, L., & Sharkey, M. (2000). An analysis of state policies for the inclusion and accommodation of English language learners in state assessment programs during 1998–1999. Arlington, VA: The George Washington University Center for Equity and Excellence in Education.

 

Rivera, C., & Vincent, C. (1997). High school graduation testing: Policies and practices in the assessment of English language learners. Educational Assessment, 4(4), 335–335.

 

Robinson, P. (1995). Attention, Memory, and the “Noticing” Hypothesis. Language Learning, 45(2), 283–331.

 

Robinson, P. (2001). Attention and memory during SLA. In C. Doughty & M. Long (Eds.), Handbook of research in second language acquisition. Oxford: Blackwell.

 

Schachter, P. (1983). On syntactic categories. Bloomington, IN: Indiana University Linguistics Club.

 

Shepard, L., Taylor, G., & Betebenner, D. (1998). Inclusion of limited-English proficient students in Rhode Island’s grade 4 mathematics assessment performance assessment (CSE Tech. Rep. No. 486). Boulder, CO: Center for the Study of Evaluation, Standards and Student Testing.

 

Sireci, S.G. (2003, December). Test accommodations for English language learners: A review of the literature. PowerPoint presented at the annual summit of the Office of English Language Acquisition, U.S. Department of Education, Washington, DC.

 

Sireci, S.G., Li, S., & Scarpati, S. (2002). The effects of test accommodations on test performance: A review of the literature. (CEA Research Rep. No. 485). Amherst, MA: School of Education, University of Massachusetts.

 

Skehan, P. (1998). A cognitive approach to language learning. Oxford: Oxford University Press.

 

Soares, C., & Grosjean, F. (1984). Bilinguals in a monolingual and a bilingual speech mode: The effect on lexical access. Memory and Cognition, 12(4), 380–386.

 

Stansfield, C. W. (2002). Linguistic simplification: a promising test accommodation for LEP students? Practical Assessment, Research & Evaluation, 8(7).

 

Swain, M. (1985). Communicative competence: Some roles of comprehensible input and comprehensible output in its development. In S. Gass & C. Madden (Eds.), Input in second language acquisition. Rowley, MA: Newbury House.

 

Swain, M. (1995). Three functions of output in second language learning. In G. Cook & B. Seidlhofer (Eds.), Principle and practice in applied linguistics: Studies in honour of H.G. Widdowson (pp. 125–144). Oxford: Oxford University Press.

 

Thurlow, M. L., McGrew, K. S., Tindal, G., Thompson, S. J., Ysseldyke, J. E., & Elliott, J. L. (2000). Assessment accommodations research: Considerations for design and analysis. (Tech. Rep. No. 26). Minneapolis, MN: University of Minnesota, National Center on Educational Outcomes.

 

Tindal, G., & Fuchs, L.S. (1999). A summary of research on test accommodations: An empirical basis for defining test accommodations. Lexington, KY: Mid-South Regional Resource Center.

 

U.S. Congress. (1994). Improving America’s Schools Act. Public Law 103-383. Washington, DC: Government Printing Office.

 

U.S. Congress. (2002). No Child Left Behind Act. Public Law 107-110. Washington, DC: Government Printing Office.

 

Valdes, G. & Figueroa, R. (1994). Bilingualism and testing: A special case of bias. Norwood, NJ: Ablex Publishing Company.

 

VanPatton, B. (1990). Attending to form and content in the input: An experiment in consciousness. Studies in Second Language Acquisition, 12(3), 287–301.

 

 

Appendix A

Overview of Accommodations Research

 

 

Table A-1. Studies examining the effectiveness of accommodations for ELLs
Study Accommodations Content Sample
Ling. Simp. Dict/ Gloss. Native Lang. Read Aloud Extra Time Reading Math Science Social Science ELLs non-ELLs Total Grade(s)
Abedi (2003) ü ü     ü   317 294 611 4, 8
             
Abedi, Courtney, & Leon (2003)   ü ü   ü     535 614 1,149 4, 8
             
Abedi, Hofstetter, Baker, & Lord (2001)a ü ü ü   ü     501 445 946 8
         
Abedi & Lord (2001) ü   ü     372 802 1,174 8
           
Abedi, Lord, Boscardin, & Miyoshi (2001)   ü     ü   183 236 419 8
           
Abedi, Lord, & Hofstetter (1998)a ü ü   ü     864 530 1,394 8
         
Abedi, Lord, & Plummer (1997) ü   ü     320 711 1,031 8
           
Albus, Bielinski, Thurlow, & Liu (2001)   ü ü       133 69 202 middle school
           
Anderson, Liu, Swierzbin, Thurlow, & Bielinski (2000)   ü ü ü       105 101 206 8
               
Castellon-Wellington (2000)     ü ü       ü 106 0 106 7
           
Garcia (2000)   ü   ü     320 82 402 8
               
Hafner (2000)   ü ü ü   ü     82 288 370 4, 7
           
Hofstetter, 2003 ü ü   ü     676 173 849 8
           
Kiplinger, Haug, & Abedi (2000)a ü ü   ü     152 1,046 1,198 4
         
Rivera & Stansfield (in press) ü     ü   109 11,306 11,415 4, 6
Note. In describing student samples some researchers used the terms “LEP” and “non-LEP” rather than “ELL” and non-ELL.”

aBecause ELL and non-ELL sample Ns were not reported, values were calculated from percents of total sample.

 

Discussion of Individual Studies

Each of the 15 studies examined one or more of the following types of accommodations: (1) linguistic simplification, (2) customized English dictionaries and glossaries (e.g., English-to-Spanish glossary, Spanish/English glossaries, simplified English glossaries, computer test with pop-up glossary), (3) use of the native language (e.g., dual language tests), (4) reading items and/or directions aloud, and (5) providing extra time in combination with other accommodations. These studies are discussed below in the context of the accommodation(s) examined in the study.

Linguistic Simplification

Abedi (2003) used NAEP science items to compare the effects of administering a test to ELLs and non-ELLs with and without linguistically simplified items, as a customized English dictionary, and an English-to-Spanish glossary. The student sample consisted of 611 fourth- and eighth-graders, of whom 317 were ELLs. ELLs and non-ELLs were randomly administered test items with no accommodation or with one accommodation. The study results revealed that fourth-grade ELLs performed better on accommodated items as compared to non-accommodated items. (Abedi did not compare the performance of fourth-graders on items administered with different accommodations.) Among eighth-grade ELLs, Abedi reports that ELLs taking the linguistically simplified items scored highest, while ELLs given access to the bilingual dictionary scored lowest. In second and third position, respectively, were ELLs provided with an English dictionary and ELLs administered test items under standard conditions. The researcher concluded that “some of the accommodation strategies employed were effective in increasing the performance of ELL students and reducing the performance gap between ELL and non-ELL students” (p. xiii).

 

Abedi, Hofstetter, Baker, and Lord (2001) used NAEP mathematics items to examine four accommodations: (1) simplified linguistic structures, (2) glossaries, (3) extra time, and (4) extra time plus glossaries. “One of five test booklets [four under accommodated conditions, one under standard conditions] were administered randomly to eighth-grade students in intact math classrooms” (p. 16). The sample included 946 students, about half of whom were categorized as LEP students (primarily Spanish-speaking). For most students, regardless of LEP status, performance on NAEP math items was higher under all accommodated conditions. The authors concluded that, in particular, the use of the modified or linguistically simplified English version of the assessment narrowed the score difference between LEP and non-LEP students, though this narrowing is less the result of increased performance by LEP students than it is a result of the fact that non-LEP students performed poorest on the modified version (Sireci et al., 2002).

 

Abedi and Lord (2001) simplified 20 NAEP eighth-grade mathematics items and randomly administered both the original and simplified items to 1,174 students, 372 of whom were designated ELL. The researchers found that linguistic simplification was beneficial to ELLs in the lowest-level mathematics classes as well as for non-ELLs of lower socio-economic status. In a subsequent analysis of students’ performance by mathematics achievement levels, it was found that the simplified items positively affected performance of students in low and average-level mathematics classes yet had a slightly negative effect on students in more advanced mathematics classes (i.e., algebra and honors mathematics classes). The researchers concluded that for students in the lowest levels of mathematics classes, certain linguistic features, such as unfamiliar words and passive verb constructions, appeared to contribute to the difficulty of text interpretation irrespective of ELL status.

Abedi, Lord, and Hofstetter (1998) compared the performance of LEP students and non-LEP students on mathematics items taken from the eighth-grade portion of the NAEP 1996 Mathematics State Assessment Program. The researchers constructed three test booklets: (1) the original English version, (2) a linguistically simplified version, and (3) a version translated into Spanish. The student sample consisted of 1,394 eighth graders, 864 of whom were designated LEP. Non-LEP students included 530 students classified as initially fluent in English (IFE) or fully English proficient (FEP). Test booklets were administered randomly to each student in the sample. Students designated as non-LEP performed better than LEP students across all three test booklets. LEP students scored highest on the simplified items, followed by the regular English test items, and lowest on the Spanish items. These findings led the researchers to reason “[that] the modified English accommodation enabled the LEP students to achieve scores most comparable to those of non-LEP students” (p. viii).

 

Abedi, Lord, and Plummer (1997) simplified 20 eighth-grade NAEP mathematics items considered linguistically complex and randomly administered original and simplified test items to 1,031 students, 320 of whom were designated as eligible for placement in an English as a Second Language (ESL) program. Findings from the study suggested that linguistic simplification was more effective for ELLs in lower-level mathematics classes than for ELLs in more advanced mathematics classes.

 

Hofstetter (2003) conducted an examination of “contextual factors, particularly at the classroom level, that influence Latino students’ performance on the NAEP mathematics assessment generally and by test accommodation” (p. 164). Her sample consisted of 849 eighth-grade students, 676 of whom were ELLs. Three test booklets were developed from eighth-grade NAEP mathematics items: (1) a non-accommodated test, (2) a linguistically simplified test booklet, and (3) a Spanish language test booklet. Each student was randomly administered one of the booklets. Results showed slightly higher performance for ELLs and non-ELLs on the linguistically simplified test than on the standard test booklet.

 

Kiplinger, Haug, and Abedi (2000) experimented with grade four NAEP items from the 1996 NAEP mathematics assessment. The test was designed to meet the specifications of the grade four mathematics Colorado State Assessment Program (CSAP). The researchers administered three test forms—a non-accommodated version, a simplified version, and a version with an English glossary containing definitions of non-technical words. Using matrix sampling, the test forms were administered randomly to a total sample of 1,198 fourth graders, of whom 152 were identified as ELL and 156 as special education students. The researchers found no significant difference in student performance across the three versions of the test. Moreover, no student group performed significantly better on any test form. The researchers attributed this finding to the general difficulty of the test items. With the exception of students with the lowest English proficiency, all students benefited from use of the glossary and the simplified test conditions, with students performing best under the glossary condition. The researchers concluded that glossaries and linguistic simplification might benefit all students, and therefore should be used.

 

Rivera and Stansfield (in press) carried out a study in which either a simplified or non-simplified test of science was administered to eight groups of fourth and sixth graders, with approximately 1,400 students in each group.  Only 109 of the students were ELLs and these were spread across all groups.  The linguistically simplified test items were randomly assigned through spiraling of test booklets on the Delaware Student Testing Program. Rivera and Stansfield found that linguistic simplification did not pose a threat to score comparability for monolingual English-speaking students. Unfortunately, the sample size for LEP students (n = 109) was too small to generalize widely from students’ performance on the simplified and non-simplified versions of the test. Although the researchers found that some of the linguistically simplified versions were slightly easier for LEP students to comprehend than the original items, the difference in difficulty was not statistically significant due to the lack of statistical power inherent with a small sample size.

Dictionaries and Glossaries

Abedi (2003) compared the effect of three accommodations including the use of a commercially published English and bilingual dictionary on a science test built with NAEP fourth- and eighth-grade items Although fourth-grade non ELLs outperformed ELLs by approximately two points, ELL students in both the English and the bilingual dictionary conditions scored significantly higher than ELL students in the standard condition. Although an achievement gap between ELL and non-ELL student performance remained evident, eighth-grade ELLs scored highest under the linguistically simplified condition; ELLs under the English dictionary condition scored the next highest; students under the bilingual dictionary condition scored the lowest. Use of the dictionary did not impact the performance of non-ELLs, providing evidence that validity was not compromised.

Abedi (2003) concludes that the findings show that different accommodations may be effective at different grade levels. In this case the dictionary condition seemed to help fourth graders more than eighth graders. While the linguistic simplification mode was more effective for eighth graders, Abedi speculates that this finding is reasonable given the linguistic complexity of the science test in the higher grades.

 

Abedi, Courtney, and Leon (2003) compared the use of four accommodations on a mathematics assessment: a customized English dictionary, a computer test with pop-up glosses, extra time, and small-group testing. Accommodations were randomly distributed within intact classrooms to two student samples: 607 fourth- grade students (including 279 ELLs) and 542 eighth-grade students (including 256 ELLs). The fourth-grade students were tested under standard conditions and with four accommodations. The eighth graders were administered the test under standard conditions and with two accommodations (customized dictionary and computer test with pop-up glossary).

Fourth-grade ELLs who took the computer test or received extra time had significantly higher scores than did ELLs who were tested under standard conditions. For ELLs, performance on the computerized test using the pop-up glossary was statistically significant; test performance for ELLs was not statistically significant with the customized dictionary accommodation.

For eighth-grade ELLs, performance on the computerized test with the pop-up glosses was significantly higher than performance with the customized dictionary or the test taken under standard test conditions alone. At both grade levels, non-ELLs did not perform significantly better on any of the accommodated conditions than in the standard test condition.

In interpreting their findings, Abedi, Courtney, and Leon (2003) noted that students who received the computerized test made extensive use of the pop-up glosses, which were activated simply by using the mouse to slide the on-screen pointer over a word. Despite the fact that “students expressed enjoyment of the computer delivery of the test,” Abedi and his colleagues cautioned that the use of computers for testing may prove logistically challenging. The researchers also noted that few students provided with a customized English dictionary indicated that they had availed themselves of this accommodation.

 

Abedi, Hofstetter, Baker, and Lord (2001) (described above under linguistic simplification) examined the effect of glossaries as well as glossaries with extra time among several other accommodations for eighth graders. The glossary consisted of “brief explanations” of potentially difficult terms that were written specifically for the test and printed in the margin of the test booklet alongside relevant test items (p. 22). Participants in the study were either taking eighth-grade math, pre-algebra, or algebra/integrated math. Overall, the findings indicated that for all students, the “most effective form of accommodation was the standard test booklet with Glossary plus Extra Time” (p. 54). While LEP student performance overall was lower than that of non-LEP students by five to six points, LEP students appeared to benefit from all accommodations, with glossary and extra time being of most benefit. Glossary plus extra time was also the most beneficial accommodation for non-LEP students. Glossary only was least beneficial for ELLs and moderately beneficial for non-ELLs.

 

Abedi, Lord, Boscardin, and Miyoshi (2001) administered 20 original NAEP science items to a sample of 422 eighth-grade students, including 183 ELLs (158 of whom were identified as Hispanic). The researchers developed three test booklets that were randomly distributed to students: (1) a standard test booklet, (2) a test booklet with a customized English dictionary appended to the end of the booklet, and (3) a booklet containing bilingual marginal glosses. The items on the customized dictionary were excerpted from an available published dictionary and included only words found on the test. The English and Spanish marginal glosses provided definitions or explanations of key terms in the test. English glosses appeared in right margins, Spanish translations of these glosses appeared in left margins.

Non-ELLs performed similarly under accommodated and unaccommodated conditions, indicating that the accommodations did not affect the construct tested. ELLs performed better under all accommodated conditions than under the unaccommodated condition, but performed significantly better under the English dictionary condition. The mean of students under the English and Spanish glossary conditions was nearly the same as the mean for the non-accommodated test indicating that the accommodation was of minimal benefit.

 

Albus et al. (2001) sampled 202 middle school students, two-thirds of whom were native Hmong speakers (133). The researchers examined the impact on students’ reading performance students of using a “simplified English dictionary.” The dictionary was commercially published and designed for ELLs. Four reading passages were designed to match to the Minnesota’s Basic Standards Reading Test and were administered to ELLs (n = 133) and non-ELLs (n = 69). On two of the passages, students were allowed to use the dictionary. For students who self-reported dictionary use, no significant differences in reading comprehension were found for either the experimental or control students (LEP or non-LEP). However, LEP students at the self-reported intermediate level of English proficiency showed a moderately significant gain. Overall, the researchers found that students with intermediate levels of English proficiency can make better use of an English dictionary than can students at lower levels of English proficiency.

 

Kiplinger et al. (2000) (described above under linguistic simplification) examined the use of glossaries (as well as linguistic simplification) on NAEP math items. Glossaries were written by the researchers and consisted of short explanations of words considered “unnecessarily difficult” for ELLs (or students with disabilities). Glosses were placed directly on the test booklet near the relevant terms. Kiplinger et al. found that fourth-grade ELLs with intermediate or higher English language proficiency performed better using a glossary on a mathematics test. However, when test difficulty was controlled, the researchers found that “all but the most limited English proficient students, including students with disabilities, performed best on the Glossary form of the test” (p. 12).

Native Language

Abedi, Lord, and Hofstetter (1998) (described above under Linguistic Simplification) compared performance on an original mathematics test in English with a Spanish version of the original English test. Test books with eighth-grade NAEP mathematics items were randomly assigned to 1,394 students, 864 of whom were LEP students. Overall, students (LEP and non-LEP) performed best under the linguistically modified condition, followed by the standard condition, and least well under the Spanish language condition. In general, performance of non-LEP students was higher than LEP designated students. LEP students performed somewhat better under standard conditions than under the Spanish language condition.

In considering the study findings, it is important to note that the sample was not explicitly delineated. That is, it is not clear from the study whether only Spanish-speaking students versus Asian and other language background students were assigned the Spanish language accommodation. In addition, the researchers acknowledge that data on students’ various levels of Spanish language proficiency were not available. Such data are essential background information that help to target an appropriate native language accommodation. Also, the authors note that most students in the study were receiving mathematics instruction in English, not Spanish. The poor performance of students on the Spanish test led the researchers to conclude “the language of instruction is an important consideration in identifying suitable test accommodations for LEP students” (pp. 28–29).

 

Anderson, Liu, Swierzbin, Thurlow, & Bielinski (2000) conducted a study to examine the effects of providing a translated version of test questions. The test was intended to approximate an English language reading comprehension test based on the Minnesota Basic Standards Assessment. While not noted or examined as separate accommodations, the researchers also provided an audiotaped Spanish rendition of the test directions and questions. (This was the same type of accommodation offered on the state mathematics assessment.) Also, while not explicitly allowed as an accommodation, test takers also were provided with extra time if it appeared it was needed (i.e., while the test was scheduled for two hours, students appearing to need extra time were offered additional time to complete the test).

A group of 206 eighth graders participated in the study. The main content of the test—presented in the form of four reading passages—was provided only in English. Students were divided into three groups: an accommodated ELL group (n=53), a non-accommodated ELL group (n=52), and a control group of general education students (n=101). As in other studies ELL performance levels were below those of the general population of students. Overall, Spanish-speaking students did not benefit from the provision of instructions and test questions in Spanish. However, it is important to note that the level of Spanish language proficiency was not controlled for in the student population tested. Also, students in the study had not received academic instruction in Spanish.

 

Garcia (2000) studied the effects of a bilingual Spanish/English or dual language test. Garcia’s sample consisted of 402 eighth graders delineated as follows: (1) non-ELLs, or native English speakers (n=82), (2) native Spanish speakers who had received three or more years of instruction in English (n=193), and (3) native Spanish speakers who had received less than three years of academic instruction in English (n=127). The researchers randomly administered two versions of NAEP mathematics items: a Spanish/English bilingual version and an English-only version. Students in one group (native English speakers) received the standard, English-only test booklet; students in the second group (native Spanish speakers with three or more years of English instruction) received either the standard or dual-language version of the test; students in the third group received only the dual-language version.

As part of a post-assessment cognitive lab, students reported having found the dual language test booklet useful. Yet the extent to which students actually utilized the two languages represented in the test booklet varied. Findings indicated that students using the dual language test booklet were likely to use one language only — students with three or fewer years of English instruction tended to use Spanish exclusively, while those with three or more years of instruction used the Spanish items as a way to cross-check their understanding of an item. After controlling both for English proficiency and language used to respond to test items, the researchers found no differences in mathematics test performance across the English and dual language test booklet.

Students less proficient in English performed better on the dual language test booklet. Native Spanish speakers who had received instruction in English for three or more years did not perform better on the dual-language test than they did on the standard test. In fact, their performance was slightly worse on the dual-language version of the test booklet. Although all students were allowed extra time, just slightly over four percent (n=17 of 402) of students utilized the option. The outcome for Spanish speakers instructed in English for three or more years suggests that extended instruction in English and level of English language proficiency were factors affecting performance. The researchers also concluded that the outcome “indicated psychometric equivalence between the dual language and English-only test booklets” (p. 6) In other words, the dual language booklet did not pose a threat to test validity.

 

Hafner (2000) studied “extended oral presentation” in the native language, which, at the test administrator’s discretion, included simplifying test directions, re-reading test directions, providing additional examples, or reading directions in a student’s native language. However, since Hafner did not track the particular aspects of “oral presentation” it is not possible to determine the effect of reading directions in a student’s native language as an accommodation. (Further discussion of this study may be found in the next section entitled, “Reading test items or directions aloud.”)

 

Hofstetter (2003) (described above under linguistic simplification), conducted an examination of “contextual factors, particularly at the classroom level, that influence Latino students’ performance on the NAEP mathematics assessment generally and by test accommodation” (p. 164). Results showed that, generally, students taking the Spanish version of the test scored slightly lower than students taking the standard booklet. However, Hofstetter notes that students taking the Spanish booklet who also received math instruction in Spanish performed better than students who received math instruction in Spanish but took the standard version of the test. She concluded that this provides “strong evidence that students perform better when the language of the mathematics test matches the students’ language of instruction” (p. 183).

 

Reading Test Items or Directions Aloud

Castellon-Wellington (2000) examined the effect on scores of reading aloud items on the seventh-grade social studies test of the Iowa Test of Basic Skills. The sample consisted of ELLs (n=106) only.

Castellon-Wellington provided a read-aloud accommodation that involved oral repetition of the actual text and did not involve any form of simplification. For the study, seventh-grade ELLs were offered a choice between receiving extra time to complete a test or having the items read aloud. First, students took a form of the Iowa Test of Basic Skills (ITBS) under standard conditions. Next, they were asked which accommodation they preferred for a retest (i.e., oral presentation or extra time). The allocation of accommodations was split into thirds: one third of the sample received the accommodation of their preference; another third received the accommodation not preferred; and the other third received an accommodation at random. The data indicated that neither preferred nor non-preferred accommodations benefited the performance of ELLs.

According to Castellon-Wellington, this accommodation is beneficial because “some students may be more prone to respond to both visual and oral stimuli than ….to visual stimuli alone” (2000, p. 3). She also points out that the provision of read-aloud accommodations can be accomplished without the burden of providing additional testing materials or modifying existing materials.

 

Hafner (2000) studied the provision of extended oral presentation and extra time. The study examined reading aloud test directions for the mathematics component of the Terranova for fourth- and seventh-grade students. Hafner’s sample included 82 ELLs and 288 non-ELLs; approximately a third of the students also were designated as students with disabilities. Oral presentation included, at the administrators’ discretion, simplifying test directions, re-reading test directions, providing additional examples, or reading directions in a student’s native language. The data on Terranova mathematics items for fourth (N=248) and seventh (N=122) grade students indicated that those who received extra time had significantly higher scores than those who had been provided with extended oral presentation of directions only, regardless of ELL status. Hafner did not consider the interaction of LEP status and accommodation condition.

Extra Time

Abedi, Courtney, and Leon (2003) (discussed above under dictionaries and glossaries) compared the use of extra time on a mathematics assessment along with three other accommodations: a customized English dictionary, a computer test with pop-up glosses, and test administration in small groups. Accommodations were randomly distributed within intact classrooms to fourth- and eighth-grade students. The fourth graders were the only group to be administered a test with extra time or with extra time and glossary. Fourth-grade ELLs performed better with extra time than they did under most other conditions: standard condition, customized dictionary, or small group administration. However, they performed less well with extra time than when given the test on a computer and provided pop-up glosses.

 

Abedi, Hofstetter, Baker, and Lord (2001) (discussed above under linguistic simplification and dictionaries and glossaries) Using NAEP mathematics items, examined the effect of extra time, either as a single accommodation or paired with the use of a glossary. Other accommodations included linguistic simplification and the use of glossaries without extra time. The sample included 946 students, half of whom were categorized as LEP students (primarily Spanish-speaking). Overall, students (LEP and non-LEP) performed best when given extra time and the use of a glossary and next best when given extra time only. This trend also was true for LEP students. The effect of allowing extra time on ELLs’ performance was inconclusive.

 

Anderson, Liu, Swierzbin, Thurlow, & Bielinski (2000) (discussed above under native language) conducted a study to examine the effects of providing a translated version of test questions. Although it was not explicitly allowed as an accommodation, test takers were provided with extra time in addition to the native language accommodation if it appeared it was needed (i.e., while the test was scheduled for two hours, students appearing to need extra time were offered additional time to complete the test).

 

Castellon-Wellington (2000) (identified above as examining the effect of reading aloud test items or directions) examined the use of extra time. for seventh-grade ELLs who were offered a choice between receiving extra time to complete a test or having the items read aloud. One third of the sample received the accommodation of their preference, another third received the accommodation not preferred, and the other third received an accommodation at random. The data indicated that neither accommodation benefited the performance of ELLs, even when the accommodation was preferred.

 

Hafner (2000) studied the provision of extra time only and the provision of extra time along with extended oral presentation. The data indicated that on mathematics items of the Terranova, fourth (N=248) and seventh (N=122) grade students who received extra time had significantly higher scores than those who had been provided with extended oral presentation of directions only. As Sireci, Li, and Scarpati (2002) observe, however, Hafner did not consider whether the accommodations were more beneficial to ELLs in particular.

 

Appendix B

Data from States’ Assessment Policies

for SY 2000–2001

 

 

Table B-1. States’ use of SD-responsive taxonomy for classifying accommodations for ELLs
State Addressed use of accommodations for ELLs Used SD-responsive taxonomy

to classify accommodations

AK
AL ü ü
AR ü
AZ ü
CA ü ü
CO ü ü
CT ü
DC ü ü
DE ü ü
FL ü ü
GA
HI ü
IA ü
ID
IL
IN ü
KS ü ü
KY ü
LA ü ü
MA ü
MD ü ü
ME ü ü
MI ü
MN ü ü
MO ü
MS ü ü
MT ü ü
NC ü
ND ü ü
NE ü ü
NH ü ü
NJ ü
NM ü
NV ü ü
NY ü
OH ü
OK ü ü
OR ü ü
PA ü ü
RI ü ü
SC ü
SD ü ü
TN ü
TX ü
UT ü ü
VA ü ü
VT ü ü
WA ü ü
WI ü
WV ü ü
WY ü ü
Total 47 28

 

 

Table B-2. States’ policies listing accommodations according to student groups, SY 2000–2001
State Addressed use of accommodations for ELLs Listed accommodations Listed ELL and SD accommodations together Listed ELL accommodations separately
AK
AL ü ü ü
AR ü ü ü
AZ ü ü ü
CA ü ü ü
CO ü ü ü
CT ü ü ü
DC ü ü ü
DE ü ü ü
FL ü ü ü
GA
HI ü ü ü
IA ü
ID
IL
IN ü ü ü
KS ü ü ü
KY ü ü ü
LA ü ü ü
MA ü ü ü
MD ü ü ü
ME ü ü ü
MI ü ü ü
MN ü ü ü
MO ü ü ü
MS ü ü ü
MT ü ü ü
NC ü ü ü
ND ü ü ü
NE ü ü ü
NH ü ü ü
NJ ü ü ü
NM ü ü ü
NV ü ü ü
NY ü ü ü
OH ü ü ü
OK ü ü ü
OR ü ü ü
PA ü ü ü
RI ü ü ü
SC ü ü ü
SD ü ü ü
TN ü ü ü
TX ü ü ü
UT ü ü ü
VA ü ü ü
VT ü ü ü
WA ü ü ü
WI ü ü ü
WV ü ü ü
WY ü ü ü
TOTAL 47 46 18 28

 

 

 

Table B-3. Content areas for which accommodations were designated in states’ policies for SY 2000–2001
State Policy explicitly addressed content ELA/Literature Math Science Social Studies Other
Reading Writing
AK
AL ü ü ü ü ü ü
AR ü ü ü ü
AZ ü ü ü ü
CA ü ü ü ü ü ü
CO ü ü ü ü ü
CT
DC
DE ü ü ü ü ü ü
FL ü ü ü ü
GA
HI ü ü ü ü
IA
ID
IL
IN ü ü ü ü ü
KS ü ü
KY
LA ü ü ü ü ü ü
MA ü ü ü ü ü
MD ü ü ü ü ü ü
ME ü ü ü ü ü ü ü
MI ü ü ü ü ü
MN ü ü ü ü
MO ü ü ü ü
MS ü ü ü ü ü ü
MT ü ü ü ü ü ü
NC ü ü ü ü ü ü ü
ND ü ü ü ü ü ü
NE ü ü
NH ü ü ü ü ü ü
NJ
NM
NV ü ü ü ü ü
NY ü ü ü ü ü ü
OH ü ü ü ü ü ü
OK ü ü ü ü
OR ü ü ü ü
PA ü ü ü ü
RI ü ü ü ü ü
SC ü ü
SD
TN ü ü ü ü ü ü
TX ü ü ü ü ü ü
UT ü ü ü ü ü ü
VA
VT ü ü
WA ü ü ü ü ü
WI ü ü
WV
WY ü ü ü ü
TOTAL 38 35 32 34 22 18 4

 

Table B-4. SY 2000–2001 Accommodations Designated for ELLs in States’ Policies, Listed within SD-Responsive Taxonomy (N = 75 accommodations)
I. Timing/Scheduling (N=5)

1.  Test time increased

2.  Breaks provided during test sessions

3.  Test schedule extended

4.  Subtests flexibly scheduled

5.  Test administered at time of day most beneficial to test-taker

II. Setting (N=17)

1.    Test individually administered

2.    Test administered in small group

3.    Test administered in location with minimal distraction

4.    Test administered in familiar room

5.    Test-taker tested in separate location (or carrel)

6.    Test administered in ESL/Bilingual classroom

7.    Individual administration provided outside school (home, hospital, institution, etc.)*

8.    Test-taker provided preferential seating

9.    Increased or decreased opportunity for movement provided *

10. Teacher faces test-taker

11. Special/appropriate lighting provided*

12. Adaptive or special furniture provided*

13. Adaptive pencils provided*

14. Adapted keyboards provided*

15. Person familiar with test-taker administers test

16. ESL/bilingual teacher administers test

17. Additional one-to-one support provided during test administration in general education classroom (e.g. instructional assistant, special test administrator, LEP staff, etc.)

III. Presentation (N = 35)

1.       Directions repeated in English

2.       Directions read aloud in English

3.       Audio-taped directions provided in English

4.       Key words or phrases in directions highlighted

5.       Directions simplified

6.       Audio-taped directions provided in native language

7.       Directions translated into native language

8.       Cues provided to help test-taker remain on task*

9.       Directions explained/clarified in English

10.    Directions explained/clarified in native language

11.    Both oral and written directions in English provided

12.    Written directions provided in native language

13.    Oral directions provided in native language

14.    Test items read aloud in English

15.    Test items read aloud in simplified/sheltered English

16.    Audio-taped test items provided in English

17.    Audio-taped test items provided in native language

Presentation (continued)

18.    Test items read aloud in native language

19.    Audio-taped test items provided in native language

20.    Assistive listening devices, amplification, noise buffers, appropriate acoustics provided*

21.    Key words and phrases in test highlighted

22.    Words on test clarified (e.g. words defined, explained)

23.    Word lists (mono- or dual-language dictionaries or glossaries) provided

24.    Enlarged print, magnifying equipment, Braille provided*

25.    Memory aids, fact charts, list of formulas and/or research sheets provided*

26.    Templates, masks, or markers provided*

27.    Cues (e.g., arrows and stop signs) provided on answer form*

28.    Acetate shield for page provided*

29.    Colored stickers or highlighters for visual cues provided*

30.    Augmentative communication systems or strategies provided (e.g. letter boards, picture communication systems, voice output systems, electronic devices)*

31.    Simplified/sheltered English version of test provided

32.    Side-by-side bilingual versions of the test provided

33.    Translated version of test provided

34.  Test interpreted for the deaf or hearing impaired/use of sign language provided*

35.  Electric translator provided*

IV. Response (N=16)

1.       Test-taker marks answers in test booklet*

2.       Test administrator transfers test-taker’s answers*

3.       Test-taker ‘s transferred responses checked for accurate marking*

4.       Copying assistance provided between drafts*

5.       Test-taker types or uses a machine to respond (e.g.. typewriter/word processor/computer)*

6.       Test-taker indicates answers by pointing or other method*

7.       Papers secured to work area with tape/magnets*

8.       Mounting systems, slant boards, easels provided to change position of paper, alter test-taker ‘s position*

9.       Physical assistance provided*

10.    Enlarged answer sheets provided*

11.    Alternative writing systems provided (including portable writing devices, computers and voice activated technology)*

12.    Test-taker verifies understanding of directions

13.    Test-taker dictates or uses a scribe to respond in English*

14.    Test-taker responds on audio tape in English*

15.    Test-taker responds in native language

16.    Spelling assistance, spelling dictionaries, spell/grammar checker provided

V. Other (N=2)

1.  Out-of-level testing provided*

2.  Special test preparation provided

* Accommodations appropriate only to SDs (n = 31)

 

 

Table B-5. ELL-Responsive accommodations (N=44)

Direct Linguistic Support (N=27)

Accommodations Provided in Native Language

Written translation

1.        Word lists (mono- or dual-language dictionaries or glossaries) provided

2.        Written directions provided in native language

3.        Side-by-side bilingual versions of the test provided

4.        Translated version of test directions and/or items provided

Scripted oral translation

5.        Oral directions provided in native language

6.        Audio-taped directions provided in native language

7.        Audio-taped test items provided in native language

Sight translation

8.        Directions explained/clarified in native language

9.        Test items read aloud in native language

10.     Directions translated into native language

11.     Written response in native language translated into English

Student responds in native language

12.     Oral response in native language translated into English

Accommodations provided in English

Simplification

1.        Directions simplified

2.        Test items read aloud in simplified/sheltered English

3.        Simplified/sheltered English version of test provided

Repetition

4.        Directions read aloud in English

5.        Test items read aloud in English

6.        Directions repeated in English

7.        Oral and written directions in English provided

8.        Audio-taped directions provided in English

9.        Audio-taped test items provided in English

10.     Key words or phrases in directions highlighted

11.     Key words and phrases in test highlighted

Clarification

12.     Directions explained/clarified in English

13.     Words on test clarified (e.g. words defined, explained)

14.     Spelling assistance, spelling dictionaries, spell/grammar checker

15.   Test-taker verifies understanding of directions

Indirect Linguistic Support (N=17)

Accommodations Involving Adjustment of Test Schedule

1.        Test Time increased

2.        Test schedule extended

3.        Subtests flexibly scheduled

4.        Test administered at time of day most beneficial to test-taker

5.        Breaks during test sessions

Accommodations Involving Adjustment of Test Environment

6.        Test individually administered

7.        Test administered in small group

8.        Teacher faces test-taker

9.        Test administered in location with minimal distraction

10.     Test-taker provided preferential seating

11.     Test-taker tested in separate location (or carrel)

12.     Special test preparation provided

13.     Person familiar to test-taker administers test

14.     ESL/bilingual teacher administers the test

15.     Additional one-to-one support during test administration in general education classroom (e.g. instructional assistant, special test administrator, LEP staff, etc.)

16.     Test administered in familiar room

17.     Test administered in ESL/Bilingual classroom

 

 

Note for Reading Tables B6–B9

States’ assessment policies often, but not always, specified which accommodations were allowed or prohibited for use with specific content area tests (e.g., English language arts [ELA], mathematics, science). However, because states’ policies were not consistent in specifying the subject matter for which specific accommodations were allowed the research team coded accommodations as follows:

  • allowed for at least one content area (A),
  • prohibited for some content areas (PS), or
  • prohibited for all content areas (PA).

This coding does not transparently identify the content areas for which accommodations were available. First, the present coding does not allow for a distinction between states allowing a particular accommodation for all content areas and those states and states that allowed a particular accommodation for only one content area. Second, some overlap between (A) and (PS) is inevitable. It may be assumed that a state’s policy that prohibited an accommodation for some content areas (PS), allowed that accommodation on at least one content area (A). However, such determinations often required information beyond that provided by the policies themselves. As the goal of this study was to present only that information explicitly represented in the states’ policies, the research team chose not to code implied references to the allowance or prohibition of accommodations for particular content areas.

Despite the limitations of this approach, the research team felt that the coding used throughout this study provides the clearest possible picture of the extent to which states’ policies took content area into consideration when listing accommodations. To clarify the implications of this picture it should be noted that, in general, it was found that accommodations prohibited in some content areas (PS) were prohibited for English language arts (ELA) but allowed for mathematics (and often science).

 

 

Table B-6. Number of states’ policies allowing or prohibiting native language accommodations for SY 2000–2001

State

Policy addressed native language accommodation Written translation Scripted oral translation Sight translation Response

Word lists (mono or dual language) provided Written directions provided Side-by-side dual-language versions of the test provided Translated version of the test provided Oral directions provided in native language Audio-taped test directions provided in native language Audio-taped test items provided in native language Directions translated into native language Directions explained/clarified in native language Test items read aloud in native language Interpreter or sight translator provided Student responds in Native Language
AK
AL ü A A PA A
AR
AZ ü A A A A PS A
CA ü A A A A
CO ü A A PA A
CT ü A A PS
DC ü A
DE ü PS A A A A A A A A
FL ü A A A A
GA
HI
IA
ID
IL
IN ü PA PA PA PA PA PA
KS ü A A A A
KY ü A A A A
LA ü A
MA ü A A A
MD ü A
ME ü A A A A A
MI
MN ü A A A A A A
MO
MS ü A
MT ü A A A PS A
NC ü A A PA PA PA PA
ND ü
NE ü A A A
NH ü A A A A
NJ ü PS PS PA
NM ü PS
NV ü PS
NY ü PS A A A PS A
OH ü A A A A
OK ü A A A PS
OR ü A A A A
PA ü A A A
RI ü PS PS A A
SC ü A A A
SD ü A A A A A A
TN ü PA
TX ü A A PA
UT ü A A A A A A PS A
VA ü A
VT ü A A A
WA ü A A PA
WI ü A A A
WV ü A PA PA
WY ü A A A A
Total A 23 2 4 12 3 8 1 26 10 13 10 8
Total PS 4 0 0 2 0 1 0 1 0 4 2 0
Total PA 1 0 2 2 0 0 0 2 2 6 2 1
Total states 42 29 3 6 16 3 9 1 29 12 23 14 9

 

 

Table B-7. Number of states’ policies allowing or prohibiting accommodations involving simplification of text for SY 2000–2001
State Policy addressed simplification Test directions simplified Test items read aloud in simplified/sheltered English Simplified/sheltered version of test provided
AK
AL
AR
AZ ü A
CA ü A
CO ü PA
CT
DC ü A
DE ü A A
FL
GA
HI
IA
ID
IL
IN ü PA
KS ü A
KY
LA
MA
MD
ME ü A
MI
MN
MO
MS
MT
NC ü PA
ND
NE ü A A
NH
NJ
NM
NV
NY
OH ü A
OK
OR ü A
PA ü A
RI ü A
SC ü A
SD ü A
TN
TX
UT ü A
VA ü A
VT
WA
WI
WV ü A
WY
Total A 14 2 2
Total PS 0 0 0
Total PA 3 0 0
Total states 19 17 2 2

 

Table B-8. Number of states’ policies allowing or prohibiting accommodations involving repetition of text for SY 2000–2001
Oral in-person Audio-taped Highlighted
State Policy addressed repetition Directions read aloud in English Items read aloud in English Directions repeated in English  Both oral and written directions provided Audio-taped directions provided in English Audio-taped test items provided in English Key words or phrases in directions highlighted Key words and phrases in test highlighted
AK
AL ü PS PS
AR
AZ ü A
CA ü A A A A A A A
CO ü A PS A PA PA
CT ü A
DC ü A A
DE ü A A A
FL ü PS
GA
HI ü PS
IA
ID
IL
IN A PS
KS ü PS A
KY ü A A
LA ü PS PS A
MA
MD ü PS A A A PS
ME ü A A A
MI
MN ü A A A A
MO ü PS
MS ü A PS PS A
MT ü A PS A
NC ü PS PS PS A
ND ü A PS PS PS A
NE ü A A A
NH ü A PS A
NJ
NM
NV
NY
OH ü A A
OK ü A PS A A
OR ü A PS A A A
PA ü A PS A A A
RI ü A PS A
SC ü A A A
SD ü A A
TN ü PA PA
TX ü PS
UT ü A PS A A A A
VA ü A A
VT ü A A
WA ü A PS A A
WI
WV ü A PS
WY ü A PS A
Total A 22 9 19 5 9 3 9 2
Total PS 3 22 2 0 2 2 0 0
Total PA 1 1 0 0 0 0 1 1
Total states 37 26 32 21 5 11 5 10 3

 

 

Table B-9. Number of states’ policies allowing or prohibiting accommodations involving clarification of text for SY 2000–2001
State Policy addressed clarification Directions explained/clarified in English Words on test clarified (e.g. words defined, explained) Spelling assistance, spelling dictionaries, spell/grammar checker provided Test-taker verifies understanding of directions
AK
AL ü PA
AR
AZ ü A
CA ü A A
CO ü PA PA
CT ü A
DC
DE ü A
FL ü A A PS
GA
HI ü A
IA
ID
IL
IN ü
KS
KY ü A A
LA ü
MA
MD ü PA
ME ü A
MI
MN ü A
MO
MS ü A A
MT ü PS
NC ü PA
ND
NE ü A
NH ü A
NJ
NM
NV ü PS PS
NY
OH
OK ü A A
OR ü A
PA ü PS A
RI ü PS PS
SC ü A A
SD
TN
TX ü PA
UT ü A A A
VA
VT ü A
WA ü A
WI
WV
WY ü PS A PS
Total A 15 3 6 5
Total PS 4 1 4 0
Total PA 2 1 2 0
Total states 30 21 5 12 5

 

 

 

Table B-10. Frequency with which indirect linguistic support accommodations were allowed or prohibited in states’ policies

State Policy addressed indirect linguistic support Test schedule Test environment
Test time increased Test schedule extended  Subtests flexibly scheduled Test administered at time of day most beneficial to test-taker Breaks provided during test sessions Test individually administered Test administered in small group Teacher faces test-taker Test administered in location with minimal distraction Test-taker provided preferential seating Test-taker tested in separate location (or carrel) Person familiar with test-taker administers test ESL/bilingual teacher administers test Additional one-to-one support provided Test administered in familiar room Test administered in ESL/Bilingual classroom Special test preparation provided
AK
AL ü A A A PS A A A A A A
AR ü A A A A
AZ ü A A A A A
CA ü A A A A A A A A
CO ü A A A A A A A A
CT ü A A A A A
DC ü A A A A A A A
DE ü A A A A A A
FL ü A A A A
GA
HI ü A PS A A A A
IA
ID
IL
IN ü PS A A A A A A A
KS ü A A A A
KY ü A
LA ü A A A A A A A A
MA
MD ü A A A A A A A A A A
ME ü A A A A A A A A A A
MI A
MN ü A A A A A A
MO ü A A A
MS ü A A A A A A A A A A A
MT ü A A A A A A A
NC ü A A PS A A A
ND ü A A A A A A A A A
NE ü A A A A A A A
NH ü A A A A A A A A A A
NJ ü PS A
NM
NV ü PS A A A A A A
NY ü A A A A
OH A
OK ü A A A A A A A
OR ü A A A A A A A A A A
PA ü A A A A A A A A
RI ü A A A A A A A A A A
SC ü A A A A
SD ü A A A A A A A A
TN ü PA A A A A A A
TX ü A
UT ü A A A A A
VA ü A A A A PS A A A A A
VT ü A A PS A A A
WA ü PS PS A A A A A A A A A
WI ü A A A A
WV ü A A A A A A A A A
WY ü A A A PS A A A A A
Total A 34 26 15 20 22 36 37 4 10 15 28 6 10 3 4 4 1
Total PS 4 1 0 0 6 0 0 0 0 0 0 0 0 0 0 0 0
Total PA 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
Total 42 38 28 15 20 28 36 37 4 10 15 28 6 10 3 4 4 1

 

 

Table B-11. Criteria for determining inclusion of ELLs in accommodated assessment for SY 2000–2001

State

Policy specified criteria Language Time Academic Opinion
English language proficiency Student’s native language proficiency Language program placement Primary language of instruction Time in US/English speaking schools Time in state’s schools Academic background in home language Performance on other test(s) Parent/guardian’s opinion or permission Teacher observation/ recommendation
AK
AL
AR ü ü ü ü
AZ ü ü ü ü
CA ü ü
CO
CT ü ü
DC ü ü ü
DE ü ü ü
FL ü ü ü
GA
HI
IA ü ü ü
ID
IL
IN ü ü
KS ü ü ü
KY ü ü
LA
MA ü ü ü ü ü
MD ü ü ü
ME ü ü
MI
MN ü ü ü
MO
MS ü ü ü
MT ü ü
NC
ND
NE ü ü ü ü
NH ü ü
NJ
NM
NV ü ü
NY
OH
OK
OR ü ü ü ü ü ü
PA
RI
SC ü ü
SD
TN
TX
UT ü ü
VA
VT ü ü
WA ü ü
WI ü ü
WV ü ü ü ü
WY ü ü ü ü ü ü
Total 28 23 7 5 2 5 3 3 2 3 2

 

 

Table B-12. Decision makers designated in states’ policies to determine inclusion of ELLs in accommodated assessment for SY 2000–2001
State Policy specified decision makers Language acquisition specialist Test official General education teacher School administrator Student/Parent Other(s)
Student’s ESL/bilingual teacher(s) Other ESL/bilingual/migrant teacher or ELL administrator Interpreter Test administrator(s) Guidance Counselor Reading Specialist Student’s classroom teacher(s)/content teacher(s) Principal School/district official(s) Student’s parent(s)/guardian(s) Student
AK
AL ü ü ü ü ü ü ü ü
AR
AZ ü ü
CA
CO ü ü ü ü ü ü
CT
DC ü ü ü
DE
FL
GA
HI
IA
ID
IL
IN
KS
KY ü ü ü
LA ü ü ü ü
MA
MD ü ü ü ü
ME ü ü ü ü
MI
MN ü ü ü ü
MO
MS ü ü ü ü ü
MT ü ü
NC
ND
NE
NH ü ü ü ü ü
NJ
NM
NV
NY ü ü ü
OH
OK ü ü
OR ü ü ü ü
PA ü ü ü
RI
SC
SD ü ü ü ü ü
TN
TX
UT ü ü ü
VA ü ü ü ü ü ü ü
VT ü ü
WA ü ü ü ü ü ü
WI
WV ü ü ü ü ü ü
WY
Total 22 9 3 1 5 4 3 11 9 5 11 4 4

 

 

Appendix C

NAEP Accommodations

 

 

Accommodations Permitted by NAEP*
ELL-Responsive SD-Responsive
Direct linguistic support Indirect linguistic support
Native language English language
Presentation format
Explanation of directions ü
Oral reading in English ü
Person familiar to student administers test ü
Bilingual (Spanish) version of test ü
Repeat directions ü
Large print ü
Bilingual dictionary ü
 
Setting format
Alone in study carrel ü
Administer test in separate room ü
With small groups ü
Preferential seating ü
Special lighting   ü
Special furniture   ü
   
Timing/Scheduling  
Extended testing time (same day) ü
More breaks ü
 
Response format  
Braille writers ü
Word processors or similar assistive devices ü
Write directly in test booklet ü
Scribes ü
Answer orally, point or sign an answer ü
One-on-one administration ü
   
Totals 2 3 8 8
  5
  13

Note. Source for NAEP accommodations: http://nces.ed.gov/nationsreportcard/about/inclusion.asp#accom_table

[1] Although the term Limited English Proficient (LEP) student is commonly used in federal legislation, many experts prefer the term English language learner (ELL) because it refers in a positive way to a student who is engaged in the process of learning English.

[2] For the purposes of this study, the District of Columbia is referred to as a state, bringing the total number of “states” included in the study to 51.

[3] The 1999 Standards for Educational and Psychological Testing defines validity as “the degree to which evidence and theory support the interpretations of test scores entailed by proposed uses of tests” (AERA, APA, & NCME, p. 9). For a sense of the full complexity of validity as a concept, see Messick (1994, 1989).

[4] The National Assessment of Educational Progress, also known as “the Nation’s Report Card,” is “the nation’s only ongoing survey of student achievement in core subject areas” (Lutkus & Mazzeo, 2003, p. vii). Since 1969, assessments have been conducted periodically in reading, mathematics, science, writing, history, civics, geography, and the arts. The national NAEP includes students drawn from both public and nonpublic schools and reports results for student achievement at grades 4, 8, and 12. State level results for NAEP have been provided since 1990.

[5] Tindal and Fuchs (1999) first summarized the literature related to test changes for students with disabilities. A discussion of studies of accommodations relevant to students with disabilities may also be found in Sireci et al. (2002), pp. 16–48.

[6] Studies using an experimental design include those that involved (1) manipulation of test administration and (2) random assignment of test conditions. Studies using quasi-experimental designs included those that involved (1) manipulation of test administration but not (2) random assignment of test conditions (Sireci et al., 2002) p. 11.

[7] For additional detail, refer to the Thurlow, McGrew, Tindal, Thompson, & Ysseldyke, & Elliott (2000) discussion of “Group Research Designs,” pp. 12–13.

[8] In reading the review of accommodation studies, it is important to recognize that researchers sometimes used the terms “ELL” and “LEP” interchangeably. LEP is used in federal legislation as well as in some states’ legislation, whereas use of the term ELL has grown markedly and is now part of the national lexicon.

[9] For the purposes of this study, the District of Columbia is referred to as a state, bringing the total number of “states” included in the study to 51.

Glossary of Commissioned Paper Synopsis – Download [Optimized PDF]

Glossary of Commissioned Paper Synopsis –  Download

Leave a comment

Your email address will not be published. Required fields are marked *