The Journal of Educational Issues of Language Minority Students, v13 p. 13-36, Spring 1994.
A PORTFOLIO ASSESSMENT MODEL FOR ESL
Sharon S. Moya; J. Michael O'Malley

The conceptual framework guiding the development of curriculum and instruction practices in the English as a second language (ESL) classroom has undergone significant modification during the last fifteen years. This shift in pedagogical theory has resulted in the increasing use of student-centered communicative approaches in the classroom. These approaches include process writing, process reading, communicative competence, and whole language (Goodman, 1989; Heymsfeld, 1989; Shanklin & Rhodes, 1989) and are distinguished from prior practices by their focus on language function and meaning and the processes of learning.

Proponents of process-oriented curricula and instruction concur that traditional assessment techniques are often incongruent with current ESL classroom practices. Standardized testing is seen as particularly antithetical to process learning and has been attacked vigorously not only in ESL, but throughout the field of education (Brandt, 1989; Glaser, 1988; Haney & Madaus, 1989; Neill & Medina, 1989; Rothman, 1990b; Shepard, 1989; Wiggins, 1989a; Wiggins, 1989b). Because of the incompatibility of process learning and product assessment and the discrepancy between the information needed and the information derived through standardized testing, educators have begun to explore alternative forms of student assessment.

Portfolio development is increasingly cited as a viable alternative to standardized testing (Flood & Lapp, 1989; Hiebert & Calfee, 1989; Jongsma, 1989; Katz, 1988; Rothman, 1988; Shepard, 1989; Valencia, 1990a; Valencia, 1990b; Wiggins, 1989a; Wolf, 1989). The feasibility of replacing or supplementing standardized testing with portfolio assessment for purposes of determining student achievement and competencies is being explored by local and state educational reform committees:

  1. Educators in California are examining portfolio assessment of Reading and writing across the curriculum as part of the California Assessment Program (Shepard, 1989).
  2. Connecticut's Common Core of Learning will include development of portfolios, simulations, extended projects, and exhibitions (Shepard, 1989).
  3. State educators in Vermont are examining the use of portfolios in combination with standardized testing to determine students' writing and math competencies (Rothman, 1988).
  4. Project PROPEL in Pittsburgh, Pennsylvania's Public Schools uses portfolio assessment in visual arts, music, and writing (Wolf, 1989).
  5. Educators in New York State are recommending the use of portfolios as part of the "results oriented" evaluation system (Rothman, 1990a).
  6. The Coalition of Essential Schools has included the idea of student exhibitions and portfolios to demonstrate student progress (Brandt, 1989).

Despite the escalation in dialogue concerning portfolio assessment, few guidelines for its use in educational settings have yet emerged. The purpose of this document is to initiate such a set of recommendations. Although the Portfolio Assessment Model described here is adaptable for other educational settings, the intended audience is elementary and secondary ESL educators concerned with monitoring the language development of limited English proficient (LEP) students.

In the remainder of this paper, we (a) define portfolio assessment and describe characteristics of an exemplary portfolio procedure, (b) provide a rationale for using portfolio assessment for monitoring the language development of elementary and secondary LEP students, and (c) describe a model that can be used in planning language-related portfolio assessment.

DEFINITION OF PORTFOLIO ASSESSMENT

The concept of portfolio development was adopted from the field of fine arts where portfolios are used to display illustrative samples of an artist's work. The purpose of the artist's portfolio is to demonstrate the depth and breadth of the work as well as the artist's interests and abilities (Jongsma, 1989). Many educators perceive the intent of educational portfolios to be similar to that of portfolios used in fine arts, to demonstrate the depth and breadth of students' capabilities through biographies of students' work (Wolf, 1989); descriptions of students' reading and writing experiences (Jongsma, 1989); literacy folders (Jongsma, 1989); collections of pieces of writing (Katz, 1988); comparison reports (Flood & Lapp, 1989); and student work exhibitions (Brandt, 1989).

Although portfolios using the model developed in the fine arts may be appropriate for illustrating student work, the model must be expanded to accommodate informational needs and assessment requirements of the classroom. A portfolio used for educational assessment must offer more than a showcase for student products; it must be the product of a complete assessment procedure that has been systematically planned, implemented, and evaluated. The Portfolio Assessment Model described in this document distinguishes clearly between portfolios and portfolio assessment. A portfolio is a collection of a student's work, experiences, exhibitions, self-ratings (i.e., data), whereas portfolio assessment is the procedure used to plan, collect, and analyze the multiple sources of data maintained in the portfolio. A portfolio that is based on a systematic assessment procedure can provide accurate information about the depth and breadth of a student's capabilities in many domains of learning.

CHARACTERISTICS OF A MODEL PORTFOLIO PROCEDURE

Five features typify model portfolios that can be used as a systematic assessment tool in instructional planning and student evaluation. Each of these features has implications for ESL classrooms.

Comprehensiveness. The potential for determining the depth and breadth of a student's capabilities can be realized through comprehensive data collection and analysis. A comprehensive approach (a) uses both formal and informal assessment techniques; (b) focuses on both the processes and products of learning; (c) seeks to understand student language development in the linguistic, cognitive, metacognitive, and affective domains; (d) contains teacher, student, and objective input; and (e) stresses both academic and informal language development.

Although comprehensiveness is a critical component of a good portfolio procedure, a portfolio can too quickly become an aggregation of everything a student produces. A screening procedure needs to be established that will include only selected, high-priority information in the portfolio. The degree of comprehensiveness should be tempered by practical limitations of the evaluation environment such as teacher-student ratios and teacher workload. Setting realistic goals for portfolio assessment increases the probability of sustained teacher interest and use.

Predetermined and Systematic. A sound portfolio procedure is planned prior to implementation. The purpose of using a portfolio, the contents of the portfolio, data collection schedule, and student performance criteria are delineated as part of portfolio planning. Each entry in the portfolio has a purpose, and the purpose is clearly understood by all portfolio stakeholders.

Informative. The information in the portfolio must be meaningful to teachers, students, staff, and parents. It also must be usable for instruction and curriculum adaptation to student needs. A mechanism for timely feedback to the teachers and students and a system for evaluating the utility and adequacy of the documented information are characteristics of a model portfolio procedure. In ESL settings, a portfolio can be particularly useful to communicate specific examples of student work to students, to parents, and to other teachers.

Tailored. An exemplary portfolio procedure is tailored to the purpose for which it will be used, to classroom goals and objectives and to individual student assessment needs. Assessment instruments and procedures are adapted to match information needs, to reflect student characteristics, and to coincide with student linguistic and developmental capabilities. With ESL students, assessment procedures are designed to reveal information about student performance in all curriculum areas relevant to the students.

Authentic. A good portfolio procedure provides student information based on assessment tasks that reflect authentic activities used during classroom instruction. In ESL, authentic language may be assessed across several contexts: formal classroom activities, natural settings (such as the playground), and informal classroom settings (e.g., cooperative learning groups). An effective portfolio procedure will include assessment of authentic classroom-based language tasks, i.e., tasks that the student encounters naturally as part of instruction. Focusing on authentic language proficiency across sociolinguistic contexts and naturally occurring language tasks acknowledges the holistic and integrative nature of language development and focuses on communicative and functional language abilities rather than attainment of discrete, fragmented skills.

RATIONALE FOR USING PORTFOLIOS IN ESL

The rationale for using portfolios in the assessment of LEP student language development derives from three major considerations: the limitations of single measure assessment, the complexity of the construct to be assessed, and the need for adaptable assessment techniques in the ESL classroom.

Limitations of Single-Measure Approaches. Criticisms aimed at standardized testing include the charges that standardized tests (a) result in consistently low scores for minority students in all skill areas so that little can be learned about strengths on which to build instructionally (Haney & Madaus, 1989); (b) may reduce teaching to preparing for the test (Haney & Madaus, 1989); (c) focus attention on lower-level skills (Haney & Madaus, 1989); (d) treat students as if thought processes are all similar and reasons for answers irrelevant (Wiggins, 1989a); and (e) emphasize quantifiable outcomes rather than instructionally relevant formative feedback. Although these criticisms are relevant to ESL, there are "legitimate reasons for taking the intellectual pulse of students, schools, or school systems through standardized tests, particularly when the results are used as an 'anchor' for school-based assessment" (Wiggins, 1989a, p. 712).

Whereas standardized tests serve a purpose in education, they are neither infallible nor sufficient. Many educators acknowledge that any "single score, whether it is a course grade or a percentile score from a norm referenced test, almost always fails to accurately report student overall progress" (Flood & Lapp, 1989, p. 509). A single measure is incapable of estimating the diversity of skills, knowledge, processes, and strategies that combine to determine student progress.

Professionals in ESL education often use a combination of formal and informal assessment techniques for monitoring student language development, but no system for combining these multiple measures and interpreting them as an integrated unit has been presented. Portfolio assessment responds to this need.

Theoretical Rationale. The multifaceted nature of language proficiency makes it difficult for any single measure to represent this skill area. Three theoretical perspectives illustrate the multifaceted nature of language proficiency: (a) communicative competency, consisting of grammatical, sociolinguistic, and strategic competencies (Canale & Swain, 1980); (b) level of competency, or coherence of knowledge, principled problem solving, usable knowledge, automatic skills, and self-regulatory skills (Glaser, 1988); and (c) academic language proficiency, which is conceptualized in terms of two orthogonal continua, one reflecting the level of contextual support available for expressing and receiving meaning, and a second addressing the cognitive demands of the task or activity (Cummins, 1983).

Evidence of the complexity of the construct is also provided through recent theoretical views of reading comprehension (Anderson & Pearson, 1984; Samuels & Kamil, 1984). Comprehension is no longer viewed as being manifested adequately by a student's ability to select single correct answer s to multiple choice items. Comprehension is considered to be a product of the student's interpretation of the text, which will vary depending on the student's purpose in reading, prior knowledge of the material or of related concepts, and strategies for reading. Students actively process the contents of reading passages, selectively focus on elements of interest, relate new information to knowledge stored in memory or to other information contained in the passage, infer meanings of ambiguous or unfamiliar words or concepts, and reflect on the significance of the information relative to their original purpose in reading. This complex array of processes and knowledge derived from the reading passage is of interest in assessment (Valencia & Pearson, 1987). Thus, teachers concerned with developing a thorough assessment of reading comprehension are interested in determining the student's purposes, interpretation, and strategic approach to textual materials. The same elements are of interest in writing development and in listening comprehension, receptive skills with mental processes that are similar to those found in reading and that are of equal interest to teachers in ESL instruction (O'Malley & Chamot, 1990).

Although it is evident that educational theorists differ in their conceptualization of language proficiency, a common conclusion can be reached from these theoretical frameworks: language proficiency must be viewed as a composite of many levels of knowledge, skills, and capabilities. A varied approach to measurement, including both test and nontest methods, is, therefore, needed to ascertain student strengths and weaknesses in all critical areas. Portfolio assessment encourages the use of multiple measures.

Adaptability of the Procedure. The portfolio procedure's adaptability to classroom assessment needs results in major advantages in the instruction and assessment of LEP students. First, tailoring the portfolio procedure to informational needs and instructional goals and objectives may result in a higher degree of curricular and instructional validity than is found with standardized tests. Second, because portfolio assessment is a classroom-based language assessment procedure, data on student progress are available continually and can be used formatively, as opposed to the more summative role of standardized tests. Third, linguistic, cultural, and educational diversity in the ESL classroom are easily addressed in assessment because portfolios can be individualized. Fourth, because portfolio assessment is not limited to quantifiable, multiple-choice techniques, attention can be directed to assessing a greater variety of higher-level skills, such as a student's ability to integrate information. Finally, a portfolio can provide documentation on a student's language development. Systematic documentation of student language growth can be used as supporting evidence in exit and reclassification decisions, differential diagnosis and prereferral decisions, consultations with parents, and program impact evaluations.

PORTFOLIO ASSESSMENT MODEL

The proposed Portfolio Assessment Model for ESL includes six interrelated levels of assessment activities: (a) identify the purpose and focus of the portfolio procedure (establish a portfolio committee and a focus for the portfolio); (b) plan portfolio contents (select assessment procedures, specify portfolio contents, and determine the frequency of assessment); (c) design portfolio analysis (set standards and criteria for interpretation of portfolio contents, determine the procedure for integrating portfolio information and schedule staff responsibilities for portfolio analysis); (d) prepare for instructional use (plan instructional use and feedback to students and parents); (e) identify procedures to verify the accuracy of the information (i.e., establish a system to check the reliability of portfolio information and to validate instructional decisions); and (f) implement the model. Figure 1 illustrates the process of portfolio planning, implementation, and evaluation within the context of the Portfolio Assessment Model.


Identify purpose and focus of portfolio
1. Establish a portfolio committee.
2. Focus the portfolio.

Plan portfolio contents
3. Select assessment procedures.
4. Specify portfolio contents.
5. Determine frequency of assessment.

Design portfolio analysis
6. Set standards and criteria.
7. Determine procedure to integrate information.
8. Schedule staff responsibilities for analysis.

Prepare for instruction
9. Plan instructional use.
10. Plan feedback to students and parents.

Plan verification of procedures
11. Establish a system to check reliability.
12. Establish a system to validate decisions.


Implement the model

The following recommendations elaborate on each level shown on Figure 1 and offer guidelines for implementation of the Portfolio Assessment Model in the ESL classroom.

Identify Purpose And Focus Of Portfolio

1. Establish a Portfolio Development Committee. This recommendation is based on the need for coordinating assessment practices among all personnel working with LEP students. Coordination of assessment would capitalize on the differential expertise and knowledge of each individual, minimize the total time dedicated to student assessment, and maximize the usability of assessment information.

The portfolio committee should work in a coordinated effort and be composed of individuals concerned with the education of LEP students, including all ESL, mainstream, and bilingual teachers who have the student in class for part of the day. Specialists in ESL curriculum and instruction, assessment, and second language acquisition are needed during portfolio planning. If these specialists cannot be found within a school or district wishing to implement portfolio assessment, consultants can be brought in for initial portfolio planning.

2. Focus the Portfolio. Once the portfolio committee has been established, its first task is to determine the purpose for which information in the portfolio will be used. In ESL and bilingual programs, this can include entry/exit assessment, monitoring student progress, grade assignments, and evaluation of instructional interventions, among others. The next step is to identify the instructional goals that are relevant to the purposes. The goals selected at each grade level for portfolio assessment guide the content of the portfolio and the criteria used to assess student language development, so these goals should be carefully selected.

Focusing on education goals-rather than objectives, which are more specific-is recommended for two reasons: (a) Educational goals are more likely to reflect the integrative and holistic nature of language proficiency development than are objectives; and (b) integrated data from multiple measures must be interpreted in terms of domains of learning, and educational goals are generally more representative of domains of learning than are objectives. Four types of educational goals should be considered: skills, strategies, concepts, and applications. Writing development can be used to illustrate examples of goals related to each category.

The focus of assessment tends to be on skill development or on the development of the mechanics of language when writing is corrected for grammar, vocabulary, spelling, punctuation, and capitalization. The existence of basic mechanical skills for writing does not ensure effective writing, however, because students must also possess other competencies to become effective writers. One such competency is the student's ability to generate ideas and thoughts appropriate to the topic. How the ideas are generated will depend on the strategies a student uses to construct meaning. A strategy that has been shown to be effective in writing development is backtracking, or rereading orally or silently, in order to proceed with the thought process. An appropriate goal related to writing strategies is for students to use backtracking in the generation of ideas. Other writing strategies that might be the focus of goal development for writing include: (a) ignoring the mechanics of language during the initial generation phase of writing to allow thoughts to develop fluidly, (b) discussing the topic with the teacher or friends, and (c) editing, rewriting, and reformulating to clarify and enrich meaning.

The foundation of learning is the student's ability to comprehend the basic concepts of a discipline, and writing development is replete with concepts that must be understood for effective writing to develop. Among the concepts that must be attained and that are possible educational goals for writing development are (a) that writing is a process of formulating and reformulating ideas, (b) that writing must be done for different audiences, (c) that English writing is linear, and (d) that writing and speaking are interrelated but are governed by distinct rules.

Students' ability to integrate writing skills in functional situations will determine their academic success. For this reason, it is important to monitor students' progress in applying skills across the curriculum in a variety of academic tasks. Are the writing skills applied when students are asked to do free writing? Can students use effective writing in answering social studies questions? Do students remember effective writing strategies when doing a home project for science? These types of questions provide the foundation for developing goals related to the application of writing skills.

Plan Portfolio Contents

3. Select Assessment Procedures. The purpose of this stage of portfolio planning is to determine how information about student progress related to each goal will be gathered. Both formal (test) and informal (nontest) techniques should be considered for their potential in eliciting specific kinds of information about student progress. Formal techniques include chapter and unit tests, teacher-made tests, standardized achievement tests, standardized proficiency tests, diagnostic tests, and other objective measures generally producing a score. Informal techniques are considered subjective in that judgement of student performance and progress could vary depending on the predisposition of each rater. Informal techniques that might be used to elicit information on student progress in the ESL classroom include teacher ratings and checklists, student self-ratings, records of work in progress, work samples, dialogue journals, naturalistic observations, planned observations, dictation, cloze procedures, writing samples, and interviews.

Balancing the use of objective and subjective techniques for gathering information and corroborating distinct sources of information in the portfolio may increase the validity of the conclusions reached from assessment. Failure to find corroboration among independent sources of information could be due to lack of validity for measuring the language construct stated in the goal or the unreliability of one or more sources of information. For example, if a student receives a score corresponding to fluent English proficiency on a standardized oral proficiency test (a formal technique), this rating can be compared to more informal measures used in the classroom, such as a teacher checklist. The checklist might consistently show student strengths in using language interpersonally but not in using language for academic purposes, such as synthesizing or evaluating information. The discrepancy between the test outcome and the informal teacher assessment might lead to the conclusion that the student needs additional work to build reasoning and analytical skills in handling classroom content.

4. Specify Portfolio Contents. Planning the presentation of student assessment information in the portfolio may necessitate looking at old skills from a new perspective. Although spelling represents only a part of an educational goal related to the development of the mechanical skills of language, it provides a straightforward example of how student progress can be documented in the portfolio in meaningful and instructionally useful ways.

Spelling assessment is frequently conducted using a variation of the traditional Friday spelling test, yet these spelling tests provide limited information that can inform the instructional process. They tell the teacher only that the student can or cannot, will or will not, memorize a list of words which must be recalled on Friday mornings. Equally, placing a child's spelling test in a portfolio would provide no instructional advantage over placing a spelling grade in the grade book. Three instructionally relevant alternatives could be used to document information on student spelling progress in the portfolio. First, a graph showing the percentage of words spelled correctly could be kept in the portfolio as a record of the general trend of individual progress on spelling tests. Second, documentation could be focused on types of spelling errors (e.g., short vowel sounds or specific consonant clusters); and two separate lists could be maintained in each student's portfolio, one list showing types of errors from spelling tests and another list showing types of spelling errors found in content work, such as writing samples. Third, a checklist of required spelling words could be developed and marked each time there is evidence of academic application of a word.

Some types of information, such as student work samples, can be collected directly from students and placed in the portfolio; other forms of information, such as planned observations, ratings, or standardized tests, will necessitate the development or selection of instruments that will be used both for gathering information about student progress and documenting the information in the portfolio. Informal instruments can be designed locally or can be found in educational journals and publications. One recent example is the Reading Development Checklist, which contains four major sections: (a) concepts about print; e.g., knowing to look at print from left to right; (b) attitudes toward reading, e.g., enjoying and selecting a variety of reading materials; (c) strategies for word identification, e.g., using cues for word identification; and (d) comprehension strategies, e.g., making predictions about meaning and self correcting (Mathews, 1990). Other examples of informal measures are abundantly available (e.g., Harp, 1988; Navarette, Wilde, Nelson, Martinez, & Hargett, 1990; Pikulski, 1990; Prutting & Kirchner, 1987; Valencia, 1990b).

5. Determine the Frequency of Assessment. In determining how often each type of assessment technique will be used, the committee should seek to balance the need for comprehensive information about student progress with the need for a practical approach to assessment. This balance is feasible if procedures which can be used as part of everyday classroom instruction (such as writing samples, cloze passages, and cognitive maps) are selected as techniques for student assessment. When these procedures are used, however, their purposes in portfolio assessment must be fully outlined and the documentation in the portfolio scheduled to demonstrate longitudinal evidence of student growth.

Design Portfolio Analysis

6. Set Standards and Criteria for Interpretation of Portfolio Contents. Standards and criteria for student performance are necessary for reaching educational decisions such as entry/exit or pass/fail. They are also required for instructional decisions concerning grouping practices, selecting appropriate materials, and so on. The standards and criteria established by the portfolio committee will be used to assist in interpreting each student's portfolio and will, therefore, play a critical role in portfolio planning.

In setting standards for judging student performance, the theoretical rationale behind portfolio development should be kept in mind: that reading, writing, speaking, and comprehending language involve such a complex array of processes, skills, behaviors, and capabilities that it is impossible to characterize effective language development solely through lists of educational objectives. These lists might provide insight into characteristics of products indicative of effective language development, but it is unlikely that they will adequately reflect the distinctiveness of each individual's language attainment. To reflect the multifaceted nature of language proficiency and to accommodate individual differences in language attainment, the portfolio committee should set standards based on expected patterns of behavior indicative of effective language development rather than specific objectives. For example, a reasonable expectation regarding the development of writing strategies would be that students demonstrate increasing use of backtracking during the writing process. In writing skills development, an expected pattern might be for students to make increasing use of grade-appropriate vocabulary in written work.

In addition to outlining expectations regarding student performance, the portfolio committee should agree on the criteria or reference point that will be used in interpreting the portfolio information. Three complementary points of reference are commonly used in interpretation of educational data: individual performance across time, mastery of skills, and relative group standing. Each type of reference point derives from a distinct rationale that should be clearly understood by the portfolio committee.

Interpretation based on individual performance is essentially a time-series research design that involves collecting baseline data on a student's capabilities at entry into the school system and using these data as the first point of comparison for gauging the student's progress. Each analysis of the student's portfolio would constitute a new point of reference used to assess progress. This approach to data interpretation requires that the portfolio committee discuss the degree of progress that could be expected in each domain of learning given different conditions (such as a student's prior education and native language literacy) and set realistic expectations related to individual progress.

A criterion-referenced interpretation of data is used when students are rated in terms of their mastery of educational objectives and goals. The criteria for mastery are set for each domain of learning, and progress is determined by comparing where the student stands in relation to each criterion. As an example, assume that the ability to write for different audiences is an educational goal related to writing development. Three possible objectives related to this goal are developing skills in diary and journal writing, the writing of business letters to obtain information, and argumentative essay writing. Mastery of these three objectives (i.e., the educational goal) might be defined as the successful completion of three adequate samples of each type of writing. When students fulfill these requirements, they have mastered the criteria.

A norm-referenced interpretation of data is used when an individual's performance is compared to that of a group and measured by relative group standing. Letter grades, grade equivalents, percentiles, stanines, standard scores, and normal curve equivalents (NCEs) are examples of norm-referenced ratings. Despite the prevalence and acceptance of norm-referenced interpretation of data in educational settings, use of this point of reference for interpretation of LEP student progress may be uninformative, misleading, and inequitable. The scores may not be informative because they tend to be uniformly low, and misleading because they indicate only what the student does not know rather than what the student does know, and may fail to show growth resulting from an instructional program. The scores may be inequitable because students are compared against a standard of individuals who not only are fully proficient in English but are also knowledgeable of test-taking strategies. These problems in the interpretation of norm-referenced data indicate that LEP student progress needs to be evaluated relative to a criterion.

7. Determine the Procedure for Integrating Portfolio Information. Student portfolios will contain an assortment of information that, at first glance, will seem impossible to analyze; there might be test scores, ungraded work samples, observations, teacher checklists, and cloze passages. In order to evaluate student language development holistically, a system for integrating and interpreting these diverse sources of information must be devised. Only through integration of the information can patterns of language development emerge that demonstrate individual student competencies related to educational goals.

A liberal adaptation of an ethnographic technique, domain analysis, introduced by Spradley (1980) in Participant Observation is suggested as a point of departure for integration of portfolio information. Recognizing the need for a practical approach to portfolio interpretation, the authors have significantly simplified the procedure to make it instructionally feasible at the district level. Using the adapted analysis procedure, a domain analysis sheet (or several, if needed) similar to the one in Figure 2 is developed for each educational goal of focus. First, program objectives corresponding to the goal are placed in the left-hand column of the analysis sheet. Next, pages in the portfolio are numbered or given a letter for reference, and then all documented information from the portfolio is examined. When evidence of student progress related to an objective is found in the portfolio contents (both formal and informal), the example is noted next to the corresponding objective on the appropriate data analysis sheet. The reference number or letter corresponding to the page in the portfolio where each example was found would be recorded in the third column on the analysis sheet. This process continues until all examples have been noted. Finally, the completed domain analysis sheets pertaining to a goal are examined as a unit for evidence of patterns of student language development that the portfolio committee has already determined are indicative of effective progress toward the goal. This analysis procedure is repeated for each educational goal of focus.

Although this qualitative approach to portfolio analysis will present interesting challenges to the portfolio committee, it supports the basic rationale behind portfolio assessment. The search for patterns of behavior that cut across all objectives related to a goal allows for individual differences in the language learning process to be accommodated. The suggested analysis format also increases the probability that all evidence of student language development is considered as a unit and that equal weight is given to different assessment methodologies in determining student progress.

Figure 2. Sample portfolio analysis form.

Student Name: ____________________________ Classroom: _________
Teacher: _________________________ Date Analyzed: _____________
Educational Goal: _____________________________________________
Objective
Examples Illustrating Student
Progress Related to Objective
Reference

____________________

____________________

____________________

8. Schedule Staff Responsibilities for Portfolio Analysis. Portfolio analysis activities should be allocated among portfolio committee members and should be scheduled at predetermined points. The persons responsible for data analysis might include individual committee members, pairs of teachers working together, or the full portfolio committee. The frequency of portfolio review will vary with individual students or classroom goals and may be scheduled so that the persons responsible review the progress of different groups of students at predetermined intervals. In determining the frequency of portfolio analysis, it should be kept in mind that these analyses provide instructional feedback to the teachers and students. If portfolio analysis is conducted at infrequent intervals, the portfolio information cannot be used to strengthen the instructional process, which makes the purpose of the portfolio summative rather than formative in nature. At the same time, if portfolio analysis is planned to cover smaller units of instruction, there are increased time demands on both the teacher and student, and trends in student progress may be more difficult to ascertain. Ideally, portfolio analysis should take place approximately every four to six weeks or after each major unit of study.

Prepare for Instructional use

9. Plan Instructional Use. Information from portfolios can be used for a variety of purposes in ESL instruction: monitoring student progress, placing students at the appropriate instructional level, assigning grades, designing future instructional interventions, and determining when it is appropriate to phase the student out of special instructional programs. The information can be referred to at any time throughout a school year and provides a cumulative record of student progress. To avoid excessive accumulation of papers, the portfolio could be divided into two sections: required portfolio contents, which are needed to maintain basic information that is related to instructional goals or needed for educational decisions and supplementary portfolio contents, which broaden the range of information to provide a comprehensive picture of language and content skills using any additional data that are needed to support or expand upon the required information (Valencia, 1990a). Required information is of interest to administrators as well as to teachers and might include test scores at program entry and at other important points, performance records from the previous year, checklists, work samples collected systematically every six weeks, and so on. The supplementary information could contain any of the informal indicators suggested above plus occasional work samples that best represent growth in student language proficiency.

10. Plan Feedback to Students and Parents. Portfolios can be used to convey information to students concerning their progress and to illustrate the student's growth to parents. Active student involvement in the portfolio procedure is recommended for several reasons. First, through active involvement in assessment, a student's cognitive and metacognitive awareness of the learning process may be strengthened. Second, a student's sense of control over the learning process may be increased through shared responsibility and participation in assessment (Zimmerman, 1989). Finally, active involvement in the portfolio process may encourage a student's self-determination in learning. Cognitive and metacognitive awareness of the learning process, internalized locus of control, and self-determination in learning have been shown to affect achievement (Pressley & Ghatala, 1990; Schunk, 1990); therefore, any instructional practices which positively influence student gains in these areas should be encouraged.

One possibility for student involvement in portfolio assessment is to encourage students to arrange their writing products from most to least effective and to answer a series of questions about the writing, such as what qualities separate the best pieces of writing from the others, what process was followed in generating the written products, and what difficulty was encountered in writing the least effective pieces.

Portfolios can also be used to provide parents with concrete information regarding their child's progress in language proficiency development. A grade informs parents only about their child's success or failure, but a portfolio provides them with specific details and examples of the child's capabilities and interests. Where grades are mandatory, the portfolio will assist the teacher in helping parents understand the reasons their child received a specific grade.

Identify Procedures to Verify Accuracy of Information

11. Establish a System to Check Reliability of Portfolio Information. Reliability of informal or alternative assessment information needs to be addressed to ensure that judgements about student performance are based on accurate information. Some of the information in a portfolio might include highly objective measures such as standardized tests, while other measures such as teacher checklists and observations have the potential for inconsistency in interpretation and scoring. A simple statement on a rating scale such as "Student has made progress in literal reading comprehension" can be interpreted in various ways. One teacher may find evidence to support a positive judgement in informal reading inventories, but another may find that end-of-unit tests support a negative conclusion. Even assuming two different teachers would use exactly the same evidence, there is considerable room for differences of opinion. The resolution of this difficulty is for the portfolio committee to decide on the specific criteria for reaching judgments concerning student progress, to discuss thoroughly in advance areas where there is a potential for varied interpretations of the same information, and to perform intermittent checks on the accuracy of teacher ratings.

12. Establish a System to Validate Decisions. One of the strengths of the portfolio approach is that varied forms of evidence can be collected and reviewed to support instructional decisions. One type of evidence can be analyzed to determine if it is in agreement with other types of evidence and if essentially the same instructional decision would be made from both.

There is no precedent for validating a portfolio procedure, but three methods are suggested: (a) a study of the relationship between conclusions derived using portfolio information and conclusions derived using objective data, e.g., standardized test scores (a concurrent validity study); (b) a study of the relationship between conclusions derived using portfolio information and teacher judgment (a concurrent validity study); and (c) a longitudinal study of the relationship of decisions made using the portfolio information with subsequent student performance (a predictive validity study).

Each of these techniques for validating instructional decisions is subject to disadvantages and constraints. A longitudinal study of the predictive validity of the portfolio procedure is time-consuming, requires continuity within the portfolio committee and stability of the student population, and assumes that the variables underlying LEP student achievement are identifiable. A concurrent validity study of the relationship between portfolio conclusions and teacher judgement sidesteps the question of the reliability of teacher opinion and is confounded by the teacher's participation in portfolio planning and implementation. Finally, a concurrent validity study showing the relationship of conclusions derived using portfolio data and those derived using objective data negates the original impetus behind portfolio development, that objective data have proven insufficient in the assessment of LEP student language development. Of the three options, a predictive validity study is recommended tentatively as a point of departure for validation of language-related portfolios. This choice is based on the rationale that the goal of most elementary and secondary ESL programs is to enable students to achieve academically in classrooms where English is the medium of instruction. Given this goal, predicting student achievement in an English-medium classroom should be a major focus of the validation of any ESL assessment procedure.

Implement the Model

The extensive planning that has entered into all previous phases of the portfolio model design will contribute essential information and procedures to aid implementation. By this time, there exists a portfolio committee and a detailed understanding of the purposes for which the portfolio will be used and the educational goals that will be assessed. Assessment procedures and instruments and portfolio contents have been selected, and a plan for instructional use of the information has been identified. Finally, procedures for ensuring the reliability of portfolio information and the validity of instructional decisions will have been determined. All that remains at this point is to implement the model, collect information from students, and analyze and verify the data used for instructional design. Portfolio designers and users should develop constructive ways in which an active and collaborative process can be maintained that involves other staff, the students themselves, and the parents. This involvement is a key to the success of the portfolio approach.

CONCLUSIONS: PASSING FAD OR PROMISING FUTURE?

Portfolio assessment is not an educational panacea; rather, it is a promising alternative assessment procedure that is replete with both strengths and weaknesses that must be fully realized for proper implementation. Its strengths include its potential for enhancing teacher professionalism through meaningful and active involvement in student assessment (Hiebert & Calfee, 1989; Rothman, 1988; Wiggins, 1989a); for establishing a sense of community among evaluators (Katz, 1988); for encouraging thoughtful activity in the classroom (Shepard, 1989); for promoting serious discussion of criteria and what goes on in the classroom (Katz, 1988); for creating instructional links at different grade levels (Hiebert & Calfee, 1989); for linking assessment more closely to classroom activities (Rothman, 1988); for allowing students to draw on the skills they learn in process-centered classrooms (Katz, 1988); for allowing assessment to become a teaching strategy to improve learning (Cross, 1987); for drawing on students' strengths rather than focusing on their weaknesses (Colvin, 1988); for involving both students and parents in assessment; and for making assessment more equitable (Calfee & Hiebert,1987; Shulman, 1987; Wiggins, 1989a).

The positive potential of portfolio assessment is tempered by some negative considerations. First, validation of the procedure is a major concern. Because the Portfolio Assessment Model is fundamentally a qualitative approach to student assessment, establishing its validity and reliability may prove substantially more difficult than is the case for quantitative approaches. The need for multiple judges, careful planning, proper training of raters, and triangulation of objective and subjective sources of information cannot be ignored for successful validation of the procedure to occur. Clearly, the degree of involvement necessary for validating the portfolio procedure is costly and time-consuming, and this may present a major disadvantage for some school districts.

A second major problem in portfolio assessment is defining the criteria against which performance is to be judged (Brindley, 1986). The standards used for portfolio assessment must reflect the holistic nature of language development, must be sensitive to individual student differences, and must accurately reflect student progress. Determining standards which reflect these three characteristics involves careful consideration and assumes that consensus can be reached among members of the portfolio planning committee. Finally, portfolio assessment demands an extraordinary commitment from all committee members, making district-level backing of this assessment procedure mandatory. Some school districts may have neither the inclination nor the resources to facilitate the implementation of this assessment approach; others may not have school board, parental, or community support for the project. If the committee is unable to elicit the required district support for portfolio assessment, it is improbable that adequate implementation of the procedure will be maintained.

Portfolio assessment, like other innovations, must be undertaken with caution and thoughtfulness for it to fulfill its promise. To approach portfolio assessment with anything less than a total dedication to developing a quality alternative assessment procedure is to relegate this potentially powerful approach to the realms of other educational fads.

REFERENCES

  • Anderson, R. C., & Pearson, P. D. (1984). A schema-theoretic view of basic processes in reading comprehension. In P. D. Pearson, R. Bar, M. L. Kamil, & P. Mosenthat (Eds.), Handbook of reading research (pp. 255-291). NY: Longman.
  • Brandt, R. (1989). On misuse of testing: A conversation with George Madaus. Educational Leadership, 46(6), 26-29.
  • Brindley, G. (1986). The assessment of second language proficiency: Issues and approaches. South Australia: National Curriculum Resource Center.
  • Calfee, R., & Hiebert, E. (1987). The teacher's role in using assessment to improve learning. In E. E. Freeman (Ed.), Assessment in the service of learning. Proceedings of the 1987 ETS Invitational Conference (pp. 45-61). Princeton, NJ: Educational Testing Service.
  • Canale, M., & Swain, M. (1980). Theoretical bases of communicative approaches to second language teaching and testing. Applied Linguistics, 1, 1-47.
  • Colvin, R. (1988, November 30). California researchers "accelerate" activities to replace remediation. Education Week, p. 6.
  • Cross, K. P. (1987). The adventures of education in wonderland: Implementing education reform. Phi Delta Kappan, 68, 496-502.
  • Cummins, J. (1983). Conceptual and linguistic foundations of language assessment. In S. S. Seidner (Ed.), Issues of language assessment: Language assessment and curriculum planning (Vol. 2). (pp. 7-16). IL: Illinois State Board of Education.
  • Flood, J., & Lapp, D. (1989). Reporting reading progress: A comparison portfolio for parents. The Reading Teacher, 42, 508-514.
  • Glaser, R. (1987). Cognitive and environmental perspectives on assessing achievement. In E. E. Freeman (Ed.), Assessment in the service of learning. Proceedings of the 1987 ETS Invitational Conference (pp. 37-43). Princeton, NJ: Educational Testing Service.
  • Goodman, K. S. (1989). Whole language is whole: A response of Heymsfeld. Educational Leadership, 46(6), 69-70.
  • Goodman, K. S., Goodman, Y. M., & Hood, W. J. (1989) (Eds.), The whole language evaluation book. Portsmouth, NH: Heinemann.
  • Haney, W., & Madaus, G. (1989). Searching for alternatives to standardized tests: Whys, whats, and whithers. Phi Delta Kappan, 70, 683-687.

    Harp, B. (1988). When the principal asks "When you do whole language instruction, how will you keep track of reading and writing skills?" The Reading Teacher, 42, 160-161.
  • Heymsfeld, C.R. (1989). Filling the hole in whole language. Educational Leadership, 46(6), 65-68.
  • Hiebert, E. H., & Calfee, R. C. (1989). Advancing academic literacy through teachers' assessments. Educational Leadership, 46(7), 50-54.
  • Jongsma, K. S. (1989). Portfolio assessment. The Reading Teacher, 43, 264-265.
  • Katz, A. (1988). The academic context. In P. Lowe, Jr. & C.W. Stansfield (Eds), Second language proficiency assessment: Current issues (pp. 178-201) Englewood Cliffs, NJ: Prentice Hall Regents.
  • Lowe, P., Jr., & Stansfield, C. W. (Eds.). (1988). Second language proficiency assessment: Current issues. Englewood Cliffs, NJ: Prentice Hall Regents.
  • Mathews, J. K. (1990). From computer management to portfolio assessments. The Reading Teacher, 43, 420-421.
  • Navarette, C., Wilde, J., Nelson, C., Martinez, R., & Hargett, G. (1990). Informal assessment in evaluation of educational programs: Implications for bilingual education programs. Washington, DC: National Clearinghouse for Bilingual Education.
  • Neill, D.M., & Medina, N. J. (1989). Standardized testing: Harmful to educational health. Phi Delta Kappan, 70, 688-697.
  • O'Malley, J. M., & Chamot, A. U. (1990). Learning strategies in second language acquisition. NY: Cambridge University Press.
  • Pikulski, J. J. (1990). Informal reading inventories. The Reading Teacher, 43, 514 516.
  • Pressley, M., & Ghatala, E. S. (1990). Self-regulated learning: Monitoring learning from text. Educational Psychologist, 25, 19-33.
  • Prutting, C., & Kirchner, D. (1987). Pragmatic aspects of language. Journal of Speech and Hearing Disorders, 52, 105-119.
  • Rothman, R. (1988, October 26). Vermont plans to pioneer with 'work portfolios'. Education Week, p. 1.
  • Rothman, R. (1990b, May 30). Ford study urges new test system to 'open the gates of opportunity.' Education Week, p. 1.
  • Samuels, S. J., & Kamil, M. L. (1984). Models of the reading process. In P. D. Pearson, R. Barr, M. L. Kamil, & P. Mosenthal (Eds.), Handbook of reading research (pp. 185-224). NY: Longman.
  • Schunk, D. H. (1990). Goal setting and self-efficacy during self-regulated learning. Educational Psychologist, 25, 71-86.
  • Seidner, S. S. (Ed.). (1983). Issues of language assessment: Language assessment and curriculum planning (Vol. 2). IL: Illinois State Board of Education.
  • Shanklin, N. L., & Rhodes, L. K. (1989). Transforming literacy instruction. Educational Leadership, 46(6), 59-63.
  • Shepard, L. A. (1989). Why we need better assessments. Educational Leadership, 46(7), 4-9.
  • Shulman, L. S. (1987). Assessment for teaching: An initiative for the profession. Phi Delta Kappan, 69, 38-44.
  • Spradley, J. P. (1980). Participant observation. New York: Holt, Rinehart and Winston.
  • Valencia, S. W. (1990a). A portfolio approach to classroom reading assessment: The whys, whats, and hows. The Reading Teacher, 43, 338-340.
  • Valencia, S. W. (1990b). Alternative assessment: Separating the wheat from the chaff. The Reading Teacher, 43, 60-61.
  • Valencia, S., & Pearson, P. D. (1987). Reading assessment: Time for a change, The Reading Teacher, April, 726-732.
  • Wiggins, G. (1989a). A true test: Toward more authentic and equitable assessment. Phi Delta Kappan, 70, 703-713,
  • Wiggins, G. (1989b). Teaching to the (authentic) test. Educational Leadership, 46(7), 41-47.
  • Wolf, D. P. (1989). Portfolio assessment: Sampling student work. Educational Leadership, 46(7), 35-39.
  • Zimmerman, B. J. (March, 1989). Individual differences in self-efficacy and use of self-regulated learning strategies: A social cognitive analysis. Paper presented at the annual meetings of the American Education Research Association, San Francisco, CA.
 
ABOUT THE AUTHORS

Sharon S. Moya, PhD, is a testing consultant at an English teacher training college in Budapest, Hungary.
J. Michael O'Malley, PhD, is Supervisor of Testing and Accreditation at Prince William County Public Schools in Manassas, Virginia.