Friday, July 25, 2014

SAT Writing vs. TOEFL Writing

The big news for educators this year has been that the College Board is redoing its Scholastic Aptitude Test (SAT) by eliminating the writing requirement in 2016. Is this a good thing? Will this de-emphasis on writing have an effect on ETS's Test of English as a Foreign Language (TOEFL)?

More than a decade ago, ETS (Educational Testing Service), which also makes the SAT, decided to create an online internet-based TOEFL (iBT - Test of English as a Foreign Language). In my view, this change was a vast improvement over the paper-based test as it included compulsory speaking and writing components. Before the iBT, I saw many of my Asian students proudly achieve the score of 450+ gain admission to local California community colleges. Later, however, I discovered that many of these same students were still taking ESL courses. Why? Although the paper-based TOEFL was supposedly their passport to entrance into and success in an American college, they found out subsequently that they had little ability to produce academic-level spoken or written English. The old paper-based TOEFL was not a great predictor of success for these non-natives in an American college system.

Not only were these students challenged to understand lectures in English, but they had to summarize and verbally restate in writing what they had heard in lectures. The skills that they needed to be successful at an American college were not just the passive skills (reading, listening, and structure/grammar) that they were tested on in the Paper-Based TOEFL (PBT). They needed to be able to produce English - not just recognize meanings or do error correction. They had to be able to rethink what they heard or read and interpret meanings. With almost no or little preparation or training for this approach to learning, they remained stuck in remedial ESL classes. With the advent of the iBT, many of these foreign students found a purpose to learning to be active producers of English.

It makes no sense to eliminate writing as a component of the SAT unless there is some other way to verify a college applicant's capabilities to produce English. Doing a timed written test in English is different from submitting a prepared statement of purpose for admission. This latter document was likely read and edited by multiple friends, family members, and paid tutors - and may not be an indicator of how a student will fare under college test conditions. Why is ETS planning to eliminate an important measure of the productive and critical thinking abilities of native English speakers while demanding measurable performances from non-native speakers on the iBT (internet-based TOEFL)?

Though I admit to preferring more creative writing in high school, I was grateful in the end that my 11th grade English teacher worked my class hard, so that the five paragraph essay was almost reflexive by the time I was a freshman at UCLA. I passed the Subject A exam of those days and was able to enroll in a required English course from my first quarter. The former "Subject A exam" still exists at UCSD, for example, in the form of The Entry Level Writing Requirement. Students who do not achieve at least one of several entry-level writing composition scores must take a composition course (for which they earn no credit toward their future degree) and pass an exam. An ESL instructor who teaches this composition course at UCSD through Mesa College told me that if a student fails the end of quarter writing exam, (s)he must repeat the course until a passing mark is reached.

As much as I am against the teach-to-a-test approach to education, if high school students know that colleges require a writing score from the SAT, they will prepare for it with the guidance of their teachers. This practice alone may send a message to all (i.e., parents, students, teachers, administrators) that critical thinking clearly expressed in writing matters.

For a supporting view, please check out this Washington Post commentary. For a broader view of the elimination of the writing component of the SAT, read Inside Higher Ed's news brief.

Friday, July 4, 2014

Assessing Grammar through Speaking

Recently I taught a course in high intermediate grammar. One of the SLOs (student learning objectives) was "Students will be able to ... produce in writing and speaking... [certain structures, such as present perfect with question formation and basic subject-verb agreement]." It is relatively straightforward to assess for grammatical structures in a writing assignment, but how does one objectively assess the "natural" production of certain structures in a speaking task? You can have students give prepared presentations, but this is somewhat "unnatural" in my view. What I would want to know if I were an English student is whether or not I can control certain structures in a "normal" conversational situation.

To get students to practice target structures in a conversation mode, I gave pairs of students a game board, with die. They also received a speaking rubric for the task so that they could see what they were being evaluated on. The game board has more than 50 squares with the base form of both regular and irregular verbs in each square.

Below you can preview the rubric and directions and decide if you like it before going to the pdf file link above. You can also see the game board which uses a bogglesworld's board template, which I like a lot. I modified slightly with with my own words.


BOARD GAME - SPEAKING ASSESSMENT
Target Features
4
3
2
1
Present perfect (questions, statements, short answer)





Subject-verb agreement (singular/plural)




Simple past (question formation, statements)




Irregular verbs (present perfect and simple past)




Pronunciation of -ed endings (present perfect and simple past)





Name___________________                                                                       Score______/20
Comments:


PRESENT PERFECT (questions, short answer, statements in positive/negative):
4 = Error-free use and production of the structures
3 = Occasional errors in use and production of structures
2 = Frequent errors in use and production of structures    
1 = Lacks control of use and production of structures

SUBJECT-VERB AGREEMENT:
4 = Always follows rules of subject-verb agreement
3 = Occasional errors in subject-verb agreement
2 = Frequent subject-verb agreement errors
1 = Almost no control of subject-verb agreement

SIMPLE PAST (Wh-Q, statements)
4 = Error-free use of the structures
3 = Occasional errors in use of structures
2 = Frequent errors in use of structures                
1 = Lacks control of structures

IRREGULAR VERBS (present perfect and simple past):
4 = Error-free use of the structures
3 = Occasional errors in use of structures
2 = Frequent errors in use of structures                
1 = Lacks control of structures

PRONUNCIATION OF -ed ENDINGS (present perfect and simple past):
4 = Error-free pronunciation of verb endings
3 = Occasional errors in pronunciation of verb endings
2 = Frequent errors in pronunciation of verb endings       
1 = Lacks control of pronunciation of verb endings


A Verb Game board** was used to elicit questions and responses in present perfect and simple past.  Students practiced for part of one period and "played" again the following meeting. They needed to practice "yes/no" questions in present perfect, short answers, and follow up wh-questions in the simple past. Answers to the wh-questions needed to use the simple past form of the verb in the original question. The paired speaking activity was recorded and rated for accuracy in use and production of target structures (about 2.5 to 3 minutes). 

**A partial view of the game board is attached at the bottom. It is not in landscape format. I printed boards and handed out dice for students to share for this activity.