LTRC 2009

 Sponsors

ILTA
CAL Logo
2LT Logo
ElSevier Logo
EIKEN Logo
ETS Logo
Pearson Logo
Michigan ELI Testing
University of Cambridge Logo

Program Schedule

Nature Scene

LTRC 2009 Program Now Available For Download (03.14.09)

Download a PDF of the complete LTRC program (981 KB) which includes full details about the conference.

Final Schedule Available

Check out the program schedule below!

 



Survey Results

CAL staff have recently compiled and summarized the many responses to the LTRC 2009 programming survey. Learn more and download a document with selected survey results.


Messick Speaker

LTRC is happy to announce that Lorrie A. Shepard, Ph.D. will receive this year's Messick Award for LTRC 2009. Dr. Shepard is a professor at the University of Colorado, Boulder. Dr. Shepard will give this year's Messick Lecture.


Update on Program Planning (01.27.09)

Meg on the Phone

The co-chairs met in DC (no expense to LTRC!) January 15-17. During this time, we:

  • Worked out logistics for the banquet with Liz Hamp-Lyons (via conference call) and our Banquet Planning Team

  • Held a conference call with Prime Management (that’s Anna on the phone)

Going over the program plan
  • Drafted the program and reviewed it with AAAL Chair Jeff Connor-Linton
Review with Rachel Then, we revised the program and reviewed the revised program with Planning Committee Member Rachel Brooks.
Barack and the Co-Chairs Then, we consulted with the (then) President-elect of the United States.

LTRC 2009 Schedule (Revised Schedule - Posted 03.05.09)

Monday, March 16, 2009
Morning coffee and afternoon cookies served in Atrium

Time

Session Type

Details

9:00 am – 4:00 pm

Workshops

HLM: Aspen Ampitheatre
Listening: Larkspur

Tuesday, March 17, 2009
Morning coffee and afternoon cookies served in Atrium

Time

Session Type

Details

9:00 am – 4:00 pm

Workshops

Listening:  Pike’s Peak
Standard Setting: Aspen Ampitheatre

4:00 – 4:30 pm

BREAK

 

5:00 – 6:00 pm

Evergreen Ballroom

Newcomers’ panel

 Greet newcomers, discuss schedule

6:00 pm

Reception

 

Wednesday, March 18, 2009
All sessions are in the Evergreen Ballrooms DEF unless otherwise noted

Time

Session Type

Details

8:30 – 8:45 am

Welcome to LTRC

Chairs

8:45 – 8:50 am

In memorium: Donna Ilyin

Charles Stansfield

8:50 – 9:00 am

Messick Lecture Introduction

Craig Deville, Measurement Incorporated

9:00 –10:00 am

Messick Lecture

L. Shepard

10:00 – 10:05 am

Messick Award Presentation

Educational Testing Service

10:05 – 10:15 am

BREAK

 

10:15 – 11:45 am

Paper Session 1

Yi-Ching Pan, National Pingtung Institute of Commerce, The University of Melbourne
The Social Impact of English Certification Exit Requirements

Kedeesa Abdul Kadir, University of Illinois at Urbana-Champaign
Addressing Transparency and Accountability through a Strong Program of Validity Inquiry: The Malaysian Public Service Experience

Evelina Galaczi, University of Cambridge ESOL Examinations
The role of quantitative and qualitative methodologies in the development of rating scales for speaking

11:45 am – 1:15 pm

Lunch

LAQ Executive Board Meeting Larkspur Room

1:15 – 2:45 pm
POSTER SESSIONS

ATRIUM

  • Seongmee Ahn, Michigan State University; Ok-Sook Park, Michigan State University; Daniel Reed, Michigan State University
    Development and Validation of a Web-based Multimedia Korean Oral Proficiency Test

  • Beverly Baker, McGill University
    The Development of an English Proficiency Test for Teacher Certification in Quebec

  • Annie Brown, Ministry of Higher Education and Scientific Research
    The development of an on-line training and marking program for the assessment of writing

  • Martyn Clark, Center for Applied Second Language Studies
    Issues in developing a proficiency-based assessment in multiple languages

  • Jeremy Cross, University of Melbourne
    Listening tests as research tools: tackling transparency in test design and validation

  • Larry Davis, University of Hawai'i at Manoa; Anne Lazaraton, University of Minnesota
    Construction of a proficiency identity within paired oral assessment: Insights from discourse

  • Tomoko Fujita, Tokai University
    Creating valid in-house can-do descriptors for a listening course

  • Gene Halleck, Oklahoma State University
    Examining collaboration in Oral Proficiency Interviews

  • Heejeong Jeong, University of Illinois at Urbana-Champaign
    Language testing courses taught by non-language testers: Is this a concern?

  • Summer Loomis, The University of Texas at Austin
    Developing a Placement Test for Academic Arabic for Upper High School and University Students

  • Lu Lu, The University of Hong Kong
    The Distance between Student Writers' Intentions of Expression and Teacher Raters' Expectations in Argumentative Essay Rating

  • Kaizhou Luo, Binhai College of Nankai University, METS Office; Ting Huang, METS Office; Mariam Jones, American Academic Alliance
    Assessing the language proficiency of Chinese nursing students and professionals: A brief introduction to METS (for Nurses)

  • David MacGregor, Center for Applied Linguistics
    Predicting Item Difficulty: A Rubrics-Based Approach

  • Hye Shin, Teachers College
    Developing a Contextual Approach to Learning L2 Prosody using ToBI Conventions

  • Toshihiko Shiotsu, Kurume University
    An Analysis of the Sentence Span Task for L2 Learners

  • Emily Svendsen, University of Illinois at Urbana-Champaign; Jiin Yap, University of Illinois at Urbana-Champaign
    Effect of guidelines during a consensus discussion in standard-to-standard alignment

  • Alan Urmston, Hong Kong Polytechnic University; Felicia Fang, Hong Kong Polytechnic University
    Selling a language assessment to skeptical stakeholders: The GSLPA

  • Margaret van Naerssen, Immaculata University
    Testifying in legal cases: Test your knowledge, flexibility, and creativity!

  • JoDee Walters, Bilkent University
    A test to reveal incremental acquisition of vocabulary knowledge

  • Hui-chun Yang, University of Texas Austin
    Response Strategies and Performance on an Integrated Writing Test

  • Kiyomi Yoshizawa, Kansai University
    The effect of different anchor tests on equating quality

  • Yujing Zheng, Chongqing Jiaotong University; Gu, Xiangdong, Chongqing Jiaotong University
    A retrospection study on the construct validity of TOEFL listening comprehension tests with multiple-choice format

2:45 – 4:15 pm

Paper Session 2

Hongwen Cai, University of California at Los Angeles
Clustering to inform standard setting in an oral test for EFL learners

Ute Knoch, University of Melbourne
Investigating the effectiveness of individualized feedback to rating behavior on a longitudinal study

Jennifer Zhang, Guangdong University of Foreign Studies
Exploring rating process and rater belief: Transparentizing rater variability

4:30 – 6:30 pm

Symposium 1

Investigating the impact of assessment for migration purposes

  • Elana Shohamy and Nick Saville, Organizers

  • Piet Van Avermaet, Centre for Intercultural Education, University of Ghent and Max Spotti, Tilburg University

  • Nick Saville and Szilvia Papp, University of Cambridge, ESOL Examinations

  • Lorenzo Rocca and Giuliana Grego Bolli, Universita per Stranieri, Perugia

  • Elana Shohamy, Tzahi Kenza and Nathalie Assias, Tel Aviv University

Discussant: Tim McNamara, University of Melbourne

7:00 – 10:00 pm

 

ILTA Executive Board Meeting Larkspur Room

Thursday, March 19, 2009

Time

Session Type

Details

8:30 – 8:45 am

Welcome, etc.

Chairs

8:45 – 10:15 am

Paper Session 3

Pauline Rea-Dickins, University of Bristol; Matt Poehner, Pennsylvania State University; Constant Leung, King's College London; Lynda Taylor, University of Cambridge ESOL Examinations; Elana Shohamy, Tel Aviv University
From the Periphery to the Centre in Applied Linguistics: the case for situated language assessment

Gary Ockey, Utah State University
The extent to which differences in L2 group oral test scores can be attributed to different rater perceptions or different test taker performance

Jiyoon Lee,University of Pennsylvania
The analysis of test takers’ performances under their test-interlocutor influence in a paired speaking assessment

10:15 – 10:30 am

BREAK

 

10:30 am – 12:00 pm

Paper Session 4

Atta Gebril, The United Arab Emirates University; Lia Plakans, The University of Texas at Austin
Towards a transparent construct of reading-to-write assessment tasks: The interface between discourse features and proficiency

Yao Hill, University of Hawai’i at Manoa; Ou Lydia Liu, Educational Testing Service
DIF investigation of TOEFL iBT Reading Comprehension: Interaction between content knowledge and language proficiency

William Grabe, Northern Arizona University; Xiangying Jiang, Northern Arizona University
Completion as an Assessment Tool of L2 Reading Comprehension: Building a Validity Argument

12:00 – 2:00 pm

Lunch

 ILTA Meeting Larkspur Room

2:00 – 3:30 pm
Works in Progress

Evergreen A, B

  • Rachel Brooks, Federal Bureau of Investigation
    Native versus non-native raters' evaluations of high-level English

  • Youngshin Chi, University of Illinois at Urbana-Champaign
    Construct validation of listening comprehension test: effects of task and the relationship between listeners' cognitive awareness and proficiency in listening comprehension

  • Yeonsuk Cho, Educational Testing Service; Brent Bridgeman, Educational Testing Service
    The relationship of TOEFL scores to success in American universities: How high is high enough?

  • Christian Colby-Kelly, McGill University
    Bridging the gap: Assessment for learning in a Quebec classroom

  • Douglas Altamiro Consolo, UNESP
    Melissa Baffi-Bonvino, UNESP
    Oral proficiency assessment in a pre-service EFL teacher education programme: transparency on the validation of vocabulary descriptors

  • Sumi Han, Seoul National University
    Factors influencing the pragmatic development Korean study-abroad learners

  • Ching-Ni Hsieh, Michigan State University
    Weiping Wu,The Chinese University of Hong Kong
    Learning outcomes and the focus of the assessment tool

  • Genesis Ingersoll, Center for Applied Linguistics
    Anne Donovan, Center for Applied Linguistics
    Developing an oral assessment protocol and rating rubric for applicants to the English for Heritage Language Speakers program

  • Guodong Jia, Renmin University of China
    Computer-based and internet-delivered College English Test in China: IB-CET in progress

  • Erin Kearney, Yale University
    Adapting the European language portfolio to an integrated language and discipline study program

  • Claudia Kunschak, Shantou University
    Reform from within: A collaborative effort at transparency, reliability and validity in assessment

  • Chih-Kai (Cary) Lin, Georgetown University
    ESL writing assessment: Does the selection of rating scale matter?

  • Margaret E.Malone, Center for Applied Linguistics
    Megan Montee, Center for Applied Linguistics
    The Internet-Based TOEFL and test user beliefs

  • Johanna Motteram, The University of Adelaide
    "Tone" and language test rating scale descriptors

  • John Read, University of Auckland
    Toshihiko Shiotsu, Kurume University
    Developing new lexical measures for diagnostic assessment

  • Sultan Turkan, University of Arizona
    Relationship Among the test, curriculum, and teacher content representations in an EFL setting

  • Viphavee Vongpumivitch, National Tsing Hua University
    Vocabulary knowledge and its use in EFL speaking and writing test performance

  • Elvis Wagner, Temple University
    Tina Hu, Temple University
    Video discourse completion tasks for the testing of L2 pragmatic competence

  • Jing Wei,UMCP
    Cheng-Chiang (Julian) Chen, UMCP
    Using integrative task-based assessment to examine the effectiveness of task-based language teaching

  • Huijie Xu, Zhejiang University of Technology
    Ying Zheng, Queen's University
    Applying protocol analysis in analyzing language test validity: A case study

3:30 – 5:30 pm

Symposium 2

The use of integrated reading/writing tasks: international, institutional and instructional perspectives
  • Guoxing Yu and Yasuyo Sawaki, Organizers

  • Yasuyo Sawaki & Thomas Quinlan, Educational Testing Service, USA and Yong-Won Lee, Seoul National University, Korea

  • Sara Cushing Weigle & WeiWei Yang, Georgia State University, USA

  • Mark Wolfersberger, Brigham Young University - Hawaii, USA, Guoxing Yu, University of Bristol, UK,

Discussant: Alister Cumming, OISE, University of Toronto, Canada

Banquet (Round Up): 6:30 pm
The LTRC Banquet (Round Up in Wild West terms) will be held on Thursday, March 19 at 6:30 pm. We know that budgets are tight, so this year's Round Up will be a buffet-style dinner. Tickets are $45 for attendees and $35 for students. Dinner includes appetizers, dinner, side dishes, dessert, limitless water and soft drinks, two tickets for beer or wine* (if you choose), and priceless company. We will also celebrate LTRC's 30th anniversary with cake (in addition to the dessert).

**Additional beer and wine tickets will be available for purchase.

Friday, March 20, 2009

Time

Session Type

Details

8:30 – 8:45 a.m

Announcements

 

8:45 – 10:15 am

Paper Session 5

Talia Isaacs, McGill University; Ron Thomson, Brock University
Judgments of L2 Comprehensibility, accentedness and fluency: The listeners’ perspective

Ryan Downey, Pearson; Alistair van Moere, Pearson
In the ear of the beholder Dependence of comprehensibility on language background of speaker and listener

Nivja DeJong, University of Amsterdam
Temporal aspects of perceived speaking fluency

10:15 – 10:30 am

BREAK

 

10:30 am – 12:00 pm

Paper Session 6

Anja Römhild, University of Nebraska; Dorry Kenyon, Center for Applied Linguistics;David MacGregor, Center for Applied Linguistics
Assessing domain-general and domain-specific academic English language proficiency

Lorena Llosa, New York University; Sarah W. Beck, New York University;
Cecilia G. Zhao, New York University
Defining the construct of academic writing to inform the development of a diagnostic assessment

Rob Schoonen, University of Amsterdam; Nivja De Jong, University of Amsterdam; Margarita Steinel, University of Amsterdam; Arjen Florijn, University of Amsterdam; Jan Hulstijn, University of Amsterdam
Profiles of Linguistic Ability at Different Levels of the European Framework: Can They Provide Transparency?

12:00 – 1:30 pm

Lunch: meeting

Language Testing Executive Board Larkspur Room

1:30 – 3:00 pm

Paper Session 7

Don Rubin, University of Georgia; Okim Kang, Northern Arizona University; Lucy Pickering, Georgia State University
Relative impact of rater characteristics versus speaker suprasegmental features on oral proficiency scores
 
Yo In’nami, Toyohashi University of Technology; Rie Koizumi, Tokiwa University
A meta-analysis of multitrait-multimethod studies in language testing research: Focus on language ability and Chelle’s (1998) construct definition and interpretation

Lynda Taylor, University of Cambridge ESOL Examinations
Telling our story: Reflections on the place of learning, transparency, responsibility and collaboration in the language testing narrative

3:00 – 5:00 pm

Symposium 3

The discourse of assessments: Addressing linguistic complexity in content and English language proficiency tests through linguistic analyses

  • Jim Bauman, Organizer

  • Jim Bauman, Center for Applied Linguistics

  • Laura Wright, Center for Applied Linguistics

  • David MacGregor, Center for Applied Linguistics

  • Abbe Spokane, Center for Applied Linguistics

  • Meg Montee, Center for Applied Linguistics

5:00 – 5:20 pm

Wrap-up

Sara Weigle
Meg Malone

AAAL/LTRC Beer tasting: 5:30 pm

©2009 Center for Applied Linguistics