Go to the U of M home page
Showing posts with label Assessment. Show all posts
Showing posts with label Assessment. Show all posts

Tuesday, January 14, 2014

New and Improved Language Assessments: Spanish SOPI, ILA, Self-Assessments and More!

Until recently, the Language Proficiency Exam (LPE) was the only central tool available for language students to evaluate their language skills at the intermediate level, and there were no options for students who had surpassed that level. Today though, the Language Testing Program is working with language program developers to diversify the tools available to students, and to reach students whose language is not taught at the University, and those who have achieved higher levels of proficiency.

The LPE remains the most important tool available. In addition to tests already in place for Arabic, Chinese, Hmong, Italian, Japanese, Russian, and a second version of the Spanish LPE Reading and Listening sections, other new assessments are in progress. Development is underway for Finnish, Korean, Somali, and Swahili LPE's this semester. A new form of the German Reading LPE will be piloted in November.

Beyond the LPE, the following new assessments have been piloted, will be piloted or are now actively administered to students: the Spanish Simulated Oral Proficiency Interview (SOPI), the Individual Language Assessment (ILA) and several Self-Assessment instruments.

The Spanish SOPI


A SOPI is a computerized version of an Oral Proficiency Interview (OPI) administered in a Digital Language Lab (DiLL) to a class of students simultaneously. The primary advantage of a SOPI over a traditional OPI is that the time required to administer the test is significantly reduced. It would take at least six hours to administer OPIs to a class of 24 students; with a SOPI, this can be done in less than one class period. Of course, all of those exams still need to be rated!

"In delivering an interview via computer, the SOPI may shift the center of power more toward the students; they may feel more ownership of their half of the (simulated) interaction, and thus feel more confident about speaking."  

-Gabriela Sweet, assessment developer and lead trainer

Students are provided with a real-life context to speak in Spanish, recording their responses in the lab using DiLL. Student feedback was generally positive after the February pre-pilot. Students commented that the SOPI is more efficient and less stressful and that they enjoyed using the SOPI in lieu of the traditional face-to-face format.

The SOPI's delivery, in capturing extended student speech, reduces the possibility of a difficult-to-rate interview, which can occur in face-to-face interviews when learners may not get the opportunities they need to demonstrate the full range of their language abilities. Developers hope to use the SOPI assessment with College in the Schools (CiS) students in the future, and it is possible to adapt this assessment for other languages as the prompts are in English.

As with the OPI, SOPI tasks are designed to target a wide range of linguistic functions and real-world topics at the student's target level of proficiency. The SOPI is also more standardized compared to the OPI because it systematically facilitates a ratable sample of student speech. Students naturally tend to speak in complete sentences in the SOPI, which may not happen during the live interview when they can appropriately respond in sentence fragments. Since the SOPI response is at sentence-level, it may be easier for raters to analyze.

A six-person team carried out the SOPI pilot. This team included Spanish 1004 coordinator Sara Mack, Spanish 1004 instructors Marilena Mattos and Stephanie Hernández, Spanish Testing Coordinator Joanne Peltonen, Language Center Testing Development Coordinator Gabriela Sweet and Language Center Technical Coordinator Diane Rackowski. Additional input and support was provided by Language Center Director Dan Soneson.

The ILA


Did you know the Testing Program now provides assessment in languages from Amharic to Zulu?

CLA students who have achieved proficiency in a language not offered at the University of Minnesota have a new way to demonstrate their proficiency and complete the second language requirement. The ILA is a writing and speaking test that is adaptable for any modern language. It tests to the intermediate level, which is the level expected after two years of university language study. The test is rated by native or near-native speakers of the particular language who are trained and guided by Language Testing Program staff.

Since January 2013, a total of 32 students have been approved to take ILAs in 16 languages from around the world, including major languages spoken by hundreds of millions of people worldwide, but rare on our campus, and dialects specific to a particular village. The most common languages assessed are Vietnamese and Oromo.

For many languages, it can be challenging to find a qualified rater. Most raters are affiliated with the University as faculty, staff, graduate students, or former students. They are almost always native speakers with some prior linguistic or educational experience. Some of the raters have few other opportunities to use their language skills professionally, although they may use the language in their daily life.

Stephanie Treat, who has assisted with the hiring of most raters, said:

"We often get requests to rate languages that I've never heard of or know little about. We've had fun researching languages and scouring campus and beyond to find potential raters. The University of Minnesota is a global community, and with persistence, we can usually find someone currently connected to campus who is pleased to share his or her expertise."

In order to conform to the CLA mandate for proficiency in a second language, ILA raters are trained using guidelines that align with those used to evaluate student performance in languages that are taught at the University of Minnesota and that have an LPE. Before ILA raters begin the evaluation process, they are made familiar with both the instrument and the target level for production. The rater-training process is very hands-on; criteria are analyzed and then applied. The rater works closely with the rater trainer. Raters are guided through the process using a rater-training module.


The Language Testing Program plans to create a video to facilitate rater training, as well as refresher training, for raters who have completed evaluations at an earlier date but would need a quick refamiliarization to ensure that they apply the criteria in the same way to a new student's ILA. "We've been fortunate to work with some very talented people," assessment developer and lead trainer Gabriela Sweet said.

"It's fascinating to work with someone on a language with which you, the trainer, have very little experience, to see them point out clearly how students demonstrate the target levels. It has been a wonderful experience for us, in the Testing Program, to see how students in a variety of languages are able to show their proficiency in alignment with the College of Liberal Arts student learning objectives. I think it's also a privilege to be able to learn from colleagues in diverse languages; in hearing what students say and reading what they write in the ILAs we begin to see that the world is, in some ways, quite small... we're all working toward many of the same goals."  

- Gabriela Sweet

Students interested in completing the CLA second language requirement via ILA exam should begin by contacting their advisor. CLA Student Services approves requests first, and then students contact the Language Testing Program to schedule an exam. There is a $30.00 fee assessed to assist with the cost of rater compensation. Students who pass both sections of the exam complete the CLA second language requirement.

Self-Assessment Instruments


Self-Assessment instruments help students become more aware of their own language development and describe their level of proficiency. This type of instrument requires little time to administer and students can take it from home at their convenience. Students can take the same assessment more than once and track their language development over time.

Two Self-Assessments for Spanish are currently in development: one intended for Spanish 1004 students and another which assesses higher-level proficiency for students pursuing the Certificate of Advanced-Level Proficiency. Developers are also creating new self-assessments for German, Italian, and French.

Students can use these self-assessments for feedback on performance in a language and to better prepare themselves before taking the LPE or other exam. In a pilot study in Fall 2013, many students found the self-assessments helpful and noted that routine self-assessment would be helpful in the future. The test developers will be very pleased to share more information about these instruments once they have been piloted and results have been analyzed.

SOPI, ILA, Self-Assessments


Developers in the Language Testing Program and the academic departments have been hard at work developing and improving language assessment instruments to create a more personalized, modern, and enjoyable testing experience for students. Undergraduate students themselves have played an important role by piloting these assessments and providing valuable feedback to developers. These new tests are designed to accurately reflect students' abilities and to provide them with information they can use as they continue to develop their proficiency.




Type of AssessmentBenefits / ImprovementsDevelopment Timeframe
New Language Proficiency Exams (LPE's)Korean and Somali were added as new LPE exam options. Adding these languages provides students with more possibilities in assessing language proficiency.Korean: Will pilot Reading in Spring. Listening & Writing are ready to go!

Somali: Will pilot Reading, Listening, and likely Writing this Spring
New German Reading TestDeveloped another form in addition to Form A. Form B is up-to-date; the readings from Form A were 20 years old. Better quality photos were also added.The Reading test will be piloted this Fall semester
Simulated Oral Proficiency Interview (SOPI)Spanish 1004: Extended time limit for thinking and speaking. Bigger, better resolution photos.Piloted in Spring 2013
Individualized Language Assessment (ILA)One format for all languages as a test of Speaking and Writing. Allows students flexibility in composing responses about real-world situations.The test has been administered to 16 students with many more expected in the future. The long-term goal is to develop multiple versions of the test.
Self-Assessment InstrumentsSpanish 1004: Students can demonstrate their second language proficiency before taking final proficiency tests

Spanish Advanced Level: To help students determine their proficiency level. Intended for students pursuing the Certificate of Advanced-Level Proficiency.
Spanish 1004: Pilot was completed in Fall 2013

Spanish Advanced Level: Piloted with Spanish students December 11, 2013

German: Pilot 5 sections in Spring 2014

Italian: Pilot in two section Spring 2014

French: Pilot in eight sections Spring 2014


Tuesday, January 7, 2014

Advanced-Level Spanish Certificate Option Approved for Students

adv_cert.gif
The Certificate of Advanced-Level Proficiency in Spanish program was recently approved as an option for students to have their language proficiency formally recognized beyond the Language Proficiency Exam (LPE). This is a great option for students of Spanish whose abilities extend beyond the intermediate level and who want to have their advanced-level proficiency formally recognized.

The Advanced-Level Certificate option is open to all undergraduate University of Minnesota students, regardless of their major or college. The certificate program will be administered jointly by the Department of Spanish and Portuguese Studies and the CLA Language Center.

The LPE can assess language proficiency skills up to the intermediate level. The Advanced-Level Spanish Certificate program will be able to assess skills up to the advanced level. The American Council on the Teaching of Foreign Language (ACTFL) guidelines determine advanced-level proficiency. According to these guidelines, students with advanced-level proficiency do not necessarily perform like native speakers. However, they have reading, writing, listening and speaking skills sufficient enough to navigate daily situations, such as routine school and work requirements and can be generally understood by native speakers.

One of the goals of the Certificate Program is to provide students with an internationally-recognized marker of proficiency once they achieve advanced-level proficiency. Another goal is: "To encourage the integration of language and culture learning across students' academic and professional lives, and empower students to be responsible for their own second language acquisition."1

There are several steps required for completing the Advanced-Level certificate, including: passing the Spanish LPE, passing two approved upper-level courses taught in Spanish, completing an intensive Spanish language immersion experience, taking a self-assessment, completing a critical reflection essay, and passing the ACTFL advanced-level exam. Please see the Advanced-Level Certificate informational website for a full list of requirements.

Students interested in learning more about the certificate are encouraged to attend a Certificate Orientation and Self-Assessment Workshop on January 22, 2014 from 3:00-4:30 p.m. in Jones 35. Information about the certificate is also available at http://z.umn.edu/spancert.


Tuesday, July 9, 2013

Easier Alternatives to Oral Interviews



mac_dill.jpg Formal and informal assessment of students' spoken language is used by most language departments, commonly as one-on-one interviews with the instructor. This can put a huge strain on instructors as they are responsible for not only proctoring and grading each interview, but also: coordinating interviews with each student (which usually takes up several class dates), finding equipment to record the interviews, and figuring out how to access the recordings. Fortunately, the Language Center computer classrooms offer a variety of alternatives to one-on-one oral skills assessment!

All four of the LC classrooms can be used as a Digital Language Lab (DiLL). The DiLL uses an intuitive software interface that allows you to carry out the activities of a traditional language lab. You may already be familiar with the DiLL for informal speaking activities in class, but a growing number of language departments (Spanish, Chinese, ESL) are using the DiLL for graded speaking tasks.

The DiLL allows you to pair students (one-on-one or even small groups) and then, with the press of a button, they can record their conversations! You will have easy access to student recordings which are saved automatically to the DiLL server. Recordings can be played online or downloaded for easy access anywhere. What normally took days of class time will require less than one class period.

In my own experience, I have found that pairing students, rather than interviewing each student myself, produced more natural language and students felt less nervous, because they were working with their classmates and not their instructor. However, if you prefer to assign assessments that are individual, especially if you are focusing on accuracy, the DiLL is capable of this as well.

The DiLL also allows you to simply record students individually by giving them a prompt in advance or you can speak with the whole class (via the DiLL) and record their responses. You can even use a pre-recorded prompt, which allows students to listen, pause and record. Students can have multiple attempts, if the goal is accuracy. And again, each student's individual recordings will be available for you online immediately after they are submitted.

The DiLL is a great way to easily collect speaking samples from all of your students in the amount of time that you might normally spend interviewing one student. Their recordings are available immediately and can easily be accessed for grading, or even downloaded to share with the student. If you would like to learn more about how the Language Center classrooms can be used for oral assessment or if you would like to schedule a one-on-one training on how to use the DiLL, please contact us.

Have you used the DiLL for oral assessment? What was your experience like?

Thursday, October 25, 2012

The LPE Changes with the Times

The Language Testing Program now offers computerized Language Proficiency Exams (LPEs) in more languages than ever before. Tests currently in development feature culturally-rich authentic source material such as clips from modern Korean film, a look at the Somali-speaking community in the Twin Cities, and much more!

scene from Korean film
Example of culturally authentic material that could be used in LPE
Computerized LPEs were established in 2001. Hundreds of language students take them each semester to fulfill the College of Liberal Arts (CLA) language requirement and as a gateway to advanced language study. The mission of the Language Testing Program has remained constant: to accurately assess students' ability to listen, read, write and speak in the target language.
However, the program has changed and grown since 2001. The LPE is constantly being improved and the pace of modernization and innovation has picked up in the last few years. The Language Testing Program has focused on two new goals since 2010:  to serve as many language students as possible and to improve the students' exam experience by including contemporary and diverse media from the target language culture.

Before 2010, a computerized test was available only for students of French, German and Spanish and a few related languages. Students of Asian and other less commonly taught languages were limited to paper-based tests, or had no exam options at all.

Tests are now in place for Arabic, Chinese, Hmong, Italian, Japanese and Russian, and there is a second version of the Spanish LPE. In addition, development is underway for Finnish, Korean, Somali, and Swahili. All of these tests were made possible through an influx of funding from Title VI and other sources, along with a committed effort on the part of the Language Testing Program and the individual language programs to work together tirelessly and collaboratively. Once all of the newer exams are completed, the LPE will be available for almost all non-Classical languages offered in CLA and will reflect the diversity of languages available at the University of Minnesota.

As new tests are created, the Language Testing Program and the developers aim to bring new depth to the student experience and to conform more closely to current trends in second language pedagogy with increased emphasis on culturally-rich contexts. The new exams retain the original LPE goal of validating the work of students in their four semesters at the university by providing an opportunity to show what they can do with the target language in a communicative context.

However, they are not just tests - they are also learning opportunities for students, highlighting something new about the culture, history, or people of the target language through the use of authentic materials. Students may learn, for example, how traditional holiday celebrations have changed over time as societies become increasingly multicultural. There are also explorations of how gender roles have shifted and how these shifts impact language as well as cultural practice. One exam features an innovative, and perhaps surprising, environmental initiative. Another explores the lyrics of a popular song from a YouTube video.

The piloting process for new LPEs often includes a survey of student opinions about the test. Reactions to the new authentic content have been overwhelmingly positive. Test-takers have said that they were surprised and pleased to see that they had no difficulty reading texts that they might encounter on a daily basis in the target culture.

Here are some sample student reactions:

It made me realize the potential of a real-life usage for the language I've been studying.

I liked that the readings were all things I'd have to figure out in real life. It was a very pleasant experience to read articles from Japan.


The Korean LPE also offers a significant technological innovation: the incorporation of authentic video segments into the listening section. The test includes five diverse clips from modern Korean film showing natural and interesting interactions between native speakers. The use of authentic video is an excellent platform from which to assess listening proficiency, since it ties closely to the construct of listening in a communicative context, where meaning is negotiated based on a variety of input sources. The Korean listening exam has already been piloted once, and the response to the test was enthusiastically positive. Students reported that they especially enjoyed the video segments and felt confident that they could understand content overall, even though there may have been a few words unfamiliar to them.

The Somali listening section will include some authentic video segments as well. This exam stands out because it is set locally and explores the lives of immigrants integrating with the larger community as they share their language and culture - a reflection of the changing face of the Twin Cities.

Since 2010, new LPE creation has been led by Gabriela Sweet, who has worked tirelessly to organize a rotating team of developers, coordinate with multiple departments and stakeholders, and keep all projects on time and moving forward. The Korean, Somali, and Swahili development teams also include Language Center AV Developer Alaina Witt, Item Reviewers Xinyi Wu and Meghan McFadden, and LC Technical Coordinator Diane Rackowski.

The current language-specific developers are:

Finnish: Dan Karvonen, Jaana Viljakainen

Korean: Hangtae Cho, Yunseong Cheong

Somali: Said Ahmed, Abdulkarim Maalin

Swahili: Angaluki Muaka


Much of the funding for Korean development has been provided by a CLA InfoTech Tools for Discovery Grant. Title VI funding managed by the Institute for Global Studies has provided some travel and development grants for Somali and Swahili.

The Language Testing Program and the Department of Asian Languages and Literatures plan to present the new Korean listening section featuring authentic video later this winter. The U of M language community will have an opportunity to see how the classic LPE format can be modernized with technology to provide students with an educational, culturally-rich, and even enjoyable testing experience.


Monday, March 21, 2011

ACTFL proficiency assessment workshop held over Spring Break

The Language Center sponsored a three-day workshop on language proficiency assessment over Spring Break led by Dr. Robert Vicars, an ACTFL certified trainer. The first two days focused on oral proficiency, and third day on writing. This professional development opportunity was offered primarily to support instructors of languages for which a new LPE is in development. A total of 33 instructors attended the workshop, representing the following languages: Arabic, Chinese, French, German, Hmong, Italian, Japanese, Korean, Russian and Spanish. The workshop was very well received, and some attendees expressed interest in discussing implications for the instruction and assessment of language classes. Anyone interested in participating in a follow-up event should contact Stephanie at treat002@umn.edu.