“Students answering questions in their own words is the most meaningful way for instructors to identify learning obstacles,” said Mark Urban-Lurain, principle investigator for the $5 million grant and co-director of MSU’s Center for Engineering Education Research. “The realities of typical large-enrollment undergraduate classes, however, restrict the options that facultymembers have for evaluating students’ writing.”
$5.7 Million NSF Grant To MSU To Boost STEM Learning
October 1, 2013 12:50 PM
EAST LANSING (WWJ) – Michigan State University has received two National Science Foundation grants totaling $5.7 million, fundsthat will be used to look for ways to use computer software to analyze student writing in science and engineering classes.
The goal is to help retain more students who are enrolled in the so-called STEM disciplines – science, technology, engineering and mathematics.
The two grants include $5 million for five years, funds that will be used to develop a website where student exam answers can be analyzed; and an additional three-year, $718,000 grant that will be used to assist instructors in the use of the software.
In typical, large-enrollment STEM courses, faculty usually use multiple-choice exams because they are easily scored by computers. But a lot of information is hidden from view with multiple-choice exams.
A team of MSU researchers will use the grants to aid in the development of computerized tools that will analyze students’ written responses to homework, quiz and test questions to predict how they would be assessed by experts.
“When students express what they know in their own words, it is deeper and a more rich view of what students know and have learned,” said John Merrill, PI of the $718,000 grant and director of MSU’s Biological Science Program. “When students write something that is untrue, now you know the challenge you need to be correcting in class. It’s a more interactive form of learning and teaching.”
The computer software is based on programs that are used in the business world to analyze surveys. It selects words and phrases written by the students that provide insight into how they understand the course material. It has the ability to analyze several sentences.
A major part of the funding is targeted for completion and public rollout of a fully automated website where instructors around the world can have their students’ open-response answers analyzed automatically.
Other collaborators on the project include faculty members at theUniversity of Colorado at Boulder, the State University of New York at Stony Brook, the University of Maine at Orono, the University of Georgia, the University of South Florida and Western Michigan University.
Merrill said improvements to the current constructed response assessments would reveal more about student misconceptions, allowing faculty members to make small corrections throughout the semester.
“The software can analyze text and determine correctness,” he said. “But we have to train the computer to do that.”
No comments:
Post a Comment
Note: Only a member of this blog may post a comment.