Instructors expanding their teaching toolkits with new UVA learning technologies
This article was published in UVA Today Thursday, December 10, 2020.
With many classes online this semester, instructors have found alternative, innovative ways to manage their classrooms and keep students engaged. To help with these efforts, UVA faculty and graduate students now have easy access to a host of new technologies to enhance teaching and learning.
This summer, the Executive Vice President and Provost and the Chief Information Officer invested in licenses for seven learning technologies: Digication, Gradescope, Hypothesis, MATLAB Grader, Peerceptiv, Poll Everywhere, and VoiceThread. More information on these tools and the rest of UVA’s digital catalog can be found on the Learning Tech website, another new initiative that launched ahead of the fall semester.
Instructors are taking advantage of these new or recently expanded technologies to improve assessment and evaluation processes and increase student engagement and collaboration.
“The University invested in these technologies to help faculty make the best of a necessary situation during the pandemic —a lot of online course instruction,” said UVA Provost Liz Magill. “But we have found that this has led to faculty innovation, too. Using new software is no exception—our faculty have changed their approach to many standard pedagogical practices and will likely retain these innovations long after the pandemic is over.”
Statistics Assistant Professor Rich Ross and Chemistry Lecturer Alicia Frantz are among nearly 550 instructors finding Gradescope to be an essential addition to their teaching toolkits. The tool allows them to streamline the grading process and gain insights into how their students are learning in their large-enrollment courses.
“I have lots of individual work and lots of group work and trying to figure out how to manage assessing or evaluating that work is a difficult thing,” Ross said, “but having tools like Gradescope have been super helpful in thinking about how we do this at scale.”
Instructors can grade exams, problem sets, and other assignments more efficiently by building intuitive, dynamic rubrics. Ross estimates that the tool saves him about 40 hours of grading over the course of the semester.
“You actually never have to take your hands off the keyboard. You can get very, very quick and accurate at grading work and especially in comparison to grading paper submissions.”
For Frantz, grading quickly means that she can return feedback to her students sooner, helping them to learn from their mistakes and better prepare for the next exam.
“Before I started using Gradescope, when we would hand back exams, there were always at least a third of the exams that never even got picked up and those were always the students that were struggling that most needed to work through those problems,” she said. “I think just the ease at which they can look through their exam without having to make a special trip or to feel anxious about it, I think that’s been one of the best things that’s come out of this.”
Students also can ask their professor to look at a question again if the student thinks there was an error.
“I think students appreciate being able to submit a regrade request maybe partly because they realize that as a course staff, we’re admitting that we can make mistakes sometimes, and that we’re happy to have those conversations,” Ross said. “I think that actually builds a lot of trust between the instructor and the students.”
In addition to trust, Ross is able to build stronger relationships with his students.
“Gradescope has implemented several mechanisms that meaningfully … reduce the amount of time you have to spend grading and let you do more of one of my favorite things as a teacher, which is interacting directly with my students, talking to them about content questions, talking to them about potential career options.”
When the pandemic hit, History Professor Jennifer Sessions turned to a tool that she had used before to help maintain an important aspect of her courses—peer feedback.
“I started using Peerceptiv for guided peer review of essay assignments in my modern European history course in Fall 2017, and since then have used it in everything from big introductory surveys with several hundred students to specialized upper-level courses of a few dozen. This fall, we’re even using it in a graduate seminar.”
She’s one of 10 instructors who have Peerceptiv up and running in their courses this semester. The peer assessment tool encourages student development as teachers and learners through a research-validated cycle of anonymous feedback. Students can share recommendations with each another, while evaluating the quality of the reviews they receive.
Sessions says she values the tool because it helps students not only improve their individual papers, but also become better readers and editors.
“It sounds hyperbolic, I know, but I regularly use the term ‘magical’ in describing this power to colleagues. The anonymous online system depersonalizes the peer review process and facilitates more objective, honest feedback, which means writers get better, more useful feedback on their own drafts.”
Her students review drafts of each other’s work and then revise those drafts for final assessment, getting separate grades for each assignment.
“Reading and providing feedback on several of their colleagues’ drafts allows students to see what does and doesn’t work for the assignment, to think through why, and to talk about how to make a given piece of writing more effective.”
The tool has inspired Sessions to reexamine her course design.
“Using Peerceptiv has made me much more deliberate in designing courses to scaffold concepts and skills over the course of a semester. Particularly, it has pushed me to shift the focus of writing assignments from outcomes to process.”
Another popular new technology is Poll Everywhere, an audience response system that enables users to post activities like attendance items, quizzes, or polls and then display results in real time.
Math Professor Paul Bourdon, School of Data Science Assistant Professor Scott Schwartz, and 168 of their colleagues are using the tool this fall.
Schwartz said, “As I was [teaching my Data Mining] course, it just happened that after one of my lectures, where I had all this dense code, I thought, I want to ask some polling questions and it just was a hit with the students.”
He says using Poll Everywhere has been a low-stakes way to get more of his students to participate in discussions: “I just think it really lowers the bar to get engagement going. And that’s really what the students have told me when they talk about it.”
Bourdon, who serves as the Department of Math’s Director of Lower Division Courses, was previously using a different polling platform and decided to adopt Poll Everywhere in all undergraduate calculus courses this fall. He uses the tool to “promote small-group discussion of interesting problems during class, to monitor student understanding, and to help students assess their own understanding.”
He solicited midsemester feedback from his students, many of whom had positive things to say about the tool. A student in his course wrote, “I think Poll Everywhere is useful because it makes me stay engaged in lecture. It also acts like a check on my understanding. I like how the professor made it so that the bulk of the Poll Everywhere grade is for participation not accuracy so I don’t feel pressured to have it all figured out at that instant.”
Bourdon and Schwartz will both continue to keep Poll Everywhere an integral part of their toolkits.
“Poll Everywhere has totally changed the way I go to make my lectures,” Schwartz said. “Instead, I start by saying, ‘What is my Poll Everywhere conversation going to be about?’”