The Keens are arguing, and their bickering is scaring away the “Peachlings” along the Chattahoochee River. But helping the Keens solve their problems — by fishing clues out of the water and answering questions on short reading passages — is a way to bring back the colorful creatures and earn beans.
Those are a few of the “River Clean-Up” challenges facing the players of Keenville, a collection of new game-based assessments for 1st- and 2nd-graders in Georgia – hence the peach and Chattahoochee references. Developed with $2.5 million appropriated by the state legislature, Keenville is also an early example of a game-based assessment made widely available to educators and moving out of the research project phase.
To develop the game, the state education agency turned to FableVision Studios, which worked with Georgia Public Broadcasting to create an 8th-grade state history game featuring a girl named Savannah and her “trusty canine companion Peaches.” Keenville, which will eventually have 30 games, will be used in a formative way to assess young students’ progress on key reading and math standards, said Jan Reyes, the director of assessment development for the Georgia Department of Education (GDOE).
For 6- and 7-year-olds, “this seems to make the most sense. We’re not having kids at this age get out their No. 2 pencils,” Reyes said in an interview. “There is a reward system built in. Students are earning beans as they progress and spend those beans for their house.”
'The benefits of fun and games'
Influenced by the popularity of video games, most games for measuring what students know and can do in an academic setting are integrated into an online learning activity instead of being a standalone assessment at the end of a unit or lesson, explained Eva Baker, a research professor at the University of California, Los Angeles and the director of the Center for Research on Evaluation, Standards and Student Testing (CRESST).
The students "get the benefits of the fun and games" and the "assessment is something that happens underneath," Baker said in an interview. Baker, who admits she's a devoted player of Candy Crush Saga, has worked on game-based assessment for the U.S. Navy. And CRESST’s Connected Learning Study examines outcomes data that can be gathered when children are playing PBS games.
The field of game-based assessment took a leap forward when the Institute of Play, a nonprofit game design studio, launched the Games, Learning and Assessment Lab — or GlassLab — in 2012. With support from the Bill and Melinda Gates Foundation, the John D. and Catherine T. MacArthur Foundation, Electronic Arts and the Entertainment Software Association, the nonprofit adapted existing games for educational purposes — its SimCityEDU is one example — with the idea that built-in tasks and challenges could serve as assessment data for teachers.
GlassLab, however, announced in December that it is closing, largely because it never developed a large customer base within the education market.
“They proved how hard it is to change assessment in a school system,” said Yoon Jeon Kim, a research scientist with the Teaching Systems Lab at the Massachusetts Institute of Technology.
Describing herself as a member of the "game-based assessment scholar generation," Kim is also part of the lab’s "playful assessment" initiative, an effort to make assessment more fun and engaging, and to spread that work beyond small pilot projects.
“I was really frustrated,” she said in an interview. “Assessment work by definition has to be impactful. If we can’t create assessments that can be used in the classroom, then we're not doing something right.”
Kim is working on a few game-based assessments “with scale in mind,” she said. One is Shadowspect, a series of puzzles meant to assess 9th-grade geometry skills. She’s co-designing the program with teachers and plans to pilot it in the spring.
Another is Maker Moment, a paper-based, bingo-type game that students use to self-assess how and when they are demonstrating particular skills in makerspaces — a growing area of curriculum that doesn’t align with traditional assessment methods. Teachers in San Mateo, California, and Albemarle, Virginia, are currently using the tool and providing feedback.
Kim’s goal with game-based assessment is not to hide the test tasks behind cute characters or Fortnite-type landscapes. She wants the “assessment culture” itself to be playful in a way that makes students care about the skills they are developing.
“I like going head on with assessment,” she said. “I haven’t given up the hope that we can change standalone assessment tools.”
While traditional tests are useful for measuring a student's mastery of a specific skills, they are less helpful at capturing the Standards for Mathematical Practice that are part of the Common Core, explains Padraic Kelly, who teaches Advanced Placement statistics at Prospect Hill Academy Charter School in Cambridge, Massachusetts. His students have been play-testing Shadowspect as it develops.
Games, he said, can show how a student is making connections between topics, showing perseverance or using resources, he wrote in an email.
"For example, on a written exam, students will erase incorrect work and leave their best answer," he said. "In [Shadowspect], we can see the keystroke history of attempts that all students take and draw conclusions about student understanding and tenacity based on that."
He said the game especially benefits students who don't think they are strong in math.
"They don’t know that math doesn’t have to be solving written problems," he said. "It gives them a chance to see that math is a broad topic, and they can be good at certain aspects they didn’t know about, which I hope will give them the confidence to be interested in working to get better at the math they traditionally see in the classroom."
Validity, cost among key challenges
The developers of game-based assessments, however, still face a number of hurdles, Baker explained. For it to measure whether a student knows the correct answer or procedure, a game-based assessment has to “minimize the role of chance,” she said. For the student-player, there should also be a clear explanation of how to win — whether that’s by trying to beat a previous score or solve a puzzle.
Another question is whether the data collected during the course of the game is valid and precise enough to provide educators with the detailed information they need on a student’s performance. Some examples, Baker said, were “way too global,” and that some models promised more than they could deliver.
“There’s a lot of ‘gee whiz’ going on,” she said. “It’s much easier to make slick things that it used to be.”
The cost of developing a high-quality, interactive game — that comes even close to what students are playing outside of school and that is also a valid and reliable assessment — may be the biggest obstacle facing the field, experts say. GDOE is saving some money by creating it’s own program, Reyes said. “We develop it, and its ours as opposed to paying licensing fees year after year.”
Randy Bennett, a research chair at the Educational Testing Service, adds that some students are simply more skilled in — and motivated by — a game format than others. He added that while there is significant potential for the use of game-based assessment for formative purposes to grow, it’s unlikely that they will replace any tests for accountability or high-stakes decisions anytime soon.
For now, Georgia teachers are letting students “play around” with the Keenville games. They include Treat Factory, which measures students’ understanding of tables and graphs, and Peachling Café, which focuses on place value. As they play, students’ responses are saved to a dashboard so teachers can monitor how a whole class or individual students are doing. Reyes, however, also sees the possibility of expanding Keenville to higher grade levels and content areas.
“This is really a developmental year,” she said. “Next year we’ll have a much better sense of how teachers are using that data.”