CU Thinking: Problem-Solving Strategies Revealed

In order to analyze engineering students’ problem-solving strategies, we are collecting work completed on Tablet PCs and analyzing the digital Ink using “tags” to identify events of interest using custom-designed software called MuseInk. The work collected includes problems completed in a first year engineering course at Clemson University (CU) specifically selected for their level of complexity, potential for multiple approaches or representations, and the level of structure and/or definition provided. A “Tag Universe,” a database of procedural events, errors, and other items of interest, has been developed to tag relevant events within student work. The Tag Universe is organized into categories based on a theoretical framework of process activities used during problem solving: knowledge access, knowledge generation and self-management. Tags include items such as sketching the problem, identifying known and unknown values, manipulating an equation to solve for a desired variable, and checking the reasonableness of a solution. In addition, student errors are categorized (conceptual, procedural, and mechanical), and students’ recognition of their errors are being analyzed based on signal detection theory. This identifies “hits” (student makes an error and self-corrects), “misses” (student makes an error and does not recognize it) and “false alarms” (student second-guesses a correct approach). MuseInk also allows the insertion of audio tags to document students’ verbal commentaries about what they were thinking when specific events occurred. A user survey was implemented to identify ways to increase benefits to students using MuseInk. Tutorials and additional classroom activities using MuseInk were developed based on survey data for use in Fall 2010/Spring 2011. To date, worked solutions and audio commentary for three problem sets were collected from total of 26 students (19 males, 7 females). One of the three problem sets has been tagged by our research team, and inter-rater reliability analysis was conducted to ensure consistent tagging. Tag data (written and verbal) is in the process of being analyzed in terms of relationships between tag categories and students’ academic backgrounds and prior knowledge about engineering. We are beginning to define criteria for structuring problems to allow students from a broad array of prior educational experiences and academic preparation to develop effective and transferrable problem-solving skills. While our methods, which use MuseInk as a research tool, are evolving, we are also considering how the software is being used as an instructional tool. A user survey was implemented to identify ways to increase benefits to students using MuseInk. Activities using MuseInk both inside and outside the classroom are being developed based on survey data, such as tutorials and peer feedback.