My name is Thomas W. Price, and I will be starting in the Fall 2018 as an Assistant Professor
North Carolina State University as a member of the Center for Educational Informatics.
My research goal is to reimagine educational programming environments as adaptive, data-driven systems that support students automatically as they pursue learning goals that are meaningful to them. I believe that every student should be able to learn computing with the support they need to be successful, working on projects that suit their values and interests. My research sits at the intersection of Computing Education Research (CER), Educational Data Mining (EDM) and Intelligent Tutoring Systems (ITS). My current research focuses on:
I am actively recruiting students interested in using data, AI and design to improve computing education.
Current NCSU Graduate Students: If you are interested in working with me, please feel free to reach out:
Prospective Ph.D. Students: If you are considering applying to NCSU's CSC Ph.D. program and are interested in working with me, I would be happy to talk to you more about the strengths of our program.
Undergraduate NCSU Students: If you are interested in getting hands-on research experience, feel free to reach out as well, and we can discuss if there are open positions in my lab.
iSnap is a programming environment designed to lower the barriers that novices face when first learning to program. It combines two effective support features: block-based programming and adaptive hints and feedback. In iSnap, students construct programs from drag-and-drop blocks, which reduce the initial challenges of programming syntax. It combines this with intelligent support features including on-demand programming hints and feedback. You can learn more about iSnap, the algorithms that power it, and the open-access iSnap datasets at go.ncsu.edu/isnap.
You can see all of my publications here, but here are some recent highlights:
By Thomas W. Price, Rui Zhi, Yihuan Dong, Nicholas Lytle and Tiffany Barnes. Presented at the International Conference on Artificial Intelligence in Education, 2018. Read the paper or view the presentation slides.
Abstract: In the domain of programming, intelligent tutoring systems increasingly employ data-driven methods to automate hint generation. Evaluations of these systems have largely focused on whether they can reliably provide hints for most students, and how much data is needed to do so, rather than how useful the resulting hints are to students. We present a method for evaluating the quality of data-driven hints and how their quality is impacted by the data used to generate them. Using two datasets, we investigate how the quantity of data and the source of data (whether it comes from students or experts) impact one hint generation algorithm. We find that with student training data, hint quality stops improving after 15-20 training solutions and can decrease with additional data. We also find that student data outperforms a single expert solution but that a comprehensive set of expert solutions generally performs best.
By Thomas W. Price, Zhongxiu Liu, Veronica Cateté and Tiffany Barnes. Presented at the International Computing Education Research (ICER) Conference, 2017. Read the paper or view the presentation slides.
Abstract: When novice students encounter difficulty when learning to program, some can seek help from instructors or teaching assistants. This one-on-one tutoring is highly effective at fostering learning, but busy instructors and large class sizes can make expert help a scarce resource. Increasingly, programming environments attempt to imitate this human support by providing students with hints and feedback. In order to design effective, computer-based help, it is important to understand how and why students seek and avoid help when programming, and how this process differs when the help is provided by a human or a computer. We explore these questions through a qualitative analysis of 15 students' interviews, in which they reflect on solving two programming problems with human and computer help. We discuss implications for help design and present hypotheses on students' help-seeking behavior.