I completed my Ph.D. in computer science at Oregon State University in spring 2022, advised by Dr. Margaret Burnett and co-advised by Dr. Tom Dietterich. In fall 2022 I joined the faculty at Penn State University. While my research interests have varied, my dissertation centers on eXplainable AI (XAI) problems from both the human side and the system side. Broadly speaking, XAI is concerned with any tools or techniques used by the wide variety of people who assess AI-infused systems, from expert developers to novice end users. Before returning to OSU as a Ph.D. student, I obtained my M.S. in 2009 from the graphics department, where I was advised by Dr. Ron Metoyer. My thesis project focused on understanding and supporting the needs of exercise prescribers in creating rich media to support their patients.
My research focuses on explainable AI (XAI). This topic includes AI, HCI, InfoVis, and a wide range of other disciplines. Since XAI covers such a wide range, some of the problems I and my students think about include (but are not limited to): devising ways to understand a decision by prodding an opaque box system, devising new systems which are more inherently explainable, studying human response to different kinds of explanation, devising tasks for humans to perform to allow measurement explanations' quality.