Visualization evaluation has been addressed by a number of researchers and educators. While many techniques and algorithms can be objectively measured, most recognize that there are also significant subjective measurements necessary to completely evaluate the effectiveness of a visualization work. Such forms of subjective measurement are not just important for the instructor’s assessment of students, but the students also need the ability to critically evaluate a visualization. In our studies, we measured a new framework for evaluating student visualizations based upon peer review rubric.

Our work:

2024 Dinh, L., Friedman, A, and Hawley, K. “Examining peer review network dynamics in higher education visual communication courses using ERGM.” Computers and Education Open.

2021 Z Beasley, A Friedman, P Rosen. Through the Looking Glass: Insights into Visualization Pedagogy through Sentiment Analysis of Peer Review Text IEEE Computer Graphics and Applications.

CONFERENCE PAPERS

2024 Friedman, A, Hawley, K. Rosen, P, Rahman, Md D. “Enhancing Student Feedback Using Predictive Models in Visual Literacy Courses” IEEE EDUCON 2024, IEEE Global Engineering Education Conference, Kos, Greece

2020 Z Beasley, A Friedman, L Pieg, P Rosen. “Leveraging Peer Feedback to Improve Visualization Education.” IEEE Pacific Visualization conference.

2018 Friedman, A. & Rosen, P. “MyReviewers Visualization Peer Review.” The 5th International Conference on Writing Analytics . St. Petersburg, FL.

Live demonstration of Visual Peer Review:
www.visualpeerreview.org