Visualization evaluation has been addressed by many, where many of those techniques and algorithms can be objectively measured the visual work. The most recognize path from those techniques is recognition for a subjective measurements necessary to complete the evaluation of visualization work effectiveness. Such forms of subjective measurement are not just important for the instructor’s assessment of students, but the students also need the ability to critically evaluate a visualization (Friedman & Rosen (2017)). We developed the visual peer review and its rubric to provide an alternative to visualization assessment in the classroom and professional environments. Here are some of our publications, funding and our live demonstrations:

2021. Z Beasley, A Friedman, P Rosen. “Through the Looking Glass: Insights into Visualization Pedagogy through Sentiment Analysis of Peer Review Text.”

2020. Z Beasley, A Friedman, L Pieg, P Rosen. “Leveraging Peer Feedback to Improve Visualization Education.” IEEE Pacific Visualization conference.

2019 Friedman, A. “Toward peer-review software and a rubric application in visual analytics classes: A case study.” Education for Information Special issue: Visual Learning by Education for Information

2018 Friedman, A. & Rosen, P. “MyReviewers Visualization Peer Review.” The 5th International Conference on Writing Analytics . St. Petersburg, FL.

2017 Friedman, A. & Rosen, P. “Leveraging Peer Review in Visualization Education: A Proposal for a New Mode.” IEEE Pedagogy of Data Visualization Workshop, Phoenix, AZ

May 26, 2022
NSF Award III-2216227: “Using Behavioral Nudges in Peer Review to Improve Critical Analysis in STEM Courses”. Total Awarded: $297,604.00

Live Demonstration