Visualization evaluation has been addressed by many, where many of those techniques and algorithms can be objectively measured the visual work. The most recognize path from those techniques is recognition for a subjective measurements necessary to complete the evaluation of visualization work effectiveness. Such forms of subjective measurement are not just important for the instructor’s assessment of students, but the students also need the ability to critically evaluate a visualization (Friedman & Rosen (2017)). We developed the visual peer review and its rubric to provide an alternative to visualization assessment in the classroom and professional environments. Here are some of our publications, funding and our live demonstrations:
2023. Friedman, A and Hawley, K. “Visual Peer Review: Examining Students’ Parts of Speech Comments vs. Grade.” EdMedia + Innovate Learning. The Association for the Advancement of Computing in Education (AACE).
2022. Z Beasley, A Friedman, P Rosen. “Through the Looking Glass: Insights into Visualization Pedagogy through Sentiment Analysis of Peer Review Text.” IEEE Computer Graphics and Applications

2020. Z Beasley, A Friedman, L Pier, P Rosen.”Leveraging Peer Feedback to Improve Visualization Education.” IEEE Pacific Visualization conference.
2019. Friedman, A.”Toward peer-review software and a rubric application in visual analytics classes: A case study.” Education for Information Special issue: Visual Learning by Education for Information
2018. Friedman, A. & Rosen, P. “MyReviewers Visualization Peer Review.” The 5th International Conference on Writing Analytics . St. Petersburg, FL.
2017. Friedman, A. & Rosen, P. “Leveraging Peer Review in Visualization Education: A Proposal for a New Mode.” IEEE Pedagogy of Data Visualization Workshop , Phoenix, AZ

Funding May 26, 2022 NSF Award III-2216227: “Using Behavioral Nudges in Peer Review to Improve Critical Analysis in STEM Courses”. Total Awarded: $297,604.00

Live Demonstration www.visualpeerreview.org