Visualization evaluation has been addressed by many, where many of those techniques and algorithms can be objectively measured the visual work. The most recognize path from those techniques is recognition for a subjective measurements necessary to complete the evaluation of visualization work effectiveness. Such forms of subjective measurement are not just important for the instructor’s assessment of students, but the students also need the ability to critically evaluate a visualization (Friedman & Rosen (2017)). We developed the visual peer review and its rubric to provide an alternative to visualization assessment in the classroom and professional environments. Here are some of our publications, funding and our live demonstrations:
2021. Z Beasley, A Friedman, P Rosen. “Through the Looking Glass: Insights into Visualization Pedagogy through Sentiment Analysis of Peer Review Text.”
2020. Z Beasley, A Friedman, L Pieg, P Rosen. “Leveraging Peer Feedback to Improve Visualization Education.” IEEE Pacific Visualization conference.
2018 Friedman, A. & Rosen, P. “MyReviewers Visualization Peer Review.” The 5th International Conference on Writing Analytics . St. Petersburg, FL.
2017 Friedman, A. & Rosen, P. “Leveraging Peer Review in Visualization Education: A Proposal for a New Mode.” IEEE Pedagogy of Data Visualization Workshop, Phoenix, AZ
Funding
May 26, 2022
NSF Award III-2216227: “Using Behavioral Nudges in Peer Review to Improve Critical Analysis in STEM Courses”. Total Awarded: $297,604.00
Live Demonstration www.visualpeerreview.org