Introduction
Two meetings were conducted over two weeks discussing the interface used by editors using an open publishing platform at UVA. These meetings discussed known problems with the interface (through tickets submitted in the past), focus areas for research, walk-throughs of the paper publication process, and ideation for research protocols and recruitments. A timeline was also discussed and agreed upon in the early stages of the project. Valuable insights from the team contributed towards developing key interview questions which were asked to editors in the UX testing sessions.
Why Was This Investigated?
The purpose of these testing sessions was to understand and improve editors’ experiences with the publishing platform and its interface, and it was not meant to be an evaluation or assessment of their capabilities. These sessions served to inform decisions on extending or reconsidering vendor contracts. The research also focused on determining which features of the interface were deemed to be vital for editors, communication with authors and editors, and whether any contexts warranted alternative tools to be used instead. Outcomes of this research looked towards providing better experiences for editors of this software.
Market Research
Multiple meetings were conducted with various vendors to evaluate alternative options. The demonstrations of other vendors (Fulcrum, Janeway, PKP, Ubiquity) along with the findings from the user testing sessions contributed to the final decision of which service to partner with.
Methodology
Research sessions were conducted with 5 participants. These sessions included semi-structured interviews and contextual inquiries/direct observation. Sessions ran for an average of 30 minutes.
Contextual Inquiries
Editors were asked to show what the process of paper submission, review and publication looks like on the publishing platform's interface. They were asked to narrate their thoughts and concerns out loud. While they explored the interface, the researcher quietly observed their interactions, facial expressions, mouse movements and low-level behaviors, which were recorded as field notes.
Semi-structured Interviews
Questions asked to editors included their frequency of use, specific tasks and workflows, communication with others involved, features and services used, their needs and expectations, likes and dislikes, and customer support and assistance provided to them.
Findings
The research data was collected through audio/video recordings, responses to interview questions, think-aloud protocols, and field notes.
I Want to Train Myself and Others
A novice user mentioned the training process and was asked to elaborate on the learnability of the interface. They expressed that they wanted to be able to train others how to use it efficiently as well.
“I am the kind of person who does not just need a manual. I need hands on training. I need to be doing it. I don’t learn well when people just show me. It needs to be interactive. And I do have a learning curve. And I need to be confident enough to give reviewers instructions.”
“If there is a problem, I need to be able to help people find solutions. Like if authors or reviewers who are new have issues, I should be able to help them. Or it should be intuitive enough and have help features available.”
Can I Trust Auto-generated Emails?
Editors found that auto-generated emails and templates had formatting issues, and metadata and tokens in those emails did not always work as intended. Editors also claimed that emailing reviewers about declined papers (revise and resubmit) required revealing authors’ identities, which may not be true, but did appear to be confusing to them.
“When I decline or accept and there are reviews in, I send an email to both reviewers. It says send a copy of this email by BCC to the reviewers. So, I end up emailing them outside of the publishing platform for decisions about the paper.”
Feature (In)compatibility
Editors mentioned that the interface was at times slow to load and some important features (Activity Log) do not work on some browsers (Chrome).
“I was in there today mystified. It is not straight forward. It is handy to have it, a central place where everything is collected, but one of the things I find hardest is that it generates emails, and I can’t see them.”
The Importance of Training
From the UX research sessions conducted with editors of publishing platform, one of the main recurring themes appears to be the importance of user training for new software. Multiple editors (especially novices) mentioned that current practices were largely influenced by how they were introduced to the software. Some were given a resource (e.g. manual) and others were verbally instructed by a peer. This led to an inconsistent adoption of the software across multiple users, with some questions arising around transparency and unpredictable outcomes, often leading to users relying on alternatives (e.g. other modes of communication) to accomplish their objectives. Outcomes of these sessions lead the researchers to believe that formal training should be offered to users which would provide them an opportunity to learn various features available to them, and hands-on training would allow us to understand compatibility requirements as well. This would also allow for more consistent adoption practices, and the facilities to train new users in the future, altogether leading to a better user experience for most (if not all) users.