Springboard Industry Design Project — Fall 2023

Springboard Industry Design Project — Fall 2023

How can reporting inappropriate content be more effective?

How can reporting inappropriate content be more effective?

How can reporting inappropriate content be more effective?

Shaghayegh Cyrous

Xiaoyang Zhu

Gloria Rasmuzzen

Nicole Kwik


Team

User Researcher, User Interface Designer, Usability Testing

Role

Figma

Miro

Tools

Timeline

4 weeks

Timeline

4 weeks

Tools

Figma

Miro

Team

Shaghayegh Cyrous

Xiaoyang Zhu

Gloria Rasmuzzen

Nicole Kwik

Role

UX Researcher

UI Designer

Usability Testing

Timeline

4 weeks

Team

Shaghayegh Cyrous

Xiaoyang Zhu

Gloria Rasmuzzen

Nicole Kwik

Role

User Researcher, User Interface Designer, Usability Testing

Tools

Figma

Miro

INTRODUCTION

As part of the Springboard UI/UX Design Career Track, I collaborated with a team of three fellow trainees on a real-world design project for Project: Human City. We collaborated closely with the project manager to design a new content moderation system for their two flagship apps: Spotstitch, a social media, events, and gaming platform, and Co-quest, a task-based service app. Our contributions focused on the early stages of the design process, including user research, ideation, and developing low-fidelity wireframes to present initial design solutions.

PROBLEM

The absence of an effective content moderation system

At the time of this project, Project: Human City’s flagship apps, Spotstitch and Co-quest, lacked a content moderation system to effectively manage and regulate user-generated content. This absence posed significant risks, including the potential spread of inappropriate, offensive, or misleading material, which could negatively impact user trust, safety, and overall engagement on the platforms.

COMPETITIVE ANALYSIS

Existing platforms lack an editing or cancellation option

The first step we took was conduct an in-depth competitive analysis of apps offering similar services to Spotstitch (a social media, events, and gaming app) and Co-quest (a task-based service app). We identified a common gap across most platforms: the absence of features allowing users to cancel or edit their submitted reports. This helped in highlighting an opportunity to enhance the user experience in our content moderation system.

USER FLOWS

Mapping out the reporting process

Building on the insights from our competitive analysis, we mapped out the user flows for the content moderation system. This process helped us visualize the different paths users would take when reporting content, ensuring the experience was intuitive and user-friendly. The final user flows were divided into three key components.

User-oriented Moderation

This flow illustrates the journey that reported content undergoes. The approval and/or denial process involves review by 2-4 users before reaching a final decision.

Main Reporting System

This flow illustrates the general content reporting process which will be used in both the Spotstitch and Co-Quest apps.

Content Review Process

This flow illustrates the general content reporting process which will be used in both the Spotstitch and Co-Quest apps.

INITIAL SKETCHES

Visualizing our desired design direction

Next, we created initial sketches based on our user flows. This allowed us to visualize the desired design direction and present our concepts to the project manager for feedback before crafting the final wireframes. In the sketches, we focused on the critical paths being: the reporting process, post-reporting notifications, and the content review process.

THE SOLUTION

Flexible Reporting and User-Oriented Report Reviews

For this project, the project manager preferred the final designs to be presented at the low-fidelity stage. Based on their feedback and approval of our initial sketches, we developed detailed low-fidelity wireframes using Figma, which marked the final phase of our design process. Additionally, to align with the project manager's vision, we created a flexible content moderation system that could be seamlessly implemented across both Spotstitch and Co-quest.

HOW WILL IT WORK?

The Main Reporting Process

This is how the user will be able to submit a report. The app will offer more steps the user can take in addition to reporting the content as well as an option to cancel report.

Post-Reporting Notifications

Users will be able to keep track of their report through the Notifications section of the app. They will be able to view updates on their report and also have the option to cancel.

Appeal Report Process

If a user's report to remove content is denied, they will have the option to appeal the decision. This allows them to provide additional context or details, which will be reviewed by the company to ensure a thorough reassessment of the report.

User Notifications to Review Reports

As part of the user-driven moderation system we designed, users are randomly selected to review content reported by others. They receive notifications or pop-ups while browsing, prompting them to participate in the review process directly from the post they’re viewing.

USABILITY TESTING

More explanation and context needed to review report

As the final step in our project, we conducted a usability testing session to gather valuable feedback on our design and identify key areas for improvement before passing it back to the design team for further iterations. The testing revealed that the primary issue with our design was a lack of clarity and context during the report review process. The majority of participants expressed frustration with the following points:


  • Unclear Understanding of the User-Driven Moderation System: Participants were unsure how the user-oriented moderation feature functioned and what their role entailed.

  • Lack of Guidance on Community Standards: Users were uncertain whether they should reference community guidelines when reviewing reported content.

  • Missing Context for Reported Content: No information was provided about why the content they were reviewing had been reported, leaving users without the necessary context to make informed decisions.

REFLECTION

Overall, this project provided me with valuable experience collaborating within a UX team and working cross-functionally with the project manager. For our final hand-off, my team and I presented our low-fidelity designs to the project manager, along with key insights from the usability testing we conducted. The client was highly satisfied with our work, highlighting our team’s responsiveness and thorough approach throughout the project.


While we successfully designed the first iteration of Project: Human City’s content moderation system, additional time would have allowed us to further refine the design based on the feedback from usability testing. Key areas for improvement include:


Enhancing User Control Over Reported Content:
Usability feedback revealed that users found the reporting process somewhat confusing and time-consuming. Our top priority would be to streamline this process by providing clearer context throughout the content review flow and simplifying the steps involved. This would not only improve user understanding but also reduce the time required to review reported content.


Optimizing the Reporting Process:
Although participants generally found the reporting process straightforward, we identified opportunities for refinement. We would replace the current cancel button with a confirmation pop-up ("Are you sure?") to prevent accidental exits and redesign the “Choose Reason” dropdown menu for better clarity. Additionally, we would enhance the camera feature to make uploading supporting images more intuitive.


Connect with me!