Improving Cross-Collaboration within Product Development
Project Leader

Challenge
For a cognitive design course, I worked with a team of 5 students (3 designers) to explore cross-collaboration between product development within the Tech sector with the goal of improving relationship dynamics and communication among designers, developers and product managers. The solution was a 4-step guideline for conducting design feedback reviews.
What I did:
Deliverables
Research
Understanding the Problem Space
For field studies, we went to Design and Developer meetups (where we recruited contacts) as well as observing daily team interactions at Tech companies and the UC San Diego Design Lab.
For contextual research, I looked at Organizational Psychology papers and articles on team collaboration and relationship building. To understand the users’ work flows, I read articles on processes like (Scrum, Agile, Kanball, ect.) and reviewed the tools that were frequently used (e.g. InVision, Slack, ect.)
Recruiting Stakeholders
Participants were designers, developers, project manager, and marketing managers within a range of companies, from global teams like Facebook and Amazon to smaller 50-100 person teams in EdTech and Telemedicine; academics from UCSD Design Lab; and Design and Computer Science students.
Interviewing Process
I created the interview questions, protocol and online survey for the team to use. Together, we interviewed around 20-25 individuals formally and/or informally, asking them about their experiences, practices, and challenges with cross-collaboration. Examples of questions are:
- Think about your last experience incorporating team feedback. What was your process and how was the experience like?
- At what stages of your design process would you usually and/or ideally incorporate team feedback?
- What kind of questions do you usually ask during during feedback?
- What artifacts or software do you use to communicate and collaborate across teams? What has been your experience with these tools?
Refocusing Research Question Around Design Reviews
Based on our initial rounds of interviews, we discovered the most significant and common problem area were design reviews and hand-offs between Developers and Designers. Specifically, we wanted to understand how to streamline the experience in a way that is valuable and informative for both parties while bringing out their expert knowledge during review sessions.
Insights
Our most salient findings from the data we gathered from interviews included tasks that need to be accomplished such as first documenting the use story, use cases, roadmap, contextual research then sending it out for feedback, benefits of team feedback such as its importance for bringing in context and and background knowledge, and lastly challenges of team feedback such as different priorities between designers as well as feedback and conflicts due to differences within expert domain knowledge.
Affinity Diagramming
Data consolidation activities we engaged in included affinity diagrams, personas, and design models. Our affinity diagram process consisted of gathering over 100 relevant quotes as data from our interviews and organizing them in a spreadsheet by each person interviewed as well as if their quote could be categorized under Feelings/Philosophy, Tasks, Goals, and Implied Tasks (see above).
Identity Models
The identity model offers a clear view of what different facets of a single user’s identity will demand for a seamless integration between user and product. It features 3 sections: I am, I like, and I do. I filled in each section related to the different types of stakeholders we were designing for. I then filled in background information into each one in a narrative style, gave each a related catchphrase, and a set of “give-mes” each would demand. Ultimately the identity model will be useful in our ideation process as it gives us a quick way to visualize our users and constrain our solutions to relevant design requirements based on user’s needs and goals.
Design
Design Requirements
Based on the analysis of the research data, I outlined the challenges and pain points of the feedback process and assembled a list of design requirements to introduce to the brainstorming session:
- Process should be quick and focused
- Session should be framed around user research and context
- Information should be centralized, i.e. not require too many tools to minimize loss of feedback and notes
- Expert knowledge from both disciplines should be utilized
- Activities should lead to actionable insights (e.g. prioritization of tasks, reorganization of product roadmap, ect.)
- Ensures all questions and assumptions are answered
Ideation and Testing
We created individual storyboards and collaborated on bringing together our ideas through multiple whiteboarding sessions while referencing our design models and interview transcript to support design decisions.
We tested our ideas with a few users from different organizations and the 4-step process received the most accolades as one participant said, “ it captures the barebones of the feedback process.” The rating system was weak because it was too "constrained," feedback was vague, and "did not allow for questions." The post-its were well received by some for being "efficient" but the trade-off was the lack of an actionable plan as an outcome.
Brainstorming a Workflow
We worked together to finalize 4-step process that incorporates developers’, designers’, and product managers’ insights and expertise into an actionable plan for the prototype. Information we imagined wanted to inclue were process timeline, human-centered design goals, company values, each team members’ priorities and goals for the project.
- Both Designers and Developers outline their goals for the review session, what they wanted to learn from each other and what their respective responsibilities and expectations are for the project
- To frame the review session, the designer presents a use case, storyboard, personas, ect. and lists key user research insights before introducing the design solution.
- As the designer walks them through the deliverable. The developer evaluates by writing feedback categorized by positive, negative and neutral. Concerns revolve around technical feasibility, development time, UI specifications, and potential problems. prioritization of tasks, reorganization of product roadmap, ect.)
- Based on the information, the two parties answer any underlying questions and work together to compromise and strategize the next steps or changes.
Solution
Final Design Solution
The final design was a guidebook that was created by another team member who translated our wireframes perfectly. Due to circumstances and as Project Lead, I was too busy doing tests and writing milestones for the course to oversee the final prototype design. Overall, we received a high grade and accolades for our research process and consolidation of data.
If We Had More Time...
Given the deadline, we delivered the final solution without too much iteration after user testing. In terms of the Hierarchy of User Needs, I was dissatisfied with the layers of support, credibility and desirability this design had and wish we had more time to iterate. If we had more time to iterate on the solution based on testing data, I wish to have:
- Implement a simple framework for providing user context, one that would include both parties filling out a short description about users, needs, uses, company value, features, and product requirements.
- Designed a more modern, professional overall aesthetic
- Included a timeline where teams could rank features and UI components based on priority and feasibility.
- Added more labels and descriptors within the pages to constrain the process more so designers and developers did not have to make assumption about what, how and when things shoulds be discussed.
- Provided an instructional page as a supplement and for improving the tool's credibility.