Twinkel Home Dashboard on Tablet and Mobile

Twinkel
Helping after-school program administrators manage program participants, increase engagement, and complete daily tasks.

About the Project
Role: UI/UX Designer
Overview: Create the visual responsive designs for Twinkel, a web-based platform for after-school programs
Deliverables: Style Explorations, Website Design Mockups, Prototype, Design System
Tools: Sketch, InVision, Google Workspace, Miro
Project length: 5 weeks
Recruit, Retain, Engage
When an after-school program is effective, it can be a game changer for young adults. The best ones offer a wide range of benefits, from boosting students' academic performances, to promoting social skills, to helping kids stay physically fit; all within a safe and structured environment.
But how does an after-school program become effective?
Recruitment, retention, and engagement.
Administrators need a 360° view of their programs, with easy access to attendance reports, student profiles, and other daily tasks. How can we ease the administrative burden for them so their programs can shine?
Over the course of 5 weeks, I (along with a team of 4 other designers) set out to answer this question, by developing the visuals for the backend of Twinkel, a web-based solution to these challenges.
Based on the wireframes handed off to us by a UX team, we worked to create high fidelity mockups, prototypes, and a design system for Twinkel, so that admins can get back to the real focus of their programs: helping kids grow.

Original annotated wireframes handed off by the UX team
School's in session
Before we began the design process, we learned from the materials handed off to us that a key stakeholder wanted users to be able to participate and interact on the platform in an easy and frictionless way. The stakeholder specifically stated: “I want it to be as convenient as scrolling on Instagram and liking a bunch of stuff while you’re standing in line at the grocery store.”
We carried this idea with us as we began compiling our initial domain research, conducting our competitive analysis, and forming the design principles that would help guide us forward.

A sampling of the competition
Evaluating the landscape
In conducting our research we set out to understand what was already on the market. We also honed in on potential gaps and opportunities that could benefit our users.
We asked: of all the software out there, what has already been established and is expected by administrators of after-school programs? Were there any emerging trends?
We learned that the main target audiences for this industry are community schools, educators, youth organizations, daycare, and after daycare. We also learned that staff tend to handle all daily operations like attendance rosters, check-ins/check-outs, and event communication. Because staff members tend to wear many hats, it was imperative that the design be easy for non-technical users, and that training new users was a breeze. Administrators also expect a bespoke solution that is customized and tailored to their programs' specific needs.
We found that while the market is quite saturated as it stands, there is a growing expectation for platforms to be mobile-friendly, a gap for many providers of legacy software. This was evident via a trend we discovered: poor reviews for services that do not adapt well to mobile screens, or don't support a mobile format at all.
After looking at the domain at a high level, we identified the most popular direct and indirect competitors to perform a competitive analysis to better understand the landscape.
Our affinity diagramming exercise as a team uncovered the following strengths, weaknesses, and opportunities for us to take into consideration:
Strengths
Most competitors (particularly ones that had the benefit of being around for a long time) offered very feature-rich applications, with large degrees of customization, and solid customer support teams.
Weaknesses
Many of the applications had poor text legibility and cramped screens. A few competitors had a distinct lack of data visualization, overuse of lists, and dated interfaces. In the reviews we read we also learned that many systems had a difficult learning curve.
Opportunities
We found that there was opportunity to clearly display data and text in a clean and non-cramped manner. Additionally, a more tailored and mobile-friendly design was an area that could be improved upon.
Design Principles
We used the findings from our research to help us establish the following design principles, which would guide our design decisions throughout the process.
Iterative
The work is never done. We will take our designs and adapt them based on what we find is working and what is not working. We must follow the user and observe what they do -- not what they say they do -- to improve their experiences.
Modular
The system needs to be made of customizable components that can be reused in different contexts. For instance, data visualization should be able to be seamlessly customized and displayed without negatively affecting the rest of the platform and ecosystem.
Universal
We must design with the user and community in mind. Administrators need to be able to access what they need easily, so they can move on with their day. Visuals should be welcoming and pleasing, but also accessible and inclusive.
Diving In
With our initial research complete, each designer got to work on designing three different style tiles to test with users and inform our directions. Although we were tasked with designing a system for adult administrators, my design approach was to take a tiny bit of inspiration from the students that the administrators work with and incorporate the feelings I hope they get out of their programs: freedom, comfort, and energy.

Concept A - Freedom
After-school programs are all about enrichment, exploration, and embracing freedom to learn more about the world. This style tile explores the road ahead. The administrators of these programs facilitate important extracurricular activities and this concept strives to capture the spirit of expanding oneself beyond the boundaries of a typical school day. To showcase the important engagement metrics and to help the data stand out from the theme, I used a card-based layout with drop shadows to place the information front and center.

Concept B - Hygge
Hygge is a Danish and Norwegian word which describes a mood of coziness and contentment. Our primary target audience of after-school program admins should feel trust and comfort in their ability to responsibly manage a large group of students in their programs. Similarly, the parents and children themselves, should have a feeling of safety and warmth about the programs in which they are enrolled and participating in. The color palette is comprised of warm colors that evoke a late autumn day, and the UI elements are based around a simple gumdrop shape that is playful without being distracting.

Concept C - Kinetic
This style tile aims to capture the energy and motion of life during an after-school activity. Admins are constantly dealing with lots of moving parts, while at the same time fostering connection. Here, visualizations are made of connections, surfacing the most important pieces of data with pops of various hues of pink and purple. Components such as the pie chart “slices” stand out boldly and have a free flowing quality to them, as if they may jump off the screen. Cards as a framing device are tossed out here, in favor of allowing the visualizations to speak for themselves.
User testing
To test these initial design explorations, we presented and tested each style tile with several users and subject matter experts. We sourced users from within our personal networks, but also via social media and email. A mix of quantitative and qualitative methods were used during these interviews. Each interview was used as an opportunity to determine users' preferred design direction and to ensure that the direction pursued would align with the goals of the stakeholder.
Key takeaways
Overall, each of these style tiles was met with positive feedback, but 4 out of the 7 initial users we tested cited my "Concept C - Kinetic” style tile as their favorite. This surprised me because it was the one style exploration I designed that felt more experimental and potentially divisive.
While one user mentioned that the gradients and darker colors might be overbearing with extended use, multiple users likened the look to "dark mode" and found the boldness of the color palette and chart elements visually pleasing. One user called the appearance "modern and different, but in a good way." Users also liked the typography choice and found it to be very legible.
As surprising as it was to me that this style was selected, this was a good lesson for me to re-learn: you are not your user!

Dashboard Version 1.0
High Fidelity Version 1.0
For my initial high fidelity version of the platform's key screens, I took the good feedback and tried to balance it with proper hierarchy in terms of the typography, but also played around more with color. I shifted the more stark purple background of my style tile to a soft radial gradient, in an effort to guide the user's eyes properly around the app.
I also got a little more creative with color choices and decided to give the components some breathing room; as if they had broken free of their borders from the wireframe. Specifically, I broke the side and top navigation out of the tabs to better blend in with the modern feel.


Methodology
For our initial high fidelity tests, we used the same mix of quantitative and qualitative methods as we did with our style tiles. The key difference for this round of testing was the introduction of prototypes, so we wanted to gather feedback on the user experience through the completion of a series of tasks.
We asked users to rate each task given on a scale of 1 to 5 in terms of ease or difficulty. We gave users different fictional scenarios as part of "a day in the life" of an after-school administrator and then asked them how they might go about performing the action. For example, we might say to the user, "Jane is a new participant in Program 1. You need to check the roster to see if she is on the new attendance list. How would you go about doing that?" We also asked users to rate how engaged they felt using the application overall.
Participants were also instructed to “think aloud” and share their thoughts throughout the test. After every design was presented, participants went through a debrief to ascertain general visual desirability feedback and to gather their overall thoughts on the experience and visual presentation.
Testing results
My design tested well, with users praising the organization of information and visuals. These design aspects also contributed to high engagement scores. One area where my design needed improvement was the notification center, as a couple of users noted that the bell icon and exclamation point did not communicate to them that an action could be taken. Additionally, the notification drop-down itself needed a clearer call-to-action that made it clear it was clickable.
There was also some confusion across all of our team's designs around how to access the attendance roster, but we learned later on that this was primarily a nomenclature issue, with many users having differing perspectives on what terms like "roster" mean.
High Fidelity Version 2.0
Since the overall aesthetic of my design was well-received, for my second iteration of the platform I focused primarily on fixing UX issues we discovered in testing. Additionally, I tweaked my grid settings in Sketch and re-aligned many components to subtly polish the overall design and make it feel tighter.
This second prototype required us to get creative and design some other visualizations in the app. Additionally, as per the brief, we got to work designing some key screens for mobile, with the goal of successfully translating the responsive design from the tablet version to a smaller device.




Version 1 and Version 2 Dashboards:
Notification center improved, spacing and sizing of elements polished.
Methodology Tweaks in Testing
For this round of testing, we used a similar methodology to what we implemented in round one, but with a couple tweaks to our process based on what we learned.
We modified the 1 to 5 ratings (ranking the ease or difficulty completing a task) to be a binary scale of "this is good" or "this needs work." We did this because we realized that if a task is rated as a 3, it's hard for us to then pinpoint exactly what can be improved and iterated on in a future version. Switching to this binary scale, with additional probing questions, ended up being far a more effective and impactful method to allow for clear iteration on future designs.
Ethnography was also used for the tests -- specifically passive observation was utilized to see what users would do instead of just taking their word for it. Finally, we asked users how well the tablet screens transferred to the key mobile screens we designed.
Results
Users viewed my second iteration as an improvement, and they rated the evolved design highly.
The notification center, which previously needed some work, was now rated as being more clear. There was a 100% success rate in terms of the completion of our scenarios/tasks. Space, organization, and visuals were considered very strong. All users found the overall evolution of the design to be easy-to-use, and it was noted that the design felt consistent from tablet and mobile.
One area that could be improved upon harkened back to the initial the project brief (which stressed an Instagram-like easiness) -- users preferred data visualization modules on mobile to be scrolled only vertically. Charts that veered off the screen were correctly interpreted as containing a horizontal scroll function, which users did not like. They much preferred for graphs and charts to be resized to fit their screens.





Original UX wireframe (with untested calendar) and my design
Lesson Learned
The biggest lesson I learned from this project was just how much UX and UI can overlap. The UX team that handed off their wireframes to us hadn't been able to test many of the features they designed. We weren't able to confer with them, so this meant that our team wasn't simply creating a UI treatment for Twinkel; we had to test some of the untested ideas as well, which was fun!
What that ultimately meant was we had adhere closely to the wireframes, but also subtly challenge them if some untested modules appeared confusing. Sometimes we would rearrange certain modules on the screen and target those areas in our interviews to see if they were well-received.
For instance, (as per the images on the left) the calendar functionality and some other menu functions on the attendance page were cited by the UX team as untested recommendations, with further testing needed before implementation. In the wireframes, the calendar was very tiny and off to the side, so I decided to shift it to be larger and better positioned on the screen.
Future Recommendations
After we tested our design iterations, I created a design system for Twinkel to use for future implementation of the platform, as well as guidelines that can be followed when designing modules that were outside the scope of the project. Our team also came up with a couple recommendations and next steps for the future (based on user requests), that we would tackle if we had more time. This included:
Designing other key screens, such as student messaging capabilities
Adjusting data visualization modules (charts + graphs) to be more mobile friendly
Overall, this was a challenging and fun project. It was made all the better with the mission of helping give after-school administrators some precious time back, so they can focus on the great work they do with students.