Spatial Vis is an engaging iPad app that teaches students valuable spatial visualization skills. The app is used in classrooms and available on the app store.
As the second designer to join the company, I helped create a new version of the app that focused on an older target audience. I designed new content and question types for the app using Sketch 3, pored through app usage analytics in excel to make data-driven design decisions, surveyed users to identify pain points, and conducted outreach (informal contextual inquiry + think aloud) at a local public school to gain a better understanding of user behaviors.
My Role: UX Designer and Researcher
Tools & Methods: Analytics Analysis, Survey, Contextual Inquiry, Think-Alouds, Prototype Testing, Sketch 3, Excel
Deliverables: Mockups, Presentations & Reports
Employer: E-Grove Learning
I helped create an enjoyable and engaging app that teaches spatial visualization to a variety of age ranges using a carefully crafted curriculum.
Spatial visualization is a critical skill linked to success in STEM, but schools don’t teach it. Before I joined the company, the app still catered to only one age range and had several serious pain points.
What is Spatial Visualization?
Spatial visualization is the ability to visualize and manipulate 2D and 3D shapes in one’s mind. It's a valuable skill, but it's not typically taught in schools. It can be learned by playing with legos and playing certain videogames (which are usually marketed towards boys), and underdeveloped spatial vizualization skills has been highlighted as a reason that may deter girls from pursuing science courses or careers.
IT HAS IMPORTANT BENEFITS
Spatial visualization skills are correlated with higher grades in STEM subjects, and are used in many careers.
Higher Graduation Rates
Spatial visualization skills have been correlated with higher university graduation rates, especially within engineering subjects.
Since our app uses a freehand sketching-based interface, “paper prototypes” actually served as a better means of evaluation than Axure, InVision, or any other high-tech tool could have. Participants were selected from a pool of current users, shown a printed mockup, and then asked to complete the task in the same way as they would on an iPad (only that instead of using a stylus, they used a good-old-fashioned pencil). We evaluated prototypes based on the correctness of participants' submissions and the amount of time they took.
Contextual Inquiry - Think Alouds. With legos.
We sometimes did outreach at a local school by bringing our 40 iPads to classrooms. This served as an opportunity to do contextual inquiry. Whenever a student raised their hand for help, we asked them to show us how they were solving the problem… using legos. The legos served as a crucial cognitive artifact that concretely showed us how students understood (or misunderstood) app curriculum. This contextual inquiry revealed that users misunderstood a crucial concept (namely, the rotation of 2D vs 3D shapes). We deduced that users’ misunderstanding was due to content that taught them bad habits, and we promptly redesigned the content to build better habits.
Data-Driven Design - Analytics + Survey Results
Our app collected a variety of quantitative measurements from users, and even saved all of their drawing submissions. Combined with our survey results, this rich data set helped answer key questions and identify frustrating pain points.
Unhelpful Defaults. The lack of a useful default required extra steps through a user flow when users needed to erase drawings.
Strict grading. We observed that users submitted drawings that were conceptually correct but were marked as incorrect due to messiness. We alleviated this problem by making the grading algorithm more lenient.
- Incorrect Solutions. Despite our best efforts, the grading algorithm was sometimes entirely wrong. Analytics allowed us to easily flag and address these issues.
How might we teach difficult concepts more effectively? Looking through the analytics data, we identified challenging problems and concepts based on variables like completion time, number of attempts, and stars earned. After identifying these outliers, we pored over student sketch submissions to get a better understanding of how our users thought about the problems.
Outcome: Redesigned a more effective and streamlined app curriculum that progressed in difficulty.
Why did some students improve, but others did not? We used a pre-test and post-test to measure students' improvement as a result of the app. Most students improved, but others didn't. Reviewing students' data and work revealed that some students who didn't improve would often peek at the solution.
Insight: Perseverance is a likely predictor of student improvement.
Nathan Delson, My Boss & President of E-Grove Learning
"Michael works extremely well with others, delivers high quality work with minimal supervision, and can always be counted on."
"I have worked closely with Michael for about a year, often with weekly meetings. I am highly impressed with Michael’s abilities and commitment. We were developing a new approach for educational software that relied on sketches of solutions rather than multiple-choice questions. Michael consistently delivered educational content with high attention to detail; these included graphics, user interface controls, and question layout. He also observed students working on the app and later analyzed the data collected. He recognized multiple insights that have allowed us to improve the software. Michael works extremely well with others, delivers high quality work with minimal supervision, and can always be counted on. He is a pleasure to work with."
- Users don’t read instructions. Users don’t learn from instructions, they learn from feedback. Designers shouldn’t rely on using instructions to teach users something.
- Mistakes break users' trust. A mistake in the app can break a user’s trust of it. Broken trust ultimately costed users time, effort, and caused frustration.