Case Study / Inbox
Content-Driven Design for a Gardening Companion App
Grove makes it easy to grow your own food at home, with connected indoor gardens that guide you along the way. In 2015, Grove shipped 50 Early Adopter Ecosystems in the Boston area, accomponied by a web app that monitored the climate, controlled the environment and tracked water chemistry.
But people weren’t using the web app, and customer support was spending a ton of time answering simple questions. Superusers were active in using the extensive IoT controls provided by the software, but the average user was asking for defaults and didn’t know how to interpret the data given to them, which was displayed simply as raw information.
In six months, I spent many hours in customer homes (and offices). Through task analysis and observation, I identified what worked and didn’t about the beta product experience.
First explorative, then more evaluative research with early prototypes revealed that for users, the technology and ecology fade into the background as the Grove becomes a lifestyle experience.
The Grove became a part of daily life for its owners, and storytelling, social interaction, even storage had to integrate into the habits of the household. Light settings were left on default, but gardens were planted and rearranged like artistic masterpieces. As a living, breathing ecosystem in their home, customers cared for it like a pet. People wanted more hand-holding, celebration and validation to create moments of pride in what they grew.
To maximize the impact of user research across the many departments on the team, I experimented with ways to convey learnings continuously, and pass on artifacts that might influence design work on the software team, in the hardware workshop and manufacturing facility, and on the marketing team.
I iterated on the artifacts and tools used to distribute the insight, including tumblr, posters, collaborative journey maps, weekly ‘#grovegrown’ gatherings at the office, visualized user flows and more.
Customers want regular validation that they’re succeeding, and guidance as to what’s next.
I first created paper prototypes to visualize ideas enough to discuss them with the team. We then moved to InVision work flows to test with customers and user testing participants in scripted in-office user tests.
Through user testing, we honed in on the concept of an ‘inbox’ and moved away from monitoring or adjusting, which is inherent in a dashboard approach.
We then considered the different inputs and classifications of ‘tasks’ that might make up the user’s primary interaction with the app. Through affinity diagramming, we grouped related activities and designed patterns for each type of behavior. Tasks could be triggered by sensor data, a user input, a timer going off, or a message sent by the Grove team. Topics included plant tasks, aquarium maintenance, water level notifications and more.
To roll out the inbox, we designed a 90-day engagement plan, then built a flexible internal messaging system.
For the first few months, over a slow roll out, every message a user saw was manually triggered by Emily, our Director of Customer Success. Since she best understood user needs in context, the feedback loop from customer to product was quick. I designed an internal tool that surfaced critical user data (including data collected from sensors, and data that users collected manually and recorded in the app).
We affectionately referred to the internal tool as Roots.
Roots, Grove’s customer support platform: including the tool to manage user messages.
As we iterated on the timing and copy of notifications and perfected the content within them, we automated tasks individually. The content card in the inbox expands to give the user a quick review of the task, and access to the detailed tutorial. We also tracked behavior in MixPanel, combining data-driven insight with qualitiative feedback to identify opportunities to improve the engagement plan.
A completed task then went into a user’s Completed timeline for quick reference. This made it easy for users to reference activity when diagnosing an issue or planning a harvest.
Users checked the Grove mobile app an average of 6.3 times per week in the first 6 months after launch. They engaged with tasks and tutorials delivered through the inbox about 4 times a week, and otherwise used the app to save water test results or control their hardware.
Defining how to measure success if half the battle. At the time, we leaned heavily on Net Promoter Scores (NPS) to gauge our 30 day and 90 day success.
NPS scores were very high (around 90% at 30 days). Customer service check ins at 7, 30 and 90 days proved incredibly valuable at providing qualitative context for our quantitative metrics.
Recognizing the limitations of the NPS score, today I focus more on identifying specific, measurable user behavior that strongly correlates to retention and customer satisfaction.
> GO HOME <
> GO HOME <