---
date: 2021-04-01
type: ship
title: Stella
slug: stella
project: Independent
kicker: A first AI/ML interface for business data, built before the wave.
excerpt: Designed the interaction model for asking questions in natural language and reading the answer visually, with confidence and error states baked in.
cover: /assets/covers/hero-stella.webp
palette:
  variant: paper-2
  accent: "#3F5F8A"
  source:
    brand: "Pelikan 4001"
    name: "Blue-Black"
  role: Head of UX & Design
  pull: We didn't trust the model yet. So the interface had to do the trusting for us.
tags: [ai, data, product-systems, origin]
---

## Context

Years before AI entered the product mainstream, Stella explored what it would mean for design to collaborate with intelligence. Not a product to launch, a working concept to learn from. How do you design an interface for a system that doesn't yet behave predictably?

![Stella concept on an iPad with smart keyboard, shot top-down on a soft white surface. Status bar reads "9:41 Mon Jun 3" and the title "Find" centres the top bar. Left column lists "Your daily topics · Today": Show our sales in this year (10:44 PM), How was our last quarter performance? (8:52 PM), Which product did we sell the most this year? (11:37 AM), Which country has more revenue? (1:53 PM), and a few repeats. Right column carries the prompt "Hey Miralda, how can I help you today?" with a search field and a large blue voice button labelled "Touch here to speak or say 'Stella'".](/assets/projects/stella/st-daily-topics.webp)

## What I designed

The behavior, not the pixels. We mapped how someone might ask a system about their data, how it might interpret intent, how it might respond visually when its own confidence varied. Adaptive layouts, contextual feedback, conversation as a primary surface long before that pattern was named.

I led the design direction and worked alongside data scientists to translate abstract model output into tangible interaction. Each cycle pushed the line between static UI and dynamic response.

![Stella answer view on an iPhone held by an out-of-focus figure. The query bar at top reads "Regional sales in 2019" with related-question chips beneath it (Global sales in 2019, Domestic sales in 2019, Total sales in 2019…). The answer is rendered as a stacked area chart, two-tone blue, with axes labelled 50000 to 200000 across Quarter 1 to Quarter 4. Below the chart, a "Sales by region" table begins (Central $386,000, Gulf $386,000…) with a "Type here to ask…" composer pinned to the bottom and the voice button beside it.](/assets/projects/stella/st-answer.webp)

## The decision that shaped it

Treat AI as a collaborator, not a feature. Information surfaced progressively rather than all at once. Motion reflected system thinking instead of decorating a transition. Errors were framed as learning moments rather than failures. The interface had to make uncertainty feel honest. The bet being that confidence theatre would erode trust faster than admitting what the system didn't know.

![Stella source-disclosure panel on an iPhone in hand. A previous answer card peeks from the left edge ("…than the closest"). The active "Options" sheet shows a "Hide data source" header followed by "How do we show this answer? The following sources were used in this answer: SAP, Oracle DW and XXX. Information for GovStats.com was included in this answer." Beneath the explanation sit three secondary actions: Pinned to homepage (active), Share, Filters. The system makes uncertainty and provenance an ordinary part of the answer, not a footnote.](/assets/projects/stella/st-source-disclosure.webp)

## What it left behind

Stella never shipped. It became one of the company's first working AI concepts and informed how later products visualised reasoning, confidence, and transparency. The pattern of treating model output as an artifact to be designed *around*, not a string to render, outlived the project itself.
