Lessons Learned from An Event Apart Denver

Editor’s Note: An Event Apart (AEA) just finished its first event of 2018 in Seattle. For the first time, the 2018 conference series has three Special Edition events, which feature 18 speakers over three-day.

Today we look back at the 2017 AEA Special Edition. UX Booth columnist Jess Vice highlights some of the biggest takeaways.

Use the discount code AEAUXBooth to save $100 on any AEA multi-day event.

From Research to Redesign

Jeffrey Zeldman, founder and creative director at studio.zeldman and co-founder of An Event Apart, kicked the conference off with a reminder of why we design in the first place. We can only identify and solve problems when we know who our customers are, who our users are, and which markets we’re reaching. Research is an often overlooked (yet vital) stage of the design process. Research sets us up to not only know who we’re designing for now but helps us discover what we should be solving for next.

Research saves time and money and allows us to build the right thing the first time around. It also saves users from a terrible temporary experience. It’s better to spend three weeks researching than to spend one year building the wrong thing.

Research helps you see blind spots and biases. We like to say we build “data-driven” solutions but, as Zeldman points out, “data is only as good as the people analyzing it.” Designers shouldn’t have the answers; they should be asking the questions. And data doesn’t provide answers – it just helps us ask better questions.

Obvious Always Wins

Luke Wroblewski was the first person I heard talk about mobile-first design patterns. He’s stuck in my mind as a visionary, always looking to what’s next and how we get there. And his talk at AEA Denver was no exception.

Wroblewski highlighted the idea of obvious design. In order for design to be effective, it has to be obvious, but obvious design doesn’t always come easily. Core features and purpose should be communicated up front, visually, and with minimal work. For instance, an Apple iPhone = iPod + phone + internets.

It can be easy to copy patterns used by other designers or brands, but copying can be dangerous because we don’t know why they made that choice. If we’re honest, we’re just assuming they tested into that design decision.

Instead, we need to take the time to iterate but in an informed way. Run survival analytics on the designs you’re producing. Don’t look at averages; look at extremes. Talk to users who’ve been around for a while, use the thing all the time, and really love it. Talk to those who just showed up and have no idea what is going on. Don’t rely only on your numbers because it’s only half the equation. Quantitative data is the “what happened” –  it doesn’t give a complete picture until you talk to users about the “why did it happen” qualitative aspects.

The Last 10%

Co-founder of Jane & Jury design studio, Cassie McDaniel is passionate about preserving the beauty in both the act of designing and the final product. She shared examples of famous artists who, from the outside, may have been indulging themselves in the designs they were creating, but who also were taking care to nourish themselves in their work processes. Charles and Ray Eames made a lasting impression in the furniture industry because they followed their instincts and personal understanding of beauty, but also because they were careful with the last 10%.

McDaniel outlined her process for deciding when and where to apply that final 10% of polish that makes the product feel human but also fulfills the designer’s instincts and needs.

Iterate:

  • Ask where is it weakest?
  • What part could use another take?
  • Am I satisfied with it?
  • Are there more ideas to draw from?

Balance:

  • This isn’t just a design principle – it’s a work and life principle. Take time away from your work to gain perspective.
  • Use the MAYA Principle from Raymond Loewy: Most Advanced Yet Acceptable. Which version offers the newest thing that people will still accept?

Simplify:

  • Simplicity is hard to achieve.
  • Simplify without losing complexity but don’t “dumb-down.”

Apply:

  • Simplify it down to elemental concepts and forms.
  • Apply them across all mediums.
  • Apply the product and versions to outlying cases to stress-test for viability.
  • Apply things in the opposite way from what you intend. Be prepared for how people will misuse your designs.

Team:

  • Show it to peers who understand and challenge your perspectives.
  • Collaborate with other specialists (don’t try to do it all yourself).
  • Collaborate around a communal or agreed-upon story.
  • When you work with other people, you’re introducing new perspectives and opening up the narrative.

But most importantly, she says, “don’t let perfection be the enemy of done.’”

Where Accessibility Lives: A Story of Inclusion

Derek Featherstone’s talk was one of the most time-sensitive and urgent talks at AEA Denver. Accessibility and inclusion are trending topics in our culture and politics, but also they are incredibly vital to the web and design industries. And in my experience, accessibility is often pushed aside in the rush to produce, go live, and move on.

His first rule of thumb is don’t aim for perfection. As with everything in life, perfection often keeps us from even starting a project for fear we will fail. When considering accessibility, think “better” instead. 60% better today than it was yesterday is an improvement. (See Cassie’s MAYA example and notes on perfection above.)

Building for accessibility starts with a bit of research. Featherstone recommends asking about three areas:

  • People: Whose job is this? What’s the goal?
  • Process: How will we research it? What’s our design and testing process look like? Who do we show iterations?
  • Tools: How you build your code determines accessibility – how will you educate everyone who touches the project?

Research can be done in-house or hired out. Don’t make assumptions. It’s essential to live test with people with disabilities. Observe how they operate and how you can adapt to them. Build accessibility decisions into your design systems, documentation, and regular check-ins. Talk about the accessibility standards issued nationally and globally, but also about the standards your company is choosing to adhere to and why.

Measuring the Customer Experience

If you haven’t had the privilege of hearing Gerry McGovern speak, get to a conference where he’s talking. He’s energetic and excited and doesn’t take excuses. At AEA Denver, he was preaching my team’s methodology and it was a holy experience.

He began with a great explanation of why we combine quantitative and qualitative data in decision making. The what is easy. We all have data piling up on what is happening with our products or websites. Finding the why is a little trickier. McGovern instructs to first identify the goal. Next, outline what the expected user journey looks like. Finally, talk to users to find the discrepancies between your expectations and actual user actions.

In a customer or user experience, if you can save the user time, you will win their loyalty. Most user time is wasted on page load and having to scroll, scan, or search for what they need. He focuses on identifying core user tasks. There are really only eight to ten top things a person needs to do in any given situation. Identify and surface these core tasks for your site or product.

After improving on the site’s core tasks and time-saving features, test with real customers. His rule of thumb is it only requires 13-18 testers to get stable success and time metrics. Run tests around the core tasks identified earlier, and be sure you’re in an observation mindset – you’re not trying to trick the users; you’re trying to uncover their natural use patterns.

For me, the strongest part of the talk was walking through how to present qualitative metrics with a quantitative slant. Testing a single task for success rate and time to complete provides two strong metrics to present: percent failure rate and average time to task. McGovern’s key point was to then iterate and improve that particular experience and retest for the same metrics. This will provide qualitative numbers in a format more readily understandable to management and C-levels.