Choosing software your team will love

Date Posted: February 14, 2019 Author: Franklin Morris

In celebration of Valentine’s Day, today we’re focusing on the importance of evaluating usability when selecting a technology partner. In other words, how to best determine whether users of a new tool will fall in love with it.

While this romantic vision may seem out of place in a business environment, listening to the needs of end users can save you time and money that would otherwise be wasted on trial and error.

Does this situation sound familiar?

You spend months shuffling through proposals and watching impressive presentations, only to become overwhelmed by the many options available. When you finally make what seems like a sound decision and the team receives the new toy, things fall apart. It turns out to be an expensive piece of clunkiness at best; it’s difficult to manage and disliked by the users.

Now, you’re probably asking yourself, “How am I supposed to pick software my team will love?” A product demo is a good first step and can show you a lot. However, it’s hard to really know how user-friendly a tool really is before you’ve fully implemented it.

In this post, we’ll cover leading indicators of usability and related questions you can ask to objectively assess the likelihood users will adopt and be happy with a new tool.

Finding the perfect match

The ISO 9241 standard, the first part of which dates back to 1997, sets out to define usability as an umbrella term for effectiveness, efficiency and satisfaction. Simply put, if these three criteria are met, you can feel confident users will adopt a new solution and appreciate that you brought it into their lives.

Building on that definition, here are key characteristics you should look for in your next great business tool.

Effectiveness: tailored features and seamless implementation

There’s often a tradeoff between number of features and implementation complexity. Platforms with a long list of features may offer flexibility, but it takes a lot of work to customize everything according to each team’s needs. It slows down implementation and requires complex maintenance, especially if (or more accurately, when) business requirements evolve.

Conversely, a platform with a more tailored set of features, designed around your industry and use cases, can be just as effective, if not more so, and with much more seamless setup and maintenance. Because it’s purpose-built, you’ll likely find industry best practices incorporated directly into the platform for a better user experience.

For this reason, you should look at effectiveness in the context of implementation, and a well-chosen set of features is often the smarter choice. For example, say you want to better harness retail data to improve performance of your consumer electronics business. You could turn to a business intelligence tool, like Tableau or Domo, but that could take months to customize with the right data integrations and retail metrics you care about. Alternatively, you could look for a purpose-built data, analytics and forecasting tool for consumer goods companies, which could be up and running and providing insights in just a couple weeks.

Ask:

  • Who are your current customers?
  • Do you focus on specific types of customer or industries?
  • How long does implementation typically take for a company my size?
Efficiency: automation and scale

Another important aspect that will impact user adoption is the tangible efficiency the new tool brings to the table. After all, technology should help reduce the time it takes to complete routine tasks and increase productivity, right?

Automation is an obvious indicator of efficiency, as automating what were manual, time-consuming tasks should make processes more efficient. Even then, one thing to watch out for is whether the the new “improved” workflow requires fewer but more complex actions to achieve similar results, so the net result is zero.

Another way to measure efficiency is to understand a tool’s ability to scale. In other words, the extent to which it can grow along with your team and business, whether that be to more users, more products, more sales channels, etc. If a tool can quickly and easily take on a greater “workload,” that’s a good sign the underlying technology is efficient. If, on the other hand, it’s not built on a robust platform or relies heavily on manual human intervention to work, it will be much harder to scale and inherently less efficient, even before you reach the scaling stage.

Ask:

  • Of the work that our team is doing today, what would be automated?
  • What would the new workflow look like?
  • How quickly could you scale if we doubled our team, sales, etc.?
Satisfaction: intuitive interface and a gentle learning curve

It doesn’t matter if it’s a music streaming service, a social media network or a business analytics platform – if it has a clunky interface, superfluous options or simply “doesn’t click,” users are going to be unsatisfied and switch to something more approachable. End of story.

At the same time, satisfaction is very hard to evaluate before you actually roll out a new tool. How do you judge whether an interface is intuitive, and what does that even mean? Some things to look for are an interface that looks more like the phone apps you use everyday than a business tool, that you would feel comfortable presenting to your customers and that follows “established” design conventions, consistent with other tools you already use.

It’s also necessary to keep the learning curve in mind. Any interface can seem intuitive to an expert user who’s spent hours training on it, but the learning curve should be gentle enough that anyone can start effectively using it with a one-hour training session. There may be more complex features that require further training, but they shouldn’t be required for a user to start seeing the benefits. One reason we’ve seen demand planning tools get abandoned after implementation is the requirement for a super user trained on the nuances of the tool, a situation that’s best avoided.

Ultimately, though, the people in the best position to speak to satisfaction are existing users.

Ask:

  • What is your customer retention rate? (unsatisfied customers will vote with their feet)
  • What percentage of your current users are active on a weekly basis?
  • Can you provide case studies and/or set up a customer reference call?

We get it. Evaluating software usability can seem like a squishy exercise based on individual preferences and judgment calls, at best. But it doesn’t have to be that way. Take the time to discuss the key aspects of usability with your team; pick their brains on what current solutions lacks in those departments and incorporate those learnings into your criteria.

And of course, ask any prospective vendors the questions above, try to get quantitative answers where possible and compare them head-to-head. Any solution provider who’s confident in their usability should happily provide these stats.

franklin1 modified
About the Author:

Franklin Morris

Franklin Morris is Vice President and Head of Global Marketing at Alloy.ai. He's spent his career leading brand, content and demand generation marketing for high-growth startups, ad agencies, and Fortune 50 giants, including IBM, Dell, Oracle, Rackspace, 3M, Facebook, Electronic Arts, Informatica, Sisense, and Argo Group.

Related resources


Article

Talking better product launch and allocation decisions with Ferrero USA

The global confectioner mitigates waste, improves service levels and controls costs by connecting digital supply chain visibility with POS analytics.

Keep reading
Article

Say goodbye to constant supply chain firefighting: A guide

How to take an iterative approach to digital supply chain transformation with real-time alerts that motivate teams to collaborate on issue resolution

Keep reading
Article

New white paper exposes the gap between planning and execution

Understand how gaps between systems, teams and processes are keeping you constantly firefighting and hurting your supply chain resilience

Keep reading