For some time now I've been meaning to start a blog, a virtual space where I can pen down (or type down if you will) my thoughts and share my findings on design, life, the universe and everything.
As anyone who has attempted a venture similar to this one can attest, finding time to write up properly gets very hard, really fast. So instead of trying and failing miserably to block time to write up long pieces, I've instead opted to write smaller snippets.
So, dear reader, thank you for visiting my thought playground, and I do hope you find my twopence worth helpful in some way.
Recently I was asked to give feedback on an iOS app design project. While going through the prototype with a critical eye, something was bothering me but couldn't put my finger on it straight away. I took a moment to figure it out and then it hit me. Too many animations, often with no purpose.
There are quite a few articles on animations out there, proclaiming how crucial they are in interaction design. And yet I find most of the literature on how and when to use them lacking.
In an interconnected, digital world, where everything - from phone notifications, to flashing digital billboards - screams for attention, adding more attention grabbing elements can be counter productive, or even out right negative.
Animations should be used with consideration in order to have the appropriate impact and desired outcome, and always have purpose. If you can't communicate the immediate benefit of animating something, or the problem that animation solves, then most probably it should not be animated.
Think of a screen with 10 interactive elements on it, where some are more important than the others. Now ask yourself which would be more impactful, having all the elements animated or just the most important ones and in a way that conveys meaning?
Recently I was reading an insightful article on how the discipline of user experience is perceived today in the industry. And I have to say, I couldn't agree more.
Countless times through out my career I have been asked to research, formulate solutions and design for users, but with quite narrow a scope. We unavoidably ended up with well designed solutions to quite specific problems which did not cover the entire user experience. Every single time.
To give a small and simple example, think of a digital money transfer process. Or better yet, think of the best digital money transfer process of your favorite bank or e-commerce site/app. Now try to think how different the experience of performing this task feels under different scenarios: standing in a crowded bus, holding bags/groceries/coffee cup with your hands, holding a baby in your arms, navigating a busy side walk, walking with the help of a cane. The experience of transferring money in these situations will feel quite different as you will feel different under each scenario. Stressed, uncomfortable, rushed. Any and more emotional states could apply. In fact the two things these experiences have in common is that transferring money is not the primary task being performed, and the emotional state of the user can wildly vary.
So why is it most UX work done does not take into account external factors to the experience? Why is it UX designers assume that the products or services they are designing for will be used in a quiet environment, safe from distractions? This approach reminds me of science experiments performed in lab conditions versus the wild, sure enough an important step but certainly not the final one.
Convincing involved stakeholders of the need to expand the experience scope is not an easy feat.
More often than not project, product, or service managers and owners do not fully understand what the UX discipline covers and tend to narrow it down to a specific segment of the experience - usually interaction design.
In such cases, it is up to the designer to educate her/his colleagues and communicate the benefits of a holistic experience design approach.
An experience is as good as its worst designed edge case.
Last few days I have been fully immersed in design system architecture. More specifically I have been looking at the best way to structure our design tokens to allow for maximum design flexibility.
For anyone not familiar with design system tokens, quite a few people have already written extensively about them. Tokens in design systems, by Nathan Curtis, is a good article to read.
But before diving straight into how we approach tokens, it is worth mentioning the unique problems my team is trying to solve. The company I currently work at has a B2B product in the banking sector (web, iOS, Android) which has 3 distinct themes. In some cases clients employ their own design teams to style the product. Given there was no design system or style guide in place before I joined the company, more often than not the UI and style changes resulted in a fork which had to be maintained separately.
Most approaches regarding token architecture out in the wild tend to include every single configurable element as a token. In our circumstances this approached compounded on our main problems, allowing third party teams to change core values on so many elements unavoidably creates consistency issues.
In come the composites
To fix this problem, we decided to introduce a second level of a different kind of token called composite. Tokens have values attached to them which can be changed by a design team to meet specific design needs. E.g. The values of the typographic size scale can be changed to meet the design needs of a new product theme. Composites, our second level tokens, instead of values have first level tokens attached to them. The font size of the main button is a composite and references a font size token from the typographic scale. So if a designer wanted to change it she would need to change which font size token it is referencing. This way we make sure only font sizes defined in the typographic scale will appear in a new theme.
When mirroring the UI from a left-to-right system to a right-to-left one, it is worth noting some icons must be treated differently.
Any icons implying direction, such as next or previous, need to be mirrored to make sense to a user who will automatically default to the appropriate direction in an RTL system.
Icons which do not imply direction, but are designed to mimic real life use need to stay in LTR direction. Think of the magnifying glass used to indicate search, as the majority of the population is right handed then it makes sense to keep the icon the same in both directional systems with the handle on the right.
RTL (Right-to-left) languages tend to slightly complicate design systems. The rule of thumb is to mirror most of the UI so design elements implying direction are reversed to make sense to someone reading everything from right to left.
But alas, as with everything there are always exceptions. For example, a circular progress indicator needs to be the same as in LTR systems (clockwise) as the design is mimicking physical clocks.
Simple enough so far. It all starts to get rather complicated when the UI is set for an RTL language, but some content need to be in LTR. I know, right?
Imagine someone using an RTL app. More specifically, imagine someone typing in an input field using an RTL app. An LTR sentence of 6 words with a syntax of 1 2 3 4 5 6, where a number corresponds to a word, when in an RTL language should show with the order of 6 5 4 3 2 1.
But what should happen when you mix the two systems? What if someone writing in Arabic wants to add a term in English within a sentence? Languages retain directionality. So the above syntax would be 6 5 3 4 2 1, where 3 and 4 are in an LTR language, and the rest of the words in RTL.
There seems to be no single way of use when it comes to the terms component library and pattern library. The industry is using the two terms interchangeably, quite often with no distinction between them, other times with unspecified differences. Similarly, our team inherited the obvious communication trap, often breading confusion when trying to communicate ideas.
When structuring our design system, for our sanity if nothing else, we decided to define the distinction between the two terms. For our purposes, a component is a stand-alone entity, with re-usable code, that can live in multiple patterns. A pattern is a combination of components that solves specific usability problems.
There, all done!
Imagine a product that has 3 different environments with slightly different visual languages: web, iOS native and Android native. Imagine this product having 3 slightly different versions, designed by 3 different design teams. Now imagine trying to collect all that and create a single design system with 3 separate themes.
It sounds complicated, doesn't it? Quite so.
This is the situation my team currently finds itself in. And from the myriad problems we encounter continuously, having to deal with rogue design choices and thinking is by far the hardest.
Design systems are all the rage in the community the last few years and justifiably so. Other than guarantying visual consistency and code sustainability, they become the bedrock of scalability. To solve relevant problems, we recently decided at the company I work at to introduce a design system.
As with most things design, how we approach typography, along with space and colour, was crucial.
It was quite important whatever system we came up with to be flexible and able to accommodate different branding without having to change much of the code base. Our product is B2B and there is a need to change the visual language for different clients.
After some research, and with said restrictions and aims in mind we came up with a tokenized scale based on the Fibonacci sequence, which can be controlled programmatically.