We had the chance to interview him after UX Week 2015.
This is the first of three segment of our conversation.
Could You Start by Telling Us about Your Background and Expertise in Information Architecture?
I’ve been doing this for 15 to 20 years depending on whether you consider part-time work part of that. And I gravitated very early on, in the late 90s to the information architecture community. Because, at that time, a lot of the other mailing lists and conversations were more around HCI (human-computer interaction), which is a more academic sort of domain.
This to me was interesting but still very small and basically out of context. They weren’t connected to other things. It was, like, is this kind of button good or bad? Well I don’t know, it depends on what it’s connected to.
I saw that the information architecture people were to some degree talking about those broader questions. So I got involved with that community and I’ve been involved with IA (Information Architecture) ever since.
So I am a User Experience design professional but I do all of this through an architectural lens.
How Can Users Make Sense of Digital Systems and Incorporate Them into Their Lives?
We have this pipedream that when we get all of this digital stuff integrated into the world around us, then we’ll no longer have to worry about user interfaces. And actually it’s the opposite because we have to worry more about user interfaces. And by interfaces, I mean literally the way we interface with a system.
We aren’t taking away UI. We are having to turn our bodies into UI. We’ve turned our thermostats into a UI that’s invisible, that pays attention to when we walk by it and then decides things based on some invisible choice that it makes.
So the way to help people make sense of it is not to hide complexity that they actually need to understand. One problem with this simplicity movement is that you can make things simple but then if there are complex rules in the environment that people need to actually understand, you can’t just whitewash over that. You have to clarify it for them.
And if it’s too complicated, then you need to change the rules. Not pretend like they aren’t there.
Really what we have to do is reframe what we mean by “user interface” because it’s really about language, which is ultimately the interface that humans have with complex systems.
Do Users Have a Goal in Mind When Approaching an Interface?
Computer human interaction back in the 70s or 80s was coming out of engineering and cognitive science. And mainstream cognitive science tends to position human cognition as something that works as a computer works. But there is an alternative perspective that’s been around for a long-time but has not made it to the mainstream, which argues cognition is a lot more organic, it’s more embodied.
That’s why in my book I ended up talking about embodied cognition a lot. And there has been other research out, like, Thinking, Fast and Slow where science is starting to come to grips with the fact that this assumption that our brain makes a plan and then tells our body to do stuff is incorrect. It doesn’t really work that way.
We just do stuff and then we tell ourselves why. We get up and we’re hungry and we stand at the fridge before we even know that we are doing it. And the thing is, the vast majority of what we do in a day is like that, no matter how technological it is or whatever.
So most of what we do is sort of stumbling through and using the environment as a way to help us reflect on and decide what to do next.
We are formulating goals constantly, but they are normally very tacit and close. You can’t create things assuming that all of users have these three goals in mind because then you are going to have to assume that as soon as you show them something that needs that goal, they are going to go there.
But many of them are still figuring out what this environment means to them and what it is that they actually need to do.
How Can Researchers Make Sense of These Tacit Goals?
What you have to do is observe behavior, not just what people say.
And the more that digital technology is seeping into being pervasive in the world around us, the more we can’t just do that on screens anymore.
It never really worked that well to begin with, but we could pretend that we got to most of the problem because just focusing on screens mainly did the trick. But now especially with mobile technology, it is more complicated.
I think ethnographic methods, watching what people do, is the thing to do. Humans are really complicated, but at the same time there are some huge patterns that start to emerge.
Then you figure out what kind of environment will match each behavior pattern, depending on whatever angle somebody is coming into it from, because they are all coming at it from different angles.
Andrew Hinton, formerly a Senior Information Architect at The Understanding Group, LLC, is now Senior Digital Experience Architect at State Farm. He’s the author of Understanding Context from O’Reilly Media, and a frequent speaker and workshop leader in the United States, Europe, and beyond. Over the years, he’s worked with clients and employers of all shapes and sizes, including Kimberly-Clark, RF Micro Devices, SRC.org, Vanguard, Sealy and Lowe’s Home Improvement. You can find more information at andrewhinton.com.
For more, you can contact Misael Leon at [email protected].