May 22, 2023

Decoding The Future: The Evolution Of Intelligent Interfaces

A futuristic smart city during the day, with skyscrapers, greenery, and a large rectangle pool of water reflecting the buildings and the sky
kobewan x Midjourney 2023

In the tech world, there's always a 'next big thing' on the horizon. Right now, that horizon is dominated by conversations about artificial intelligence, mainly ChatGPT, Google’s Bard, and Generative AI tools like Midjourney and Dall-E 2. Some professions are embracing Generative AI, while others are fearing it.

Included in that conversation is a lot of talk about chatbots. But chatbots are not the future, they are our present. In many ways, chat interfaces are the foundation of more conversational and natural interactions with systems and computers, but really they are the start of something much bigger.

We’ve been anticipating this future for a while, but it hasn’t really been possible until the last couple of years when the right ingredients and technology have converged. It’s an exciting time.

Ubiquitous Computing and Intelligent Interfaces

What happens when we move beyond our screens and unleash computing power, the internet, and AI into the real world? It starts to get interesting. Welcome to the age of ubiquitous computing — the future of intelligent systems.

Ubiquitous computing, also known as pervasive computing or ambient intelligence, is a concept first proposed by Mark Weiser, Chief Technologist at Xerox PARC, in the late 1980s. Weiser later wrote a paper on the topic in 1991 titled "The computer for the 21st century". The idea behind ubiquitous computing was to create an environment where computers were embedded seamlessly into the physical world and where human-computer interaction was natural and effortless. Where we forget about the underlying technology. Weiser envisioned a world where computing would be "invisible," and users would be surrounded by an "information fabric" that would provide them with relevant information and services.

Ubiquitous computing has been somewhat realized in the Internet of Things (IoT), the network of interconnected devices and sensors embedded in everyday objects, from smart homes to self-driving cars. But, the original vision of ubiquitous computing as a truly seamless and integrated experience has yet to be fully realized.

Then there are intelligent Interfaces, representing a paradigm shift in human-computer interaction. Unlike traditional GUIs (Graphical User Interfaces) that require users to learn the system's language, intelligent interfaces aim to make the system understand the user's language. Instead of humans adapting to the system, the system adapts to them. Intelligent interfaces leverage human instincts and behaviors to create a more intuitive experience. These systems integrate artificial intelligence (AI) and can leverage everything from touch interfaces, voice recognition systems, and gesture-based controls to brain-computer interfaces — they’re multi-modal.

These intelligent systems can make our interactions with technology more intuitive, natural, and efficient. Think Siri, but on steroids. So as we stand on the brink of a new era in human-computer interaction, it's worth exploring how these interfaces have evolved, where they're headed, and how they will revolutionize how we interact with devices and the world around us.

“The future is already here — it's just not evenly distributed.” 
— William Gibson

Current State

The journey has been gradual, with momentum building in the last five years. It started with command-line interfaces, moved to graphical user interfaces, and then to touch interfaces with the advent of smartphones and tablets.

A spectrum that illustrates the evolution from interfaces that are built around computers to interfaces built around humans, from punchcards, keyboards, and mice to touch, voice, gesture, and gaze.
Slide from my Leading Innovation presentation, 2019

Today, we're already seeing the beginnings of intelligent interfaces. Technology is getting smarter, from Alexa managing our smart homes to AI algorithms recommending our next Netflix binge. They're in our phones, our cars, even our refrigerators. And they're changing the way we interact with technology. We’re now using much more natural and multimodal interaction in everyday objects. We're seeing the rise of voice interfaces like Apple's Siri and Google Assistant, gesture-based controls in gaming systems like the Nintendo Switch, and even early versions of brain-computer interfaces (BCIs) from companies like Neuralink.

As consumers, we’ve become accustomed to using biometrics, like facial recognition and fingerprints, to authenticate and unlock our devices. Even BMW is starting to use gestural interaction and gaze for hands-free interaction in their cars. Augmented Reality (AR) and Virtual Reality (VR) interfaces are becoming more popular, providing immersive experiences that blend the virtual and real worlds. This is seen in devices like the Meta Quest and Microsoft HoloLens.

Interaction patterns of the future.

The chat interfaces we’re seeing today with ChatGPT and others are setting the foundation for multimodal systems of the future. The back-and-forth interaction volley will lay the groundwork for how systems interact with us, anticipating our needs, confirming our requests, and acting on our behalf.

We’ve seen glimpses of ubiquitous computing and intelligent interfaces in TV shows and movies, but it’s always seemed to be further into the distance. However, we’re now on the cusp of an explosion of product innovation that advances in AI, computing, and hardware will finally enable. It’s exciting that what used to be science fiction and innovation concepts are now becoming reality. Movies and TV shows have played a large part in inspiring us to push further.

Minority Report (2002)

One of the most famous examples is the user interface in Minority Report. The movie has been highly influential in shaping public perceptions of interfaces and has inspired real-world applications of gesture-based interfaces and other emerging technologies. Most remember the gestural interface used by Tom Cruise’s character, what’s not widely known is the computer system used in the movie. John Underkoffler of Oblong Industries created the g-speak Spatial Operating Environment, which uses natural gestures — no keyboard, mouse, or command line. Underkoffler also worked on the gestural holographic interfaces in Iron Man (2006).

Her (2013)

The gesture-based interfaces in Minority Report and Iron Man gave us a glimpse of what's possible, while the voice interface in the movie Her showed us a future where our devices understand us on a deeply personal level. Her explores the idea of natural UI, the potential for human-like interactions with digital assistants, and the potential implications of a future where technology becomes more integrated into our personal lives.

Maeve’s “Attribute Matrix”, Westworld

Another great example is Westworld, which explores the concept of ephemeral interfaces and artificial intelligence, where android hosts adapt and respond to the guests' actions and preferences in real-time, creating a highly personalized and immersive experience in a technologically advanced amusement park. It gets even more interesting when the androids return to the “real world.”

Blade Runner 2049 (2017)

Blade Runner 2049 offers a glimpse into a future where projection mapping and augmented reality create immersive and dynamic environments. In the movie, projection mapping creates large-scale holographic displays that interact with the physical environment, creating an immersive and surreal atmosphere.

Iron Man’s J.A.R.V.I.S. AI assistant and the virtual assistant FRIDAY in Captain America: Civil War are great examples of natural UI. These interfaces demonstrate the potential for natural language processing and machine learning to create sophisticated and responsive digital assistants to help us navigate our increasingly complex and interconnected world.

Intelligent Interfaces promise to make our interactions with technology more natural and effortless. By leveraging our behaviors and instincts, they can also reduce the learning curve associated with new technologies, making them more accessible to a broader range of users.

Emerging technology shaping the future

The interface of the future is not chat — it’s multimodal and ubiquitous.

The future is not pages and pages of UI flows or detecting whether you’re on desktop or mobile. Our future interfaces will be intelligent, contextual, and ephemeral. Just enough interface compiled in real-time, based on context and relevance.

UI that appears when it’s needed and hidden when it’s not.

The ability to interact by voice, touch, or typing, easily switching modalities based on what’s natural for the user. Where interfaces are fluid, and sound and haptics enhance calm, ambient interactions. A proactive concierge that provides what’s needed based on understanding who you are and gets better the more you interact with it.

Systems that adapt to humans instead of the other way around.

Imran Chaudhri, Co-Founder, Chairman, and President of Humane at TED 2023

Some companies are already working towards this vision of the future. At this year’s TED Conference, Humane’s Imran Chaudhri provided a preview of their unreleased tech, a system where AI, computer vision, and projection come together to create an assistant that’s with you throughout your day without using a phone — where the device disappears. Early views of this type of system are reminiscent of Pranav Mistry’s 2009 SixSense demo at TEDIndia of a wearable gestural interface, his MIT Media Lab thesis project. Pattie Maes, who runs the Media Lab's Fluid Interfaces research group, created a huge buzz at the TED main stage that year, introducing the project.

Mercury OS

Another team pushing the boundaries, former Apple designer and founder Jason Yuan and Sam Whitmore, have just received funding for new.computer, whose mission is to “create a future where computers intuitively adapt to humans, forging relationships as essential as the tools we use today.” Jason Yuan’s name may be familiar for creating Mercury OS, a minimal, fluid reimagining of the traditional operating system focused on the user’s intention instead of apps and folders.

Refik Anadol, Machine Memoirs

And some of the most experimental art may push the boundaries and help shape how we interact with future systems. Refik Anadol uses projection mapping and machine learning to create immersive AI data sculptures and interactive art installations. Anadol's work blurs the line between the physical and digital worlds, creating beautiful and thought-provoking environments.

Advancements in AI, machine learning, natural language processing, and human-computer interaction will likely drive the evolution of intelligent interfaces over the next decade. As we think about the types of new interactions and experiences that will evolve, here are some of the trends and developments that will influence that future:

  1. Multimodal Interfaces: Future interfaces will likely combine text, voice, visual, and even tactile inputs and outputs. This will allow users to interact with AI in whatever way is convenient or intuitive for them at any given moment. For example, you might speak a command to your AI assistant, then receive a visual response on your smart glasses.
  2. Context-Aware Interfaces: AI will become better at understanding the context of user interactions. This means that the AI will understand what you're saying, where you are, what you're doing, and what you might need in that specific situation. This could involve integrating data from various sensors and sources to provide more relevant and personalized responses.
  3. Emotionally Intelligent Interfaces: AI will become more adept at recognizing and responding to human emotions. This could involve analyzing voice tones, facial expressions, or even physiological signals to understand the user's emotional state and adjust its responses accordingly.
  4. Proactive Interfaces: Instead of waiting for commands, AI interfaces will become more proactive, anticipating user needs based on patterns, habits, and preferences. For example, your AI assistant might suggest leaving early for a meeting if it knows there's heavy traffic on your usual route.
  5. Immersive Interfaces: With advancements in AR, VR, and projection mapping technologies like those demonstrated by Humane, we can expect more immersive AI interfaces. These technologies could allow for more natural and intuitive interactions with digital content.
  6. Collaborative Interfaces: AI will become more capable of collaborative problem-solving, working alongside humans to tackle complex tasks. This will involve understanding and contributing to human-like conversations, including recognizing when to take initiative and when to ask for clarification.

In the future, we can expect more personalized and immersive interfaces. AI and machine learning will continue to make interfaces smarter and more adaptive. They'll learn from our habits and preferences, making our interactions with technology more efficient and enjoyable. AR and VR will continue to evolve, creating more immersive and interactive experiences, blurring the lines between the physical and digital worlds, and creating new possibilities for interaction. Though still in their infancy, Brain-Computer Interfaces (BCIs) hold the promise of a future where we can interact with technology using our thoughts alone.

We'll see a move towards more continuous, personalized, ambient interfaces. These interfaces will be seamlessly integrated into our daily lives, allowing us to interact with AI in a more natural and intuitive way, similar to how we interact with other humans. Consider a combination of voice, gesture, gaze, and even thought-based interfaces in the future enabled by technological advancements like BCIs.

The latest research in the field is fascinating. Scientists are exploring everything from AI algorithms to brain-computer interfaces, and their findings could revolutionize how we interact with technology. Researchers at MIT's Computer Science and Artificial Intelligence Laboratory (CSAIL) are developing systems that can understand and respond to human emotions, potentially making our interactions with technology more empathetic and engaging. Meanwhile, researchers at the University of California, San Francisco, are making strides in BCI technology, recently developing a system that can translate brain signals into complete sentences.

Intelligent Interfaces represent the next frontier in human-computer interaction. They hold the promise of making our interactions with technology more natural, intuitive, and engaging. While significant challenges and ethical considerationsexist, the potential benefits are immense. One thing is clear — how we interact with technology is about to change significantly.


This is a more dense approach than my usual posts. As always, send me your feedback. If the community is very interested in this topic, I’ll write part two, where I’ll dive into intelligent interfaces' challenges, ethical considerations, and how the role of designers will evolve. I’ve been trying out ChatGPT as my research assistant, how did it do?

March 10, 2022

How Design Thinking and Emerging Technology Will Enhance Travel Experiences

Customer experience has become a significant competitive advantage in the travel industry, magnified by shifts in what travelers value. We’re on the cusp of an evolution in how emerging technology will enhance the travel experience from start to finish, moving from reactive to anticipatory and proactive customer experiences. 

Evolving expectations 

With more ways to spend their time and money than ever before, people expect more from their customer experiences. Today’s experiences are benchmarked against the best across all industries, which means companies compete with experiences completely outside their category for mindshare and wallet. So, when a company disrupts an industry or makes their service incredibly easy or more delightful, consumers wonder why everything can’t be that simple. Turns out, travel and technology are good companions. 

In the past, when a flight was canceled, it was enough for a travel company to supply travelers with the connection points and contact information to fix the issues themselves. We’re no longer in that era. Now, travelers will compare the self-service experience of dealing with a flight cancellation with the ease and simplicity of their favorite app — regardless of industry. The end-to-end service travelers receive is considered part of the product experience itself. 

The shift of customer focus from products to services to experiences has been happening for years, and the pandemic has only amplified the need for meaningful connection. Travelers are placing greater value on the trips they’re taking and the memories they are making. A recent report from Expedia Group found that 50% of travelers plan to spend more on trips than they did prior to the pandemic. 

Higher consumer expectations, coupled with an increased emphasis on the role of travel in our lives, has raised the bar considerably for travel providers who want to deliver great experiences.  

Human-Centered Design 

To become a traveler-centric company, we must put our deep understanding of traveler needs, preferences, and behaviors at the core of our work. Human-Centered Design allows our cross-functional teams to activate our expertise and innovate in real time: connecting travelers with inspiring ideas to explore their world, streamlining the planning process, and keeping relevant information at their fingertips throughout their trip. 

While historically, travel providers have viewed the transaction as the end of a traveler’s experience, Human-Centered Design enables an experience-led product and service design process that maps the traveler’s journey end-to-end and informs every touchpoint they have along the way. 

A shift to holistic thinking and personalized experiences 

The travel industry has a history of optimizing experiences for search and transactions. Instead of focusing on the transaction, travel providers need to focus on the relationship travelers have with their brand and use that to build more intuitive, personalized, and proactive experiences. Taking a more holistic view of the experience frees us from thinking in transactional silos and highlights how all the pieces interconnect. From discovery and planning, to booking, in-trip, and post-trip — it helps connect the journey across all channels and time. 

Travelers don’t see parts of the experience or features in isolation, to them it’s all one experience — and that’s exactly how companies need to see it too. 

Making technology human 

Technology is an enabler of great experiences. Leveraging artificial intelligence, natural language processing, and predictive analytics, companies can create hyper-personalized interactions that adapt to a traveler’s context and work across every aspect of their journey. At their core, experiences need to be humanized, starting with a cohesive design and conversational tone, removing jargon, reducing complexity, and streamlining interactions.  

Once that foundation is ready, companies can deliver real-time, personalized experiences that meet travelers where they are and provide the right information, at the right time, in the right context. Personalization unlocks a new level of experience quality. It moves us from a ‘one-to-many’ to a ‘one-to-one’ conversation with customers. Reflecting people’s needs and preferences while providing value at every interaction also builds trust. Companies can use data to anticipate issues and solve them using customer preferences and light touch interactions. 

Natural language processing allows for multimodal interaction, so travelers can interact in the most natural way for them — whether that’s through typing, tapping, or voice. Voice interaction will become increasingly prevalent over time, enabling a new generation of experiences that deliver actionable insights and real-time personalized interfaces. 

What’s next – hyper-personalization and prediction 

What’s considered bleeding-edge now will become table stakes in the future as customers’ expectations evolve. Where we’re heading is hyper-personalized interactions that adapt to context, work across the entire journey, and solve problems before travelers even know they have them — the future is predictive and proactive.  

This shifts us from a place where flight cancellations cause additional time and stress, to a world where issues are solved before travelers even know there’s a problem. A world where flights are rebooked and itineraries updated before travelers even know their flight was canceled, with orchestration happening behind the scenes, reducing the complexity and stress when things change. Systems that get better the more you interact with them, increasing value to travelers by anticipating their needs. 

This is where the power of journey orchestration and proactive experiences really come into view. Travel providers that take advantage of this trend can create better customer experiences, achieve higher conversion rates, and increase the value of each trip. They also can build long-term relationships with travelers instead of just transactions.

Originally posted at MyCustomer

Contact

info@kobewan.com

Design leadership and operations, building world-class organizations that integrate human-centered design to drive product innovation and customer-centric culture.

© 2023 kobewan

The present is a beta of a better future.