A recent Wall Street Journal article reporting that Orbitz shows Mac users more expensive hotels than they do other visitors piqued my interest. At first, this seemed like a non-issue, as many sites customize experiences based on past behaviors or your profile. I didn’t understand why anyone was upset, save for PC users who had to take the unspoken, backhanded cheapskate swipe.
Issue or non-issue?
T3’s very own Ben Gaddis saw it as a non-issue and thinks it will happen more often in the future. I completely agree, but the outcry made more sense when I remembered that most people live in a world where cookies are sweet and retargeting is something kids (and T3 planners) do in video games.
Moving past the basic misunderstanding that Orbitz intends to charge more for the same hotel (simply untrue), it’s clear that people were unhappy when they found out the choice architecture changes were for a clearly defined group (Mac users) and regarding something most people consider bad (higher prices).
While it was undoubtedly a play for increased margins, Orbitz also took the logical step of showing people the kinds of hotels they seemed likely to book—hopefully improving their experience by decreasing search time and limiting the likelihood of abandonment before booking. The central issue that came up—and one that behavioral economics continues to wrestle with—is that people don’t like to believe they’re being led down a purchase path. Adding to this, Orbitz is built around having the lowest prices. This is clearly at odds with a choice architecture that looks like it’s leading customers to higher priced hotels.
The trouble with optimizing.
What’s wrong with optimizing user experiences? Shouldn’t I see products like those I’ve bought or looked at in the past? Or those that people like me bought? We’re seeing all kinds of studies citing tablet and iOS users as big spenders—wouldn’t it make sense to optimize their experience to maximize revenue?
There’s clearly a fine line we need to tread between optimizing the experience and a choice architecture that’s too obvious.
Knowing where that line is comes from an understanding of trust, value exchange and what people expect from brand experiences. One person who commented on the WSJ article is obviously tired of the web and no longer trusts what she’s getting online, “From my Mac..Dear Orbitz, I’m done. I long for the return of the human Travel Agent. How I miss their straight-up support and good picks even if I am willing to pay more. Please spare me the algorithm-driven travel experience.” Clearly, her history includes being shortchanged online by brands.
Moving forward, if Big Data isn’t used wisely to improve experiences, we could find a growing crowd of people who feel the very same way.
Expecting impartial advice.
When using travel sites like Orbitz, there’s an implied promise of lowest prices, but choice architecture always leads to some amount of bias.
So what’s the right level of disclosure? “Just a heads up—we put our highest margin items first so they were easiest to choose”? or “We’re creating content that supports how great our product is—take it with a grain of salt”? These type of disclaimers wouldn’t make much sense, but it’s definitely important to avoid creating obvious groupings (like all Mac users) and telling major U.S. newspapers about it.
Orbitz now has a tremendous but fleeting opportunity to respond and have real impact on its brand. Will they take this opportunity to explain that they deliver customized recommendations to help customers (if it’s actually true), not just that they charge more for the same hotels? Will they shrink and ignore the issue? Or will they simply and quietly change their system?
Leave a Reply
You must be logged in to post a comment.