Thursday, November 29, 2007

Still surfing

I'm always relieved to hear that not everyone knows exactly what they want exactly when they want it, so I was pleased to come across a bunch of interesting media research in the public domain for the Corporation for Public Broadcasting, most of it here. Particularly interesting to me was the study of how viewers currently navigate through their viewing options with Interactive Program Guides (IPG’s) which is the semi-official name for those grids floating over our TV screens when we flick on our digital cable/dvr remotes.

What the research revealed is that despite the many ways we now have to get exactly what we want when we want it (Tivo, Netflix, OnDemand) the majority of television viewing (over 50% the respondents claim) is still driven by general surfing behavior. Unlike the web which, most evidence suggests, is increasingly destination oriented, television remains well suited to the pleasurable of experience of surfing or just seeing “what’s on.”

Also interesting was the way research called attention to the IPG as an important media vehicle. The specific language in the IPG grids had a significant impact on viewing choices both at the primary schedule-matrix level and at the secondary (more-info) level. Intriguing to me because it basically means that networks and anyone interested in getting viewers to tune in should not take the language in grid listings for granted but think of these IPG’s as communication vehicles at least as--if not more--effective than tune-in advertising.

Sunday, November 25, 2007

Sunday-afternoon viewing: the power of convention

Saw two movies this holiday week-end for the first time in maybe a decade and while the two could not be more disparate in theme and tone (The Coen Brother’s No Country for Old Men and Seinfeld’s Bee Movie), the combined experience of the audience reactions reinforced my generally cranky instincts about our deep investment in genre convention.

Like most of the pro critics, I found No Country pretty fantastic. The first half hour takes you right away; the clockwork structure, the breathtaking cinematography, the acting. Even the Coen brothers characteristic vices (easy glibness, playing with violence for effect) are replaced by a seriousness about the characters. And yet, most of the audience in the full suburban theater I saw it in
weren't happy with the ending. They voiced their disappointment out loud: “I want my money back," "It was such a good movie at the beginning.” I obviously don’t want to spoil it for anyone, but it’s fair to say that the movie undermines your expectations for a certain kind of conventional Western conclusion. The ending is totally consistent with the structural and narrative terms of the movie, but my audience didn’t care. They wanted Clint Eastwood. It reminded me once again how deeply invested we are in genre conventions and how little tolerance we have for cultural expressions that don't follow the rules.

Speaking of cultural conventions, Bee Movie falls right into the sarcastic center of kid culture these days, with the same knowing, ironic tone of the revolting Shrek franchise, and lots of screen time filled up with half-hearted set-pieces satirizing (or pretending to satirize) adult culture (Larry King, Goodfellas, The Graduate). There is even a twenty minute court-room drama parody. What? I realize these things are supposed to keep the adults entertained (aren't we clever that we can recognize scenes from other famous movies), but it seems more like a failure of the imagination. When did kid movies become nothing more than an excuse to create set pieces for easy jokes? My kids stared at the screen just hoping something would happen beyond listening to the Seinfeld character do his Bee stand-up routine. Do people really like this stuff? A bee version of Larry King?

But has anyone else mentioned the movie is practically an allegory for Seinfeld’s own professional life. (Spoiler alert: don't read further if the plot of Bee Movie is important to your viewing satisfaction). A bee rebels against the conventions of the hive because he doesn’t want to do one job for the rest of his life. When the bee’s rebellion is surprisingly successful and they get all the honey they've ever made back, the hive stops production because they have more honey than they know what to do with. The bees slip into aimless and unsatisfying lives of leisure. Unfortunately, this has disastrous effects on the natural world (no pollination, no flowers, etc) and the bees realize they need to get back to work to save the world. Maybe it wasn’t all about the honey after all? Sucks to become irrelevant, doesn’t it?

Wednesday, November 21, 2007

Experimental office fiction #2

The only habit of highly successful people that Meyers could remember was #3: first things first. The other ones, something about being proactive, another one that encouraged you to synergize were too vague to be of much use. But the illuminating distinction between urgency and importance was so simple and so clear that it spoke to Meyers hunger for a secret key, a filter or even just a new perspective with which he could change the course of his career.

It was maybe bad luck that much of what had often been urgent in Meyers’ life also turned out to be important. His failure to clean the gutters a few Octobers ago had had disastrous consequences on a load-bearing wall in his kitchen. Not to mention a scarring fight with his ex-wife. But this unpleasant memory did not diminish the power of the principle. Even years later, the urgent/important paradigm had a special place in his cognitive tool-kit. Other systems (colored parachutes, personal brands, emotional intelligence) had come and gone but first things first remained an operating principle. Whenever Meyers heard the phone ring or just recalled some annoying errand he’d been putting off for weeks, he would find himself asking himself: Is this just urgent? Or is it really important?

It’s true that there had been stretches of time when it was difficult for Meyers to identify something important enough to put off all the things he didn’t want to do, but that was no longer the case. Importance had been thrust upon him, and he felt a renewed energy and focus. He went to work the night after their first meeting, surfing the Internet, looking for more clues to the character of B--, the great man he was now responsible for advising.

Invisible forces seemed to be aligned in his favor once again for Meyers discovered that B-- was speaking at a conference that very week-end at a resort in Southern Maine. B-- was on a panel provocatively titled, “Breaking the Rules.” Meyers immediately called the number listed on the website. The woman on the phone made sympathetic noises but explained that the conference was unfortunately fully booked. Meyers was not surprised. He had long been an avid conference attendee and knew how fast they filled up, especially with speakers of B—‘s caliber. It had probably been booked for months.

The woman was explaining how he could sign up to see the speakers streamed on the Internet but Meyers was already imagining the conference itself: all the small delightful details, from the excitement of choosing among the array of panels, the animated debates at day's end, big talk about the future, the sense of possibility. You could find yourself talking to the founder of of empires. You never knew where they might lead. That’s why Meyers usually attended several a year. It had been another sticking point in his marriage. But what was more important than career development! An experienced conference-attendee like Meyers knew that even first-rate conferences had a high rate of cancellations at the last minute. The kind of people who let the urgent get in the way of the important, Meyers thought. But Meyers wasn't one of those people. He was already searching for a hotel room.

Tuesday, November 20, 2007

Intuition wins again

"Mock on, mock on, Voltaire, Rousseau:
Mock on, mock on: ‘tis all in vain!
You throw the sand against the wind,
And the wind blows it back again."
--W. Blake

Everyone's (or at least my) favorite columns in the Saturday NYT’s business pages—What’s online? What’s offline—explored (or summarized other explorations) of the perennial question in business culture. What makes a good manager? Method or Instinct? A well-honed system or a well-developed gut? Judging by these two summaries, the gut is clearly winning but maybe because we’re tired of the alternatives.

The What’s Online column cited an article in The Economist which in turn cites several other sources, including HBS professor Rakesh Khurana, critiquing the degradation of business schools from a “serious intellectual endeavour to a slapdash set of potted theories.” There are also counter-arguments, including a study out of Stanford and LSE which claims that companies adopting business school methods outperform the competition, but even this evidence seems further compromised by the celebration of intuition on the parallel column, celebrating the power of intuition.

The core evidence in What’s Offline is from the MIT Sloan Management Review (though Woman’s day and Fast Company are also cited) which explains how you can improve your intuition. While the article seems designed to articulate a more complicated notion of intuition--

“Intuitions is a highly developed form of reasoning that is based on years of experience and learning and on facts, patterns, concepts, procedures…stored in ones head.”

--the implications don’t exactly take us somewhere new: Managers should learn to seek out new experiences (to internalize a broader array of patterns), harness their emotional intelligence, and reflect on their intuitions before acting too rashly.

To me that’s starting to sound a lot like what we used to call "using your judgment" or even “paying attention,” but to be fair, I should check out the MIT article first.

What’s most interesting to me is the rising popularity of of “intuition” as a quality of value in managers in general and leaders in particular, whether it’s leaderships bio’s like Welch’s Straight from the Gut to Gladwell’s Blink to celebrations of any kind of intelligence besides the old-fashioned kind: Emotional Intelligence, Social Intelligence, Multiple Intelligence. Perhaps business schools are already teaching courses entitled “Not just for women anymore: the uncanny instincts of successful leaders.”

When did intuition suddenly become so valuable a core competency? It's easy to understand why we all want it: it's faster and cheaper than research and analysis and a lot more exciting, almost mystical in it's power. But why do we--in a data-rich field like business--suddenly value it so highly? Is it because we now have too much information? And it's too easy to get stuck in an analysis (paralysis) mode and be unable to act? Or is the ascendancy of intuition a biz culture corrective to a long period of overly restrictive, pseudo-scientific systems which dominated management training for decades: creating Whyte’s Organization Man an valuing systems and bureaucracy at the cost of creativity and innovation. I’m sure the story is more complicated than that, but it’s worth exploring further.

Saturday, November 17, 2007

Do they have to like it?

It’s an old truism in the agency business that creatives care too much about the quality (originality, aesthetic beauty, executional detail, etc.) of their work and not enough about whether it will actually build the brand and the business. Now this is supposedly changing along with many other things in adland, but I don’t think it was even true in the old-fashioned world. And not because creatives were so focused on business results. But rather because marketers evaluate marketing ( TV in particular) much more subjectively than they usually admit, whether it works or not.

The reason I’m making this observation now is that I recently heard about two marketing clients from significant brands who killed a campaign because the marketer and his staff simply "didn’t like it.” Even though there was abundant evidence--both hard and soft metrics—that the campaign was working, they just couldn’t get over the fact that the work represented their company in style and tone that they didn’t like. It was unconventional, they weren’t. Even if the target was responding, they just don't want to be a talking monkey (that’s a placeholder).

It’s their prerogative of course; they are writing the checks. But it’s more evidence that even though managers often talk a tough game when it comes to keeping costs down, some might be just—if not more—sensitive to how the marketing looks and feels. Does my boss like it? Does my staff like it? Will I go down in company history as the idiot who approved the talking monkey campaign?

You can hardly blame them. The work a CMO stewards is more public and exposed than the work of almost anyone else in the company, certainly at the executive level. Employees might not like what the CFO has planned for them, but most can’t complain about it. The same can’t be said for a TV ad or website or DM that is going to be seen by the all the employees and their wives and most of their friends.

If it doesn’t work out, a creative can always leave it off the reel. A CMO has to live with it. It’s a dilemma, particularly when we have to challenge the client to break out of conventions in order to achieve their often ambitious goals. Of course you can always avoid clients who aren’t ready to walk the talk, but it’s hard to know that beforehand. I’m inclined to think that along with coming up with the great ideas, we also have to get good at making clients comfortable with them. A subject that is worthy of a post of it's own.

Wednesday, November 14, 2007

New metrics: or some things I wish I could measure

Just sat through an engaging presentation from an online market research company (OTX) which has some new brand-experience measurement tricks up their sleeve (modeling store environments, distracted exposures, etc). But the presentation also reminded me of the other things I wish someone knew how to measure in a cost-effective way.

1) Brand utility: of all the newly coined phrases describing the attention to what brands actually do for consumers (vs. what they merely show and tell), “brand utility” is my personal favorite. I first heard it in an interview on PSFK with Benjamin Palmer though I think he co-credits Johnny Vulcan of Anomaly with the coinage. Wouldn’t it be great if we could develop a metric that would measure a brand’s utility factor (or whatever we call it) in relation to the competition? If what we are all saying is true, then “utility” should be just as, if not more, important than “relevance” and “uniqueness” and all the traditional metrics.

Imagine if we could compare the relative utility factors of all the travel sites out there: Travelocity, Kayak, Expedia, Orbitz, etc. This wouldn’t be a strict usability measure which is easy enough to generate but rather some combination of ease of use and ability of the brand to anticipate and facilitate your desires behavior, both on and offline. It would be a more comprehensive measure than one you could generate through simple online user satisfaction survey.

Amazon might be considered the gold-standard of a certain kind of ecommerce transactional utility, against which other ecommerce brands could aspire.

Once we established a baseline, then we could--theoretically of course--measure how much a given usability, service improvement or new application impacted the score, and then (praise be the intelligent designer!) we might even determine if the cost investment in the improvment was worth the result? Maybe someone can already do this; But I haven’t come across it.

2) Sharability/Network/Viral effects: Another thing almost everyone wants these days is for their work to go all viral: have consumers talk about their brands and products and advertisements for free, for fun. It’s not easy but not impossible to measure how well content (or a conversation, as we say today) travels around the web once it’s produced, but wouldn’t it be cool if we could pre-test the power of a piece of a content in some kind of enclosed network? You might call it something like a Network Communication Test. The goal would be to see both how sharable some particular chunk of content was and to understand why it was so sharable. Which might even help us develop even more viral-icity.

3) Participation: Everyone not yet involved in social media that wants to be involved in social media looks at the rise of MySpace and Facebook and thinks it’s easy to get consumers to participate in their sites, rating and reviewing and uploading video of themselves doing stupid things with their friends when they're drunk. Those of us working on sites of various kinds know it’s not as easy as it looks. That we often have to prod consumers (with incentives of various kinds) to get them going and optimize the mechanisms for involvement before the machine really starts humming. Would love to find some way to measure the interest in and barriers to participation on a competitive basis at a significant scale and then compare the effectiveness of various models of inciting involvement.

Any takers?

Tuesday, November 13, 2007

Experimental office fiction #1

The first resume that Julia ever received in an official Human Resource capacity that she really remembers in detail was printed on hand-made paper, embedded with tendrils of organic material, leaves and petals of some flower. Lavender, Julia thought, because the smell reminded her of summer afternoons, sitting around the lake at her father's house where there was always lots of lavender. She remembers carrying it into the cubicle of her colleague, Linda S., who was equally astounded. They traded a couple jokes about it. It was pretty paper, but you couldn’t even read the type that was printed over the tiny flowers. It looked like the candidate’s name was “Sucks” though they both guessed it was probably Susan.

For years, Julia used this example at her speaking engagements at colleges and career fairs. She didn’t actually bring in the resume because that would be a violation of privacy, but told the “sucks” joke and spoke about the lavender scent filling her office. I wanted to put it in a vase, she said, dryly, at most of her speaking engagements. She always got a couple good laughs out of it, loosening everyone up, because who didn’t know not to do something this stupid?

Julia is remembering that resume now because she just received another DVD in the mail. It wasn’t the first; they’d been trickling in for the past year actually. Julia wasn’t a fan of this whole video thing. It seemed much too intimate. Not professional. But this is different. In the past week, she’s gotten three copies of the same one. She isn't even sure it’s a resume. There isn’t a cover letter. Just one of those CD covers slid into the plastic sheath with the title: My Destiny by Justin Clover.

This time, however, the background behind the title has pictures of a young man, presumably Justin, playing his guitar, rock climbing, smiling with his arm around a young woman, sitting on the beach with a large dog, even a baby photo! And most weirdly of all, a man in a white robe kneeling before another figure in a long red and white robe. It looks like a confirmation ceremony, at least to Julia, who is Catholic. She holds the DVD in her hand for a long thirty seconds. She isn’t sure she should put it in her computer, fearing it might contain some kind of virus.

Julia doesn’t sit in a cube anymore. Not for a long time. She’s a VP now, running a small HR group devoted to what they call “Transitions” at V-- which usually means firing people. She spends most of her time on exit interviews and employee-satisfaction surveys, but around spring, she always helps troll through the unsolicited resumes. She decides to call Cherie, who is the VP in charge of most of the entry-level evaluations, but Cheries not there. Julia goes back to the stack of resumes but pretty soon her curiosity gets the better of her and quickly, before she can think about it, she pushes the DVD in the drive and pulls her hand away quickly.

Right away, a soundtrack comes on. It's very loud choir music of some kind, so loud that Julia has to turn down the volume on her computer speakers. Strike #1, Julia thinks. On the screen there is a table of contents.

My Origins
My Joys
My Influences
The Truth
My Destiny

Julia can feel her heart racing, but she’s not entirely sure why, except maybe because she knows that those boys who shot their classmates made weird videos. A thought that convinces her she has to look. What if she could stop something terrible from happening?

With her hand on the mouse, she scrolls the cursor up and down the list. Finally she shuts her eyes and just clicks. She hears the whirring of the drive and almost instantly regrets it, feeling an odd and totally pointless panic.

For a second or two the screen is just white, which is a kind of relief. She gives it a moment, but nothing happens except the light gets brighter, so bright she has to squint. She didn’t know the computer could even get that bright. Shit, she thinks, it is a virus, and she’s about to hit the escape button when she hears a sound. She thinks it’s a thumping at first but then realizes is actually more of a fluttering noise, or clicking, like a vast swarm of insects, their metallic wings clattering over the sky or pattering against a giant pane of glass. From the bright white panel, a pattern starts to emerge (a school of fish?) a mass of little dark shapes moving together through some murky surface.

She feels really weird now and she’s sure it’s nothing but trouble and hits eject. The drive dutifully spits the disc out. For a minute, Julia just sits there listening to the office noises and takes a deep breath, hoping she hasn’t infected the server with something. "Jesus Christ," she thinks and says softly to herself. She looks at the DVD again, a little metal tongue sticking out of her drive. But she doesn't touch it. She decides she'd better wait for Cherie and goes to get some coffee.

Monday, November 12, 2007

Sunday afternoon reading: The Maias or another re-discovery

I started a new translation of The Maias by Jose Maria Eca de Querios, on Sunday, which I'd come across in a review in the NYT's and it seems fantastic so far, at least if you are a fan of the late-Victorian novel. All the traditional themes are there (a large country house, a portrait of a multi-generation family in decline, from an aristocratic past to a set of dilettante-ish university-educated aesthetes). But all from the context of a Portuguese diplomat.

What I love about coming across books The Maias is that I can still discover new books that have been around for a century even though I've been a pretty serious semi-pro reader for much of my life.

Just when you think you can't possibly find another great work in a certain period, one turns up. Before The Maias, it was some of the great central European writers like Joseph Roth's The Radetzky March which I'd somehow missed even though I've heard it's a staple in German schools. The good and hopeful news if you still bother to read novels is that our greater access to more international cultures will continually produce a stream of not just good or interesting but newly translated world-class works which haven't yet reached a global audience.

It's highly unlikely--though not impossible--that I might still come across something great written in English that I haven't encountered before. But it's extremely likely that there are entire traditions that I haven't yet come across. On a lot of fronts, the the emerging global economy is a mixed bag. But when it comes to expanding our access to great art, it's seems all good, at least on a Sunday afternoon with The Maias.

Thursday, November 8, 2007

Social network distinctions: the ego and the object

Those of us working with brands eager to get into some kind of social networking action (if only they because they are so excited to see that consumers will talk about their brands for free) know how challenging it can be to explain the differences among the various options, let alone make a strong strategic case for the impact or sustainability of one play over another.

Fred Stutzman over at unit structures has a whole bunch of smart and well-researched ideas on the subject worth checking out. But over the past week, in response to the recent announcements about Google's Open Social and FB's Social Ads, he's articulated some particularly useful and provocative claims about the viability of social networks in two posts: here and here.

He starts with a distinction between ego-centric network which places the individual at the core of the network experience (FB, Friendster, Orkut) and object-centered networks which place a social object at the center (Digg: news item; Flick: photo). And then goes on to suggest the source of value in each network model:
Object-centric social networks offer core value, which is multiplied by network value. A great photo-hosting service like Flickr stands alone without the network, making it less susceptible to migration. An ego-centic network, on the other hand, has limited core-value - it's value is largely in the network - making it highly susceptible to migration. We see this with Myspace: individuals lose little in terms of affordances when they migrate from Myspace to Facebook, making the main chore of migration network-reestablishment, a chore made ever-simpler as the migration cascade continues.

They key to maintaining value, says Stutzman, is maintaining situational relevance. But once ego-centric networks lose that relevance, their days are numbered. FB still has a lot of growth left in it, but eventually, Stutzman suggests, we'll be moving on to another bar down the street, no matter what they do to spruce the place up.
Try as they might, once ego-centric social networks lose situational relevance, its pretty much impossible for them to retain their status. Myspace users have exhausted the Myspace experience; they've done all they can do, they've found all the people they can find, so now its time to find a new context. We naturally migrate - we don't hang out in the same bar or restaurant forever, so why would we assume behavior would be any different online?

Here's where I get to my point: It's all about networks. The coolest tools, the best exclusive media - these are only "fingers in the dam" to keep users in non-situationally relevant spaces. Networks naturally migrate from place to place - slowly at first, followed by a cascade as switching costs decrease - and no tools or content or affordances can really stop that.
Agree or disagree, Stutzman's points offer a useful conceptual foundation for thinking through the right social network fit.

Tuesday, November 6, 2007

What you measure is what you get: learning from artists

Heard a charming and instructive story on NPR yesterday morning about Auguste Escoffier the famed 19th century chef who invented veal stock and identified a secret kind of deliciousness that no one knew existed. If you missed it, you can check out the summary here.

Up until and even after Escoffier’s invention, most people agreed with Democitus’ (and later Plato and Aristotle’s) assertation that there were four basic tastes: Sweet, Salty, Sour and Bitter. But as Jonah Lehrer (author of Proust was a Neuroscientist) writes, Escoffier sauce seemed to defy this classification. Everyone agreed it was delicious and made everything it touched more delicious, but it wasn’t sweet or salty or sour or bitter. So what was it? Maybe everyone was just imagining it?

It took the work of a Japanese chemist, among others, to identify and isolate the missing ingredient which turned out to be what we now call glutamate, a chemical compound is created by the breakdown of organic matter breaks down, (think cured meats and sauteed vegetables) and for which our tongues have special receptors.

Jonah’s point (and the one he explores in his book) is that artists (and writers and chefs) who examine and describe human experience can sometimes discover things that scientists don’t identify until much later. It's a nice inspiring tale about the power of artistic imagination. And the need to balance verified and testable facts with evidence of our own experience.

But I hope it doesn't lead to people to dismiss the power of analysis or scientific method as fatally flawed. The biggest problem with certain scientific research methods isn’t that they are inaccurate, or even that they are incomplete, but that they pretend to be complete or we forget (overwhelmed by their authority, statistical or otherwise) that they are incomplete. Just because we've haven't found a way to isolate and measure something doesn't mean it doesn't exist.

What you measure is what you get is an old cautionary saying in research circles. And it's particularly important to keep in mind for those of us suspended between creative and analytical fields. Practically speaking, this means a few different things:
1) You should always keep in mind what your research is measuring and what it's not
2) you shouldn't rely on one piece of research, say one big survey, but draw data from multiple sources and methods
3) We should all be experimenting with our research as well, trying to measure new elements of consumer experience with new methods and questions.

Sunday, November 4, 2007

Sunday afternoon reading: Hume or a refreshing dose of skepticism

Reading Hume today for no particular reason than I love Hume. I miss Hume. He always clears my head from all the half-baked claims and blather that I've been wading through for the past week. Here he is in a famous critique of dubious testimony about miraculous events from his 1768 Essays and Treatises on Several Subjects:

“When anyone tells me, that he saw a dead man restored to life, I immediately consider with myself, whether it be more probable, that this person should either deceive or be deceived, or that the fact, which he relates, should really have happened. I weigh the one miracle against the other; and according to the superiority, which I discover, I pronounce my decision, and always reject the greater miracle.”
Ah, now doesn’t that feel better?

So, having fueled my skeptical fires, it seems like a good autumnal afternoon to question another set of claims that seem to be circulating around here a lot. I know I’m bound to come off as the recalcitrant crank with this point but after reading post after post from respected colleagues in the field (Adrian and Gareth and Mark), I’m wondering if we might be stretching the distance between all our talk about a changed way of doing things and the amount of work that actually gets done that way.

Different advocates use different language--though usually with the same sets of examples (Nike, Axe, etc)--but the point is generally that brands today are built by doing and behaving and servicing rather than just showing and telling and entertaining.

Now I think this is true, or should be true, but channeling Hume, I'm inclined to ask, how often is it empirically true? I certainly haven't gotten to execute most of these new-world branding ideas. I've suggested a lot of them, and even sold through some behaviorist strategies (about the experience or even usability of a service vs. a brand benefit) but most of the brands didn’t really have the managerial will to fully execute on them. Or at least not yet.

In no way do I want to suggest that all of us who are making claims about this bold new era of brand development are lying or lazy or totally full of shit. (Though one comment I recently read claiming that ":30 second ads never really worked" seems to be solidly within bs territory.)

But in most cases, I think the points we’ve all been making do reflect an insightful response to the changed marketplace, but the opportunities to act on them--rather than talk or blog or consult about them--are still the exceptions to the rule, for the obvious reason that we don’t write the checks that pay for them. And most of the people who do write these checks are not willing to risk marketing budgets on experiments.

I recently spoke to a partner in one of the current crop of small agency hotshops trying to do things differently. They’ve got a differentiated strategic approach, rooted in all kinds of fancy research, backed by planners with very deep connections in the C-suite. Even still, when I asked him tell me--honestly--how much of the work they do actually gets done his agency's special X-factor proprietary way, he said, “probably about a 1/3.” Another 1/3 they do "X-factor lite" and the remaining 1/3 of the business is totally conventional old-school advertising.

Don't get me wrong. I agree with everyone I've cited above. I left a pleasant job in a big agency for the same reason: because it seemed obvious that communications, let alone advertising, wasn't going to solve the business problems it was being asked to solve.

In my own current shop, we've built a new structure (based on a network model) designed to liberate brand development from communications. We've got lots of smart people working across a variety of disciplines and start every project with a deep strategic engagement, examining operations and product development as well as consumer experience. But how often do we really get to execute what we think is right? In a way that we think is best for the client? Off the top of my head, I'm not sure we beat my friend's odds.

And I don’t think that’s a bad thing. On the contrary, I think it’s the sign of a mature business though it's a lot less glamorous than declaring we have a new way of doing everything.

Thursday, November 1, 2007

Marketing marketing or our unique proprietary exclusive something or other

Paul Soldera has a timely post on incomprehensible marketing language. He quotes a passage from a so-called Business Intelligence or (BI!) company which he pulled from the company's website (he generously removes the company's name) in order to represent an excessive use of marketing jargon, which I'll quote in full:

XXX is the first and only business intelligence (BI) platform to deliver a complete set of market-leading, end-to-end BI capabilities: best-in-class enterprise performance management (EPM), dashboards and visualization, reporting, query and analysis, and enterprise information management (EIM). XXX introduces significant innovations that deliver BI in new ways to a broad set of users.
These things happen, Paul suggests, when we try too hard to claim some unique or proprietary benefit for something that we'd be better off just trying to describe clearly.

It's a legit point, and these absurd, hyperbolic assertions make me cringe precisely because they resemble claims made by organization I've worked for and have probably made myself (though I've tried to block them out.) But as Paul also acknowledges, it's not quite that simple, as any of us in the client-service business know only too well.

The truth is that it's very hard to differentiate one kind of client service from another, which is another of saying it's very hard place a value on "thinking" (a point I've raised repeatedly here under the heading "getting paid.") One way, of course, is to sell a process rather than product, another is rely on past clients/work/results to do your talking for you, the third is some clever articulation of how what you do is a different and better version of what everyone else does--usually with a lot of "proprietaries" thrown in. It's this third way which usually leads to passages like the one above as we strive to differentiate our offerings.

I guess a fourth way would not to differentiate in kind but in quality, by building value through execution, though that's equally hard to articulate and assert