Monday, December 31, 2007

New year with a new car and new questions about internet research

Been car shopping for the past week, and yes, yes, I did my research. Did I research! I got the consumer reports pricing guides on a few cars; I checked out the obsessive threads on Edmunds, listing all the prices various people paid all around the country. I compared honda's and toyota's and audi's and vw's and saab's. I test-drove a half-dozen cars. I learned about invoice prices and holdbacks and money factors. I got my credit score and researched independent financing. I waited until the end of the month, today in fact, to finally purchase: the supposedly best time to buy a car. And I think I got a good deal on the car I wanted (more on that later) but I have two comments on this whole experience:

1) If you're a car person, I'm sure this is deeply satisfying. Getting deep into the dynamics of the business at the same time as you work your angles to bring down the price. For a bunch of people I'm sure it combines several of their greatest pleasures (technology, cars, money, negotiating, comparing stats). But if you're not, and if you have three kids bouncing off the walls as you surf endless car-buying guides and insanely unusable dealer websites, there is definitely a point of diminishing returns, in which added research doesn't necessarily add more value, because, and if i have a point here beyond my own exhaustion with car shopping, it's that:

2) The information didn't always help. In fact, the info I found from multiple sources was frequently inconsistent and misleading, even from super creditable sources like consumer reports and equifax, even on stuff that is supposed to be fixed like residuals and money factors. One dealer insisted he was giving me the official residual and money factor from the brand but I got a totally different ones from online sources and different dealers. And the differences were dramatic.

2a) Even my credit score turned up better at the dealer than it did with a service one day before. I know this is because there are multiple credit services and different companies update their rates at different times, and I know if I wanted to go an even deeper level into this whole deal-pursuit I could have found this out too, and perhaps used it for additional leverage.... And I'm sure if some car enthusiast happens to read this blog he or she is going to inform me why my data might not have been accurate and if I'd only multiplied the number by the consumer confidence index and divided it by the number of dealers in a hundred mile radius I would have gotten the exact number!

2c) But the deal finally came down less to all my research than my willingness to drive all over metropolitan Boston to a dozen different dealers and talk cash and refuse to pay most of them until finally, in an act of old-fashioned brinkmanship, I called one dealer about 70 miles from my house and told them if they could sell me this car at this particular price I'd buy it today (today being today).

2d) My point isn't that the research was wasted. It helped provide useful context and some important facts; but it certainly wasn't all accurate or helped my negotiations. You couldn't, as the saying goes, take a lot of it to the bank. At a certain point I was just rearranging the deck chairs. I knew what the car cost at invoice over a week ago. I just had to decide what I was willing to pay for it and then keep asking until someone said yes.

Everything else, I'm beginning to think, as I look back over the literally 100's of hours I put into this great American project of car buying was something you'd better enjoy to make it really worthwhile.

Sunday, December 23, 2007

West coast planners really know how to party

A couple of planning colleagues who work on the West coast recently related experiences that had me wondering if there was an East/West divide in planning styles and cultures. They all describe a little friction between the behaviors and styles we tend to associate with the East (edgy, ambitious, academic) and attitudes we tend to the associate with the west (creative, intuitive, cool.)

These tales were reinforced by a British colleague who transferred straight from London to a S.F. shop and found that natives didn’t take instantly to his sharp intelligence and astringent wit. And though I’m a born and bred Midwesterner, there’s no question my style of work has been influenced by 1,000 years of post-grad education, and I’ve occasionally faced similar reactions from West coasters to my acdemic style. Don't sweat the technique, indeed.

I'm already voiced my objection the word "overthink" when used as an easy critique of some strategic analysis, usually by someone who doesn't want to put in the effort to wrestle with the problem. Don't get me wrong. You can certainly think badly about a problem. Or bring the wrong kind of thinking to bear. (In fact, a long time ago, Aristotle defined one kind of intelligenc as the ability to match the right method of analyis to the subject at hand) But whenever people use the expresssion, "Don't overthink it?" i wonder if they've ever bothered to engage with the current marketing landscape. is SEM so simple? so intuitive?

In any case, these stories of cultural friction are probably isolated examples, and this whole East/West planning style divide I’m suggesting here probably doesn’t exist. But wouldn’t it be just more fun for us if it did? I seem to recall that an AAAA/APG conference of a few years back set up an iron-chef styled contest pitting boy vs. girl planners. Maybe we should suggest an East vs. West version for our revels in Miami? Bad boy vs. Death Row anyone? The Brits, in the fine expatriate tradition, can pick a coast based on current employment, visa status and personal inclination to embrace or deny their origins.

Friday, December 21, 2007

Anyone enjoying a singing xmas card about now?

Perhaps an audio ecard? Anyone opening a card and as the song breaks out, smiling and thinking, "how charming!" Anyone have the demos on people that are having that experience?

Monday, December 17, 2007

Thinking about switching

I’ve been thinking a lot about switching lately, partially because the Xmas season almost always makes me think I'd rather rotate out my old stuff than get anything new, and partially because I’m working on the health care industry, which--at least in the health-care-reform state of Massachusetts--is creating new options for consumers. Whether they will take advantage of the newly competitive marketplace, however, is another question. The question that has me thinking about switching costs.

Economists of the macro and micro variety have done a fair amount of work on the financial costs of switching, particularly when it comes to switching suppliers (from exit fees to breaches of contract). As the solid Wikipedia article points out, however, the emotional and social costs of switching are much harder to measure and consequently often under-estimated.

This is especially true with complex products, like financial services or health insurance. Even the thought of filling out the necessary paperwork is enough to stop most of us from considering the alternative. And when you add to the administrative burdens other secondary effects—the costs of explaining your new information (phone number, policy number) or learning a new system—the costs get pretty high pretty fast. The data suggests that costs savings (or some more soft calculation around additional benefits) have be in the 10% range to impact choice.

It might even be argued that as our lives and products get more complex, the resistance to switching is on the rise. The costs are relatively low when it comes to choosing a white chocolate peppermint spiced latte over your usual Mountain Dew smoothie. But when it comes to services and more complex products, it might require the aggressive strategies adopted by the telecom and cable industries, paying to switch and switch you back, to make a difference in the market. Anyone want to a try a new doctor? It’s on us.

Wednesday, December 12, 2007

Not for Little Children

Just watched Todd Field's Little Children on Netflix. It didn't strike me as a very successful movie on a number of fronts, covering familiar territory (repression in the suburbs) in a way that was neither particularly convincing (I could not figure out who these people were) nor very inventive. The most striking thing about it, to my jaded eye, was that it focused so much attention on masturbation as a symbol of marital discord. Watching the mocking portrayal of one of the husbands as he masturbates to Internet porn (only a real freak would ever do something like that!), it struck me that I've been seeing a lot of this lately, by which I mean: portrayals of masturbation as a marker of failing relationships or worse.

In Apatow's recent Knocked Up, Paul Rudd's uptight and controlling wife derisively describes catching her husband masturbating to her sister. And in HBO's execrable short series, Tell Me You Love Me masturbation repeatedly turns up as a sign of trouble. Now, I'm not going to start a support group, but it strikes me as interesting, and kind of creepy, that so many cultural products are pathologizing what might be described as innocent fun and the desired alternative to extra-marital adventures.

When did jacking go so wrong? I mean since the writings of St. Paul. Was it American Beauty's portrayal of the masturbating hero as a symbol of the suburban man trapped in a loveless marriage and dead-end job?

I'm not quite sure how to interpret it, except maybe as a cultural expression of our increasing need to imagine and represent relationships so perfect that they fulfill every possible desire, including those of our fantasy lives.

Monday, December 10, 2007

Sunday Afternoon Viewing: The Mikado

Partially to clear my head from the seasonal onslaught of kid-oriented shitertainment and partially because we’re big G & S fans around my house, I took 2/3’s of the progeny to the Harvey-Rad student production of The Mikado, which turned out to be fun on multiple levels. Like almost all student productions, the performances were uneven. Some of the kids were a touch overmatched by the demanding pacing of the G&S songs, others were fantastic. Brian Polk's Ko-ko (Aka, The Lord High Executioner) is probably on his way to a professional career somewhere. But who cares? The enthusiasm and effort—as well the charmingly intimate Agassiz theater--more than made up for the occasional lack of polish, particularly in our contemporary entertainment context of lots of overproduced junk.

Like all trips back into time, this one called into high relief what my very rich entertainment diet has been lacking lately, which I might call a little old-fashioned spectacle. An orchestra pit, kettle drums!, costumes, make-up. If you think your kids are too jaded by Playstation to enjoy a theatrical production that doesn’t involve people in oversized animal costumes, you might be surprised to watch them react to a old-fashioned lighting-design. My four-year old was fixated on how the lighting transformed the stage from morning to night.

Which reminded me of one of things I’ve always found weak about Steven Johnson’s Everything Bad is Good for You. While I agree with a lot of his argument (e.g., television shows have gotten more complex, videogames are an under-valued medium), and the way he challenges a lot of assumptions about “good” and “bad” media experiences, it seems to me that he also lets some of his own key assumptions go unexamined. Most relevant here is the assumption that greater complexity or more intense stimulation leads to a better (more engaging, more enriching, more challenging, more fun) experience.

While it’s certainly true that great video games like Halo or a breakthrough serial television show like The Sopranos forces you to pay attention to a complex array of stimuli, it also distracts you from other fun aesthetic experiences which value emotional sensitivity over sensory stimulation, e.g., watching the characters react to one anothers' performances onstage. Don’t get me wrong, I like them all: Halo, The Sopranos and The Mikado. But a trip to a student production of Gilbert & Sullivan is a nice reminder that there is more than one way to engage the viewer of any age, no matter how many video games they play.

Thursday, December 6, 2007

Experimental office fiction #3

One of the disappointing consequences of great accomplishments, B— sometimes thought, was that they didn’t seem great after a while. A shockingly little while, it turned out. When he reflected over his career, B— could see a clear pattern repeat itself: the flicker of ambition as he eyed uncharted territory, the thrill of pursuit, the early failures which only inspired him to more intense and obsessive efforts, and then, in the midst of some battle, the sudden inspiring discovery (a new approach, a brokered deal, a relenting enemy--like some enormous gear once rusted shut slowly breaking free and turning into place) and then the rush toward inevitable success, about which he remembered a lot less. It was like that with every domain he had pursued: real estate, technology, talent. Each new world—however strange and alien--followed the same laws; they had all knelt before the forces he'd brought to bear.

These fits of self-consciousness always bothered him. Most of the time, he was too busy to think about anything but his next step. But on his jet, sipping a gin gimlet, watching the American landscape flow predictably by like a favorite tv show he’d watched as child, he’d fall into these sick reveries. At first he thought it was just the lack of distractions. It was the only time in his life he was truly trapped. He had to just sit there. He tried books, movies, games, drugs, sex. None of it really helped. As soon as the plane passed over the clouds, and he caught sight of the curve of the earth, he started to feel maudlin, reflective, pointlessly philosophical.

He had complained about it once to Hannah, and she'd remarked that this was a common problem. “Had he heard the expression, train of thought?” she asked. “Was I claiming originality?” B— shot back, his feelings suddenly hurt. Hannah patted his head, explaining that it was only the curse of the prospect. “It was the fault of the landscape Don’t blame yourself,” she said. "A hundred years ago, all they had were mountains."

He’d looked out the window again at the mountains, followed by small middle American cities huddled against nameless rivers, then the endless farms, then more mountains, desert, all rushing by. How did everything get so small? Maybe he should stay on the ground. He never flet this way when he was walking around. Or driving. Hannah just sat back in her seat and laughed.

He couldn’t take her for more than day when they were together, but now he missed her. He remembered how she’d sit on the edge of his desk in her rich girl lesbian avenger boots making fun of him. Whenever he went on too long, expounding on some success, she’d start in with a little one-word song: redundant, redundant, redundant, redundant… She didn’t give a shit. That, at least, was refreshing.

He tried to focus on something more concrete—the meeting in Denver then the call to New York—but suddenly the plane bucked like a pick-up riding over a pothole. He instinctively grabbed hold of the armrests and looked out the window.

He expected a storm cloud but instead he saw bright sky, the sun shining off the wings, sharp and clear. Then his eye was drawn to the space around the plane. Something filled the sky with a kind of texture. It seemed to almost take shape and then dissipate again, like static on a pre-digital television or a flock of starlings banking sharply. Could it be birds? No, it was too small. Maybe bugs. But then, as he stared, the thousand little dots definitely took shape, a curve or blade or giant black wing curving alongside the jet. He hunched up in his seat to get a better view, and to his shock, he saw that this thing seemed to be casting a shadow on the ground two miles down, a vast shadow spreading across acres of farmland. He slid the shade down and yelled to CJ, who was up flirting with the pilots. “What the fuck?” He shouted.

Instantly, he saw her pretty smiling face lean into the aisle. She’d been on his plane for 14 months now and knew him pretty well. In an instant, she could tell he was upset. She pursed her lips in a sympathetic way that gave her otherwise very un-maternal face a maternal quality that B—liked in spite of himself. “What’s wrong?” She asked again. B—was about to try and explain what he’d seen out the window but instantly realized nothing good could come of it. What was the point? He was either fleeing from something or he wasn’t? What he needed was something to chase.

He asked for another drink.

Wednesday, December 5, 2007

Taglines: who needs them?

A recent article in Brandweek by T.L. Stanley on the diminishing importance of taglines has caught the b-sphere attention because it rightly recognizes something we’ve all noticed: lots of great big brands aren't bothering with them anymore. Stanley’s diagnosis of the tagline’s declining relevance is, however, less convincing in my view. Here are T.L.'s reasons:

1) Shorter tenure of CMO’s (which I’m not sure I understand except as a general point about risk-aversion, repeated in 3 below)
2) Proliferation of media channels means we have ways to reach consumes that don’t require taglines
and 3) focus groups, or at least, an excessively rational relation to the function of the tagline. Aka, using the tagline as a “safety net” rather than a rallying cry.

Stanley’s second reason is only one that makes any sense. The first and third are tautological. Taglines no longer matter because we are only producing bad taglines that no longer matter. In fact, the article goes on to name a couple (GE’s “Imagination at work” for one) which it thinks is great. Come to think of it, the piece never really distinguishes a good from a bad tagline to begin with.

Responding to the same article, Gareth cites a couple more reasons for the tagline’s demise more convincing than the orginal article: most importantly, the fact that a consumer’s relation to a brand is often more influenced by what a brand does (and how it helps them) than what it says. Or as GK puts it: actions speak louder than words.

I’d add one more addendum to the expanding story: taglines might be increasingly unimportant to consumers, but in my experience, they are still very important to marketers. Most of them still ask the marketing service co. to provide something clear, concise and inspirational that expresses the brand idea/experience. Call it what you like: tagline, handle, rallying cry. Most marketers still like and need to have a quick way of representing the brand idea to the company as a whole, to align them around the vision and inspire everyone to take action on it.

Monday, December 3, 2007

Neither this, nor that

In the past week, I’ve heard three senior staff members from totally different but equally successful creative marketing firms tell me that “agency” is a bad word at their company. They might be a studio or a consultancy or a design shop or a next-generation interactive integrated something-or-other but just don’t call them an agency. Because the one thing they are most certainly NOT is an agency.

I realize of course that some of this agency-resistance is just a natural consequence of our attempts to differentiate ourselves in a competitive industry full of smart and talented people. And part of it is an attempt to distance our ventures from a particularly limited kind of agency that only focuses on TV ads as the answer to all a clients’ business or marketing problems.

But I still find it kind of annoying for a couple reasons. First, because I like the idea of the advertising agency at its best. I like the original notion that creative thinking and expression can have a big impact on a business. I like the breadth, flexibility and fertility of the model: the sheer number and variety of agencies, big and small, creative and integrated, interactive, independent, direct, etc, etc. I even like the name: agency: how it articulates both an organization and an action in a single word.

One of my favorite Freudian concepts is what he called narcissism of small differences, or, the fact that we tend dislike people with small differences from us, or we tend to dislike them more and more intensely than people that are very different from us. Freud’s idea was that we reserve our specially intense antipathies for the “nearly-we” because they threaten our sense of self much more strongly than the “other” or people that have nothing to do with us.

Thursday, November 29, 2007

Still surfing

I'm always relieved to hear that not everyone knows exactly what they want exactly when they want it, so I was pleased to come across a bunch of interesting media research in the public domain for the Corporation for Public Broadcasting, most of it here. Particularly interesting to me was the study of how viewers currently navigate through their viewing options with Interactive Program Guides (IPG’s) which is the semi-official name for those grids floating over our TV screens when we flick on our digital cable/dvr remotes.

What the research revealed is that despite the many ways we now have to get exactly what we want when we want it (Tivo, Netflix, OnDemand) the majority of television viewing (over 50% the respondents claim) is still driven by general surfing behavior. Unlike the web which, most evidence suggests, is increasingly destination oriented, television remains well suited to the pleasurable of experience of surfing or just seeing “what’s on.”

Also interesting was the way research called attention to the IPG as an important media vehicle. The specific language in the IPG grids had a significant impact on viewing choices both at the primary schedule-matrix level and at the secondary (more-info) level. Intriguing to me because it basically means that networks and anyone interested in getting viewers to tune in should not take the language in grid listings for granted but think of these IPG’s as communication vehicles at least as--if not more--effective than tune-in advertising.

Sunday, November 25, 2007

Sunday-afternoon viewing: the power of convention

Saw two movies this holiday week-end for the first time in maybe a decade and while the two could not be more disparate in theme and tone (The Coen Brother’s No Country for Old Men and Seinfeld’s Bee Movie), the combined experience of the audience reactions reinforced my generally cranky instincts about our deep investment in genre convention.

Like most of the pro critics, I found No Country pretty fantastic. The first half hour takes you right away; the clockwork structure, the breathtaking cinematography, the acting. Even the Coen brothers characteristic vices (easy glibness, playing with violence for effect) are replaced by a seriousness about the characters. And yet, most of the audience in the full suburban theater I saw it in
weren't happy with the ending. They voiced their disappointment out loud: “I want my money back," "It was such a good movie at the beginning.” I obviously don’t want to spoil it for anyone, but it’s fair to say that the movie undermines your expectations for a certain kind of conventional Western conclusion. The ending is totally consistent with the structural and narrative terms of the movie, but my audience didn’t care. They wanted Clint Eastwood. It reminded me once again how deeply invested we are in genre conventions and how little tolerance we have for cultural expressions that don't follow the rules.

Speaking of cultural conventions, Bee Movie falls right into the sarcastic center of kid culture these days, with the same knowing, ironic tone of the revolting Shrek franchise, and lots of screen time filled up with half-hearted set-pieces satirizing (or pretending to satirize) adult culture (Larry King, Goodfellas, The Graduate). There is even a twenty minute court-room drama parody. What? I realize these things are supposed to keep the adults entertained (aren't we clever that we can recognize scenes from other famous movies), but it seems more like a failure of the imagination. When did kid movies become nothing more than an excuse to create set pieces for easy jokes? My kids stared at the screen just hoping something would happen beyond listening to the Seinfeld character do his Bee stand-up routine. Do people really like this stuff? A bee version of Larry King?

But has anyone else mentioned the movie is practically an allegory for Seinfeld’s own professional life. (Spoiler alert: don't read further if the plot of Bee Movie is important to your viewing satisfaction). A bee rebels against the conventions of the hive because he doesn’t want to do one job for the rest of his life. When the bee’s rebellion is surprisingly successful and they get all the honey they've ever made back, the hive stops production because they have more honey than they know what to do with. The bees slip into aimless and unsatisfying lives of leisure. Unfortunately, this has disastrous effects on the natural world (no pollination, no flowers, etc) and the bees realize they need to get back to work to save the world. Maybe it wasn’t all about the honey after all? Sucks to become irrelevant, doesn’t it?

Wednesday, November 21, 2007

Experimental office fiction #2

The only habit of highly successful people that Meyers could remember was #3: first things first. The other ones, something about being proactive, another one that encouraged you to synergize were too vague to be of much use. But the illuminating distinction between urgency and importance was so simple and so clear that it spoke to Meyers hunger for a secret key, a filter or even just a new perspective with which he could change the course of his career.

It was maybe bad luck that much of what had often been urgent in Meyers’ life also turned out to be important. His failure to clean the gutters a few Octobers ago had had disastrous consequences on a load-bearing wall in his kitchen. Not to mention a scarring fight with his ex-wife. But this unpleasant memory did not diminish the power of the principle. Even years later, the urgent/important paradigm had a special place in his cognitive tool-kit. Other systems (colored parachutes, personal brands, emotional intelligence) had come and gone but first things first remained an operating principle. Whenever Meyers heard the phone ring or just recalled some annoying errand he’d been putting off for weeks, he would find himself asking himself: Is this just urgent? Or is it really important?

It’s true that there had been stretches of time when it was difficult for Meyers to identify something important enough to put off all the things he didn’t want to do, but that was no longer the case. Importance had been thrust upon him, and he felt a renewed energy and focus. He went to work the night after their first meeting, surfing the Internet, looking for more clues to the character of B--, the great man he was now responsible for advising.

Invisible forces seemed to be aligned in his favor once again for Meyers discovered that B-- was speaking at a conference that very week-end at a resort in Southern Maine. B-- was on a panel provocatively titled, “Breaking the Rules.” Meyers immediately called the number listed on the website. The woman on the phone made sympathetic noises but explained that the conference was unfortunately fully booked. Meyers was not surprised. He had long been an avid conference attendee and knew how fast they filled up, especially with speakers of B—‘s caliber. It had probably been booked for months.

The woman was explaining how he could sign up to see the speakers streamed on the Internet but Meyers was already imagining the conference itself: all the small delightful details, from the excitement of choosing among the array of panels, the animated debates at day's end, big talk about the future, the sense of possibility. You could find yourself talking to the founder of of empires. You never knew where they might lead. That’s why Meyers usually attended several a year. It had been another sticking point in his marriage. But what was more important than career development! An experienced conference-attendee like Meyers knew that even first-rate conferences had a high rate of cancellations at the last minute. The kind of people who let the urgent get in the way of the important, Meyers thought. But Meyers wasn't one of those people. He was already searching for a hotel room.

Tuesday, November 20, 2007

Intuition wins again

"Mock on, mock on, Voltaire, Rousseau:
Mock on, mock on: ‘tis all in vain!
You throw the sand against the wind,
And the wind blows it back again."
--W. Blake

Everyone's (or at least my) favorite columns in the Saturday NYT’s business pages—What’s online? What’s offline—explored (or summarized other explorations) of the perennial question in business culture. What makes a good manager? Method or Instinct? A well-honed system or a well-developed gut? Judging by these two summaries, the gut is clearly winning but maybe because we’re tired of the alternatives.

The What’s Online column cited an article in The Economist which in turn cites several other sources, including HBS professor Rakesh Khurana, critiquing the degradation of business schools from a “serious intellectual endeavour to a slapdash set of potted theories.” There are also counter-arguments, including a study out of Stanford and LSE which claims that companies adopting business school methods outperform the competition, but even this evidence seems further compromised by the celebration of intuition on the parallel column, celebrating the power of intuition.

The core evidence in What’s Offline is from the MIT Sloan Management Review (though Woman’s day and Fast Company are also cited) which explains how you can improve your intuition. While the article seems designed to articulate a more complicated notion of intuition--

“Intuitions is a highly developed form of reasoning that is based on years of experience and learning and on facts, patterns, concepts, procedures…stored in ones head.”

--the implications don’t exactly take us somewhere new: Managers should learn to seek out new experiences (to internalize a broader array of patterns), harness their emotional intelligence, and reflect on their intuitions before acting too rashly.

To me that’s starting to sound a lot like what we used to call "using your judgment" or even “paying attention,” but to be fair, I should check out the MIT article first.

What’s most interesting to me is the rising popularity of of “intuition” as a quality of value in managers in general and leaders in particular, whether it’s leaderships bio’s like Welch’s Straight from the Gut to Gladwell’s Blink to celebrations of any kind of intelligence besides the old-fashioned kind: Emotional Intelligence, Social Intelligence, Multiple Intelligence. Perhaps business schools are already teaching courses entitled “Not just for women anymore: the uncanny instincts of successful leaders.”

When did intuition suddenly become so valuable a core competency? It's easy to understand why we all want it: it's faster and cheaper than research and analysis and a lot more exciting, almost mystical in it's power. But why do we--in a data-rich field like business--suddenly value it so highly? Is it because we now have too much information? And it's too easy to get stuck in an analysis (paralysis) mode and be unable to act? Or is the ascendancy of intuition a biz culture corrective to a long period of overly restrictive, pseudo-scientific systems which dominated management training for decades: creating Whyte’s Organization Man an valuing systems and bureaucracy at the cost of creativity and innovation. I’m sure the story is more complicated than that, but it’s worth exploring further.

Saturday, November 17, 2007

Do they have to like it?


It’s an old truism in the agency business that creatives care too much about the quality (originality, aesthetic beauty, executional detail, etc.) of their work and not enough about whether it will actually build the brand and the business. Now this is supposedly changing along with many other things in adland, but I don’t think it was even true in the old-fashioned world. And not because creatives were so focused on business results. But rather because marketers evaluate marketing ( TV in particular) much more subjectively than they usually admit, whether it works or not.

The reason I’m making this observation now is that I recently heard about two marketing clients from significant brands who killed a campaign because the marketer and his staff simply "didn’t like it.” Even though there was abundant evidence--both hard and soft metrics—that the campaign was working, they just couldn’t get over the fact that the work represented their company in style and tone that they didn’t like. It was unconventional, they weren’t. Even if the target was responding, they just don't want to be a talking monkey (that’s a placeholder).

It’s their prerogative of course; they are writing the checks. But it’s more evidence that even though managers often talk a tough game when it comes to keeping costs down, some might be just—if not more—sensitive to how the marketing looks and feels. Does my boss like it? Does my staff like it? Will I go down in company history as the idiot who approved the talking monkey campaign?

You can hardly blame them. The work a CMO stewards is more public and exposed than the work of almost anyone else in the company, certainly at the executive level. Employees might not like what the CFO has planned for them, but most can’t complain about it. The same can’t be said for a TV ad or website or DM that is going to be seen by the all the employees and their wives and most of their friends.

If it doesn’t work out, a creative can always leave it off the reel. A CMO has to live with it. It’s a dilemma, particularly when we have to challenge the client to break out of conventions in order to achieve their often ambitious goals. Of course you can always avoid clients who aren’t ready to walk the talk, but it’s hard to know that beforehand. I’m inclined to think that along with coming up with the great ideas, we also have to get good at making clients comfortable with them. A subject that is worthy of a post of it's own.

Wednesday, November 14, 2007

New metrics: or some things I wish I could measure


Just sat through an engaging presentation from an online market research company (OTX) which has some new brand-experience measurement tricks up their sleeve (modeling store environments, distracted exposures, etc). But the presentation also reminded me of the other things I wish someone knew how to measure in a cost-effective way.

1) Brand utility: of all the newly coined phrases describing the attention to what brands actually do for consumers (vs. what they merely show and tell), “brand utility” is my personal favorite. I first heard it in an interview on PSFK with Benjamin Palmer though I think he co-credits Johnny Vulcan of Anomaly with the coinage. Wouldn’t it be great if we could develop a metric that would measure a brand’s utility factor (or whatever we call it) in relation to the competition? If what we are all saying is true, then “utility” should be just as, if not more, important than “relevance” and “uniqueness” and all the traditional metrics.

Imagine if we could compare the relative utility factors of all the travel sites out there: Travelocity, Kayak, Expedia, Orbitz, etc. This wouldn’t be a strict usability measure which is easy enough to generate but rather some combination of ease of use and ability of the brand to anticipate and facilitate your desires behavior, both on and offline. It would be a more comprehensive measure than one you could generate through simple online user satisfaction survey.

Amazon might be considered the gold-standard of a certain kind of ecommerce transactional utility, against which other ecommerce brands could aspire.

Once we established a baseline, then we could--theoretically of course--measure how much a given usability, service improvement or new application impacted the score, and then (praise be the intelligent designer!) we might even determine if the cost investment in the improvment was worth the result? Maybe someone can already do this; But I haven’t come across it.

2) Sharability/Network/Viral effects: Another thing almost everyone wants these days is for their work to go all viral: have consumers talk about their brands and products and advertisements for free, for fun. It’s not easy but not impossible to measure how well content (or a conversation, as we say today) travels around the web once it’s produced, but wouldn’t it be cool if we could pre-test the power of a piece of a content in some kind of enclosed network? You might call it something like a Network Communication Test. The goal would be to see both how sharable some particular chunk of content was and to understand why it was so sharable. Which might even help us develop even more viral-icity.

3) Participation: Everyone not yet involved in social media that wants to be involved in social media looks at the rise of MySpace and Facebook and thinks it’s easy to get consumers to participate in their sites, rating and reviewing and uploading video of themselves doing stupid things with their friends when they're drunk. Those of us working on sites of various kinds know it’s not as easy as it looks. That we often have to prod consumers (with incentives of various kinds) to get them going and optimize the mechanisms for involvement before the machine really starts humming. Would love to find some way to measure the interest in and barriers to participation on a competitive basis at a significant scale and then compare the effectiveness of various models of inciting involvement.

Any takers?

Tuesday, November 13, 2007

Experimental office fiction #1

The first resume that Julia ever received in an official Human Resource capacity that she really remembers in detail was printed on hand-made paper, embedded with tendrils of organic material, leaves and petals of some flower. Lavender, Julia thought, because the smell reminded her of summer afternoons, sitting around the lake at her father's house where there was always lots of lavender. She remembers carrying it into the cubicle of her colleague, Linda S., who was equally astounded. They traded a couple jokes about it. It was pretty paper, but you couldn’t even read the type that was printed over the tiny flowers. It looked like the candidate’s name was “Sucks” though they both guessed it was probably Susan.

For years, Julia used this example at her speaking engagements at colleges and career fairs. She didn’t actually bring in the resume because that would be a violation of privacy, but told the “sucks” joke and spoke about the lavender scent filling her office. I wanted to put it in a vase, she said, dryly, at most of her speaking engagements. She always got a couple good laughs out of it, loosening everyone up, because who didn’t know not to do something this stupid?

Julia is remembering that resume now because she just received another DVD in the mail. It wasn’t the first; they’d been trickling in for the past year actually. Julia wasn’t a fan of this whole video thing. It seemed much too intimate. Not professional. But this is different. In the past week, she’s gotten three copies of the same one. She isn't even sure it’s a resume. There isn’t a cover letter. Just one of those CD covers slid into the plastic sheath with the title: My Destiny by Justin Clover.

This time, however, the background behind the title has pictures of a young man, presumably Justin, playing his guitar, rock climbing, smiling with his arm around a young woman, sitting on the beach with a large dog, even a baby photo! And most weirdly of all, a man in a white robe kneeling before another figure in a long red and white robe. It looks like a confirmation ceremony, at least to Julia, who is Catholic. She holds the DVD in her hand for a long thirty seconds. She isn’t sure she should put it in her computer, fearing it might contain some kind of virus.

Julia doesn’t sit in a cube anymore. Not for a long time. She’s a VP now, running a small HR group devoted to what they call “Transitions” at V-- which usually means firing people. She spends most of her time on exit interviews and employee-satisfaction surveys, but around spring, she always helps troll through the unsolicited resumes. She decides to call Cherie, who is the VP in charge of most of the entry-level evaluations, but Cheries not there. Julia goes back to the stack of resumes but pretty soon her curiosity gets the better of her and quickly, before she can think about it, she pushes the DVD in the drive and pulls her hand away quickly.

Right away, a soundtrack comes on. It's very loud choir music of some kind, so loud that Julia has to turn down the volume on her computer speakers. Strike #1, Julia thinks. On the screen there is a table of contents.

My Origins
My Joys
My Influences
The Truth
My Destiny

Julia can feel her heart racing, but she’s not entirely sure why, except maybe because she knows that those boys who shot their classmates made weird videos. A thought that convinces her she has to look. What if she could stop something terrible from happening?

With her hand on the mouse, she scrolls the cursor up and down the list. Finally she shuts her eyes and just clicks. She hears the whirring of the drive and almost instantly regrets it, feeling an odd and totally pointless panic.

For a second or two the screen is just white, which is a kind of relief. She gives it a moment, but nothing happens except the light gets brighter, so bright she has to squint. She didn’t know the computer could even get that bright. Shit, she thinks, it is a virus, and she’s about to hit the escape button when she hears a sound. She thinks it’s a thumping at first but then realizes is actually more of a fluttering noise, or clicking, like a vast swarm of insects, their metallic wings clattering over the sky or pattering against a giant pane of glass. From the bright white panel, a pattern starts to emerge (a school of fish?) a mass of little dark shapes moving together through some murky surface.

She feels really weird now and she’s sure it’s nothing but trouble and hits eject. The drive dutifully spits the disc out. For a minute, Julia just sits there listening to the office noises and takes a deep breath, hoping she hasn’t infected the server with something. "Jesus Christ," she thinks and says softly to herself. She looks at the DVD again, a little metal tongue sticking out of her drive. But she doesn't touch it. She decides she'd better wait for Cherie and goes to get some coffee.

Monday, November 12, 2007

Sunday afternoon reading: The Maias or another re-discovery

I started a new translation of The Maias by Jose Maria Eca de Querios, on Sunday, which I'd come across in a review in the NYT's and it seems fantastic so far, at least if you are a fan of the late-Victorian novel. All the traditional themes are there (a large country house, a portrait of a multi-generation family in decline, from an aristocratic past to a set of dilettante-ish university-educated aesthetes). But all from the context of a Portuguese diplomat.

What I love about coming across books The Maias is that I can still discover new books that have been around for a century even though I've been a pretty serious semi-pro reader for much of my life.

Just when you think you can't possibly find another great work in a certain period, one turns up. Before The Maias, it was some of the great central European writers like Joseph Roth's The Radetzky March which I'd somehow missed even though I've heard it's a staple in German schools. The good and hopeful news if you still bother to read novels is that our greater access to more international cultures will continually produce a stream of not just good or interesting but newly translated world-class works which haven't yet reached a global audience.

It's highly unlikely--though not impossible--that I might still come across something great written in English that I haven't encountered before. But it's extremely likely that there are entire traditions that I haven't yet come across. On a lot of fronts, the the emerging global economy is a mixed bag. But when it comes to expanding our access to great art, it's seems all good, at least on a Sunday afternoon with The Maias.

Thursday, November 8, 2007

Social network distinctions: the ego and the object

Those of us working with brands eager to get into some kind of social networking action (if only they because they are so excited to see that consumers will talk about their brands for free) know how challenging it can be to explain the differences among the various options, let alone make a strong strategic case for the impact or sustainability of one play over another.

Fred Stutzman over at unit structures has a whole bunch of smart and well-researched ideas on the subject worth checking out. But over the past week, in response to the recent announcements about Google's Open Social and FB's Social Ads, he's articulated some particularly useful and provocative claims about the viability of social networks in two posts: here and here.

He starts with a distinction between ego-centric network which places the individual at the core of the network experience (FB, Friendster, Orkut) and object-centered networks which place a social object at the center (Digg: news item; Flick: photo). And then goes on to suggest the source of value in each network model:
Object-centric social networks offer core value, which is multiplied by network value. A great photo-hosting service like Flickr stands alone without the network, making it less susceptible to migration. An ego-centic network, on the other hand, has limited core-value - it's value is largely in the network - making it highly susceptible to migration. We see this with Myspace: individuals lose little in terms of affordances when they migrate from Myspace to Facebook, making the main chore of migration network-reestablishment, a chore made ever-simpler as the migration cascade continues.

They key to maintaining value, says Stutzman, is maintaining situational relevance. But once ego-centric networks lose that relevance, their days are numbered. FB still has a lot of growth left in it, but eventually, Stutzman suggests, we'll be moving on to another bar down the street, no matter what they do to spruce the place up.
Try as they might, once ego-centric social networks lose situational relevance, its pretty much impossible for them to retain their status. Myspace users have exhausted the Myspace experience; they've done all they can do, they've found all the people they can find, so now its time to find a new context. We naturally migrate - we don't hang out in the same bar or restaurant forever, so why would we assume behavior would be any different online?

Here's where I get to my point: It's all about networks. The coolest tools, the best exclusive media - these are only "fingers in the dam" to keep users in non-situationally relevant spaces. Networks naturally migrate from place to place - slowly at first, followed by a cascade as switching costs decrease - and no tools or content or affordances can really stop that.
Agree or disagree, Stutzman's points offer a useful conceptual foundation for thinking through the right social network fit.

Tuesday, November 6, 2007

What you measure is what you get: learning from artists


Heard a charming and instructive story on NPR yesterday morning about Auguste Escoffier the famed 19th century chef who invented veal stock and identified a secret kind of deliciousness that no one knew existed. If you missed it, you can check out the summary here.

Up until and even after Escoffier’s invention, most people agreed with Democitus’ (and later Plato and Aristotle’s) assertation that there were four basic tastes: Sweet, Salty, Sour and Bitter. But as Jonah Lehrer (author of Proust was a Neuroscientist) writes, Escoffier sauce seemed to defy this classification. Everyone agreed it was delicious and made everything it touched more delicious, but it wasn’t sweet or salty or sour or bitter. So what was it? Maybe everyone was just imagining it?

It took the work of a Japanese chemist, among others, to identify and isolate the missing ingredient which turned out to be what we now call glutamate, a chemical compound is created by the breakdown of organic matter breaks down, (think cured meats and sauteed vegetables) and for which our tongues have special receptors.

Jonah’s point (and the one he explores in his book) is that artists (and writers and chefs) who examine and describe human experience can sometimes discover things that scientists don’t identify until much later. It's a nice inspiring tale about the power of artistic imagination. And the need to balance verified and testable facts with evidence of our own experience.

But I hope it doesn't lead to people to dismiss the power of analysis or scientific method as fatally flawed. The biggest problem with certain scientific research methods isn’t that they are inaccurate, or even that they are incomplete, but that they pretend to be complete or we forget (overwhelmed by their authority, statistical or otherwise) that they are incomplete. Just because we've haven't found a way to isolate and measure something doesn't mean it doesn't exist.

What you measure is what you get is an old cautionary saying in research circles. And it's particularly important to keep in mind for those of us suspended between creative and analytical fields. Practically speaking, this means a few different things:
1) You should always keep in mind what your research is measuring and what it's not
2) you shouldn't rely on one piece of research, say one big survey, but draw data from multiple sources and methods
and
3) We should all be experimenting with our research as well, trying to measure new elements of consumer experience with new methods and questions.

Sunday, November 4, 2007

Sunday afternoon reading: Hume or a refreshing dose of skepticism

Reading Hume today for no particular reason than I love Hume. I miss Hume. He always clears my head from all the half-baked claims and blather that I've been wading through for the past week. Here he is in a famous critique of dubious testimony about miraculous events from his 1768 Essays and Treatises on Several Subjects:

“When anyone tells me, that he saw a dead man restored to life, I immediately consider with myself, whether it be more probable, that this person should either deceive or be deceived, or that the fact, which he relates, should really have happened. I weigh the one miracle against the other; and according to the superiority, which I discover, I pronounce my decision, and always reject the greater miracle.”
Ah, now doesn’t that feel better?

So, having fueled my skeptical fires, it seems like a good autumnal afternoon to question another set of claims that seem to be circulating around here a lot. I know I’m bound to come off as the recalcitrant crank with this point but after reading post after post from respected colleagues in the field (Adrian and Gareth and Mark), I’m wondering if we might be stretching the distance between all our talk about a changed way of doing things and the amount of work that actually gets done that way.

Different advocates use different language--though usually with the same sets of examples (Nike, Axe, etc)--but the point is generally that brands today are built by doing and behaving and servicing rather than just showing and telling and entertaining.

Now I think this is true, or should be true, but channeling Hume, I'm inclined to ask, how often is it empirically true? I certainly haven't gotten to execute most of these new-world branding ideas. I've suggested a lot of them, and even sold through some behaviorist strategies (about the experience or even usability of a service vs. a brand benefit) but most of the brands didn’t really have the managerial will to fully execute on them. Or at least not yet.

In no way do I want to suggest that all of us who are making claims about this bold new era of brand development are lying or lazy or totally full of shit. (Though one comment I recently read claiming that ":30 second ads never really worked" seems to be solidly within bs territory.)

But in most cases, I think the points we’ve all been making do reflect an insightful response to the changed marketplace, but the opportunities to act on them--rather than talk or blog or consult about them--are still the exceptions to the rule, for the obvious reason that we don’t write the checks that pay for them. And most of the people who do write these checks are not willing to risk marketing budgets on experiments.

I recently spoke to a partner in one of the current crop of small agency hotshops trying to do things differently. They’ve got a differentiated strategic approach, rooted in all kinds of fancy research, backed by planners with very deep connections in the C-suite. Even still, when I asked him tell me--honestly--how much of the work they do actually gets done his agency's special X-factor proprietary way, he said, “probably about a 1/3.” Another 1/3 they do "X-factor lite" and the remaining 1/3 of the business is totally conventional old-school advertising.

Don't get me wrong. I agree with everyone I've cited above. I left a pleasant job in a big agency for the same reason: because it seemed obvious that communications, let alone advertising, wasn't going to solve the business problems it was being asked to solve.

In my own current shop, we've built a new structure (based on a network model) designed to liberate brand development from communications. We've got lots of smart people working across a variety of disciplines and start every project with a deep strategic engagement, examining operations and product development as well as consumer experience. But how often do we really get to execute what we think is right? In a way that we think is best for the client? Off the top of my head, I'm not sure we beat my friend's odds.

And I don’t think that’s a bad thing. On the contrary, I think it’s the sign of a mature business though it's a lot less glamorous than declaring we have a new way of doing everything.

Thursday, November 1, 2007

Marketing marketing or our unique proprietary exclusive something or other

Paul Soldera has a timely post on incomprehensible marketing language. He quotes a passage from a so-called Business Intelligence or (BI!) company which he pulled from the company's website (he generously removes the company's name) in order to represent an excessive use of marketing jargon, which I'll quote in full:

XXX is the first and only business intelligence (BI) platform to deliver a complete set of market-leading, end-to-end BI capabilities: best-in-class enterprise performance management (EPM), dashboards and visualization, reporting, query and analysis, and enterprise information management (EIM). XXX introduces significant innovations that deliver BI in new ways to a broad set of users.
These things happen, Paul suggests, when we try too hard to claim some unique or proprietary benefit for something that we'd be better off just trying to describe clearly.

It's a legit point, and these absurd, hyperbolic assertions make me cringe precisely because they resemble claims made by organization I've worked for and have probably made myself (though I've tried to block them out.) But as Paul also acknowledges, it's not quite that simple, as any of us in the client-service business know only too well.

The truth is that it's very hard to differentiate one kind of client service from another, which is another of saying it's very hard place a value on "thinking" (a point I've raised repeatedly here under the heading "getting paid.") One way, of course, is to sell a process rather than product, another is rely on past clients/work/results to do your talking for you, the third is some clever articulation of how what you do is a different and better version of what everyone else does--usually with a lot of "proprietaries" thrown in. It's this third way which usually leads to passages like the one above as we strive to differentiate our offerings.

I guess a fourth way would not to differentiate in kind but in quality, by building value through execution, though that's equally hard to articulate and assert





Tuesday, October 30, 2007

"Held hostage by culture" or the cost of a meritocracy

“Satan exalted sat, by merit rais'd
To that bad eminence…”
--Paradise Lost, Book II

There’s yet another article in the NYT yesterday about hyper-achieving kids in the suburbs of Boston yesterday. (I say another, because it follows a much-discussed article from the spring about Newton girls who apparently feel under enormous pressure to be brilliant and talented on multiple fronts as well as “effortlessly hot.” What’s with the focus on Boston? Don’t they have stressed out beautiful teenagers in Scarsdale?)

In any case, this time the focus was on Needham high and their renegade principle, Paul Richardson, who is trying to combat the stress levels among his over-achieving student body with an official “stress reduction committee” and required yoga classes, which, unfortunately, many of the students are too busy to attend. Similar attempts to reduce stress have also met with mixed results.

When Richardson asked teachers to schedule homework-free weekends and holidays, students told him they appreciated the time because it allowed them to catch up on other schoolwork.

Or last year, when he stopped publishing the honor roll, and was summarily mocked by Rush Limbaugh for coddling students and received hate mail from around the country.

Richardson explained his efforts this way: “It’s very important to protect the part of the culture that leads to all the achievement,” he said. “It’s more about bringing the culture to a healthier place.”

Huh?

Richards efforts are admirable, but, as his results thus far suggest, they are probably doomed a very limited impact, if only because the part of the culture that “leads to all the achievement” is fundamental to the resulting stress.

The students aren’t overachieving themselves into misery because they have lost their minds or caught some disease, nor is it simply the pressure put on them by their parents (always the easy answer to any cultural phenomena). Many parents are as worried about their over-scheduled kids as Principle Richards

The problem with these kids is that have too thoroughly adopted the values of the dominant culture, which are in turn based on the economic reality of increased competition in a global marketplace.

So, when an English teacher tells them: “When you graduate from college, no one is going to care where you went…” the students are rightly skeptical. They know, from their own contacts and experience, that going to elite colleges does, in fact, expand their range of opportunities. This is not to say, of course, that they won’t have successful careers (let alone happy lives) if they go to a “state” school (which has some students so mortified that they are lying about it).

But this isn’t what the kids are worried about. They don’t just want successful lives. Like all 17 year-olds they want the big-time: fame, fortune and the love of beautiful people.

What freaks us out about these kids isn’t that they have screwed up values but rather that they are doing what they sense they need to do in order to maintain their class privilege. In other words, they are what we’ve made them and we don’t especially like it.

Richards seems to understand all this, as his “hostages of culture” description suggests, but his valiant efforts to make non-productive, no-goal-oriented, non-resume-building time in the children’s lives can’t possibly compete with the larger meritocratic machinery at work. Everywhere else they turn they’ll keep hearing that the only thing keeping them from being rich and famous is more hard work.

Milton understood all this too. In his world, merit was still term of moral evaluation rather then the marker of talent or socio-economic success or talent it was to become. But Milton could already sense the problems inherent in a meritocratic society, in which ambitious young upstarts would continually strive to better themselves at the cost of social stability. But that's a longer story I've already spent too much of my life pondering.

So what could Richard do? I’m not sure, but this could be a good planning challenge. Perhaps a planning for good challenge. My first idea: convince them all to be become Freegans, One thing that really trumps a powerful ideology (in the classic sense of a cultural expression designed to reproduce the ruling class) is a full rejection of it. Most other options feel like failure or retreat, especially when you’re 17 and have your eye on the big American Idol prize.

Friday, October 26, 2007

Bad Company

Pursuing some insight into our mess of a health care system, I came across an interesting article in the June 07 issue of HBR “Companies and Customers who Hate Them” by Gail McGovern and Youngme Moon. The article articulates the perhaps obvious point that many companies get into habit of “profiting from consumer confusion.”

McGovern and Moon briefly describe the practices in three widely-despised industries—retail banking, cell phone services and health clubs--to detail how certain companies in these categories have institutionalized a set of practices that “extract” value from consumers by creating confusing plans, hidden fees, and rigid contracts restricting consumer choice. While these practices can be highly profitable, they create the potential for mass defection when a competitor emerges offering a more consumer-friendly and transparent alternative, e.g., Virgin mobile.

Most interesting of all, McGovern and Moon point out that these companies often start with less egregious motives. Confusing product portfolios are often initially created to serve a range of consumer segments, and penalties are instituted to incentive the consumer to behave more responsibly—by not bouncing checks, for example. However, when these companies see how much profit is generated from these practices (cell phone overages), they often orient their business models around them, and adjust operations to enhance the margins rather than helping consumers make better decisions. It often isn’t long before consumer’s spot a better option.

Not revolutionary: but a useful cautionary note on how easy it is for brands to go bad when the $ is good.

Tuesday, October 23, 2007

Dumb trend spotting

An article in today's NYT's entitled Redefining Business Casual reminded me about another frequent annoyance I have about my chosen profession; the anecdotal observation that passes itself off as a behavioral trend based on some kind of data. In recent weeks, I've had to deal with more than my share of these unsubstantiated claims, expressed with great passion and authority from: "empty-nesters are moving back into the city," to "married people have more sex than people that just live together" to the claim in this article:
“I see a return to more traditional business wear...People dress up more in times of financial uncertainty and intense competition. It helps their sense of stability.”

The source of this information: the highly objective president of the Marcraft Apparel Group, which champions suits and ties.

All the evidence I've looked at lately suggests the opposite: a gradual downward slope in men's formal business wear (suits and ties) that's been going on for over a decade. This isn't to say that some fashion-forward types aren't reclaiming the tie or jacket in certain situations, but we are a long way from returning to 5 suits a week for most men in most industries.

Nevertheless, journalists and market researchers of a sort are drawn to these claims because while they seem to be superficially surprising (really!) they are also comforting. They are almost always deployed to confirm a some deeper truth about human nature It's illustrated above in the remark from the president of Marcraft Apparel Group. Men are wearing more suits. Really? Why? Because we dress up to give us a sense of stability in uncertain times. But of course. That does ring true. Except for the fact that suits are really expensive and most of us don't require them for work. So no matter how comforting a nice three-piece might be: they'd be a big waste of money
So while these facts are interesting and make us feel capable of penetrating into the secret motives driving human behavior, they usually mark the exceptions rather than rules. it's not that they're untrue. They're just not trends, at least in the traditional definition:

A statistically significant change in performance of measured data which is unlikely to be due to a random variation in the process.

The truth is I'm not that interested in trends to begin with. As I see it, my job isn't to predict the future, but to describe the present, which is hard enough.

Sunday, October 21, 2007

Sunday afternoon reading: getting paid, again

I’ve posted earlier about the challenge of getting paid what we think our work is worth: along with the much lamented conventions that allow us to charge for color copies but insist we give our ideas away, there is also a long dependence on hourly fees in almost every service industry which essentially incentives many of us (admen, consultants and lawyers alike) to be inefficient, throwing as many people and hours at the client as possible.

This last problem is particularly challenging for people in the so-called creative industries. We all know that great ideas do, on occasion, come easily or quickly, but according to the laws of accounting, this idea would be worth less than something we labored on for tedious weeks, whatever it's worth or potential impact in the marketpale

I’m not sure I have a convincing solution to this challenge (I hope Adrian and Rob over at zeusjones have it figured out), but I came across another interesting option in an academic paper of a former student. While the analogue doesn’t quite solve the problem of how to value easy work, it’s somewhat comforting to learn it’s been around for a good long time.

The situation was a famous debate between the painter James Whistler and the critic John Ruskin. In a particularly nasty review of an 1877 exhibition containing Whistler’s work, Ruskin takes particular exception to Whistler’s “Nocturne in Black and Gold: The Falling Rocket”
"I have seen, and heard, much of Cockney impudence before now; but never expected to hear a coxcomb ask two hundred guineas for flinging a pot of paint in the public's face."
Whistler, in turn, decided to sue Ruskin for libel. The Judge at the London trial asked Whistler how he could dare ask for such a large sum for a work that took only two days to paint. Whistler answered that that the fee was not for the two days but for “the knowledge of a lifetime.”

If only we could all get paid for this acquired knowledge. And in some ways we do. It’s why more experienced or senior staffers bill out at a hire rate than more junior ones. But it still leaves us in the trap of billing for time.

Tuesday, October 16, 2007

Brand cults, brand culture, brand ideology, pt. II

What is a Freegan? As a Freegan website explains:

Freeganism is a total boycott of an economic system where the profit motive has eclipsed ethical considerations and where massively complex systems of productions ensure that all the products we buy will have detrimental impacts most of which we may never even consider. Thus, instead of avoiding the purchase of products from one bad company only to support another, we avoid buying anything to the greatest degree we are able.


And though this choice is not without some significant costs in creature comforts (garbage picking, etc.), it’s also a pretty good example of ideological resistance, or the capacity for it in our consumer economy vs. say having to stand in front of tank in order to vote.

Ideology has a lot of definition. In it’s most basic sense it means a “body of ideas” or a “worldview,” but even when people use it causally, they generally mean more than just any idea. They usually mean an idea that expresses some form of social or economic power. In this way, it still carries the Marxist implication of “dominant ideology” i.e. a set of ideas that is designed to make the interests of ruling class appear as the interests of everyone. Or to paraphrase Althusser, “a fantastic relation to the real material conditions of existence.” It’s a message or image or set of ideas that makes us believe something about role in the world contrary to the material facts of our lives. (Wikipedia isn't bad on the subject. Here is a summary of some of the key terms in a popular culture class at Georgetown)

By any of these definitions, all advertising is ideological: the whole point of advertising is to reproduce the desire that drives the consumer economy. It would almost be impossible for advertising to not be ideological. So, when academics or marketers talk about the ideological impact of advertising, I have to agree, because what else could it possibly be? If it failed to be ideological in some minimal way, it would have to be some really poor advertising.

Sunday, October 14, 2007

Brand cults, brand culture and brand ideology, so-called

Reading about Stalin reminded me of another trend in thinking about marketing: the bold comparison between marketing methods and other more coercive forms of social control.

The question of just how powerful advertising really is seems to depend on your POV. Pros in finance and sales often speak about advertising as a necessary evil at best, and a complete waste of money at worst. Academics and social critics often portray advertising as an incredibly powerful source of nefarious social control, making us fat, covetous and generally unhappy. I’ve heard that Sut Jhally, a professor of communications at University of Massachusetts has various speeches and short films called how "How television exploits its audiences," “Advertising and the end of the World.” It's a bracing set of arguments (which I'll take on in a later post) for any marketer who thinks they are engaged in a positive act of consumer empowerment.

Lately, pro advertisers have started to think that maybe these scathing critics have a point, but being good marketers, they used the critique as a source of raw material, drawing analogies to to coercive modes of behavioral modification and social control. Marketing books over the past couple years have analyzed the role of iconic brands in cultural change (Doug Holt’s How Brands Become Icons), have turned to cults as a source of potential marketing techniques (Douglas Atkin's The Culting of Brands) and have deployed the term ideology to articulate the ultimate power of marketing (beyond reason, beyond emotion, beyond culture, there lies something even more…) to sweep consumers up in a psycho-social wave of behavioral transformation.

I’m all for stealing tricks from parallel disciplines of persuasion and strategy (readers of the screen know that game design, political strategy and behavioral economics have all been fruitful sources of new ideas for me) , but it strikes me that many of these branding manifestos efforts overreach with their analogies on at least two fronts.

1) They fall prey to a characteristic vice of brand books, which is deriving principles from rare, if not unique examples. Coke, Harley Davidson, the Ipod are powerful brands indeed but they are hardly typical, and provide few lessons for most Internet start-ups. Of the books cited above, Doug Holt’s is, by far, the most sophisticated. While he deploys a classic post-Marxist mode of cultural analysis (Althusser seems an important theoretical foundation) to understand how certain brands reconcile the cultural contradictions of late capitalism (our fantasies clashing with the reality of our socio-economic conditions), he is the first to admit that Iconic brands are a special case, and what he calls “cultural branding” is not for every brand. There is no question that advertising in conjunction with the right cultural forces can produce quite a powerful force of persuasion, but it doesn’t follow that most or even many brands (because of the category, the budget, the competition) can interact with larger cultural forces to change behavior. In fact, some of the world's most powerful marketers these days are moving in the opposite direction, spending money on consumer services rather then the mass communication of big ideas. E.g, Nike+ in today's NYT's.


2) Many of these arguments, however, seem to forget that there remains a enormous gap between ideological power (as exerted by cults or restrictive political regimes) and persuasive power of advertising. For anyone who has bothered to read a page about what it’s like to actually live under a truly restrictive regime (the former USSR, China during the cultural revolution, Nazi Germany), under the constant threat of betrayal, imprisonment and death recognizes that there is a big difference between spending 20 years in solitary confinement because you once made a critical comment about Stalin (or not even that; see, for example Eugenia Ginzburg's Journey Into the Whirlwind if have a strong stomach and a want an inspiring taste of the real thing) and feeling uncomfortable with your body image because you are constantly subjected to unrealistic images of beauty by the fashion industry. The constituent differences between marketing and totalitarian regimes are so numerous that it’s almost embarrassing to list them; but the most obvious one is that ideological regimes (whether political or social) tend to work by systematically crushing all competitive viewpoints. Advertising often thrives on and plays off just such competitive views on how to live your life. Now, it's certainly true that it can feel difficult to refuse to participate our consumer society (with all it's joys and broken promises) but we all know people who have the chosen to go off the grid--to various degrees--and don't end up in a prison camp for it.

There is more to say about ideology—including the trickiness of defining it in the first place--but this is getting long so I’ll save it for tomorrow.