Tuesday, July 31, 2007

Unexpected start-up pleasure #1: Acting Normal

We all know that style matters in business. And though it's hard to describe or define, there are unwritten boundaries about what falls within the realms of acceptable style. What makes this interesting and relevant here is that style changes over time, changes which become particularly striking when you look across generational barriers.

Just as parents have battled with their children over haircuts and clothing styles forever, so do bosses and their usually younger stuff get into conflicts over what counts as “professional” style or clothing or behavior. ( The word “professional” has become such a bizarrely broad catch-all for policing just about any kind of behavior that it’s worthy of a post of it’s own).

In one of my previous jobs, flip-flops became an odd sticking point. For some, they signified a careless disrespect of the workplace. For other (usually, but not always, younger) colleagues, they were a mark of emulation of well-known creative thinkers. An issue that was reflected on the national scene when college athletes wore flip flops to the White House.

But in many ways, the changes in clothing and appearance are dramatic and obvious. The more subtle and harder to document but much more interesting changes occur around the etiquette of professional relationships.

In my experience, the business “style” of the former generation was and is rooted in sincerity. Important conversations, business-forging conversations are generally held in a tone of sentimental seriousness (care, respect, trust) with a touch of tough-love (bottom-line, end-of-the-day). Don't get me wrong, these are all good things, but they can sound a little tinny to the X’ers of my generation who have a hard time saying how much they “care” about anything. We care too, we just prefer to express it differently, usually with a little more humor.

This is tricky territory to speak about on the fly. It requires a broader sample to see if it holds up. In fact, I hope someone somewhere is doing a full-blown sociology of business-etiquette to document how these behaviors and attitudes really shift over time

Which brings me to my original point: one of the surprisingly fun parts about working for a small company, run by your peers, rather than the parent's of your peers, is that your style of respect and authority and “professionalism” is, more or less, the style of your company. Which means fewer debates about flip flops and the proper and professional way to behave.

It also means you can, more or less, use your own instincts and judgment rather than impersonating a style that belongs to someone else. It means you can build relationships on your own terms, in your own voice: one that makes so much sense to you that it's simply yours. You can just act normal rather than "professional."

It's maybe obvious but worth noting again: there is value and surprising pay-offs to surrounding yourself with people who share your professional style.

Thursday, July 26, 2007

The latest end of (traditional) authority

Just walked out of a couple dozen hours of focus groups with college students in the deep middle of middle America, and I am here to remind us all once again that this coming generation does not care--I mean, really, does not give one little crap who provides their information, so long as it's relatively useful.

After a dozen agonizing questions in which I tried to generate some interest around now obviously inconsequential attributes like "credibility" and "trust," questions which were met with silent shrugging and clandestine text-message checking, I had to conclude that I was, well, old and my own interest in "valid" or "authoritative sources of information" about as old-fashioned and quaint as my habit of unfolding a paper NYT every morning.

Like everyone else, I'd already noticed how the Internet had displaced certain kinds of authority figures (accountants, travel agents, real estate agents, car salesman) but most of these people were basically in our way in the first place--jealousy guarding info, telling us we couldn't do it ourselves, keeping us from buying stuff when and how we wanted.

But I realize now that the impact is far more than practical. It's really and truly epistemological. The explosion of sources of info have, it seems, pretty much erased the importance of the source of that info as a key element of it's value. At least in the deep middle of middle America.

They just don't care. They assume it's accurate. And if it's not, they'll find out soon enough. Or not. It doesn't really matter. They can always check at another site.

Except for sports. In which case, ESPN still matters. Some things are still sacred.

Wednesday, July 25, 2007

YouTube on TV

Not sure if it transformed the debate, but you could see the early signs of a significant change in the format, maybe less for the candidates, who slipped back into their characteristic postures (thanking the citadel, reaching across the aisle, asserting their patriotism, etc…) than for the rest of us. To my eye, YouTube’s biggest impact was on the voters' role in the exchange and of course, the impact of the questions. The voters asked for specifics, (which Republican would you take as a running mate? would you work for minimum wage?), they mocked the traditional evasive rhetoric, they put the candidates on the defensive. My favorite moment: when Joe Biden felt compelled to declare his personal worth which spurred a comical posturing of many of the candidates, describing how little money they had.

Of course, the format raised questions about which of the candidates could speak to this newly empowered voter, but didn't provide any answers yet. None of them, really, rose to the challenge, though Obama probably came closest, with his confident ability to shift between more official levels of discourse and an easy vernacular style. He shut down the silly debate on the poorest candidate by stating the obvious fact that they all had plenty of money compared to the worker on minimum wage. But you could see them having to abandon their traditional evasions in order to directly answer the question.

I don’t want to overstate the case. The traditional setting and rules still dominated the experience: the careful parsing of words, the sentimental anecdotes, etc.,. but when each new video came on screen, representing gay couples, parents of veterans, underpaid workers, you could feel something new, even on through the double screens. The citizen-reporters had a confidence, a personality, even, at times, a sharpness that they’d never have on the floor of arena, under the scrutiny of the lights.

All of us could suddenly see ourselves asking direct, clear, hard questions to candidates. Suddenly, or possibly we became a nation of voters aspiring to get their questions on a national debate. Talk about empowerment. And what a worthy ambition! A new kind of politically engaged American Idol. Many of us would never be comfortable on television, but it’s many of us would be thrilled to be on YouTube on Television, after we had rehearsed and reshot our question until we got it just right.

My colleague, Jim Dowd, who insists he’ll add his comments to this post, has further suggested that the whole You Tube structure started to reposition the journalists. Cooper wasn't asking questions as a comfortable insider, so he had to add a new kind of value, probing for specifics, asking for follow-ups. Just as the Internet has marginalized (or at least redefined the role of) travel agents and accountants by giving consumer-citizens access to information and choice, now it’s forcing reporters to rethink their role.

I know it's getting mixed reviews but I like it so far.

Monday, July 23, 2007

Good Viral?

One of the frustrating things about being an analyst in the hyped-fueled (hype is our business after all) world of advertising and marketing is that it's very hard to sort out what's working from what's not. I remember a day not too long ago when I found two articles on a previous Quiznos' campaign with completely opposite POV's, one celebrating it as a great success, the other damning it as a complete failure. Like most disagreements, this one existed at the level of assumptions.

You could say--and we usually do--that the market is the ultimate judge, but we all know it isn't that simple. Even when campaigns that don't "build business" right away, we often make other claims for them: e.g. they raised the brand profile, or attracted a younger audience, or will lead to long-term impact. And almost anyone who knows their way around a spread-sheet can find some sketchy data to support their claims.

This is especially true in emerging media, where the data is even thinner, and less available. Most of us have been making claims about the power of viral videos to generate buzz and engagement and drive business, but most of the analysis out there is just as impressionistic, based on taste. Here's a know | future, weighing in on one set of choices.

Social Media Experts like Joseph Carribis go further, listing a set of qualities possessed by good viral. A posting on his blog lists Entertainment, Utility, Reward and Uniqueness. . That sounds good too but it also sounds a lot like qualities of any good form of marketing. Equally useful and universal principles are posted designer Ben Terett on his Noisy Decent Graphics Blog. He lists Funny, Rude, Useful and Simple. Again, I like it. But for those of us who started working in the dark ages before web 2.0, getting the consumer/viewer/user's attention was pretty important too.

My favorite post--in terms of matching the medium to the message has be this post on Twitter--has to be here.

I certainly don't have the answer either, though I'm starting to think that the previous posts are exactly right: the best of viral is much (though not entirely) a lot like the best of everything else.

One of my favorites recently is the now semi-famous Will it Blend site for industrial blender Blendtec. There are any number of great creative and interactive touches to this execution, from the user-selected choice of an iphone to the ability to bid online for the destroyed device to the fact that it doesn't overplay it's hand with broad comedy. Most of all, though, I loved how it well it communicated the product benefit in such an unforgettable way. After watching this video, you have no doubt that this mother could chop through just about anything. And who doesn't want one of those?

Saturday, July 21, 2007

CEO's read litearture!

Breaking story in today's New York Times, as reported by Harriet Rubin. Stephen Jobs collected Blake! Michael Milkin is a fan of Galileo. Phil Knight has a library of rare works of Asian Poetry and Art History. Darwin is apparently hot among business leaders, as are books on climate change! In comparison, Shelly Lazarus' choices (Meg Wolitzer) sound positively middle-brow. But let's not split hairs. After all, any sign that America's CEO's are reading about something other than lost cheese is good news to me. It's true that Rubin includes a relatively short-list of examples, raising the question that these CEO's might be the exception rather than the rule. (They certainly are in my limited experience.)

But more interesting still, Rubin reports that these CEO's tend to keep their passion for serious literature a secret She doesn't really explain why, though she suggests that the CEO's don't want to reveal the value of their libraries or raise the price of desired tomes. But this explanation doesn't really ring true. Can't these guys and gals buy almost anything they want? And on the scale of their possessions, these books are relatively inexpensive.

Perhaps they are embarrassed? Or just don't want to reveal the source of their strategic ideas? If only we could get a peek at their Amazon wish lists!

It makes me wonder if all my bosses were hiding copies of Ruskin and Tolstoy within dust-jackets of The One Minute Manager and Seven Habits of Highly Successful People. It's hard to believe, but I hope so.

Friday, July 20, 2007

Another weird thing about business culture that I still haven’t completely adjusted to

is how business people always seem to be defending—nervously or aggressively--the boundaries of their responsibilities. People who are particularly aggressive with this behavior are usually called “territorial,” in the full imperialistic sense of the term.

At first I thought this was just a pathology of the place I was working, but when I started perusing the business literature it started to look pretty pervasive, almost the dominant problem of business culture. Take a look at at almost any business mag and you’ll see all kinds of advice on how to help employees understand the value of shared responsibility.

It’s weird, to me, because in my old life, the goal was to give away as much work as possible so you could do what you really wanted: which was to hang out and drink espressos in cafĂ©s with cute waitresses or research obscure topics in musty libraries. Avoid committee work like the plague, get student help grading papers, go on sabbatical and hope it’s all taken care of when you get back.

Of course, turf battles aren't limited to business. They seem pretty endemic to any organization. If you google “turf battles," you’ll see that “turf battles” are identified as the source of any number of failures to get something done in law enforcement, the scientific community, government.

It’s of special concern to me now since I just signed up with a company which has built an entire business model around successful collaboration across strategic and creative services. Like anything worth doing, it's not easy: generating enough conference-call posturing to fill the script of a David Mamet play.

“Is that really a best practice?”
“Depends on what you consider a best practice.”
“Are you suggesting I’m not familiar with best practices?”

But the truth is, it’s an issue for anyone involved in brand or account planning or any kind of strategic function since, by definition, you are in a collaborative role. You are put in the position to give direction to people who generally aren't super-eager to take it, for instance—just for instance—creative directors.

I looked over a couple sets of rules on-line but they weren't very useful or convincing. (“Show how sharing information can lead to job security.” Not sure about that one.) I only have one rule, at least for managers: don’t take credit for anything. Really, anything. Chances are you only half-deserve it. And better still: you tend to get credit anyway. Double the credit. Credit for getting it done and credit for having the good judgment to hire the person who got it done. And your staff tends to appreciate it, which pays all kinds of dividends in loyalty and good work. It’s kind of an artificially simple principle—call it strategic humility—but it's remarkably rare.

Thursday, July 19, 2007

Unsurprising Start-Up Fact #1: Back to the basics

But six, seven rounds of revisions on a screener, really? Really? Is that what it was like? I have renewed compassion for my former staff.

Wednesday, July 18, 2007

Predictions: Extra-artificial simplicity

Whenever forecasters and futurologista claim a certain degree of accuracy (like Faith Popcorn's well-publicized remark about her 95% hit-rate) I always wonder what they got wrong (claim-diggers for men?), but when I ask about their failures and how they adjusted their model in response (and I've asked most of the copy-testing services I've worked with), I always get a much less clear answer, or no answer at all. For me, making predictions about markets or market-place success is to account planning as real-estate get-rich quick schemes are to financial planners. They make such a huge over promise that the real hard work of data analysis and communication development appears either equally shoddy or a weak deliverable. It can be hard to compete with prophets.

And of course Web 2.0 has brought with it a new wave of entrepreneurial activity and a new wave of prognosticators making bold predictions about the year and years ahead.

Of course, the world has long been plagued by prophets, who lacked the data to support their claims. The literary critic Frank Kermode--in his foundational The Sense of an Ending went so far as to claim that being wrong was a constituent feature of most prophets. The failure of apocalyptic predictions to come true never harms prophet's status. On the contrary, they tend to re-energize the faithful, fueling another round of fuzzy math that puts the end of the world, slightly later. Something new to get excited about!

In the early 18th-century, Jonathon Swift got so annoyed with a cobbler/astrologer John Partridge's yearly almanacs (think early Popcorn report) that Swift published his own prediction that Partridge that would die from a "raging fever," and in a later pamphlet announced that his prediction had come to pass, writing a satiric eulogy to the quack. The hoax became hugely popular and plagued partridge throughout his life. When Partridge protested the claim, stating that he was not, in fact, dead, Swift rebutted him in print, saying: "There was sure no man alive ever to writ such damn stuff as this."

I can't think of greater authority than Swift on almost any subject. But if we wanted another take on why predictions are so hard, and so often miss the market , here's Proust in the final book of In Search of Lost Time on the long-anticipated but somehow still shocking departure of the narrator’s beloved Albertine:

In order to picture to itself an unknown situation the imagination borrows elements that are already familiar and, for that reason, cannot picture it. But the sensibility, even it most physical form, receives, like the wake of a thunderbolt, the original, and for long indelible imprint of this novel event. And I hardly dared say to myself that if I had foreseen this departure, I would perhaps have been incapable of picturing it to myself in all its horror….


Even when we foresee something, our imaginations are so limited by past experiences that we still can't believe it when it happens or we imagined such a totally different experience that are emotional preparations are utterly useless.

As I usually say in meetings: I'm not in the business of predicting the future, describing the present is hard enough.

Tuesday, July 17, 2007

Start-up Experience #1: It's kind of emotionally exhausting

I have an ingrained resistance to writing about my personal experiences (not b/c I'm so protective of my privacy but b/c who really cares?), but since both Gareth Kay from Modernista and Jeff Flemings from Digitas have already posted bits about my new gig, I thought I'd add something from my perspective. (Plus I promised that I'd experiment with different ways of using the form.) What follows is drawn from journal notes I've been keeping since I started working at Mechanica, contrasting it with my past jobs, mostly in big agencies or agencies owned by big holding companies. You know the type.


Small business inspires a kind of protectiveness, an impulse to nurture. My tolerance for criticism is low; the opposite of my reflexive skepticism of large and powerful institutions to which I have belonged to most of my life. In the past, I started throwing stones with my first handshake, rattling cages and smiling grimly during my HR orientations, complete with instructional films about the fine linesof sexual harassment. Now, I find myself stepping gingerly, eyeing the horizon for possibility and risk. The fragility of the structure sensitizes me to every exchange. Corporate takes or trades time and hope and a piece of identity for money and some degree of stability (though increasingly little), but in truth it only asks for a kind of half-belief. The lies it demands are mainly lies of omission. Or even just pretense. You have to pretend to believe in an institutional identity (a culture, a spirit, whatever) that really isn't very different from many other institutions exactly like it, and in which we are basically functionaries. But you can also pretty easily ignore the machinery of corporate culture altogether and do your job reasonably well. Now, there is a pressure to create something, to live up to some abstract idea which only half-exists or has only half-realized it's ideal state. This job asks for faith, which I tend to lack.

Monday, July 16, 2007

Penquins can help too

The cheese has been moved and now the glaciers are melting. The NYT's today reports on the rewriting of a classic business book on organizational change as a fable Our Iceberg is Melting, starring a Penguin named Fred who must organize his colony against the titular threat. Angel Jennings reports:

With bright colorful illustrations and large text, “Our Iceberg Is Melting: Changing and Succeeding Under Any Conditions” looks at first glance more like a children’s book than something a chief executive might read. But the book is attracting readers and creating a penguin movement in boardrooms around the world, Mr. Kotter said.


The good news? Kotter is pictured taking notes on the back of his pug Cleo at Squam lake. Now that sounds more like it.

Full disclosure: I'm not very objective on the subject. My very first experience in business, my very first week on my first job was a three day (at like 16 hours/day) team-building meeting called a "Breakthrough" session in a windowless ballroom in a Cincinnati hotel. It was run by a group of black-t-shirted inspirational speakers, who encouraged us to "let he energy flow" and get on the "positive train." I don't recall much else, except the Vangelis music, the passing of the "talking stick" and the black sweatpants I was given for my participation, which my wife and I still refer to as my my breakthrough pants. Which is all an elaborate way to say that I'm still a little gun-shy about organizational change books or events of any kind. When it comes to dealing with change, I still prefer to sit back and take notes off the back of my pug.

But if anyone else has off-site horror stories, please share. It feels like it's an under-documented experience.

Sunday, July 15, 2007

Originality on a Sunday Afternoon

Creatives in the ad business put a high premium on originality and for lots of good reasons: more original ads are good for the creatives' careers and at their best, they tend to help get consumers' attention.

One of the problems with overvaluing originality, however, is that the majority of people aren't terribly interested in original experiences. Despite what we say, most of us want the same thing over and over, with slight variations.

The evidence on the subject is pretty overwhelming but I was reminded of the fact today when I the NYT's story on John Travolta's role in the movie adaption of Hairspray. The article was a reflection on the many rises and falls in Travolta's career with special attention to the controversy surrounding his membership in the Church Scientology. But what struck me was the more basic fact that they were making a movie from a musical that had been made from a movie. What's next? Another musical, an ice show, an animated series, followed by another movie and musical.

Other evidence. My kids eating ice-cream earlier this afternoon. They never seem to get bored with it. And they usually order the same flavor, week after week.

There's a great experiment in the annals of Behavioral Economics, documenting this same phenomena. It's noted by Richard Thaler in his foundational essay "Mental Accounting: You can find it here. But this is the relevant bit:

Read and Loewenstein (1995)... demonstrated the role of choice bracketing in an ingenious experiment conducted on Halloween night. The 'subjects' in the experiment were young trick-or-treaters who approached two adjacent houses. In one condition the children were offered a choice between two candies (Three Musketeers and Milky Way) at each house. In the other condition they were told at the first house they reached to 'choose whichever two candy bars you like'. Large piles of both candies were displayed to assure that the children would not think it rude to take two of the same. The results showed a strong diversification bias in the simultaneous choice condition: every child selected one of each candy. In contrast, only 48% of the children in the sequential choice condition picked different candies. This result is striking, since in either case the candies are dumped into a bag and consumed later. It is the portfolio in the bag that matters, not the portfolio selected at each house.
When we are offered options we tend to take options, but day after day, well, we tend to pick the usual, our old favorites, though we might add some jimmies for variety.

Saturday, July 14, 2007

Business Books: Artificial Simplicity Perfected

“The World’s #1 book on change!”
-- Who Moved My Cheese? Website

I usually get an hour or two to read something interesting on Saturday and so it’s usually about now (Saturday evening) that I’m struck again by how stupid, redundant and generally empty of content most business books are.

When I try to read them—and I do try, drawn on by a mixture of hope and anxiety that I might be missing something—I get a little dizzy. There is so little content, so many bullet points with so much space between them that it feels like I’m free-falling through the pages, reaching out desperately that for any evidence (a provocative thought, a well-crafted sentence, a fresh perspective) of a sharp mind engaged with a worthy subject. And make no mistake, I think business is more than a worthy subject. It’s one of the reason I find business books so depressing. They are so inadequate to the task of enlightening or instructing.

I had a relatively tense conversation with my old boss on the subject; I disparaged some book (perhaps the one cited above, perhaps a heroic bio of a business leader) as a half-baked collection of familiar slogans that could have and probably was written by a software program that simply reformatted prose from past business books. He made the claim that this book was important because it “captured the spirit of the time,” which meant that other powerful men that he admired were reading it.

But he was right of course. The importance of a book does partially reside in its ability to attract and inspire an contemporary audience. Being the world’s #1 book on change is no joke. People must be having a positive experience or some kind. My best guess is that these books are inspirational. More like going to a rousing speech than immersing oneself in strong imagination or critical mind. But I’m not sure. Curious what others think.

Even writers of business books sometimes seem a little embarrassed by the company they keep. John Butman, the author or co-author of several well-respected books on consumer behavior, including Trading Up and Treasure Hunt, has recently written a satire of business books called The Book That’s Sweeping America which sounds like it’s on target.

But there are exceptions: Doug Holt’s book on cultural branding How Brands Become Icons is a serious attempt to explain how certain brands have managed to sustain their power over time. I like a lot about it, not least that it recognizes that a brand’s power is dependent on it’s cultural context, a fact that seems to go unnoticed in most branding books that focus on Essences and Brand DNA. But even more pleasing, to a methodological snob like me, is that it actually has a theoretical method, detailed in its appendix, which it tests by comparing itself to other methods. When he explains why, he sounds a lot like Martin describing how successful leaders think, the post I started with:

“Academic theory building is based on systematic skepticism. A researcher challenges conclusions with data until the theory proves that it can handle all comers. Rather than selling a favored theory, he or she seeks out strong challengers and subjects these theories to an empirical test with sufficiently detailed data. The best theory wins.”

Who can argue with that?

Friday, July 13, 2007

The Seduction of Intuition

Thinking is a feeling.
--Joshua Clover


The question of intuition came up incidentally in response to a post and seems worthy of additional examination, if only because the power of intuition seems to undermine the need to think at all. It's the ultimate form of simplicity--the totally natural kind: intuition as an almost transcendental ability apprehend truth without any effort at all. In fact, from the perspective of the strongly "intuitive" person, thinking is part of the problem, getting in the way of our natural ability to "know." While the history of the intuition has been pretty shaky in business circles, it's definitely been coming on strong lately.

Malcolm Gladwell's Blink has provided some evidence for the power of intuition (though he actively resists the word himself) describing several now well-publicized examples of experts who can evaluate seemingly complex situations and relationships (marriages, a job candidate) in a few seconds.

And the question of intuition or instinct also came up at a Luxury Marketing Summit I attended recently in West Palm Beach. Many of the speakers (Ian Shrager I mentioned in a post below) spoke about intuition ("sensibility" was his preferred word) as a defense of taste as the only really arbiter of a luxury experience. Ian was particularly eloquent on the subject, disparaging attempts to standardize his efforts as a form of "brand creep." And who can doubt him: the guy's record (from Studio 54 to the Delano) is pretty amazing. It was clear, however, that many of the speakers were on the defensive, actively resisting the intrusion of more scientific marketing techniques into their previously rarefied realm. Many of the speakers hissed out the word "segmentation" as some unseemly practice best not mentioned at all.

I'd be the first person to defend artistic expression as the clearest form of intuition in action. And something like a pure expression of an artistic sensibility is probably necessary to create the highest level of luxury goods. It's the designers unique sensibility that gives them such high value. They shouldn't be standardized (and for that reason are often pretty small business considering how high profile they are in the culture)

And like most other planners I respect, I think a great creative idea can be tested to death (for the obvious reasons: consumers don't really know what they want, and the methods of evaluation suck). And I too get tired of submitting my intuitions to consumer testing.

But if I'm honest, and look back over my career, I'd say my instincts are wrong ALL THE TIME. My instincts were wrong about photographers and wrong about video gamers. I was wrong about grandparents and eight-year olds. I was wrong about what people think about insurance and dishwashers and George W. Bush. And I love being wrong because it gives me something to do.

Maybe I'm just really bad at this job, but it seems to me that anyone who is remotely involved in researching or analyzing behavior of any kind should be glad that intuition isn't adequate most of the time. Otherwise, we'd better pack it up.

I'm not suggesting that intuition doesn't play a role in our decisions. It's particularly important in evaluating creative, which needs to make you feel if it's any good. But the broad use of Gladwell to justify instinctual reactions strikes me as meretricious in the extreme. (Gladwell himself insists pretty strongly that these quick-judgments are in fact a form of thinking.)

As Barry Schwartz (of The Paradox of Choice fame) expressed it to me over cocktails at the same conference (I'm paraphrasing here): "The reason those experts are so good at predicting an outcome is because they have so much experience to base it on. They are quickly synthesizing vast amounts of data. Intuition isn't any good when you are out of your element."

Something to keep in mind when someone who knows nothing about a field tells you "it just doesn't feel right."

Thursday, July 12, 2007

Simplicity vs. Stupidity

“Every model has its limitations and is not a complete representation of reality.”
--Warren Robinett


Which isn't to suggest that simplicity is bad. Only stupidity is bad, which I might define as thoughtless simplification. Artificial simplicity is necessary for any act of interpretation. Or any kind of representation. The world may be everything that is the case, but it is infinitely complex. People who describe and interpret things need to choose the relevant details (what Martin below would call salience) and represent them in a way that clarifies our choices. There are a million examples of this fact (Tufte's books are full of them) but my favorite recent one is in this incredible game-design textbook called The Rules of Play by Katie Salen and Eric Zimmerman. It’s an innovative and inspirational textbook for a number reasons, but I’ll save that for later. For now, check out their implicit critique of overly complex simulations, which is itself prescient of Wii’s victory in the next-gen console battle:

Why is it that games can’t simulate everything with a high degree of detail? Why can’t a game simulation be both wide and deep? ….Limited development resources require that game designers decide where those resources will be spent. But the limitations of time and budget are not the only things affecting the scope of simulations. Meaningful play results from the ability of players to make meaningful choices from a limited set of knowable options.”

How fabulous is that? That last sentence could also serve as a nice metaphor for what planners try to do for creatives: creating a set of formal guidelines that establish the conditions of meaningful play.

Zimmerman and Salen take the point further as they outline an approach for designing wargame maps. Citing another game designer, James Dunnigan, they describe how careful formal choices (thoughtful acts of simplification) create the possibilities of meaningful action. Including details that don’t impact game play are merely distracting.

“As Dunningan puts it, too much detail in the terrain can get in the way of a player’s understanding; only “gross” terrain features have a real impact on military operations. Abstraction emphasizes the features critical to understanding the terrain, while minimizing the “noise” created by less important elements…. A visible feature that does not contribute to the functioning rules of play is bad design.


Including details that don't matter makes bad briefs, and bad marketing. Simplification is a great end, but you can’t start there. You are just as likely to pick a meaningless as a meaningful detail. You first have to analyze the data to identify what matters—stuff that impacts the relevant outcome. People who demand or, worse, assert simplicity are usually just being stupid or trying to make you feel that way.

Wednesday, July 11, 2007

Thoughts about thinking about business

“When a colleague admonishes us to ‘quit complicating the issue,' it’s not just an impatient reminder to get on with the damn job—it’s also a plea to keep the complexity at a comfortable level.”
--Roger Martin


One of the first things I found odd about working in business, or at least the odd part of the business I was in (advertising), was how hard it was to define the thinking we were supposed to do.

Everyone was pretty clear that, as a planner, I had the luxury to “think” about my clients’ business, but they were less clear describing how I should go about it. Looking around for examples, it seemed that thinking could mean a whole bunch of things but none of them bore much resemblance to what I thought thinking was in my old life as a perpetual graduate student, studying Literature and Economic history.

So I was pleased to see that an intriguingly titled article “How Successful Leaders Think” in the June issue of The Harvard Business Review by Roger Martin, Dean of the School of Management at the University of Toronto.

http://harvardbusinessonline.hbsp.harvard.edu/hbsp/hbr/articles/article.jsp?ml_action=get-article&articleID=R0706C&ml_page=1&ml_subscriber=true

I liked a lot about it, not least a refreshing critique of the relentless push for forced simplicity quoted above and which nicely supports my general mission here.

Also refreshing was the diplomatic acknowledgment that most business leaders, even successful ones, are not very self-aware about their own decision-making processes, using the heralded bio of Jack Welch as a paradigmatic example. However much we might think (or remember) that we simply used our guts or searing, prophetic insight to make a decision, the truth is that most of us do try to evaluate the costs and benefits of high-stakes decisions. We think about it first, though not necessarily in an empirical, or research-driven way.

Martin describes a cognitive process called “integrative thinking” which he opposes to a more linear cost-benefit analysis of two existing options, which generally leads to unpleasant trade-offs. Successful leaders, he claims, don’t settle for an “either-or” scenario, but look for synthetic or innovative solutions by taking a wider and more holistic view of the challenge. Using Fitzgerald’s old chestnut about high intelligence being defined by the ability to “hold two opposing ideas in the mind” to define the foundational cognitive act, he maps out a three-stage process:

1) Determining Salience: identifying the relevant factors even if they aren’t obvious
2) Analyzing causality: searching for surprising relationships between these factors
3) Envisioning a decision architecture: this is Martin's weakest category and means, roughly, don’t lose track of the best solution by delegating various parts of the analysis to different people who will miss the forest for the trees

This strikes me as fine and good, but less because it’s so ground-breaking then because it’s a clear description of what I’d call old-fashioned conceptual thinking, as I used to do it and attempt to teach it in graduate school when I was studying the humanities and social sciences. Though my steps were a little different and more evaluative than prescriptive (they were designed to help students write papers after all) they amount to another version of the same thing:

1) Determine the factors/categories/experiences that are really important in the data set
1a) Pay as much attention to what’s surprisingly missing (e.g. a book about a relationship without any mention of desire) and irrelevant as what is included and emphasized.
2) Determine relative value/influence of these various factors in terms of how they interact and impact the data set as a whole.
2a) What factors are validated and what are under-valued? Are the factors dynamic or static?
3) Develop a hypothesis/story to explain the data set which has captured your attention or seems particularly important
4) Then test the hypothesis/story up against contradictory data. Can you come up with a story that explains the contradictory data? 5) If you can’t make it work, evolve the hypothesis or start over with a new data set.

Business doesn’t take conceptual thinking too seriously as a real discipline, probably because it sounds too soft, too theoretical, at least compared to the catch-all category of Quant or statistical analysis which is the most convincing form of evidence for most business people. (Martin also mentions the reliance on regression analysis as the favorite tool of simplicity seekers). Or maybe it’s because business education relies so heavily on a case study model which facilitates the impulse to “search and reapply” which I’ve witnessed among so many M.B.A.-trained colleagues.

It’s weird because business—with it’s complex systems, susceptibility to social-cultural factors, radically dynamic relationships and need for overarching explanatory stories—seems more suited to conceptual thinking than quantitative analysis which is a better tool for analyzing existing data sets or studying phenomena in controlled environments.

But as Martin rightly points out, complexity makes most of anxious, even if the ability to analyze more factors generally leads to more innovative solutions.

Maybe I’m missing something obvious (the vice of the complexity-embracer), but for the time being, I’m going to continue defending old-fashioned thinking (whatever you call it: integrative, conceptual, critical) as the first tool of choice for solving business problems.