Archive for Creativity

Easy to Imagine, Hard to Imagine

This morning, my brother pointed me toward a very excellent rant by Steve Yegge about what he calls the “shit’s easy” problem. The essay is worth reading in full*, but his basic point is that just because you can imagine something, that doesn’t mean you can implement it. Something that seems easy in theory can get very hard very fast, once you start considering things like use cases, interactions with other complex systems, and the like. He’s got two extended examples – implementing credit card “buckets” for categories of spending, and legalizing marijuana – that will show you exactly what he means. Both of these examples are things that you can easily imagine, but that get very hard once you have to actually do them.

I, of course, struggle with the opposite of the “shit’s easy” problem – namely, with the thing we like to call creativity. “Shit’s easy” is what happens when something is harder to implement than it is to imagine. “Creativity,” I’d argue, is what happens when the opposite is true. When we have a hard time imagining how to accomplish something, and yet we have the evidence that it has been done, we call it creative**.

What we’re talking about in both cases is cognitive dissonance – a difference between our expectations and the reality of getting things done. When things are about as hard to do as they are to imagine***, there’s nothing that requires explanation. We aren’t provoked to curiosity.

When things are easier to imagine than implement, we can’t understand why it doesn’t just get done. “What could be wrong?” we groan. “Shit’s easy!” Cue irritated explanations of system complexity.

When things are easier to implement than imagine – which may just be because they are very hard to imagine – we can’t imagine how it did get done. “Whoa!” we say. “That was so creative!”

See the graph below for a visual explanation. As things get harder to imagine, we’re more likely to call them creative. As they get harder to implement, we’re more likely to inaccurately call them easy****.

Of course, that gets at a deeper question: what is easy or hard for us to imagine?

* Okay, you can skip the bits where he’s telling you just how awesome he is. Yes, he’s doing it rhetorically to assure the reader he isn’t speaking defensively. Yes, it will probably also annoy you.

** Obviously this is a testable hypothesis. I’d love to test it at some point.

*** Note that this implies some interesting predictions about the impact of expertise on creativity evaluations – and, for that matter, on “shit’s easy” syndrome.

**** The best defense against the “shit’s easy” problem, by the way, is a dozen or so years of Talmud study. Talmud is allllll about the implementation details.

Rooms and Elephants

In defiance of Internet Time, I’m going to recommend an essay I came across a couple of months ago: Sven Birkerts’ “The Room and the Elephant.”

Birkerts weighs in on a debate I’m quite deeply interested in, both personally and as a scholar. Whether it’s being framed as a conversation about collective intelligence, expertise, authorship or individuality, the thread running through it is always the myth of individual creativity.

I say myth quite deliberately, because unlike Birkerts and Lanier (of whom he seems to approve, though I can’t imagine why), I find the notion of individual creativity deeply problematic. We romanticize a process that actually involves many more people than we like to admit. Colleagues, collaborators, conversation partners, chance encounters are all a part of the creative process – not to mention the professional apparatus involved with execution, production and distribution of any big idea. Every creator is connected to the world. We use the myth of individual creativity to draw a line between what is the creator’s and what is the world’s, but that line is essentially arbitrary, drawn more from our minds and dreams than from reality.

At the same time, that hardly puts me in a camp with Bustillos and the others Birkerts cites as his opposition. I believe expertise matters, for example; I’m enough of a cognitive psychologist to understand the ways in which experts differ from novices in perception, memory, problem-solving and more. I think it’s disingenuous to assume those differences don’t translate to the digital world. I also am less than enchanted with the notion that online is necessarily different. You’ll notice I don’t believe that individual creativity has become a myth, in our newly networked world. Rather, I believe that it has always been a myth that we finally have the tools to examine.

Finally, I’ll add that I’m increasingly interested in the role of the body in our increasingly screen-based age. I think it’s no coincidence that the body is treated more and more like an object, to be exercised or groomed or controlled, as our work lives become more and more about pouring our brains onto screens. Birkerts may call it a longing for selfhood, but I’d argue that a lot of the web is about expressing and constructing selves – even if it’s not the deep subjectivity Birkerts valorizes. What’s missing is a way of constructing the self that integrates the mind, the body, and the often-forgotten heart.

A Very Old Freshman Musical

Back in 1995, I was a college freshman. One night I was sitting around the dorm with some friends (including Steve Huff, Dara Horn and Michelle Chen) complaining about how hard it was to get involved in theater in your first semester – especially if you wanted to write, direct, or play the lead. Somehow, we decided that the best solution was to write, compose, direct, produce and staff an entire musical using only first-year students. That way there’d be at least one production where high-level roles were open to freshmen. And so The Freshman Musical was born.

Over the next six months, we wrote an original musical – with no experience. We composed an original musical – with no experience. We directed and produced and arranged and rehearsed our little hearts out. We stapled all our sets to the floor because someone bought the wrong size staples for the staple gun. We made horrible puns and forgot what it was to sleep. And eventually, the curtain went up on Shakespeare in the Yard, our mashup of Shakespeare and Kafka and Harvard culture and adolescent angst. I can still sing at least half the songs.

Today, Steve sent me the link for this year’s Freshman Musical production. Meaning: sixteen years later, they are still going.

Looking back, what amazes me is that we didn’t wait for anyone’s permission. We didn’t wait until we knew how to do it right.  We didn’t create it thinking we were making an institution. We just made something neat, because it was fun, and we could, and we wanted to. And it turned out that we made something awesome.

Recently, I’ve been so absorbed in finishing my dissertation that I’ve been saying no to starting things. But actually, starting things is awesome in a completely different way.

I think that’s my life lesson for today.

The Top of Your Mind, Part I

Over the last few days, I’ve been thinking a lot about Paul Graham’s essay, The Top Idea in Your Mind.  He argues for “ambient thought” as a valuable problem-solving tool.  The thing you let your mind drift to when you’re in the shower, or standing in line, or on the subway?  That’s the thing you’re going to have insights about.  What seem like snips and scraps of time add up to a lot of attention on a problem, especially since they’re likely reflecting even more activity going on under the surface.

When it comes to creativity, this is what’s called incubation – time you’re not actively spending on a problem, but that nonetheless helps you solve it.  There’s some debate about how incubation works: does it help you come up with new ideas about a problem, or does it just help you let go of ideas that aren’t working?  Either way, though, that time is valuable.  As Graham points out, you can get unproductive things stuck in the top of your mind, such as raising money or arguments you’ve had.  If you do, you lose out on productive incubation for whatever idea you might have engaged with otherwise.

But are money and arguments really always unproductive?  Can we generalize beyond Paul Graham’s experience?  I think the answer is yes.  What’s common to Graham’s problematic “top ideas” he mentions is inability to control the outcomes.  Raising money is dependent on other people’s willingness to give it to you.  Resolving a dispute is dependent on the participation of whoever you’re in conflict with.  No matter how much time you spend “solving” these problems, they’re not within your power to solve.  Spending top-of-the-mind time on them is like salting your umbrella: it may make you feel like you’re cooking, but at the end of the day, it won’t taste very good no matter what you do.

Graham’s approach to forgiveness is a really good example of how to let go of problems you can’t control, and focus on ones you can.  When you find yourself able to spend your top-of-the-mind time on things you can make progress on, you’ll find that progress actually gets made!

Unrequited Romance With Media Figures

David Duchovny, why don’t you love me?

Or, if you’re like me and have never seen the X-Files, you could just get it on with Ray Bradbury.  [NSFW]

As funny as these videos are, they’re also wonderful examples of magical thinking about authorship and creativity.  To love the work is to love the man behind the work.  (And of course the reverse happens too.  See under: Olivia Munn.)

I think this would be an interesting concept to look at historically, as ideas of public and private life have changed.  I just read How to Be Alone* which has a wonderful essay on the disappearance of the public sphere, and at the same time new technologies let us perform as if in public all the time.  I’m sure authors and actors have always gotten plenty of mash notes (what a wonderful phrase!), but to me these performances feel different, and I don’t think it’s just because they’re meant to be funny.  I think the funny lies in the exposure of this conflation of author and work, public and private, and the attempt to touch one by rather literally touching the other.

* If this indicates to you that I am way behind with my book logging, you would be absolutely correct.

Assumptions, Transgression and Plagiarism

In Stanley Fish’s post on plagiarism, he argues that plagiarism is not a “big moral deal” – a phrase I’m delighted he uses, even if I disagree with parts of his argument.  It deflates the usual hysteria about how terrible it is that kids these days just don’t care about proper attribution of sources because the Internet has ruined their brains.  This is a chance to actually think the issue through.

I think Fish is absolutely right that plagiarism only becomes a transgression against the backdrop of certain assumptions.  However, I think those assumptions are far broader than simply the disciplinary standards of journalism or academia.  In writing something down, there is an implied claim of ownership of the words and ideas.  Referencing the original author then lets the reader know that in this case, that assumption does not hold.  Citation holds additional meaning in particular disciplines, of course, such as allowing the reader to trace the ideas in the piece back to their sources.  Ultimately, though, if we didn’t have widespread cultural assumptions about the meaning of writing something down, we wouldn’t need to signal when those assumptions are violated.

Here’s a thought experiment.  Writing – and by extension authorship – could come with a set of rather different assumptions.  Imagine that everything written down was assumed to be the work of someone else.  If you were to propound your own ideas, it would need to be clearly signaled with a set of written markers, and failure to use those markers correctly could get you in serious trouble. “Kids these days are so arrogant,” one can imagine the hysterical articles claiming.  “They keep trying to pass their own work off as the work of other, smarter, more talented people.  They just don’t care about properly marking what’s their own.”  This isn’t even so terribly unrealistic; the documentary hypothesis suggests precisely this sort of reverse plagiarism, just for example.  (I remember hearing a similar argument made about Shakespeare’s positioning some of his dramatic innovations as traditional, but damned if I can find it again.  Anyone know?)

This is why I disagree with Fish.  The disciplinary values that he frames as “game rules” are not value neutral.  (Just ask Ian Bogost about the way rules can express particular critical positions!)  They are founded on a commonly understood meaning of what writing means, which itself ties to the notions of originality and single authorship that Fish tries to take out of the picture.  Saying that plagiarism is wrong because it violates the rules of the academic game is fine, but that only refers the issue to the assumptions on which the rules themselves are founded.  Will those rules change as our assumptions about written texts change?  Will they stay the same, becoming a hermetically sealed system?  Or will they serve as a brake on how our ideas shift, tying us back to individualism and changing our practices themselves?

It also bothers me that originality and single authorship are assumed to go together, and that if one doesn’t exist, it violates the other.  This assumption is by far my biggest problem with the paper.  People can be profoundly creative working in groups.  While it’s hard to assign a standard notion of authorship to any individual within the group, it hardly implies that just anyone can claim to have contributed, or that nothing new has been created.  In fact, one can look at the entire process of academic citation and referencing as a way of collaboratively producing original knowledge.  Each person working in a particular academic tradition has their own ideas, but they’re based on relationships and prior ideas.  The citation process is precisely a system to formalize and make visible this group relationship, in the interests of allowing one person to claim authorship of a single portion of the conversation.

Of course, these ideas about originality and single authorship are ones Fish reports on, not claims as his own.  I just wish he’d cited his sources more extensively, so I could respond to the people actually making this argument rather than Fish’s interpretation of their work!

Reading List 2010 (7/99)

More books!  (And musings on creativity, below the cut.)

  • Gone Tomorrow, Lee Child
  • Relentless, Dean Koontz
  • Undone, Karin Slaughter
  • The Reapers, John Connolly
  • The Pursuit of Love, Nancy Mitford
  • Love in a Cold Climate, Nancy Mitford
  • The Complete Stories of Evelyn Waugh, Evelyn Waugh

Read more

Finding My Line

Over the last two months I’ve been working intensely on my dissertation.  If you’re wondering whether this relates to my adviser’s imminent return from sabbatical, you’d be right!  But I’ve been really surprised by how much this intense focus helps both my productivity and my mood.  I wake up every morning raring to dissertate (yes, I did just say “raring to dissertate”), and I still have several hours to devote to other projects after I hit my daily targets.

Today, I came across Merlin Mann’s article on Making Time to Make and realized what I’ve been doing.  I’ve been drawing a clear and firm line around my time.  While I hardly have the problems of a Neal Stephenson, I do have lots of people who want my time: academic colleagues, former students, potential consulting clients, friends I haven’t seen recently, and more.  All these relationships enrich my life, but there’s more of them than I can manage!  Worse, making daily decisions about how much attention I could spare was killing my productivity even when I wasn’t actually available.

I’ve made a few exceptions, but my so-far-successful ruleset looks like this:

– No meetings that end after 10am, unless data collection requires it.
– No leaving the office for any reason until I’ve hit my dissertation goal for the day.
– No new freelance projects or academic commitments.*
– No organizing social events of any kind; let other people be in charge!
– No long emails.  (And a private IM account that only my boy’s got access to.)
– No apologizing for putting my dissertation first.

What’s especially interesting to me is just how much of this was made possible by the dissertation-completion fellowship program I’m in.  The office they gave me is hidden away**, meaning I don’t get interrupted unexpectedly.  The workspace is ergonomic enough that I can work until I’ve hit my daily goal without killing my wrists.  The meeting room is heavily booked during the late morning and afternoon, so I’m not tempted to schedule midday meetings.  It’s amazing how these structural changes help me enforce my own rules!

That’s not to say that line-drawing has no drawbacks.  There are people I really like who aren’t getting the attention I want to give them, and I’m feeling pretty darn broke without any new projects in the pipeline.  Just yesterday I had to tell a former student I couldn’t meet with him, which I hate to do!  And there are less obvious drawbacks, too: I’m not really good at letting other people organize my free time, so instead of hanging out with friends I’m doing more one-on-one activities with the boy.***

I think that some of the specifics of my strategy will have to change during the upcoming year. For example, I’d like to have one “open afternoon” a week, where I go work somewhere I’m casually available for conversation and brainstorming.  I also don’t think I can go a whole year without organizing any social events!  But having rules, even if they’re less strict, seems to work really well for me.  The less time I spend making decisions about how to spend my time, the more time I actually have to spend.****

* Okay, I’m really bad at this one.  Why must so many things be so interesting?

** In a basement, as usual.  Do you think I can write “Must have workspace with window” into a job contract?

*** Though this isn’t all bad, since it’s resulted in dance lessons!

**** Which is why I may have to do a piece about rules as cognitive technology.  But not now!  My rules say I can’t!

Which Test?

As a researcher, I’ve wanted a site like whichtest.info for ages. By answering a few simple questions about your data sample, the site helps you figure out which statistical methods to apply. It’s not a replacement for stats classes, but it’s an incredibly helpful supplement. (Or, if you’re like me, it’s a good way to confirm that you’re actually doing what you meant to!)

It’s also reminded me that there’s lots of easy, relatively low-tech tools for thinking that no one’s built yet, because no one’s seen the money in it – or has gotten sufficiently annoyed to roll their own. Just last week I had to pull out my stats textbook to check something and got really irritated by how long it took me. Next time I’m annoyed at something in my daily life, I’ll pay attention!

OH JARON LANIER NO

I just came across the New York Times’ review of Jaron Lanier’s new book. Ordinarily I’d read it before opining, but I’m not sure I’m willing to pay him for a copy. Here’s one example of why:

His new book, “You Are Not a Gadget,” is a manifesto against “hive thinking” and “digital Maoism,” by which he means the glorification of open-source software, free information and collective work at the expense of individual creativity.

As a creativity scholar, I find the notion of “individual creativity” as constructed in our culture to be a myth. Creativity is not something that happens in a vacuum. Even if you take a fairly narrow definition of the term (as, say, novel creations), almost no one works alone. Most fields require collaboration for people in them to function, let alone to advance. We also perpetually underestimate the degree to which personal and chance connections influence people’s creative work.

Worse, the myth of “individual creativity” is a poisonous one. It’s rooted in a Great Man approach to the world, which has been debunked in most fields but somehow not this one. Under this theory, creative advances happen because of the heroic efforts and remarkable capabilities of one person. This is just not realistic, as Herbert Spencer pointed out back in the day. It’s also used to argue that women and minorities are less creative than men (because, you know, where’s our Mozart?). This theory is seriously not even wrong.

Lanier may be right that collective work is being glorified at the expense of individual creativity, but he’s wrong to be upset about it. “Individual creativity” deserves to have some of the air taken out of it. Jaron Lanier may have something to lose if that happens, but the millions of people doing unrecognized collaborative creative work have far more to gain.