Premises, premises

I honestly hadn’t intended to take the last few days off from blogging, but I assure you, I have a dandy excuse. To give you a hint, I invite you to contemplate the riddle of the Sphinx: what animal walks on four legs in the morning, two at noon, and three at night?

On to the day’s business — or rather, the business of last week. Scouring my to-blog-about list for amusing and thought-provoking topics to while away the time before the advent of Queryfest, my annual foray into all things query-related, I came across a terrific question from reader Kelly:

I have a question about plot clichés, if you have the chance to address it. Obviously, the “it was all a dream” won’t fly. What other common plot twists do those of you who see so many manuscripts just groan about? Thanks and feel better soon.

That bit at the end will tell you just how long even very good questions sometimes linger in my hey, that would make a great post pile: kind Kelly was wafting me positive energies immediately after my car crash last year. There’s been some recent progress in that area, by the way: after 14 months, I’m finally walking without a cane.

Can tap-dancing be far behind?

So if I’m honest about it, responding to Kelly’s question is really the business of last year. That seems oddly appropriate, given one of the publishing world’s most common complaints about writers: a fondness for procrastination.

Oh, don’t grimace; everyone procrastinates a little. It’s healthy not to be too rigid. Besides, one of the most important lessons any writer of book-length work has to learn is that a full-length manuscript is not the kind of thing that even the most gifted crafter of prose can polish off in a day, a week, or a month.

Oh, some writers (including yours truly) can indeed draft new text very quickly, but that’s not the issue. Writing a book requires consistent, patient application, not merely short, intense bursts of endeavor. So does revising a manuscript. Yet since most of us do our best work if we can devote some unbroken time to it, it can be very tempting to put off diving in — or diving back in — until we can devote a whole day, week, or month to it, isn’t it?

And that temptation, boys and girls, is why most serious writers have woken up on at least one fine spring morning, sat bolt upright in bed, and shouted, “Wait — how much time has passed since I swore that I was going to finish that revision? Or start it?”

Or exclaimed, “Hey, wasn’t my New Year’s resolution to send out ten queries per week? Have I sent out even one this month?”

Or moaned, “Oh, my God — the agent of my dreams requested pages six months ago, and I’m still revising. Should I take another run at Chapter 152, or should I pop the whole shebang in the mail as is? What if she doesn’t want it anymore?”

I’m not bringing this up to depress all of you who swore that Labor Day (or the Fourth of July, or Valentine’s day, or St. Swithin’s day) was going to be the moment you sprung into action, honest. Nor am I passing judgment on the many, many aspiring writers whose lives swamped their good intentions. I’m not even changing the subject so that I may put off answering Kelly’s excellent question for a few more minutes.

I’m bringing it up, if you must know, because writers who procrastinate so often create characters that procrastinate. Seriously, it’s one of Millicent the agency screener’s most frequent complaints about how novelists and memoirists plot books: characters irk her by sitting around and thinking too much.

Or, to mix things up a little, by sitting around and talking through the problems with their best friends, coworkers, mothers, fathers, or, depending upon book category, the people they are about to try to murder. Especially, as is often the case in novel submissions, when these little chats over coffee, in bars, over lunch, over a telephone, or in hastily-improvised torture chambers consist largely of the protagonist recapping conflict that reader has already seen.

How, from an editorial standpoint, could that not seem redundant? “Criminy, move on,” Millicent scolds the text in front of her. “The point of novel narration is not to convey every single thing that happened in the book’s world, but to tell a story in a lively and entertaining manner!”

Because I love you people, I shall spare you what she hisses at memoir submissions in which the narrator agonizes for fifty or sixty pages on end about whether to confront someone who clearly needs some confrontation — only to decide not to do it after all. In fiction and nonfiction alike, her preference nearly always leans toward the active protagonist given to making things happen, rather than a passive one to whom things happen.

Half of you clutched your chests at some point over the last four paragraphs, didn’t you? Relax; I’m not about to suggest the all-too-often-heard advice on this point: telling writers never to show their protagonists thinking is akin to asserting that no character, however devoted to the color pink, may ever be depicted wearing it. Intelligent characters frequently think, and one-size-fits-all writing rules are almost invariably wrong a great deal of the time.

What I am suggesting, heart-clutchers, is merely that Millicent, like most professional readers, has from long experience developed a finely-tuned sense of how much rumination is too much, as well as when it starts to feel repetitious. To eyes trained to spot textual and conceptual redundancy, even a single repeated thought pattern can jump off the page. Small wonder, then, that showing the complexity of a problem by depicting the protagonist revisiting the same set of doubts over and over again is a notorious professional readers’ pet peeve.

Frequently, their impatience is justified: while deeply-felt internal conflict can be quite interesting on the page, most protagonists in first-person and tight third-person narratives don’t think about problems differently each time. Instead, the writer seeks to have the page mirror the way people mull over problems in real life: with redundant logic, facing the same fears and rehashing the same options on Monday as on Friday.

Or the following Friday. Or two years from Friday.

“God, I wish that this writer had never seen a production of Hamlet,” Millicent has been known to murmur over the fourth slow-moving protagonist of the day. “Would it be too much to ask the narrative to get out of this character’s head long enough for her to do something? It wouldn’t even have to advance the plot — I’d settle for her taking up lion-taming or developing a sudden passion for spelunking. Anything, so she gets out of her chair and moves around the world!”

“But Anne!” I hear some of you chest-clutchers point out, and with good reason, “people honestly do fall into thought loops when they’re worried about something, especially if they lean toward the compulsive in general. I’m sorry if it bores Millicent, but I’m trying to represent reality here: the human psyche is not always bent upon producing entertainingly diverse thought patterns.”

Perhaps it isn’t, but you should be. It’s a writer’s job not just to hold, as ’twere, the mirror up to nature, but to create a result that will be a pleasure to read. Redundant thoughts, like redundant action, have a nasty habit of discouraging readers from continuing to turn pages. Obsessive characters can be very interesting, but as the pros like to say, it all depends on the writing: it’s very, very easy for realistic depictions of recurrent thought or even feeling to become positively soporific on the printed page.

Not as easily spotted a cliché as it was a dark and stormy night or you may be wondering why I called you all here, admittedly, but the rumination-obsessed protagonist is actually more common in submissions these days than either of these well-worn tropes. None of these are as ubiquitous as teenagers who roll their eyes, of course, or people under 50 who say whatever and like, but all are equal-opportunity Millicent-annoyers.

Now the rest of you are clutching your chests, but at this late date, most adult readers, even non-professional ones, have seen enough compulsive thought patterns on the page to recognize it within a line or two. At most, it will take them a couple of paragraphs to catch on. How, then, is the writer to maintain interest and tension throughout pages and pages of it?

Honestly, a little obsessive-compulsion goes a long way on the page. Millicent’s seeing less of it these days than when the TV show MONK rendered OCD such a popular character quirk; if a hit TV show or movie contains a noteworthy character trait or plot twist, it’s a safe bet that agencies will be receiving hundreds of iterations of it over the next 2-5 years. The Millies of the early 1980s could have wallpapered both North and South Korea entirely in manuscripts that resembled M*A*S*H, for instance; for the last decade, it’s been rare that a police procedural submission does not include a scene reminiscent of LAW AND ORDER or CSI. And frankly, our time on earth is too precious to waste time toting up how many SF and fantasy submissions fairly reeked of the influence of STAR WARS and STAR TREK.

It’s not that some of the borrowed characters and quirks are not inherently entertaining; in a good writer’s hands, they certainly can be. There’s also something to be said for adhering to the conventions of one’s chosen book category: in a Western, readers expect a confrontation between the fellows in the white hats and the black, just as readers of women’s fiction expect their protagonists to grow and change over the course of the story.

By definition, though, what none of these elements can ever be is fresh.

Which goes right to the heart of Kelly’s question, does it not? While the list of premises, plot twists, and character traits that might set Millicent’s teeth on edge changes perpetually — what might have riled her Aunt Mehitabel when she was just starting out as a reader in the mid-1970s is substantially different from what might occur often enough to get on Millie’s nerves today, or her younger sister Margie five years from now — the basic principle remains the same: even if the writing is good, if she’s seen it before, it’s not going to seem fresh or surprising on the page.

Remember, Millicent is not only charged with the task of sifting through submissions to find great writing and original voices; she’s also looking for unique takes on reality and plots that she hasn’t seen before. While imitation may be the sincerest form of flattery (which I sincerely doubt), at submission time, not seeming like a rehash of the most recent bestseller or blockbuster film is a significant asset.

I know, I know: it’s not all that uncommon for agency submission guidelines to sound as though their Millicents are eagerly awaiting a carbon-copy of whatever is hitting the top of the bestseller lists today. Indeed, sometimes they are looking for copycats. Even with monumental bestsellers like the TWILIGHT series or BRIDGET JONES’ DIARY, though, it usually doesn’t take too long before Millie and her boss are saying, “Oh, no, another knock-off? I want the next great bestseller, not what was hot two years ago.”

Don’t believe me? How hard do you think it would be to sell BRIDGET JONES’ DIARY as a fresh manuscript today? It would simply seem derivative.

That’s why, in case you had been wondering, those oft-repeated experiments in which some bright soul submits the first 50 pages of some classic like PRIDE AND PREJUDICE (1813) to an array of present-day agents and/or publishing houses, in an attempt to test whether their Millicents would know great literature if it fell in their laps, invariably fall flat. Of course, PRIDE AND PREJUDICE would get rejected today; as a new manuscript, it would seem completely lifted from Jane Austen. To a reader familiar with English novels of the period, even the title would seem unoriginal: the phrase PRIDE AND PREJUDICE (in all caps, no less) is repeated no fewer than three times in Fanny Burney’s novel of a generation before, CECELIA, OR, MEMOIRS OF AN HEIRESS (1782).

Besides, have you seen how much time Austen’s protagonists spend thinking?

I know that this might come as a shock to the many, many writers raised on 19th-century literature, but what seemed fresh on the page in 1813 is unlikely to strike Millicent as original or even market-appropriate today. Ditto for 1913, 1943, 1983, or 2003. In fact, what would have wowed ‘em at the agency in any of those years is likely to seem positively dated now, even if the cultural references did not.

Remember, too, that Millie lives in the same media-heavy culture you do: while she might not watch enough T.V. to know what a Snooki is, to catch an Arrested Development reference, or to be able to pick any of the current crop of presidential contenders out of a police line-up, it’s unlikely that she would be lucky enough to have missed any public discussion of these phenomena. If you loved the Saw movies enough to borrow some elements of them for your horror manuscript, chances are that a Millicent working in a horror-representing agency will be harboring some affection for those movies, too.

Which is not to say that a plot similar to the Saw movies might not have done very well, had it hit Millicent’s desk right after the first film in the series came out. Many a writer who has been toiling away quietly for years on a manuscript has suddenly seen it become sought-after as soon as a similar book, movie, or TV show hits the big time. Agents and editors do often clamor for something similar to what’s hot at the moment. Since it takes so long to write a book, however, it’s generally the writers that were already working on a book, not because it was cool, but because they liked the subject matter, who are in the best position to take advantage of such a trend. Or writers who can produce a manuscript with similar appeal within a year or two. After that, imitation is likely to make the book seem dated.

Not sure what a dated manuscript is, or why it might be hard to sell? Okay, let me ask you: if you picked up a book stuffed to the gills with references to Ross Perot, would you (a) embrace it as a book about contemporary politics, (b) assume that it had been published sometime in the mid-1990s, and turn to another book for insights on the current political scene or (c) wonder who in the heck Ross Perot was?

If you said (b), you’re beginning to think like Millicent: the 1992 election was a long time ago. If you said (a), I’m guessing you do not follow politics very closely. And if you said (c), well, ask your parents, but don’t be surprised if they remember his ears more than his politics.

Even if a manuscript avoids the specific pop references that tend to age so poorly on the page — nothing seems more tired than yesterday’s catchphrases, right? — borrowing the plot twists and premises of yesteryear can make a book seem dated. One of the surprisingly immortal premises: neighborhoods where none of the mothers work outside the home, or even consider it. While it’s not beyond belief that such communities still exist, it’s far enough from the mainstream American experience these days that it would require fairly extensive textual explanation.

Embracing writing fads of years past also tends to make a manuscript seem dated. When STAR WARS embraced the Jungian heroic journey structure, it generated a lot of buzz — and for the next two decades, the viewing public was inundated with movies with that same structure. Then, in the late 1990s and early 2000s, advocating that structure for novels became extremely popular, resulting in manuscript after manuscript with essentially the same story arc falling on Millicent’s desk with clockwork regularity. Because Millicent’s boss was screening manuscripts back then, Millie’s been trained to regard that structure as old-fashioned.

Not to mention predictable. And speaking of repetitive premises, does it bother anyone but me that the mortality rate for mothers in the STAR WARS movies is close to 100%?

Seriously, it doesn’t pay to underestimate just how predictable adhering to a well-worn plot device can render a manuscript, especially to someone who reads as much as Millicent. People drawn to work in publishing tend to be both plot-retentive and detail-oriented: I was surely not the only future editor who walked out of the original STAR WARS saying to her big brother, “You know what would make more sense than that ending? If Leah was Luke’s sister? I mean, honestly — why begin their names with the same first letter, something screenwriters usually take wincing pains to avoid, unless we’re supposed to guess that there’s a familial relationship?”

Okay, so this was probably not how most elementary schoolers reacted to the film, but I read a great deal. Not only science fiction, but fables — and the heroic journey story arc was supposed to surprise me? Nice try, Mr. Lucas.

An original plot twist or premise should surprise the reader — and that’s legitimately hard to do. It’s also often difficult for an isolated writer to spot just how much his plot, premise, or characters might resemble what Millicent is receiving from other writers. Even if the writer can successfully weed out conceptions of dramatic fitness culled from stories floating around the zeitgeist — from movies, television, books, even major news stories — that might be turning up in other submissions, rooting out or even noticing stereotypes (what? The guy with tape on his glasses is a computer expert? Who saw that coming?), stock plot twists (the murderer isn’t the first person the police arrest? Alert the media!), overused premises (the police partners who made the arrest are experiencing some romantic tension? The school bully targeting the gay teen is himself fighting urges in that direction? The guy bent on revenge is actuated by the trauma of having seen his wife and small child murdered out of the reader’s sight and before the story began?) and hackneyed phrasing (“I’m sorry for your loss,” anyone?) can often require an outside eye.

Why? Often, such well-worn story elements are so familiar to the writer, as well as to her nearest and dearest, that they don’t seem like clichés. They just seem like the constituent parts of a story. Therein lies the essential paradox of trafficking in the already-done: that plot twist that feels dramatically right may well come across that way because you’ve seen it before.

And so has Millicent. Remember, clichés don’t irritate agents, editors, and contest judges the first time these fine folks spot them on the manuscript page, typically, or even because the pesky things are repeated over the course of a particular submission or contest entry. What chafes their sensibilities is seeing the same phrases, characters, plot twists, and even premises over and over across hundreds of manuscripts.

Hey, if you’ve seen one completely selfless mother, a lady completely devoid of any personal preferences unrelated to her children, you might not actually have seen ‘em all. After screening the forty-seventh synopsis featuring a selfless mother within a week, however, it might well start to feel that way.

That’s a pretty good test of whether a manuscript might have strayed into over-nibbled pastures, by the way: if the synopsis — or, sacre bleu, the descriptive paragraph in the query letter — makes reference to a well-established stereotype, it’s well worth looking into how to make the characters less, well, predictable.

And now two-thirds of you chest-clutchers are mopping your weary brows. Honestly, this is beginning to read like a word problem on the math section of the S.A.T.

By definition, stereotypes and clichés are predictable: they are the shorthand a culture uses for archetypes. The mean tenth-grade girl, for instance, or the dumb jock. The absent-minded professor who can’t find the glasses perched on top of his head. The sociopathic lawyer who cares only about winning cases, not justice. The tough drill sergeant/teacher/physical therapist who seems like a bully at first, but turns out to be concealing a heart of gold.

Hey, what happened to all the floozies harboring hearts of gold? When did they fall out of the collective mind? Sometime during the Reagan administration? Or was it a decade earlier, when librarians and schoolteachers lost the right to yank the pencils from their collective hair, remove the eyeglasses that they apparently don’t require in order to see, and have the nearest male exclaim, “Why, Miss Jones — you’re beautiful!”

Now, poor Miss Jones would to be an expert in particle physics, save the world in the third act of the story, and look as though she had never eaten a cookie in order to engender that reaction. It’s enough to make an educated woman bob her hair.

Naturally, what constitutes a cliché evolves over time, just as what seems dated in a plot does, but as far as characterization goes, one factor remains the same: a stereotype telegraphs to the reader what kind of behavior, motivations, and actions to expect from a character. A pop quiz for long-time readers of this blog: why might that present a problem in a manuscript submission?

For precisely the same reason that a savvy submitter should avoid every other form of predictability, especially in the opening pages of a manuscript or contest entry:: because being able to see what’s going to happen in advance tends to bore Millicent. If a professional reader can tell instantly from a story’s first description of a character precisely how he is going to act and how he is likely to speak, where’s the suspense?

The same holds true for too-common premises, by the way. Those two coworkers of opposite sexes squabbling? They’ll be in love within fifty pages. That child the woman who swore she never wanted children inadvertently acquires, by accident, theft, or some inconsiderate relative’s leaving him on her doorstep. It will completely transform her life. The completely irresponsible man who discovers he’s had an unknown child for decades? He’s going to be integral to that kid’s life, and vice versa. That wish the protagonist makes on page 2, even though the text explicitly tells us that she never wishes on passing stars? It’s going to come true.

In spades. It’s written on the sand.

Oh, you thought that Millie wouldn’t catch on that teenage Billy was going to wreck his new motorcycle by the second time his parents are shown to be worried about it? I hate to burst anyone’s plotting bubble, but at this juncture in literary history, most professional readers would have said, “Oh, he’s doing to crash it,” halfway through the scene where he bought the bike.

She’s also going to foresee that the character a bystander identifies as having had a hard childhood is going to be the mysterious murderer decimating the summer camp/isolated hotel/submarine’s crew, the grandmother/grandfather/elderly neighbor giving the youthful protagonist with nowhere else to turn sterling (if predictable) advice is going to have some sort of a health scare by three-quarters of the way through the book, and that the otherwise clear-thinking lady who wisely retreated to someplace her violent ex-husband/evil boss/corrupt Congressman isn’t will be startled when he shows up.

Quite possibly standing behind her while she is gazing soulfully into a mirror. A cat will have startled her first, however. That fellow also not going to be dead the first time she, her knight in shining armor, or the few remaining members of that light-hearted weekend canoeing party think they have dispatched him.

Hey, the monster always returns is a cliché for a reason.

I don’t mean to alarm you, but reading manuscripts for a living often results in a serious decrease in the ability to be astonished by plot twist at all. Avert your eyes if you have never seen The Sixth Sense, but I had twice suggested to my date that the psychologist was a ghost before the end of the first therapy scene. I kept asking, “But if he’s alive, why isn’t he talking to the kid’s mother? And why doesn’t she have any interests or concerns unrelated to her child?”

To anyone who has been reading manuscripts for a living for more than a week or two, there’s another problem with stock characters. Millicent tends to associate them with rather lazy writing — and certainly with lax research. I’m not just talking about the astonishingly common phenomenon of novels saddling their protagonists with professions with which their writers are clearly unfamiliar (if I had a nickel for every tax specialist character who takes an annual month-long holiday on April 16th because the writer who created her isn’t aware of how many people file their taxes late, I would be able to afford a month-long holiday right now) or the equally common fish-out-of-water stories in which the writer seems as out of his depth in the new environment as his protagonist (my personal pet peeve: protagonists who inherit wineries, then proceed to run them with a whole lot of heart — and learning valuable life lessons — while clearly learning virtually nothing about the actual practicalities of making wine).

I’m talking about characters, usually secondary ones, that are different in some fundamental way from the protagonist. You wouldn’t believe how often subtly-drawn primary characters share page space with downright cartoonish villains or minor characters.

When writers just guess at the probable life details and reactions of characters unlike themselves, they tend to end up writing in generalities, not plausible, reality-based specifics. A common result: characters whose beauty and brains are inversely proportional, whose behavior and/or speech can be predicted as soon as the narrative drops a hint about their race/gender/sexual orientation/national origin/job/whatever, and/or who act exactly as though some great celestial casting director called up the nearest muse and said, “Hello, Euterpe? Got anything in a bimbo cheerleader? Great — send me twelve.”

Seen once on the page, one-note characters are kind of annoying. When those cheerleaders come cartwheeling across a good 40% of YA set in high schools, even a hint of waved pom-pom can get downright annoying.

Even amongst agents, editors, and judges who are not easily affronted, stereotypes tend not to engender positive reactions. What tends to get caught by the broom of a sweeping generalization is not Millicent’s imagination, but the submission. If it seems too stereotypical, it’s often swept all the way into the rejection pile.

Why, you ask? Because by definition, a characterization that we’ve all seen a hundred times before, if not a thousand, is not fresh. Nor do stereotypes tend to be all that subtle. And that’s a problem in Millicent’s eyes, because in a new writer, what she’s looking to see — feel free to chant it with me now — originality of worldview and strength of voice, in addition to serious writing talent.

When a writer speaks in stereotypes, it’s extremely difficult to see where her authorial voice differs markedly from, say, the average episodic TV writer’s. It’s just not all that impressive — or, frankly, all that memorable.

“But Anne,” writers of reality-based fiction and nonfiction alike protest, “sometimes, stereotypes have a kernel of truth to them, just as clichéd truisms are frequently, well, true. Isn’t it possible that Millicent sees certain character types over and over again because they pop up in real life so often, and writers are simply reflecting that? Should she not, in short, get over it?”

Ah, editors hear that one all the time from those writing the real, either in memoir form or in the ever-popular reality-thinly-disguised-as-fiction manuscript. In fact, it’s an argument heard in general conversation with some fair frequency: many, many people, including writers, genuinely believe various stereotypes to be true; therein lies the power of a cliché. The very pervasiveness of certain hackneyed icons in the cultural lexicon — the policeman enraged at the system, the intellectually brilliant woman with no social skills, the father-to-be who faints in the delivery room, that same father helpless if he is left alone with the child in question, to name but four — render them very tempting to incorporate in a manuscript as shortcuts, especially when trying to tell a story in an expeditious manner.

Oh, you don’t regard stereotypes as shortcuts? Okay, which would require more narrative description and character development, the high school cheerleader without a brain in her head, or the one who burns to become a nuclear physicist? At this point in dramatic history, all a pressed-for-time writer really has to do is use the word cheerleader to evoke the former for a reader, right?

Unless, of course, a submission that uses this shortcut happens to fall upon the desk of a Millicent who not only was a high school cheerleader, but also was the captain of the chess team. At Dartmouth. To her, a manuscript that relies upon the usual stereotype isn’t going to look as though it’s appealing to universal understandings of human interaction; it’s going to come across as a sweeping generalization.

Can you really blame her fingers for itching to reach for the broom?

“But Anne,” some of you point out, and who could blame you? “Isn’t this all going a little far afield from Kelly’s original question? Wasn’t she really asking for a list of overused plot twists and premises a savvy aspiring writer should avoid?”

Possibly, but that’s precisely the conundrum of freshness. What would have struck Millicent as fresh a year ago, when Kelly first brought this up, is not what would seem so to her now. Freshness is an ever-moving target, difficult for an aspiring writer — who, after all, usually takes at least a year or two to fashion a premise into a full manuscript — to hit predictably. Since nobody can legitimately claim to know what will be selling well a couple of years from now, committing to a premise is always going to be something of a risky proposition.

All a writer can do is strive to make her plot and characterization as original as her voice — and, ideally, as surprising. The best means of figuring out what will come as a pleasant surprise to her is to read widely in your chosen book category. What kinds of plot twists are used, and which overused? What’s been done to death, and what’s new and exciting? What’s considered characteristic and expected in your type of book these days, and what’s considered out of bounds?

Once you have come up with provisional answers to those questions, ask yourself another: how can I make my book’s premise, characterization, and plot even better than what’s already on the literary market?

Speaking of conundrums, have you solved the riddle of the Sphinx yet? It’s the humble human being: as babies, we crawl; in our prime, we walk on two legs; in old age, we use canes.

Actually, people tend to use walkers now, but who are we to question the wisdom of the Sphinx? All I know — and this is so far from a standard premise that I can’t recall a bestselling novel of the last twenty years that has dealt with this subject in any significant depth — is that after one has been hobbling around on three legs, it’s astonishingly tiring to wander around on just two. And that, my friends, is the explanation for my recent blogging silence: I’ve been taking a long change-of-season nap.

All the better to launch into Queryfest next time, my dears. Keep up the good work!

But enough about you — what about me?

Today, I had planned to launch headlong into my annual foray into how to construct a graceful and effective query letter, campers, but frankly, didn’t we devote an awful lot of the summer to discussing how to pitch? After so many weeks on end of dealing with practicalities, I feel that the artist in each of us deserves a little holiday.

So let’s refresh ourselves by talking craft for a while. Queryfest will be every bit as useful next week.

Memoir-writing and writing about reality as fiction has been much on my mind of late, and not merely because my memoir remains in publishing limbo. (Yes, still. Let’s just be grateful that not every memoirist’s extended family has the wherewithal to make credible $2 million dollar lawsuit threats.) While we writers talk endlessly amongst ourselves about craft and structure for fiction, it’s actually quite rare to stumble into a knot of literary conference attendees avidly discussing how to make a personal anecdote spring to life on the page.

Why is that, when it is so very hard to write memoir well? All too often, the prevailing wisdom dictates that all a writer needs to produce a successful memoir is an exciting life, an ability to write clearly, and, if at all possible, celebrity in another field, so the writing will matter even less. The writer’s platform and the inherent interest of the story, we’re told, are all that matter in a memoir. Anything beyond that, presumably, is gravy.

As to structure, that’s held to be self-evident. In the immortal words of Lewis Carroll,

The White Rabbit put on his spectacles. “Where shall I begin, please, your Majesty?” he asked.

“Begin at the beginning,” the King said gravely, “and go on till you come to the end: then stop.”

As a memoirist and an editor who works regularly with same, I must disagree. While a chronological structure can work, not all human events start out scintillating; depending upon the story, another structure might work better.

Then, too, a memoir cannot really be deemed a success unless readers find it entertaining, enlightening, or at the very least, interesting. That’s not merely a matter of story. Any long-form writing, be it fiction or nonfiction, will benefit from a strong narrative structure, a consistent, likable narrative voice, a plausible and engaging story arc, believable, well-drawn characters, a protagonist the reader would be happy to follow for a few hundred pages…

In short, many of the elements one might find in a well-constructed novel. But that’s not all that a good reality-based story requires, is it? After all, few readers will want to read a story, whether it is presented as memoir or as fiction, simply because it really happened. It needs to feel real on the page — and it needs to be enjoyable to read.

What makes me think that this might be news to many writers of memoir and reality-based fiction, you ask? For my sins, I have served quite frequently as a contest judge, assessing both memoir and novel entries, and I’m here to tell you, they look more similar on the page than one might think.

How so? They tend to share a few characteristics: a one-sided approach to scenes, as if the protagonist’s perspective were the only possible one; an apparent assumption that the reader will automatically side with the protagonist, regardless of what is going on, and bolstering both, a propensity for relating conflictual exchanges as though they were verbal anecdotes, light on detail but strong on emotion. Or, to boil all of these down to a single trait, these narratives tend to be disproportionately weighted toward a single point of view.

And memoirists’ hands fly heavenward all over the world. “But Anne,” they point out, and who could blame them? “My memoir is my story. Why wouldn’t it be biased toward my perspective?”

It should, of course — but in the interests of representing one’s own point of view, memoirists and writers of the real often render the narrative so one-sided that the situation neither seems plausible nor fairly presented. It just reads like a diatribe in scene form, a piece of prose whose primary point is not storytelling, but getting back at someone.

About half of you have started to blush, have you not? I’m not surprised; in both memoir and reality-based fiction, the scene where the reader is evidently expected to take the protagonist’s side, not because the antagonist is shown to be particularly awful, but because the narrative presents the antagonist without any sympathy — or, usually, any redeeming characteristics — is a notorious pet peeve of our old pal, Millicent the agency screener. And not just as a generality, either. When Millicents, their boss agents, and the editors to whom they cater gather to share mutual complaints in that bar that’s never more than 100 yards from any writers’ conference in North America, the annoying coworker stereotype often crops up in conversation.

As in, “You think you’re tired of conceptual repetition? I’ve read fourteen submissions this week alone with omitable annoying coworker scenes.”

It’s perhaps not altogether astonishing that memoirs would be rife with interactions between the protagonist/narrator and the people who happen to rile her, told in a breathlessly outraged tone, but aspiring writers of fact-based fiction are often stunned to discover that they were not the first to think of inserting actual conflicts into fictional stories. They shouldn’t be: there’s a pretty good reason that such scenes are ubiquitous in manuscript submissions and contest entries. Care to guess?

If you immediately cried out, “By gum, Anne, every writer currently crawling the crust of the earth has in fact had to work with someone less than pleasant at one time or another,” give yourself a gold star for the day. Given how often aspiring writers resent their day jobs — and, by extension, the people with whom they must interact there — that such unsavory souls would end up populating the pages of submissions follows as night the day.

If these charming souls appeared in novel and memoir submissions in vividly-drawn, fully fleshed-out glory, that actually might not be a problem. 99% of the time, however, the annoying co-worker is presented in exactly the same way as any other stereotype: without detail, under the apparent writerly assumption that what rankles the author will necessarily irk the reader.

Unfortunately, that’s seldom the case — it can take a lot of page space for a character to start to irritate a reader. So instead of having the character to demonstrate annoying traits and allowing the reader to draw his own conclusions, many a narrative will convey that a particular character is grating by telling the reader directly (“Georgette was grating”), providing the conclusion indirectly (through the subtle use of such phrases as, “Georgette had a grating voice that cut through my concentration like nails on a chalkboard”), or through the protagonist’s thoughts (“God, Georgette is grating!”)

Pardon my asking, but as a reader, I need to know: what about Georgette was so darned irritating? For that matter, what about her voice made it grating? It’s the writer’s job to show me, not tell me, right?

I cannot even begin to count the number of memoirs and novels I have edited that contained scenes where the reader is clearly supposed to be incensed at one of the characters, yet it is not at all apparent from the action of the scene why.

Invariably, when I have asked the authors about these scenes, the response is identical: “But it really happened that way!”

No surprise there. These scenes are pretty easy for professionals to spot, because the protagonist is ALWAYS presented as in the right for every instant of the scene, a state of grace quite unusual in real life. It doesn’t ring true.

The author is always quite astonished that his own take on the real-life scene did not translate into instantaneous sympathy in every conceivable reader. Ultimately, this is a point-of-view problem — the author is just too close to the material to be able to tell that the scene doesn’t read the way she anticipated.

Did I just see some antennae springing up out there? “Hey, wait a minute. Mightn’t an author’s maintaining objective distance from the material — in this case, the annoying co-worker — have helped nip this particular problem in the bud long before the manuscript landed on Millicent’s desk?”

Why, yes, now that you mention it, it would. Let’s look at the benefits of some objective distance in action.

Many writers assume, wrongly, that if someone is irritating in real life, and they reproduce the guy down to the last whisker follicle, he will be annoying on the page as well, but that is not necessarily true. Often, the author’s anger so spills into the account that the villain starts to appear maligned, from the reader’s perspective. If his presentation is too obviously biased, the reader may start to identify with him, and in the worst cases, actually take the villain’s side against the hero. I have read scenes where the case against the villain is so marked that most readers would decide that the hero is the impossible one, not the villain.

This character assassination has clearly not gone as planned. A little more objective distance might have made it go better. Who was it that said, revenge is a dish best served cold?

Yes, I called it revenge, because revenge it usually is. Most writers are very aware of the retributive powers of their work. As my beloved old mentor, the science fiction writer Philip K. Dick, was fond of saying, “Never screw over a living writer. They can always get back at you on the page.”

Oh, stop blushing. You didn’t honestly think that when you included that horrible co-worker in three scenes of your novel that you were doing her a FAVOR, did you?

My most vivid personal experience of this species of writerly vitriol was not as the author, thank goodness, but as the intended victim. And at the risk of having this story backfire on me, I’m going to tell you about it as nonfiction.

Call it a memoir excerpt. To prevent confusion, I’m going to offset the narrative from the discussion.

A few years before I began blogging, I was in residence at an artists’ colony. Now, retreats vary a great deal; mine have ranged from a fragrant month-long stay in a cedar cabin in far-northern Minnesota, where all of the writers were asked to remain silent until 4 p.m. each day to a sojourn in a medieval village in southwestern France to a let’s-revisit-the-early-1970s meat market, complete with hot tub, in the Sierra foothills.

A word to the wise: it pays to do your homework before you apply.

This particular colony had more or less taken over a small, rural New England town, so almost everyone I saw for the month of March was a writer, sculptor, photographer, or painter. While world-class painters and sculptors were imported up ice-covered rural roads every few days to critique and encourage those newer to their respective arts, the National Book Award winner scheduled to give feedback to the writers didn’t bother to show up for the first week of her residency. Amenities like kilns, darkrooms, and ladders to facilitate the construction of 20-foot woven cardboard cocoons seemed to appear whenever the visual artists so much as blinked. The writers, a tiny minority, had been shoved into a dank, dark cellar with cinder block walls; you could see the resentment flash in their eyes when they visited the painters’ massive, light-drenched studios, and compared them to the caves to which they had been assigned.

See what I just did there? I skewed the narrative so you would resent the visual artists.

But was that necessary? Objectively speaking, they were not the villains in this situation; they, like me, were visitors to the retreat. Besides, since the overwhelming majority of the Author! Author! community is made up of writers, couldn’t I simply have assumed that my readers would identify with the cave residents pretty much automatically?

Or, better yet, couldn’t I have included a vivid detail or two that would have nudged the reader in that direction without the narration’s appearing to be presenting a myopic account?

What kind of detail, you ask? Let’s try this one on for size.

Due to the musty dampness of the writers’ cellar, I elected to write in my assigned bedroom, in order to catch the occasional ray of sunlight. Sure, there were certain drawbacks — the desk had been designed for a hulking brute twice my size, while the desk chair had apparently been filched from a nearby kindergarten — but at least the heat worked. Too well, in fact: an hour and a half into my first afternoon of writing, a sleepy hornet emerged from the gaping hole around the charming antique light fixture and aimed straight for my head.

It was not the best moment to learn that the windows had been sealed for the winter. You know writers: we can’t be trusted not to let all of the heat out. Unlike, say, painters, whose windows might safely open onto vast vistas of forested hillside.

As the afternoon sun warmed the room, hornet after hornet emerged from its long winter’s nap. After the eighth had expressed its displeasure at my having had the temerity to have turned on either the light or the heat, I shook the bees off my jacket, wrapped my head and shoulders in several scarves, and plunged into a blizzard. By the time I reached the administration building, I was chilled to the bone.

Perhaps naïvely, I had assumed that the hornet’s nest in my room would come as a surprise to the retreat’s administrators. The writer who’d had the room the previous November — the local authorities had deemed it inadequately heated for winter residence — had complained about the bees, too. The painter-in-residence charged with rooting them out had simply not gotten around to it.

And didn’t for three days. He was too busy with a canvas that just couldn’t wait to be handed down to posterity. The administrators encouraged me to regard sleeping on a couch next to the dining hall as my contribution to the world’s supply of art. I had to wait until after dark in order to retrieve my laptop.

That engaged your sympathies more robustly, didn’t it? It’s still my experience and my perspective, told in my voice — but I’ve allowed you to draw the conclusion. That’s simply better storytelling.

Don’t see it? Okay, contrast the fleshed-out account above with the following series of summary statements.

Sharing meals in a dining hall was a bit high school-like, conducive to tensions about who would get to sit at the Living Legend in Residence’s table, squabbles between the writers and the painters about whether one should wait until after lunch to start drinking, or break out the bottles at breakfast (most of the writers were on the first-mentioned side, most of the painters on the latter), and the usual bickerings and flirtations, serious and otherwise, endemic to any group of people forced to spend time together whether or not they have a great deal in common.

An environment ripe, in other words, for people to start to find their co-residents annoying.

Aren’t you already longing for me to show you how specifically they were annoying, rather than merely telling you that they were? Let’s exacerbate the problem in the manner so many writers of the real do, creating the illusion of narrative distance by switching the text almost entirely into the passive voice.

Of course, such problems are endemic to large artists’ colonies. One classic means of dealing with the inevitable annoying co-resident problem is to bring a buddy or three along on a retreat; that way, if the writer in the next cubicle becomes too irritating, one has some back-up when one goes to demand that she stop snapping her gum every 27 seconds, for Pete’s sake. I am of the school of thought that retreating entails leaving the trappings and the personnel of my quotidian life behind, but there’s no denying that at a retreat of any size, there can be real value in having someone to whom to vent about that darned gum-popper. (Who taught her to blow bubbles? A horse?)

Doubtless for this reason, several artists had brought their significant others to the hornet-ridden New England village. Or, to be more accurate, these pairs had applied together: writer and photographer, painter and writer, etc. One of these pairs was a very talented young couple, she a writer brimming with potential, he a sculptor of great promise. Although every fiber of my being longs to use their real names, I shall not.

Let’s call them Hansel and Gretel, to remove all temptation.

And let’s see how this telling, not showing thing I’ve got going works for character development, shall we?

Hansel was an extremely friendly guy, always eager to have a spirited conversation on topics artistic, social, or his personal favorite, explicitly sexual. The dining hall’s Lothario, one constantly spotted him flirting with…hmm, let’s see how best to represent how he directed his attentions…everything with skin.

Amusing, but wouldn’t some details have brought his predilections more clearly before the reader’s eyes? Let’s try showing some of his work.

His eyes flickered over the female residents so persistently that I wondered if he was looking for a model. On day three, when he invited me to his palatial sculpture studio, I realized that he might have been seeking a lady to encase in plaster of Paris: practically every flat surface held representations of breasts, legs, pudenda, and breasts. He practically backed me into a backside. Murmuring some hasty excuse about needing to get back to my hornets, I slipped away from his grasping hands and dashed out into the pelting snow.

Still don’t see why that was better? Okay, let’s revert to generalities.

Being possessed of skin myself, I naturally came in for my fair share of Hansel’s attentions. (How’s that for a colorless summary of the proceeding story?) Generally speaking, though, I tend to reserve serious romantic intentions for…again, how to put this…people capable of talking about something other than themselves. Oh, and perhaps I’m shallow, but I harbor an absurd prejudice in favor of the attractive.

This is precisely the type of paragraph that will absolutely slay ‘em in a verbal anecdote, or even in a blog, but often falls flat on the page. Yes, it’s amusing; yes, people actually do speak this way, so it’s a plausible a first-person narrative voice. But it’s vague. It’s character development, in the sense that it purports to tell the reader something about the narrator, but the reader just has to take the narrative’s word for it. Is that really the best way to convince the reader what a protagonist is like?

An artists’ retreat tends to be a small community, however; one usually ends up faking friendliness with an annoying co-resident or two. Since there was no getting away from the guy — believe me, I tried — I listened to him with some amusement whenever we happened to sit at the same table. I was, after all, the only other artist in residence who had read any Henry Miller. We had coffee a couple of times when there was nobody else in the town’s only coffee shop. And then I went back to my room, battled away the wildlife, and wrote for 50 hours a week.

Imagine my surprise, then, when Gretel started fuming at me like a dragon over the salad bar. Apparently, she thought I was after her man.

Now, I don’t know anything about the internal workings of their marriage; perhaps they derived pleasure from manufacturing jealousy scenes. I don’t, but there’s just no polite way of saying, “HIM? Please; I do have standards” to an angry wife, is there? So I simply started sitting at a different table in the dining hall.

A little junior high schoolish? Yes, but better that than Gretel’s being miserable — and frankly, who needed the drama? I was there to write.

Let’s pause here to consider: what do you, the reader, actually know about Gretel at this point? Are your feelings about her based upon what you have actually seen her do or my conclusions about her motivations? And are the facts even clear: was I the only resident of whom Gretel was jealous, or did she fume over the salad bar with anyone possessing two X chromosomes?

Wouldn’t it have worked better had I just shown her slapping peanut butter violently onto some white bread while I tried to make pleasant conversation with her, or depicted her veering away from me with her cracked metal tray? In short, wouldn’t it have made more sense to show this as a scene, rather than telling it as an anecdote?

Often, this fix is expressed rather confusingly: writers are told to insert some narrative distance into such scenes. I’m not a big fan of this language, for the simple reason that most memoirists and writers of the real new to editor-speak tend to interpret it as a call to make the narrative appear objective by, you guessed it, retreating into the passive voice. Let’s take a gander at this strategy in action.

Another phenomenon that often characterizes a mixed residency — i.e., one where different types of artists cohabitate — is a requirement to share one’s work-in-progress. At this particular retreat, painters and sculptors had to fling their studios open to public scrutiny once a week. Each writer had to do at least one public reading in the course of the month.

Feels like you’ve been shoved back from the story, doesn’t it? That’s how verbal anecdotes tend to read on the page: as rather vague summaries. When they are in the passive voice as well, the narrator can come across as the passive puppet of circumstances, rather than as the primary actor of the piece, the person who makes things happen.

Let’s borrow a tool from the novelist’s kit and make the protagonist active, shall we?

Being a “Hey – I’ve got a barn, and you’ve got costumes!” sort of person, I organized other, informal readings as well, so we writers could benefit from feedback and hearing one another’s work. I invited Gretel to each of these shindigs; she never came. By the end of the second week, my only contact with her was being on the receiving end of homicidal stares in the dining hall, as if I’d poisoned her cat or something.

It was almost enough to make me wish that I had flirted with her mostly unattractive husband.

But I was writing twelve hours a day (yes, Virginia, there IS a good reason to go on a retreat!), so I didn’t think about it much. I had made friends at the retreat, my work was going well, and if Gretel didn’t like me, well, we wouldn’t do our laundry at the same time. (You have to do your own laundry at every artists’ retreat on earth; don’t harbor any fantasies about that.) My friends teased me a little about being such a femme fatale that I didn’t even need to do anything but eat a sandwich near the couple to spark a fit of jealous pique, but that was it.

Aha, so Gretel had singled me out. Was there a good narrative reason not to make that plain earlier? It almost certainly would have been funnier — and made both my reactions and my conclusions as narrator make more sense to the reader.

At the end of the third week of our residency, it was Gretel’s turn to give her formal reading to the entire population of the colony, a few local residents who wandered in because there was nothing else to do in town, and National Book Award winner who had finally deigned dropped by (in exchange for a hefty honorarium) to shed the effulgence of her decades of success upon the resident writers. Since it was such a critical audience, most of the writers elected to read highly polished work, short stories they had already published, excerpts from novels long on the shelves. Unlike my more congenial, small reading groups, it wasn’t an atmosphere conducive to experimentation.

Wow, I’ve left you to fill in a lot of details here, have I not? How could you possibly, when the narrative so far has given you only a very sketchy view of time, place, and character?

Four writers were scheduled to read that night. The first two shared beautifully varnished work, safe stuff, clearly written long before they’d arrived at the retreat. Then Gretel stood up and announced that she was going to read two short pieces she had written here at the colony. She glanced over at me venomously, and my guts told me there was going to be trouble.

See how I worked in the false suspense there? Rather than showing precisely what her venomous glance was like — impossible for you to picture, right, since I have yet to tell you what she looks like? — I embraced the ever-popular storytelling shortcut of having the protagonist’s reaction to an event or person take the place of showing what was actually going on. Think that was the best strategy for this story?

Let’s try another tack. How about getting a little closer to what’s happening in that crowded room, so the reader may feel more like she is there? Or at least more like she’s standing in the narrator’s shoes?

Gretel settled a much-abused spiral notebook onto the podium and began to read a lengthy interior monologue in stentorian tones. Her eyes never left the paper, and with good reason: the plotless account depicted Hansel and Gretel — both mentioned by name on page 1, incidentally — having sex in vivid detail. Just sex, without any emotional content to the interaction, in terms neither titillating nor instructive. It was simply a straightforward account of a mechanical act, structured within a literal countdown to the final climax: “Ten…nine…eight…”

It was so like a late-1960’s journalistic account of a rocket launching that I kept expecting her to say, “Houston, we’ve got a problem.”

I cringed for her — honestly, I did. I’d read some of Gretel’s other work: she was a better writer than this. So what point was she trying to make by reading this…how shall I put it?…a literarily uninteresting piece whose primary point seemed to be to inform the uncomfortable audience that she and her husband had consummated their marriage?

See how I used my response to develop the narrator’s character? Memoirists and writers of the real too often forget that the narrator is the protagonist of the story they are telling, and thus needs to be fleshed out as a character. If I’d attacked that last paragraph with a big more descriptive vim, I might have worked in some interesting insights into both Gretel and Hansel’s characters — how did her account jibe with his sculptural depictions of the act, for instance?

Oh, you thought that all of those body parts were languishing around his studio solo? Alas, no; I’ve seen less accurate models in biology classes. Again, wouldn’t it have been more effective storytelling to have shown that — or even made that last comment — while the protagonist was in the studio?

That would also have been the natural time to work in that Hansel’s sculptures did not…again, how to put this tactfully?…appear to have been based upon his wife’s womanly attributes. Artistically, he favored curves; she was so angular that she could have cut vegetables on her hip bones.

Lingering too long in the narrator’s head can be distracting from the action, though. Throughout the next paragraph, I invite you to consider: as a reader, would you have preferred to see the action more directly, or entirely through the narrator’s perspective?

Maybe I just wasn’t the right audience for her piece: the painters in the back row, the ones who had been drinking since breakfast, waved their bottles, hooting and hollering. They seemed not to notice that although the monologue was from a female perspective, there were no references whatsoever to the narrator’s physical sensations, only what Hansel was doing. The part of Gretel might have been quite adequately played by a robot.

Call me judgmental, but I tend to think that when half the participants seem to be counting the seconds until the act is over, it’s not the best romantic coupling imaginable. Still, looking around the auditorium, I didn’t seem to be the only auditor relieved when it ended. “Three…two…one.” No one applauded but Hansel.

In first-person pieces, the narration will often switch abruptly from inside the protagonist’s head to an ostensibly objective set of descriptions. Sometimes it works, sometimes it doesn’t. You be the judge: how well do you think the next paragraph carries the story forward from the last?

Gretel’s second piece took place at a wedding reception. Again, it was written in the first person, again with herself and her husband identified by name, again an interior monologue. However, this had some legitimately comic moments in the course of the first few paragraphs. As I said, Gretel could write.

Somewhere in the middle of page 2, a new character entered the scene, sat down at a table, picked up a sandwich — and suddenly, the interior monologue shifted from a gently amused description of a social event to a jealously-inflamed tirade that included the immortal lines, “Keep away from my husband, {expletive deleted}!” and “Are those real?”

Need I even mention that her physical description of the object of these jabs would have enabled anyone within the sound of her voice to pick me out of a police line-up?

Wouldn’t it have been both more interesting and better character development to have shown the opening of Gretel’s second piece, rather than leaving it to the reader’s imagination? Ponder how that choice might have affected your perception of whether this scene is funny or tragic, please, as the narrative belatedly tells what it should have shown in the previous section.

She read it extremely well; her voice, her entire demeanor altered, like a hissing cat, arching her back in preparation for a fight. Fury looked great on her. From a literary standpoint, though, the piece fell flat: the character that everyone in the room knew perfectly well was me never actually said or did anything seductive at all; her mere presence was enough to spark almost incoherent rage in the narrator. While that might have been interesting as a dramatic device, Gretel hadn’t done enough character development for either “Jan”– cleverly disguised name, eh?– for the reader either to sympathize with the former or find the latter threatening in any way.

There was no ending to the story. She just stopped, worn out from passion. And Hansel sat there, purple-faced, avoiding the eyes of his sculptor friends, until she finished.

The first comment from the audience was, “Why did the narrator hate Jan so much? What had she done to the narrator?”

Had I been telling this anecdote verbally — and believe me, I have — this spate of summary statements and analysis of what the reader has not been shown might well work beautifully. Memoirists tend to be fond of paragraphs like this, commenting upon the action as if the reader had also been there. It makes abundant sense, from the writer’s perspective: after all, I was actually there, right?

But talking about events creates a very different impression on the page than writing about them vividly enough that the reader can picture the action and characters for herself. If I had shown you the story Gretel was reading, at least in part, you could have judged this character based on her own words — much more powerful than the narrator’s simply telling you what you should think about her.

A professional reader like Millicent — or, heck, like me — might well raise another objection to that last section: since the narration is so skewed to the protagonist’s side, some readers may feel that this account lacks credibility. Could Gretel actually have been as vitriolic (or unstrategic) as I’ve depicted her here?

Actually, she was, every bit — but does that matter, if the narrative can’t make her seem plausible on the page? The fact that Gretel existed and that she chose to act in this extraordinary manner is not sufficient justification for the reader to finish this story. It also has to work as a story, and that’s going to require some serious character development for not only the narrator, but the other characters as well.

You’d be astonished at how often memoir submissions do not treat either as characters. Frequently, Millicent sees memoirs — and slice-of-life fiction, for that matter — that are simply commentary upon what was going on around the protagonist. Yet a memoir isn’t a transcript of events, interesting to the reader simply because they happened to the narrator; it’s one person’s story, skillfully pruned to leave out the dull parts. If the reader doesn’t get to know that narrator, though, or come to experience the other characters as real, the memoir is likely to fall flat.

Why? Because it will read like a series of anecdotes, rather than like a book.

Fictionalizers of real life tend to have an easier time thinking of their protagonists as protagonists, I notice, but as any Millicent could tell you, they often give away the narrative’s bias by clearly siding with one character over another. Or by depicting one character as all sweetness and light and the other as all evil. A popular secondary strategy: describing other characters’ reactions to the antagonist as universally in line with the protagonist’s, as though any onlooker would have had exactly the same response.

I was very nice to Gretel afterward; what else could I do? I laughed at her in-text jokes whenever it was remotely possible, congratulated her warmly on her vibrant dialogue in front of the National Book Award nominee, and made a point of passing along a book of Dorothy Parker short stories to her the next day.

Others were not so kind, either to her or to Hansel. The more considerate ones merely laughed at them behind their backs. (“Three…two…one.”) Others depicted her in cartoon form, or acted out her performance; someone even wrote a parody of her piece and passed it around.

True, I did have to live for the next week with the nickname Mata Hari, but compared to being known as the writer whose act of fictional revenge had so badly belly flopped, I wouldn’t have cared if everyone had called me Lizzie Borden. And, of course, it became quite apparent that every time I went out of my way to be courteous to Gretel after that, every time I smiled at her in a hallway when others wouldn’t, I was only pouring salt on her wounded ego.

Is there anything more stinging than someone you hate feeling sorry for you?

At last, we come full circle, back to my original point in sharing this anecdote in the first place: if your answer was any flavor of yes, you might want to consider waiting until you’ve developed some objective distance from your annoying co-worker before committing her to print. Think at least twice about what you’re putting on the page, particularly for work you are submitting to contests, agencies, or small presses.

Or, heaven forbid, reading to a group of people you want to like you. Or your narrator.

If you’re still angry, maybe it’s not the right time to write about it for publication. Your journal, fine. But until you have gained some perspective — at least enough to perform some legitimate character development for that person you hate — consider giving it a rest. Otherwise, your readers’ sympathies may ricochet, and move in directions that you may not like.

It’s always a good idea to get objective feedback on anything you write before you loose it on the world, but if you incorporate painful real-life scenes into your fiction, sharing before promotion becomes ABSOLUTELY IMPERATIVE. If you work out your aggressions at your computer — and, let’s face it, a lot of us do — please seriously consider joining a writing group. To be blunt about it, finding good first readers you can trust can save you from looking like an irate junior high schooler on a rampage.

And Gretel, honey, in the unlikely event that you ever read this, you might want to remember: revenge is a dish best served cold. Or, as Philip used to say, never screw over a living writer. You never know who might end up writing a blog.

Hey, I’m only human — which renders me a more interesting protagonist in a memoir, right? As a memoirist, I have to assume that my readers too intelligent to believe that I was 100% perfect in this trying situation (I must admit, I did make an unkind joke or two in private), or that Gretel was 100% nasty (in actuality, she was rather nice to people her husband did not appear to be obsessed with sculpting), I suspect that most readers would also wonder whether Hansel actually stood by passively while his wife seethed with jealousy (he didn’t: he egged her on, in what appeared to me to be characteristic of their relationship). Were I planning to use this dynamic in a memoir, it would be in the story’s best interest to develop those less-neat elements into a more plausibly complete account.

If I hoped to fold this frankly pretty darned annoying incident into a novel, the imperative to flesh these people out into fully-rounded characters would be even stronger. Showing their foibles through action and dialogue, rather than just telling the reader what conclusions to draw, is not only better storytelling — it’s less intrusive narration.

Would I feel as vindicated? Perhaps not. Enough time has passed, however, that I now see this story as fundamentally sad: instead of befriending a more experienced writer who could have conceivably helped her on the long, twisty road to publication, Gretel allowed the troubled dynamic of her marriage to become the central focus of a bunch of not-particularly-sympathetic strangers. She, too, was in that dank basement, while her husband created his fantasies of women who did not resemble her in comparative comfort. If he hadn’t chosen me to as the prod with which to keep poking her insecurities, I’m sure he would have found somebody else.

So who is the actual villain of this piece? You decide; that’s the reader’s job, after all.

Keep up the good work!

Pet Peeves on Parade, part XXVII: plausibility, realism, and the wildly variable potentials of plot

I return to you an injured warrior, campers: for the past few days, my keyboard has lain idle while I have been recovering from a viciously broken fingernail. I’ve been lolling around with my left hand elevated, muttering ruefully.

Were those giant guffaws I just heard rolling about the ether an indication that some of you would not consider this a debilitating injury? I defy anyone to type successfully while a significant part of the nail bed on the pointer finger so dear to those who use the hunt-and-peck method is protected from the elements by nothing but the largest Band-Aid currently available to the medical community. Or to touch-type with any accuracy whilst said Band-Aid extends that finger to clownish lengths. Should any writer out there not care if his intended Fs are 5s and his Ps plus signs, I have yet to meet him.

In the course of all of that enforced lolling, however, I had leisure to contemplate once again the burning issue of plausibility on the page. Now that I’m back, I’m going to fling it into your consciousness, too: honestly, if you encountered the story above on page 57 of a novel, would it seem remotely realistic to you?

To a reader either unfamiliar with the torrid history of my long, accident-prone nails or happily inexperienced in having their own nails violently bent back, I’m guessing it would not. I’m also guessing that would come as a surprise to some of you, because as anyone who reads manuscripts for a living can tell you, the single most common response to an editorial, “Wow, that doesn’t seem particularly plausible,” is an anguished writer’s cry of, “But it really happened!”

I can tell you now that to a pro like Millicent the agency screener, this argument will be completely unconvincing — and not merely because she has, if she’s been at it a while, heard it applied to scenes ranging from cleverly survived grizzly bear maulings to life-threatening hangnail removals to couples who actually split the domestic chores fifty-fifty, rather than just claiming that they do. (Oh, like I was going to do laundry with a bent-back fingernail.) Any guesses why that cri de coeur about the inherently not-very-believable nature of reality will leave her cold?

Long-time readers, chant it with me now: just because something has occurred in real life does not necessarily mean it will be plausible written as fiction. Nor does the fact that a human being might actually have uttered a particular phrase render it automatically effective dialogue. For that reason, it’s the writer’s responsibility not simply to provide snapshots and transcripts of real life on the page, but to write about it in such a way to make it seem plausible to the reader.

Let’s face it, plenty of real-life shenanigans are completely absurd; plenty of what tumbles out of people’s mouths is at least equally so. The world as we know it does not labor under the novelist’s imperative to render actions dramatically satisfying, or even interesting. None of us is empowered to walk up to someone who does something astonishing and say, “Hey, that’s completely out of character for you. Editing! Cut what this man just did.” (Although, admittedly, it would be an interesting approach to winning friends and influencing people.) And don’t even get me started about how a good editor could improve the dialogue all of us overhear in the movie ticket line, at the grocery store, or at your garden-variety garden party.

Besides, as a novelist, isn’t your job to improve upon reality? Isn’t it, in fact, your art and your pleasure to take the real and dress it up in pretty language, garnishing it with trenchant insights?

So you can’t really blame Millicent and her cronies for preferring fiction writing to have more to recommend it than its resemblance to something that might have happened on this terrestrial sphere. I suspect all of us who love good writing harbor a similar preference.

But I ask you as a reader: would you have felt differently if the tale at the opening of this post had turned up on page 143 of a memoir?

Most readers would; based on a true story is not ubiquitous in book and movie marketing simply because folks in those industries happen to like the sound of the phrase, after all. It’s human nature to like to be in the know.

That does not mean, however, that any truthful memoir — which, as the series of scandals that have rocked the publishing world in recent years have made all of us aware, are not necessarily synonymous terms — is automatically and inherently plausible. Yes, the reader picks up a memoir with the expectation that it will provide a fact-based portrayal of reality, but once again, it’s not just the accuracy of the facts that makes them seem true-to-life on the page.

What might the decisive factor be, campers? Could it be how the writer conveys those facts on the page?

As the pros like to say, it all depends on the writing. Just as many a ho-hum real-life event has been punched up by a gifted prose stylist into an unforgettable scene on the page, many an inherently fascinating occurrence has been rendered downright turgid by a dull telling.

Don’t believe me? Okay, try this little experiment: the next time you find yourself at a gathering that contains both interesting and uninteresting people, pick a few of each at random. Ask these people to describe their first really vivid memories — or, if you have ears of iron, their first memories of how their parents responded to a major public event like men walking on the moon, the shooting of President Reagan and James Brady, or a celebrity couple’s breaking up. (Hey, one person’s intriguing public event is another person’s snoozefest.) Listen attentively to each account without interrupting.

Then ask yourself afterward: “Did all of those stories seem equally true?”

If it’s not apparent to you a few sentences into the first poorly-told account why the storyteller’s skill makes all the difference to the audience’s perception of the story, well, I shall be very surprised. What might be less apparent — and thus require more careful listening to detect — is that you’re probably going to care less whether what the speaker is saying is true if she happens to tell the tale well.

And that, my friends, sums up the private reactions of many, many denizens of the publishing world in the wake of the A MILLION LITTLE PIECES scandal. For months afterward, while people in the outside world were asking, “But is this accurate?”, folks who dealt with books for a living — and, I suspect, most habitual readers of memoir — kept saying, “But was it well-written?”

Frankly, for a memoir to work, it needs to be both. Unless the memoirist in question is already a celebrity — in which case he’s probably not going to be the sole writer, anyway — a simple recital of the facts, however titillating they may be in and of themselves, will not necessarily grab Millicent. Nor will a beautifully-told collection of purely imaginary events fly in the memoir market.

You know where gorgeous writing that doesn’t confine itself rigidly to what actually happens in the real world works really well, though? In a novel. Provided, of course, that the writer presents those fictional — or fictionalized — events in such a manner that they are both a pleasure to read and seem plausible within the context of the world of the book.

Do I spot some timidly-raised hands out there? “But Anne,” those of you who specifically do not write about the real point out shyly, “I don’t think this applies to my work. I create storylines out of whole cloth, creating plots where vampires roam freely, werewolves earn master’s degrees, and denizens of other planets lecture in political science departments. Of course, my stories aren’t plausible; that’s part of their point.”

Actually, to work on the page, any storyline needs to be plausible. That is, the narrative must be sufficiently self-conscious about its own premise that any reader who has accepted its underlying logic that everything in the story could have happened that way.

You would be amazed at how often paranormal, science fiction, and fantasy manuscripts do not adhere to this basic precept of storytelling. Implausible fantasies are perennially among Millicent’s pet peeves.

That got a few goats, did it not? “What part of fantasy don’t you understand, Millie?” I hear some of you mutter under your respective breaths. “It’s not intended to be realistic.”

No, but it does need to be plausible — which is not necessarily synonymous with realism. In fact, in a completely fantastic story, remaining plausible might actually require being anti-realistic.

How so? Well, for the reader to be carried along with a story, its internal logic must make sense, right? A narrative that deliberately eschews the laws of physics of our world can’t just ignore physical properties and motion altogether; the writer must come up with a new set of rules governing the world of the story. The less like the real world that fantasy world is, the more vital to the reader’s willing suspension of disbelief maintaining the reader’s sense of plausibility is.

That means, in effect, that while a fantastic plot allows the writer to play with reality, in order to be plausible, the narrative must be respectful of the fictional reality. So when, say, the three-toed sloth protagonist first sets a digit upon the Planet Targ, a place the reader was informed 138 pages ago was exempt from both gravity and dirt, and ol’ Three-Toe leaves a footprint, that’s going to jar a reader who has been paying attention. And the negative effects of even minor inconsistencies can pile up awfully fast: when T-T appears with his designer jeans covered in mud thirty pages after the footprint faux pas, the reader is obviously going to be less accepting than the first time the writer broke the rules.

What is the cumulative effect likely to be? For a lay reader, being knocked out of the story altogether. To a professional reader, however, the results are usually more dire — and are likely to be triggered by the first plausibility lapse, not the third or fourth.

“Oh, no,” Millicent sighs over The Saga of the Sloth. “This writer has set up a really interesting set of rules for this world, and now she’s violated one of them. That’s too bad; I was buying the premise here, and now I have to question it. Next!”

From Millicent’s perspective, the inconsistent detail about the footprint, while not necessarily a rejection-worthy problem in itself, represented a symptom of a plot-level plausibility issue, one that she does not necessarily feel compelled to read on to see confirmed thirty pages later in the muddy jeans. It was the writer’s job to make Three-Toe’s trip to Targ believable within the context of the book’s logic, after all. Since the narrative has already demonstrated a lax approach toward internal plausibility, an experienced Millie would expect to see more lapses later on in the manuscript.

And most of the time, she would be quite right about that. If you really want to set your fantastic world apart from 99% of the others she sees, make its attributes perfectly consistent.

That should be a piece of cake, right?

I’m kidding, of course; editing one’s own work for consistency is one of the most difficult self-editing tasks there is. That’s true, incidentally, no matter where your story might fall on the fantastic-realistic scale. In fact, proofing a hyper-realistic text can be even more challenging than a completely fictional one: even if it’s vitally important to the story that the broom is always kept behind the china cabinet, not the ottoman, the very mundanity of the detail may render it harder to keep in mind.

But you don’t want your heroine to expend her last gasp of breath futilely flailing behind the wrong piece of furniture, would you?

Naturally, from the reader’s perspective, the less predictable a detail is, the more memorable it is likely to be. Case in point: what kind of animal is visiting the Planet Targ? Would you have been able to answer so quickly if the story had just been about some guy named Bart?

Does that gasp of frustration mean that those of you who write reality-based fiction and memoir are already familiar with the problem of how to make the real memorable while still maintaining a sense of realism? Let’s face it: most of real-life details are likely to be on the unmemorable side. While a fantasy writer has the option — nay, the responsibility — to transform that perfectly ordinary mailbox on the corner into a flying monkey that happens to deliver mail for a living, a writer painting a picture against a backdrop of this world can’t.

(At least not until I have finished organizing my secret Chimps-on-Wings postal service. Mum’s the word until I put the finishing touches on that promising enterprise.)

But details need not strain the credulity in order to capture the reader’s imagination. Allow me to tell you a little story to illustrate — or, rather a series of little stories. But first, let me prime the creative pump by showing you a couple of literal illustrations.

fortune side onefortune side two

These are the two sides of the single fortune I found tucked into an end-of-the-meal cookie last year, right around census time: a tactfully-phrased prediction of my future happiness — by mail, no less! — accompanied by a terse statement about my general standing in the world. Now, had I been a less secure person, I might have taken umbrage at my dessert’s presuming to judge whether I counted or not, but since I had already sent back my census form, I found the symmetry very pleasing: clearly, Somebody Up There (or at any rate, Somebody Working in a Cookie Factory) was planning to reward the civic virtue of my outgoing mail with something fabulous in my incoming mail.

Imagine how dismayed I would have been, though, had I not yet popped my census form into the mail — or, even worse, if I had not yet received my census form. As I rearranged vegetables and yogurt containers in preparation for fitting my leftover asparagus in black bean sauce and Hunan pork into my overstuffed refrigerator, I would have kept wondering: is the census form the mail I’m supposed to find so darned pleasant? I mean, I understand the Constitutional obligation to be counted every ten years, but who is this fortune cookie to order me to enjoy filling it out?”

Admittedly, in a real-life fortune cookie-consumption situation, this might have been a bit of an overreaction. (Although what’s next, I wonder? Miranda warnings printed on Mars bars, for easy distribution at crime scenes? The First Amendment immortalized in marzipan, lest bakery patrons temporarily forget about their right to freedom of assembly whilst purchasing fresh macaroons?) Had the protagonist in a novel or memoir stumbled upon this chatty piece of paper, however — and less probable things turn up on the manuscript page all the time — it would have seemed pretty significant, wouldn’t it?

Any thoughts on why that might be the case? Could it be that this bizarre means of communication is one of those vivid details I keep urging all of you to work into the opening pages of your manuscripts, as well as the descriptive paragraph in your queries, synopses, verbal pitches, and contest entries? Could the paragraphs above be crammed with the kind of fresh, unexpected little tidbits intended to make Millicent suddenly sit bolt upright, exclaiming, “My word — I’ve never seen anything like that before,” at the top of her lungs?

Or, to put it in terms the whole English class can understand, in choosing to incorporate that wacky fortune cookie into the narrative, am I showing, rather than telling, something about the situation and character?

How can a savvy self-editing writer tell whether a detail is vivid or unusual enough to be memorable? Here’s a pretty reliable test: if the same anecdote were told without that particular detail, or with it described in (ugh) general terms, would the story would be inherently less interesting?

Don’t believe that so simple a change could have such a dramatic subjective effect? Okay, let me tell that story again with the telling details minimized. To make it a fair test, I’m going to keep the subject matter of the fortunes the same. Because I always like to show you examples of correctly-formatted manuscript pages, however, this time, I’m going to present it to you as a screening Millicent might see it. As always, if you’re having trouble reading the individual words, try enlarging the image by holding down the COMMAND key and pressing +.

It’s not as funny, is it, or as interesting? I haven’t made very deep cuts here — mostly, I’ve trimmed the adjectives — and the voice is still essentially the same. But I ask you: is the story as memorable without those telling details? I think not.

Some of you are still not convinced, I can tell. Okay, let’s take a more radical approach to cutting text, something more like what most aspiring writers do to the descriptive paragraphs in their query letters, the story overviews in their verbal pitches, and/or the entirety of their synopses, to make them fit within the required quite short parameters. Take a peek at the same tale, told in the generic terms that writers adopt in the interests of brevity:

Not nearly as much of a grabber as the original version, is it? Or the second, for that matter. No one could dispute that it’s a shorter version of the same story, but notice how in this rendition, the narrator seems to assume that the reader will spontaneously picture the incident so clearly that no details are necessary. Apparently, it’s the reader’s job to fill in the details, not the writer’s.

Except it isn’t. As far as Millicent is concerned, it’s the writer’s responsibility to tell the story in a way that provokes the intended reaction in the reader, not the reader’s to guess what the writer meant. Or to figure out what details might fit plausibly into the scene.

I hate to be the one to break it to you, but professional reading is seldom anywhere near as charitable as the average submitter or contest entrant hopes it will be. Blame it on the intensity of competition created by literally millions of aspiring writers seeking to get published: Millicent knows that if the well-written submission in front of her does not provide her with the reading experience her boss the agent believes will sell right now, chances are good that one of the next thousand submissions will.

According to her, then, it’s your job to draw her into your story so completely that she forgets about all of that. It’s your job to wow her with your storytelling — and without relying upon her sense that you might be writing about something that really happened to supply the plausibility strong, tangible details would provide.

So it honestly is in your best interest to assume that the reader is only going to picture the details you actually provide on the page. Since you cannot be sure that every reader will fill in the specifics you want, make darned sure that what you want the reader to take from the scene is not left to his imagination. If the detail is important, take the page space to include it.

This is particularly good advice if you happen either to be writing memoir or a novel with scenes based upon your personal experience. All too often, reality-based narrators rely upon the fact that something really happened to render it interesting to a reader, regardless of how skillfully that story may be told. All that’s really necessary is a clear telling, right? Or that the kind of terse narrative that works so well in a verbal anecdote will inspire the same reaction if reproduced verbatim on the page?

How well does either of these extremely common theories work out in practice? Well, let me ask you: did you prefer the first version of the fortune cookie story, the second, or the third? More importantly for submission purposes, which do you think would grab Millicent the most as the opening of a manuscript?

Uh-huh. The difference between those three renditions was not the voice (although a case could be made that part of the voice of the first was created through the selection of the details) or even the writing quality (although the last version did get a mite word-repetitive), but the narrative’s willingness to include telling details — and unusual ones at that.

What if the entertainment differential between the three lay not in an authorial failure of imagination in composing the last version, but in a failure to recognize that the point of including this anecdote is presumably to entertain and inform the reader? In telling the story as quickly as possible, can a writer sometimes defeat the purpose of including it at all?

“But Anne!” memoirists and reality-based novelists protest nervously. “When I’m writing about the real, I can’t just make up pithy little details to enliven the narrative, can I? I have to stick to what happened!”

True enough, anxious truth-tellers: if you are writing the real, you cannot control the facts. What you can control, however, and what any writer must control, is how you present them to the reader.

No matter what you write, the success of your narrative is going to depend largely upon your storytelling skills — they’re what separates your account of a particular incident from anybody else’s, right? Frankly, this isn’t an easy task, even if dear self doesn’t happen to be the protagonist; it’s genuinely hard to represent the real world well on the page. Let’s face it, reality is sometimes a lousy storyteller.

Oh, your life has never been trite or obvious or just plain perplexing, even for a minute? Okay, all of you English and Literature majors, tell me, please, how the following 100% true anecdote rates on the symbolism front.

A couple of years ago, I was scheduled to give a eulogy for a dead friend of mine — a writer of great promise, as the pros used to say — at our college reunion. Because several of my classmates had, unfortunately, passed away since our last get-together, eight of us were to give our eulogies at the same event. Because I am, for better of worse, known to my long-time acquaintances as a teller of jokes, I was under substantial pressure to…how shall I put this?…clean up the narrative of my late friend’s life a little. Or at least tell a version that might not offend the folks who didn’t happen to know him.

No, that’s not the symbolic part; that’s all backstory. Here’s the symbolism: my throat was annoyingly, scratchily sore for the entire week that I was editing the eulogy.

Now, if I saw a parallel that obvious in a novel I was editing, I would probably advise cutting it. “No need to hit the reader over the head with it,” I’d scrawl in the margins. “Yes, it’s showing, not telling, but please. Couldn’t you come up with something a bit more original?”

(And yes, now that you mention it, I am known for the length of my marginalia. Brevity may be the soul of wit, but explanation is often the soul of clarity.)

Now, if my life were a short story written for an English class, the voice loss in that anecdote might pass for legitimate symbolism — or even irony, in a pinch. A bit heavy-handed, true, but certainly situationally appropriate: outsiders move to silence protagonist’s voice through censorship = protagonist’s sore throat. Both New Age the-body-is-telling-you-something types and postmodern the-body-is-a-text theorists would undoubtedly be pleased.

But the fact is, in a novel or memoir, this cause-and-effect dynamic would seem forced, or even trite. Certainly, it’s unlikely to make Millicent drop her latte and exclaim, “Wow, I never saw that coming!”

As I believe I may have already mentioned, just because something happens in real life doesn’t necessarily mean that it will make convincing fiction. My sore throat is precisely the type of symbolism that comes across as ham-handed in a novel. It’s too immediate, for one thing, too quid pro quo. Dramatically, the situation should have taken time to build — over the years since my friend’s death, perhaps — so the reader could have felt clever for figuring out why the throat problem happened. Maybe even anticipated it.

How much better would it have been, in storytelling terms, if our protagonist had dealt with all the different input with aplomb, not coming down with strep throat until scant minutes before she was to speak? That way, in fine melodramatic style, she would have to croak her way through her speech, while her doctor stood by anxiously with antibiotics.

The possibilities make the writerly heart swoon, do they not?

Just think how long it would extend a funeral scene if a eulogizer were unable to speak more than a few emotion-charged words before her voice disappeared with a mouse-like squeak. Imagine the deceased’s secret admirer creeping closer and closer, to catch the muttered words.

Heck, just think of the dramatic impact of any high-stakes interpersonal battle where one of the arguers cannot speak above a whisper. Or the comic value of the persecuted protagonist’s being able to infect her tormenters with strep, so they, too, are speechless by the end of the story.

Great stuff, eh? Much, much better than protagonist feels silenced, protagonist IS silenced. That’s just so…literal.

Besides, readers like to see a complex array of factors as causes for an event, and an equally complex array of effects. Perhaps if our protagonist had been not spoken about her friend since he passed away (which, in a sense, is quite true: I was unable to make it across the country for his memorial service; that could be transformed into an interesting flashback), then she would be fictionally justified in developing speech-inhibiting throat problems now. Or if he and she had shared deep, dark secrets she had sworn never to reveal (no comment), how telling a slight sore throat might be on the eve of spilling the proverbial beans, eh?

But a single event’s sparking a severe head cold? Dramatically unsatisfying. Not to mention implausible.

Taken too far, it might even make the protagonist seem like a wimp. Readers, like moviegoers, like to see protagonists take a few hits and bounce up again. Even better is when the protagonist is beaten to a bloody pulp, but comes back to win anyway.

One of the great truisms of the American novel is don’t let your protagonist feel sorry for himself for too long — at least, not if his problems rise to the level of requiring action to fix. Simply put, most readers would rather see a protagonist at least make an attempt to solve his problems than spend 50 pages resenting them.

I can feel authors of novels and memoirs where characters sit around and think about their troubles for chapters on end blanching. Frankly, you should, at least if you intend to write for the U.S. market. Domestic agents and editors expect first-time authors’ plots to move along at a pretty good clip — and few characteristics slow a plot down like a protagonist’s tendency to mull. Especially in a first-person narrative, where by definition, the reader must stay within the worldview of the narrator.

Some of you blanching souls have your hands raised, I see. “But Anne,” these pale folks exclaim, “I’ve always heard that the real key to keeping a reader’s interest is to introduce conflict on every page. Well, most of my protagonist’s conflict is internal — she can’t make up her mind where to turn. Surely,” the pallor deepens, “a professional reader like Millicent wouldn’t dismiss this kind of thinking as whining, right?”

That’s a good question, blanchers, and one that fully deserves an answer. The short one is that it all depends on how long the equivocation goes on, how plausible the conflict is, and how repetitive the mulling ends up being. That, and whether the protagonist (or the plot, for that matter) is doing anything else whilst the wheels in her brain churn.

The long answer, of course, is that in order to formulate a really good answer to that particular question, you would need to go out and read a hefty proportion of the tomes released in your book category within the last couple of years. Not EVERY book, mind you: those by first-time authors, because the already-established have to impress fewer people to get a new book into print.

In recent years, most fiction categories have moved pretty firmly toward the action end of the continuum. As opposed to, say, virtually any novel written in English prior to 1900, most of which hugged the other, pages-of-mulling end of the continuum.

This preference isn’t limited to the literary realm, either — we often see this philosophy in movies, too. Don’t believe me? Okay, think about any domestic film with where an accident confines the protagonist to a wheelchair.

No examples springing to mind? Okay, how about if the protagonist is the victim of gratuitous discrimination, or even just simple bad luck? I’m talking about serious drawbacks here, not just everyday annoyances, of course. ( For some reason, whining about trivial problems — “But I don’t have the right shoes to wear with a mauve bridesmaid’s dress!” — seems to be tolerated better by most readers and audience members, provided that the whine-producer doesn’t bring the plot to a screeching halt until she finds those shoes.)

Got a film firmly in mind? Now tell me: doesn’t the film include one or more of the following scenes:

(a) some hale and hearty soul urging the mangled/unemployed/otherwise unhappy protagonist to stop feeling sorry for himself,

(b) a vibrantly healthy physical therapist (job counselor/spouse/friend) telling the protagonist that the REAL reason he can’t move as well as he once did is not the casts on his legs/total paralysis/missing chunks of torso/total lack of resources/loss of the love of his life, but his lousy ATTITUDE, and/or

(c) the protagonist’s lecturing someone else on his/her need to stop feeling sorry for himself and move on with his/her life?

In fact, don’t filmmakers — yes, and writers of books, too — routinely expect their characters to become better, stronger people as the result of undergoing life-shattering trauma?

Now, we all know that this is seldom true in real life, right? As someone who has spent quite a bit of time in physical therapy clinics over the last year, I’m here to tell you that pain does not automatically make people better human beings; it makes them small and scared and peevish. That sudden, crisis-evoked burst of adrenaline that enables 110-pound mothers to move Volkswagens off their trapped toddlers aside, few of us are valiantly heroic in the face of more than a minute or two of living with a heart attack or third-degree burns.

Or ten months of physical therapy. And had I mentioned that my nail had a boo-boo?

Heck, even the average head cold — with or without a concomitant voice loss — tends to make most of us pretty cranky. Yet dramatically, we as readers accept that the little irritations of life might seem like a big deal at the time, even in fiction, because these seemingly trivial incidents may be Fraught with Significance.

Which often yields the odd result, in books and movies, of protagonists who bear the loss of a limb, spouse, or job with admirable stoicism, but fly into uncontrollable spasms of self-pity at the first missed bus connection or hot dog that comes without onions WHEN I ORDERED ONIONS.

Why oh why does God let things like this happen to good people?

One of my favorite examples of this phenomenon comes in that silly American remake of the charming Japanese film, SHALL WE DANCE? After someone spills a sauce-laden foodstuff on the Jennifer Lopez character’s suede jacket, she not only sulks for two full scenes about it, but is later seen to be crying so hard over the stain that the protagonist feels constrained to offer her his handkerchief.

Meanwhile, the death of her dancing career, the loss of her life partner, and a depression so debilitating that she barely lifts her head for the first half of the movie receive only a few seconds’ worth of exposition. Why? Because dwelling on the ruin of her dreams would be wallowing; dwelling on minor annoyances is Symbolic of Deeper Feelings.

So where does that leave us on the vivid detail front — or the plausibility front, for that matter? Should we all shy away from giving our protagonists big problems, in favor of more easily-presented small ones?

Well, I’m not going to lie to you: there are plenty of writing gurus out there who would advise you to do precisely that. Edith Wharton remarked in her excellent autobiography (which details, among other things, how terribly embarrassed everybody her social circle was when she and Theodore Roosevelt achieved national recognition for their achievements, rather than for their respective standings in the NYC social register; how trying.) that the American public wants tragedies with happy endings. It still seems to be true.

So why, you may be wondering, am I about to advise you not only to depict your protagonists (fictional and real both) with many and varied problems, as well as significant, realistic barriers to achieving their goals? Have I merely gone detail-mad?

Not by a long shot. I have heard many, many agents and editors complain in recent years about too-simple protagonists with too-easily-resolved problems. In conference presentation after conference presentation, they’ve been advising that writers should give their protagonists more quirks.

It’s an excellent way to make your characters memorable, after all — and it enables the inclusion of lots and lots of luscious telling details. Give ‘em backstory. If you want to make them sympathetic, a hard childhood, dead parent, or unsympathetic boss is a great tool for encouraging empathy.

Not to mention being plausibly survivable traumas. Do you have any idea how many Americans have experienced one of those things? Or all three?

Feel free to heap your protagonist (and love interest, and villain) with knotty, real-life problems — provided, of course, that none of these hardships actually prevent the protagonist from achieving his or her ultimate goal. Interesting delay creates dramatic conflict; resignation in the face of an insuperable barrier, however, is hard to make entertaining for very long. Make sure that the protagonist fights the good fight with as much vim and resources as someone who did not have those problems — or show her coming up with clever ways to make those liabilities work for her.

Again, this is not the way we typically notice people with severe problems acting in real life, but we’re talking writing that people read for pleasure here. We’re talking drama.

We’re talking, to put it bluntly, about moving a protagonist through a story in a compelling way, and as such, as readers and viewers, we have been trained to regard the well-meaning soul who criticizes the recently-bereaved protagonist by saying, “Gee, Monique, I don’t think you’ve gotten over your mother’s death yet,” as a caring, loving friend, rather than as a callous monster incapable of reading a calendar with sufficient accuracy to note that Monique buried her beloved mother only a couple of weeks before.

While a sympathetic soul might reasonably ask, “Um, why should she have gotten over it already, if she’s not completely heartless?”, strategically, even the deepest mourning should not cause the plot to stop moving altogether.

Don’t get me wrong: I don’t think that professional readers who resent characters who linger in their grief are inherently unsympathetic human beings. They just see far, far too much wallowing on the page.

While that’s undoubtedly realistic, it doesn’t really work in a manuscript. Fictional characters who feel sorry for themselves (or who even possess the rational skills to think at length over the practical ramifications of obstacles in their paths) tend to be passive, from the reader’s point of view. They don’t do much, and while they’re not doing much, the plot grinds to a screaming halt. Yawn.

Or to express it in Millicent’s parlance: next!

Yes, people do this in real life. All the time. But I’m relatively positive that someone told you very, very recently, just because something really happened doesn’t mean it will work on the page.

My, we’ve covered a lot of ground today. I’m going to leave all of this to germinate in your fertile minds for the nonce, campers, while I turn our attention back to nit-picky issues for the next few posts. (Oh, you thought I hadn’t noticed that I’d digressed from structural repetition?) Trust me, you’ll want to have your eye well accustomed to focusing on sentence-level details before we leap back up to plot-level planning.

A good self-editor has to be able to bear all levels of the narrative in mind simultaneously, after all. This is complicated stuff, but then, so is reality, right? Keep up the good work!

The Short Road Home, part V, in which we have apparently all died and gone to Concrete Example Heaven

On and off for the last couple of weeks, we’ve been talking about that graveyard of literary tension and promoter of telling rather than showing, the Short Road Home. The SRH haunts novel and memoir submissions in a variety of disguises. Oh, it’s a versatile narrative trick, easily applied to a broad range of manuscript environments; it is as proficient at strangling burgeoning character development as it is at draining the tension out of a scene.

Most often, of course, it manifests as a scene or plot that resolves conflict practically the nanosecond it appears — astonishingly often without any effort whatsoever on the part of the protagonist. Indeed, conflict-avoidance is so popular amongst fictional characters that protagonists tend to favor resisting the status quo even in their minds.

Oh, you may laugh, but you’d be surprised how often those of us who read for a living will watch, stunned, as a protagonist briefly considers perhaps maybe eventually doing or saying something — only to be interrupted by another character rushing in to prevent even the thought of discord from developing into something that might be interesting for the reader to watch. The swiftness with which these tension-averse white knights dispatch nascent conflict is sometimes downright eerie, begging the question: is this character a participant in this story, or is he reading it?

You’d like a concrete example, wouldn’t you? We aim to please.

If only I had the courage to speak up, Tyrone thought, seething. I’ve put up with my repressive boss’ arbitrary pronouncements for years. Maybe today is the day I should stop being a doormat. Maybe today is the day I shall start speaking up for myself. Maybe today is…

“Oh, and before we end the meeting,” Artie said, smoothing his notes, “I’ve been sensing some disgruntlement in the face of our recent reorganization. Perhaps I’ve been a trifle, well, if not insensitive, then at least myopic. I’d like to hear your concerns, though.” He turned to Tyrone. “I’ve always valued your opinion, Ty. How do you think we could improve our beloved department?”

Beaming, Tyrone wrestled a binder stuffed with suggestions from his backpack. “I thought you’d never ask!”

Ah, but the reader wishes you hadn’t asked, Artie. Characters who read one another’s minds are notorious tension-deflaters.

They are also prone to cutting off plot possibilities before they have a chance to do more than poke their wary heads above ground. Had Artie not magically deduced his employees’ irritation from some clue that the narrative has not elected to share with the reader — if, in other words, the conflict were shown by any means other than Tyrone’s thoughts telling us about them — maybe then Tyrone would have had to take the longer, more arduous road of addressing the problem by — wait for it — addressing the problem. As in out loud, in a manner that might have provoked an interesting, true-to-life scene.

We met another favorite guise of the Short Road Homes in my last post: telling a story out of chronological order, drowning any possible suspense about the outcome of a conflict by revealing it at the beginning of the scene, rather than the end. Even if foreshadowing is vague, it can sap the reader’s impetus to wonder what is going to happen next — a pity, really, as its purpose is ostensibly to raise suspense.

All too soon, our happy mood vanished, ruining the rest of the day. If I’d known what was going to happen next, I would have grabbed the oars and rowed like mad for the shore.

“Where’s the sun gone?” Barbara asked suddenly.

Meg’s hat blew off before she could reply. “The sky looks mighty ominous. I’d always thought that the clouds spelled DOOM was just an expression.”

I pointed a shaking finger over the side. “Are those sharks?”

Even minor chronology-surfing can lead to confusion. Since — chant it with me now, long-term readers — unless the narrative specifically states otherwise, events are presumed to occur in the order they appear on the page, what may appear to the writer as just a little creative sentence restructuring may genuinely muddy the reader’s conception what’s going on.

How? By inverting cause and effect temporally. Compare, for instance, this inadvertently time-traveling piece of prose:

Horrified, James jumped backward as Fred took a swing at him. He narrowly avoided being grabbed by George’s flailing hands. Wincing at the pain, he managed to spot and catch Bob’s crowbar before it connected with the side of his head.

With this more straightforward narration, in which cause precedes effect and our hero does not react before he perceives a threat:

Abruptly, Fred took a swing at him. Horrified, James jumped backward, practically into George’s flailing hands. As he veered under the large man’s arm, he spotted Bob wielding a crowbar. He managed to catch it just before it connected with the side of his head. His palm exploded with pain.

Much clearer, is it not? It’s also less of a Short Road Home: the reader is not told up front that something that has not yet occurred on the page will cause our hero to wince with pain.

Sometimes, though, a writer’s effort to make a series of actions clear can also send the narrative sliding down the Short Road Home. The pros like to call this over-explaining, for reasons I hope the next example will render obvious.

Darlene took a deep breath, so she could speak at length. This was taking a surprising amount of explanation. “It’s over, honey.”

Morgan’s eyes filled with tears. Confusion suffused his soul as he struggled to plumb her meaning. “But I don’t understand!”

He honestly didn’t. His perplexity continued even after Darlene’s quiet, “How is that possible, after the last hour and a half of conversation?” He just couldn’t wrap his mind around what she was trying to say. Was there a subtext here? Was it a subtle joke? Why was she telling him this now? Had there been a series of clues he had not caught, and if so, would she merely get angrier if he asked her for an itemized list?

He reached for his notebook, so he could consult his notes from the previous hour. “You’ll have to explain this to me again. What is it you’re trying to say?”

A tad redundant, is it not? Again, over-explanation is typically a show, don’t tell problem: by swamping the character-revealing and plot-resolving action with a welter of extraneous explanation, not only is the pacing slowed, but the central point of the scene (in this case, Morgan’s refusal to accept a painful rejection) gets a bit lost. So while all of that repetitive bottom-lining of his emotional state may have seemed to the author like necessary clarification, naming those emotions rather than showing them renders the scene less effective.

Too-heavy explanations are also, as we discussed in this series, rather insulting to Millicent’s intelligence. I’ve wrested a bit of comedy from Morgan’s cluelessness in order to make this scene more fun to read, so the distrust of the reader’s ability to draw quite obvious conclusions about a fairly straightforward situation may not have leapt out at you. It’s not as though this scene deals with unfamiliar concepts, however; for most readers, confusion — and, by extension, denial — don’t really require much introduction.

Don’t believe me? Okay, here’s that same scene again, allowing the characters’ actions and feelings to speak for themselves.

Darlene took a deep breath. “It’s over, honey.”

Morgan’s eyes filled with tears. “But I don’t understand!”

Her grip tightened on the back of the chair so much that her knuckles grew white, but she held her voice quiet. “How is that possible, after the last hour and a half of conversation?”

Every inch of his intestines quivering, we reached for his notebook, so he could consult his notes from the previous hour. There had to be a way to talk her out of this, but how?

“I just want to make sure I get your reasoning.” He measured the time between his words with care: long enough to buy him some time, not long enough to make her want to leap into his mouth with pliers to drag the reluctant syllables out. “You know, so I can explain your departure to our friends.”

Axing all of that extraneous explanation certainly bought some room for character development, didn’t it? That would come as a surprise to most Short Road Home-wielders, I suspect: the urge to summarize tends to be a side effect of the impulse to speed things up. Or, as is often the case in these decadent days of relatively low word counts — can you imagine, say, THE WORLD ACCORDING TO GARP, if it had to be cut down to under 100,000 words before it could be marketed? — of a clawing, terrified need to slash fifty pages from an over-long manuscript.

Besides, as we have just seen, summarizing emotional turmoil, that oh-so-common manifestation of the Short Road Home, just isn’t as effective on the page as demonstrating it through specific feelings, actions, and thoughts. Not merely labeling the emotion in question. mind you — Jack was sad is not, after all, a particularly evocative description — but by showing it in detail and trusting the reader to draw the correct conclusion about Jack’s emotional state.

He roved listlessly around the living room, straightening a grinning china dwarf here, making sure a magazine’s edge was exactly parallel to the edge of a table there. Calla might not be alive to notice anymore, but that was no reason to relax her standards. Someday, a guest might stop by, as they did in the old days.

You wouldn’t want to convey the impression that Millicent is intellectually incapable of extrapolating as self-evident a conclusion as Jack was sad from that little gem, would you?

The SRH’s almost magical ability to minimize the emotional impact of a moment is not limited to tragedy, either; as we saw in the Pet Peeves on Parade series, over-explanation’s ability to declare a joke dead on arrival is legendary. Less discussed amongst writers but equally pernicious, skating too quickly past the comic constituent parts of a potentially funny scene can also be fatal to humor.

Just as suspense is more effective if the reader has time to absorb the ambient threat and imagine a negative outcome before something bad happens — Alfred Hitchcock was apparently fond of saying that the best way to render a scene that will end with an explosion was not to show the participants running around in terror of the imminent bang, but to let the audience a bomb was concealed under a table, then let them squirm while a couple of characters who have no idea they’re dining atop a bomb chat about something else entirely — a funny build-up tends to have more impact if the reader has a chance to appreciate a series of amusing details.

Premises in particular are susceptible to death by Short Road Home. Take a gander:

Surely, nobody would care if she took just one apricot from that beautiful pile. Gerri reached out, grabbed a small one near the bottom — and then the entire pyramid disintegrated, sending fruit flying everywhere.

Now, this might have been funny, had it been fleshed out a bit more. Indeed, it wouldn’t be particularly difficult for a comedy-minded reader to picture what probably happened here in hilarious detail, based upon this scant description. But it’s not the reader’s job to contribute material to a book’s humorous scenes; it’s the writer’s job to write them so that they are funny.

Surprisingly often, simply drawing out the suspense will make a SRH scene funnier. Let’s apply the Hitchcock principle to poor Gerri’s plight.

The largest pile of fruit she had ever seen loomed before her, five feet high if it was an inch and nearly as broad at the base. Each perfectly ripe apricot selflessly offered a flawless furry cheek to the public, an arc of delicious roundness identical to its neighbor. She leaned forward to examine it more closely, convinced that the fruit must be fake.

A slap of immistakable sweetness assured her nose that her brain was dead wrong. She had to force herself not to plunge her face into the wall of fruit.

She circled the display, running her fingers as close to the base as she dared. The Great Pyramid of Giza could hardly have been arranged with greater care, but Gerri felt this was an even greater human achievement: presumably, the ancient wonder had taken years; judging by the heady aroma, this must have been the work of a single breathless hour. She could not even begin to imagine the bravery it must have taken to place that last crowning apricot, the cherry on the top of the world’s most precariously-constructed sundae.

Her mouth was watering; clearly, it had been a mistake to swoop in for a sniff. A reasonable adult would simply have accepted that the pyramid was what the sign next to it said it was — the Arabella County 4-H Club’s summer project, an attempt to beat a three-decade-old youth timed fruit-piling record — and moved on. A reasonable adult, however, would not have been forcibly deprived of stone fruit for the last two years by a husband who wouldn’t have known a vegetable had it leapt into his mouth of its own accord, screaming, “Eat me, Harold!”

She slipped around back, where theft would be least likely to be noticed and selected her prey. A smallish one, with a dent in it so minuscule that Sherlock Holmes might have missed it. Millimeter by millimeter, she edged it out of its space in the middle of the tenth row.

She slipped it into her pocket, cool and damp, just before a fresh group of State Fair-goers stampeded into the room. “Isn’t it magnificent, Ma?” a little girl with pigtails gushed.

Palpitating but proud, Gerri wove her way through the crowd. “Oh, excuse me,” she told a gawker. “I didn’t mean to step on your shoe.”

He waved away her concern a little too hard; Gerri had to duck to avoid his elbow. Her elbow knocked into something soft and prickly.

The Zucchini Through the Ages display went crashing to the ground, its hand-lettered signage squashing thirty squash into pulp in an instant. Small zucchini rolled under bystanders’ feet, sending strangers careening into one another. The first-place entry, a monster of eighteen pounds, flew straight into the stomach of a passing Girl Scout, forcing a stream of fairground goodies out of her astonished mouth. In the ensuing stampede, every single entry in the legume category was trampled beyond recognition.

Only the apricot mountain was spared. “I knew they must have glued it together,” Gerri muttered, slipping form the tent.

Oh, hadn’t I mentioned that comedy also benefits from a healthy dose of surprise?

No, but seriously, folks, after a lifetime of reading and a couple of decades of reading professionally, it’s my considered opinion that the overwhelming majority of aspiring writers don’t have a clear idea just how much the average reader enjoys savoring conflict — or how much more trivial an easily-solved problem appears on the page than one with which the protagonist must struggle for pages or chapters on end. Just as an Idiot Plot that is resolved the instant someone thinks to ask Aunt Joyce her ring size is less than dramatically satisfying, a plot resolved by a Short Road Home tends to leave readers feeling a trifle underfed.

They came for a full meal, you know, with many succulent courses. How could they not be disappointed when a narrative merely gives them a glimpse of a nicely-fried brook trout, then whisks it away untasted? Or when the waiter spends the whole meal boasting of the spectacular dessert, then brings out a single dry cookie for the entire table to share?

And that’s non-professional readers’ reaction; the pros are even more ravenous for luscious, richly-depicted narrative tension. Just because Millicent spends her days grazing upon query letters and munching on synopses doesn’t mean she wouldn’t be thrilled to have a feast come submission-reading time.

Please say you’ve grasped the concept, because this metaphor is beginning to whimper under its explanatory load.

An excellent place to start sniffing around for instances of the Short Road Home: when a narrative begins to stray close to stereotype territory. Why? Well, stereotypes thrive upon generalization, so when they rear their ugly heads, they tend to nudge the narrative toward summary statements, conclusions, and the like. Grounding a scene or argument in the specific has the opposite tendency.

Straying toward the general is particularly likely too occur in memoirs and novels where writer is working overtime to make a character likeable — or always right. A character that is never wrong is, among other things, predictable; when predictability has pulled up a chair and seated itself in a scene, tension tends to take a flying leap out the nearest window.

Too theoretical? Okay, let’s take a peek at the offspring one of the more common marriages of stereotype and Short Road Home: the troubled child of the protagonist, particularly if it’s a teenager.

At the very mention, Millicent has already started cringing in her cubicle in New York, I assure you. The TCoP crosses her desk so frequently in adult fiction and memoir that she can scarcely see a character in the 13-19 age range without instinctively flinching and crying out, “Don’t tell me — she’s going to be sullen.”

You’re quite right, Millicent — 99% of the time, she will be. And rebellious. Not to mention disrespectful, sighing, and eye-rolling.

Yes, troubled kids and teenagers across the land have been known to do these things from time to time — but remember what I said a few paragraphs back about predictability? When Millicent encounters the rare non-stereotypical teenager in a submission, it’s a red-letter day.

Do I sense some shifting in chairs out there? “Yeah, yeah,” I hear a few seasoned self-editors piping, “I already know to avoid stereotypes, because Millicent sees them so often and because the whole point of writing a book is to show my view of the world, not a bunch of clichés. What does this have to do with the Short Road Home?”

In practice, quite a bit: it’s very, very common for a narrative featuring a TCoP to expend considerable (and usually disproportionate) time explaining the kid’s behavior — and, often, justifying how the protagonist responds to it. Unfortunately, this rush to interpret not infrequently begins as early as the first scene in which the TCoP is introduced.

What might this look like on the first page of a manuscript, you ask? A little something like this — and see if you can catch the subtle narrative bias that often colors this stripe of the Short Road Home:

When hard-working Tom Carver opened his front door, arriving home late from work at the stuffed animal plant yet again, his daughter, Malia, was once again refusing to speak to him. Glaring at him silently with all of the dastardly sneer her fifteen-year-old face could muster, she played with her spiky, three-toned hair until the third time he had considerately asked her how her school’s field trip to the State Fair had gone.

“Like you care!” she exclaimed, rolling her eyes dramatically. She rushed from the room. Small chunks of what appeared to be zucchini flew from her hair onto the beautifully-swept floor.

The now-familiar sound of her slammed bedroom door ringing in his ears, he wandered into the kitchen to kiss his adored wife on her long-suffering cheek. “Criminy, I’m tired of that, Alice. Someday, all of that slamming is going to bring the house tumbling down on our heads. I’ll bet she hasn’t done even one of her very reasonable load of daily chores, either. Why did good people like us end up with such a rotten kid? I try to be a good father.”

Alice shook her head good-humoredly as she dried her wet hands on a dishtowel, slipped an apple pie in the oven, settled the home-make brownies more comfortably on their plate, and adjusted the schedule book in which she juggled her forty-seven different weekly volunteer commitments. “Well, Tom, she’s not a bad kid; she just acts like one. Malia’s felt abandoned since her mother, your ex-wife, stopped taking her bipolar medication and ran off with that bullfighter three months ago, totally ignoring the custody schedule we invested so many lawyers’ bills in setting up. She doesn’t have any safe outlet for her anger, so she is focusing it on you, the parent she barely knew until you gained the full custody you’d been seeking for years because you loved her so much. All you can do is be patient and consistent, earning her trust over time.”

Tom helped himself to a large scoop of the dinner he had known would be waiting for him. “You’re always right, Alice. I’m so lucky to have you.”

Well, I’m glad that’s settled. No need to read the rest of the novel, is there?

That’s a shame, because this story contains elements of a good character-driven novel. There’s a wealth of raw material here: a new custody situation; a teenager dealing with her mother’s madness and affection for matadors; a father suddenly thrust into being the primary caretaker for a child who had been living with his unstable ex; a stepmother torn between her loyalty to her husband and her resentment about abruptly being asked to parent a child in trouble full-time.

But when instant therapy sends us veering down the Short Road Home, all of that juicy conflict just becomes another case study, rather than gas to fuel the rest of the book. The result: what might have been an interesting scene that either showed the conflict (instead of telling the reader about it), provided interesting character development, or moved the plot along.

In other words, it becomes a scene that the writer should consider cutting.

Effectively, the narrative’s eagerness to demonstrate the protagonist’s (or other wise adult’s) complete understanding of the situation stops the story cold while the analysis is going on. Not for a second is the reader permitted to speculate whether Malia’s father or stepmother had done something to provoke her response; we hardly have time even to consider whether Tom’s apparently habitual lateness is legitimate ground for resentment.

Again, that’s a pity. If only Tom had said, “You know, instead of avoiding conflict, I’m going to maximize it, to make things more interesting for the reader,” and gone to knock on Malia’s door instead of strolling into the kitchen for coffee and soporific analysis, we might have had all the narrative tension we could eat.

Heck, had the narrative just gone ahead and shown Tom and Alice being patient and consistent, earning Malia’s trust over the next 200 pages, the reader MIGHT have figured out, I think, that being patient and consistent is a good way to deal with a troubled teenager. But no: the subtle Short Road Home demands that the reader be told what to conclude early and often.

Whenever you notice one of your characters rationalizing in order to sidestep a conflict, ask yourself: am I cheating my readers of an interesting scene here? And if you find you have a Jiminy Cricket character, for heaven’s sake, write a second version of every important scene, a draft where he doesn’t show up and explain everything in a trice, and see if it isn’t more dynamic. Do this even if your book’s Jiminy Cricket is the protagonist’s therapist.

Especially if it’s the therapist. Millicent sees a lot of those.

If you are writing a book where the protagonist spends a significant amount of time in therapy, make sure that you are balancing two-people-sitting-in-a-room-talking scenes with scenes of realization outside the office. And make sure to do some solid character development for the therapist as well, to keep these scenes tense and vibrant.

If you are in doubt about how to structure this, take a gander at Judith Guest’s excellent ORDINARY PEOPLE, where most of the protagonist-in-therapy’s breakthroughs occur outside of the analyst’s office. The therapist appears from time to time, punctuating young Conrad’s progress toward rebuilding his life after a particularly grisly suicide attempt with pithy questions, not sum-it-all-up answers.

Hey, here’s a radical thought for revising a Short Road Home scene: what if you tinkered with it so your protagonist learns his lessons primarily through direct personal experience — or through learning about someone else’s direct personal experience told in vivid, tension-filled flashbacks?

Sound familiar? It should: it’s a pretty solid prescription for a narrative that shows, rather than tells.

Which, at the risk of wearing out some pretty time-honored writing advice, you should strive to do as often as possible — at least in your first book, where you really need to wow the pros. After you make it big, I give you permission to construct a plot entirely about a couple of characters sitting around talking, motionless.

But for heaven’s sake, leave that pyramid of apricots alone; it’s not as solid as it appears to be. Keep up the good work!

The Short Road Home, part IV: Tommy! Watch out for that bear lurking at the end of this post! Tommy!

I can’t quite decide whether I am profoundly sorry or oddly pleased that I’ve been digressing from our series-within-a-series on the Short Road Home, my pet name for a storyline that introduces a conflict only to resolve it immediately, sometimes before the reader has a chance to register that the problem raised is at all serious. Yes, too-swift fixes make it harder for the reader to root for the protagonist — or, when faced with a truly galloping case of SRH, to perceive any build-up of narrative tension at all — but since authorial distrust in readers’ attention spans often underlie these apparently self-solving problems, perhaps jumping around between topics has been appropriate.

Those of us who read for a living, however, may be trusted to have attention spans longer than a third grader hopped up on a quart of cola and half a dozen brownies. Oh, our old pal, Millicent the agency screener, may be conditioned to reject most manuscript submissions on page 1, but once she gets into a story, she, like any other reader, wants to see it played out in a satisfying manner.

That seems to be news to an awful lot of submitters, however. You’d be amazed at how often not small, potentially character-revealing conflicts are resolved practically as soon as they appear on the page, but major ones. In book openings, it’s not even all that uncommon to use one of these near-momentary crises as a clumsy means of introducing necessary backstory, as the following sterling piece of dialogue illustrates.

“It’s gone!” Marvin scrabbled around frantically in the dry grass next to his sleeping back, careless of the rattlesnake producing marimba rhythms on its tail a scant yard away. “My beloved late great-great-grandfather’s pocket watch!”

Antoinette gasped. “Not the one traditionally passed from dying father to eldest son for a century and a half, and entrusted to you by your father on his deathbed not four weeks ago?”

“The same.” A silver disk flew through the air at his head, glinting in the firelight. “Why, here it is! Where did it come from?”

The sleeping bag on the far side of the fire jackknifed. Jesse’s red face peered out of the opening. “You dropped it three hours ago. I was waiting for you to notice.”

Marvin flung his arms around Antoinette. “My legacy is safe!”

“What kind of idiot brings an heirloom mountain climbing?” Jesse muttered, trying to regain a comfortable position.

Yes, this is Hollywood narration — all three characters are already aware of the significance of the watch, so the only conceivable motivation for Antoinette and Marvin to explain it to each other is so the reader can hear what they say, right? — but you must admit, it is a darned efficient means of shoehorning the watch’s importance to Marvin into the story. It might not even come across as heavy-handed, if the reader had time to absorb the loss, understand its significance through Marvin’s reaction, and gain a sense of what might happen if the watch were never found.

But here, the darned thing reappears practically the instant Antoinette finishes filling the reader in about it, killing any possible suspense before it’s had time to build. Does that strike you as a narrative strategy likely to entrance a professional reader? Or is it likely to seem like the Short Road Home to anyone with an attention span longer than a drunken gnat’s?

Leaving aside for the moment the burning question of whether a gnat could be trained to hold its liquor, let’s consider how much more annoying this narrative strategy would be if (a) it were used frequently throughout the story, (b) it were in fact the primary tactic for introducing conflict into the story, and/or (c) the conflict in question were one that had been hyped throughout the book as central to the protagonist’s personal journey.

Yes, you did read that last bit correctly, campers. You would be stunned at how frequently Millicent sees a manuscript’s central conflict diverted to the Short Road Home. Often in the last chapter — or on the next-to-last page.

“Oh, Marv,” Antoinette moaned, cradling his bloody head, “you are so close to learning the truth about your family. Before you die, let’s look at that watch one more time.”

With effort, he fished it out of his pocket. The last rays of the sun illuminated its broad face. “Wait — I’ve never noticed that notch before. Maybe it has a false back.”

After the third time he dropped the watch, she put her deft fingers to work for him. “Why, you’re right. There’s been a piece of paper hidden back here all the time.”

She spread the paper two inches from his eyes. With difficulty, he made out the words. “Dear descendent: you will have heard all your life about a family curse. There really isn’t one; I just made it up to scare off competition from my gold mine. Please find attached the true map to your inheritance. Love, Marvin Bellamy the First.”

Suddenly, Marvin felt life once again suffusing his limbs. “Why, that’s the answer I’ve been seeking since we began this long, strange trek!”

Antoinette struggled to contain her annoyance. “And to think, if you’d only given that watch more than a passing glance after your father gave it to you, we wouldn’t have had to spend fifteen months hiking these mountains barefoot.”

“Oh, stop your moaning.” He sprang to his feet. “Your shoes didn’t wear out until month three. Let’s go find the gold mine — it’s only a few hundred yards away.”

“Um, excuse me?” Millicent asks politely. “Is there a reason that I had to read the 312 pages prior to this one? The entire plot has just been sewn up in seven paragraphs.”

Ah, but you should be grateful, Millie: at least this protagonist had to do something in order to send us careening down the Short Road Home. Granted, it wasn’t much; he simply had to manhandle his main prop a little to find his long-sought truth. As you know from experience, many a passive protagonist simply has another character hand the key to the plot to him on a silver platter.

The shadowy figure was closer now, bending over him. If this was Death, he certainly wore nice cologne.

Wait — he knew that scent. Hurriedly, Marvin wiped the dust from his eyes, but he still didn’t believe what they told him. “Dad? I thought you were…”

“Dead?” Marvin the Fifth chuckled ruefully. “No, not quite, son. That was merely the necessary push to aim you toward your legacy. Still got that watch?”

Marvin dug it out of his pocket. Snatching it, the old man cracked it in half.

“My inheritance!” Marvin screamed, horrified.

“Oh, it’s just a cheap knock-off.” Dad poked around in the shards. “But it contained this key to a safe-deposit box located twenty-two feet from this very spot. Come on, kid, let’s go claim your real inheritance. On the way, I’ll tell you all about your great-great grandfather’s plan for making his descendents rich.”

“Do I have to walk?” Marvin whined. “I’m tired from all of that mountain-climbing.”

“Hello?” Antoinette shouted after the pair. “Remember me? The lady who has been carrying your backpack for the last 100 pages?”

Come on, admit it: Marvin, Jr. is not the only one who seems a trifle lazy here. This writer appears to have dropped a deus ex machina into this plot, having a new character waltz into the story at the last minute to explain away all of the remaining mystery, rather than engaging in the hard, meticulous work of setting up sufficient clues throughout the story for the protagonist to be able to solve it himself.

Like other forms of the Short Road Home, the external explainer is a tension-killer. It could have been worse, though: ol’ Dad could have popped up periodically throughout the story, making it clear to all and sundry that he could have filled Marvin in at any time, if so chose he. What a pity that Marvin was just too darned lazy — or dim-witted, or determined that this story would take 324 pages to tell — to ask the obvious question.

Oh, you laugh, but narrators effectively tease the reader in this manner all the time in both novel and memoir submissions, through the use of the historical future tense. The openings of chapters are particularly fertile ground for this sort of suspense-killing narration. Often mistaken for subtle foreshadowing, transitional statements like I was happy — but my illusions were about to be shattered forever. actually minimize the tension to come.

How? Well, before the conflict even begins, the reader already knows the outcome: the narrator’s illusions will be shattered. She may not yet know the details, but you can hardly expect her to begin reading the next scene hoping for the best, can you?

Section-opening paragraphs that tell the reader how the scene how it’s going to end before the scene begins are alarmingly ubiquitous. Sometimes, such foreshadowing is subtle:

Although I didn’t know it at the time, my days of wine and roses were soon to come to an end — and in a way that I could never have anticipated in a thousand years of constant guessing. How was I to know that every child only has so many circuses in him before he snaps?

When my great-uncle Cornelius came down to breakfast waving the circus tickets that Saturday in May, I couldn’t have been happier…

Sometimes, though, foreshadowing is so detailed that it more or less operates as a synopsis of the scene to follow:

My hard-won sense of independence was not to last long, however. All too soon, the police would march back into my life again, using my innocuous string of 127 unpaid parking tickets (hey, everyone is forgetful from time to time, right?) as an excuse to grab me off the street, throw me in the back of a paddy wagon, and drag me off to three nights’ worth of trying to sleep in a cell so crowded that the Black Hole of Calcutta would have seemed positively roomy by contrast.

It all began as I was minding my own business, driving to work on an ordinary Tuesday…

In both cases, the narrative is telling, not showing — and, even more troubling to writing rule-mongers, telling the story out of chronological order. The latter is generally a risky choice, because, let’s face it, unless you’re writing a book that features time travel, most readers will expect events to unfold in chronological order — or if not, for flashbacks to be well-marked enough that the reader never needs to ask, “Wait, when is this happening?”

For the sake of clarity, beginning a scene at the beginning and proceeding to the end without extensive temporal detours is the established norm. That’s why, in case any of you had been wondering, the frequent use of and then tends to annoy your garden-variety Millicent: unless a narrative specifically indicates otherwise, actions are assumed to have occurred in the order they appear on the page. I lost my footing and plunged into the water. And then the bear ate me, therefore, does not convey any more information to the reader than I lost my footing and plunged into the water. The bear ate me.

I hear some of you giggling. “Oh, come on, Anne,” lovers of conversational-style narration and/or run-on sentences protest. “I can see that and then might have been logically unnecessary here, but what’s the big deal about adding a couple of extra words?”

If they appear only once or twice in the course of a manuscript, they might not be a big deal. Given the extreme popularity of chatty-voiced narration, however, and the common conception that first-person narration peppered with conversational conjunctions is a valid reflection of everyday speech, Millicent sees an awful lot of and thens in a work day. Often, more than once on a single page. Or within a single paragraph.

You might want to give it a rest. I’m just saying.

Back to the benefits of telling a story in chronological order, rather than skipping around in time. Showing events in the order they occurred renders maintaining narrative tension easier, particularly in first-person narration: the reader may be safely left in the dark about surprising developments until they’re sprung upon the narrator, right?

Let’s face it, though, if the reader already knows what is going to happen before a scene begins, the temptation to skim or even skip the recap can be considerable. Particularly, say, if the reader in question happens to be a Millicent trying to get through a hundred submissions in an afternoon. Maybe she should run out and grab a latte to perk herself up a little…

All of which is to say: if you were looking for a good place to start trimming a manuscript, running a quick scan for the historical future tense might be a dandy place to start. Often, such opening paragraphs may be cut wholesale with little loss to the overall story. Ditto with premature analysis.

Oh, wait: I’m foreshadowing — and to render it even more confusing, I’m doing it by jumping backwards in time. The last time I addressed this topic, a reader wrote in to ask:

I’m assuming that it’s still okay to occasionally employ the historical future (foreshadowing) comments, as long as we don’t prematurely spill the beans…or choke on them…in our rush to analyze, yes?

That’s an interesting question. So much so that I strongly suspect that if this reader had asked it at a literary conference, agents and editors would glance at one another sheepishly, not wanting to generalize away the possibility that a writer in the audience could wow ‘em with foreshadowing, and then fall back on that time-worn industry truism, it all depends upon the writing.

Which would be precisely true, yet not really answer the question. But did you notice how gratuitous that and then was?

To address it head-on, let’s take another gander at our last two examples. In a novel or a memoir, a writer could probably get away with using the first, provided that the story that followed was presented in an entertaining and active manner.

Yes, Example #1 does provide analysis of action that has not yet happened, from the reader’s point of view — and doesn’t it make a difference to think of a foreshadowing paragraph that way, campers, instead of as a transition between one scene and other? — but it does not, as our questioner puts it, spill the beans. The reader knows that something traumatic is going to happen, and where, but not enough about either the event or the outcome to spoil the tension of the upcoming scene.

In Example #2, by contrast, not only does the narrative announce to the reader the specifics of what is about to occur — told, not shown, so the reader cannot readily picture the scene, so revisiting it seems dramatically necessary — but shoves the reader toward an interpretation of the events to come. After such a preamble, we expect to be outraged.

Which, too, is dangerous strategy in a submission: such an introduction raises the bar for the scene that follows pretty high, doesn’t it? If a text promises Millicent thrills and doesn’t deliver them, she’s not going to be happy. Or impressed. Frankly, though, if she’s already in a touchy mood — how many times must the woman burn her lip on a latte before she learns to let it cool before she takes a sip? — the mere sight of the historical future might set Millicent’s teeth on edge, causing her to read the scene that follows with a jaundiced eye.

Why, you ask? The insidious long-term result of repetition — because writers, unlike pretty much everybody else currently roaming the planet, just LOVE foreshadowing. The historical future makes most of us giggle like schoolgirls tickled by 5000 feathers.

As with any device that writers as a group overuse, it’s really, really easy to annoy Millicent with the historical future. Especially if she happens to work at an agency that handles a lot of memoir, where it’s unusual to see a submission that doesn’t use the device several times within the first 50 pages alone.

Heck, it’s not all that uncommon to see it used more than once within the first five. By the end of any given week of screening, poor Millie has seen enough variations on but little did I know that my entire world was about to crumble to generate some serious doubt in her mind about whether there’s something about writing memoir that causes an author to become unstuck in the space-time continuum on a habitual basis.

Which, in a way, we do. Since memoirs by definition are the story of one’s past, really getting into the writing process can often feel a bit like time-travel. After all, how else is a memoirist going to recall all of those wonderfully evocative telling details that enlivened the day a bear ate her brother?

Tell me honestly: as a reader, would you rather see that bear jump out of the underbrush and devour bratty little Tommy twice — once before the scene begins, and once at its culmination — or only once?

Or, to put it another way, would you prefer to know that Tommy is going to be a carnivore’s dinner, so you may brace yourself for it? Or would you like it better if the scene appeared to be entirely about the narrator and Tommy bickering until the moment when the bear appears — and then have it devour him?

If you’re like most readers — and virtually all professional ones — nine times out of ten, you would pick the latter. And for good reason: genuine suspense arises organically from conflict between the characters as the story chugs along. A surprise that you’ve known was coming for two pages is obviously going to startle you less than one that appears out of nowhere.

Foreshadowing is the opposite tactic: it tells the reader what to expect, dampening the surprise. It’s hard to do without spoiling future fun. All too often, what the writer considers a subtle hint informs the reader that a shock is to come in such explicit terms that when the shock actually occurs, the reader yawns and says, “So?”

That’s a pretty high price to pay for a transitional sentence or two that sounds cool, isn’t it?

Not all foreshadowing utilizes the historical future tense, of course, but it’s not a bad idea to get into the habit of revisiting any point in the manuscript where the story deviates from chronological order for so much as a sentence. Or even — and revising writers almost universally miss this when scanning their own works — for half a sentence.

Why? Well, from a reader’s perspective, even that brief a Short Road Home can substantially reduce a scene’s tension. Take, for example, this fairly common species of scene-introducing prose:

On the day my brother Jacques shocked us all by running away from home, I woke with a stomachache, as if my intestines had decided to unravel themselves to follow him on his uncertain road, leaving the rest of my body behind.

Assuming that the reader had gleaned no previous inkling that Jacques might be contemplating going AWOL, what does the narrative gain from opening with the scene’s big shocker? Yes, announcing it this way might well evoke a certain curiosity about why Frère Jacques departed, perhaps, but why not let the reader experience the surprise along with the family?

Taking the latter tack would not even necessarily entail losing the dramatic effect of foreshadowing. Take a look at the same scene opener without the spoiler at the beginning of the first sentence:

I awoke with a stomachache, as if my intestines had decided to unravel themselves to follow an uncertain road behind the Pied Piper, leaving the rest of my body behind. If this was what summer vacation felt like, give me six more weeks of school.

Mom burst into the room with such violence that I cringed instinctively, anticipating the obviously unhinged door’s flying across the room at me. “Have you seen Jacques? He’s not in his room.”

More dramatic, isn’t it? Starting off with a description of a normal day and letting the events unfold naturally is a more sophisticated form of foreshadowing than just blurting out the twist up front.

Not to mention closer to the way people tend to experience surprises in real life– as a manifestation of the unexpected.

That may seem self-evident, but as Millicent would have been the first to tell you had not I beaten her to the punch, few manuscript submissions contain twists that actually surprise professional readers. Partially, as we discussed earlier in this series, this is the fault of the pervasiveness of the Idiot Plot in TV and film, of course, but it also seems that many aspiring writers confuse an eventuality that would come out of the blue from the point of view of the character experiencing it with a twist that would stun a reader.

Again, it all depends upon the writing. (Hmm, where have I heard that before?) At the risk of espousing a radical new form of manuscript critique, I’m a big fan of allowing the reader to draw her own conclusions — and of trusting her to gasp when the story throws her an unanticipated curve ball. After all, it’s not as though she has the attention span of a gnat, drunken or otherwise.

Unfortunately, many aspiring writers apparently don’t trust the reader to catch subtle foreshadowing; they would rather hangs up a great big sign that says, HEY, YOU — GET READY TO BE ASTONISHED. That in and of itself renders whatever happens next less astonishing than if it came out of the proverbial clear blue sky.

I’m sensing some disgruntlement out there. “But Anne,” some of you inveterate foreshadowers call out, “what you say about real-life surprises isn’t always true. Plenty of people experience premonitions.”

That’s quite true, disgruntled mutterers: many folks do feel genuine advance foreboding from time to time. Others cultivate chronic worry, and still others apply their reasoning skills to the available data in order to come up with a prediction about what is likely to occur.

Do such people exist in real life? Absolutely. Should one or more of them be tromping around your manuscript, bellowing their premonitions at the tops of their gifted lungs? Perhaps occasionally, as necessary and appropriate, if — and only if — their presence doesn’t relieve the reader of the opportunity to speculate on her own.

In fact, a great way to increase plot tension in a story featuring a psychic character is to show him being wrong occasionally. Mixes things up a bit for the reader. But — correct me if I’m wrong — in real life, most of us don’t hear giant voices from the sky telling anyone who might happen to be following our personal story arcs what is going to happen to us twenty minutes hence.

To those of you who do habitually hear such a voice: you might want to consult a reputable psychiatrist, because the rest of us don’t lead externally-narrated lives. There’s an excellent chance that six-foot rabbit who has been giving you orders is lying to you, honey.

If we were all subject to omniscient third-person narration at the most startling moments of our lives, Tommy wouldn’t have let that bear get the drop on him, would he? Unfortunately for his future prospects, as handy as it would have been had a talking vulture been available to warn him about the nearby hungry beast, that doesn’t happen much in real life.

But that doesn’t mean that if you do find that your life starts being narrated on the spot by a talking vulture, you shouldn’t seek some professional help.

Speaking of professional help: from a professional reader’s point of view, heavy-handed foreshadowing on the page is rather like having a tone-deaf deity bellow driving instructions from a low-hanging cloud bank. Yes, that constant nagging might well cause Millicent to avoid driving into that rock five miles down the road — but, time-strapped as she is, I’m betting that the warning is more likely to convince her to stop driving on that road altogether, rather than hanging on for the now-predictable ride.

Okay, so that wasn’t one of my better metaphors; darn that pesky vulture for distracting me. Keep up the good work!

Pet peeves on parade, part XX: but people really talk that way! revisited, or, what’s up, Doc?

All right, I’ll ‘fess up: last time, I broke one of the cardinal rules of blogging. In Thursday’s post, I blithely signed off with I shall continue to wax poetic on this subject tomorrow. But tomorrow came and went, and so did Saturday. In my defense, I might point out that I stayed away from my keyboard in deference to another cardinal rule of blogging, thou shalt not post whilst feverish. But honestly, with the nastiness of this year’s Seattle Spring Cold (contracted, typically, by rushing out into the elements the nanosecond sunshine breaks through threatening deep-gray cloud cover, madly stripping off the outer layers of one’s clothing and shouting, “Sun! I thought you had forsaken us!”), I might have predicted that tomorrow might see a spike in temperature.

Unless, of course, I was feverish when I wrote the tomorrow bit. Rather than send all of us hurtling down that ethical rabbit hole, I’m just going to tender my apologies and move on.

Or, to be precise, move laterally. I’m taking a short detour from the Short Road Home series — which, as those of you keeping track will recall, was itself a digression from our ongoing Pet Peeves on Parade series — to guide you past a cautionary tale or two. Dropping that increasingly tortured set of compound analogies like the proverbial hot potato, let me simply say that the inspiration for today’s post came, as is so often the case, from the muses stepping lightly into my everyday life to provide you fine people with illustrations of writer-friendly truths.

Thank the nice ladies, please. Where are your manners?

Perhaps I am constitutionally over-eager to put a happy-faced spin on things — my first writing group did not nickname me Pollyanna Karenina for nothing — but I have been thinking for months that one of the many advantages stemming from my long-lingering car crash injuries has been the opportunity (nay, the positive necessity) to have extended conversations with a dizzying array of medical practitioners, insurance company bureaucrats, and folks waiting around listlessly for their dreaded appointments with one or the other. Everyone has a story to tell, and I’ve been quite surprised at how minuscule a display of polite interest will trigger a vivid telling.

Oh, I had expected to encounter an eagerness to swap stories in fellow accident victims — those of you scratching your heads over constructing a pitch for an upcoming conference would do well to spend some time in medical waiting rooms, gleaning summarization technique; the average person-on-crutches can deliver a gripping rendition of how she ended up that way in thirty seconds flat — but you’d be astonished at how readily even the seemingly stodgiest paper-pusher will open up if one asks a few friendly questions. After, of course, getting over his surprise that someone would treat a professional conversation as, well, a conversation.

Admittedly, I am notorious for interviewing people trying to interview me; I’ve seldom walked into my first day on a job in ignorance of what my new boss wanted to be when she grew up, the kind of poetry she wrote in high school, and/or the full details of the time that her beloved terrier, Pepper, got his front right paw caught in that barbed wire fence running along mean Mr. Jones’ alfalfa field. (Mr. Jones’ neighbors, the Heaths, were chronically inept at fencing in their pet pygmy goats, you see.) One never knows where good, fresh material may be found, after all. And having grown up helping authors prepare for interviews and Q&A sessions at book readings, I know from long experience that one of the best ways to be a scintillating interviewee is to learn something about the interviewer.

So on Feverish Friday, after extracting from my chiropractor the exciting story of how his grandfather immigrated by himself from Hungary at age 12, just in time to avoid World War I, and egging on his receptionist as she tried to top his tale with her great-grandparents’ 1880s sea journey from Ireland to Brazil, then around the southernmost tip of South America to San Francisco to establish a community newspaper — isn’t it fascinating how practically every American has at least one forebear with a genuinely harrowing immigration story or a deeply disturbing how-the-federal-troops-displaced-us-from-our-land story? — I hobbled into my next appointment, all set to glean some interesting dialogue.

Why dialogue, you ask? Having been seeing, as I mentioned, an impressive array of practitioners over the last ten months, I had begun to notice certain speech patterns. Doctors, for instance, tend to speak largely in simple declarative statements, with heavy reliance upon the verbs to be and to have, but light on adjectives and adverbs. Frequently, they will lapse into Hollywood narration during examinations, telling the patient what ordinary logic would dictate was self-evident to both parties and asking softball questions to which simple observation might have provided an answer.

By contrast, patients often positively pepper their accounts with descriptors. Although most of their sentences are in the first person singular (“I seem to have misplaced my leg, Doctor.”), they frequently back off their points when faced with medical jargon. They also tend to echo what the doctor has just said to them, as a means of eliciting clarification.

Weren’t expecting that sudden swoop into dialogue-writing theory, were you? I’ll pause a moment, to allow you to whip out your Fun with Craft notebooks.

In the right mindset for some textual analysis now? Excellent. Let’s see what the speech patterns I described above might look like on the manuscript page.

“Let me take a look.” Dr. Ferris poked around her kneecap, nodding whenever she screamed. “Does that hurt?”

“Tremendously,” she whimpered.

That may have been a vague answer, but it apparently deserved a note on her chart. “You have a dislocated knee, Georgette. It is bent at a peculiar angle and must be causing a lot of pain. It will have to be put back into place.”

“What do you mean, back into…”

The wrench knocked her unconscious. When she awoke, her entire leg on fire, a piece of paper was resting on her stomach.

The doctor smiled at her reassuringly. “You will be in pain for a while. I have written you a prescription for painkillers. Take it to a pharmacy and have it filled.”

Hard to imagine that most of these statements came as much of a surprise to Georgette, isn’t it? She may not have the medical background necessary to diagnose a dislocated knee (although the doctor’s dialogue might have been substantially the same if she had, with perhaps a bit more medical jargon tossed in), but surely, she was already aware that the bottom and top halves of her leg were not connected in their habitual manner. Nor, one suspects, was she astonished to hear that she was in pain, or that prescriptions are filled at a pharmacy.

Yet this rings true as examination-room dialogue, does it not, despite an almost complete absence of medical terminology? That would come as a shock to most aspiring novelists writing about this kind of professional interaction: in manuscript submissions, doctors tend to spout medical lingo non-stop, regardless of context.

Stop laughing — it’s true. Whether they are in a hospital or in a bar, at the beach or at a funeral, fictional doctors often sound like they’re giving a lecture to medical students. Similarly, fictional lawyers frequently use terminology appropriate to closing arguments in a murder trial while ordering a meal in a restaurant, fictional professors apparently conduct seminars on Plato at cocktail parties, and fictional generals are incapable of speaking to their toddlers in anything but terse, shouted commands.

Okay, so that last one was a bit of an exaggeration, but you’d be surprised at how often Millicent the agency screener is faced with manuscripts in which professional credentials are established purely through a liberal dose of jargon.

Why is that problematic? Since your garden-variety Millie not only went to college with people who went on to become doctors, lawyers, professors, and the like, but may well have parents or siblings who pursue those avocations, it’s likely to give her pause when characters spout professional-speak in non-professional contexts. To her, those characters are likely to seem either unrealistic — a scientist who spoke nothing but shop talk around non-scientists would have a difficult time socially, after all — or monumentally insecure, because, let’s face it, well-adjusted doctors, lawyers, professors, and/or generals don’t really need to keep reminding bystanders of their standings in their respective fields. Or indeed, to keep reminding them what those fields are.

However, to writers not lucky enough to have spent much time around professionals in the fields about which they are writing — the non-medically-trained writer whose protagonist is a doctor, perhaps, or the non-cook whose mystery takes place in a restaurant — jargon may appear to be the primary (or only) means of demonstrating a character’s credibility as a member of that profession. Dropping some jargon into dialogue is certainly the quickest way to suggest expertise to the non-specialist: as most readers will not be intimately familiar with the actual day-to-day practices of, say, a diamond cutter, including a few well-defined diamond-cutting terms into a gem-handling character’s dialogue during scenes in which s/he is discussing jewelry might add quite a bit of verisimilitude.

Oh, you were expecting a concrete (or perhaps rock-based) example? Ah, but I follow the well-known writing precept write what you know — and its lesser-known but equally important corollary, do not write about what you don’t know — and if you must write about something outside your area of expertise, do a little research, already.

Okay, it’s a mouthful, but it’s fine advice, nevertheless. Because I know next to nothing of diamond-cutting and its lingo, it’s a good idea for me not to attempt a scene where a character’s credibility hangs on her expertise in gemology. It also would not necessarily make the scene ring any truer to those who do know something about the field if I invested all of twenty minutes in Googling the field, lifted four or five key terms, and shoved them willy-nilly into that character’s mouth.

Which is, alas, precisely what aspiring dialogue-constructors tend to do to characters practicing medicine for a living. Let’s invade poor Georgette’s appointment with Dr. Ferris again, to see what the latter might sound like if we added a heaping helping of medical jargon and stirred.

“At first glance, I’d say that this is a moderate case of angulation of the patella.” Dr. Ferris poked around her kneecap, nodding whenever she vocalized a negative response. “You’re a little young for it to be chondromalacia. Does that hurt?”

“Tremendously,” she whimpered.

“Lateral sublexation.” That apparently deserved a note on the chart. “You see, Georgette, if the displacement were in the other direction, we might have to resort to surgery to restore a more desirable Q-angle. As it is, we can work on VMO strength, to reduce the probability of this happening again. In the short term, though, we’re going to need to rebalance the patella’s tracking and more evenly distribute forces.”

“What do you mean, rebalance…”

The wrench knocked her unconscious. When she awoke, her entire leg on fire, a piece of paper was resting on her stomach.

Rather than focusing on whether a doctor might actually say any or all these things — some would get this technical, some wouldn’t — let me ask you: did you actually read every word of the jargon here? Or did you simply skip over most of it, as many readers would have done, assuming that it would be boring, incomprehensible, or both?

While we’re at it, let me ask a follow-up question: if you had not already known that Georgette had dislocated her knee, would this jargon-stuffed second version of the scene have adequately informed you what had happened to her?

For most readers unfamiliar with knee-related medical terminology (and oh, how I wish I were one of them, at this point), it would not. That’s always a danger in a jargon-suffused scene: unless the text takes the time to define the terms, they often just fly right over the reader’s head. Stopping the scene short for clarification, however, can be fatal to pacing.

“At first glance, I’d say that this is a moderate case of angulation of the patella.”

“Angulation?”

“It’s a mistracked kneecap.” Dr. Ferris poked around, nodding whenever she vocalized a negative response. “It must be. You’re a little young for it to be chondromalacia.”

Georgette was afraid to ask what chondromalacia was, just in case she wasn’t too young to get it. She should have asked, because unbeknownst to her, chondromalacia of the patella, the breakdown or softening of the cartilage under the kneecap, is quite common in runners.

A particularly vicious poke returned her attention to the doctor. “Does that hurt?”

“Tremendously,” she whimpered.

Slower, isn’t it? The switch to omniscient exposition (and judgmental omniscient exposition, at that) in the narrative paragraph shifts the focus of the scene from the interaction between the doctor and the patient to the medical information itself. Too bad, really, because the introduction of the jargon raises the interesting possibility of a power struggle between the two: would Georgette demand that Dr. Ferris explain what was going on in terms she could understand, or would she passively accept all of that jargon as unquestionable truth?

Oh, you thought that I was off my conflict-on-every-page kick? Never; passive protagonists are on practically every Millicent’s pet peeve list. Speaking of which, this latest version contained one of her lesser-known triggers. Any guesses?

If you immediately flung your hand into the air and cried, “I know, Anne! Paragraph 4 implied that Georgette had been thinking the entirety of the previous paragraph, rather than just its first sentence,” help yourself to a gold star out of petty cash. Coyly indicating that the protagonist is reading the text along with the reader used to be a more common narrative trick than it is today, probably because it no longer turns up in published YA so much, but that has not reduced the ire the practice tends to engender in professional readers.

“But Anne!” I hear some of you fond of 1970s-style YA narration protest. (You probably also favor the fairy-tale paragraph opening it was then that… , don’t you?) “I didn’t read Paragraph 4 that way at all. I just thought that the narration was cleverly acknowledging the time necessary for Georgette to have felt the fear expressed in the first sentence of Paragraph 3.”

Fair point, old-fashioned narrators, but why bother? Merely showing the thought is sufficient to indicate that it took time for Georgette to think it. Since that would have eaten up only a second or two, showing her so wrapped up in the thought (and, by implication, the sentence that follows, which she did not think) that it requires an external physical stimulus to bring her back to ordinary reality makes her seem a bit scatter-brained, doesn’t it? Combined with the echo of the doctor’s words in her first speech in Paragraph 2, the overall impression is that she quite confused by a relatively straightforward interaction.

Generally speaking, the harder it seems for a character to follow the plot, the less intelligent s/he will seem to the reader. If the distraction had been depicted here as pain-related, it might make sense that someone else would need to remind her to pay attention to what’s going on, but this isn’t a particularly intense thought. Besides, it’s related to what the doctor is doing to her — why would she need to make an effort to think and feel simultaneously?

Speaking of character I.Q. levels, contrary to popular opinion amongst aspiring writers, the use of jargon will not necessarily make a doctor or character in a similar profession appear smarter. In fact, it may well make him seem less articulate: the clichéd fictional male nerd who has trouble speaking to real, live women (although such people tend to study and work beside real, live women every day, TV and movies have conveniently trained us to ignore that fact) is not, after all, a cultural icon for his communication skills. Intelligent people — at least, those who are not trying to impress others with their jargon-mongering — consider their audiences when choosing what to say; deliberately talking above one’s conversational partner’s head is usually indicative of a power trip of some sort.

Or rampant insecurity. Or both.

Yes, really. As a reader — and, perhaps more to the point, as someone who reads manuscripts for a living — if I encountered the last two versions of Dr. Ferris on the page, I would assume that I was supposed to think, “Wow, this doctor is a poor communicator,” rather than, “Wow, this doctor is knowledgeable.” I would assume, too, that the writer had set this up deliberately.

Why? Well, the heavy use of jargon emphasizes the power differential between these two people at the expense of the reader’s comprehension. Indeed, in the last example, Georgette’s reluctance to admit that she does not understand the terms seems to be there almost exclusively to add more conflict to the scene. As the jargon doesn’t seem to serve any other narrative purpose, what else could I possibly conclude?

Oh, you have other ideas? “Yes, I do, Anne,” those of you still slightly irritated by our wrangle over the proper interpretation of Paragraph 4 point out. “Some of us use jargon because, well, that’s the way people in the fields we’re writing about actually speak. There’s no understanding some of ‘em. By reproducing that confusion on the page, we’re merely being realistic.”

Ah, but we’ve discussed this earlier in the series, have we not? Feel free to pull out your hymnals and sing along, long-term readers: just because a real-life person like a fictional character might say something doesn’t mean it will work on the manuscript page. The purpose of written dialogue is not, after all, to provide a transcript of actual speech, but to illustrate character, advance the plot, promote conflict — and, above all, to be entertaining to read.

By virtually everyone on earth’s admission, jargon from a field other than one’s own is not particularly entertaining to hear, much less read. Jargon is, by definition, exclusive: it’s meaningful to only those who know what it means.

That’s why in most published fiction, it’s kept to a minimum: since it’s safe to assume that the majority of readers will not be specialists in the same field as the character in question, merely sneaking in an appropriately avocation-specific term here or there will usually create a stronger impression of expertise than laying on the lingo with a too-generous hand.

And please, just to humor me, would everyone mind laying off the professor-who-can’t-stop-lecturing character for a while? I used to teach Plato, Aristotle, and Confucius at a major university, and I’ve been known to speak like a regular human being.

Case in point: go, Huskies!

See how annoying insider references can be? While that last bit may have brought a gleam of recognition to the eyes of those of you who live in the Pacific Northwest (or who are devoted to college football, women’s basketball, and/or cutting-edge cancer research), I would imagine that it left the rest of the Author! Author! community completely unmoved.

That’s precisely how readers who don’t get inside jokes in manuscripts feel. No matter how trenchant a reference may seem to those who happen to work within a particular industry, unless you plan for your book to be read by only people within that arena, it may not be worth including. At least not at the submission stage, when you know for a fact that your manuscript will need to gain favor with at least three non-specialist readers: Millicent, her boss the agent, and the editor to whom the agent will sell your book.

Oh, scrape your jaws off the floor. Few agents or editors — and, by extension, their screeners and assistants — can afford to specialize in novels or memoirs about a single subject area. The agent of your dreams have represented a book or two in which a doctor was a protagonist, but it’s unlikely that she will sell nothing but books about doctors. Even a nonfiction agent seldom specializes to that extent.

It’s in your manuscript’s strategic best interest, then, for you to presume that virtually any professional who will read your book prior to publication will not be an expert in your book’s subject matter — and thus will not be a native speaker of any jargon your characters might happen to favor. Bear in mind that if Millicent says even once, “Wait — I’ve never seen that term used that way before,” she’s substantially more likely to assume that it’s just a misused word than professional jargon.

Try thinking of jargon like a condiment: used sparingly, it may add some great flavor, but apply it with a too-lavish hand, and it will swamp the main course.

Interestingly, US-based aspiring writers have historically been many, many times more likely to employ the slay-‘em-with-jargon tactic in the dialogue of upper middle-class professional characters than in that of blue-collar workers. On the page, doctors, professors, and other beneficiaries of specialized higher education may flounder to express themselves in a social context, but plumbers, auto mechanics, coal miners, and longshoremen are apparently perfectly comfortable making the transition between shop talk and conversing with their non-specialist kith and kin. Unless Our Hero happens to be dealing with a particularly power-hungry plumber, the mechanic-who-turns-out-to-be-the-killer, or someone else pathologically intent upon establishing dominance in all situations, the writer is unlikely to resort to piling on employment-based jargon so that character can impress a casual acquaintance.

To those of us who happen to have had real-world interactions with pathological plumbers, world domination-seeking appliance repair people, and yes, doctors with poor communication skills, prone to responding to their patients’ input by pulling rank, essentially, this seems like an odd literary omission. Professionals using expertise for power is hardly rare in any field. Rather than taking the time to listen to an objection, consider whether it is valid, and either take steps to ameliorate the situation or explain in a manner comprehensible to the layman why the objection is invalid, some specialists routinely dismiss the non-specialist’s concerns purely on the grounds that a non-specialist could not possibly understand anything.

Best leave it to the professionals, dear. Don’t worry your pretty little head about it.

According to this logic (at least as it runs in my pretty little head), not only must the non-specialist’s diagnosis of the problem be wrong — her observations of the symptoms must be flawed as well. Since there is by definition no argument the non-specialist can make in response, the professional always wins; the only winning move for the non-specialist is not to play.

Which is why, I suspect, the classic send-up of this situation still rings as true today as it did when it originally aired in 1969. Here it is, for those of you who have somehow managed never to see it before.