I return to you an injured warrior, campers: for the past few days, my keyboard has lain idle while I have been recovering from a viciously broken fingernail. I’ve been lolling around with my left hand elevated, muttering ruefully.
Were those giant guffaws I just heard rolling about the ether an indication that some of you would not consider this a debilitating injury? I defy anyone to type successfully while a significant part of the nail bed on the pointer finger so dear to those who use the hunt-and-peck method is protected from the elements by nothing but the largest Band-Aid currently available to the medical community. Or to touch-type with any accuracy whilst said Band-Aid extends that finger to clownish lengths. Should any writer out there not care if his intended Fs are 5s and his Ps plus signs, I have yet to meet him.
In the course of all of that enforced lolling, however, I had leisure to contemplate once again the burning issue of plausibility on the page. Now that I’m back, I’m going to fling it into your consciousness, too: honestly, if you encountered the story above on page 57 of a novel, would it seem remotely realistic to you?
To a reader either unfamiliar with the torrid history of my long, accident-prone nails or happily inexperienced in having their own nails violently bent back, I’m guessing it would not. I’m also guessing that would come as a surprise to some of you, because as anyone who reads manuscripts for a living can tell you, the single most common response to an editorial, “Wow, that doesn’t seem particularly plausible,” is an anguished writer’s cry of, “But it really happened!”
I can tell you now that to a pro like Millicent the agency screener, this argument will be completely unconvincing — and not merely because she has, if she’s been at it a while, heard it applied to scenes ranging from cleverly survived grizzly bear maulings to life-threatening hangnail removals to couples who actually split the domestic chores fifty-fifty, rather than just claiming that they do. (Oh, like I was going to do laundry with a bent-back fingernail.) Any guesses why that cri de coeur about the inherently not-very-believable nature of reality will leave her cold?
Long-time readers, chant it with me now: just because something has occurred in real life does not necessarily mean it will be plausible written as fiction. Nor does the fact that a human being might actually have uttered a particular phrase render it automatically effective dialogue. For that reason, it’s the writer’s responsibility not simply to provide snapshots and transcripts of real life on the page, but to write about it in such a way to make it seem plausible to the reader.
Let’s face it, plenty of real-life shenanigans are completely absurd; plenty of what tumbles out of people’s mouths is at least equally so. The world as we know it does not labor under the novelist’s imperative to render actions dramatically satisfying, or even interesting. None of us is empowered to walk up to someone who does something astonishing and say, “Hey, that’s completely out of character for you. Editing! Cut what this man just did.” (Although, admittedly, it would be an interesting approach to winning friends and influencing people.) And don’t even get me started about how a good editor could improve the dialogue all of us overhear in the movie ticket line, at the grocery store, or at your garden-variety garden party.
Besides, as a novelist, isn’t your job to improve upon reality? Isn’t it, in fact, your art and your pleasure to take the real and dress it up in pretty language, garnishing it with trenchant insights?
So you can’t really blame Millicent and her cronies for preferring fiction writing to have more to recommend it than its resemblance to something that might have happened on this terrestrial sphere. I suspect all of us who love good writing harbor a similar preference.
But I ask you as a reader: would you have felt differently if the tale at the opening of this post had turned up on page 143 of a memoir?
Most readers would; based on a true story is not ubiquitous in book and movie marketing simply because folks in those industries happen to like the sound of the phrase, after all. It’s human nature to like to be in the know.
That does not mean, however, that any truthful memoir — which, as the series of scandals that have rocked the publishing world in recent years have made all of us aware, are not necessarily synonymous terms — is automatically and inherently plausible. Yes, the reader picks up a memoir with the expectation that it will provide a fact-based portrayal of reality, but once again, it’s not just the accuracy of the facts that makes them seem true-to-life on the page.
What might the decisive factor be, campers? Could it be how the writer conveys those facts on the page?
As the pros like to say, it all depends on the writing. Just as many a ho-hum real-life event has been punched up by a gifted prose stylist into an unforgettable scene on the page, many an inherently fascinating occurrence has been rendered downright turgid by a dull telling.
Don’t believe me? Okay, try this little experiment: the next time you find yourself at a gathering that contains both interesting and uninteresting people, pick a few of each at random. Ask these people to describe their first really vivid memories — or, if you have ears of iron, their first memories of how their parents responded to a major public event like men walking on the moon, the shooting of President Reagan and James Brady, or a celebrity couple’s breaking up. (Hey, one person’s intriguing public event is another person’s snoozefest.) Listen attentively to each account without interrupting.
Then ask yourself afterward: “Did all of those stories seem equally true?”
If it’s not apparent to you a few sentences into the first poorly-told account why the storyteller’s skill makes all the difference to the audience’s perception of the story, well, I shall be very surprised. What might be less apparent — and thus require more careful listening to detect — is that you’re probably going to care less whether what the speaker is saying is true if she happens to tell the tale well.
And that, my friends, sums up the private reactions of many, many denizens of the publishing world in the wake of the A MILLION LITTLE PIECES scandal. For months afterward, while people in the outside world were asking, “But is this accurate?”, folks who dealt with books for a living — and, I suspect, most habitual readers of memoir — kept saying, “But was it well-written?”
Frankly, for a memoir to work, it needs to be both. Unless the memoirist in question is already a celebrity — in which case he’s probably not going to be the sole writer, anyway — a simple recital of the facts, however titillating they may be in and of themselves, will not necessarily grab Millicent. Nor will a beautifully-told collection of purely imaginary events fly in the memoir market.
You know where gorgeous writing that doesn’t confine itself rigidly to what actually happens in the real world works really well, though? In a novel. Provided, of course, that the writer presents those fictional — or fictionalized — events in such a manner that they are both a pleasure to read and seem plausible within the context of the world of the book.
Do I spot some timidly-raised hands out there? “But Anne,” those of you who specifically do not write about the real point out shyly, “I don’t think this applies to my work. I create storylines out of whole cloth, creating plots where vampires roam freely, werewolves earn master’s degrees, and denizens of other planets lecture in political science departments. Of course, my stories aren’t plausible; that’s part of their point.”
Actually, to work on the page, any storyline needs to be plausible. That is, the narrative must be sufficiently self-conscious about its own premise that any reader who has accepted its underlying logic that everything in the story could have happened that way.
You would be amazed at how often paranormal, science fiction, and fantasy manuscripts do not adhere to this basic precept of storytelling. Implausible fantasies are perennially among Millicent’s pet peeves.
That got a few goats, did it not? “What part of fantasy don’t you understand, Millie?” I hear some of you mutter under your respective breaths. “It’s not intended to be realistic.”
No, but it does need to be plausible — which is not necessarily synonymous with realism. In fact, in a completely fantastic story, remaining plausible might actually require being anti-realistic.
How so? Well, for the reader to be carried along with a story, its internal logic must make sense, right? A narrative that deliberately eschews the laws of physics of our world can’t just ignore physical properties and motion altogether; the writer must come up with a new set of rules governing the world of the story. The less like the real world that fantasy world is, the more vital to the reader’s willing suspension of disbelief maintaining the reader’s sense of plausibility is.
That means, in effect, that while a fantastic plot allows the writer to play with reality, in order to be plausible, the narrative must be respectful of the fictional reality. So when, say, the three-toed sloth protagonist first sets a digit upon the Planet Targ, a place the reader was informed 138 pages ago was exempt from both gravity and dirt, and ol’ Three-Toe leaves a footprint, that’s going to jar a reader who has been paying attention. And the negative effects of even minor inconsistencies can pile up awfully fast: when T-T appears with his designer jeans covered in mud thirty pages after the footprint faux pas, the reader is obviously going to be less accepting than the first time the writer broke the rules.
What is the cumulative effect likely to be? For a lay reader, being knocked out of the story altogether. To a professional reader, however, the results are usually more dire — and are likely to be triggered by the first plausibility lapse, not the third or fourth.
“Oh, no,” Millicent sighs over The Saga of the Sloth. “This writer has set up a really interesting set of rules for this world, and now she’s violated one of them. That’s too bad; I was buying the premise here, and now I have to question it. Next!”
From Millicent’s perspective, the inconsistent detail about the footprint, while not necessarily a rejection-worthy problem in itself, represented a symptom of a plot-level plausibility issue, one that she does not necessarily feel compelled to read on to see confirmed thirty pages later in the muddy jeans. It was the writer’s job to make Three-Toe’s trip to Targ believable within the context of the book’s logic, after all. Since the narrative has already demonstrated a lax approach toward internal plausibility, an experienced Millie would expect to see more lapses later on in the manuscript.
And most of the time, she would be quite right about that. If you really want to set your fantastic world apart from 99% of the others she sees, make its attributes perfectly consistent.
That should be a piece of cake, right?
I’m kidding, of course; editing one’s own work for consistency is one of the most difficult self-editing tasks there is. That’s true, incidentally, no matter where your story might fall on the fantastic-realistic scale. In fact, proofing a hyper-realistic text can be even more challenging than a completely fictional one: even if it’s vitally important to the story that the broom is always kept behind the china cabinet, not the ottoman, the very mundanity of the detail may render it harder to keep in mind.
But you don’t want your heroine to expend her last gasp of breath futilely flailing behind the wrong piece of furniture, would you?
Naturally, from the reader’s perspective, the less predictable a detail is, the more memorable it is likely to be. Case in point: what kind of animal is visiting the Planet Targ? Would you have been able to answer so quickly if the story had just been about some guy named Bart?
Does that gasp of frustration mean that those of you who write reality-based fiction and memoir are already familiar with the problem of how to make the real memorable while still maintaining a sense of realism? Let’s face it: most of real-life details are likely to be on the unmemorable side. While a fantasy writer has the option — nay, the responsibility — to transform that perfectly ordinary mailbox on the corner into a flying monkey that happens to deliver mail for a living, a writer painting a picture against a backdrop of this world can’t.
(At least not until I have finished organizing my secret Chimps-on-Wings postal service. Mum’s the word until I put the finishing touches on that promising enterprise.)
But details need not strain the credulity in order to capture the reader’s imagination. Allow me to tell you a little story to illustrate — or, rather a series of little stories. But first, let me prime the creative pump by showing you a couple of literal illustrations.
These are the two sides of the single fortune I found tucked into an end-of-the-meal cookie last year, right around census time: a tactfully-phrased prediction of my future happiness — by mail, no less! — accompanied by a terse statement about my general standing in the world. Now, had I been a less secure person, I might have taken umbrage at my dessert’s presuming to judge whether I counted or not, but since I had already sent back my census form, I found the symmetry very pleasing: clearly, Somebody Up There (or at any rate, Somebody Working in a Cookie Factory) was planning to reward the civic virtue of my outgoing mail with something fabulous in my incoming mail.
Imagine how dismayed I would have been, though, had I not yet popped my census form into the mail — or, even worse, if I had not yet received my census form. As I rearranged vegetables and yogurt containers in preparation for fitting my leftover asparagus in black bean sauce and Hunan pork into my overstuffed refrigerator, I would have kept wondering: is the census form the mail I’m supposed to find so darned pleasant? I mean, I understand the Constitutional obligation to be counted every ten years, but who is this fortune cookie to order me to enjoy filling it out?”
Admittedly, in a real-life fortune cookie-consumption situation, this might have been a bit of an overreaction. (Although what’s next, I wonder? Miranda warnings printed on Mars bars, for easy distribution at crime scenes? The First Amendment immortalized in marzipan, lest bakery patrons temporarily forget about their right to freedom of assembly whilst purchasing fresh macaroons?) Had the protagonist in a novel or memoir stumbled upon this chatty piece of paper, however — and less probable things turn up on the manuscript page all the time — it would have seemed pretty significant, wouldn’t it?
Any thoughts on why that might be the case? Could it be that this bizarre means of communication is one of those vivid details I keep urging all of you to work into the opening pages of your manuscripts, as well as the descriptive paragraph in your queries, synopses, verbal pitches, and contest entries? Could the paragraphs above be crammed with the kind of fresh, unexpected little tidbits intended to make Millicent suddenly sit bolt upright, exclaiming, “My word — I’ve never seen anything like that before,” at the top of her lungs?
Or, to put it in terms the whole English class can understand, in choosing to incorporate that wacky fortune cookie into the narrative, am I showing, rather than telling, something about the situation and character?
How can a savvy self-editing writer tell whether a detail is vivid or unusual enough to be memorable? Here’s a pretty reliable test: if the same anecdote were told without that particular detail, or with it described in (ugh) general terms, would the story would be inherently less interesting?
Don’t believe that so simple a change could have such a dramatic subjective effect? Okay, let me tell that story again with the telling details minimized. To make it a fair test, I’m going to keep the subject matter of the fortunes the same. Because I always like to show you examples of correctly-formatted manuscript pages, however, this time, I’m going to present it to you as a screening Millicent might see it. As always, if you’re having trouble reading the individual words, try enlarging the image by holding down the COMMAND key and pressing +.
It’s not as funny, is it, or as interesting? I haven’t made very deep cuts here — mostly, I’ve trimmed the adjectives — and the voice is still essentially the same. But I ask you: is the story as memorable without those telling details? I think not.
Some of you are still not convinced, I can tell. Okay, let’s take a more radical approach to cutting text, something more like what most aspiring writers do to the descriptive paragraphs in their query letters, the story overviews in their verbal pitches, and/or the entirety of their synopses, to make them fit within the required quite short parameters. Take a peek at the same tale, told in the generic terms that writers adopt in the interests of brevity:
Not nearly as much of a grabber as the original version, is it? Or the second, for that matter. No one could dispute that it’s a shorter version of the same story, but notice how in this rendition, the narrator seems to assume that the reader will spontaneously picture the incident so clearly that no details are necessary. Apparently, it’s the reader’s job to fill in the details, not the writer’s.
Except it isn’t. As far as Millicent is concerned, it’s the writer’s responsibility to tell the story in a way that provokes the intended reaction in the reader, not the reader’s to guess what the writer meant. Or to figure out what details might fit plausibly into the scene.
I hate to be the one to break it to you, but professional reading is seldom anywhere near as charitable as the average submitter or contest entrant hopes it will be. Blame it on the intensity of competition created by literally millions of aspiring writers seeking to get published: Millicent knows that if the well-written submission in front of her does not provide her with the reading experience her boss the agent believes will sell right now, chances are good that one of the next thousand submissions will.
According to her, then, it’s your job to draw her into your story so completely that she forgets about all of that. It’s your job to wow her with your storytelling — and without relying upon her sense that you might be writing about something that really happened to supply the plausibility strong, tangible details would provide.
So it honestly is in your best interest to assume that the reader is only going to picture the details you actually provide on the page. Since you cannot be sure that every reader will fill in the specifics you want, make darned sure that what you want the reader to take from the scene is not left to his imagination. If the detail is important, take the page space to include it.
This is particularly good advice if you happen either to be writing memoir or a novel with scenes based upon your personal experience. All too often, reality-based narrators rely upon the fact that something really happened to render it interesting to a reader, regardless of how skillfully that story may be told. All that’s really necessary is a clear telling, right? Or that the kind of terse narrative that works so well in a verbal anecdote will inspire the same reaction if reproduced verbatim on the page?
How well does either of these extremely common theories work out in practice? Well, let me ask you: did you prefer the first version of the fortune cookie story, the second, or the third? More importantly for submission purposes, which do you think would grab Millicent the most as the opening of a manuscript?
Uh-huh. The difference between those three renditions was not the voice (although a case could be made that part of the voice of the first was created through the selection of the details) or even the writing quality (although the last version did get a mite word-repetitive), but the narrative’s willingness to include telling details — and unusual ones at that.
What if the entertainment differential between the three lay not in an authorial failure of imagination in composing the last version, but in a failure to recognize that the point of including this anecdote is presumably to entertain and inform the reader? In telling the story as quickly as possible, can a writer sometimes defeat the purpose of including it at all?
“But Anne!” memoirists and reality-based novelists protest nervously. “When I’m writing about the real, I can’t just make up pithy little details to enliven the narrative, can I? I have to stick to what happened!”
True enough, anxious truth-tellers: if you are writing the real, you cannot control the facts. What you can control, however, and what any writer must control, is how you present them to the reader.
No matter what you write, the success of your narrative is going to depend largely upon your storytelling skills — they’re what separates your account of a particular incident from anybody else’s, right? Frankly, this isn’t an easy task, even if dear self doesn’t happen to be the protagonist; it’s genuinely hard to represent the real world well on the page. Let’s face it, reality is sometimes a lousy storyteller.
Oh, your life has never been trite or obvious or just plain perplexing, even for a minute? Okay, all of you English and Literature majors, tell me, please, how the following 100% true anecdote rates on the symbolism front.
A couple of years ago, I was scheduled to give a eulogy for a dead friend of mine — a writer of great promise, as the pros used to say — at our college reunion. Because several of my classmates had, unfortunately, passed away since our last get-together, eight of us were to give our eulogies at the same event. Because I am, for better of worse, known to my long-time acquaintances as a teller of jokes, I was under substantial pressure to…how shall I put this?…clean up the narrative of my late friend’s life a little. Or at least tell a version that might not offend the folks who didn’t happen to know him.
No, that’s not the symbolic part; that’s all backstory. Here’s the symbolism: my throat was annoyingly, scratchily sore for the entire week that I was editing the eulogy.
Now, if I saw a parallel that obvious in a novel I was editing, I would probably advise cutting it. “No need to hit the reader over the head with it,” I’d scrawl in the margins. “Yes, it’s showing, not telling, but please. Couldn’t you come up with something a bit more original?”
(And yes, now that you mention it, I am known for the length of my marginalia. Brevity may be the soul of wit, but explanation is often the soul of clarity.)
Now, if my life were a short story written for an English class, the voice loss in that anecdote might pass for legitimate symbolism — or even irony, in a pinch. A bit heavy-handed, true, but certainly situationally appropriate: outsiders move to silence protagonistâ€™s voice through censorship = protagonistâ€™s sore throat. Both New Age the-body-is-telling-you-something types and postmodern the-body-is-a-text theorists would undoubtedly be pleased.
But the fact is, in a novel or memoir, this cause-and-effect dynamic would seem forced, or even trite. Certainly, it’s unlikely to make Millicent drop her latte and exclaim, “Wow, I never saw that coming!”
As I believe I may have already mentioned, just because something happens in real life doesnâ€™t necessarily mean that it will make convincing fiction. My sore throat is precisely the type of symbolism that comes across as ham-handed in a novel. Itâ€™s too immediate, for one thing, too quid pro quo. Dramatically, the situation should have taken time to build — over the years since my friend’s death, perhaps — so the reader could have felt clever for figuring out why the throat problem happened. Maybe even anticipated it.
How much better would it have been, in storytelling terms, if our protagonist had dealt with all the different input with aplomb, not coming down with strep throat until scant minutes before she was to speak? That way, in fine melodramatic style, she would have to croak her way through her speech, while her doctor stood by anxiously with antibiotics.
The possibilities make the writerly heart swoon, do they not?
Just think how long it would extend a funeral scene if a eulogizer were unable to speak more than a few emotion-charged words before her voice disappeared with a mouse-like squeak. Imagine the deceased’s secret admirer creeping closer and closer, to catch the muttered words.
Heck, just think of the dramatic impact of any high-stakes interpersonal battle where one of the arguers cannot speak above a whisper. Or the comic value of the persecuted protagonistâ€™s being able to infect her tormenters with strep, so they, too, are speechless by the end of the story.
Great stuff, eh? Much, much better than protagonist feels silenced, protagonist IS silenced. That’s just so…literal.
Besides, readers like to see a complex array of factors as causes for an event, and an equally complex array of effects. Perhaps if our protagonist had been not spoken about her friend since he passed away (which, in a sense, is quite true: I was unable to make it across the country for his memorial service; that could be transformed into an interesting flashback), then she would be fictionally justified in developing speech-inhibiting throat problems now. Or if he and she had shared deep, dark secrets she had sworn never to reveal (no comment), how telling a slight sore throat might be on the eve of spilling the proverbial beans, eh?
But a single eventâ€™s sparking a severe head cold? Dramatically unsatisfying. Not to mention implausible.
Taken too far, it might even make the protagonist seem like a wimp. Readers, like moviegoers, like to see protagonists take a few hits and bounce up again. Even better is when the protagonist is beaten to a bloody pulp, but comes back to win anyway.
One of the great truisms of the American novel is donâ€™t let your protagonist feel sorry for himself for too long — at least, not if his problems rise to the level of requiring action to fix. Simply put, most readers would rather see a protagonist at least make an attempt to solve his problems than spend 50 pages resenting them.
I can feel authors of novels and memoirs where characters sit around and think about their troubles for chapters on end blanching. Frankly, you should, at least if you intend to write for the U.S. market. Domestic agents and editors expect first-time authors’ plots to move along at a pretty good clip — and few characteristics slow a plot down like a protagonist’s tendency to mull. Especially in a first-person narrative, where by definition, the reader must stay within the worldview of the narrator.
Some of you blanching souls have your hands raised, I see. “But Anne,” these pale folks exclaim, “I’ve always heard that the real key to keeping a reader’s interest is to introduce conflict on every page. Well, most of my protagonist’s conflict is internal — she can’t make up her mind where to turn. Surely,” the pallor deepens, “a professional reader like Millicent wouldn’t dismiss this kind of thinking as whining, right?”
That’s a good question, blanchers, and one that fully deserves an answer. The short one is that it all depends on how long the equivocation goes on, how plausible the conflict is, and how repetitive the mulling ends up being. That, and whether the protagonist (or the plot, for that matter) is doing anything else whilst the wheels in her brain churn.
The long answer, of course, is that in order to formulate a really good answer to that particular question, you would need to go out and read a hefty proportion of the tomes released in your book category within the last couple of years. Not EVERY book, mind you: those by first-time authors, because the already-established have to impress fewer people to get a new book into print.
In recent years, most fiction categories have moved pretty firmly toward the action end of the continuum. As opposed to, say, virtually any novel written in English prior to 1900, most of which hugged the other, pages-of-mulling end of the continuum.
This preference isn’t limited to the literary realm, either — we often see this philosophy in movies, too. Don’t believe me? Okay, think about any domestic film with where an accident confines the protagonist to a wheelchair.
No examples springing to mind? Okay, how about if the protagonist is the victim of gratuitous discrimination, or even just simple bad luck? I’m talking about serious drawbacks here, not just everyday annoyances, of course. ( For some reason, whining about trivial problems — “But I don’t have the right shoes to wear with a mauve bridesmaid’s dress!” — seems to be tolerated better by most readers and audience members, provided that the whine-producer doesn’t bring the plot to a screeching halt until she finds those shoes.)
Got a film firmly in mind? Now tell me: doesnâ€™t the film include one or more of the following scenes:
(a) some hale and hearty soul urging the mangled/unemployed/otherwise unhappy protagonist to stop feeling sorry for himself,
(b) a vibrantly healthy physical therapist (job counselor/spouse/friend) telling the protagonist that the REAL reason he canâ€™t move as well as he once did is not the casts on his legs/total paralysis/missing chunks of torso/total lack of resources/loss of the love of his life, but his lousy ATTITUDE, and/or
(c) the protagonistâ€™s lecturing someone else on his/her need to stop feeling sorry for himself and move on with his/her life?
In fact, donâ€™t filmmakers — yes, and writers of books, too — routinely expect their characters to become better, stronger people as the result of undergoing life-shattering trauma?
Now, we all know that this is seldom true in real life, right? As someone who has spent quite a bit of time in physical therapy clinics over the last year, I’m here to tell you that pain does not automatically make people better human beings; it makes them small and scared and peevish. That sudden, crisis-evoked burst of adrenaline that enables 110-pound mothers to move Volkswagens off their trapped toddlers aside, few of us are valiantly heroic in the face of more than a minute or two of living with a heart attack or third-degree burns.
Or ten months of physical therapy. And had I mentioned that my nail had a boo-boo?
Heck, even the average head cold — with or without a concomitant voice loss — tends to make most of us pretty cranky. Yet dramatically, we as readers accept that the little irritations of life might seem like a big deal at the time, even in fiction, because these seemingly trivial incidents may be Fraught with Significance.
Which often yields the odd result, in books and movies, of protagonists who bear the loss of a limb, spouse, or job with admirable stoicism, but fly into uncontrollable spasms of self-pity at the first missed bus connection or hot dog that comes without onions WHEN I ORDERED ONIONS.
Why oh why does God let things like this happen to good people?
One of my favorite examples of this phenomenon comes in that silly American remake of the charming Japanese film, SHALL WE DANCE? After someone spills a sauce-laden foodstuff on the Jennifer Lopez characterâ€™s suede jacket, she not only sulks for two full scenes about it, but is later seen to be crying so hard over the stain that the protagonist feels constrained to offer her his handkerchief.
Meanwhile, the death of her dancing career, the loss of her life partner, and a depression so debilitating that she barely lifts her head for the first half of the movie receive only a few secondsâ€™ worth of exposition. Why? Because dwelling on the ruin of her dreams would be wallowing; dwelling on minor annoyances is Symbolic of Deeper Feelings.
So where does that leave us on the vivid detail front — or the plausibility front, for that matter? Should we all shy away from giving our protagonists big problems, in favor of more easily-presented small ones?
Well, I’m not going to lie to you: there are plenty of writing gurus out there who would advise you to do precisely that. Edith Wharton remarked in her excellent autobiography (which details, among other things, how terribly embarrassed everybody her social circle was when she and Theodore Roosevelt achieved national recognition for their achievements, rather than for their respective standings in the NYC social register; how trying.) that the American public wants tragedies with happy endings. It still seems to be true.
So why, you may be wondering, am I about to advise you not only to depict your protagonists (fictional and real both) with many and varied problems, as well as significant, realistic barriers to achieving their goals? Have I merely gone detail-mad?
Not by a long shot. I have heard many, many agents and editors complain in recent years about too-simple protagonists with too-easily-resolved problems. In conference presentation after conference presentation, they’ve been advising that writers should give their protagonists more quirks.
Itâ€™s an excellent way to make your characters memorable, after all — and it enables the inclusion of lots and lots of luscious telling details. Give â€˜em backstory. If you want to make them sympathetic, a hard childhood, dead parent, or unsympathetic boss is a great tool for encouraging empathy.
Not to mention being plausibly survivable traumas. Do you have any idea how many Americans have experienced one of those things? Or all three?
Feel free to heap your protagonist (and love interest, and villain) with knotty, real-life problems — provided, of course, that none of these hardships actually prevent the protagonist from achieving his or her ultimate goal. Interesting delay creates dramatic conflict; resignation in the face of an insuperable barrier, however, is hard to make entertaining for very long. Make sure that the protagonist fights the good fight with as much vim and resources as someone who did not have those problems — or show her coming up with clever ways to make those liabilities work for her.
Again, this is not the way we typically notice people with severe problems acting in real life, but weâ€™re talking writing that people read for pleasure here. Weâ€™re talking drama.
Weâ€™re talking, to put it bluntly, about moving a protagonist through a story in a compelling way, and as such, as readers and viewers, we have been trained to regard the well-meaning soul who criticizes the recently-bereaved protagonist by saying, â€œGee, Monique, I donâ€™t think youâ€™ve gotten over your mother’s death yet,â€ as a caring, loving friend, rather than as a callous monster incapable of reading a calendar with sufficient accuracy to note that Monique buried her beloved mother only a couple of weeks before.
While a sympathetic soul might reasonably ask, “Um, why should she have gotten over it already, if she’s not completely heartless?”, strategically, even the deepest mourning should not cause the plot to stop moving altogether.
Don’t get me wrong: I donâ€™t think that professional readers who resent characters who linger in their grief are inherently unsympathetic human beings. They just see far, far too much wallowing on the page.
While that’s undoubtedly realistic, it doesn’t really work in a manuscript. Fictional characters who feel sorry for themselves (or who even possess the rational skills to think at length over the practical ramifications of obstacles in their paths) tend to be passive, from the readerâ€™s point of view. They donâ€™t do much, and while theyâ€™re not doing much, the plot grinds to a screaming halt. Yawn.
Or to express it in Millicent’s parlance: next!
Yes, people do this in real life. All the time. But I’m relatively positive that someone told you very, very recently, just because something really happened doesn’t mean it will work on the page.
My, we’ve covered a lot of ground today. I’m going to leave all of this to germinate in your fertile minds for the nonce, campers, while I turn our attention back to nit-picky issues for the next few posts. (Oh, you thought I hadn’t noticed that I’d digressed from structural repetition?) Trust me, you’ll want to have your eye well accustomed to focusing on sentence-level details before we leap back up to plot-level planning.
A good self-editor has to be able to bear all levels of the narrative in mind simultaneously, after all. This is complicated stuff, but then, so is reality, right? Keep up the good work!