Well, it’s the new year, so time to roll up our sleeves and get started. Let’s begin with my first 2016 Nebula prediction. Remember, I try to predict what will happen, based on past evidence and patterns in the Nebulas and various lists and data from this year, rather than what should happen. These are my opinions, so they have no particularly authority, and I always think Chaos Horizon is best used in conjunction with other opinions and websites on the internet.
Predicting the Nebulas this year was made much easier since the Science Fiction and Fantasy Writers of America (SFWA), the group that administers the Nebulas, made their “Recommended Reading List” public. Remember, the Nebulas are a vote of SFWA members; by making their recommendations public, we get a good idea of the direction these awards are leaning. Last year, the final Recommended List correctly predicted 4/6 of the final nominees (the other two nominees were in spots #7 and #8). Since Chaos Horizon always uses the past year as a guide for the next year’s prediction, I predict something similar will happen this year.
If you look at the SFWA list as of right now (January 1), we can see that the top of the list is too heavily slanted towards Fantasy when compared to Nebula history. 5 of the top 6 are Fantasy novels (Leckie being the only SF), as are 8 of the top 10. I suspect one or two of the SF novels will creep up the list over time. Right now, I’m looking at a gang of four: either a novel by a past Nebula winner (Aurora by Kim Stanely Robinson (tied #11) or The Water Knife by Paolo Bacigalupi (also tied at #11)), Thunderbird by perennial Nebula favorite Jack McDevitt, or Raising Caine by Charles Gannon, #3 in a series that garnered Nebula noms in 2013 and 2014. One or two of these books making the final ballot would create a more balance Fantasy/SF ratio.
The Nebulas nominate 6 novels in the category.
Here’s my initial prediction, as of January 1, 2016:
1. Uprooted, Naomi Novik: Novik has almost every metric going for her: good sales, good placement on year-end lists, strong fan response. She has no Nebula history (0 nominations), although she did a grab a Hugo best novel nomination back in 2007 for Temeraire. I’ve got this #1 because I see it as the “buzziest” book of the year; it’s also #1 on the SFWA recommendations. Why second-guess the data?
2. Ancillary Mercy, Ann Leckie: Leckie is coming off of two straight Nebula nominations for this series, including her win for Ancillary Justice in 2014. I don’t expect anything to change this year; the final volume was well-received as a fitting conclusion to this trilogy. As of January 1, 2016, she’s #6 on the SFWA recommended list.
3. Grace of Kings, Ken Liu: Liu has been a recent Nebula darling : 7 short fiction nominations since 2012. This is his first novel, and since the Nebula audience is already very familiar with his short fiction from prior nominations, that brings a lot of eyeballs to the text. In Chaos Horizon predictions, eyeballs = possible voters. It’s also #2 on the SFWA Nebula recommendations list, and he scored a Best Novel nomination last year for translating Cixin Liu’s Three-Body Problem.
4. The Fifth Season, N.K. Jemisin: Jemisin has three prior best novel Nebula noms in 2011, 2012, and 2013, which is every year she’s been eligible for the novel category (she’s published 5 novels, but some years she published more than one novel). She’s at 8th on the recommended list, but with that strong Nebula history, I think she’s a good bet for a nomination this year.
5. Aurora, Kim Stanley Robinson: Robinson has been a perennial Nebula favorite (12 total nominations, 3 wins, including Best Novel wins for 2312 in 2013 and Red Mars back in 1994). Even though he’s tied #14 on the SFWA list, this is a kind of Hard SF novel that appeals to the SF wing of the Nebulas; that group has always had enough votes to put 1-2 books on every Nebula ballot.
6. Karen Memory, Elizabeth Bear: I’m less certain about the Bear. Her high placement on the SFWA list (#3), as well as the generally positive reception of the book, would seem to place her in good stead. In the negative column, she has 0 total Nebula nominations ever, and Karen Memory doesn’t perform particularly well in popularity metrics. The 19th century steampunk setting might be a challenge for some voters as well. I think any of the texts from 4-10 in my list has a real chance of making it this year.
7. Thunderbird, Jack McDevitt: The first rule of Nebula prognostication: you never count Jack McDevitt out. 12 Best Novel Nebula nominations, including 9 out of the past 12 years! This book is from one of his less popular series, and it came out very late in the year (December 1, 2015); otherwise, I’d have him higher.
8. The Water Knife, Paolo Bacigalupi: If Aurora doesn’t make it, this book is the other logical choice for a SF novel from a recent winner. Bacigalupi roared to huge Nebula and Hugo success with The Windup Girl back in 2010, and this is his first proper “adult” SF novel since then. 5 years is an eternity in these awards—has his popularity cooled off? Or will he return to the ranks of the nominees?
9. The Traitor Baru Cormorant, Seth Dickinson: This placed #5 on the SFWA recommended list, so why do I have it so low? Genre, genre, genre: I can’t predict a Nebula with 5 or 6 fantasy novels in it, and I think Dickinson has to be slotted behind the other more obvious fantasy contenders. Keep an eye to see if this picks up steam in January.
10. Raising Caine, Charles Gannon: I place a lot of stock in Gannon’s two previous nominations in 2014 and 2015 for books from this series. He’s currently only at 4 votes in the SFWA list (versus 23 last year). Is this an indication of poor reception of Raising Caine or am I looking at the list too early? If that number increases, expect him to rise in my prediction.
11. Updraft, Fran Wilde: Currently #4 on the SFWA list, I think this is more likely to get a nomination in the Andre Norton (the Young Adult category, where it sits at #1 in the recommendations). While nothing prevents a novel from getting both a Nebula and a Norton nomination, I don’t see nominators voting for the same book in 2 different categories.
12. The Dark Forest, Cixin Liu: You’d think the sequel to last year’s Hugo winner and Nebula nominee would be higher in the recommended list, but The Dark Forest currently doesn’t make the SFWA recommended list at all. I don’t know how to explain that (maybe Ken Liu, who translated The Three-Body Problem but not this volume, was the name that brought the Nebula voters?), but you got to go by the stats. Last year’s Hugo win and Nebula nom should at least keep it in the mix.
13.Barsk: The Elephant’s Graveyard, Lawrence Schoen: The surprise of this list, this places an impressive 7th on the SFWA list. This just came out December 29th, 2015; I think that’s too late for a Nebula book to pick up steam with the rest of the SFWA voters that don’t have access to early copies.
14. Seveneves, Neal Stephenson: You’d think Stephenson would be neck-and-neck with the Robinson and Bacigalupi, but the Nebulas have never liked Stephenson much. He only has 1 nomination back in 1997 for Diamond Age and zero wins. If the Nebulas ignored Snow Crash, Cryptonomicon, and Anathem, why would you predict this? It’s tied for #16 on the current recommendations.
15. Sorcerer to the Crown, Zen Cho: If one of the fantasy novels higher on the list falters, Cho’s book could stand poised to take it’s place. Somewhat similar in setting to the well-liked Hugo/Nebula winning Jonathan Strange & Mr Norrell, this seems to hit some marks that previous Nebula voters have liked.
So, there’s my initial Top #15 Nebula list! Remember, this is a starting place, not the finishing place, and these awards can be very dynamic between January-February, with lots of shifts as books pick up steam. So, what do you think? Did I miss any obvious contenders? Thinks someone should be higher or lower? Argue away in the comments, and happy predicting!
Time for a quick study on Hugo/Nebula convergence. The Nebula nominations came out about a week ago: how much will those nominations impact the Hugos?
In recent years, quite a bit. Ever since the Nebulas shifted their rules around in 2009 (moving from rolling eligibility to calendar year eligibility; see below), the Nebula Best Novel winner usually goes on to win the Hugo Best Novel. Since 2010, this has happened 4 out of 5 times (with Ancillary Justice, Among Others, Blackout/All Clear, and The Windup Girl, although Bacigalupi did tie with Mieville). That’s a whopping 80% convergence rate. Will that continue? Do the Nebulas and Hugos always converge? How much of a problem is such a tight correspondence between the two awards?
The Hugos have always influenced the Nebulas, and vice versa. The two awards have a tendency to duplicate each other, and there’s a variety of reasons for that: the voting pools aren’t mutually exclusive (many SFWA members attend WorldCon, for instance), the two voting pools are influenced by the same set of factors (reviews, critical and popular buzz, etc.), and the two voting pools have similar tastes in SFF. Think of how much attention a shortlist brings to those novels. Once a book shows up on the Nebula or Hugo slates, plenty of readers (and voters) pick it up. In the nearly 50 years when both the Hugo and Nebula has been given, the same novel has won the award 23 out of 49 times, for a robust 47% convergence. As we’ll see below, this has varied greatly by decade: in some decades (the 1970s, the 2010s) the winner are basically identical. In other decades, such as the 1990s, there’s only a 20% overlap.
All of this is made more complex by which award goes first. Historically, the Hugo used to go first, often awarding books a Hugo some six months before the Nebula was award. Thanks to the Science Fiction Awards Database, we can find out that Paladin of Souls received its Hugo on September 4, 2004; Bujold’s novel received its Nebula on April 30, 2005. Did six months post-Hugo hype seal the Nebula win for Bujold?
Bujold benefitted from the strange and now defunct Nebula rule of rolling eligibility. The Locus Index to SF Awards gave us some insight on how the Nebula used to be out of sync with the Hugo:
The Nebulas’ 12-month eligibility period has the effect of delaying recognition of many works until nearly 2 years after publication, and throws Nebula results out of synch with other awards (Hugo, Locus) voted in a given calendar year. (NOTE – this issue will pass with new voting rules announced in early 2009; see above.)
SFWA has announced significant rules changes for the Nebula Awards process, eliminating rolling eligibility and limiting nominations to work published during a given calendar year (i.e., only works published in 2009 will be eligible for the 2010 awards), as well as eliminating jury additions. The changes are effective as of January 2009 and “except as explicitly stated, will have no impact on works published in 2008 or the Nebula Awards process currently underway.”
Since 2009, eligibility has been straightened out: Hugo and Nebula eligibility basically follow the same rules, and now it is the Nebula that goes first. The Nebula tends to announce a slate in late February, and then gives the award in early May. The Hugo announced a slate in mid April, and then awards in late August/early September, although those dates change very year.
Tl;dr: while it used to be the Hugos that influenced the Nebula, but, since 2010, it is now the Nebulas that influence the Hugos. We know that Nebula slates tend to come out while Hugo slate voting is still going on. This means that Hugo voters have a chance to wait until the Nebulas announce their nominations, and then adjust/supplement their voting as they wish. This year, there were about 3 weeks between the Nebula announcement and the close of Hugo voting: were WorldCon voters scrambling to read Annihilation and The Three-Body Problem in that gap? Remember, even a slight influence on WorldCon voters can drastically change the final slate.
But how much? Let’s take a look at the data from 2010-2014, or the post-rule change era. That’s not a huge data set, but the results are telling.
This chart shows how many of the Nebula nominations showed up on the Hugo ballot a few weeks later. You can see the it makes for around 40% on average. Don’t get fooled by the 2014 data: Neil Gaiman’s The Ocean at the End of the Lane made both the Nebula and Hugo slate, but Gaiman declined his Hugo nomination. If we factored him in, we’d be staring at that same 40% across the board.
40% isn’t that jarring, since that only means 2 out of the 5 Hugo nominees. If we consider the overlap between reading audiences, critical and popular acclaim, etc., that doesn’t seem too far out of line.
It’s the last column that catches my eye: 4/5 joint winners, or 80% joint winners in the last 5 years. Only John Scalzi managed to eek out a win over Kim Stanley Robinson, otherwise we’d be batting 100%. We should also keep in mind the tie between The City and the City and The Windup Girl in 2010.
Nonetheless, my research shows that the single biggest indicator of winning a Hugo from 2010-2014 is whether or not you won the Nebula that year. Is this a timeline issue: does the Nebula winner get such a signal boost on the internet in May that everyone reads it in time for the Hugo in August? Or are the Hugo/Nebula voting pools converging to the point that their tastes are almost the same? Were the four joint-winners in the 2010s so clearly the best novels of the year that all of this is moot? Or is this simply a statistical anomaly?
I’m keeping close eye on this trend. If Annihilation sweeps the Nebula and Hugos this year, the SFF world might need to take step back and ask if we want the two “biggest” awards in the field to move in lockstep. This has happened in the past. Let’s take a look at the trends of Hugo/Nebula convergence by decade in the field:
That’s an odd chart for you: the 1960s (only 4 years, though) had 25% joint winners, the 1970s jumped to 80%, we declined through the 1980s (50%) and the 1990s (20%), stayed basically flat in the 2000s (30%), and then jumped back up to 80% in the 2010s. Why so much agreement in the 1970s and 2010s with so much disagreement in the 1990s and 2000s? The single biggest thing that changed from the 2000s to the 2010s were the Nebula rules: is that the sole cause of present day convergence?
I don’t have a lot of conclusions to draw for you today. I think convergence is a very interesting (and complex) phenomenon, and I’m not sure how I feel about it. Should the Hugos and Nebulas go to different books? Should they only converge for books of unusual and universal acclaim? In terms of my own predictions, I expect the trend of convergence to continue: I think 2-3 of this year’s Nebula nominees will be on the Hugo ballot. If I had to guess, I’d bet that this year’s Nebula winner will also take the Hugo. Given this data, you’d be foolish to do anything else.
Now that we have the nominations for this year’s Nebula Nominations for Best Novel, what are we to make of them?
The Goblin Emperor, Katherine Addison (Tor)
Trial by Fire, Charles E. Gannon (Baen)
Ancillary Sword, Ann Leckie (Orbit US; Orbit UK)
The Three-Body Problem, Cixin Liu, translated by Ken Liu (Tor)
Coming Home, Jack McDevitt (Ace)
Annihilation, Jeff VanderMeer (FSG Originals)
Let’s do what Chaos Horizon does, and look at some stats. What were the most predictive elements for the 2015 Best Novel Nebula?
- 83.3% of the nominees were science fiction.
- 66.7% of the nominated authors had previously been nominated for a Nebula for Best Novel.
- 33.3% of the nominated authors had previously won a Nebula for Best Novel.
- 50.0% of the nominees were either stand-alone novels or the first novel in a series.
- 66.7% of the nominees placed in the top part of my collated SFF Critics Meta-List.
- 16.7% of the nominees were Jack McDevitt.
Overall, the Nebulas Best Novel nominees were very traditional in 2015. After several years of being friendlier to fantasy, the Nebula snapped back to SF: we had 5 SF books and only one fantasy novel, although you may want to count Annihilation as cross-genre (weird/SF?). The Nebula had been creeping up to a 50/50 mix of fantasy and science fiction. This year, we saw none of that trend: three of the books (Leckie, Gannon, McDevitt) are far-future SF novels complete with spaceships and all the SF trimmings. The Cixin Liu, despite being a translation of a Chinese novel, may be the most traditional SF novel of the lot: an alien invasion novel along the lines of Arthur C. Clarke’s Childhood’s End. Liu even does away with more modern characterization, instead using the old 1950s technique of “characters as cameras” to drive us through the plot and the science.
The Nebulas went with 4 writers that had previously been nominated for the Best Novel Nebula (VanderMeer, McDevitt, Leckie, Gannon) and only 2 newcomers. 2 of our 6 nominees already have won the Nebula Best Novel award, with Leckie winning in 2014 and McDevitt back in 2007. The Nebula Best Novel category tends to draw heavily from past nominees and winners, and 2015 was no different. Since the SFWA voting membership doesn’t change much year-to-year, this means support from one year tends to carry over into the next year.
Case in point: Jack McDevitt, who now has have 12 (!) Best Novel Nebula nominations. The constant McDevitt nominations are the strangest thing that is currently happening in the Nebulas. That’s not a knock against McDevitt. I’ve read two of McDevitt’s book, The Engines of God and the Nebula winning Seeker. They were both solid space exploration novels: fast-paced, appealing characterization, and professionally done. They didn’t stand out to me, but there’s never anything wrong with writing books people want to read. Still, I’m not sure why McDevitt deserves 12 nominations while similar authors such as Peter F. Hamilton, Alistair Reynolds, Stephen Baxter, etc., are largely ignored by the SFWA voters. To put this in context: McDevitt has more Nebula Best Novel nominations than Neal Stephenson (1), William Gibson (4), and Philip K. Dick (5) combined.
Since 2004, when the era of McDevitt domination truly began, 73 different books have received Nebula nominations. 9 of those have been McDevitt novels. So, over the last 11 years, McDevitt alone constituted 12% of the total Nebula Best Novel field. I’m going to have to create a “McDevitt anomaly” to start accounting for the Nebula slates. Will Gannon fall into similar territory? There seems to be a block of SFWA voters who like a very specific kind of SF novel. This testifies to the inertia of the Nebula award; once they start voting in one direction, they continue to do so. The McDevitt nominations are useful because it reminds us how eccentric the Nebula can be: if you’re trusting the SFWA to come up with an unbiased list of the best 6 SFF novels of the year, you’re out of luck. The Nebula gives us the 6 SFF novels that the SFWA voters voted for: no more, no less.
I was pleased with how predictive my SFF Critics list was. Ancillary Sword and Annihilation placed 1-2 on that list and grabbed noms. The Goblin Emperor and The Three-Body Problem tied for third (along with 5 other novels, many of which didn’t stand a Nebula chance because of being last in a series, not being SFF-y enough, or not being published in the US). City of Stairs was a place behind those two, so that list at least predicted The Goblin Emperor over the Bennett. Neither Gannon nor McDevitt made the SFF Critics list. I’ll have to trust this list more in the future.
The demographics of the Best Novel award were also interesting, if predictable. 67% men / 33 % women is a little more male-slanted than normal, although the granularity of having only 6 nominations makes that easy to throw off. Along race/ethnic lines, you’re looking at 83% white / 17% Asian; I believe Cixin Liu is the first Asian author nominated for the Best Novel Nebula. Recent trends have been a little higher than that, depending on how you want to categorize race and ethnicity. Nationality of 83% American / 17% Chinese / 0% British is definitely a little unusual; this award has been friendlier to British authors in recent years. I’ll admit that I thought at least one British author would sneak in.
Any other statistical trends stand out to you?
There’s a wealth of information there, including recommendations for categories that I don’t have the time to follow, like YA Novel, Novella, Novelette, and Short Story. In the past, most of the future Hugo and Nebula nominees have shown up on these lists. Part of that is because the lists are so long (20-30 suggestions each), but also because Locus pretty closely mirrors the sentiments of the SFWA and the Nebula.
Here’s there SF and Fantasy lists:
Novels – Science Fiction
•Ultima, Stephen Baxter (Gollancz; Roc 2015)
•War Dogs, Greg Bear (Orbit US; Gollancz)
•Shipstar, Gregory Benford & Larry Niven (Tor; Titan 2015)
•Chimpanzee, Darin Bradley (Underland)
•Cibola Burn, James S.A. Corey (Orbit US; Orbit UK)
•The Book of Strange New Things, Michel Faber (Hogarth; Canongate)
•The Peripheral, William Gibson (Putnam; Viking UK)
•Afterparty, Daryl Gregory (Tor; Titan)
•Work Done for Hire, Joe Haldeman (Ace)
•Tigerman, Nick Harkaway (Knopf; Heinemann 2015)
•Europe in Autumn, Dave Hutchinson (Solaris US; Solaris UK)
•Wolves, Simon Ings (Gollancz)
•Ancillary Sword, Ann Leckie (Orbit US; Orbit UK)
•Artemis Awakening, Jane Lindskold (Tor)
•The Three-Body Problem, Cixin Liu (Tor)
•The Causal Angel, Hannu Rajaniemi (Tor; Gollancz)
•The Memory of Sky, Robert Reed (Prime)
•Bête, Adam Roberts (Gollancz)
•Lock In, John Scalzi (Tor; Gollancz)
•The Blood of Angels, Johanna Sinisalo (Peter Owens)
•The Bone Clocks, David Mitchell (Random House; Sceptre)
•Lagoon, Nnedi Okorafor (Hodder; Saga 2015)
•All Those Vanished Engines, Paul Park (Tor)
•Annihilation/Authority/Acceptance, Jeff VanderMeer (FSG Originals; Fourth Estate; HarperCollins Canada)
•Dark Lightning, John Varley (Ace)
•My Real Children, Jo Walton (Tor; Corsair)
•Echopraxia, Peter Watts (Tor; Head of Zeus 2015)
•World of Trouble, Ben H. Winters (Quirk)
Novels – Fantasy
•The Widow’s House, Daniel Abraham (Orbit US; Orbit UK)
•The Goblin Emperor, Katherine Addison (Tor)
•Steles of the Sky, Elizabeth Bear (Tor)
•City of Stairs, Robert Jackson Bennett (Broadway; Jo Fletcher)
•Hawk, Steven Brust (Tor)
•The Boy Who Drew Monsters, Keith Donohue (Picador USA)
•Bathing the Lion, Jonathan Carroll (St. Martin’s)
•Full Fathom Five, Max Gladstone (Tor)
•The Winter Boy, Sally Wiener Grotta (Pixel Hall)
•The Magician’s Land, Lev Grossman (Viking; Arrow 2015)
•Truth and Fear, Peter Higgins (Orbit; Gollancz)
•The Mirror Empire, Kameron Hurley (Angry Robot US)
•Resurrections, Roz Kaveney (Plus One)
•Revival, Stephen King (Scribner; Hodder & Stoughton)
•The Dark Defiles, Richard K. Morgan (Del Rey; Gollancz)
•The Bees, Laline Paull (Ecco; Fourth Estate 2015)
•The Godless, Ben Peek (Thomas Dunne; Tor UK)
•Heirs of Grace, Tim Pratt (47North)
•Beautiful Blood, Lucius Shepard (Subterranean)
•A Man Lies Dreaming, Lavie Tidhar (Hodder & Stoughton)
•The Girls at the Kingfisher Club, Genevieve Valentine (Atria)
•California Bones, Greg van Eekhout (Tor)
Like I said, pretty comprehensive. Most of the major candidates are there, ranging from VanderMeer to Leckie to Addison to Bennett. Here are the snubs I noticed:
The Martian, Andy Weir: That’s a good indication that the “industry” doesn’t consider this a 2014 book.
Station Eleven, Emily St. John Mandel: A surprise. Maybe it caught fire too late in the year to make the list?
The First Fifteen Lives of Harry August, Clair North
Most mainstream fantasy novels: no Words of Radiance, no The Broken Eye, no Fool’s Assassin, no Prince of Fools, no The Emperor’s Blade’s, no The Slow Regard of Silent Things. It says something when you put together a list of 22 fantasy novels and leave out most of the fantasy best-sellers. Is Locus arguing that excellence can’t be achieved in mainstream epic fantasy? Or are they reflecting their audience’s lack of interest in epic series? Sure, there are a few fantasy series on the list—Robert Morgan, Elizabeth Bear, Lev Grossman, Kameron Hurley—but each of those is set up, on some level, as a challenge to more conventional epic fantasy.
There are several books that haven’t gotten an official US publication yet (or least they aren’t available on Amazon): Lagoon, A Man Lies Dreaming, Bete, and Wolves. You’d think publication would be truly international in 2014, but that’s not yet the case. Lagoon, in particular, would have had a Nebula and Hugo shot if had gotten a US publication. Without one, it’s probably not eligible for the Nebula, and thus can’t build momentum towards a Hugo.
Lastly, is The Bone Clocks really science fiction? I guess part of the novel takes place in the future, so that’s probably why they placed it in that category. It felt more like a horror/weird fiction/fantasy hybrid to me, but I guess classification doesn’t matter that much in the end.
I’ve been waiting for this list; now that we have it, I’ll update and finalize the Critics Meta-List.
It’s the end of the month, so time to check in on the popularity of the leading Hugo and Nebula contenders!
To do this, I use Goodreads numbers. We don’t have a lot of reliable ways to measure sales in the SFF field. Publishers tend to keep their numbers secret, so we’re left having to estimate via the number of Goodreads or Amazon ratings. Since Amazon bought Goodreads in 2013, the Amazon and Goodreads dating has been converging.
I prefer Goodreads because it has a larger sample size. Goodreads is not even close to 100% accurate. My current guess is that it samples 5%-20% of the American readership, and that this varies substantially from author to author. So one author might be sampled at a 20% rate, but another at only 5%. Since Goodreads is now integrated with Amazon kindle, it very much favors books that sell large numbers through Amazon. Authors that sell large numbers of copies through physical outlets, such as Stephen King, do much worse in these numbers. Goodreads also slants decidedly young in its demographic, with all the various statistical issues that brings. So take these with a grain of salt, and they’re better for comparison purposes than for absolutes. Look for differences in order of magnitude, not fine-grain differences such as whether one book has 500 more readers than another.
What I’m looking for is a general since of which novels are “hot” and which are selling more slowly. Since this is the first year of Chaos Horizon, we don’t know how predictive sales are for the awards; I don’t think there’s a simple correlation like “more sales = greater chances of winning.” However, there is probably some sort of “sales floor” we can discover. Without moving some copies, you’re not popular enough to get nominated or win. I also think picking up lots of readers in January has got to help your Hugo/Nebula chances; the fresher a book is in the mind, the better chance of voting for it.
Here’s the full Excel data; it’s getting too bulky to present here on WordPress: Hugo Metrics. I’ve got data going back to October. Methodology: I record the # of Goodreads rating on the last day of the month.
Table 1: Goodreads Popularity for Selected Hugo and Nebula Contenders
I’m still blown away by how well The Martian is doing. According to Goodreads, more people read The Martian last month than the bottom 15 books on my list combined. That’s sales power! If The Martian proves to be eligible for the 2015 Hugo, it’ll be a formidable competitor. It would be very interesting to see how such a “mainstream” SF hit does against a more literary SFF novel like Annihilation.
Mandel’s Station Eleven is also on fire, putting up a huge 10,000 reader month. I think that number solidifies Mandel’s Nebula chances, and if she grabs a Nebula nomination, she could make a run at a Hugo nomination. I added Skin Game to the list for this month, to get a look at how a popular mainstream urban fantasy novel does against it’s science fiction and fantasy brethren. Very well, it turns out.
Everyone else is sort of floating along. A lot of books have finished up their hardcover runs, and are waiting for their paperbacks to come along and revitalize sales. If I had a book out, I’d want my paperback to come out in January: look at those huge reader numbers for this month. People must be spending their Holiday gift cards! Frontline possibilities like Annihilation are selling well enough to still be in the mix. I’d point to books like City of Stairs, The Goblin Emperor, and The Mirror Empire, which are all languishing at the bottom of my list. They’ve been doing well on Year-End lists, but they don’t have the sales to match that critical enthusiasm. Here’s a chart showing momentum over the past 3 months:
Table 2: Goodreads Momentum for Selected Hugo and Nebula Contenders
I’m surprised that January did better than December. People must have been too busy with Christmas too read! It’s interesting to see something like The Martian build momentum over the last 3 months; word of mouth really is paying off for a book like that. In contrast, something like Ancillary Sword is basically flat over those same three months. I’m interested to see if such momentum is predictive or not.
A sub-category of my broader genre study, this post addresses the increasing influence of “literary fiction” on the contemporary Hugo and Nebula Awards for Best Novel, 2001-2014. I think the general perception is that the awards, particularly the Nebula, have begun nominating novels that include minimal speculative elements. Rather than simply trust the general perception, let’s look to see if this assumption lines up with the data.
Methodology: I looked at the Hugo and Nebula nominees from 2001-2014 and ranked the books as either primarily “speculative” or “literary.” Simple enough, right?
Defining “literary” is a substantial and significant problem. While most readers would likely acknowledge that Cloud Atlas is a fundamentally different book than Rendezvous with Rama, articulating that difference in a consistent manner is complicated. The Hugos and Nebulas offer no help themselves. Their by-laws are written in an incredibly vague fashion that does not define what “Science Fiction or Fantasy” actually means. Here’s the Hugo’s definition:
Unless otherwise specified, Hugo Awards are given for work in the field of science fiction or fantasy appearing for the first time during the previous calendar year.
Without a clear definition of “science fiction or fantasy,” it’s left up to WorldCon or SFWA voters to set genre parameters, and they are free to do so in any way they wish.
All well and interesting, but that doesn’t help me categorize texts. I see three types of literary fiction entering into the awards:
1. Books by literary fiction authors (defined as having achieved fame before their Hugo/Nebula nominated book in the literary fiction space) that use speculative elements. Examples: Cloud Atlas, The Yiddish Policeman’s Union.
2. Books by authors in SFF-adjacent fields (primarily horror and weird fiction) that have moved into the Hugo/Nebulas. These books often allow readers to see the “horror” elements as either being real or imagined. Examples: The Drowning Girl, Perfect Circle, The Girl in the Glass.
3. Books by already well-known SFF authors who are utilizing the techniques/styles more commonplace to literary fiction. Examples: We Are All Completely Besides Ourselves, Among Others.
That’s a broad set of different texts. To cover all those texts—remember, at any point you may push back against my methodology—I came up with a broad definition:
I will classify a book as “literary” if a reader could pick the book up, read a random 50 page section, and not notice any clear “speculative” (i.e. non-realistic) elements.
That’s not perfect, but there’s no authority we can appeal to make these classifications for us. Let’s see how it works:
Try applying this to Cloud Atlas. Mitchell’s novel consists of a series of entirely realistic novellas set throughout various ages of history and one speculative novella set in the future. If you just picked the book up and started reading, chances are you’d land in one of the realistic sections, and you wouldn’t know it could be considered a SFF book.
Consider We Are All Completely Beside Ourselves, Karen Joy Fowler’s reach meditation on science, childhood, and memory. Told in realistic fashion, it follows the story of a young woman whose parents raised a chimpanzee alongside her, and how this early childhood relationship shapes her college years. While this isn’t the place to decide if Fowler deserved a Nebula nomination—she won the National Book Award and was nominated for the Booker for this same book, so quality isn’t much of a question—the styles, techniques, and focus of Fowler’s book are intensely realistic. Unless you’re told it could be considered a SF novel, you’d likely consider it plain old realistic fiction.
With this admittedly imperfect definition in place, I went through the nominees. For the Nebula, I counted 13 out of 87 nominees (15%) that met my definition of “literary.” While a different statistician would classify books differently, I imagine most of us would be in the same ball park. I struggled with The City & The City, which takes place in a fictional dual-city and that utilizes a noir plot; I eventually saw it as being more Pychonesque than speculative, so I counted it as “literary.” I placed The Yiddish Policeman’s Union as literary fiction because of Chabon’s earlier fame as a literary author. After he establishes the “Jews in Alaska” premise, large portions of the book are straightly realistic. Other books could be read either as speculative or not, such as The Drowning Girl. Borderline cases all went into the “literary” category for this study.
Given that I like the Chabon and Mieville novels a great deal, I’ll emphasize I don’t think being “literary” is a problem. Since these kinds of books are not forbidden by the Hugo/Nebula by-laws, they are fair game to nominate. These books certainly change the nature of the award, and there are real inconsistencies—no Haruki Murakami nominations, no The Road nomination—in which literary SFF books get nominated.
As for the Hugos, only 4 out of 72 nominees met my “literary” definition. Since the list is small, let me name them here: The Years of Rice and Salt (Robinson’s realistically told alternative history), The Yiddish Policeman’s Union, The City & The City, and Among Others. Each of those pushes the genre definitions of speculative fiction. Two are flat out alternative histories, which has traditionally been considered a SFF category, although I think the techniques used by Robinson and Chabon are very reminiscent of literary fiction. Mieville is an experimental book, and the Walton is a book as much “about SFF” as SFF. I’d note that 3 of those 4 (all but the Robinson) received Nebula nominations first, and that Nebula noms have a huge influence on the Hugo noms.
Let’s look at this visually:
Even with my relatively generous definition of “literary,” that’s not a huge encroachment. Roughly 1 in 6 of the Nebula noms have been from the literary borderlands, which is lower than what I’d expected. While 2014 had 3 such novels (the Folwer, Hild, and The Golem and the Jinni), the rest of the 2010s had about 1 borderline novel a year.
The Hugos have been much less receptive to these borderline texts, usually only nominating once the Nebula awards have done. We should note that both Chabon and Walton won, once again reflecting the results of the Nebula.
So what can we make of this? The Nebula nominates “literary” books about 1/6 times, or once per year. The Hugo does this much more infrequently, and usually when a book catches fire in the Nebula process. While this represent a change in the awards, particularly the Nebula, this is nowhere as rapid or significant as the changes regarding fantasy (which are around 50% Nebula and 30% Hugo). I know some readers think “literary” stories are creeping into the short story categories; I’m not an expert on those categories, so I can’t meaningfully comment.
I’m going to use the 15% Nebula and 5% Hugo “literary” number to help shape my predictions. I may have been overestimating the receptiveness of the Nebula to literary fiction; this study suggests we’d see either Mitchell or Mandel in 2015, not both. Here’s the full list of categorizations. I placed a 1 by a text if it met the “literary” definition: Lit Fic Study.
Now that I’ve put up the Mainstream Best of 2014 Meta-List, I can move on to the far more interesting (and predictive) SFF Critics Meta-List. I’m starting today with Strange Horizons, because their “Best of 2014” list causes on immediate methodological crisis. Thanks, Niall!
Like many of these posts from bigger publications, Strange Horizons is a meta-list unto itself, including short paragraphs highlighting the “Best of 2014” from 18 different critics. These critics represent a large range of important voices in the field, including Hugo nominated authors and fan writers. Of course, Strange Horizons was itself a Hugo nominee for semiprozine (whatever that means) in 2013 and 2014, and is thus likely to carry a fair amount of weight with Hugo voters.
All good so far, and this is exactly what I’m looking for in a predictive list. I figure we collate this list against other similar lists, and we’ll have another indicator of likely Hugo/Nebula nominees and winners. I then collate the indicators, and bam!, I have my predictive model.
My processing practice so far has been to read through the lists and every time a critic mentions a book as a “Best of 2014” (honorable mentions don’t count), to give it 1 point. Simple, or so I thought. In my previous SFF Critics Meta-List collation, I let each mention count for one vote. Thus, since 3 critics from Tor.com’s list mentioned The Goblin Emperor, it got 3 votes. This helped Goblin Emperor win the first collation.
That multiple votes per list is becoming a problem. Here’s the Strange Horizon list (absent Adam Robert’s choices, since I already collated them from The Guardian article he wrote, and I didn’t want his choices to count twice):
5 mentions: Annihilation/Southern Reach, VanderMeer, Jeff
2 mentions: J, Jacobson, Howard
2 mentions: The Race, Allan, Nina
2 mentions: Fire in the Unnamable Country, Islam, Ghalib
Everyone else got 1 mention each:
Europe in Autumn, Hutchinson, David
All those Vanished Engines, Park, Paul
Boy, Snow, Bird, Oyeyemi, Helen
Steles of the Sky, Bear, Elizabeth
Ancillary Sword, Leckie, Ann
Broken Monsters, Beukes, Lauren
The Bone Clocks, Mitchell, David
The Wake, Kingsnorth, Paul
Of Things Gone Astray, Matthewson, Janina
The Causal Angel, Rajaniemi, Hannu
Wolf in White Van, Darnielle, John
The Strange and Beautiful Sorrows of Ava Lavender, Walton, Leslye
The Angel of Losses, Feldman, Stephanie
The Department of Speculation, Offill, Jenny
Tigerman, Harkaway, Nick
The Girl in the Road, Byrne, Monica
Nigerians in Space, Olukotun, Deji Bryce
A good list, broad and deep, with mentions of plenty of the front-runners for the Nebula and Hugo. But can I really give Annihilation 5 points from one list? Clearly, VanderMeer won the Strange Horizons betting pool, but how much influence can I give one publication? If I collate 5 votes, that means Strange Horizons will dominate my meta-list. Not cool. On the other hand, anyone who reads this Best of 2014 is likely to come away with the feeling they better read Annihilation, so is it fair to give it only 1 point? Does that accurately reflect the intent/effect of the article?
Like I said, methodological crisis. Our only option: panic!
Fortunately, Chaos Horizon is just for fun. I’m putting together a list that may or may not predict the Hugos and Nebulas, and, even when I do, we’re only looking at a few hundred data points, not enough to be statistically sound. For the time being, I’m going to give a list like Strange Horizons (and Tor.com, and SF Signal) a maximum of 2 points. Everyone who appears at least once, gets 1 point. That final point will be scaled against the multiple mentions. So the top of the Strange Horizons lists will look like this in the collation:
2 points: Annihilation/Southern Reach, VanderMeer, Jeff
1.25 points: J, Jacobson, Howard
1.25 points: The Race, Allan, Nina
1.25 points: Fire in the Unnamable Country, Islam, Ghalib
So, beyond the initial mentions, VanderMeer got 4 more mentions. 4/4 = 1. Howard got 1 more mention, 1/4 = .25. What do you think? Fair? Unfair?
I’ll be back tomorrow with SF Signal’s list and some more comments on the methodology for the SFF Critics Meta-List, and then I’ll recollate the list. Things are heating up this award season, so it’ll be interesting to see who pulls ahead with my evolving SFF Critics list methodology.
I’m doing some organizing work here at Chaos Horizon, so let me put up something I’ve been meaning to for a while: blank data sets for the Hugo and Nebula Awards for Best Novel, from the beginnings to the present. These are Excel formatted lists of the Hugo and Nebula winners + nominees, sorted by year and author. It was a pain to put these together, but now that they’re cleanly formatted I wanted to share them with the community.
So long story short, anyone who wants to do their own statistical study of the Hugo and Nebulas is free to use my worksheets. Excel is a powerful tool, and given the relatively small size of the data sets—311 Nebula nominees, 288 Hugo nominees—it isn’t too hard to use. With only a little amount of work—and data entry—you can be generating your own tables and graphs in no time. I’m also somewhat confident Google Docs can work with these, although I never use Google Docs myself.
The guiding principles of Chaos Horizon have always been neutrality and methodological/data transparency. Statistics are at their most meaningful when multiple statisticians are working on the same data sets. There’s lot of information to be sorted through, and I look forward to other what statisticians will find. If you do a study, drop me an e-mail at firstname.lastname@example.org or link in the comments.
Here’s the Excel File: Blank Hugo and Nebula Data Set. I’ll also perma-link this post under “Resources.”
Yesterday, we looked at the Nebula slate; today, we’ll look at the Nebula winners. I show seven fantasy novels (out of 50 winners total; there was a tie in 1967) as having won the Nebula Award:
1982: Claw of the Conciliator, Gene Wolfe
1988: Falling Woman, Pat Murphy
1991: Tehanu, Ursula K. Le Guin
2003: American Gods, Neil Gaiman
2005: Paladin of Souls, Lois McMaster Bujold
2009: Powers, Ursula K. Le Guin
2012: Among Others, Jo Walton
Interestingly, the 1980s were better for winning than the 1990s (we’ll see that also reflected in the Hugo in the upcoming days), and things have picked up a great deal in the last 15 years for fantasy. This is a pretty broad slice of fantasy: we have secondary world novels with Bujold and Le Guin, contemporary fantasy with Walton and Gaiman, and Wolfe’s nearly unclassifiable Dying Earth style book. Here’s the data and charts:
The chart is pretty zig-zaggy because we’re dealing with such small numbers (10 per decade), although you do see a gradual increase over time in the direction of fantasy wins. Still, the “win” chart is nowhere near as dramatic as the “nominee” chart, proving that it’s easier to get nominated as a fantasy novel than to win as a fantasy novel.
We can conclude that fantasy novels tend to underperform once they reach the slate: since 1980, fantasy novels have made up 32% of the slate but only account for 20% of the wins. That’s a statistically significant bias against fantasy novels winning, something I need to take into account for my future predictions.
In an odd way, the more fantasy novels get nominated, the harder it can be for a fantasy novel to win, as the fantasy vote ends up getting split between the slate. 2013 is a perfect example of this: one SF novel faced off against 5 fantasy novels. 2312 ended up winning, because I imagine all the “the Nebula should go to a SF novel” SFWA voters voted for Robinson, and the fantasy votes were spread our across the other 5. If we’re considering genre alone, fantasy books are at a disadvantage of winning. Of course, genre alone does not determine the winner, as many other factors—familiarity, reception, popularity, demographics, etc.—also come into play.
In a statistical study like this, you have to think about what the “baseline” might be, i.e. what the stats would be like without bias. Is the Nebula an award moving toward a 50/50 split between fantasy and science fiction? Why should/would 50/50 be the baseline? Isn’t fantasy more popular than science fiction, at least in terms of readers in 2014? What about critical prestige? What about the nebulous and nearly impossible to define idea of “tradition”? How about the bias towards well-known authors? How about potential biases regarding gender? What about the bias against books in a series or sequels?
All of these are going to factors in the eventual fantasy/science fiction split the Nebula arrives at, and all these factors change over time. Trying to cross-correlate all those variables before we have a basic understanding is only going to result in mass confusion. As Chaos Horizon slowly builds up its data sets, the best we can do is think about the statistical moment we’re at right now, as predicting even the next 5 years is very difficult. So, to sum up the situation for the Nebula:
1. The Nebula slate breaks into three eras: a 15 year period (1966-1980) where fantasy was largely excluded, a 30 year period (1980-2010) where fantasy was around 25%-30% of the slate, and a more recent era (2010-2014) where fantasy has overtaken SF on the slate. Are the last 5 years a statistical aberration or something that is likely to continue?
2. The Nebula winners have been more consistent since 1980, accounting for around 20% of the wins, with a general increase in % of winners over time. Be aware that these conclusions are shakier because the numbers are smaller. Nonetheless, fantasy novels have underperformed on the slate, winning at a smaller proportion than their SF peers.
Tomorrow, we’ll look at how the Hugo Award nominees have shaped up! Any questions so far?
In this post, we’ll look at how genre impacts the Nebula Award for Best Novel from the start of that award (1966) up to the present day (2014). Let’s jump right into the data:
The streams have crossed! Lame Ghostbusters joke aside, there’s a lot of information to sort through here. Obviously, that cross in the 2010s is going to jump out the most, but let’s make some other observations:
1. The Nebula embraced fantasy nominees fairly early in its history, starting primarily in 1981. This was a surprise to me; I thought the change would have come later. The Nebula has been awarded for almost 50 years, and it’s only for the first 15 years that this was exclusively a SF award.
2. The Nebula was fairly consistent through the 1980s, 1990s, and 2000s, nominating around 25%-30% fantasy novels for that 30 year period, or roughly 2 fantasy novels per year.
3. The Nebula has changed drastically in the last 5 years. While the 2010s aren’t over, we’re more than halfway through, and already more fantasy novels have been nominated this decade than in the 2000s. Even if you believe the numbers are a little skewed, a retreat back to that 30% number is statistically unlikely.
1960s: A straightforward SF decade. The Nebula was still finding its way, and we have a very erratic # of nominees per year: 1966 saw 12 nominees, with 1967 only 3. The only “Other” book in this decade was James Blish’s Black Easter, an outstanding horror-themed book about demonic summoning. This is best read with its companion volume The Day After Judgment, usually collected together as Devil’s Day. Blish, of course, was already well-known for the SF audience, and this pattern—genre-borderline books getting nominated if they’re by well-known authors—will continue for the next several decades.
1970s: Plenty of “Other” books form this decade. In 1976, the Nebula nominated an overwhelming 18 novels for the award (check out the sfadb for the full list). With that massive list, some unusual choices creep in: Italo Calvino’s Invisible Cities and E.L. Doctorow’s Ragtime. Throw in 1974’s nomination for Thomas Pynchon’s Gravity’s Rainbow and we have our first inklings of the Nebula’s sympathy for literary fiction. I don’t know if you could classify any of those novels as science fiction or fantasy, although I’ll listen if anyone wants to try.
There are some other hard-to-classify novels from the 1970s. I never know what to do with R.A. Lafferty, and he received a 1972 nomination for The Devil is Dead. Along this horror angle, Robert Silverberg grabbed a 1973 nomination for The Book of Skulls. I know people wouldn’t blink an eye if I classed this at SF, but it, at least in my opinion, is basically a realistic novel with a few horror elements. As a pure aside, this is part of Silverberg’s great “death” trilogy alongside Dying Inside and “Born with the Dead.” In my opinion, these three texts are Silverberg’s greatest achievement as an author, and if you can handle the gloom factor, they’re excellent reading.
Larry Niven and Jerry Pournelle were nominated for Inferno in 1977, a variation on Dante’s Inferno. You can see that in the 1970s, the best way to get a Nebula if you aren’t writing SF is to write something horror themed, particularly if it has “devil” or “death” in the title.
Lastly, we see our first fantasy books pop up in this decade. Poul Anderson received a 1976 nomination for A Midsummer Tempest, a Shakespeare-themed magic-infused alternative-history book. Anderson, though, was already a SF star, and he was one of that strange 18 nominee year. Richard Lupoff’s 1978 nomination for Sword of the Demon was a Japanese themed fantasy about demon-killing, and it fits the pattern of needing a horror theme in the title to make it into the Nebulas.
So, all told, the 1970s show a definite loosening of genre-boundaries in the Nebula, although this seems to be more inflected in the direction of horror or literary fiction than fantasy.
1980s: This is where things get interesting. Beginning 1981, fantasy arrives in a major way: Robert Stallman’s The Orphan, and, more significantly, Gene Wolfe’s Shadow of the Torturer.
Wolfe’s four volume The Book of the New Sun is the critical series for this decade. Each volume received a nomination, with the second volume (Claw of the Conciliator) winning the Nebula. New Sun is a difficult and hard to classify series. Drawing on elements of Jack Vance’s Dying Earth, it hovers on the line between fantasy and science fiction, a fact that I think helped it get nominated. Taking place in the far future, it initially seems to be pure fantasy, only to have some technological elements revealed in the later volumes. The Locus Magazine reviewers were equally confused: volumes 1-3 were voted as fantasy, and volume 4 made it as science fiction. In the 2012 Locus Century poll, it makes the list of both “20th Century Science Fiction Novel” and “20th Century Fantasy Novel.” Maximum confusion for everyone! I ultimately classified the four volumes just as the Locus voters saw them: #1-#3 as fantasy, #4 as science fiction. Make of that what you will.
Wolfe was a driving wedge, though, and after 1981 more and more clearly fantasy books get nominated: Jon Crowley’s Little, Big, Jack Vance’s Lyonesse, Orson Scott Card’s Red Prophet, as well as Wolfe’s own Soldier of the Mist. By the time the decade is over, 14 fantasy novels have been nominated, and Pat Murphy wins again in 1989 for her Mayan-influenced Falling Woman.
By this time, the Nebula has loosened it’s genre-policing. While some of these fantasy nominees were already well known for their SF (Wolfe, Vance, Card), others were not, and we see fantasy novels by lesser known authors pop up on the list. We aren’t seeing, though, fantasy novels by writers like Terry Brooks, Stephen Donaldson, Daniel Eddings, Mercedes Lackey, Marion Zimmer Bradley, etc. (i.e. the books that are bestsellers). No nomination for Mists of Avalon might be the most surprising.
1990s: The 1990s are filled with fantasy nominations. To mention some of the bigger ones: Elizabeth Scarborough’s Healer’s War, Ursula K. LeGuin’s Tehanu, Patricia McKillip’s Winter Rose, and George R.R. Martin’s A Game of Thrones. There are also a number of nominations by lesser known authors, showing a real openness in the Nebula to different types of fantasy literature. Notice there aren’t a lot of nominations for what we might think of as traditional “epic” fantasy: secondary world, part of a multi-volume series, etc. I’ll be taking a closer look at that in a few posts.
2000s: Two more nominations for George R.R. Martin, as well as multiple nominations for Nalo Hopkinson and Lois McMaster Bujold. Even someone like Terry Pratchett is able to get into the mix, scoring a 2006 nomination for Going Postal. We have plenty of lesser known authors grab nominations. For instance, China Mieville is nominated for Perdido Street Station. While it’s hard to remember, Mieville was unknown at the time: to grab a Nebula for a fantasy debut marks a major change.
We also have a broad range of fantasy novels nominated this decade, from more contemporary fantasy like American Gods to 19th century fantasy like Jonathan Strange & Mr Norrell to secondary world books like the Bujold, Martin, or Pratchett.
It’ll be a few more years, though, until fantasy takes over the Nebulas. Fantasy is still stuck around the 30% mark . . .
2010s: And that 30% jumps to 60% for this decade. We’ve seen an explosion of fantasy nominations in the last five years: 2010 had 4 fantasy nominations and only 2 SF nominations, 2012 was the same, and 2013 saw one lone SF novel face off against 5 fantasy contenders. Why the rapid acceleration? I, quite frankly, have no idea. The fantasy novels being nominated now come from all versions of fantasy: contemporary (Gaiman’s The Ocean at the End of the Lane, Walton’s Among Others), historical (Griffith’s Hild and Kowal’s Shades of Milk and Honey), experimental (VanderMeer’s Finch), literary (Wecker’s The Golem and the Jinni), and secondary world (the multiple nominations for Jemisin, Ahmed’s The Throne of the Crescent Moon).
So, to sum up: we saw a slow start for fantasy from 1960-1970s, with horror-themed books breaking the genre divide. Beginning 1981, fantasy leapt into the Nebulas, occupying around 25%-30% of the award. That held steady until 2010, when fantasy leapt into the lead. All genres of fantasy now seem welcome in the Nebulas, and it’s going to be fascinating to see what happens going forward.
If you want to look at the Excel sheet with the genre classifications, here it is: Nebula Genre Study.