Tag Archive | Hugo

Hugo Award Nomination Ranges, 2006-2015, Part 3

Today, we’ll start getting into the data for the fiction categories in the Hugo: Best Novel, Best Novella, Best Novelette, Best Short Story. I think these are the categories people care about the most, and it’s interesting how differently the four of them work. Let’s look at Best Novel today and the other categories shortly.

Overall, the Best Novel is the healthiest of the Hugo categories. It gets the most ballots (by far), and is fairly well centralized. While thousands of novels are published a year, these are widely enough read, reviewed, and buzzed about that the Hugo audience is converging on a relatively small number of novels every year. Let’s start by taking a broad look at the data:

Table 5: Year-by-Year Nominating Stats Data for the Hugo Best Novel Category, 2006-2015
Table 5 Best Novel Stats

That chart list the total number of ballots for the Best Novel Category, the Number of Votes the High Nominee received, and the Number of Votes the Low Nominee (i.e. the novel in fifth place) received. I also calculated the percentage by dividing the High and Low by the total number of ballots. Remember, if a work does not receive at least 5%, it doesn’t make the final ballot. That rule has not been invoked for the previous 10 years of the Best Novel category.

A couple notes on the table. The 2007 packet did not include the number of nominating ballots per category, thus the blank spots. The red flagged 700 indicates that the 2010 Hugo packet didn’t give the # of nominating ballots. They did give percentages, and I used math to figure out the number of ballots. They rounded, though, so that number may be off by +/- 5 votes or so. The other red flags under “Low Nom” indicate that authors declined nominations in those year, both times Neil Gaiman, once for Anasasi Boys and another time for The Ocean at the End of the Lane. To preserve the integrity of the stats, I went with the book that originally was in fifth place. I didn’t mark 2015, but I think we all know that this data is a mess, and we don’t even really know the final numbers yet.

Enough technicalities. Let’s look at this visually:

Chart 5 Best Novel Data

That’s a soaring number of nominating ballots, while the high and low ranges seem to be languishing a bit. Let’s switch over to percentages:

Chart 6 Best Novel % Data

Much flatter. Keep in mind I had to shorten the year range for the % graph, due to the missing 2007 data.

Even though the number of ballots are soaring, the % ranges are staying somewhat steady, although we do see year-to-year perturbation. The top nominees have been hovering between 15%-22.5%. Since 2009, every top nominee has managed at least 100 votes. The bottom nominee has been in that 7.5%-10% range, safely above the 5% minimum. Since 2009, those low nominees all managed at least 50 votes, which seems low (to me; you may disagree). Even in our most robust category, 50 readers liking your book can get you into the Hugo—and they don’t even have to like it the most. It could be their 5th favorite book on their ballot.

With low ranges so low, it doesn’t (or wouldn’t) take much to place an individual work onto the Hugo ballot, whether by slating or other types of campaigning. Things like number of sales (more readers = more chances to vote), audience familiarity (readers are more likely to read and vote for a book by an author they already like) could easily push a book onto the ballot over a more nebulous factor like “quality.” That’s certainly what we’ve seen in the past, with familiarity being a huge advantage in scoring Hugo nominations.

With our focus this close, we see a lot of year-to-year irregularity. Some years are stronger in the Novel categories, other weaker. As an example, James S.A. Corey actually improved his percentage total from 2012 to 2013: Leviathan Wakes grabbed 7.4% (71 votes) for the #5 spot in 2012, and then Caliban’s War 8.1% (90 votes) for the #8 spot in 2013. That kind of oddity—more Hugo voters, both in sheer numbers and percentage wise, liked Caliban’s War, but only Leviathan Wakes gets a Hugo nom—have always defined the Hugo.

What does this tell us? This is a snapshot of the “healthiest” Hugo: rising votes, a high nom average of about 20%, a low nom average of around 10%. Is that the best the Hugo can do? Is it enough? Do those ranges justify the weight fandom place son this award? Think about how this will compare to the other fiction categories, which I’ll be laying out in the days to come.

Now, a few other pieces of information I was able to dig up. The Worldcons are required to give data packets for the Hugos every year, but different Worldcons choose to include different information. I combed through these to find some more vital pieces of data, including Number of Unique Works (i.e. how many different works were listed on all the ballots, a great measure of how centralized a category is) and Total Number of Votes per category (which lets us calculate how many nominees each ballot listed on average). I was able to find parts of this info for 2006, 2009, 2013, 2014, and 2015.

Table 6: Number of Unique Works and Number of Votes per Ballot for Selected Best Novel Hugo Nominations, 2006-2015

Table 6 Best Novel Selected Stats

I’d draw your attention to the ratio I calculated, which is the Number of Unique Works / Number of Ballots. The higher that number is, the less centralized the award is. Interestingly, the Best Novel category is becoming more centralized the more voters there are, not less centralized. I don’t know if that is the impact of the Puppy slates alone, but it’s interesting to note nonetheless. That might indicate that the more voters we have, the more votes will cluster together. I’m interested to see if the same trend holds up for the other categories.

Lastly, look at the average number of votes per ballot. Your average Best Novel nominator votes for over 3 works. That seems like good participation. I know people have thrown out the idea of restricting the number of nominations per ballot, either to 4 or even 3. I’d encourage people to think about how much of the vote that would suppress, given that some people vote for 5 and some people only vote for 1. Would you lose 5% of the total vote? 10%? I think the Best Novel category could handle that reduction, but I’m not sure other categories can.

Think of these posts—and my upcoming short fiction posts—as primarily informational. I don’t have a ton of strong conclusions to draw for you, but I think it’s valuable to have this data available. Remember, my Part 1 post contains the Excel file with all this information; feel free to run your own analyses and number-crunching. If you see a trend, don’t hesitate to mention it in the comments.

Advertisement

Best of 2014: NPR

NPR (National Public Radio) has put up their “Best Books of 2014” webpage, including a long list of SFF novels from 2014. While certainly leaning more to the literary than the genre world, NPR is a little more tuned in to what’s happening SFF than some other mainstream outlets. I’ve listed just the novels below; they include plenty of comics, picture books, collections, reprints, etc.

The Bone Clocks, David Mitchell
On Such a Full Sea, Chang-Rae Lee
Station Eleven, Emily St. John Mandel
Ancillary Sword, Ann Leckie
Horrorstor, Grady Hendrix
Afterparty, Daryl Gregory
The Martian, Andy Weir
The Memory Garden, Mary Rickert
The Bees, Laline Paul
The Man with the Compound Eyes, Wu Ming-Yi
Valour and Vanity, Mary Robinette Kowal
California, Edan Lepucki
Whiskey Tango Foxtrot, David Shafer
The Tropic of Serpents, Marie Brennan
The Peripheral, William Gibson
The Book of Strange New Things, Michel Faber
Southern Reach, Jeff VanderMeer
The Three-Body Problem, Cixin Liu

Note the near total absence of epic fantasy—no Sanderson, Weeks, Hurley, Bear, Hobb, Rothfuss, etc. Other than that, this includes a lot of the Hugo and Nebula favorites. Echopraxia once again fails to make a year-end list, and The Peripheral continues to make these lists. On the basis of that popularity, I’d have to consider Gibson much more likely to make a slate than Watts this year.

A pattern for theses lists is emerging: the big “literary” SFF novels of the year (Mandel, Mitchell, Lee, Faber) joined with the big “genre” novels of the year (Gibson, Leckie, Weir, VanderMeer). I wonder if that’s how the Nebula is going to play out, as a battle between “literary” and “genre.” It would be interesting to see what kind of debates a Nebula slate of VanderMeer/Leckie/Gibson on one side versus Mitchell/Mandel/Lee would generate.

Xeno Swarm

Multiple Estrangements in Philosophy and Science Fiction

AGENT SWARM

Pluralism and Individuation in a World of Becoming

Space and Sorcery

Adventures in speculative fiction

The BiblioSanctum

A Book Blog for Speculative Fiction, Graphic Novels... and more!

The Skiffy and Fanty Show

Running away from the thought police on wings of gossamer and lace...

Relentless Reading

"A Veritable Paladin of Blogging!"

MyLifeMyBooksMyEscape

A little about me, a lot about books, and a dash of something else

Far Beyond Reality

Science Fiction and Fantasy Reviews

Andrew Liptak

three more from on high

Eamo The Geek

The Best In Sci-Fi And Fantasy Book Reviews by Eamon Ambrose

Read & Survive

How-To Read Books

Mountain Was Here

writing like a drunken seismograph

The Grimdark Review

The very best of fantasy.

SFF Book Reviews

random thoughts about fantasy & science fiction books

Philip K. Dick Review

A Re-read Project

Notes From the Darknet

Book reviews and literary discussion