Hugo Contenders and Popularity, October 2014
Awards season moves ever closer! Right now, the Hugo and Nebula Awards for 2015 are still wide-open: what we have is disorganized sentiment, and that’s going to begin to get organized over the next 2-3 months. Some of the first posts about the 2015 Hugos are beginning to appear, including this excellent post from A Dribble of Ink. As more and more people begin talking about the Hugos and Nebulas, new contenders are going to emerge.
To help get those discussions perspective, I’m going to launch a new monthly feature of Chaos Horizon: checking on the popularity of the major Hugo contenders. Without further ado, here’s the chart, with all numbers taken from Goodreads as of October 31, 2014:
The chart lists the number of time each book has been ranked on Goodreads, with also the overall ranking. I think this chart is interesting, as it allows some of the differences in number of readers/reception based on these texts. While The Mirror Empire was embraced by critics on SFF websites, it’s been outread almost 75 to 1 by Words of Radiance. Sure, the Sanderson has been out for 8 months and the Hurley only 2, but that’s a staggering difference in number of readers. Also look at the score: 3.81 for Hurley, 4.76 for Sanderson. Have enough people read and liked The Mirror Empire for it to make the slate?
When you see the numbers like this, it’s clear how popular Words of Radiance and The Martian actually are, and how quickly writers like Scalzi or Mitchell are moving copies of their books. Based on number of ratings alone, Annihilation looks like a strong candidate. I don’t know yet how much raw popularity factors into the Hugo awards, but it has to be a factor, doesn’t it? I think by touching base with the popularity of these books every month, we can get a good idea of how often they’ve been read, and, in turn, how many voters can vote for them. That is, if you believe Hugo voters only vote for novels they’ve read . . .
About the Chart: One of the frustrations I have about the publishing industry is how secretive they are with numbers. The movie, music, and television industries are all relatively transparent about their numbers, and publish them regularly. We know within a few hours how well a blockbuster movie did at the box office, for instance.
For books—it’s all a deep, dark secret. The bestseller lists we have like NYTimes are calculated using obscure and byzantine formulas, and they don’t even release estimates of numbers sold. Bookscan covers a solid portion of physical book sales, but all those numbers are locked behind an extraordinarily expensive paywall. Publisher’s Weekly gives us some Bookscan numbers, but only for top-selling books—which often excludes SFF novels. So we’re left having to estimate popularity, either by word of mouth, blog traffic, Amazon sales ranks, etc.
I’ve thought long and hard about what the best measure of popularity might be, and I’ve settled on Goodreads as our current, most reliable measure of a book’s popularity. It’s not perfect by any means, and we don’t know the correlation between the # of Goodreads ratings to total sales (I’d estimate that at 5%-20%). What we do know is the following:
1. Goodreads has tons of users. For an average SFF book, there might be anywhere from between 1,000 to 50,000 ratings, and that has to represent a significant % of overall readers. Goodreads tends to have at least 10x the amount of ratings that Amazon.com does, for instance.
2. Goodreads doesn’t distinguish between electronic or print versions of the book. If you’ve read the book, you can rank it, no matter how. To my mind, that makes Goodreads even more reliable than Bookscan.
3. Even if Goodreads is somewhat skewed (the % of Goodreads readers is not 1:1 with the general reading public), it’s likely to always be skewed the same way. This makes for good “apples to apples” comparisons. Or, in other words, Goodreads is equally unfair to everyone.
A couple things to keep in mind:
1. The pool of Hugo voters is not the same as the general reading public. Hardcore SFF fans may like different things about novels than the more general pool of Goodsreads readers.
2. We don’t know how popularity correlates to the Hugo awards. I’ll need to collect data for a few years to begin to see patterns.
3. We don’t know if Goodreads is skewed towards certain authors. Most social media sites skew young, and books that appeal to readers in their teens and twenties may do better on such sites. I’ve got some ideas to figure this out, but it’s going to take time.
4. Goodreads only works for comparisons within the same year. More people join Goodreads all the time. Also, I can’t go back in time to measure how popularity worked in previous years; as soon as novels get Hugo or Nebula nominations/wins, that greatly increases their popularity, and we can’t know whether those novels sold well before or after they were nominated.
So, what do you think of this measure of popularity? Can it help us understand the Hugos or Nebulas better?