17 correct out of 24 is 71% correct or a “C-”.
Generally the results indicate that our model performed below average. In the impossible game of predicting the Oscars though it is above average and by a good bit, we won our prediction competition in a tie with Walt Hickey of FiveThirtyEight. Our model was an aggregation model, meaning we predicted that the films most people thought would win would in fact win. This turned out to be true most of the time, but was wrong 7 times.
If you want to look at what we got right and why we got it right, check out our predictions. In this piece we will be examining the categories we got wrong and the who?, what?, where?, when?, and why? of the night but in opposite order.
Prediction: La La Land with a 70% chance
Actual: Hacksaw Ridge who we gave a 15% chance
Why: Hacksaw Ridge had a Sound Mixer on its team that had been nominated 21 times without a win, Kevin O’Connell. In the final week before the Oscars, after the Academy had already begun voting his story began to get a lot of buzz. It would appear that buzz pushed him and his team to victory and an Academy Award.
When: At the anticipated time.
Where: In heaven Kevin’s mom smiled down on her son as he lovingly thanked her for encouraging him to pursue this dream.
What: Next year, we will examine both past nominations and pre-voting buzz as indicators of Oscar success.
Who: Vanity Fair, the only predictions we used in our aggregation that beat our own final predictions, got this category right.
Prediction: Hacksaw Ridge with a 61% chance
Actual: Arrival who we gave a 15% chance
Why: This one is still a bit of a puzzler to me. Hollywood liked Arrival more than they like most sci-fi film, so perhaps the Academy want to give it something. Or when they all started to vote for Kevin it is possible they didn’t want to give both sound awards to the same movie. Unfortunately at this point there isn’t a more clear explanation available.
When: Right before Kevin’s victory.
Where: At the academy awards as everyone held their breath waiting to see if Kevin would finally be recognized.
What: Next year, we will examine how the other awards a film is nominated for or expected to win influence its chances for success in a particular category.
Who: Microsoft’s predictions posted to Bing gave Arrival the greatest chance of winning this category. If we hadn’t included them in our prediction the film would’ve only had a 7% chance of winning, and we would have looked even worse.
Short Film Live Action
Prediction: Ennemis Interieurs with a 52% chance
Actual: Sing who we gave a 16% chance
Why: This was one of our least confident categories. We gave Ennemis Interieurs a 52% chance of winning, which still leaves a lot of room for someone else to win, as Sing did.
When: The film is set in 1991, the glorious year my wife was born.
Where: In a Hungarian classroom where music fills the hearts of all who enter.
What: Not much we can learn from this loss other than how difficult it is to predict some of these lower profile categories. More research needs to be done so we can get this right next year.
Who: Only Hollywood Reporter and Vulture had Sing as their top choice to win. Vulture had one of the better scorecards getting two thirds of their predictions correct.
Makeup and Hairstyling
Prediction: Star Trek Beyond with a 65% chance
Actual: Suicide Squad who we gave an 18% chance
Why: It seems anything can happen when only three films are nominated in a category. Many experts claimed the makeup and hair in this film wasn’t very good. Obviously good is subjective. Each year it seems the Academy throws at least one bone to a high budget and high profile film, this year it seems they chose warner brothers critically disliked bad guys movie.
When: Hopefully never again.
Where: We wish in the DCEU’s made up universe only…
What: High budget and high profile films only ever seem to win in technical categories. We will examine which ones they win and what indicators may point to them winning as we prepare our model for next year.
Who: Only Vox got this category correctly as their prediction to win, and they only got half of their guesses correct so we won’t be giving them too much more weight in any future aggregate models.
Prediction: La La Land with a 76% chance
Actual: Hacksaw Ridge who we gave a 18% chance
Why: This was supposed to be a part of La La Land’s big night, but it went to a serious underdog. The Academy likes to award the film editors for war movies they have made look very realistic. They have done it many times before and will likely do it again.
When: The incredible and true story, with its violent gore, happened during World War 2
Where: In the Pacific “Theater”.
What: Examining which genres of film seem to win what categories more often could be useful in creating a smarter model.
Who: This is another category that Microsoft’s Bing got right and almost no one else did. Where their model tends to be a little more conservative, they were confident in Hacksaw for this category. They may be the model to out-predict next year.
Prediction: Jackie with a 60% chance
Actual: Fantastic Beasts and Where to Find Them who we gave a 6% chance
Why: Our model had both La La Land and Jackie at least 5 times more likely than the Harry Potter spin-off to win. It appears returning to the magic was too much for the Academy to resist this time around making our model very wrong.
When: Before that snake guy got dead the first time.
Where: Is the Wizarding World in another dimension? Or are they just really good and thorough with their memory wiping and Youtube video removing spells?
What: Sequels and spinoffs don’t typically win much at all, but we should examine what they win especially when linked to a beloved franchise like this one was.
Who: CNN got two thirds correct including this category that only had points from them and as a darkhorse in Microsoft’s Bing predictions.
Prediction: La La Land with an 87% chance
Actual: Moonlight who we gave a 9% chance
Why: This was one of our most confident categories. However, we did write this article about why Moonlight deserved to win. It was well written so please check it out.
When: A few minutes too late.
Where: Ripped from, Jordan Horowitz, the producer of La La Land’s hands. The Academy Awards team and PwC have a lot of mud on their faces after the announcement blunder of best picture. Holding back tears the La La Land crew was graceful as they left the stage, in what may turn out to be one of 2017’s most heartbreaking Hollywood moments.
What: We need to examine how often the film that people say should win but won’t actually does win.
Who: Refinery29 predicted this win alone; however, all models that ranked likely winners had Moonlight as a close second behind La La Land.
They say predicting the Oscars is impossible. We say, “We’ll try again next year.”