If you’d like to skip the explanation and see the full list of StatScores and Win Shares, you can go to this Google Sheet.
The biggest off-season story in the NRL was the transfers of Cooper Cronk from Melbourne to Sydney and then Mitchell Pearce from Sydney to Newcastle. From the Roosters’ perspective, for two players likely on similar pay packets, how did the Roosters decide one was better than the other? Then I wondered if it were possible to work out a way of judging value for money in player trades. It’s big in baseball, so why not rugby league? This led me to develop StatScore and Win Shares as ways to numerically evaluate rugby league players.
If you’re wired for numbers, like I am, it can be hard to deal with people’s feelings and understanding why they think the things that they do. That’s why I’ve decided to quantify the feelings a team generates into five distinct indices: Power, Hope, Panic, Fortune and Disappointment.
Each index has two components. There’s a main mechanism for ranking the teams and some minor tie-breaking stats. The main mechanism typically uses Elo ratings to make an estimation of what we expect from a team, whether or not they are meeting that expectation and what that means for the season ahead. The tie-breakers are statistics used to award a few points here and there to help rank the teams should they have similar mechanism results.
Editor’s note: As much of last season’s material was influenced by The Arc, much of this season’s material owes a debt to SB Nation, including the idea of panic/hope indices.
The Greeks is the collective name given to a series of Elo rating models for tracking performance of rugby league teams and forecasting the outcomes of games. I usually refer to them as if the philosopher himself was making the prediction, even though the Greeks have mostly been dead for a couple thousand years and certainly would never have heard of rugby league or Arpad Elo.
The differences between each Greek are on the subtle side, with the intention of measuring different things. You may want to revisit primers from last year:
This is the last projections update for the year. It’s come around so fast.
While I realise there’s still three weeks of finals left, in reality there’s three games left after next weekend. The Stocky is not great at dealing with small sample sizes of games, as evidenced by its relative lack of performance towards the year’s end, so its a little pointless doing it any further this year. I might tweet something instead. Let’s also ignore the fact that the Storm are unbackable favourites at the moment.
The projections will be back next year, rebranded and tarted up, and the tips posts will continue until I sign off sometime after grand final day.
Sydney City 24 (17-7) d Brisbane 22 (16-8)
Melbourne 18 (20-4) d Parramatta 16 (16-8)
Penrith 22 (13-11) d Manly 10 (14-10)
North Queensland 15 (13-11) d Cronulla 14 (15-9)
Brisbane 20 (16-8) d North Queensland 10 (13-11)
Parramatta 22 (16-8) d South Sydney 16 (9-15)
Sydney City 20 (17-7) d Gold Coast 16 (7-17)
Manly 28 (14-10) d Penrith 12 (13-11)
Melbourne 32 (20-4) d Canberra 6 (11-13)
Cronulla 26 (15-9) d Newcastle 18 (5-19)
Canterbury 26 (10-14) d St George Illawarra 20 (12-12)
Wests Tigers 28 (7-17) d New Zealand 16 (7-17)
Normally, we’d run through the Collated Ladder but this week, being the end of the regular season and the real ladder is far more accurate (being complete and all), I thought I’d compare the different systems to see how their rankings stack up:
Parramatta 52 (15-8) d Brisbane 34 (15-8)
Canberra 46 (11-12) d Newcastle 28 (5-18)
North Queensland 22 (13-10) d Wests Tigers 14 (6-17)
Canterbury 26 (9-14) d Gold Coast 14 (7-16)
Melbourne 64 (19-4) d South Sydney 6 (9-14)
Sydney City 16 (16-7) d Cronulla 14 (14-9)
Manly 22 (13-10) d New Zealand 21 (7-16)
St George Illawarra 16 (12-11) d Penrith 14 (13-10)
Now to look ahead to see what the Collated Ladder has predicted for the final standings of the season.
The Collated Ladder is a bit useless now and I’m mostly just publishing it for completeness. For example, it strangely has both Brisbane and North Queensland losing their final round face-off, which is a relic of the way the ladder comes together. Look at the Stocky column for a better bet:
- 90% of a last win – Sydney
- 80% of a last win – Melbourne, Parra
- 70% of a last win – Penrith, St George
- Approx 50:50 – Brisbane/NQ and Cronulla/Newcastle
The Stocky is the main forecasting tool driving the analysis on this site. It’s a simulator of the season ahead, using the Monte Carlo method and based on Elo ratings, that gives insight into the future performance of each club. My main interest has been the number of wins, as it determines ladder positions which in turn have a big impact on the finals. The Stocky might not be able to tell you which games a team will win, but it is good at telling you how many wins are ahead.
But how does a computer simulation (in reality, a very large spreadsheet) compare to reality? To test it, I’ve put together a graph of each team’s performance against what the Stocky projected for them. Each graph shows:
- The Stocky’s projection for total wins (blue)
- Converting that projection to a “pace” for that point in the season (red)
- Comparing that to the actual number of wins (yellow)
It will never be exactly right, particularly as you can only ever win whole numbers of games and the Stocky loves a decimal point, but as we’ll see, the Stocky is not too bad at tracking form and projecting that forward.
This week is Part II, from North Queensland to Wests Tigers. Part I, from Brisbane to Newcastle, was last week. Also see this week’s projections update for some errors in the Stocky.