Tag Archives: elo ratings

A Shallow Dive into the 2020 St George Illawarra Dragons

From earlier in the year:

The squad itself isn’t magic but should be better than last place. New signing Isaac Luke has always been a productive player but he will presumably be second fiddle to Cameron McInnes when he returns from injury, reducing the potential volume of work Luke could be doing. Indeed, St George Illawarra are extremely reliant on their spine to perform. While Hunt, Norman and McInnes have been productive, I don’t think they’ve been especially effective. The Dragons are also still searching for a fullback. Lomax may or may not be it.

If Flanagan really is the de facto, if not de jure, head coach, then he should be able to coax that performance out of the roster. If McGregor is still in charge, then a 5-0 start will turn into a 7-17 season and the cycle will begin anew.

A couple of swings and misses – Flanagan didn’t feature, Luke transferred mid-season to Brisbane, Lomax went back to centre after two games and they finally found their fullback – but generally, the cycle repeated except without the fast start to the season. Yawns ensued.

Summary

Not much was expected of the Dragons this year. They didn’t have the cattle, the coach or the will to do much. While there were a few surprise results, a 7-13 would not have been far above realistic expectations. As was the case three of the previous four of these reviews, the head coach, Paul McGregor, was fired.

What happened

The Dragons were a non-entity. They were bad but not in a way that was interesting. St George Illawarra would have been a team that most sides had pencilled in for the two points. They picked up four wins against bottom eight sides plus one against Cronulla (embarrassing), another against Parramatta (inexplicable) and a last round win over the Sunshine Coast-Eastern Suburbs Tigcons.

Good teams’ Elo ratings improve over the course of the season and bad teams’ ratings collapse. Sometimes teams can build some momentum and then fall away. We might be interested in seeing how a coach or player adapts to a new environment and see this reflected in the team’s rating. These are theoretically interesting scenarios.

The Dragons offered none of these narratives.

The Dragons improved from equal worst in the league at round four to below average by round nine and then that was it for the year. They were stuck in a subpar purgatory. There was no rhyme or reason to it: football happened and these were the results.

From another way of looking at it, Dragons’ games yielded the least information of any team in the NRL. We use Elo ratings to estimate a pre-game winning probability, expressed as a percentage, based on the ratings of the two teams. The margin at the end of the game is converted to an equivalent percentage and the difference between the two percentages is what is used to adjust the rating up or down.

What this chart shows is the sum of those differences between expectation (pre-game winning probability) and reality (post-game conversion from margin to percentage) for each team. Small changes mean that the pre-game ratings were correct, minimal adjustment is needed and that the game hasn’t told much new. From July to September, you didn’t need to watch Dragons games because you already knew pretty well what was going to happen.

The only geniuine surprise was Matt Dufty finally making something of himself. Dufty managed to finish eleventh on the overall WARG leaderboard and fifth among fullbacks, doubling his career WARG in one season.

Like Nofoaluma and Feldt, we may have to temper these productive performances by acknowledging WARG does not do a good job of measuring defensive capability and Net Points Responsible For at least makes the attempt.

There you go, there might be something there. Let’s see if he can do it again.

What’s next

Griffinball. The retread coach that’s been fired from both Penrith and Brisbane is back and this time, he’s racist af. As someone with a reputation for an uninteresting brand of football plus his work history, I don’t think I can in good conscience recommend watching any Dragons games moving forward.

If, for some reason, you don’t find this a compelling argument and insist on watching the Dragons, next year at least offers the prospect of confirming if Matt Dufty is legit now and watching the development of the likes of Adam Clune, Mikaele Ravalawa, Jason Saab and Tristan Sailor. Oh wait, they let the last two go. Presumably they were too ethnic for their new coach.

[It has since come to light since first publishing this that Tristan Sailor is allegedly a scumbag. My apologies for suggesting that the Dragons were awful racists in letting him go. Instead, they are scumbags for signing him in the first place.]

The Dragons also need to resolve their Corey Norman problem. It’s not his play, although it is that, so much as his huge, millstone of a salary. At a lower price, he might have a future but he does not deserve an elite playmaker’s salary. The freed up cash could be used to replenish their dire forward pack, which other than an aging Paul Vaughan, lacks big names as Frizell departs for Newcastle and Graham long gone for St Helens. It won’t matter who the halves are or how much they are paid if the forward pack is butter.

It doesn’t look great for St George Illawarra. They don’t have the cattle, the coach or the will. The yawns look to continue.

The coaches that fucked up your club

When a coach arrives at a major league club, fresh and excited to make his own mark in the history books, you’d have to think that, as a minimum threshold for success, he’d want to leave the place in better shape than when he arrived. Sometimes, the vagaries of reality make it difficult to assess a coach’s legacy but we can definitely ignore nuance and simplify things down to a nice looking line on a graph.

For this, we use Class Elo ratings. Over this kind of time frame, you can think of the rating as a glorified win-loss stock ticker. It goes up when the team wins and it goes down when the team loses. The rating goes up more for unexpected wins and goes down more for unexpected losses. Grand finals are weighted the heaviest, then finals and then regular season games. Challenge Cup results are included for Super League teams. You can see each team’s class Elo rating history for NRL and Super League.

This post compares different coaches at each club and see how they improved the club’s rating from their first game. I’ve included most, but not all of, the coaches for each club over the last two decades. Caretakers have generally been excluded. I used rugbyleagueproject.org (DONATE TO THE PATREON) to determine the extents of careers but it may not be 100% complete for coaching details and career lengths may be out by a few games. It is very hard to find out which round a coach was sacked from a club in 2003 if it’s not on RLP. 

Embed from Getty Images

Read more

The Challenge Cup in 2020

Despite being an avowed radical when it comes to rugby league culture – best encapsulated as “burn it all down and guillotine those that disagree” – I can see why the Challenge Cup has appeal. Running a format similar to that of the much more famous FA Cup, there’s something between the destructive chaos wrought by knock-out football and the ability for any club to participate no matter the background that has massive appeal.

Embed from Getty Images

A mix of apathy, ignorance, time slots and broadcast restrictions mean that I’ve never taken much interest in the Challenge Cup. That is, up until the Catalan Dragons won it in 2018. I discovered that the Challenge Cup had a long history of international club participation. Alongside Catalans, clubs from outside the United Kingdom, including Toronto Wolfpack, Red Star Belgrade, Longhorns of Ireland, Pia, Paris St Germain, Lezignan, Carcassonne, Toulouse, Villeneuve and Saint-Gaudens from France and Dinamo Moscow and Lokomotiv Moscow, have had a crack at the tournament. In this sense, the Cup provides an excellent platform for teams to chance their hand at a higher level than they may get at home, without the commitment of joining the RFL pyramid. But 2018 was the first time a team from outside had won the competition.

The Cup has faced declining interest in recent years and not just because a French team made the final. It remains to be seen how, with the likes of knock-out finals in the Super League and Magic Weekend becoming the sport’s pilgrimage, the Wembley final maintains relevance in the 21st century. As hosting rights become a viable revenue stream for rugby league, the RFL will need to find a way to make the finals, at least, saleable and that will likely involve leaving London and following the cash.

Challenge Cup Final Attendances

The other challenge is finding space for it in an increasingly crammed calendar. With Super League now contested over an unnecessary twenty-nine rounds plus finals and with the slowly growing re-emphasis on international football, the half dozen weekends required to contest the Cup are valuable real estate, especially given the opportunity cost and the wear and tear on players. In my vision for a future hyper-mega-league that the sport should be working towards, even with a shortened club season, I’m not sure I’d make room for the Cup as it currently stands. On the other hand, if Origin doesn’t work in Europe, then maybe this is their equivalent.

Current travails aside, the competition for 2020 is already underway. The second round completed, the number of non-league teams is being whittled down in anticipation of the League 1 clubs joining the fray in round 3. In the fourth round, the Championship clubs will join, followed by the bottom four of Super League in the fifth and the balance in the sixth round with finals to commence thereafter. In this, we take a look ahead at how this year’s Cup might unfold.

Rating the Challengers

You should be familiar enough with how we approach Elo ratings by now but just in case, we use two systems side-by-side. The first is a form rating, which measures short term performance by calculations based on the points margin of each game, and the second is a class rating, which measures long term performance based on small rating changes for wins and losses. You can review the history of the RFL class ratings in the post from earlier this week.

I used the class rating system as a basis for simulating the 2020 Cup 50,000 times. Historically, the class rating has been a comparatively poor predictor of match results but this system has the advantage of rating teams across leagues, saving us the trouble of trying to estimate differences between the leagues and applying corrective measures, as I did for the GRLFC Rankings. Obviously, this is an important feature when the fates can have clubs from different leagues mixing and matching.

Here’s how each team stacks up going into 2020:

2020 RFL CLASS

Worth nothing that the ratings of new teams, like Toronto and Toulouse, are a little underdone. They haven’t had as much time as the other clubs to beef up their rating, even though they’ve accumulated points about as quickly as humanly possible.

It’s probably going the way you’d expect

In 2020, we’re really only considering three obvious challengers:

2020 Challenge Cup

After a poor couple of seasons, the class rating doesn’t see Leeds as a member of the big four anymore, leaving St Helens, Wigan and defending champions Warrington as the favourites to hoist the trophy at Wembley.

Outside of that, two things surprise me about the results of this exercise. The first is how many teams are exactly zero chance of winning the Cup. Sixteen of the thirty-seven teams registered no simulated wins. Of these, half a dozen made it as far as the final in a handful of cases but failed to win any of them. That’s got to be deflating. Worse still are clubs like Skolars and Coventry, who never made the semis, or West Wales, who didn’t clear the sixth round. The simple fact is if you need consecutive low probability events to go your way, eventually they pile up into an impossibility. As an exercise, multiply 30% by 30% five or six times and you’ll see what the Raiders and Bears are up against.

On the contrary, and the second surprising thing, is how many clubs can still realistically challenge. The entire Super League has a shot, largely by virtue of playing fewer games with far better squads, but even the likes of the London Broncos and Leigh Centurions could be featuring in the latter stages of the tournament, if not fighting for the title. The aforementioned big three total up to a 61% chance with the implied flipside being that there’s a 39% chance of any other team winning. If Wests Tigers can win a NRL premiership, then anything is possible.

One of the myths underpinning the Challenge Cup, and indeed similar tournaments around the world, is that anyone can win it. Reality doesn’t bear that out – the last team from outside the top flight to win the FA Cup was in 1980 and I’m not sure if or when that has happened in the Challenge Cup – but it’s still a part of the competition’s narrative. Based on the simulations, you can see that, by far, the most likely pathway is for each division to be squeezed out one-by-one as the rounds progress. Any team may be able to participate but that doesn’t necessarily mean they have the capability to compete.

League Composition of Challenge Cup

Context from past cups

The chaos of knock-out football is one thing, but all of this is probabilistic. Ince Rose Bridge could very well win the Challenge Cup but it is extremely unlikely. I don’t think that’s controversial. I always say anything can happen and even the word “impossible” does a lot of heavy lifting in this context, but some events are very difficult to see occurring.

Here we have no hard and fast predictions – making them isn’t really my style in any case – but we can see from running the same exercise for the last two Cups how “good” the simulations are at making predictions. After all, the probabilities above are only as good as the mechanism for deriving them.

This time two years, the graph looked a lot like this:

2018 Challenge Cup

And the Catalan Dragons (2.5%) won. Last year:

2019 Challenge Cup

And the Warrington Wolves (12.1%) won. This should give fans some hope that we aren’t here to witness St Helens waltz their way to The Double but might seem to undermine the quality of the forecasting.

Looking closer, what we find across the two tournaments is that, for each team, the team exits at the round estimated to be the most likely about 40% of the time. About the same number again are only off by one round, either exiting one round earlier or surviving one round longer than their most likely outcome.

I think that’s pretty reasonable. The 2019 Wigan Warriors were the only team who had winning the Cup as their most likely outcome. It remains to be seen whether the 2020 St Helens side, who are similarly favoured, will manage to go one better than 2019 or if the likes of Huddersfield or Castleford will spring a surprise.

A Complete History of Super League

This article is about the northern hemisphere rugby league competition. For the short-lived southern hemisphere competition, see the League Digest podcast.

To complement the complete histories of the NRL and the Queensland Cup, I humbly present the complete history of the northern hemisphere Super League competition. I will also, at no charge to you, include abbreviated histories of the Championship (from 2007) and League 1 (from 2009). I would go further back but the official rugby league website does not have results back that far.

This being a website that predominantly deals in statistics, I don’t intend to describe the history in words but rather in graphs. Specifically, I will use Elo ratings to chart the paths of each club.

Embed from Getty Images

I do two types of Elo ratings. Form is about the short term performance of clubs, and can represent anywhere from four to eight weeks of results (calculated on the points margin of each game) depending on the draw and league, while class is about long term performance, and can represent the average of years of performance, aggregating wins and losses with small rating changes. Form is a better predictor of match results, class is a better predictor of fan disappointment.

Normally, I treat each league as a self-contained entity, which operates with an average rating of 1500. For the RFL pyramid, I took a different tack and created one class rating system to span the Challenge Cup and the three leagues. The Challenge Cup rounds and finals are weighted the same as their league equivalents and teams carry their rating through promotion and relegation between the leagues. Super League teams start with a 1600 rating, Championship on 1300 and League 1 teams on 1000. Non-league teams are given a 750 rating for Challenge Cup purposes.

Here’s how each team sits going in to the 2020 season:

2020 RFL CLASS

If you want to see how your team’s history looks, you can jump ahead:

Super League

Championship

League 1

esl-shl St Helens

History of St Helens RFC

esl-wig Wigan

History of Wigan Warriors

esl-war Warrington

History of Warrington Wolves

esl-cas Castleford

History of Castleford Tigers

esl-hfc Hull FC

History of Hull FC

esl-lee Leeds

History of Leeds Rhinos

esl-sal Salford

History of Salford Red Devils

esl-cat Catalans

History of Catalan Dragons

esl-hud Huddersfield

History of Huddersfield Giants

esl-wak Wakefield Trinity

History of Wakefield Trinity

esl-hkr Hull Kingston Rovers

History of Hull Kingston Rovers

esl-tor Toronto

History of Toronto Wolfpack

rfl-ldn London Broncos

History of London Broncos

rfl-lei Leigh

History of Leigh Centurions

rfl-fea Featherstone

History of Featherstone Rovers

rfl-tls Toulouse Olympique

History of Toulouse Olympique

rfl-brd Bradford

History of Bradford Bulls

rfl-hal Halifax

History of Halifax RLFC

rfl-wid Widnes

History of Widnes Viking

rfl-yck York City

History of York City Knights

rfl-she Sheffield

History of Sheffield Eagles

rfl-bat Batley

History of Batley Bulldogs

rfl-whi Whitehaven

History of Whitehaven RLFC

rfl-old Oldham

History of Oldham Roughyeds

rfl-brd Dewsbury

History of Dewsbury Rams

rfl-swi Swinton

History of Swinton Lions

rfl-don Doncaster

History of Doncaster RLFC

rfl-bar Barrow

History of Barrow Raiders

rfl-wor Workington Town

History of Workington Town

rfl-ncl Newcastle

History of Newcastle Thunder

rfl-hns Hunslet

History of Hunslet RLFC

rfl-roc Rochdale

History of Rochdale Hornets

rfl-nwc North Wales

History of North Wales Crusaders

rfl-kei Keighley

History of Keighley Cougars

rfl-lds London Skolars

History of London Skolars

rfl-cov Coventry

History of Coventry Bears

rfl-wwr West Wales

History of South Wales_West Wales Raiders

Tonga are still a long shot for the 2021 World Cup

After a dominating win over Great Britain, followed by a tough and exciting win over Australia, some extremely exuberant pundits decided that Tonga were the best nation in the world and worthy of Tier 1 status.

Putting aside the fact that a poor island nation of 100,000 is not capable of generating enough native talent to compete with Australia or England and so not at all suitable for Tier 1 status, I find it hard to believe that two wins is enough to reach the top of the rugby league pile. Then the IRL updated their rankings and put New Zealand at number one. It all got too much for me.

Fortunately, the draw for the 2021 World Cup restored some sanity, despite the slightly incongruous setting of Buckingham Palace and newsworthy presence of the Duke of Sussex. With 641 days to kickoff, I got a bit excited and looked at next year’s tournament.

Embed from Getty Images

Read more

Ranking every rugby league team in the world

If you’re not interested in how the rankings work and just want to see the outputs, click here.

It began as a simple exercise to try and rate Super League players, much in the way that I rate NRL and Queensland Cup players. It turns out that the Super League website makes that an impossible task because it is a garbage fire for stats. Moving on from the wasted effort, I thought I might still do team ratings for the RFL system, mostly out of my increased interest with the Toronto Wolfpack’s promotion into Super League.

Then I thought about the Kaiviti Silktails of Fiji entering into the New South Wales system and wondered if I should take a look at the leagues there, despite my dubiousness about whether anyone in NSW cared about lower grade football when they could follow the Dragons, the Tigers or the Knights in so-called first grade.

From there I spiralled into a mishmash of US college football tradition, websites in Serbian and copying and pasting. When I came to, I had a neatly formatted spreadsheet covering a decade of world club rugby league.

Embed from Getty Images

Ranking the world

Invariably, creating any sort of evaluation system requires judgements by the evaluator about who to include or exclude and what the evaluation system considers to be “good”. I’ll explain my position and you can decide whether or not you like it.

Scoring the teams uses an average of four similar rating systems that look at performance over different time intervals.

We’ve long had form and class Elo ratings for the NRL and Queensland Cup. Form is about the short term performance of clubs, and can represent anywhere from four to eight weeks of results depending on the draw and league, while class is about long term performance, and can represent the average of years of performance. Form is a better predictor of match results, class is a better predictor of fan disappointment.

I created similar systems for another ten leagues in NSW, PNG, France (see also my Elite 1 season preview), the UK and the USA. They work along the same lines as the NRL and Queensland Cup editions. The average rating within an Elo system is approximately 1500 and the disparity in ratings can be used to estimate match outcome probabilities.

Both sets of Elo ratings are adjusted by a classification system I borrowed from baseball. To acknowledge the fact that a 1700 team in the BRL is not likely to be as good as a 1300 team in Super League, we adjust the team ratings so we can attempt to compare apples to apples –

  • Majors: NRL (ratings adjusted by +500) & Super League (+380)
  • Triple-A (AAA): QCup, NSW Cup and RFL Championship (all +85)
  • Double-A (AA): Ron Massey, RFL League 1, FFR Elite 1 (all -300)
  • High-A (A+): Brisbane RL, FFR Elite 2 (all -700)
  • Low-A (A-): USARL (-1000)

In Elo terms, a difference of 120 points between teams, like between an average NRL and an average Super League team, makes the NRL team 2:1 favourites. A 415 point gap gives the less favoured team a 8.4% chance of winning (equivalent to the replacement level), 800 points 1%, 1200 points 0.1% and 1600 points 0.01%. Consider the improbability of the Jacksonville Axemen beating the Melbourne Storm and you get an idea of where I’m coming from.

Between short term form and long term class, we’re missing a medium term component that represents roughly a single year of performance. I originally was going to create Poseidon ratings for the leagues, so I took a simpler approach and used points scored per game and points conceded per game over a regular season in lieu.

I then made my simplification much more complicated by doing a linear regression of winning percentage across all leagues compared to points scored per game and a second regression against points conceded per game. This gives a formula that converts the components of for and against into winning percentage, which is in turn converted to an equivalent Elo rating, which is then adjusted per the above. It also allows me to compare points scored per game – as a measure of competitiveness or quality or both? – across different leagues.

Competitiveness or quality.png

This specifically is just trivia but from an overall analytics perspective, the risk is if only the top league is analysed and analysts assume that the same principles apply to all leagues, incorrect conclusions will be drawn about the sport.

The ranking is decided by which team has the highest average score across the four rating components, which are given equal weighting. I call it the Global Rugby League Football Club Rankings, or GRLFC for short.

While it’s possible for teams to game a single system, it would be nigh on impossible to game all components, so I feel relatively comfortable that the highest ranked team is the “best”.

That said, form ratings and the for-and-against components only work on regular season results. Class ratings are the only component that takes into account playoff (and Challenge Cup, where applicable) performance. You may think finals footy deserve more weighting but I would put it to you that “the grand final winner is always the best team” and “any rugby league team can win on their day” are two mutually exclusive thoughts and I prefer to believe the latter. If you want to further mull it over, consider that Newtown finished seventh on the ladder in the twelve team NSW Cup in 2019 and then went on to win the Cup and then the State Championship.

Each club (as represented by their combination of name, colours and logo) is only represented once in each year’s rankings, by the version of that club in the highest league. For example, Wentworthville have been in the NSW Cup and the lower tier Ron Massey Cup. To date, Wenty have been represented in the rankings by their state cup team. However, as the Magpies will be replaced in the NSW Cup by Parra reserve grade in 2020, and while this doesn’t change much in reality, they will be henceforth represented in the rankings by their Ron Massey team. This is mostly because it makes the rankings a little more interesting, not having been clogged up by a half dozen clones of the NSWRL clubs.

I would like to have included the Auckland Rugby League’s Fox Memorial comp as a double-A league but it seems to be impossible to find scores. I also would have liked to add more low-A comps, like those in Serbia or Netherlands or maybe even Nigeria or Kenya, but scores for these comps are even more difficult to find or have incomplete results or don’t really play enough games. As a result, we may never know whether the Otahuhu Leopards are better than the Villeneuve Léopards.

I drove myself mad enough to trying to get the results that I did. I don’t feel the need to delve further into district comps in Australia but, who knows, I may well change my mind on that. It would be nice to go further back on some comps, particularly in France and PNG, but we have what we have. A big thanks to rugbyleagueproject.org, leagueunlimited.com and treizemondial.fr for hosting what they do have, because we can’t possibly rely on federations to have curated their own records and history.

A full season of results is required for a club to be ranked. This is only a problem for French clubs, with both Elite 1 and 2 running through their winter and the date the ranking is nominally calculated is December 31. A French club’s first part season is given a provisional place in the rankings, converting to a ranking the year after, based on the previous twelve months’ worth of results.

The rankings can be seen for 2009 through 2019 here. Your current top seeds in each competition are –

  • NRL (Major): nrl-mel Melbourne Storm (1)
  • Super League (Major): esl-shl St Helens (5)
  • Championship (AAA): esl-tor Toronto Wolfpack (29)
  • Queensland Cup (AAA): qcup-scf Sunshine Coast Falcons (30)
  • NSW Cup (AAA): nsw-nwt Newtown Jets (40)
  • Ron Massey (AA): nsw-mry St Marys (63)
  • League 1 (AA): rfl-old Oldham Roughyeds (64)
  • PNG NRLC (AA): png-lae Lae Tigers (66)
  • Elite 1 (AA): el1-alb Albi Tigers (69)
  • Elite 2 (A+): el2-vgh Villegailhenc-Aragon (101)
  • BRL (A+): qld-wsp West Brisbane Panthers (105)
  • USARL (A-): usa-jax Jacksonville Axemen (109)

Women’s Rankings

In an ideal world, we’d have a women’s ranking to complement the men’s. But the NRLW has only completed 14 games, which is not a sufficient sample although we may see that double in 2020. The QRLW will only commence this year and it remains to be seen what the NSWRL is going to do with their women’s premiership, whether this becomes the equivalent of a Ron Massey Cup to a new NSWRLW/women’s NSW Cup or if, as is usually the case, the Sydney comp will be promoted to be the state comp.

In the more enlightened Europe, the women’s Super League has completed its first season, comprising 14 rounds, and the Elite Feminine has just commenced its second season, the previous being 12 rounds. The bones are there for a women’s club ranking, but it will take time for Australia to catch up a little and make the rankings more balanced. With any luck, I should be able to deliver the first rankings at the end of this year.

The World Club Challenge

International club football is a rare thing, indeed. The ridiculously lopsided 1997 World Club Challenge (Australian clubs scored 2506 points to the Europeans’ 957) largely put paid to the idea that there could be a competition on an equal footing between the two major leagues of football. Other than a short lived World Club Series, which was overly reliant on the charity of big Australian clubs, all that remains of the concept is the World Club Challenge match-up between the winners of the Super League and the NRL.

First held irregularly since 1976 and annually since 2000, the match suffers from the disparity in the quality of the leagues – obviously driven by money – and a lack of interest – largely driven by a lack of promotion and lack of commitment from most Australian clubs. The advantage has ebbed and flowed, generally in favour of the Australian sides but in the late 2000s, the English fought back before being pummelled back into submission more recently.

World Club Challenge For and Against.png

Incidentally, I arrived at a 120 point discount between the NRL and Super League based on Super League clubs’ for and against in the WCC over the last twenty years. The application of Pythagorean expectation and then converting that (approx. 33% win percentage for SL) into Elo rating points.

Still, I believe that the WCC should be one of the centrepieces of the season, not unlike an abbreviated World Series or Super Bowl. A match day programme could be filled out by play-offs from the champions of the men’s, women’s and secondary men’s comps – perhaps with the winners of the NRL State Championship and the winner of a play-off of the premiers of the RFL Championship and Elite 1 – in the Pacific and the Atlantic. Such an event could be saleable to broadcasters, sponsors and hosts.

Of course, if successful, the WCC would then undermine the respective competitions’ grand final days, so there’s an obvious conflict of interest. The conflict is difficult to resolve when the stakeholders are more interested in maintaining their own position than making money or securing a commercial future. While cash may be a corrupting influence, the game will not survive as a professional sport without it.

Given the absence of international club fixtures, you could fairly wonder what the applications of this ranking system might be, other than to have a rough guess at whether the Gold Coast Titans are better or worse than the Sunshine Coast Falcons (the answer is: slightly better). My feel is that the final score is a rough proxy for a singular globalised Elo rating system. Consequently, it may not be very good but I looked back to the last ten WCCs.

GRLFC vs WCC.png

It was successful in predicting the higher ranked team winning eight of the ten matches but not particularly predictive in terms of the gap between the teams (the trendline above shows basically zero correlation) nor in the scale of favouritism (favourites won 80% of the time compared to 65.9% predicted probability). Still, it’s only a sample size of ten games where the Super League sides have been beaten pretty comprehensively.

In the meantime, this gives the English something to work towards.

Who has the softest NRL draw in 2019?

The current format of the NRL doesn’t allow for each team to play each other twice. Doing that would mean extending the season by another six weeks and, even if they players were up for that (which they are not), as an armchair analyst, I don’t think I could cope.

This means that not every team’s schedule is the same. For twenty-four games, each teams plays each other once and plays a second game against nine other teams. The NRL has no particular interest in trying to provide the mythical “balanced schedule” that would be fair for all teams and prefers to use the opportunity to use a doubling up of rivalry games to generate commercial returns.

This might seem grossly unfair, especially if your team has to play the premiers twice, but it is what it is. What I’m interested in looking at this week is how slanted the schedules are and who will have an easier time of the 2019 NRL season and who will have to do it the hard way.

Embed from Getty Images

Read more

A complete history of the Queensland Cup

Last year, I did a series of graphs which told each NRL team’s history in the top flight since 1998. Despite expectations, it was pretty well received compared to the one I was expecting to generate a bit of interest.

Now that we have all of the Queensland Cup results, I thought I’d run the same exercise. I don’t expect all that much interest but I think it’s a shame that the Queensland Cup doesn’t garner more attention. The days of 30,000 attending the BRL grand final in the 1980s has been replaced by less than 10,000 turning up to the equivalent event in the 2010s, crushed under the homogenisation of rugby league culture in Brisbane and Queensland thanks to the twin-headed Orthrus* of the Maroons and the Broncos.

Part of the solution is to create some meaningful content about it and we’ll see more of it on this website this year. This post is the starting point, literally charting the competition’s history from its inception in 1996 through the turbulent early 00s period and into the relatively stability – dare I say, optimism – of the last ten years.

*It’s like Cerberus but with two heads.

Read more

What’s new in 2019

I had meant to do some wrap-up posts after I got back from my honeymoon but it felt like the moment had passed, being two or three weeks after the grand final, and I didn’t have much to say other than the navel gazing stuff that people really don’t care about it.

This tweet sums up the 2018 season on the blog pretty well though:

This post is about some of the stuff that’s changed and what’s a work in progress but mostly to let you know that we are indeed in business in 2019.

Embed from Getty Images

Read more

« Older Entries