I think this is ultimately the smoking gun of what's going on. Without the lower rated fields to keep ratings from drifting higher and higher each year, we've slowly seen the ratings inflate by about 10 points from 20 years ago, with most of that inflation going on over the last 7 or so years when the PRO/AM fields were more often separated, the FPO was given their own layout, and the top tier MPO players (>1030) started playing the exclusive events.
I'm not saying ratings don't drift, but I would like to see some sort of analysis that shows the magnitude of possible drift over the years. I think to really have a strong argument you would need to show for a fixed course over many years the same score results in a trend of increasing ratings. It is hard to find a fixed course because trees grow & fall, and at the touring pro level the courses are constantly being tweaked. Similarly weather conditions might be radically different in some years further obfuscating any trends.
So I want to describe other mechanics that could also contribute to a rise in the rating of the
top N pros.
The total number of disc golf players competing in the PDGA system has seen a huge rise in the past decade. Check out the demographics from PDGA's website:
PDGA Demographics
From:
- 2002 - 7638
- 2003 - 8304
- 2007 - 11943
- 2011 - 16609
- 2015 - 30454
- 2019 - 53366
That is a huge increase in the the size of the field so it seems reasonable that the
top N pros have higher ratings now.
In this awesome discussion of the PDGA rating system (
Nick & Matt show with Chuck Kennedy) around the 25:30 mark Chuck describes the origin of the ratings. 1000 Rated SSA was set from the average of the 100 Best Rounds at the 1998 Cinncinati Pro Worlds. These Course ratings were used to generate player ratings. Everything propagates from this event. If a similar event were held with the players and discs from today, would we expect the 100 best rounds to average better or worse than the top 100 rounds from 1998?
Later in the show, around 1:07:25, they discuss ratings creep and Chuck shares his thoughts. Chuck attributes the majority of the rise in ratings to the fact that modern courses are being made harder through a variety of methods like increased distance and OB zones.
The PDGA rating is a measure of the players past performances on the courses they have played. There is an upper limit to your rating and that limit is the difficulty of the courses you have played. You can always find ways to make a course harder for the general population of disc golfers. But if a player's total throws don't change as the course is made harder he is showing that his rated rounds (at that course) were limited by the ability of the course to properly rate/evaluate his skill.