clock menu more-arrow no yes mobile

Filed under:

The Historical | SEC Scheduling, Part II

In Part II of this series, we take a look at a method for rating schedules and the corresponding results

Hmm... could really go for some of that Celebration Bermuda right now...
Hmm... could really go for some of that Celebration Bermuda right now...
Kim Klement-USA TODAY Sports

All statistics are courtesy of Football Outsiders, home of the F/+ Combined Ratings for college football.
The S&P+ rating was created by Bill Connelly; check out his college football analytics blog, Football Study Hall.
All schedule information courtesy of Winsipedia.

Part I of this series, covering the history of the SEC and its football scheduling methodology, may be found here.

Now that all that is out of the way, let’s talk method.

In order to address some of these perceived biases in the SEC scheduling process, it is necessary to construct a methodology for rating past schedules. To do that, I’m going to employ advanced metrics[1], namely the redesigned S&P+ ratings, to provide an objective measure of team strength. Frequent readers of Processing the Numbers during football season are already familiar with S&P+ — this was Bill C.’s opponent-adjusted rating method that assessed teams on a play-by-play basis using success rate and equivalent points per play. After getting a drive efficiency component in 2013, Bill rebuilt the formula in the offseason to more closely track his Five Factors of football and better fit a normal distribution. It still depends on success rate and a form of equivalent points per play called IsoPPP, but the way in which it is presented is different; one notable difference is in the new ratings, an "average" team gets a rating of 0, meaning positive ratings correspond to above-average performance. I’ll have a piece up closer to football season detailing the new system and how it impacts F/+; for now, check out the links to Bill’s articles in the links box for more information about the S&P+ redesign. Before this metric can be employed, however, there are some other conditions that need to be set.

1 | Shocking, I know.


Unfortunately the advanced metrics only go back to about 2005 or so; this is when the official play-by-play data became available to the public. Special teams data was available starting in 2007, so to provide only the most complete picture of team strength I’m going to cap this look there. Conveniently, 2007 to 2014 represents eight years of conference play, containing four complete home-and-home rotations among annual opponents and at least one game against every other team in the conference. 2007 is also the first year Nick Saban was on the sidelines at Alabama, so realistically the first year in Alabama football history since 1992.

Two current members of the SEC were not here until 2012 — Texas A&M and Missouri. As such, they are not going to appear in the rankings below. They were counted as conference opponents post-2012 and out-of-conference opponents for 2007-2011, however. But, as their arrival coincided with a period of strength for the conference, the fact they had only three years of data vs. eight for the other teams skewed the data too much and muddied some of the results.

Regular Season Games Only

This is probably self-explanatory, but bowl games and conference championship games were not included. The whole idea is to evaluate the games within the control of the universities and the SEC, and to evaluate the teams on the same number of total games. Not everyone gets to play in the postseason[2].

2 | I see you, Vanderbilt.

Home, Road, and Neutral Site Adjustments

Matt Mills over at From the Rumble Seat looked into home field advantage before last season, and determined home teams from the 2008-2013 seasons were 3.3 points better than the road team, on average. I converted that to percentage above and below the average score, and found the home team was about 6% better at home and the road team about 6% worse on the road. The ratings of home and road opponents were adjusted accordingly, while neutral site opponent ratings were unadjusted.

FCS Opponents

Most of your advanced metrics for college football are only calculated for play at the FBS level[3], so how do we rate these opponents? My first inclination was to not do so, but that poses a slight problem. For the most part, your typical SEC team’s schedule includes one FCS cupcake per season. Alabama’s favorite paycheck recipient has been Western Carolina, for example, while Mississippi State appears to have adopted a three-year rotation with Jackson State, Alcorn State, and UT-Martin. The issue is seasons where a team elects not to schedule an FCS opponent (like LSU in 2009) or a season where a team schedules two FCS opponents (like Ole Miss in 2009). How do you account for the variation without penalizing a team for not scheduling an FCS squad, or vice versa?

3 | Jeff Sagarin’s ratings being a notable exception.

Well, I looked at the new S&P+ ratings for each year since 2007, and noticed the absolute worst team was mighty New Mexico State in 2011, who put up a very solid -27.9 rating for the year. As such, it’s a reasonable assumption that your typical FCS cupcake is probably at or below the level of the 2011 New Mexico State team. To that end, I assigned every FCS opponent a rating of -30[4].

4 | No SEC team in the sample scheduled a giant-killer like North Dakota State; we’re dealing with the Chattanoogas of the world here.

Why bother? Because we’re looking at overall schedule strength as well, and I didn’t want to penalize those teams who didn’t schedule FCS schools in a particular season by removing an FBS opponent from the schedule to compensate. It has no impact on the intraconference scheduling numbers, so we’re good to go there — the overall schedule numbers are more for fun anyways.

Putting It All Together

For each individual season, I summed the adjusted S&P+ ratings for all 12 games to serve as a schedule rating for that team’s season. I also calculated out a few splits (SEC, OOC, In-Division, Cross-Division, Home, and Away), some of which we’ll be focusing on shortly. Finally, I averaged over the eight-year sample, which provides the basic ratings I’ll be using below. The rest of this piece will be focused on sharing this data; Part III will focused on analyzing this data to tease out the results that may not be apparent at first glance.

Now that I’m ready for a nice nap, what did you find out?

To give you a feel for how these ratings scale, below is a matrix showing the final rating for each SEC team at the end of each season, ordered by average rating over the sample:

Team 2007 2008 2009 2010 2011 2012 2013 2014 Avg.
Alabama 3.1 19.8 24.0 22.9 27.5 28.5 22.2 28.3 22.04
LSU 22.6 9.9 15.5 15.0 28.7 15.4 15.9 16.5 17.44
Florida 21.8 30.6 25.0 10.4 6.4 22.4 9.7 11.6 17.24
Georgia 14.4 10.9 7.7 9.2 15.2 18.5 16.4 22.6 14.36
S. Carolina 9.6 8.4 12.9 20.0 9.8 15.8 17.5 7.9 12.74
Ala. PoIy 9.8 -1.6 9.1 23.9 4.6 -2.6 20.4 23.6 10.90
Arkansas 5.0 4.8 12.7 19.8 12.3 7.4 0.3 23.1 10.68
Creamsicles 16.6 5.8 15.2 1.5 7.2 8.4 6.5 14.2 9.43
Ole Miss -3.3 13.2 6.1 2.3 -2.0 13.1 6.9 23.0 7.41
Miss. State 3.2 -7.1 7.1 10.5 2.4 6.4 13.4 17.8 6.71
Kentucky 10.6 -1.1 3.3 0.9 -6.4 -3.3 -3.4 1.5 0.26
Vanderbilt 3.5 3.2 -9.4 -8.9 11.3 3.3 -0.1 -10.9 -1.00

No great surprises there, right? Over the last eight years, the Crimson Tide has been the class of the conference, with LSU and Florida a bit behind but still fielding a bevy of outstanding teams. Georgia, South Carolina, API, Arkansas, and the Creamsicles form the middle class with the Mississippi schools a step behind. Kentucky and poor old Vanderbilt — the only SEC team below-average nationally over the last eight years — bring up the rear.

Now that the playing field is clear, let’s take a look at those long-awaited schedule results. First up, the oft-discussed and vilified cross-division schedule, with the SEC East teams in the upper part of the table and the West teams in the bottom. The teams are ordered within division by cross-division schedule rating, with the rank among all SEC teams in bold:

Team Cross-Division Strength
Florida 37.96 (1)
Creamsicles 37.18 (2)
S. Carolina 32.63 (3)
Kentucky 29.63 (5)
Georgia 28.26 (6)
Vanderbilt 24.48 (9)
Arkansas 30.45 (4)
Ala. Poly 27.95 (7)
LSU 25.47 (8)
Alabama 22.59 (10)
Ole Miss 19.81 (11)
Miss. State 13.98 (12)

Interesting! The first thing I notice is that five of the six hardest cross-division schedule averages in the conference belong to SEC East schools, which tells me the SEC West has been the stronger division over the past eight years. The lone exception there is Arkansas, and a brief peek at their cross-division opponents over the sample includes games against the two strongest teams Urbz put together in Gainesville, several games against the consistently-good ‘Dawgs, and the yearly tilt against Spurrier and the Gamecocks — they also got Vanderbilt the one year they were decent in 2011. The other thing I see is that, while there’s a loose relationship between the strength of each team’s permanent cross-division rival and their cross-division schedule strength, it’s not exactly one-to-one — consider the fact that Kentucky’s cross-division schedule has been a bit harder than Georgia's, despite the fact API’s been a good bit better than Mississippi State over the last eight years.

Next, let’s take a peek at the total SEC schedule strength. Same breakout as before, with the East teams on top and the West on the bottom, with overall SEC ranks in bold:

Team SEC Strength
Kentucky 87.05 (6)
Creamsicles 85.79 (7)
Vanderbilt 83.24 (8)
Florida 78.68 (10)
S. Carolina 77.73 (11)
Georgia 71.62 (12)
Arkansas 101.17 (1)
Ala. Poly 98.87 (2)
Ole Miss 93.47 (3)
LSU 89.27 (4)
Miss. State 88.52 (5)
Alabama 81.63 (9)

Here is your confirmation about which division has been tougher over the last eight years. Most of the games represented in this chart are intra-division, and sure enough, five of the six hardest average schedules in the SEC belong to West teams. Alabama is the sixth team in the West by this metric, and lags well, well behind their division mates. That’s interesting, considering that they were not the lowest team in the West by the cross-division numbers, which indicates there’s an effect in the intra-division scheduling that has depressed their SEC schedule strength accordingly. You probably know what that is already, but I’m going to dive into that in more detail next time.

Finally, how about overall schedule strength, including OOC schedules? Keep in mind this incorporates the adjustment for FCS squads. That adjustment makes the typical OOC schedule component negative, such that overall schedule strength is often considerably lower than SEC schedule strength:

Team Overall Strength
Creamsicles 53.87 (3)
Florida 49.64 (5)
S. Carolina 49.34 (6)
Georgia 46.52 (8)
Kentucky 36.89 (11)
Vanderbilt 32.58 (12)
Ala. Poly 60.35 (1)
Arkansas 55.39 (2)
LSU 50.54 (4)
Ole Miss 48.23 (7)
Alabama 43.98 (9)
Miss. State 43.20 (10)

This largely tracks the SEC schedule rankings, as most everyone in the conference plays (at most) one decent OOC opponent and fills in the rest with patsies. API and the Creamsicles are your two divisional winners here, I’m sad to say, but to their credit they’ve scheduled decent OOC opponents over the last eight years. API tends to schedule two above-average (nationally speaking) opponents a year, whereas the Creamsicles were buoyed with a relatively-hellacious 2014 schedule that containted three above-average OOC opponents with a road game at Oklahoma and home tilts against Utah State and Arkansas State[5]. They also had three seasons in a row (2007-2009) with no FCS opponents.

5 | Yes, Arkansas State was considered above-average by the new S&P+ in 2014, finishing with a rating of 1.

The rest is as you’d suspect — schools with regular OOC rivalries against P5 schools (Florida, Georgia) getting a big boost over where they ranked on just SEC schedule strength, and the schools in the West without such rivalries (such as the Mississippi schools) dropping precipitously. Alabama holds steady with the ninth-toughest schedule in the conference over the last eight years, largely the product of scheduling a neutral-site contest against a "marquee"[6] OOC opponent almost every year, as well as a home-and-home with Penn State in 2010 and 2011.

6 | 2009 Virginia Tech? Yes! 2008 Clemson? Noooo...


Toughest SEC Schedule — Arkansas, 2014 [153.04]

With a schedule rating of 153.04, Arkansas’ SEC slate from last season is the toughest of the last eight years. In fact, six of the top seven schedules in the sample belonged to the SEC West teams from last seasons, with 2013 Arkansas being the seventh. Yeah, the SEC West was soooo overrated last year, you guys.

Easiest SEC Schedule — Georgia, 2011 [33.14]

Narrowly edging Alabama’s 2008 slate, Georgia’s toughest opponent this year was Vanderbilt — let me repeat that in bold: their toughest opponent that year was Vanderbilt — and they drew the Mississippi schools and a somewhat-down API for their cross-division slate. 2011 saw the weakest version of Florida in the sample, which is typically one of Georgia’s tougher games.

Toughest Overall Schedule — Alabama Poly, 2014 [127.33]

Once again, the top five teams by this metric came from last year’s SEC West, with 2014 Mississippi State a bit farther down the list at #14. Bragging rights go to API, who paired the intra-division gauntlet with a road game against Kansas State, a better-than-you-thought Louisiana Tech at home, and of course the yearly game against the East’s top squad from last year in Georgia.

Easiest Overall Schedule — Alabama, 2008 [-3.49]

-3.49. -3.49, the only negative overall schedule strength in the sample. How on earth does that happen? Well, the OOC schedule turned out crappy, with Clemson the only above-average squad against a veritable murderer’s row in Tulane, Western Kentucky, and Arkansas State. Among the SEC, the Tide got three below average squads in Kentucky, Mississippi State, and API. The tough game that year was Ole Miss, which is not something that can be said very often. We all knew the 2008 Tide was a bit ahead of schedule, but that undefeated run to the SEC title game makes even more sense than it did before, doesn’t it?

All of this is great of course, but simply listing a bunch of numbers and eyeballing ranks within the SEC doesn’t answer the key questions that have driven this debate. Is the cross-division scheduling system "unfair", and does it adversely affect the strength of each team’s overall SEC schedule? Are certain teams being scheduled more fairly than others, or are certain years set up to favor a chosen team? Is Alabama’s SEC schedule really as easy as it seems? I aim to answer all of those questions in Part III of this series, coming to a device near you... soon?