Most major publications have released their “Power Rankings,” which provide us with fodder to throw around the proverbial water-cooler. Typically, I don’t put much stock in rankings, unless it’s the BCS rankings, in which case I don’t put any stock in it. However, after a little research, I surprised to find that NFL power rankings are pretty indicative of a team’s success.

On a week-to-week basis, the rankings tend to vary wildly. This is the reactionary, or knee-jerk, nature of the sport’s world. Look at the weekly variation of the 2011 Raiders.

Of course, the variation can be explained by the team’s inconsistent play. The Raiders’ performance is measured by Football Outsiders (green), using Defense-adjusted Value Over Average (DVOA) . Football Outsiders (FO) measures each play of every game and  compares the outcome to a league-average baseline. From there, they can determine how well a team is performing based on how well the average would perform. So, until week 10 or so, ESPN (red) and FOX’s (blue) power rankings reflected the team’s performance according to FO. But, after that, the method to their respective madnesses is anybody’s guess.

Despite the weekly fluctuation, as a whole, the system works. If taken in averages, power rankings are mostly consistent with the team’s actual standing.

According to their win-loss record and Football References’ strength of schedule, the Raiders were the 18th best team in the league last season, which means that FOX Sports’ rankings — aka Brian Billick’s — were most consistent with reality.

Still, power rankings are subjective, but this isn’t necessarily a bad thing. As fans and pundits, we don’t have the same football acumen as Billick, nor do we possess the same tracking tools as Football Outsiders. As such, power rankings can give a keen insight into a team’s progress, which is especially important for the Raiders.

Following the release of DeMarcus Van Dyke (DVD), I was dubious of the Raiders’ thought process. Despite the “DVD sucks” narrative that fans have inexplicably attached themselves too, statistical evidence shows that DVD’s rookie performance was comparable to that of Stanford Routt and Cortland Finnegan. Such a performance, in my thinking, was enough to guarantee DVD more of an opportunity than just four preseason games. Most of the readers here disagreed, noting that DVD was an Al Davis “scholarship” player who possessed only speed, not skill. In short, there was a discrepancy between statistical output and fan observation.

This is where I believe power rankings to be the most useful: They provide tangible, weekly evidence of a teams’ success. With Football Outsiders, and now ProFootball Focus, we can measure a team’s performance statistically. With FOX sports and ESPN’s rankings, we can measure it based on observation by professionals — namely that of Billick.

The following are the pre-Week 1 power rankings. In all, every publication believe the Raiders are worse than they were when last season ended. Let’s see if that holds true.

ProFootball Focus

Football Outsiders

ESPN

FOX Sports

As the weeks go on, I’ll have a chart that shows the weekly variation, as well as the averages.