New Report Claims to Pick Best, Worst Airlines

a runway with lights at night

While there are any number of studies and surveys that attempt to provide clarity on the perennial best/worst-airline question, none have proved definitive.

For many frequent flyers, airline quality is an oxymoron, as travelers’ reactions to United’s recent involuntary bumping incident attest. Still, there are higher- and lower-quality airlines. And for travelers, it’s worth knowing which are which.

One such annual study is the Airline Quality Rating study, a joint undertaking by professors at Embry-Riddle Aeronautical and Wichita State universities, the latest edition of which was published this week with reviews and rankings of U.S. airlines’ performance during the previous year.

The study incorporates 15 elements in four areas of airline performance: mishandled bags, on-time arrivals, denied boardings, and customer complaints. The report synthesizes data compiled by the DOT for its monthly Air Travel Consumer Report, and assigns U.S. airlines a higher or lower AQR (Airline Quality Rating) accordingly.

According to this year’s report, the winners and losers, from best to worst, are as follows:

  1. Alaska
  2. Delta
  3. Virgin America
  4. JetBlue
  5. Hawaiian
  6. Southwest
  7. SkyWest
  8. United
  9. American
  10. ExpressJet
  11. Spirit
  12. Frontier

Overall, the study is bullish on the airlines’ performance: “The Airline Quality Rating industry score for 2016 shows an industry that improved in overall performance quality over the previous year.” And the improvement isn’t just year-over-year. “The 2016 score is the best AQR score in the 26 year history of the rating.”

That improvement was across the board, including better on-time performance, fewer mishandled bags, a reduction in denied boardings, and fewer customer complaints (1.52 complaints per 100,000 passengers in 2016, versus 1.9 per 100,000 in 2015). So, at least in those easily measured areas, airline performance improved.

While the study has the look and feel of rigorous quantitative analysis, and the authors go to considerable lengths to tout its supposed objectivity, there’s a fundamentally subjective set of choices at its core. The various factors are weighted according to the opinions of a panel of “airline experts,” whose perceptions may or may not accord with those of the traveling public. For example, the study overweights on-time performance and underweights customer complaints.

In the end, after perusing the numbers and percentages and rankings, many travelers will find themselves wondering what it all means to them. It’s a good question. The DOT data on which the study is based are limited to what the Department can readily capture and quantify. And the survey’s weighting system adds at least as much uncertainty to the endeavor as it adds to its relevance.

In this case, given the study’s assumptions and methodology, I find the final airline rankings of some interest, but with qualifications. However, the finding that the overall AQR has improved, when so much anecdotal evidence suggests the opposite, is a head-shaker at best.

Reader Reality Check

How do the study results compare to your own assessment of airline quality?

After 20 years working in the travel industry, and almost that long writing about it, Tim Winship knows a thing or two about travel. Follow him on Twitter @twinship.

This article first appeared on SmarterTravel.com, where Tim is Editor-at-Large.