The average room rate in the 10 most-costly United States cities rose almost 10 percent in the past year, soaring from $92.55 to $101.73. In the 10 least-expensive cities, by contrast, the average increase rose 2.5 percent, from $39.45 to $40.45. Those figures would ``make the day'' of any eagle-eyed accountant. The 10 low-rate cities are Albany, Ga.; Boise, Idaho; Burlington, Vt.; Cheyenne, Wyo.; Great Falls, Mont.; Oklahoma City; Roanoke, Va.; Rockford, Ill.; Sioux Falls, S.D.; and Springfield, Mo. In the Midwest, for example, the average hotel-lodging rate in Cincinnati is some 43 percent lower than Chicago -- $58 vs. $101.50. Going south, the rate in Oklahoma City, at $42.50, is one-half the hotel room rate in Dallas. Perhaps not surprisingly, the rate spread is getting wider. The percent difference between the most expensive and the least expensive hotels in 1984 was 135 percent, but in 1985 it has grown to 151 percent. What this says, according to Runzheimer, is that one can significantly cut costs by snubbing the big cities for Roanoke or Great Falls. Meanwhile, the hotel outlook for the next 10 years is flat, reports Laventhol & Horwath, a business and financial services firm. Room occupancies, now at an average 66.2 percent, are expected to rise to only 68 percent by 1995.
Why might a company planning a business meeting -- or a family planning a vacation -- pass up Manhattan for Boise or Sioux Falls? It could be the hotel rates, according to a report by Runzheimer International, a management consulting firm in Rochester, Wis.
As any firm's accounting office knows all too well, it costs a lot more money to put up a busy executive in New York, Chicago, Dallas, or San Francisco, among the most expensive hotel checkout counters in the country. The other ``top 10'' club members are Boston, Honolulu, Los Angeles, San Diego, Atlantic City, N.J., and Washington, D.C.