Commentary: Wi-Fi and DAS ain’t cheap — but can your venue afford not having them?

Screen Shot 2016-01-05 at 10.59.41 AMIf I had to guess, I would bet that our news that Texas A&M’s new optical-based stadium network cost “north of $20 million” to build will be one of the most talked-about things in the stadium technology world for the near future. While some may ask “who really has that kind of money to spend” on a stadium network, I think there is an equal question in the opposite direction: Can you afford not to spend that much (or at least as much as you can) to make your network as good and future-proof as it can be?

Our cover story about the new deployment at A&M’s Kyle Field in our latest STADIUM TECH REPORT (which you can download for free) may be somewhat of an outlier since Texas A&M is clearly one of the subset of universities and colleges that doesn’t really have the budgetary concerns that others might. Yet it’s also instructive to look around at Texas A&M’s peers at the big-time college football level to see how few of them have even started down the road toward a top-level stadium network.

Some schools with “big” football programs (which regularly attract large, sellout crowds and have plenty of income on hand) have certainly built great networks of their own, including schools we’ve profiled, like Wisconsin, Nebraska, Baylor and more. But there are still many more schools, even those with successful, money- making operations, who still haven’t put high- speed wireless networks into their venues. The biggest question may be for them, and it is: How much longer will your fans put up with the feared “no signal” problem? Especially as the kids of today become potential ticket-buying alums that you count on for the future?

It’s not about watching the phone at the game

To be sure, we still don’t think that anyone – anyone – goes to a sporting event to stare at their phone. There is still so much to the live game-day experience, the smells, sounds and tribal fun, that it will always outweigh whatever entertainment or pleasure one might derive from their mobile device.

Dallas fan in mobile action at AT&T Stadium. Photo: Phil Harvey, MSR

Dallas fan in mobile action at AT&T Stadium. Photo: Phil Harvey, MSR

That being said, it’s also true that our society has already become one that is used to being able to connect everywhere; that’s especially so when we’re in public and social situations, where the ability to stay in touch facilitates not only face-to-face meetings (meet you there!) but also enables us to stay close to others who can’t physically be with us (wish you were here!).

Time and time again, when we profile venues that have installed new wireless networks, we ask about the reasons behind the deployment – and almost always, fans complaining about not being able to connect is one of the top woes. Before the stadium refurbishment at Texas A&M, chancellor John Sharp’s office was “flooded” with emails after every home game, complaining about two things in particular: The lack of women’s restrooms, and the terrible cellular reception. They’re both plumbing problems, but some people still don’t seem to see the urgency to solve the second kind, the one that uses “pipes” to connect phones.

For the near future, it may be easy to ignore the problem and say it’s not a priority, that fans come to watch the games, not their phones. But ignoring the reality of the need for people to stay connected seems a bad way to treat paying customers; and every day your venue doesn’t have a network is another day lost in the possible pursuit of a closer relationship with ticket-buyers, and the potential digital-supported revenue ideas that are just starting to emerge.

While we’re guessing that not every institution can support a $20 million network (even if the wireless carriers are paying half the cost), there are many other ways to skin this cat, as other profiles in our most recent STADIUM TECH REPORT issue point out. By partnering with Boingo, Kansas State was able to get both a DAS and a Wi-Fi network built; and at Ole Miss, a partnership with C Spire got a Wi-Fi network deployed at Vaught-Hemingway Stadium, albeit one where non-C Spire customers have to pay a small fee ($4.99 per game) to use it.

Maybe charging a small fee isn’t the ideal situation, but it’s better than no network at all, especially if you want to attend a game but still want to remain somewhat connected to the outside world. And we haven’t even mentioned the public safety aspects of ensuring you have adequate cellular and/or Wi-Fi coverage in your venue, which might prove indispensible in times of emergency.

And even at stadiums we’ve been to where there is advanced cellular and Wi-Fi inside the venue itself, there is often poor or no connectivity outside. At Texas A&M, we heard tales of some 30,000 people who remained in the tailgating lots during the game, never wanting to come inside. While not all schools may have that kind of be-there fervor, the idea of an “event city” is taking shape at many venues pro and collegiate.

At the University of Phoenix Stadium in Glendale, Ariz., for example, a Crown Castle DAS brings connectivity to the extensive mall/restaurant area surrounding the football stadium and hockey arena; both the Green Bay Packers and the Chicago Cubs are planning outside-the-wall fan areas that will have Wi-Fi and DAS coverage to keep people connected on the way to or from the games. For many venues, outside is now as important as inside when it comes to wireless coverage.

So the question is, should your institution spend the necessary money to put great networks into your most public places, or is connectivity still a luxury your venue can’t afford? We’ll admit we don’t know all the answers to those twin questions, but if you have a story or opinion one way or the other we’re ready to help you tell your tale. Let’s hear from more of you, so that everyone can learn.

University of Phoenix Stadium sees another 2 TB Wi-Fi game with big events on the horizon

University of Phoenix Stadium before Super Bowl XLIX. Photo: Paul Kapustka, MSR (click on any photo for a larger image)

University of Phoenix Stadium before Super Bowl XLIX. Photo: Paul Kapustka, MSR (click on any photo for a larger image)

Call it maybe a warm-up before the storm hits? The University of Phoenix Stadium, home of the Arizona Cardinals, racked up another 2 terabyte Wi-Fi traffic event during a recent Thursday night game, but bigger wireless days are no doubt on the near horizon.

With playoff-consideration regular season home games coming up against the Green Bay Packers and the Seattle Seahawks, the beefed-up Wi-Fi and DAS at UoP is sure to get a workout, though there might be even bigger numbers chalked up during the Notre Dame-Ohio State clash at the Fiesta Bowl on Jan. 1, 2016, and the College Football Playoff championship game, scheduled for Jan. 11. According to Mark Feller, vice president of technology for the Arizona Cardinals, the two college events will use the stadium’s expanded seating, which increases capacity from the NFL-game level of 63,500 to 75,000.

Last February during Super Bowl XLIX, the University of Phoenix Stadium (located in Glendale, Ariz.) recorded the highest single-game Wi-Fi traffic mark, a figure of 6.23 TB, while the inaugural College Football Playoff championship game at AT&T Stadium hit 4.93 TB. With the Packers coming to town Dec. 27 followed by the Seahawks on Jan. 3, it might be interesting to see how much Wi-Fi traffic is carried at UoP in the two-week-plus span.

For the Dec. 10 Thursday night game against the Minnesota Vikings (won by the Cardinals, 23-20), Feller said the Wi-Fi network recorded 28,497 unique clients, an almost 45 percent “take rate.” The peak concurrent user number that night was 25,333, Feller said, occurring just before halftime. The total bandwidth used was 2.0 TB, Feller said.

We’ll be interested to see what happens in the “15 days of bandwidth,” a series of events Feller and his crew are facing with excitement, as well as probably some pots of coffee and/or energy drinks.

“We are excited to be hosting all these games, but won’t be sleeping much,” Feller said in an email.

Nebraska’s 2015 season Wi-Fi stats: Two 4+ TB games, 3.4 TB average

Memorial Stadium, University of Nebraska. Credit all photos: University of Nebraska.

Memorial Stadium, University of Nebraska. Credit all photos: University of Nebraska.

The high-density Wi-Fi network at the University of Nebraska’s Memorial Stadium saw a lot of action during the 2015 football season, racking up an average of 3.4 terabytes per game with two games going past the 4 TB mark.

According to figures provided to us by Chad Chisea, IT operations manager for Nebraska athletics, an early season game against South Alabama carded 4.2 TB and a Nov. 7 matchup against Michigan State (which Nebraska won, 39-38) hit 4.1 TB of Wi-Fi usage to set the high-water marks for the seven-game home schedule. Chisea noted that both 4+ TB Wi-Fi events were during night games, an interesting stat to ponder. The low Wi-Fi usage mark came during the final game of the season, a 28-20 Cornhuskers loss on Nov. 27, a day that Chisea said had temperatures that stayed below freezing in Lincoln.

The average number of unique devices connected per game was 31,358, an impressive “take rate” given that the average announced attendance during 2015 was 90,012 per game. The Michigan State game saw the highest single-game unique device total, 35,781, as well as the biggest number of peak concurrent connections, 29,666. For the entire seven-game season, the Nebraska network saw 219,504 unique devices connected, and it carried a total of 24.1 TB of traffic.

(Click on image to see larger version)

(Click on image to see larger version)

Last-minute audible to optical made Texas A&M’s stadium network a winner

Texas A&M student at recent Aggies football game. Photo: Paul Kapustka, MSR (click on any photo for a larger image)

Texas A&M student at recent Aggies football game. Photo: Paul Kapustka, MSR (click on any photo for a larger image)

The original game plan for the new wireless networks at Texas A&M’s Kyle Field called for copper, not optical fiber, at the network core. Then came a last-minute audible that changed the game not just for the Aggies but maybe for stadium networks overall.

After initially designing the network with a traditional copper wiring system, a late spring 2014 decision by Texas A&M chancellor John Sharp reversed field, switching instead to an all-optical network for DAS, Wi-Fi and IPTV combined. The new network, now in full operational mode, is already being hailed as the future-proof path of the future of stadium network technology, with other schools and pro teams beating a path to College Station to see what they might learn.

With screaming speeds on both the Wi-Fi and DAS networks and plenty of capacity for now and the future, Sharp’s line-of-scrimmage call to go with an IBM and Corning optical-based network seems to be a huge score, according to a school official who brought the idea to Sharp’s attention.

Editor’s note: This story is part of our most recent STADIUM TECH REPORT, the COLLEGE FOOTBALL ISSUE for 2015. The 40+ page report, which includes profiles of stadium deployments at Texas A&M, Kansas State, Ole Miss and Oklahoma, is available for FREE DOWNLOAD from our site. Get your copy today!

A sample of the Wi-Fi and DAS speed tests we took at Kyle Field.

A sample of the Wi-Fi and DAS speed tests we took at Kyle Field.

Last-minute switch from copper to optical

“We had got pretty far down the road with an older, but tried and true [network] architecture,” said Phillip Ray, Vice Chancellor for Business Affairs at The Texas A&M University System. But after hearing and reading about the possible potential of an optical fiber-based network system, Ray brought in Corning and IBM representatives over school spring break in 2014 to discuss the possibility of switching to an optical fiber-based network for Kyle Field – even though the network would have to be ready for the 2014 football season.

“We had some face to face meetings with chancellor Sharp and discussed all the pros and cons,” said Ray, who had been charged by Sharp with overseeing the network deployment part of the $485 million Kyle Field renovation. Though Ray said he was under a “lot of pressure” to stick with the older-type design, he quickly got a green light from Sharp to take the optical choice and run with it.

“If we had gone copper, we knew that we would have had a network in the stadium for game 1,” said Ray. “But the pros of optical far outweighed the cons. Chancellor Sharp instead took a big risk, and took a leap of faith for all the right reasons. He said, ‘this is the chance of a lifetime, to really move the ball and shoot for the top!’ “

According to Ray, the total cost of the combined Wi-Fi, DAS and IPTV network ended up being “just north of $20 million,” but that cost was softened when the two largest cellular carriers, AT&T and Verizon Wireless, ponied up $10 million, almost half the cost.

“The carriers embraced it, funded it, and want to be with us down the road,” said Ray. “It was a paradigm shift for them, but they wanted to be involved.” While AT&T and Verizon are live on the DAS now, Ray said that Texas A&M already has a commitment from T-Mobile to join the DAS soon, and hopes to also add Sprint before long.

Aside from the leap of faith to go optical was the on-the-ground necessity to build the network quickly, since Sharp didn’t want to start the 2014 season without it. Ray said that Todd Chrisner – a former IBM employee who moved to Corning during the past year – “helped lead a Herculean effort” of gear suppliers, service providers and construction workers who finished Phase 1 of the network in time for the first game of last season. Phase 2 of the network also required quick moving, since it didn’t get started until Texas A&M blew up and then rebuilt the entire west side of the stadium between December 2014 and the 2015 season.

On the road to College Station, Aggie pride is everywhere. Whoop!

On the road to College Station, Aggie pride is everywhere. Whoop!

Again, the network (and the building) were finished on time.

“We had a lot of Aggies involved [in the construction],” Ray said. “They knew they were going to be sitting in those seats for the next 35 years, so they worked hard.”

Now that it’s finished and working incredibly well, Ray said the Kyle Field network has already been visited by representatives from other colleges, as well as professional football and hockey stadium-networking types.

“We get calls every week, and we have people down to share what we learned – we’re an open book,” said Ray. And they’re able to tell a success story mainly because Ray, Sharp and others trusted themselves to switch from an OK play to one that could score a touchdown.

“If we had gone with copper we’d be so regretting it now,” Ray said. Having an optical-based network, he said, “sets us up for many years, and eventually will save us money. It was a lot of hard work and risk, and if it had fallen on its head, chancellor Sharp would have taken the heat. Instead, it’s one of the best decisions, ever.”

New Report: Is Texas A&M’s $20 million, all-optical DAS and Wi-Fi the fastest stadium network out there?

One of the many maxed-out speed tests we took at Texas A&M's Kyle Field. All photos: Paul Kapustka, MSR (click on any photo for a larger image)

One of the many maxed-out speed tests we took at Texas A&M’s Kyle Field. All photos: Paul Kapustka, MSR (click on any photo for a larger image)

Is there a combined stadium Wi-Fi and DAS deployment that is as fast as the one found at Texas A&M’s Kyle Field? If so we haven’t seen or heard of it.

In fact, after reviewing loads of live network-performance data of Kyle Field’s new Wi-Fi and DAS in action, and after maxing out the top levels on our speed tests time after time during an informal walk-around on a game day, we’ve come to the conclusion that Kyle Field has itself a Spinal Tap of a wireless deployment. Meaning, that if other stadium networks stop at 10, this one goes to 11.

Movie references aside, quite simply, by the numbers Kyle Field’s wireless network performance is unequaled by any other large public venue’s we’ve tested in terms of raw speed and the ability to deliver bandwidth. With DAS and Wi-Fi speed measurements ranging between 40 Mbps and 60+ Mbps pretty much everywhere we roamed inside the 102,512-seat venue, it’s a safe bet to say that the school’s desire to “build the best network” in a stadium hit its goal as best as it could. And since the school spent “north of $20 million” on the network, perhaps it’s no surprise that it’s the fastest anywhere.

Our full profile of our in-depth visit to College Station to see this network in action can be found in our latest STADIUM TECH REPORT, our COLLEGE FOOTBALL ISSUE for 2015. You can download the report for free, right now! In addition to the Texas A&M profile you will find in-depth looks at wireless deployments at Kansas State, Ole Miss, Oklahoma and the venerable Rose Bowl — so download your copy today!

See the white dots? Those are under-seat Wi-Fi APs

See the white dots? Those are under-seat Wi-Fi APs

Audible to optical made the difference

Inside our latest 40+ page report you will get a full breakdown on how the Texas A&M network came to be — in an exclusive interview with Phillip Ray, Vice Chancellor for Business Affairs at The Texas A&M University System, you hear for the first time how much the Kyle Field network cost — “north of $20 million” — as well as how much the top two wireless carriers paid to be a part of it. Want to know? Then download the report!

And while the Kyle Field story is our lead article, that’s not all you’ll find in our latest in-depth exploration of stadium technology deployments. Reporter Terry Sweeney checks out the new DAS deployment blanketing Pasadena’s Rose Bowl, perhaps one of the toughest old-style stadium construction challenges to try to bring in wireless coverage. We also have profiles of Wi-Fi deployments at Kansas State and at Ole Miss, and a feature about covering RV parking lots with Wi-Fi at the University of Oklahoma. To top it all off we have some Wi-Fi cost/benefit analysis from yours truly, and a bonus photo feature by photographer Phil Harvey, who accompanied MSR for a recent visit to AT&T Stadium.

We’d like to take a quick moment to thank our sponsors, which for this issue include Mobilitie, Crown Castle, SOLiD, CommScope, Aruba (a Hewlett Packard Enterprise company), JMA Wireless, Corning, 5 Bars, Extreme Networks, and ExteNet Systems. Their generous sponsorship makes it possible for us to offer this content free of charge to our readers. We’d also like to thank you for your interest and continued support. Thanks for reading and enjoy the COLLEGE FOOTBALL ISSUE!

Kyle Field at kickoff.

Kyle Field at kickoff.

IBM formally launches sports consulting practice to bring tech to stadiums

Texas A&M student at recent Aggies football game. Photo: Paul Kapustka, MSR (click on any photo for a larger image)

Texas A&M student at recent Aggies football game. Photo: Paul Kapustka, MSR (click on any photo for a larger image)

IBM formally cemented its entrance to the sports-stadium tech deployment market with the announcement of a sports and fan experience consulting practice, and a “global consortium” of tech and service suppliers who may help IBM in its future stadium and entertainment venue deployments.

For industry watchers, the Nov. 19 debut of the IBM “Sports, Entertainment and Fan Experience” consulting practice was not a surprise, since its leader, Jim Rushton, had already appeared at tech conferences this past summer, talking about IBM’s plans to deploy a fiber-based Wi-Fi and DAS network at the new Mercedes-Benz Stadium being built for the Atlanta Falcons. IBM was also publicly behind a similar network build over the last two years at Texas A&M’s Kyle Field. For both networks, IBM is using Corning optical gear.

Still, the formal creation of the IBM practice (you can read all about it at the new IBM sports website) means that the 800-pound gorilla is now firmly inside the competitive ring of the stadium-tech marketplace, a landscape that currently has multiple players, many of which have multiple stadium deployments under their belts. However, IBM’s vast experience in big-time sports technology deployments — Big Blue is behind such endeavors as the truly wonderful online experience of The Masters, as well as technical underpinnings of three of tennis’ Grand Slam events (Wimbledon, the U.S. Open and the Australian Open) — as well as its considerable tech and monetary resources probably makes it a No. 1 contender for all of the biggest projects as well as possibly smaller ones as well.

Artist's rendering of planned overhead view of new Atlanta NFL stadium

Artist’s rendering of planned overhead view of new Atlanta NFL stadium

Rushton, who spoke with Mobile Sports Report earlier this year in one of his first public appearances as an IBMer, said in a phone interview this week that IBM’s fiber-to-the-fan network model isn’t just for large-scale deployments like the one at 105,000-seat Kyle Field or the Falcons’ new $1.4 billion nest, which will seat 71,000 for football and up to 83,000 for other events after it opens in 2017.

“That type of system [the optical network] is scalable,” Rushton said, and even in smaller venues he said it could potentially save customers 30 percent or more compared to the cost of a traditional copper-based cabled network. The flip side to that equation is that purchasers have fewer gear suppliers to choose from on the fiber-based side of things, and according to several industry sources it’s still sometimes a problem to find enough technical staffers with optical-equipment expertise.

How much of the market is left?

The other question facing IBM’s new consulting practice is the size of the market left for stadium tech deployments, an answer we try to parse each year in our State of the Stadium survey. While this year’s survey and our subsequent quarterly reports found a high number of U.S. professional stadiums with Wi-Fi and DAS networks already deployed, there are still large numbers of college venues as well as international stadiums and other large public venues like concert halls, race tracks and other areas that are still without basic connectivity.

Full house at Kyle Field. Photo: Paul Kapustka, MSR

Full house at Kyle Field. Photo: Paul Kapustka, MSR

With its new “global consortium” of companies that supply different parts and services of the connected-stadium experience, IBM could be an attractive choice to a customer that doesn’t have its own technical expertise, providing a soup-to-nuts package that could conceivably handle tasks like in-stadium IPTV, DAS and Wi-Fi, construction and stadium design, and backbone bandwidth solutions.

However, IBM will be going up against vendors who have led deployments on their own, and league-led “consortium” type arrangements like MLBAM’s project that brought Wi-Fi to almost all the Major League Baseball stadiums, and the NFL’s list of preferred suppliers like Extreme Networks for Wi-Fi and YinzCam for apps. Also in the mix are third-party integrators like CDW, Mobilitie, 5 Bars, Boingo Wireless and others who are already active in the stadium-technology deployment space. And don’t forget HP, which bought Wi-Fi gear supplier Aruba Networks earlier this year.

Certainly, we expect to hear more from IBM soon, and perhaps right now it’s best to close by repeating what we heard from Jared Miller, chief technology officer for Falcons owner Arthur Blank’s namesake AMB Sports and Entertainment (AMBSE) group, when we asked earlier this year why the Falcons picked IBM to build the technology in the new Atlanta stadium:

Remote optical cabinet and Wi-Fi AP at Kyle Field. Photo: Paul Kapustka, MSR

Remote optical cabinet and Wi-Fi AP at Kyle Field. Photo: Paul Kapustka, MSR

“IBM is unique with its span of technology footprint,” Miller said. He also cited IBM’s ability to not just deploy technology but to also help determine what the technology could be used for, with analytics and application design.

“They’ve looked at the [stadium] opportunity in a different manner, thinking about what we could do with the network once it’s built,” Miller said.

From the IBM press release, here is the IBM list of companies in its new “global consortium,” which IBM said is not binding, meaning that none of the companies listed is guaranteed any business yet, and others not on the list may end up in IBM deployments, like Kyle Field, which uses Aruba gear for the Wi-Fi:

Founding members of the consortium, include:

· Construction and Design: AECOM, HOK, Whiting Turner

· Infrastructure Technology/Carriers: Alcatel/Lucent, Anixter, Commscope, Corning, Juniper Networks, Ruckus Wireless, Schneider Electric, Smarter Risk, Tellabs, Ucopia, Zebra Technologies, YinzCam (IPTV), Zayo, Zhone

· Communications Solutions Providers: Level 3, Verizon Enterprise Solutions, AT&T

· Fan Experience Consulting & Data Management Integration: IBM