Thinking out loud: Stadiums need better game-day online response teams

Avaya Stadium offers an online welcome

Avaya Stadium offers an online welcome

Maybe I just haven’t been to enough stadiums, but in the ones I visited over the last year I was struck by the fact that none of them seemed to have any kind of a place for live, updated game-day information where fans could find the kind of answers that might really improve their attendance experience.

In visiting various professional and top-level collegiate venues and interviewing representatives of other stadiums I continue to be impressed by the depth and breadth of technology deployments and of some apps that deliver advanced services, like Levi’s Stadium’s food delivery or the various live-replay systems in place at schools like Baylor and Nebraska, as well as at numerous pro venues. But I’ve yet to find a stadium, team or school with what seems like a simple thing to do — either to have a constantly updated “daily news” stream about game-day issues, or better yet, a rapid-response team on either social media or email to answer simple questions like, where should I park, and which gate should I go to?

Sometimes it seems like the simplest things are being overlooked when it comes to stadium technology, and I’m wondering why no such services seem to exist. Are they too costly? Or just not thought of as necessary? Or are stadium owners and operators not really paying attention to what happens on game day?

Why can’t all fans get the ‘suite’ treatment?

I don’t think the last question is true, since I did have the privilege of attending one Niners game at Levi’s Stadium this past season as the guest of app developer VenueNext, an experience that included a pass to the company’s corporate suite. As you can probably guess, having a suite-level pass is indeed a “suite” way to see a game. Almost all of your concerns and needs are taken care of, from the already paid-for drinks and food to the comfortable seating, and there is no shortage of stadium staff around to answer any questions you might have about where to go or how to find things.

One for the road at the BNY Club, Levi's Stadium

One for the road at the BNY Club, Levi’s Stadium

Fans with “regular” passes, however, simply don’t have many similar options for assistance, especially outside the stadium gates, where perhaps help is most needed. I know teams and stadiums (like Levi’s) do a good job of making maps and guides available online, especially for season ticket holders, but those resources typically aren’t designed for viewing on mobile devices, especially in a low-connectivity or bright-sunlight outdoor situation. Others that are designed for mobile apps, like Avaya Stadium’s “Ava” character, only offer canned information, and not a question-and-answer service.

Wouldn’t it be great to have a rapid-response Twitter handle or a regularly updated Twitter feed to answer questions like “where is the best place to park for seating in Section X,” or “which lots are less full,” or “which lots offer the fastest exit after the game?” Such a service could be incredibly helpful for the huge numbers of fans who only attend a small number of games, who might be making such decisions at the last minute and may have never been to the stadium before.

Feed me, keep me warm and dry

I really could have used such an informative service at the College Football Playoff championship game, which I was able to attend via a last-minute media invite from AT&T. Though my pass included access to the game (and more suite-level pampering) I didn’t have any special treatment getting to the event, so my game-day travel experience was probably very similar to many of the thousands of Ohio State and Oregon fans who had likely never been to AT&T Stadium before. Like many others, I decided to get to the stadium early, both to avoid any kind of parking crush and to bathe in whatever pre-game atmosphere might emerge. Three things I wasn’t prepared for came back to chomp me in the behind: Freezing cold weather, the lack of anywhere outside the stadium to get out of said cold weather, and the lack of any kind of online information to assist in the situation.

Fans freezing outside waiting for the CFP game to start

Fans freezing outside waiting for the CFP game to start

Though we were smart enough to grab lunch beforehand at a nearby bar and grill, my friend and media buddy Phil Harvey and I were only vaguely aware of the fact that the doors to the stadium weren’t going to open until 5:30 p.m., two hours before the scheduled game start, something we hadn’t really counted on when we drove over to park at 2 p.m. Our thoughts of being able to wander around and check out tailgate parties — or the underpublicized outdoor “festival” being put on by the NCAA and its sponsors — were negated by the chilling, biting wind, which whipped mercilessly throughout the acres of parking lots surrounding the stadium.

Like many others that day, we wound up spending some unplanned shopping time in the nearby Walmart, mainly to get out of the chill. We also ended up being frustrated with thousands of our newest closest friends, when the ticket gates apparently opened at 4:30 — only to find ourselves “in” the event (having gone through security and ticket checking) but still outside the doors, jammed onto the outdoor patios where we had to wait for another hour. The only good part of being crushed cheek to jowl is that being packed together did help keep all of us somewhat warmer.

Bargains available at the AT&T Stadium Walmart.

Bargains available at the AT&T Stadium Walmart.

Sure, we should have been smarter and maybe asked more questions beforehand but during the hours of unpleasantness all I could think of was why someone from the game or venue wasn’t outside watching what was going on, or doing anything to help rectify the situation. Even a simple official message of “we aren’t opening the doors for two more hours — here are a list of nearby restaurants you can walk to” would have been extremely helpful advice.

Maybe the CFP game was an outlier situation — lots of people who had never been to the venue before — but I’m guessing the situation isn’t that unique, especially for “big” events like playoffs or championships. And especially when it comes to extreme weather conditions, it just seems to make sense to have some kind of continually updated “at the game” news service that is well advertised and easily found, so that when a crisis situation emerges, fans know where to turn for trusted information.

Do any such services exist? Are there teams out there already doing this in a fashion that works? Let me know here, or we can have a discussion over on Twitter, where you can find me under the @PaulKaps handle.

Stadium Tech Report: AT&T Stadium’s massive antenna deployment delivers solid Wi-Fi, DAS performance

The old saw that says “everything’s bigger in Texas” is not just a stereotype when it comes to wireless networking and AT&T Stadium. Though our visit was brief and we didn’t have the opportunity to do a deep-dive technology tour, the MSR team on hand at the recent College Football Playoff championship game came away convinced that if it’s not the fastest fan-facing stadium network, the Wi-FI and DAS deployments at AT&T Stadium sure are the biggest, at least the largest we’ve ever heard of.

Inside AT&T Stadium at the College Football Playoff championship game. (Click on any photo for larger image) Credit all photos: Paul Kapustka, MSR

Inside AT&T Stadium at the College Football Playoff championship game. (Click on any photo for larger image) Credit all photos: Paul Kapustka, MSR

And in many ways we found, bigger is better, at least when it comes to staying connected inside one of the world’s truly humongous indoor spaces.

If you’ve not seen the stats, allow us to remind you that during the Jan. 12 championship game between the University of Oregon and THE Ohio State University the AT&T Stadium network carried more than 6 terabytes of wireless data, with almost 5 TB of that over the in-house Wi-Fi network. Another 1.4 TB was recorded being used by AT&T customers on the AT&T-hosted neutral DAS, which almost certainly carried another terabyte or two from other carriers on the system, who did not report any statistics. Any way you add it up, it’s the biggest single-day wireless data figure we’ve ever heard for a sports arena, professional or college, in any sport at any time.

Flooding the zone with more antennas and APs

How do you get such a big data number? One way is to make sure that everyone can connect, and one way to get to that point is to flood the zone with antennas and access points. Already the leader in the number of Wi-Fi access points and DAS antennas, AT&T Stadium got another 280 Wi-Fi antennas installed between Thanksgiving and the college championship game, according to John Winborn, CIO for the Dallas Cowboys. Some of those antennas, the staff said, were housed in new under-the-seat enclosures that AT&T’s Foundry designed somewhat specifically for use in the lower bowl of AT&T Stadium, which like other stadiums had previously had issues getting connectivity to seats close to field level.

According to Winborn, the AT&T Stadium now has more than 1,600 Wi-Fi APs in use for football games, and 1,400 antennas in its DAS network. By comparison, Levi’s Stadium in Santa Clara, Calif., perhaps the newest and one of the most technologically savvy venues out there, has 1,200 Wi-Fi APs and 700 DAS antennas in its deployments. Winborn also said that the antenna/AP number at AT&T can also scale up as necessary, especially for events that use up more of the building’s space, like the Final Four basketball tournament held there last spring.

“We scaled up to 1,825 [Wi-Fi] antennas for the Final Four last year,” said Winborn in a recent email, where he guessed that the venue might deploy up to 2,000 Wi-Fi APs when the Academy of Country Music Awards holds its yearly event at AT&T Stadium on April 19.

Hiding Wi-Fi APs an aesthetic priority

John Winborn, CIO for the Dallas Cowboys, poses next to a picture of two other innovators, Tex Schramm and Gil Brandt

John Winborn, CIO for the Dallas Cowboys, poses next to a picture of two other innovators, Tex Schramm and Gil Brandt

For all the extra numbers, one thing we noticed in walking around the building on Jan. 12 was that seeing an exposed Wi-Fi AP is about as common as seeing an albino deer. When we asked Winborn what the toughest thing was about network deployment in the venue, he responded quickly: “Finding ways to hide the APs so Jerry [Jones] doesn’t see them.”

With the price-is-no-object Jones on one side, and AT&T’s corporate image on the other, it’s clear there aren’t too many budgetary concerns when it comes down to spending more to make the network work, or look, better. Put it this way: You are never likely to have a “no signal” problem in a building that has on its outside an AT&T logo the size of the moon, and where AT&T CEO Randall Stephenson can be found wandering around the suite level during big events.

Though the immense space could probably be covered by fewer antennas, it’s worthwhile to remember that when the building was built and opened in 2009, it wasn’t designed with high-speed networking in mind. That means that almost all of the Wi-Fi and DAS deployments are a retrofit, including the ingenious circle of Wi-Fi antennas halfway up the seating bowl, which are covered by a tented ring of fiberglass designed and built specifically for the stadium.

According to Winborn the Wi-Fi network is supported by its own 2 GB backbone, with separate backbones in place for media networks and stadium application use. Winborn also noted that the stadium network runs 3,500 TVs via the Cisco StadiumVision system. Other records from this season include a peak concurrent Wi-Fi user mark of 27,523 (set at the Lions playoff game) and 38,534 unique Wi-Fi connections, that mark set at the season opener against the San Francisco 49ers.

Performance solid, even at rooftop level

The view from the nosebleed section

The view from the nosebleed section

So how fast are the Wi-Fi and DAS networks? In our limited testing time at the CFP game, we found solid connections almost everywhere we tried, including outside the stadium while we (freezingly) waited for the doors to open. Just outside the main ticket gate, we got a Wi-Fi signal of 23.93 Mbps on the download and 39.67 Mbps on the upload. At the same location a Verizon 4G LTE device got a 5.93 Mbps download speed, and a 2.59 Mbps upload speed, but it’s unclear if that was on the stadium DAS or just on the local macro network.

When the doors finally opened at 5:30 p.m. (no idea why Jerry kept us all out in the cold all afternoon) we went inside and got solid connections inside the foyer of the pro shop — 18.23/21.74 on Wi-Fi, 21.05/14.84 on an AT&T 4G LTE device, and 12.65/4.61 on a Verizon 4G LTE phone. (It’s worthwhile to note that all our Wi-Fi speeds were recorded on the Verizon device, a new iPhone 6 Plus.)

Down in our field-level suite, where we were the guests of AT&T, we got marks of 19.43/25.31 on the Wi-Fi, 7.35/11.04 on AT&T 4G and 5.71/4.05 on Verizon 4G. An interesting note here: When Oregon scored a touchdown on its opening drive, we took another Wi-Fi speedtest right after the play and got readings of 4.38/7.79, suggesting that there were many Ducks fans communicating the good news.

Later during the game we wandered up to the “Star Level” suites (floor 6 on the stadium elevator) and got a Wi-Fi mark of 11.57/30.51, and 19.31/13.46 on AT&T 4G. The only place we didn’t get a good Wi-Fi signal was at the nosebleed-level plaza above the south end zone, where we weren’t surprised by the 1.41/1.98 Wi-Fi mark since we didn’t see any place you could put an AP. We did, however, get an AT&T 4G signal of more than 7 Mbps on the download in the same location, meaning that even fans way up at the top of the stadium were covered by wireless, no small feat in such a huge space.

Bottom line: Network in place for whatever’s next

If there is a place where AT&T falls behind other stadiums, it’s in the synchronization of network and app; since it wasn’t built with food delivery in mind, it’s doubtful that AT&T will match Levi’s Stadium’s innovative delivery-to-any-seat feature anytime soon. And even though AT&T Stadium is dominated by the massive over-the-field TV set, fans at the CFP championship game were left literally in the dark during questionable-call replays, since they weren’t shown on the big screen and aren’t supported in the AT&T Stadium app.

What could be interesting is if the technology demonstrated by AT&T at the big college game – LTE Broadcast, which sends a streaming channel of live video over a dedicated cellular link – becomes part of the AT&T Stadium repertoire. From experience, such a channel could be extremely helpful during pregame events, since many fans at the college championship were wandering around outside the stadium unsure of where to go or where to find will-call windows. A “pre-game info” broadcast over LTE Broadcast could eliminate a lot of pain points of getting to the event, while also introducing fans to the network and app for later interaction.

At the very least, AT&T Stadium’s network alone puts it in at least the top three of most-connected football stadiums, alongside Levi’s Stadium and Sun Life Stadium in Miami. Here’s looking forward to continued competition among the venues, with advancements that will only further improve the already excellent wireless fan experience.

More photos from our visit below. Enjoy!

Fans freezing outside waiting for the CFP game to start

Fans freezing outside waiting for the CFP game to start

Creative OSU fan

Creative OSU fan

Plug for the app

Plug for the app

AT&T Stadium NOC aka "the Fishbowl"

AT&T Stadium NOC aka “the Fishbowl”

Sony Club. Now we know where Levi's Stadium got its "club" ideas

Sony Club. Now we know where Levi’s Stadium got its “club” ideas

Panoramic view (click on this one!)

Panoramic view (click on this one!)

A glass (cup?) of bubbly to celebrate the 6 TB event

A glass (cup?) of bubbly to celebrate the 6 TB event

College championship game at AT&T Stadium breaks 6 Terabyte wireless data mark, with almost 5 TB of Wi-Fi traffic

AT&T Stadium before the college football playoff championship game. (Click on any photo for larger image) Credit all photos: Paul Kapustka, MSR

AT&T Stadium before the college football playoff championship game. (Click on any photo for larger image) Credit all photos: Paul Kapustka, MSR

Not only did Monday night’s College Football Playoff championship game crown a new new national title team — it also broke the unofficial record for most wireless traffic at a single sporting event, with more than 6 terabytes of data used by the 85,689 fans in attendance at AT&T Stadium in Arlington, Texas.

John Winborn, chief information officer for the Dallas Cowboys, said the AT&T-hosted Wi-Fi network at AT&T Stadium carried 4.93 TB of traffic during Monday’s game between Ohio State and Oregon, a far higher total than we’ve ever heard of before for a single-game, single-venue event. AT&T cellular customers, Winborn said, used an additional 1.41 TB of wireless data on the stadium DAS network, resulting in a measured total of 6.34 TB of traffic. The real total is likely another terabyte or two higher, since these figures don’t include any traffic from other carriers (Verizon, Sprint, T-Mobile) carried on the AT&T-neutral host DAS. (Other carrier reps, please feel free to send us your data totals as well!)

The national championship numbers blew away the data traffic totals from last year’s Super Bowl, and also eclipsed the previous high-water Wi-Fi mark we knew of, the 3.3 TB number set by the San Francisco 49ers during the opening game of the season at their new Levi’s Stadium facility. Since we’ve not heard of any other event even coming close, we’ll crown AT&T Stadium and the college playoff championship as the new top dog in the wireless-data consumption arena, at least for now.

University of Phoenix Stadium, already with Super Bowl prep under way

University of Phoenix Stadium, already with Super Bowl prep under way

Coincidentally, MSR on Tuesday was touring the University of Phoenix Stadium and the surrounding Westgate entertainment district, which is in the process of getting the final touches on a new complex-wide DAS installed by Crown Castle. The new DAS includes antennas on buildings and railings around the restaurants and shops of the mall-like Westgate complex, as well as inside and outside the UoP Stadium. (We’ll have a full report soon on the new DAS installs, including antennas behind fake air-vent fans on the outside of the football stadium to help handle pre-game crowds).

The University of Phoenix Stadium also had its entire Wi-Fi network ripped and replaced this season, in order to better serve the wireless appetites coming for the big game on Feb. 1. At AT&T Stadium on Monday we learned that the network there had almost 300 new Wi-Fi access points and a number of new DAS antennas installed since Thanksgiving, in anticipation of a big traffic event Monday night. Our exclusive on-the-scene tests of the Wi-Fi and DAS network found no glitches or holes in coverage, which is probably part of the reason why so many people used so much data.

UPDATE: Here is the official press release from AT&T, which basically says the same thing our post does.

Stadium Tech Report: AT&T Stadium network a winner at CFP Championship game

Inside AT&T Stadium at the College Football Championship game. Credit all photos: Paul Kapustka, MSR

Inside AT&T Stadium at the College Football Championship game. Credit all photos: Paul Kapustka, MSR

It’s late here in North Texas and you know by now that Ohio State beat Oregon to win the first non-mythical college football championship. Behind the scenes at AT&T Stadium Monday night, the wireless network in AT&T Stadium was also a winner, standing up to the challenge of the 85,000-plus crowd on both the DAS and Wi-Fi front.

We’ll have a more thorough stadium report when we get time to digest all the info we gathered at the game (and get the network stats back from the AT&T Stadium tech crew) but one thing we learned before the game was that since November, the Wi-Fi network at AT&T Stadium grew by more than 280 access points, on top of a total somewhere in the 1,200 range. According to AT&T network folks the stadium here in Arlington, Texas, has been seeing game-day totals of 3.3 Terabytes of data carried on the Wi-Fi network — leading some here to believe that Monday’s championship game could well surpass 4 TB of data used at a single game, an unofficial record as far we know for a single-day, single facility network.

As guests of AT&T we also got a quick demonstration of LTE broadcast technology, which basically slices the available cellular spectrum into a channel that can provide live streams of video. We’ll have more on this new technology in another separate report, but it is something to watch for facilities that want video options but don’t want to go whole hog on Wi-Fi.

AT&T LTE Broadcast demo, showing a live streaming broadcast of the game

AT&T LTE Broadcast demo, showing a live streaming broadcast of the game

Even though we were housed in a field-level suite your intrepid MSR crew wandered all over the massive facility, and basically found great connectivity wherever we were. Two places stick out in my mind: At the very top of the nosebleed section in the south end zone the Wi-Fi dipped to just 1 Mbps, probably because the roof is so high there is no place for an access point. However, at that same spot the AT&T 4G LTE signal was around 7 Mbps, providing great connectivity in a tough to configure spot.

The other notable spot was in a “star level” suite (about the 6th level of the building), where we got a Wi-Fi signal of 28 Mbps download and 59 (no typo!) Mbps on the upload. Yes, suite people have it better but all around wherever we went we got consistent Wi-Fi signals in the high teens or low 20s, and LTE cellular signals (including Verizon 4G LTE) just under 10 Mbps. Like the Ohio State offense, the network at AT&T Stadium works really well and may have set a new record Monday night. More soon, and more images soon as well. For now, Elvis has left the building.

Outside in the frozen tundra of North Texas, aka Arlington

Outside in the frozen tundra of North Texas, aka Arlington

This place was humming all night long

This place was humming all night long

AT&T 4G LTE speedtest, from the top of the stadium

AT&T 4G LTE speedtest, from the top of the stadium

The view from the nosebleed section

The view from the nosebleed section

Some "suite" Wi-Fi speeds

Some “suite” Wi-Fi speeds

Stadium Tech Report: Corning, IBM bringing fiber-based Wi-Fi and DAS to Texas A&M’s Kyle Field

Kyle Field, Texas A&M University. Credit all photos: Texas A&M

Kyle Field, Texas A&M University. Credit all photos: Texas A&M

Editor’s note: The following is an excerpt from our recent Stadium Tech Report series COLLEGE FOOTBALL ISSUE, a 40-page in-depth look at Wi-Fi and DAS deployment trends at U.S. collegiate football stadiums. You can download the full report for free, to get more stadium profiles as well as school-by-school technology deployment capsules for both the SEC and Pac-12 conferences.

When Texas A&M’s newly renovated Kyle Field opens for the 2015 football season, its outside appearance will have changed dramatically. But from a networking perspective, what’s really different is hidden on the inside – namely, an optical fiber infrastructure designed to bring a new level of performance, cost savings and future-proofing to stadium network deployments.

While the use of optical fiber instead of copper cable in large networks isn’t exactly “new” in the core telecom or enterprise networking worlds, in the still-nascent field of stadium network deployments fiber has yet to make large inroads. But the promise of fiber’s ability to deliver much higher performance and greater future-proofing at lower installation costs in stadium situations may get a very visible poster child when Texas A&M’s football facility kicks off the 2015 season with a technology infrastructure designed to be among the most comprehensive in any stadium, collegiate or professional.

With a Wi-Fi network designed to support 100,000 concurrent connections, a robust DAS network with more than 1,000 antennas, and an IPTV deployment with more than 1,000 screens, the IBM-designed network based largely on Corning’s fiber-optical systems is incredibly impressive on paper – and it has already produced some eye-popping statistics this past season, when just a part of it came online during the “Phase 1” period of the two-phase $450 million Kyle Field renovation.

The final, or Phase 2 of the renovation, just now getting underway, began with an implosion of the stadium’s west stands, with reconstruction scheduled to finish in time for the 2015 season with a new, enclosed-bowl structure that will seat 102,512 fans. And if the new network delivers as planned, those fans will be among the most-connected anywhere, with plenty of future-proofing to make sure it remains that way for the foreseeable future – thanks to fiber.

Driving on the left side of the street

What’s going to be new about Kyle Field? According to news reports some of the creature comforts being added include redesigned concession stands, so-called “Cool Zones” with air conditioning to beat the Texas heat, well-appointed luxury suites and new restrooms – including 300 percent more women’s bathrooms.

Scoreboard, Kyle Field

Scoreboard, Kyle Field

According to representatives from the school, the decision to make the new stadium a standout facility extended to its network infrastructure. “Our leadership decided that [the stadium renovation] would be leading edge,” said Matthew Almand, the IT network architect for the Texas A&M University System, the administrative entity that oversees university operations, including those at the flagship school in College Station, Texas. “There were some leaps of faith and there was a decision to be leading edge with technology as well.”

Though the Phase 1 planning had started with traditional copper cable network design for the network, Almand said a presentation by IBM and its “smarter stadium” team changed the thinking at Texas A&M.

“The IBM team came in and did a really good job of presenting the positive points of an optical network,” Almand said.

Todd Christner, now the director, wireless business development at Corning, was previously at IBM as part of the team that brought the optical idea to Texas A&M. While talking about fiber to copper-cable veterans can sometimes be “like telling people to drive on the left side of the street,” Christner said the power, scalability and flexibility of a fiber network fit in well with the ambitious Kyle Field plans.

“The primary driving force [at Texas A&M] was that they wanted to build a state of the art facility, that would rival NFL stadiums and set them apart from other college programs,” Christner said. “And they wanted the fan [network] experience to be very robust.”

With what has to be one of the largest student sections anywhere – Christner said Texas A&M has 40,000 seats set aside for students – the school knew they would need extra support for the younger fans’ heavy data use on smartphones. The school officials, he said, were also concerned about DAS performance, which in the past had been left to outside operators with less than satisfactory results. So IBM’s presentation of a better, cheaper alternative for all of the above found accepting ears.

“It was the right room for us to walk into,” Christner said.

IBM’s somewhat radical idea was that instead of having separate copper networks for Wi-Fi, DAS and IPTV, there would be a single optical network with the capacity to carry the traffic of all three. Though the pitch for better performance, far more capacity, use of less space, and cheaper costs might sound a bit too good to believe, most of it is just the combination of the simple physics advantages of using fiber over copper, which are well known in the core telecom and large-enterprise networking worlds, applied to a stadium situation.

Deploying now and for the future

Corning ONE DAS headend equipment.

Corning ONE DAS headend equipment.

Without going too deeply into the physics or technology, a simple explanation of the benefits stem from the fact that optical fiber can carry far more bandwidth than copper, at farther distances, using less power. That advantage is one reason why fiber is used extensively in core backbone networks, and has been creeping slowly closer to the user’s destination, through deployments like Verizon’s FiOS.

Why hasn’t fiber won over completely? Mainly because in single-user deployments – like to a single home or office – it is still costly to replace systems already in the ground or in the wall with fiber, and for many users fiber’s capacity can be a bit of overkill. Fiber’s main benefits come when lots of bandwidth is needed, and the scale of a project is large, since one main benefit is the elimination of a lot of internal switching gear, which takes up space and consumes lots of power.

Those reasons accurately describe the perfect bandwidth storm happening in networked stadiums these days, where demand seems to keep increasing on a daily basis. Some stadiums that were at the forefront of the wireless-networking deployment trend, like AT&T Park in San Francisco and AT&T Stadium in Arlington, Texas, have been in a near-constant state of infrastructure upgrades due to the ever-increasing needs for more bandwidth. And Isaac Nissan, product manager for Corning ONE, said new equipment like Wi-Fi access points with “smart” or multiple-input antennas are also going to help push the LAN world into more fiber on the back end.

But there’s another drawback to using fiber, which has less to do with technology and more to do with history: Installers, integrators and other hands-on networking folks in general are more comfortable with copper, which they know and have used for decades. Fiber, to many, is still a new thing, since it requires different skills and techniques for connecting and pulling
wires, as well as for managing and administering optical equipment.

“There’s definitely a learning curve for some the RF [industry] people, who have been doing coax for 20 years,” Nissan said. “Fiber is a little different.”

Texas A&M’s Almand admitted that bringing the stadium’s networking group into a new technology – fiber – was a challenge, but one with a worthy payoff.

Copper cable tray hardly filled by optical fiber

Copper cable tray hardly filled by optical fiber

“There’s definitely been a gear-up cycle, getting to a new confidence level [with fiber],” Almand said. But he added that “sometimes it’s good to break out of your comfort zone.”

Lowering the IDF count

Christner said the Corning optical gear is at the center of the Kyle Field deployment, providing support for the fan-facing Wi-Fi as well as Wi-Fi for back of the house operations like point of sale; it also supports the stadium DAS, as well as a network of more than 1,000 IPTV screens. Aruba Networks is the Wi-Fi gear supplier, and YinzCam is helping develop a new Kyle Field app that will include support to use smartphones as remote-control devices for IPTVs in suites.

On the Wi-Fi side, Christner said the finished network will have 600 APs in the bowl seating areas, and another 600 throughout the facility, with a stated goal of supporting 100,000 concurrent 2 Mbps connections. The DAS, Christner said, is slated to have 1,090 antennas in 50 sectors.

With no intermediate switching gear at all, Christner said that for the fiber network in Kyle Field only 12 intermediate distribution frames (the usually wall-mounted racks that support network-edge gear, also called IDFs) would be needed, as opposed to 34 IDFs in a legacy fiber/coax system. In addition to using less power, the cabling needed to support the fiber network is a fraction of what would have been needed for coax.

One of the more striking pictures of the deployment is a 36-inch wide cable tray installed for the original copper-network plan, which is carrying just 10 inches of fiber-optic cable. Christner said the fiber network also provides a cleaner signal for the DAS network, which already had a test run this past season, when 600 DAS antennas were deployed and lit during the 2014 season.

“At the Ole Miss game we had 110,663 fans at the stadium, and according to AT&T on the DAS all their lights were green,” Christner said. “Via our completely integrated fiber optic solution, we are now able to provide the DAS with much higher bandwidth as well,” said Texas A&M’s Almand, who also said that the carriers have responded very positively to the new DAS infrastructure.

Up from the dust – a model for the future?

Antenna and zone gear box near top of stands

Antenna and zone gear box near top of stands

Also included in the design – but not being used – are an additional 4,000 spare fibers at 540 zone locations, which Christner said can be immediately tapped for future expansion needs. And all of this functionality and flexibility, he added, was being built for somewhere between one-third and 40 percent less than the cost of a traditional copper-based solution.

The proof of the network’s worth, of course, will have to wait until after the west stands are imploded, the new ones built, and the final pieces of the network installed. Then the really fun part begins, for the users who will get to play with things like 38 channels of high-def TV on the IPTV screens, to the multiple-angle replay screens and other features planned for the mobile app. At Texas A&M, IBM’s support squad will include some team members who work on the company’s traditionally excellent online effort for the Masters golf tournament, as well as the “smarter stadium” team.

For Texas A&M’s Almand, the start of the 2015 season will mark the beginning of the end, and a start to something special.

“If I were a country singer, I’d write something about looking forward to looking back on this,” Almand said. “When it’s done, it’s going to be something great.”

Report excerpt: SEC moving slowly on stadium Wi-Fi deployments

Jordan-Hare Stadium, Auburn University

Jordan-Hare Stadium, Auburn University

Editor’s note: The following is an excerpt from our recent Stadium Tech Report series COLLEGE FOOTBALL ISSUE, a 40-page in-depth look at Wi-Fi and DAS deployment trends at U.S. collegiate football stadiums. You can download the full report for free, to get more stadium profiles as well as school-by-school technology deployment capsules for both the SEC and Pac-12 conferences.

When it comes to college football, the South- eastern Conference – usually just known as “the SEC” – is second to none when it comes to the product on the field.

But what about the product in the stands, namely the wireless technology deployments in SEC stadiums? With just two of 14 conference schools currently with fan-facing Wi-Fi in their main venues, the SEC isn’t pushing any technology envelopes as a whole. And according to one SEC athletic director, there probably won’t be a wholesale march by the conference to the technology forefront – simply because the SEC’s in-stadium fans have other priorities on what needs fixing first.

Scott Stricklin, the AD at SEC member Mississippi State, leads a conference-wide group that is taking a close look at the in- stadium fan experience, a concern for the SEC even as the conference enjoys NFL-like popularity for its teams and games.

“We are proud that we have a pretty special product in our stadiums, and we want to take steps to keep it that way,” said Stricklin in an interview with MSR. A recent conference-wide fan survey, he said, did highlight the fact that when it comes to wireless connectivity, “none of us from a performance standpoint scored very well.”

Wi-Fi not as important as parking, good food

But Stricklin also noted that the same fan survey didn’t place stadium connectivity at the top of the list of things to fix: Instead, it fell well down, trailing issues like parking, clean restrooms, stadium sound and good food. That lack of press- ing concern, combined with Stricklin’s still-common belief that fans should be cheering instead of texting while at the stadium, means that the SEC will probably take a measured approach to Wi-Fi deployments in stadiums, and continue to rely on carrier-funded DAS networks to carry the game-day wireless load.

Scott Stricklin, Mississippi State AD

Scott Stricklin, Mississippi State AD

“I take more of a Mark Cuban approach – I’d rather people in the stands not be watching video [on their phones],” Stricklin said. “It takes away from the shared experience.”

Stricklin also noted that the two schools that have installed Wi-Fi in their stadiums – Auburn and Ole Miss – haven’t had resounding success with their deployments.

“Some [SEC schools] have done [Wi-Fi], and they’re not completely happy with the results,” said Stricklin, saying the lack of success has reinforced the cautious approach to Wi-Fi, conference-wide. “Those are the issues all of us are facing and grappling with,” he added.

SEC fans setting DAS traffic records

Even as they trail on Wi-Fi deployments, that doesn’t mean SEC schools are putting in dial-up phone booths. Indeed, Stricklin noted the huge video boards that have been installed in most conference stadiums, and did say that the recent installations of carrier-funded DAS deploymentshave somewhat eased the no-signal crunch of the near past.

At his own school, Stricklin said his office got a lot of complaints about fans not being able to get a cellular signal before AT&T updated the stadium’s DAS in 2013.

“Last year, we got very few negative comments [about cellular service],” Stricklin said. “AT&T customers were even able to stream video.”

Vaught-Hemingway Stadium, Ole Miss

Vaught-Hemingway Stadium, Ole Miss

AT&T’s aggressive plan to install as many DAS networks as it can has helped bring the SEC to a 100 percent DAS coverage mark, and the fans seem to be enjoying the enhanced cellular connectivity. According to AT&T statistics, fans at SEC schools have regularly led the carrier’s weekly DAS traffic totals for most of the football season, especially at the “big games” between SEC schools like Alabama, Auburn, Ole Miss, Mississippi State and Georgia.

During Alabama’s 25-20 home victory over then-No. 1 Mississippi State, AT&T customers at Bryant-Denny Stadium used 849 gigabytes oftraffic, the second-highest total that weekend for stadiums where AT&T has a DAS. The next two highest data-usage marks that weekend came at games at Georgia (676 GB) and Arkansas (602 GB), highlighting that SEC games typically have huge crowds, and those crowds like to use their cellphones, no matter how good the game on the field is.

Would Wi-Fi help with some of the traffic crunches? Possibly, but only two schools in the conference – Ole Miss and Auburn – currently have fan-facing Wi-Fi in their stadiums. Texas A&M, which is in the middle of a $450 million renovation of Kyle Field, is leaping far ahead of its conference brethren with a fiber-based Wi-Fi and DAS network and IPTV installation that will be among the most advanced anywhere when it is completed this coming summer.

But most of the SEC schools, Stricklin said, will probably stay on the Wi-Fi sidelines, at least until there is some better way to justify the millions of dollars in costs needed to bring Wi-Fi to a facility that might not see much regular use.

“If you only have 6 home games a year, it’s hard to justify,” said Stricklin of the cost of a Wi-Fi stadium network.

Other sports may move before football

Stricklin, the man who wants fans to keep their phones in their pockets at football games, is no stranger to technology-enhanced experiences in stadiums. He claims to “love” the in-seat food delivery options at MSU baseball and basketball games, and notes that the conference athletic directors will have a meeting soon where the game-experience panel experts will walk the ADs through the facets of wireless technology deployments.

“They’re going to lay out what are the challenges, and what are the costs” of wireless deployments, Stricklin said. What Stricklin doesn’t want to see at MSU or at any SEC school is the return of the “no signal” days.

“When fans from other schools come here, we want them to have a good experience,” Stricklin said.

But he’d still prefer that experience is real, not virtual.

“I still just wonder, is anybody really doing this?” he asked. “Are you going to pay what you pay to come to our place, and then watch your phone? What I hope is that we produce such a great experience, you’re not going to want to reach for your phone.”