Stadium Tech Report: AT&T Stadium’s massive antenna deployment delivers solid Wi-Fi, DAS performance

The old saw that says “everything’s bigger in Texas” is not just a stereotype when it comes to wireless networking and AT&T Stadium. Though our visit was brief and we didn’t have the opportunity to do a deep-dive technology tour, the MSR team on hand at the recent College Football Playoff championship game came away convinced that if it’s not the fastest fan-facing stadium network, the Wi-FI and DAS deployments at AT&T Stadium sure are the biggest, at least the largest we’ve ever heard of.

Inside AT&T Stadium at the College Football Playoff championship game. (Click on any photo for larger image) Credit all photos: Paul Kapustka, MSR

Inside AT&T Stadium at the College Football Playoff championship game. (Click on any photo for larger image) Credit all photos: Paul Kapustka, MSR

And in many ways we found, bigger is better, at least when it comes to staying connected inside one of the world’s truly humongous indoor spaces.

If you’ve not seen the stats, allow us to remind you that during the Jan. 12 championship game between the University of Oregon and THE Ohio State University the AT&T Stadium network carried more than 6 terabytes of wireless data, with almost 5 TB of that over the in-house Wi-Fi network. Another 1.4 TB was recorded being used by AT&T customers on the AT&T-hosted neutral DAS, which almost certainly carried another terabyte or two from other carriers on the system, who did not report any statistics. Any way you add it up, it’s the biggest single-day wireless data figure we’ve ever heard for a sports arena, professional or college, in any sport at any time.

Flooding the zone with more antennas and APs

How do you get such a big data number? One way is to make sure that everyone can connect, and one way to get to that point is to flood the zone with antennas and access points. Already the leader in the number of Wi-Fi access points and DAS antennas, AT&T Stadium got another 280 Wi-Fi antennas installed between Thanksgiving and the college championship game, according to John Winborn, CIO for the Dallas Cowboys. Some of those antennas, the staff said, were housed in new under-the-seat enclosures that AT&T’s Foundry designed somewhat specifically for use in the lower bowl of AT&T Stadium, which like other stadiums had previously had issues getting connectivity to seats close to field level.

According to Winborn, the AT&T Stadium now has more than 1,600 Wi-Fi APs in use for football games, and 1,400 antennas in its DAS network. By comparison, Levi’s Stadium in Santa Clara, Calif., perhaps the newest and one of the most technologically savvy venues out there, has 1,200 Wi-Fi APs and 700 DAS antennas in its deployments. Winborn also said that the antenna/AP number at AT&T can also scale up as necessary, especially for events that use up more of the building’s space, like the Final Four basketball tournament held there last spring.

“We scaled up to 1,825 [Wi-Fi] antennas for the Final Four last year,” said Winborn in a recent email, where he guessed that the venue might deploy up to 2,000 Wi-Fi APs when the Academy of Country Music Awards holds its yearly event at AT&T Stadium on April 19.

Hiding Wi-Fi APs an aesthetic priority

John Winborn, CIO for the Dallas Cowboys, poses next to a picture of two other innovators, Tex Schramm and Gil Brandt

John Winborn, CIO for the Dallas Cowboys, poses next to a picture of two other innovators, Tex Schramm and Gil Brandt

For all the extra numbers, one thing we noticed in walking around the building on Jan. 12 was that seeing an exposed Wi-Fi AP is about as common as seeing an albino deer. When we asked Winborn what the toughest thing was about network deployment in the venue, he responded quickly: “Finding ways to hide the APs so Jerry [Jones] doesn’t see them.”

With the price-is-no-object Jones on one side, and AT&T’s corporate image on the other, it’s clear there aren’t too many budgetary concerns when it comes down to spending more to make the network work, or look, better. Put it this way: You are never likely to have a “no signal” problem in a building that has on its outside an AT&T logo the size of the moon, and where AT&T CEO Randall Stephenson can be found wandering around the suite level during big events.

Though the immense space could probably be covered by fewer antennas, it’s worthwhile to remember that when the building was built and opened in 2009, it wasn’t designed with high-speed networking in mind. That means that almost all of the Wi-Fi and DAS deployments are a retrofit, including the ingenious circle of Wi-Fi antennas halfway up the seating bowl, which are covered by a tented ring of fiberglass designed and built specifically for the stadium.

According to Winborn the Wi-Fi network is supported by its own 2 GB backbone, with separate backbones in place for media networks and stadium application use. Winborn also noted that the stadium network runs 3,500 TVs via the Cisco StadiumVision system. Other records from this season include a peak concurrent Wi-Fi user mark of 27,523 (set at the Lions playoff game) and 38,534 unique Wi-Fi connections, that mark set at the season opener against the San Francisco 49ers.

Performance solid, even at rooftop level

The view from the nosebleed section

The view from the nosebleed section

So how fast are the Wi-Fi and DAS networks? In our limited testing time at the CFP game, we found solid connections almost everywhere we tried, including outside the stadium while we (freezingly) waited for the doors to open. Just outside the main ticket gate, we got a Wi-Fi signal of 23.93 Mbps on the download and 39.67 Mbps on the upload. At the same location a Verizon 4G LTE device got a 5.93 Mbps download speed, and a 2.59 Mbps upload speed, but it’s unclear if that was on the stadium DAS or just on the local macro network.

When the doors finally opened at 5:30 p.m. (no idea why Jerry kept us all out in the cold all afternoon) we went inside and got solid connections inside the foyer of the pro shop — 18.23/21.74 on Wi-Fi, 21.05/14.84 on an AT&T 4G LTE device, and 12.65/4.61 on a Verizon 4G LTE phone. (It’s worthwhile to note that all our Wi-Fi speeds were recorded on the Verizon device, a new iPhone 6 Plus.)

Down in our field-level suite, where we were the guests of AT&T, we got marks of 19.43/25.31 on the Wi-Fi, 7.35/11.04 on AT&T 4G and 5.71/4.05 on Verizon 4G. An interesting note here: When Oregon scored a touchdown on its opening drive, we took another Wi-Fi speedtest right after the play and got readings of 4.38/7.79, suggesting that there were many Ducks fans communicating the good news.

Later during the game we wandered up to the “Star Level” suites (floor 6 on the stadium elevator) and got a Wi-Fi mark of 11.57/30.51, and 19.31/13.46 on AT&T 4G. The only place we didn’t get a good Wi-Fi signal was at the nosebleed-level plaza above the south end zone, where we weren’t surprised by the 1.41/1.98 Wi-Fi mark since we didn’t see any place you could put an AP. We did, however, get an AT&T 4G signal of more than 7 Mbps on the download in the same location, meaning that even fans way up at the top of the stadium were covered by wireless, no small feat in such a huge space.

Bottom line: Network in place for whatever’s next

If there is a place where AT&T falls behind other stadiums, it’s in the synchronization of network and app; since it wasn’t built with food delivery in mind, it’s doubtful that AT&T will match Levi’s Stadium’s innovative delivery-to-any-seat feature anytime soon. And even though AT&T Stadium is dominated by the massive over-the-field TV set, fans at the CFP championship game were left literally in the dark during questionable-call replays, since they weren’t shown on the big screen and aren’t supported in the AT&T Stadium app.

What could be interesting is if the technology demonstrated by AT&T at the big college game – LTE Broadcast, which sends a streaming channel of live video over a dedicated cellular link – becomes part of the AT&T Stadium repertoire. From experience, such a channel could be extremely helpful during pregame events, since many fans at the college championship were wandering around outside the stadium unsure of where to go or where to find will-call windows. A “pre-game info” broadcast over LTE Broadcast could eliminate a lot of pain points of getting to the event, while also introducing fans to the network and app for later interaction.

At the very least, AT&T Stadium’s network alone puts it in at least the top three of most-connected football stadiums, alongside Levi’s Stadium and Sun Life Stadium in Miami. Here’s looking forward to continued competition among the venues, with advancements that will only further improve the already excellent wireless fan experience.

More photos from our visit below. Enjoy!

Fans freezing outside waiting for the CFP game to start

Fans freezing outside waiting for the CFP game to start

Creative OSU fan

Creative OSU fan

Plug for the app

Plug for the app

AT&T Stadium NOC aka "the Fishbowl"

AT&T Stadium NOC aka “the Fishbowl”

Sony Club. Now we know where Levi's Stadium got its "club" ideas

Sony Club. Now we know where Levi’s Stadium got its “club” ideas

Panoramic view (click on this one!)

Panoramic view (click on this one!)

A glass (cup?) of bubbly to celebrate the 6 TB event

A glass (cup?) of bubbly to celebrate the 6 TB event

College championship game at AT&T Stadium breaks 6 Terabyte wireless data mark, with almost 5 TB of Wi-Fi traffic

AT&T Stadium before the college football playoff championship game. (Click on any photo for larger image) Credit all photos: Paul Kapustka, MSR

AT&T Stadium before the college football playoff championship game. (Click on any photo for larger image) Credit all photos: Paul Kapustka, MSR

Not only did Monday night’s College Football Playoff championship game crown a new new national title team — it also broke the unofficial record for most wireless traffic at a single sporting event, with more than 6 terabytes of data used by the 85,689 fans in attendance at AT&T Stadium in Arlington, Texas.

John Winborn, chief information officer for the Dallas Cowboys, said the AT&T-hosted Wi-Fi network at AT&T Stadium carried 4.93 TB of traffic during Monday’s game between Ohio State and Oregon, a far higher total than we’ve ever heard of before for a single-game, single-venue event. AT&T cellular customers, Winborn said, used an additional 1.41 TB of wireless data on the stadium DAS network, resulting in a measured total of 6.34 TB of traffic. The real total is likely another terabyte or two higher, since these figures don’t include any traffic from other carriers (Verizon, Sprint, T-Mobile) carried on the AT&T-neutral host DAS. (Other carrier reps, please feel free to send us your data totals as well!)

The national championship numbers blew away the data traffic totals from last year’s Super Bowl, and also eclipsed the previous high-water Wi-Fi mark we knew of, the 3.3 TB number set by the San Francisco 49ers during the opening game of the season at their new Levi’s Stadium facility. Since we’ve not heard of any other event even coming close, we’ll crown AT&T Stadium and the college playoff championship as the new top dog in the wireless-data consumption arena, at least for now.

University of Phoenix Stadium, already with Super Bowl prep under way

University of Phoenix Stadium, already with Super Bowl prep under way

Coincidentally, MSR on Tuesday was touring the University of Phoenix Stadium and the surrounding Westgate entertainment district, which is in the process of getting the final touches on a new complex-wide DAS installed by Crown Castle. The new DAS includes antennas on buildings and railings around the restaurants and shops of the mall-like Westgate complex, as well as inside and outside the UoP Stadium. (We’ll have a full report soon on the new DAS installs, including antennas behind fake air-vent fans on the outside of the football stadium to help handle pre-game crowds).

The University of Phoenix Stadium also had its entire Wi-Fi network ripped and replaced this season, in order to better serve the wireless appetites coming for the big game on Feb. 1. At AT&T Stadium on Monday we learned that the network there had almost 300 new Wi-Fi access points and a number of new DAS antennas installed since Thanksgiving, in anticipation of a big traffic event Monday night. Our exclusive on-the-scene tests of the Wi-Fi and DAS network found no glitches or holes in coverage, which is probably part of the reason why so many people used so much data.

UPDATE: Here is the official press release from AT&T, which basically says the same thing our post does.

Stadium Tech Report: Corning, IBM bringing fiber-based Wi-Fi and DAS to Texas A&M’s Kyle Field

Kyle Field, Texas A&M University. Credit all photos: Texas A&M

Kyle Field, Texas A&M University. Credit all photos: Texas A&M

Editor’s note: The following is an excerpt from our recent Stadium Tech Report series COLLEGE FOOTBALL ISSUE, a 40-page in-depth look at Wi-Fi and DAS deployment trends at U.S. collegiate football stadiums. You can download the full report for free, to get more stadium profiles as well as school-by-school technology deployment capsules for both the SEC and Pac-12 conferences.

When Texas A&M’s newly renovated Kyle Field opens for the 2015 football season, its outside appearance will have changed dramatically. But from a networking perspective, what’s really different is hidden on the inside – namely, an optical fiber infrastructure designed to bring a new level of performance, cost savings and future-proofing to stadium network deployments.

While the use of optical fiber instead of copper cable in large networks isn’t exactly “new” in the core telecom or enterprise networking worlds, in the still-nascent field of stadium network deployments fiber has yet to make large inroads. But the promise of fiber’s ability to deliver much higher performance and greater future-proofing at lower installation costs in stadium situations may get a very visible poster child when Texas A&M’s football facility kicks off the 2015 season with a technology infrastructure designed to be among the most comprehensive in any stadium, collegiate or professional.

With a Wi-Fi network designed to support 100,000 concurrent connections, a robust DAS network with more than 1,000 antennas, and an IPTV deployment with more than 1,000 screens, the IBM-designed network based largely on Corning’s fiber-optical systems is incredibly impressive on paper – and it has already produced some eye-popping statistics this past season, when just a part of it came online during the “Phase 1” period of the two-phase $450 million Kyle Field renovation.

The final, or Phase 2 of the renovation, just now getting underway, began with an implosion of the stadium’s west stands, with reconstruction scheduled to finish in time for the 2015 season with a new, enclosed-bowl structure that will seat 102,512 fans. And if the new network delivers as planned, those fans will be among the most-connected anywhere, with plenty of future-proofing to make sure it remains that way for the foreseeable future – thanks to fiber.

Driving on the left side of the street

What’s going to be new about Kyle Field? According to news reports some of the creature comforts being added include redesigned concession stands, so-called “Cool Zones” with air conditioning to beat the Texas heat, well-appointed luxury suites and new restrooms – including 300 percent more women’s bathrooms.

Scoreboard, Kyle Field

Scoreboard, Kyle Field

According to representatives from the school, the decision to make the new stadium a standout facility extended to its network infrastructure. “Our leadership decided that [the stadium renovation] would be leading edge,” said Matthew Almand, the IT network architect for the Texas A&M University System, the administrative entity that oversees university operations, including those at the flagship school in College Station, Texas. “There were some leaps of faith and there was a decision to be leading edge with technology as well.”

Though the Phase 1 planning had started with traditional copper cable network design for the network, Almand said a presentation by IBM and its “smarter stadium” team changed the thinking at Texas A&M.

“The IBM team came in and did a really good job of presenting the positive points of an optical network,” Almand said.

Todd Christner, now the director, wireless business development at Corning, was previously at IBM as part of the team that brought the optical idea to Texas A&M. While talking about fiber to copper-cable veterans can sometimes be “like telling people to drive on the left side of the street,” Christner said the power, scalability and flexibility of a fiber network fit in well with the ambitious Kyle Field plans.

“The primary driving force [at Texas A&M] was that they wanted to build a state of the art facility, that would rival NFL stadiums and set them apart from other college programs,” Christner said. “And they wanted the fan [network] experience to be very robust.”

With what has to be one of the largest student sections anywhere – Christner said Texas A&M has 40,000 seats set aside for students – the school knew they would need extra support for the younger fans’ heavy data use on smartphones. The school officials, he said, were also concerned about DAS performance, which in the past had been left to outside operators with less than satisfactory results. So IBM’s presentation of a better, cheaper alternative for all of the above found accepting ears.

“It was the right room for us to walk into,” Christner said.

IBM’s somewhat radical idea was that instead of having separate copper networks for Wi-Fi, DAS and IPTV, there would be a single optical network with the capacity to carry the traffic of all three. Though the pitch for better performance, far more capacity, use of less space, and cheaper costs might sound a bit too good to believe, most of it is just the combination of the simple physics advantages of using fiber over copper, which are well known in the core telecom and large-enterprise networking worlds, applied to a stadium situation.

Deploying now and for the future

Corning ONE DAS headend equipment.

Corning ONE DAS headend equipment.

Without going too deeply into the physics or technology, a simple explanation of the benefits stem from the fact that optical fiber can carry far more bandwidth than copper, at farther distances, using less power. That advantage is one reason why fiber is used extensively in core backbone networks, and has been creeping slowly closer to the user’s destination, through deployments like Verizon’s FiOS.

Why hasn’t fiber won over completely? Mainly because in single-user deployments – like to a single home or office – it is still costly to replace systems already in the ground or in the wall with fiber, and for many users fiber’s capacity can be a bit of overkill. Fiber’s main benefits come when lots of bandwidth is needed, and the scale of a project is large, since one main benefit is the elimination of a lot of internal switching gear, which takes up space and consumes lots of power.

Those reasons accurately describe the perfect bandwidth storm happening in networked stadiums these days, where demand seems to keep increasing on a daily basis. Some stadiums that were at the forefront of the wireless-networking deployment trend, like AT&T Park in San Francisco and AT&T Stadium in Arlington, Texas, have been in a near-constant state of infrastructure upgrades due to the ever-increasing needs for more bandwidth. And Isaac Nissan, product manager for Corning ONE, said new equipment like Wi-Fi access points with “smart” or multiple-input antennas are also going to help push the LAN world into more fiber on the back end.

But there’s another drawback to using fiber, which has less to do with technology and more to do with history: Installers, integrators and other hands-on networking folks in general are more comfortable with copper, which they know and have used for decades. Fiber, to many, is still a new thing, since it requires different skills and techniques for connecting and pulling
wires, as well as for managing and administering optical equipment.

“There’s definitely a learning curve for some the RF [industry] people, who have been doing coax for 20 years,” Nissan said. “Fiber is a little different.”

Texas A&M’s Almand admitted that bringing the stadium’s networking group into a new technology – fiber – was a challenge, but one with a worthy payoff.

Copper cable tray hardly filled by optical fiber

Copper cable tray hardly filled by optical fiber

“There’s definitely been a gear-up cycle, getting to a new confidence level [with fiber],” Almand said. But he added that “sometimes it’s good to break out of your comfort zone.”

Lowering the IDF count

Christner said the Corning optical gear is at the center of the Kyle Field deployment, providing support for the fan-facing Wi-Fi as well as Wi-Fi for back of the house operations like point of sale; it also supports the stadium DAS, as well as a network of more than 1,000 IPTV screens. Aruba Networks is the Wi-Fi gear supplier, and YinzCam is helping develop a new Kyle Field app that will include support to use smartphones as remote-control devices for IPTVs in suites.

On the Wi-Fi side, Christner said the finished network will have 600 APs in the bowl seating areas, and another 600 throughout the facility, with a stated goal of supporting 100,000 concurrent 2 Mbps connections. The DAS, Christner said, is slated to have 1,090 antennas in 50 sectors.

With no intermediate switching gear at all, Christner said that for the fiber network in Kyle Field only 12 intermediate distribution frames (the usually wall-mounted racks that support network-edge gear, also called IDFs) would be needed, as opposed to 34 IDFs in a legacy fiber/coax system. In addition to using less power, the cabling needed to support the fiber network is a fraction of what would have been needed for coax.

One of the more striking pictures of the deployment is a 36-inch wide cable tray installed for the original copper-network plan, which is carrying just 10 inches of fiber-optic cable. Christner said the fiber network also provides a cleaner signal for the DAS network, which already had a test run this past season, when 600 DAS antennas were deployed and lit during the 2014 season.

“At the Ole Miss game we had 110,663 fans at the stadium, and according to AT&T on the DAS all their lights were green,” Christner said. “Via our completely integrated fiber optic solution, we are now able to provide the DAS with much higher bandwidth as well,” said Texas A&M’s Almand, who also said that the carriers have responded very positively to the new DAS infrastructure.

Up from the dust – a model for the future?

Antenna and zone gear box near top of stands

Antenna and zone gear box near top of stands

Also included in the design – but not being used – are an additional 4,000 spare fibers at 540 zone locations, which Christner said can be immediately tapped for future expansion needs. And all of this functionality and flexibility, he added, was being built for somewhere between one-third and 40 percent less than the cost of a traditional copper-based solution.

The proof of the network’s worth, of course, will have to wait until after the west stands are imploded, the new ones built, and the final pieces of the network installed. Then the really fun part begins, for the users who will get to play with things like 38 channels of high-def TV on the IPTV screens, to the multiple-angle replay screens and other features planned for the mobile app. At Texas A&M, IBM’s support squad will include some team members who work on the company’s traditionally excellent online effort for the Masters golf tournament, as well as the “smarter stadium” team.

For Texas A&M’s Almand, the start of the 2015 season will mark the beginning of the end, and a start to something special.

“If I were a country singer, I’d write something about looking forward to looking back on this,” Almand said. “When it’s done, it’s going to be something great.”

Stadium Tech Report: THE COLLEGE FOOTBALL ISSUE looks at university Wi-Fi deployments

collegethumbIf there was a college football playoff for stadium wireless network deployments, which four teams would be in? Electing myself to the committee, I think my top picks would be the same venues we’re profiling in our latest Stadium Tech Report – Baylor, Nebraska, Stanford and Texas A&M. All four are pursuing high-end networks to support a better fan experience, leading the way for what may turn out to be the largest “vertical” market in the stadium networking field – sporting venues at institutions of higher learning.

To be sure, network deployments at major universities in the U.S. are still at the earliest stages — in our reporting for our latest long-form report, we found that at two of the top conferences, the SEC and the Pac-12, only four schools total (two in each conference) had fan-facing Wi-Fi, with only one more planned to come online next year. Why is the collegiate market so far behind the pro market when it comes to network deployment? There are several main reasons, but mostly it comes down to money and mindset, with a lack of either keeping schools on the sidelines.

Leaders look for NFL-type experiences

But at our “playoff” schools, it’s clear that with some ready budget and a clear perspective, college stadiums don’t need to take a back seat to anyone, pro stadiums included. The networks, apps and infrastructure deployed for this season at Baylor’s McLane Stadium and Nebraska’s Memorial Stadium are among the tops anywhere in sports, and the all-fiber infrastructure being put in place at Texas A&M should make that school’s Kyle Field among the most-connected if all work gets completed on time for next football season. Read in-depth profiles on these schools’ deployments, along with team-by-team capsule technology descriptions and an exclusive interview with Mississippi State athletic director Scott Stricklin in our latest report, available for free download from our site.

We’d like to take a second here to thank our sponsors, without whom we wouldn’t be able to offer these comprehensive reports to you free of charge. For our fourth-quarter report our sponsors include Crown Castle, SOLiD, Extreme Networks, Aruba Networks, TE Connectivity, and Corning.

Stadium Tech Report: Arizona Cardinals get stadium ready for Super Bowl with Wi-Fi upgrade

University of Phoenix Stadium. Credit all photos: Arizona Cardinals.

University of Phoenix Stadium. Credit all photos: Arizona Cardinals. (click on photos for larger image)

As they get set to host their second Super Bowl this February, the IT team at the University of Phoenix Stadium in Glendale, Ariz., knows now what they didn’t know then: The big game requires a big wireless network. Bigger than you think.

“It’s funny to look back now on that first Super Bowl,” said Mark Feller, vice president of information technology for the Arizona Cardinals, speaking of Roman numeral game XLII, held in the barely 2-year-old facility on Feb. 3, 2008. With a couple Fiesta Bowls and one BCS championship game (2007) under his belt in a facility that opened with Wi-Fi and DAS, Feller said he and his team “thought we had a good handle” on what kind of network was required for a Super Bowl crowd.

The NFL, he said, begged to differ. Those college games might have been big, but the Super Bowl was bigger.

“We had the Fiesta Bowl that year at night, and when the game was over there were people from the NFL there wanting to know when they could set up,” said Feller in a recent phone interview. “This year, we’re much better prepared. We know what the water temperature is this time.”

Rip and replace, with more and better gear

Wi-Fi railing antennas

Wi-Fi railing antennas

For Super Bowl XLIX, scheduled to take place in Glendale on Feb. 1, 2015, Feller and his team have not just tuned up their network — they have done a full rip and replace of the Wi-Fi system, installing new Cisco gear from back end to front, in order to support a wireless game-day demand that is historically second to none. Integrator CDW has led the Wi-Fi effort and Daktronics and ProSound did the installation of new video screens, and neutral host Crown Castle has overseen a revamp of the DAS system, again with more antennas added to bolster coverage. In all, there has been more than $8 million in wireless improvements before this year, Feller said, as well as another $10 million for two new video boards that are each three times larger than what was there before.

“The last three or four years there have been things we knew we needed to improve [before the Super Bowl],” Feller said. After extensive work with the NFL’s technical team — this time well before the Fiesta Bowl — Feller oversaw a “top to bottom” refurbishment that included replacing core Cisco networking gear with newer gear, and new and more Wi-Fi access points that now total somewhere north of the 750 mark, with some more to be added before the big game. The new network, which was in place for the start of the current NFL season, has undergone testing by CDW at each home game, Feller said. CDW also plans to expand the network outside the stadium before the Super Bowl, in part to handle the extra events that take place not just on game day but in the days leading up to the game.

“The plan is to install more [coverage] outside, in the plaza areas,” Feller said.

When it opened in 2006, the $455 million University of Phoenix Stadium was one of the first with full-bowl Wi-Fi, using Cisco gear from the inside out. “Cisco was in here before they called it [their solution] ‘connected stadium’,” Feller said. From core network switches to firewalls to edge switches, this year there is all new Cisco gear in the venue, as well as new 3700 series APs, with panel antennas and antennas in handrails.

“Handrail [antennas] are sometimes a bit of a challenge, because you need to drill through concrete that’s 40 feet up in the air, behind another ceiling,” said Feller, describing one particular design challenge. Another one was mounting antennas on drop rods from the catwalks below the stadium’s retractable roof, to serve the upper-area seating. There are also some new Wi-Fi APs on the front row of the seating bowl, pointing up into the crowd.

“It was a fun project,” Feller said.

Stadium with roof open

Stadium with roof open

All on board for the DAS

The upgrade for the stadium’s DAS, led by Crown Castle, was just finished a few weeks ago, Feller said, and included more coverage outside the stadium as well, with antennas placed on light poles and on the stadium’s shell.

“Crown Castle did a great job of managing the carriers” on what is a 48-sector DAS, Feller said. “It [the upgrade] really required a lot of creative thinking from their engineers.”

Since the stadium was originally designed with wireless in mind, Feller and his team didn’t need to build new head end room for the DAS upgrades. “But I wouldn’t say we have plenty of space left,” he added. “We’ve got a lot of new equipment.”

Though all the major carriers are expected to be on the DAS by the big game, league partner Verizon Wireless should have some special projects up its sleeve for the big game, including another demonstration of its LTE Broadcast technology, which optimizes things like live video over LTE cellular links.

New Cardinals app a preview of Super Bowl version?

The Cardinals also had a new version of the game-day team app for this season, built by stadium-app leader YinzCam. According to Feller the new app supports three different live video feeds, as well as instant replays.

Wi-Fi antenna on railing

Wi-Fi antenna on railing

“It’s really cool to have that ability to watch things like a touchdown pass at the end of the game,” Feller said. And while no details have yet been revealed, in an interview with NFL CIO Michelle McKenna-Doyle earlier this year MSR learned that the league and YinzCam are working on a Super Bowl app with its own new bells and whistles. (Stay tuned for more info on the Super Bowl app.)

In addition to two more regular-season home games in December, the University of Phoenix Stadium will have at least a couple more dry runs to help test the network, during the Dec. 31 Fiesta Bowl and during the NFL’s Pro Bowl, scheduled for Jan. 25. And though the Cardnials lost to the Atlanta Falcons Sunday, at 9-3 they are still tied with the Green Bay Packers for the best record in the NFC, something that has the Phoenix faithful optimistic about the postseason.

“We’re going to get some more test runs, on New Year’s Eve and during the Pro Bowl,” Feller said. “And maybe some home playoff games as well!”

(more photos below)

Wi-Fi antenna in roof rafters

Wi-Fi antenna in roof rafters

More antennas in rafters

More antennas in rafters

Wi-Fin antenna under overhang

Wi-Fi antenna under overhang

Stadium Tech Report: With advanced wireless network and app, Baylor brings ‘NFL Experience’ to McLane Stadium

McLane Stadium, Baylor University. Credit all photos: Baylor University

McLane Stadium, Baylor University. (click on any photo for larger image) Credit all photos: Baylor University

Just a few years ago, the Baylor University football program wasn’t a topic of national conversation. But now after a Heisman trophy, a Big 12 championship and perennial top rankings, Baylor is doing its best to stay at the front of the college football pack — and that effort extends to its new stadium, where Baylor has put in place a wireless network and a feature-filled app designed to bring an “NFL experience” to the Waco, Texas campus.

Now in its first season at the brand-new McLane Stadium, Baylor is already delivering an in-stadium fan technology experience that, like the team itself, ranks highly in the nation. Thanks to a Wi-Fi deployment from Extreme Networks, a DAS from AT&T and a new stadium app from sports-app leader YinzCam, Baylor is able to bring high-quality wireless connectivity to all parts of the 45,140-seat facility, along with advanced app features like live and on-demand streaming action video, as well as seating and parking maps for the new facility.

Like the recently opened Levi’s Stadium in Santa Clara, Calif., Baylor had an advantage with McLane Stadium in being able to make technology part of the original design, instead of having to retrofit it in later. “It’s an amazing opportunity to have a new stadium and be able to plan for technology from the bottom up,” said Pattie Orr, Baylor’s vice president for information technology and Dean of university libraries, in a recent phone interview. “It sure is nice to have technology in mind from the beginning.”

The house that RG3 built

McLane Stadium - Opening Game Day vs SMU

McLane Stadium – Opening Game Day vs SMU

But just like the Baylor team, the plan for the new stadium and its technology underpinnings had to come together quickly. Even late in the 2011 season, when then-Baylor quarterback Robert Griffin III was just starting to turn heads with his on-field heroics, the idea of building a new football facility on campus hadn’t yet been formally approved. In 2011, Baylor still played games in Floyd Casey Stadium, a 50,000-seat facility that opened in 1950, located about four miles from campus.

And then, RG3 happened. As many people associated with Baylor will tell you, when the Bears and Griffin quickly vaulted into the national consciousness — especially after a dramatic RG3-led win over Oklahoma and his subsequent winning of the Heisman trophy — the push for a new stadium quickly gathered steam. (For more background, read this excellent history of the stadium’s origin from the Waco Tribune-Herald.)

“Two years ago we still weren’t sure the stadium was coming,” said Bob Hartland, associate vice president for IT infrastructure, who also participated in the phone interview. “Then there was the Heisman trophy, and everything started becoming a reality.”

After the university gave its formal approval in July of 2012, planning for the $266-million facility could begin — with Orr and Hartland’s tech team having to employ a bit of crystal-ball thinking.

“We knew we needed to deliver for mobile devices,” said Hartland. “The hard thing was trying to predict what was going to happen 2 years out [when the stadium would open].”

Pattie Orr, VP of IT for Baylor

Pattie Orr, VP of IT for Baylor

Bringing an ‘NFL experience’ to Waco

And even though Baylor is private and smaller than its Big 12 conference competitors, the IT team made no small plans. “We wanted an NFL experience,” Orr said. To her, that meant an interactive mobile app that delivered live video to each and every seat.

“The best thing we could do was be forward looking,” said Orr. “What we pictured was, ‘could we have it in our hands?’ In the stadiums of the past, fans loved the big screens, and they still do. But there’s nothing like having it right in the palm of your hand.”

Orr said the Baylor IT team visited some existing stadiums with advanced networks, like AT&T Stadium and Gillette Stadium, as part of a technology vetting process. Eventually the Baylor IT department whittled the Wi-Fi selection down to three different approaches — one that included under-the-seat antennas, one that proposed an under-the-concrete solution, and one that relied mainly on overhead APs. That final one, from Wi-Fi provider Extreme Networks, became the winning bid, in part because the Baylor team liked its less-intrusive technology.

If you look closely under the overhangs, you can see Wi-Fi APs

If you look closely under the overhangs, you can see Wi-Fi APs

“Overhead [APs] are just less intrusive, operationally,” said Hartland, noting the need to drill holes in concrete and do special cleaning or weather-hardening for under-the-seat APs. If you look at McLane, you can see multiple overhang areas around the entire seating bowl, which facilitates overhead AP placements. According to news reports, the Extreme Wi-Fi deployment has 330 APs.

Baylor’s Orr also liked the Extreme Purview Wi-Fi analytics software, which provides detailed views of network usage.

“Analytics provide what you need to know,” Orr said. “If you’re in the dark on the fan experience, and don’t know which apps are being used, how can you tune it or make it better?”

On the DAS side, Baylor went with AT&T as the neutral host, though AT&T already has signed up main competitor Verizon Wireless as a client, meaning that the two largest providers of cellular service have enhanced coverage at McLane Stadium through the AT&T DAS, which reportedly has 486 antennas.

“Our goal was a high-density solution, for both cellular and Wi-Fi,” Orr said.

Solving for the standing-on-the-seat problem

Wi-Fi "coach" helps out at McLane Stadium.

Wi-Fi “coach” helps out at McLane Stadium.

While the network has been an early success — Orr said Baylor is already seeing Wi-Fi take rates as high as 33 percent of all attendees at games so far this season — there have also been a few interesting fixes that have been necessary, including re-tuning Wi-Fi APs to get around the interference quirk of students standing on their seats.

Call it technology meeting tradition, with tradition winning: A Baylor tradition to have underclass students standing for the whole game turned into students standing on top of seats at their new section in McLane Stadium — a shift that led to unexpected interference with the original Wi-Fi antenna placements. (One of the quirks of Wi-Fi networks is that the water inside human bodies is a very effective blocker of Wi-Fi signals.)

“We had not anticipated the students standing on seats, and that extra 20 inches really made a difference,” Hartland said. According to another story in the local paper, large band instruments also blocked Wi-Fi signals. Hartland said that since the original problems the IT team and Extreme have developed work-arounds and new antenna placements to fix the issue.

“It’s pretty fantastic that our students are so excited,” said Orr of the standing-interference issue. “You don’t see things like that much at the NFL level.”

Live video and app ‘coaches’

On the app side, Baylor went with YinzCam, a company with numerous stadium apps under its belt for all the top U.S. professional leagues. YinzCam, like Extreme, is also a partner with the NFL, giving YinzCam an edge in winning NFL stadium deployments.

Like other stadium apps, the Baylor In-Game app from YinzCam features multiple camera-angle choices for replays and live streaming video, as well as a host of stats and other team information. Important to Baylor and its new stadium are maps that help direct fans to parking areas, as well as to specialty concession stands in a facility that is new to everyone this season.

Using the app at McLane Stadium

Using the app at McLane Stadium

“We have some well-known smoked onion rings [at the stadium] and the app can help fans find which stands are selling them and how to get there,” Orr said. The parking feature on the app, she said, can send text directions to fans. Also special to Baylor is a “brick finder,” an app that lets fans who participated in a stadium fundraiser find where the brick with their name on it is.

One more NFL-like feature with a collegiate twist is Baylor’s embrace of the Extreme “Wi-Fi coaches” program, which has network-knowledgeable staff members walking around stadiums in highly visible gear offering hands-on help with connectivity and stadium app use. While Extreme has used the coaches program at pro venues like New England and Philadelphia, at Baylor Orr took advantage of in-house “talent,” using students in the MIS program as roaming “coaches,” giving them some real-world experience at network troubleshooting and customer service.

“We put them [the student coaches] in bright vests and have them stationed near concession stands, to offer a friendly face,” Orr said. “They’re terrific, and they give us real-time feedback.”

Orr said Baylor also has a journalism department student intern leading the technology team’s social media effort, which encourages fans to tweet out problems or questions they might have.

“With my gray hair I’m not too good on social media, but one thing I learned is that we need to embrace it,” said Orr. Hartland said that YinzCam reps told Baylor they “just need to get out there” on social media to support the app, and he reports pleasant surprises when the IT team tweets back.

“On social media, [fans] don’t expect to be contacted,” Hartland said. “They really appreciate it when we get back to them.”