Texas A&M’s Kyle Field: A network built for speed

Full house at Kyle Field. All Photos: Paul Kapustka, MSR (click on any photo for a larger image)

Full house at Kyle Field. All Photos: Paul Kapustka, MSR (click on any photo for a larger image)

Is there a combined stadium Wi-Fi and DAS deployment that is as fast as the one found at Texas A&M’s Kyle Field? If so, we haven’t seen or heard of it.

In fact, after reviewing loads of live network-performance data of Kyle Field’s new Wi-Fi and DAS in action, and after maxing out the top levels on our speed tests time after time during an informal walk-around on a game day, we’ve come to the conclusion that Kyle Field has itself a Spinal Tap of a wireless deployment. Meaning, that if other stadium networks stop at 10, this one goes to 11.

Movie references aside, quite simply, by the numbers Kyle Field’s wireless network performance is unequaled by any other large public venue’s we’ve tested in terms of raw speed and the ability to deliver bandwidth. With DAS and Wi-Fi speed measurements ranging between 40 Mbps and 60+ Mbps pretty much everywhere we roamed inside the 102,512-seat venue, it’s a safe bet to say that the school’s desire to “build the best network” in a stadium hit its goal as best as it could.

Editor’s note: This story is part of our most recent STADIUM TECH REPORT, the COLLEGE FOOTBALL ISSUE. The 40+ page report, which includes profiles of stadium deployments at Texas A&M, Kansas State, Ole Miss and Oklahoma, is available for FREE DOWNLOAD from our site. Get your copy today!

On one hand, the network’s top-line performance is not that much of a surprise, since as part of an overall Kyle Field renovation that has already cost an estimated $485 million, the optical-based Wi-Fi, DAS and IPTV deployment inside the Aggies’ football palace is probably among the most expensive and expansive in-venue networks ever built. According to Phillip Ray, Vice Chancellor for Business Affairs at The Texas A&M University System, the total cost of the optical-based Wi-Fi, DAS and IPTV network was “somewhere north of $20 million.”

Remote optical cabinet and Wi-Fi AP at Kyle Field.

Remote optical cabinet and Wi-Fi AP at Kyle Field.

And even though the nation’s biggest cellular carriers, AT&T and Verizon Wireless, paid nearly half the network’s cost – $10 million, according to Ray – with the dedication and work crews brought to the table by main suppliers IBM and Corning, and Wi-Fi gear vendor Aruba, you have components, expertise and budgetary freedom that perhaps only a small group of venue owners could hope to match.

But just throwing money and technology at a stadium doesn’t necessarily produce a great network. In a venue the size of the new Kyle Field there needs to be great care and innovative thinking behind antenna placement and tuning, and in that arena Texas A&M also had the guiding hand of AmpThink, a small firm with oversized smarts in Wi-Fi deployment, as evidenced by its impressive track record of helping wireless deployments at the biggest events including several recent Super Bowls.

The core decision to go with optical for the network’s guts, and a tactical decision to put a huge chunk of the Wi-Fi APs in under-seat deployments are just part of the strategy that produced a network that – in A&M fan parlance – can “BTHO” (Beat The Hell Out) of most challengers.

Since it’s almost impossible to directly compare stadiums and venue network performances due to all the possible variables, you’ll never hear us at Mobile Sports Report declare a “champion” when it comes to click-bait themes like “the most connected stadium ever.” Given its remote location some three hours south of Dallas in College Station, Texas, Kyle Field will almost certainly never face the ultimate “big game” pressures of a Super Bowl or a College Football Playoff championship, so the network may never know the stress such large, bucket-list gatherings can produce. And so far, there aren’t many ambitious fan-facing applications that use the network, like in-seat food delivery or wayfinding apps found in other stadiums.

But as part of the football-crazy SEC, and as the altar of pigskin worship for some of the most dedicated fans seen anywhere, Kyle Field is sure to see its share of sellout contests against SEC rivals that will push wireless usage to new heights, especially as more fans learn about and use the still-new system. Though total Wi-Fi usage at the Nov. 7 game we attended versus Auburn (a 26-10 Texas A&M loss) was “only” 2.94 terabytes – a total hampered by cold, windy and rainy conditions – an Oct. 17 game earlier in the season against Alabama saw 5.7 TB of Wi-Fi usage on the Kyle Field network, a number surpassed only by last year’s Super Bowl (with 6.2 TB of Wi-Fi use) in terms of total tonnage.

At the very least, the raw numbers of total attendees and the obvious strength of the still-new network is sure to guarantee that Kyle Field’s wireless deployment will be one of the most analyzed stadium networks for the foreseeable future.

Texas A&M student recording the halftime show.

Texas A&M student recording the halftime show.

What follows are some on-the-spot observations from our visit, which was aided by the guidance and hospitality of Corning project manager Sean Heffner, who played “tour guide” for part of the day, giving us behind-the-scenes access and views of the deployment that are unavailable to the general fan audience.

An off-campus DAS head end

This story starts not inside Kyle Field, but in a section of town just over three miles away from the stadium, on a muddy road that curves behind a funky nursery growing strange-looking plants. A gray metal box, like a big warehouse, is our destination, and the only clue as to what’s inside is the big antenna located right next to it. This structure is the Kyle Field DAS head end, where cellular carrier equipment connects to the fiber network that will bring signals to and from fans inside the stadium.

Why is the head end so far away? According to Corning’s Heffner there was no room for this huge space inside the stadium. But thanks to the use of optical fiber, the location is not a problem since signals traveling at the speed of light makes 3.3 miles an insignificant span.

It might be helpful to back up a bit if you haven’t heard the full story of the Kyle Field deployment, which we told last year when the job was halfway completed. Though the rebuilding of the stadium was started with copper-based networks as the original plan, a last-minute audible championed by Texas A&M chancellor John Sharp sent the school on a decidedly untraditional path, by building a stadium network with a single optical-based core for Wi-Fi, DAS and IPTV networks. The kicker? Not only would this network have huge capacity and be future-proof against growth, it would actually cost less than a comparable copper-based deployment. If it got built on time, that is.

Though the pitch for better performance, far more capacity, use of less space, and cheaper costs might sound a bit too good to believe, most of it is just the combination of the simple physics advantages of using fiber over copper, which are well known in the core telecom and large-enterprise networking worlds, applied to a stadium situation.

One of the many maxed-out speed tests we took at Texas A&M's Kyle Field.

One of the many maxed-out speed tests we took at Texas A&M’s Kyle Field.

Without going too deeply into the physics or technology, a simple explanation of the benefits stem from the fact that optical fiber can carry far more bandwidth than copper, at farther distances, using less power. Those advantages are why fiber is used extensively in core backbone networks, and has been creeping slowly closer to the user’s destination, through deployments like Verizon’s FiOS.

And that’s also the reason why Texas A&M could put its DAS head end out in a field where it’s easier to add to (no space constraints), because the speed of fiber makes distance somewhat irrelevant. Corning’s Heffner also said that the DAS can be managed remotely, so that staff doesn’t need to be physically present to monitor the equipment.

Of course, there was the small matter of digging trenches for optical fibers to get from the head end to the stadium, but again, for this project it is apparent that getting things done was more important than strictly worrying about costs. Beyond the cash that the carriers all put in, other vendors and construction partners all put in some extra efforts or resources – in part, probably because the value of positive publicity for being part of such an ambitious undertaking makes any extra costs easy to justify.

Keeping the best fans connected and happy

From the head end, the fiber winds its way past apartment buildings and a golf course to get to Kyle Field, the center of the local universe on football game days. Deep inside the bowels of the venue is where the fiber meets networking gear, in a room chilled to the temperature of firm ice cream. Here is where the human element that helps keep the network running spends its game days, wearing fleece and ski jackets no matter what the temperature is outside.

See the white dots? Those are under-seat Wi-Fi APs

See the white dots? Those are under-seat Wi-Fi APs

In addition to Corning, IBM and AmpThink employees, this room during our visit also had a representative from YinzCam in attendance, a rarity for a company that prides itself on being able to have its stadium and team apps run without local supervision. But with YinzCam recently named as a partner to IBM’s nascent stadium technology practice, it’s apparent that the Kyle Field network is more than just a great service for the fans in the seats – it’s also a proof of concept network that is being closely watched by all the entities that helped bring it together, who for many reasons want to be able to catch any issues before they become problems.

How big and how ambitious is the Kyle Field network? From the outset, Corning and IBM said the Wi-Fi network part was designed to support 100,000 connections at a speed of 2 Mbps, so that if everyone in the stadium decided to log on, they’d all have decent bandwidth. But so far, that upper level hasn’t been tested yet.

What happened through the first season was a “take rate” averaging in the 35,000-37,000 range, meaning that during a game day, roughly one-third of the fans in attendance used the Wi-Fi at some point. The average concurrent user peaks – the highest numbers of fans using the network at the same time – generally averaged in the mid-20,000 range, according to figures provided by Corning and AmpThink; so instead of 100,000 fans connecting at 2 Mbps, this season there was about a quarter of that number connecting at much higher data rates, if our ad hoc speed tests are any proof.

Our first test that Saturday [Nov. 7, 2015], just inside a lower-level service entryway, hit 41.35 Mbps for download and 18.67 on the upload, on a Verizon iPhone 6 Plus over the stadium’s DAS. And yes, that download speed was the slowest we’d record all day, either on the DAS or the Wi-Fi.

Inside the control room we spent some time with AmpThink CEO Bill Anderson, who could probably use up an entire football game talking about Wi-Fi network deployment strategies if he didn’t have a big network to watch. On this Saturday the top things we learned about Kyle Field is that Anderson and AmpThink are solid believers in under-seat AP placements for performance reasons; according to Anderson at Kyle Field, fully 669 of the stadium’s 1,300 APs can be found underneath seats. Anderson also is a stickler for “real” Wi-Fi usage measurements, like trying to weed out devices that may have autoconnected to the Wi-Fi network but not used it from the “unique user” totals – and to take bandwidth measurements at the network firewall, to truly see how much “live” bandwidth is coming and going.

On the road to College Station, Aggie pride is everywhere. Whoop!

On the road to College Station, Aggie pride is everywhere. Whoop!

AmpThink’s attention to detail includes deploying and configuring APs differently depending on which section they are located in – student sections, for example, are more densely packed with people than other sections so the APs need different tuning. Corning’s Heffner also said that the oDAS – the DAS just outside the stadium – got special attention due to the large numbers of tailgating fans, both before and during the games. At the Alabama game, Heffner said there were some 30,000 fans who remained outside the stadium during the contest, never coming inside but still wanting to participate in the scene.

AmpThink, Corning, IBM and others involved at Kyle Field all seem keen on finding out just how much bandwidth stadium fans will use if you give them unlimited access. The guess? According to Corning’s Heffner, the mantra of stadium networks these days seems to be: “If you provide more capacity, it gets consumed.”

The ‘real’ 12th man

After walking through a tunnel with a nearly full cable tray overhead (“It’d be even more loaded if we were using copper,” Heffner said) we went out into the stadium itself, which was just starting to fill. Though the overcast day and intermittment rain squalls might have kept other teams’ fans from showing up for a 5:30 p.m. local start time, that simply wasn’t the case at an A&M home game.

Some of the Wi-FI and DAS download measurements we took at Kyle Field.

Some of the Wi-FI and DAS download measurements we took at Kyle Field.

As someone who’s attended a countless number of football games, small and large – including a Super Bowl and last year’s inaugural College Football Playoff championship game – I can honestly say that the level of fan participation at Texas A&M is like nothing I’d seen before. The student section alone spans two decks on the stadium’s east side and takes up 40,000 seats, according to stadium officials – simply dwarfing anything I’d ever witnessed. (Out of an enrollment of 57,000+, having 40,000 students attend games is incredible.) And outside of small high school crowds I’d never seen an entire full stadium participate in all the school songs, the “yells” (do NOT call them “cheers” here) and the locked-arms back-and-forth “sawing” dance without any need for scoreboard instruction.

Part of the stadium renovation that closed the structure into a bowl was, according to school officials, designed to make Kyle Field even more intimidating than it already was, by increasing the sound levels possible. Unfortunately the night of our visit some early Auburn scores took some of the steam out of the crowd, and a driving, chilling rain that appeared just before halftime sent a good part of the crowd either home or into the concourses looking for warmth and shelter. (The next day, several columnists in the local paper admonished the fans who left early for their transgressions; how dare they depart a game whose outcome was still in doubt?)

But I’ll never forget the power of the synchronized “yells” of tens of thousands of fans during pregame, and the roar that surfaced when former Aggie QB Johnny Manziel made a surprise appearance on the field before kickoff. Seattle Seahawks fans may stake the pro claim to fan support, but if you want to determine the “real” 12th man experience you need to stop by Kyle Field and give your ears a taste of loud.

Controlling the TV with the app

If the students and alumni and other fans outside provide the vocal power, the money power that helped get the stadium rebuilt can be found in the new Kyle Field suites and premium seating areas, some of which are found on the venue’s west side, which was blown up last December and rebuilt in time for this past season.

Conduit reaching to an under-seat AP

Conduit reaching to an under-seat AP

Inside the All American Club – a behind-the-walls gathering area with catered food and bars that would not seem out of place in Levi’s Stadium or AT&T Stadium – we tested the Wi-Fi and got speeds of 63 Mbps down, 69 Mbps up; Verizon’s 4G LTE service on the DAS hit 48 Mbps/14.78 Mbps, while AT&T’s 4G LTE DAS checked in at 40 Mbps/22 Mbps.

In an actual suite where we were allowed to check out the IPTV displays, the speed tests got 67/67 for Wi-Fi and 57/12 for Verizon 4G LTE. So the well-heeled backers of A&M football shouldn’t have any problems when it comes to connectivity.

As for the IPTV controls, the new system from YinzCam solves one of the problems that’s plagued stadium suites since there’s been suites: What do you do with the TV remote? What YinzCam did for Texas A&M was link the TV controls to a Texas A&M “TV Remote” app; by simply punching in a numerical code that appears on the bottom of the screen in front of you, anyone with access to a suite or club area with TVs can change the channel to a long list of selections, including multiple live game-day views (stadium screen, broadcast view) as well as to other channels, like other games on the ESPN SEC network.

By having a static code number for each TV and another set of numbers that randomly scrambles over time, the system smartly builds security into the channel changing system, and prevents someone who had been in a suite previously from being able to change the channels after they leave. The whole remote-control process took less than a minute to learn, and we had fun wandering through the club-level areas our pass gave us access to, changing screens as we saw fit.

Our favorite places to watch the game at Kyle Field were the loge-level lounges, where you could first purchase food and beverages, including alcoholic ones, at an inside bar and then sit at an outside seat with a small-screen TV in front of you for information overload. The Wi-Fi in the southwest corner loge lounge checked in at 67.03/62.93, so it was no problem being connected via mobile device, either.

What comes next for the Kyle Field network?

Even though the rain had started coming down harder, we left the comfort and warmth of the club levels to wander around the stadium’s upper decks, including the student section, where we watched numerous fans taking pictures or videos of the band’s halftime performance. Clearly most everyone in Kyle Field had gotten the message and wasn’t afraid that they won’t connect if they use their mobile device at the game, even among 102,000 of their closest friends.

Antennas on flag poles atop seating

Antennas on flag poles atop seating

The question now for Kyle Field is what does it do next with its network? The most obvious place for innovation or new features is with a stadium-centric app, one that could provide services like a wayfinding map. Maybe it was our round-the-stadium wandering that produced confusion finding our way around, but any building that seats 102,000 plus could use an interactive map. It might also be interesting to tie a map to concessions – the night we visited, there were long lines at the few hot chocolate stands due to the cold weather; in such situations you could conceivably use the network to find out where hot chocolate stands were running low, maybe open new ones and alert fans through the app.

We’re guessing parking and ticketing functions might also be tied to the app in the future, but for now we’ll have to wait and see what happens. One thing in Kyle Field’s favor for the future: thanks to the capacity of the optical network buildout, the stadium already has thousands of spare fiber connections that aren’t currently being used. That means when it’s time to upgrade or add more DAS antennas, Wi-Fi APs or whatever comes next, Kyle Field is already wired to handle it.

For the Nov. 7 game at Kyle Field, the final numbers included 37,121 unique users of the Wi-Fi network, and a peak concurrent user number of 23,101 taken near the end of the 3rd quarter. The total traffic used on the Wi-Fi network that night was 2.94 TB, perhaps low or average for Kyle Field these days but it’s helpful to remember that just three years ago that was right around the total Wi-Fi data used at a Super Bowl.

Until the next IBM/Corning network gets built in Atlanta (at the Falcons’ new Mercedes-Benz Stadium, slated to open in 2017), the Kyle Field network will no doubt be the center of much stadium-technology market attention, especially if they ever do manage to get 100,000 fans to use the Wi-Fi all at once. While A&M’s on-the-field fortunes in the competitive SEC are a yearly question, the performance of the network in the Aggies’ stadium isn’t; right now it would certainly be one of the top four seeds, if not No. 1, if there was such a thing as a college stadium network playoff.

What we’re looking forward to is more data and more reports from a stadium with a network that can provide “that extra push over the edge” when fans want to turn their connectivity dial past 10. Remember, this one goes to 11. It’s one more.

(More photos below! And don’t forget to download your copy of the STADIUM TECH REPORT for more!)

kf7
Panoramic view of Kyle Field before the 102,000 fans fill the seats.

kf2
Some things at Kyle Field operate at ‘traditional’ speeds.

kf1

Outside the south gate before the game begins.

kf3

Overhang antenna in the middle section of the stadium.

Fans at College Football Playoff championship game use 4.9 TB of Wi-Fi data, 3.9 TB of DAS from AT&T and Verizon

Alabama coach Nick Saban hoists the college championship trophy. Photo by Kent Gidley / University of Alabama

Alabama coach Nick Saban hoists the college championship trophy. Photo by Kent Gidley / University of Alabama

The exciting national championship game Monday night between Alabama and Clemson also resulted in a big night for Wi-Fi and cellular usage at the University of Phoenix Stadium in Glendale, Ariz., with 4.9 terabytes of Wi-Fi data consumed, according to stadium network officials.

While the number didn’t set a stadium record — the 6.23 TB of Wi-Fi used at Super Bowl XLIX last February in the same venue is still the highest single-game Wi-Fi mark we’ve seen — the 4.9 TB used Monday nearly matches the total from last year’s inaugural College Football Playoff championship game at AT&T Stadium in Arlington, Texas, where 4.93 TB of Wi-Fi was used. It’s worth noting, however, that Monday night’s game had 75,765 fans in attendance, almost 10,000 less than last year’s crowd of 85,689 at the first playoff championship. So at the very least, Monday’s fans used more data per fan in attendance than last year’s game.

On the cellular side of things however, AT&T reported that data usage on its DAS network Monday night exceeded the total from last year’s Super Bowl, with 1.9 TB carried Monday to top the 1.7 TB total AT&T recorded at Super Bowl XLIX. UPDATE, 1/26/16: Verizon has followed up with a report claiming it had 2 TB of DAS traffic at the event. So for right now the wireless total from Monday’s game stands at 8.8 TB, a number that still might grow if we ever hear from Sprint or T-Mobile.

Mark Feller, vice president of information technology for the Arizona Cardinals, said that the University of Phoenix Stadium Wi-Fi network saw 23,306 unique devices connect Monday night, with a peak concurrent connected total of 17,297 devices. The stadium network also saw an additional 1.2 TB of wired data used Monday night, primarily from press and photographer Ethernet connections, Feller said.

The 4.9 TB mark unofficially puts Monday’s game in the “top four” of highest-ever single game Wi-Fi data totals we’ve seen, behind only last year’s Super Bowl, an Alabama game at Texas A&M’s Kyle Field this fall that hit 5.7 TB, and (barely) last year’s college championship game. All eyes in the Wi-Fi totals world now turn to Levi’s Stadium, where Super Bowl 50 takes place Feb. 7. Will the 6.2 TB mark survive, maybe showing that fan data use at big games has peaked? Or will a new record be set?

AT&T: NFL fans used 55% more DAS data this year

dx1AT&T customers who visited NFL stadiums this season used 55 percent more cellular traffic this year than last, according to some year-end figures from AT&T.

In the 31 different NFL venues where there is an AT&T DAS AT&T customers used 132.8 terabytes of cellular data this NFL season, with the Dec. 14 Monday night game between the New York Jets and the Miami Dolphins topping the single-game charts with 1.6 TB of DAS data used, according to AT&T. It’s appropriate that Sun Life Stadium had the biggest data game, since Miami’s home also led the NFL for highest average DAS data used, with 1.4 TB per game. Close behind in second place for average DAS use was AT&T Stadium in Arlington, Texas, where the average hit 1.257 TB this season. Third was San Diego’s Qualcomm Stadium with 1.085 TB, and fourth was Levi’s Stadium in Santa Clara, Calif., with an average of 1.054 TB each game on the AT&T DAS.

Commentary: Wi-Fi and DAS ain’t cheap — but can your venue afford not having them?

Screen Shot 2016-01-05 at 10.59.41 AMIf I had to guess, I would bet that our news that Texas A&M’s new optical-based stadium network cost “north of $20 million” to build will be one of the most talked-about things in the stadium technology world for the near future. While some may ask “who really has that kind of money to spend” on a stadium network, I think there is an equal question in the opposite direction: Can you afford not to spend that much (or at least as much as you can) to make your network as good and future-proof as it can be?

Our cover story about the new deployment at A&M’s Kyle Field in our latest STADIUM TECH REPORT (which you can download for free) may be somewhat of an outlier since Texas A&M is clearly one of the subset of universities and colleges that doesn’t really have the budgetary concerns that others might. Yet it’s also instructive to look around at Texas A&M’s peers at the big-time college football level to see how few of them have even started down the road toward a top-level stadium network.

Some schools with “big” football programs (which regularly attract large, sellout crowds and have plenty of income on hand) have certainly built great networks of their own, including schools we’ve profiled, like Wisconsin, Nebraska, Baylor and more. But there are still many more schools, even those with successful, money- making operations, who still haven’t put high- speed wireless networks into their venues. The biggest question may be for them, and it is: How much longer will your fans put up with the feared “no signal” problem? Especially as the kids of today become potential ticket-buying alums that you count on for the future?

It’s not about watching the phone at the game

To be sure, we still don’t think that anyone – anyone – goes to a sporting event to stare at their phone. There is still so much to the live game-day experience, the smells, sounds and tribal fun, that it will always outweigh whatever entertainment or pleasure one might derive from their mobile device.

Dallas fan in mobile action at AT&T Stadium. Photo: Phil Harvey, MSR

Dallas fan in mobile action at AT&T Stadium. Photo: Phil Harvey, MSR

That being said, it’s also true that our society has already become one that is used to being able to connect everywhere; that’s especially so when we’re in public and social situations, where the ability to stay in touch facilitates not only face-to-face meetings (meet you there!) but also enables us to stay close to others who can’t physically be with us (wish you were here!).

Time and time again, when we profile venues that have installed new wireless networks, we ask about the reasons behind the deployment – and almost always, fans complaining about not being able to connect is one of the top woes. Before the stadium refurbishment at Texas A&M, chancellor John Sharp’s office was “flooded” with emails after every home game, complaining about two things in particular: The lack of women’s restrooms, and the terrible cellular reception. They’re both plumbing problems, but some people still don’t seem to see the urgency to solve the second kind, the one that uses “pipes” to connect phones.

For the near future, it may be easy to ignore the problem and say it’s not a priority, that fans come to watch the games, not their phones. But ignoring the reality of the need for people to stay connected seems a bad way to treat paying customers; and every day your venue doesn’t have a network is another day lost in the possible pursuit of a closer relationship with ticket-buyers, and the potential digital-supported revenue ideas that are just starting to emerge.

While we’re guessing that not every institution can support a $20 million network (even if the wireless carriers are paying half the cost), there are many other ways to skin this cat, as other profiles in our most recent STADIUM TECH REPORT issue point out. By partnering with Boingo, Kansas State was able to get both a DAS and a Wi-Fi network built; and at Ole Miss, a partnership with C Spire got a Wi-Fi network deployed at Vaught-Hemingway Stadium, albeit one where non-C Spire customers have to pay a small fee ($4.99 per game) to use it.

Maybe charging a small fee isn’t the ideal situation, but it’s better than no network at all, especially if you want to attend a game but still want to remain somewhat connected to the outside world. And we haven’t even mentioned the public safety aspects of ensuring you have adequate cellular and/or Wi-Fi coverage in your venue, which might prove indispensible in times of emergency.

And even at stadiums we’ve been to where there is advanced cellular and Wi-Fi inside the venue itself, there is often poor or no connectivity outside. At Texas A&M, we heard tales of some 30,000 people who remained in the tailgating lots during the game, never wanting to come inside. While not all schools may have that kind of be-there fervor, the idea of an “event city” is taking shape at many venues pro and collegiate.

At the University of Phoenix Stadium in Glendale, Ariz., for example, a Crown Castle DAS brings connectivity to the extensive mall/restaurant area surrounding the football stadium and hockey arena; both the Green Bay Packers and the Chicago Cubs are planning outside-the-wall fan areas that will have Wi-Fi and DAS coverage to keep people connected on the way to or from the games. For many venues, outside is now as important as inside when it comes to wireless coverage.

So the question is, should your institution spend the necessary money to put great networks into your most public places, or is connectivity still a luxury your venue can’t afford? We’ll admit we don’t know all the answers to those twin questions, but if you have a story or opinion one way or the other we’re ready to help you tell your tale. Let’s hear from more of you, so that everyone can learn.

NFL Stadium Tech Reviews — NFC West

Editor’s note: The following team-by-team capsule reports of NFL stadium technology deployments are an excerpt from our most recent Stadium Tech Report, THE PRO FOOTBALL ISSUE. To get all the capsules in one place as well as our featured reports, interviews and analysis, download your free copy of the full report today.

NFC WEST

Reporting by Paul Kapustka

View from the Levi's 501 Club section seats, 2014 season. Photo: Paul Kapustka, MSR

View from the Levi’s 501 Club section seats, 2014 season. Photo: Paul Kapustka, MSR

San Francisco 49ers
Levi’s Stadium
Seating Capacity: 68,500
Wi-Fi – Yes
DAS – Yes

Though the San Francisco 49ers didn’t quite live up to expectations last year, the team’s new stadium delivered on its technological promise, especially on the Wi-Fi network front, where service was solid from day 1, supporting the innovative stadium-app features like food delivery to every seat and instant replays. And while there were no complaints about the stadium’s DAS, carrier customers paid deployment firm DAS Group Professionals to completely replace the system this offseason, to better handle even more traffic expected at Super Bowl 50, which will take place at Levi’s in February.

Arizona Cardinals
University of Phoenix Stadium
Seating Capacity: 63,500
Wi-Fi – Yes
DAS – Yes

If you want great Wi-Fi, by all means have your facility host a Super Bowl. The latest recipient of a high-fidelity network (using Cisco gear and deployed by CDW), the University of Phoenix Stadium set Wi-Fi records last February at the big game, with more than 6 terabytes of data used.

Seattle Seahawks
CenturyLink Field
Seating Capacity: 72,000
Wi-Fi – Yes
DAS – Yes

CenturyLink Field, once a joke because it was a stadium named after a phone company that had poor connectivity, is now into its second year of a Wi-Fi deployment from Extreme and Verizon Wireless, where Verizon customers get their own part of the network. Watch for more innovation in Seattle on the app side, with multiple camera angles available for replays.

St. Louis Rams
Edward Jones Dome
Seating Capacity: 66,000
Wi-Fi – No
DAS – Yes

Still no Wi-Fi at the Edward Jones Dome, as the team continues to ponder its future and whether or not it will stay in St. Louis.
Fans should still have good cellular connectivity thanks to the Mobilitie neutral-host DAS installed last season.

University of Phoenix Stadium sees another 2 TB Wi-Fi game with big events on the horizon

University of Phoenix Stadium before Super Bowl XLIX. Photo: Paul Kapustka, MSR (click on any photo for a larger image)

University of Phoenix Stadium before Super Bowl XLIX. Photo: Paul Kapustka, MSR (click on any photo for a larger image)

Call it maybe a warm-up before the storm hits? The University of Phoenix Stadium, home of the Arizona Cardinals, racked up another 2 terabyte Wi-Fi traffic event during a recent Thursday night game, but bigger wireless days are no doubt on the near horizon.

With playoff-consideration regular season home games coming up against the Green Bay Packers and the Seattle Seahawks, the beefed-up Wi-Fi and DAS at UoP is sure to get a workout, though there might be even bigger numbers chalked up during the Notre Dame-Ohio State clash at the Fiesta Bowl on Jan. 1, 2016, and the College Football Playoff championship game, scheduled for Jan. 11. According to Mark Feller, vice president of technology for the Arizona Cardinals, the two college events will use the stadium’s expanded seating, which increases capacity from the NFL-game level of 63,500 to 75,000.

Last February during Super Bowl XLIX, the University of Phoenix Stadium (located in Glendale, Ariz.) recorded the highest single-game Wi-Fi traffic mark, a figure of 6.23 TB, while the inaugural College Football Playoff championship game at AT&T Stadium hit 4.93 TB. With the Packers coming to town Dec. 27 followed by the Seahawks on Jan. 3, it might be interesting to see how much Wi-Fi traffic is carried at UoP in the two-week-plus span.

For the Dec. 10 Thursday night game against the Minnesota Vikings (won by the Cardinals, 23-20), Feller said the Wi-Fi network recorded 28,497 unique clients, an almost 45 percent “take rate.” The peak concurrent user number that night was 25,333, Feller said, occurring just before halftime. The total bandwidth used was 2.0 TB, Feller said.

We’ll be interested to see what happens in the “15 days of bandwidth,” a series of events Feller and his crew are facing with excitement, as well as probably some pots of coffee and/or energy drinks.

“We are excited to be hosting all these games, but won’t be sleeping much,” Feller said in an email.