First Look: Inside the Atlanta Falcons’ new Mercedes-Benz Stadium

Big Bird greets all visitors to Mercedes-Benz Stadium. Credit all photos: Paul Kapustka, MSR (click on any photo for a larger image)

We’ll have much more to report on what we saw at the press day at the Atlanta Falcons’ new Mercedes-Benz Stadium, but we thought it was important to share these views as soon as we could — so here is an extended photo essay from the newest NFL venue (which will also be used for soccer). Unfortunately the Wi-Fi and DAS networks were live but not yet optimized, so we weren’t able to do any comprehensive speed testing (but hey, that’s what a regular season game is for!).

Overalll first impressions, technology wise — this is another well thought out venue specifically from a technology standpoint but also mainly just from a visual feel. The halo board is as impressive as advertised, though we would want to see it in action during a game (while sitting in a seat) to fully judge whether or not it fits in with the flow of an event. For advertisers it’s a wonder, as watching all the video screens in the house go to a synchronized ad video was a big wow factor.

Since much of the stadium interior is unfinished concrete, there wasn’t much of an effort to hide networking components — but given all the other piping and cabling, the equipment does kind of fade out of sight in plain view.

MSR welcomes you to the big house

It’s our educated guess that the AT&T Porch — a wide open gathering area in the end zone opposite the windows toward downtown — is going to be a popular hangout, since you can see the field and have multiple big screen TV options behind you. We also liked the “technology loge suites,” smaller four-person private areas just off the main concourse with their own small TV screens and wireless device charging.

On the app side of things, it’s fair to say that features will iterate over time — both the wayfinding and the food-ordering options are not wirelessly connected yet, but according to IBM beacons are a possible future addition to the mix. And while Mercedes-Benz Stadium is going to all-digital ticketing, season ticket holders will most likely use RFID cards on lanyards instead of mobile phone tickets simply because the RFID is a quicker option. The ticket scanners are by SkiData, fiber backbone by Corning, Wi-Fi APs by Aruba, and DAS by Corning and a mix of antenna providers.

Like we said, more soon! But enjoy these photos today, ahead of the first event on Aug. 26.

The view inside the main entry, with halo board visible above

The view from the other side of the field, from the AT&T Porch

Just hard to fit all this in, but you can see here from field to roof

I spy Wi-Fi, APs point down from seat bottoms to main entry concourse

One of the many under-seat APs

A good look at the roof: Eight “petals” that all pull straight out when open, which is supposed to take 7 minutes according to design

Good place for maximum coverage

View from the field

One of “hundreds” of mini-IDFs, termination points that bring fiber almost right to edge devices

The mega-vertical TV screen, just inside the main entry. 101 feet tall!

Something Falcons fans may like the most: Look at the prices!

MORE SOON!

AT&T to provide backbone bandwidth for Mercedes-Benz Stadium Wi-Fi

In a somewhat surprising announcement, AT&T said it will provide backbone bandwidth for the Wi-Fi network at the new Mercedes-Benz Stadium in Atlanta, as part of a partnership deal that makes the carrier the “Official Communications Provider” for the Atlanta Falcons’ new home.

Announced today, the deal calls for AT&T to provide twin redundant 40 Gbps pipes to power the 1,800 Wi-Fi APs that are inside Mercedes-Benz Stadium. As reported earlier by MSR, the Mercedes-Benz Wi-Fi network will primarily use under-seat AP deployments in the seating bowl.

AT&T said it will also provide “monitoring and maintenance” for the stadium’s Wi-Fi network, and will also bring its DirecTV service to the venue’s IPTV system, making that content available to the more than 2,000 digital displays in the stadium. Mercedes-Benz Stadium is scheduled to formally open later this summer, for one of the Falcons’ preseason games.

What makes this announcement interesting to the stadium networking industry is the fact that there is no mention of any participation by AT&T on the venue’s DAS network, which will be running on Corning equipment. For most of the recent past, AT&T has been pulling away from stadium Wi-Fi deployments and concentrating on DAS funding in large public venues. Its main competitor Verizon Wireless has been much more active recently on the stadium Wi-Fi front, helping fund Wi-Fi deployments in a number of NFL stadiums, including those in Green Bay, Denver, Seattle, Houston and others. AT&T does continue to participate in network deployments at AT&T Park in San Francisco and AT&T Stadium in Arlington, Texas, among others.

The press release out today does not say whether or not AT&T customers will have their own SSID or network space reserved, a feature Verizon usually secures for its customers when it helps fund a stadium’s Wi-Fi network. The release did say that as part of the deal AT&T will also sponsor the “AT&T Perch,” which is described as “a permanent interactive gathering spot” located on the concourse above the stadium’s west end zone. According to the release the Perch will have multiple screens where fans can watch NFL content including DirecTV’s Sunday Ticket programming and the NFL Network’s RedZone channel.

AT&T beefs up ski resort reception with stealthy DAS

AT&T DAS antenna stand (right) near the American Eagle lift at Copper Mountain. Credit all photos: Paul Kapustka, MSR (click on any photo for a larger image)

In order to improve cellular reception at the Copper Mountain ski area, AT&T this winter installed a stealthy seven-antenna DAS in several base-area locations, including inside ski-lodge buildings and inside a rooftop cupola.

According to Quin Gelfand, a senior real estate and construction manager for AT&T’s Antenna Solutions Group, the mountain had previously been served only by a single macro tower located up near the slopes of the popular Colorado resort, which is located just off the I-70 Interstate between Frisco and Vail.

On heavy skier-visit days, Gelfand said, the macro tower recently caused some “capacity concerns,” leading AT&T to design a DAS solution for the several base areas at Copper Mountain. In addition to just being saturated by demand, Gelfand said the single macro antennas often didn’t provide strong signals inside buildings at the base areas.

“In a lot of areas around the resort, there were low bars for LTE,” Gelfand said.

AT&T’s Quin Gelfand shows off the main head end DAS gear rack.

But on Feb. 23 this year, that situation changed for AT&T cellular customers, as the DAS went live and immediately started moving lots of cellular traffic. By the time of our visit in early April, Gelfand said the DAS installation (which has the capacity equivalent of a single large macro tower) had already seen more than 7 terabytes of data moved, averaging about 175 GB per day. Like at many Colorado ski areas, March is a busy month at Copper with lots of spring break skiers and locals driving up on weekends from Denver.

Hiding antennas in a cupola

Brad Grohusky, senior IT manager for Copper Mountain, said AT&T approached the resort a couple of years ago to discuss the idea of a DAS. “When we had a dense population of guests, it was pretty easy to saturate a signal,” Grohusky said.

On weekends, Grohusky said Copper could often see as many as 10,000 guests, and might even see as many as 14,000 visitors on popular days or holidays. Wireless communications, he said, could get even more stress if the weather turned nasty or cold, driving more people inside buildings.

DAS antenna (upper top left) in Copper Station lodge

Starting from an existing telecom service room located in an underground garage, AT&T ran fiber this past offseason to three different antenna locations. The closest and most obvious is a three-antenna stand near the “Burning Stones” gathering area and the American Eagle chairlift base. As one of the resort’s main first chairs the American Eagle often has crowds at its base, and the Burning Stones area is a small clearing between the slopes and the base area buildings that is used often for concerts and other public gatherings.

“There was lots of digging last summer,” said Grohusky of the fiber-trenching effort, which gained some extra time thanks to a warmer-than-usual fall that kept the snow at bay. “We took advantage of that extra week,” Grohusky said.

If the American Eagle-area antennas are in plain sight, the two antennas at the Union Creek Schoolhouse base area to the west would be impossible to find if you didn’t know where they were; on the roof of a building AT&T built custom-designed baffling for a rooftop cupola that completely hides the antennas while allowing cellular signals to pass through.

“You would never know the antennas were up there,” Grohusky said. “AT&T really accomodated our architecture there.”

Closer look at DAS tower near American Eagle lift

Back farther to the east, two more antennas were located at the top windows of the Copper Station lodge building, pointed outward to cover the lift base areas and the condos and other buildings in that area. According to Gelfand AT&T used Nokia RAN gear as well as Corning fiber equipment, CommScope cabling components and antennas from JMA Wireless in the deployment. The DAS is powered by a 100 Mbps fiber link from CenturyLink, and supports three cellular bands — 700 MHz, AWS and PCS, according to Gelfand.

Even though ski season is all but over, the network will still get use in the non-snowy months as Copper Mountain, like many Colorado resorts, has an active summer schedule of on-mountain activities. The resort also has a limited free public Wi-Fi network in certain base area buildings, including in and around the Starbucks location right next to the Burning Stones area. Gohusky said there are no current plans to expand the Wi-Fi, and also said that none of the other major cellular carriers are planning to add any of their own DAS deployments.

But for AT&T customers, Grohusky said connectivity is vastly improved. “The feedback has been great,” he said. “Connectivity used to be poor inside buildings, but now it’s great.”

Look back toward the Burning Stones gathering area, near American Eagle lift

Union Creek Schoolhouse building — cupola with AT&T antennas is the one closest to ski hill

JMA Wireless antenna mounted high up inside Copper Station lodge

CommScope gear inside the Copper Station node equipment room

Corning optical gear inside the Copper Station node equipment room

Copper Station lodge building (with DAS antennas) on far right, showing proximity to eastern base area

Optical fiber, under-seat Wi-Fi will power wireless connectivity at Atlanta’s Mercedes-Benz Stadium

Aerial photo of Mercedes-Benz Stadium under construction. Credit all photos and artist renderings: Merecedes-Benz Stadium (Click on any photo for a larger image)

Aerial photo of Mercedes-Benz Stadium under construction. Credit all photos and artist renderings: Merecedes-Benz Stadium (Click on any photo for a larger image)

Once just a series of drawings on a blueprint, Atlanta’s new Mercedes-Benz Stadium is getting more real by the day, with walls being added to steel beams, and wires for the internal networks being pulled into place.

Though the June 2017 opening day still is many months away, thanks to thoughtful planning many elements of the stadium’s network have already been tested, thanks to a facility created by stadium network officials to test components under situations as close to “live” as they could possibly get. That lab environment helped the network team make its final decisions on vendors and deployment methods, like going under-seat for deployment of most of the 1,000 Wi-Fi APs that will be in the stadium’s bowl area, part of a planned total of 1,800 APs in the entire venue.

In a recent interview with Jared Miller, chief technology officer at AMB Sports and Entertainment (the entity named for Arthur Blank, the owner of the Atlanta Falcons), Mobile Sports Report got an exclusive update on the construction progress so far for the new $1.5 billion facility, along with new details about the internal network deployment, which will be using more optical fiber than any previous stadium network we know of.

Like the network built at Texas A&M’s Kyle Field, the network inside Mercedes-Benz Stadium will have a single optical core for Wi-Fi, cellular and video, using the Corning ONE platform and deployed by lead network integrator IBM along with Corning.

Wall panels being added to Mercedes-Benz Stadium in Atlanta

Wall panels being added to Mercedes-Benz Stadium in Atlanta

Miller also confirmed our earlier report that YinzCam software would be used to host the stadium’s IPTV deployment, but vendor choices for Wi-Fi gear and a stadium app have yet to be named.

As construction teams continue to hustle toward completion of the building, here are more details from our conversation with Miller about how the Falcons’ tech team went through the process of determining the products and methods that would allow them to construct a network able to “push the limits” on fan connectivity.

Under-seat for Wi-Fi, with handrail heat sinks

In our early August conversation with Miller, he was happy to report that the planned 4,000 miles of optical fiber were finally starting to be threaded into the new building. “We’re making great progress with a ton of yellow cable,” Miller said.

While the overall architecture at the network core in Mercedes-Benz Stadium will be similar to the one IBM and Corning deployed at Kyle Field, Miller said that in Atlanta his team is pushing fiber even farther to the edge, “with only the last couple feet at most being copper.”

Interior suite construction with fiber cable visible

Interior suite construction with fiber cable visible

Miller said optical fiber, which can carry more data traffic at faster speeds than copper cable, is a necessary infrastructure underpinning for facilities like Mercedes-Benz Stadium that expect to host the biggest events like the Super Bowl and college football championship games. Mercedes-Benz Stadium is already slated to host Super Bowl LIII, the 2018 College Football Playoff Championship, and the 2020 Final Four.

“I really believe [fiber] gives us the foundation to grow and react in the future, to handle technologies we don’t even know about yet,” Miller said.

On the Wi-Fi side of things, Miller said that Mercedes-Benz Stadium will also mimic Kyle Field’s extensive use of under-seat APs in the bowl seating areas. Miller said the stadium will have 1,000 APs serving the seating areas and another 800 for the rest of the venue, for a total Wi-Fi AP count of 1,800.

Since the Mercedes-Benz Stadium network will be using more optical equipment closer to the edge, Miller said that his team used 3D printing experiments to craft custom enclosures for the under-seat APs, both to ensure they didn’t act as debris “traps” and also to add elements like an internal heat sink to diffuse the warmth from the extra electrical components. The heat sink solution involved attaching the AP elements to metal chair railings to dissipate heat, Miller said.

Testing the network before the building is built

After announcing its partnership with IBM in early 2015 as lead technology integrator, the stadium network team spent 6 months reworking the network design, Miller said, a process that confirmed the choice of optical networking at the core. Then to help the network team select gear and components, the Mercedes-Benz Stadium organization built a “full-scale lab facility” that Miller said allowed his team to build multiple live networks to test gear for performance and interaction with other network elements.

Artist rendering of outside of building

Artist rendering of outside of building

“The lab enabled us to see firsthand how gear behaved, not just alone but together [with other products],” said Miller, who added that at one time the network team had three simultaneous running stadium networks inside the lab.

“We were able to bring in different endpoint devices, like POS systems, and know how it’s going to behave [in a network],” Miller said. Plus, the network gave eventual business users of the planned gear time to get hands-on experience and training well before the stadium opens its doors.

On the DAS side of the network buildout, Miller said the stadium has an on-site, raised-floor room for DAS gear with “ample room” for future growth.

“One of those things we learned was that DAS [needs] always double,” Miller said.

YinzCam software for IPTV

Though the stadium hasn’t yet announced a provider for a game-day stadium application, Miller did confirm that Mercedes-Benz Stadium will use YinzCam software to control its IPTV system, which will cover the 2,500 or so TV screens inside the building.

Artist rendering of Falcons game configuration with roof open and 'halo' video board visible

Artist rendering of Falcons game configuration with roof open and ‘halo’ video board visible

“YinzCam is just the most intuitive and capable content management system,” Miller said.

Video is going to be a big part of the stadium from all angles, beginning with the one-of-a-kind “halo board,” a circular screen that will sit inside the retractable roof lines. For standard TV placements, Miller said Mercedes-Benz Stadium will use mainly 50-inch screens and will work with YinzCam to ensure the screens can be seen.

In the stadium’s suites, TV screens will be controlled by a tablet application; Miller said that Mercedes-Benz Stadium is also “contemplating adding the ability to control TV screens with a mobile app,” like the system YinzCam deployed at Texas A&M.

Friendly food pricing and more to come

Though Miller’s concerns are mostly technological in nature, he said there are still a lot of improvements coming to the stadium “that are not always reliant on brute technology,” like the new lower-priced food menus the Falcons announced earlier this year that seem to harken another era with $2 Cokes and $2 hot dogs. Miller said the stadium team continues to get feedback from a fans’ council, which has tagged the arrival and departure experience as one of the main pain points that needs fixing.

Artist rendering of window wall with view to city

Artist rendering of window wall with view to city

Mercedes-Benz Stadium will try to alleviate ingress and egress issues by doing things like creating “ticketed spaces” perhaps on the big outdoor plazas where many fans can congregate even before entering the stadium doors. By creating such spaces, Miller said fans might be able to enter the stadium more rapidly without the logjams that sometimes occur.

“We’re going to study arrival patterns and see what it looks like,” Miller said. “We have one more season to test those kind of things.”

Another amenity that may emerge is the use of wireless charging stations at a number of locations, to combat a scenario that Miller said often happens at marquee events, mainly fans’ phones draining their batteries as they compete with other devices to connect to a wireless network.

“We are focusing on providing amazing connectivity and pushing the limits,” Miller said. “We are looking at all kinds of options to allow fans to stay connected and not be separated from their device.”

Texas A&M’s Kyle Field: A network built for speed

Full house at Kyle Field. All Photos: Paul Kapustka, MSR (click on any photo for a larger image)

Full house at Kyle Field. All Photos: Paul Kapustka, MSR (click on any photo for a larger image)

Is there a combined stadium Wi-Fi and DAS deployment that is as fast as the one found at Texas A&M’s Kyle Field? If so, we haven’t seen or heard of it.

In fact, after reviewing loads of live network-performance data of Kyle Field’s new Wi-Fi and DAS in action, and after maxing out the top levels on our speed tests time after time during an informal walk-around on a game day, we’ve come to the conclusion that Kyle Field has itself a Spinal Tap of a wireless deployment. Meaning, that if other stadium networks stop at 10, this one goes to 11.

Movie references aside, quite simply, by the numbers Kyle Field’s wireless network performance is unequaled by any other large public venue’s we’ve tested in terms of raw speed and the ability to deliver bandwidth. With DAS and Wi-Fi speed measurements ranging between 40 Mbps and 60+ Mbps pretty much everywhere we roamed inside the 102,512-seat venue, it’s a safe bet to say that the school’s desire to “build the best network” in a stadium hit its goal as best as it could.

Editor’s note: This story is part of our most recent STADIUM TECH REPORT, the COLLEGE FOOTBALL ISSUE. The 40+ page report, which includes profiles of stadium deployments at Texas A&M, Kansas State, Ole Miss and Oklahoma, is available for FREE DOWNLOAD from our site. Get your copy today!

On one hand, the network’s top-line performance is not that much of a surprise, since as part of an overall Kyle Field renovation that has already cost an estimated $485 million, the optical-based Wi-Fi, DAS and IPTV deployment inside the Aggies’ football palace is probably among the most expensive and expansive in-venue networks ever built. According to Phillip Ray, Vice Chancellor for Business Affairs at The Texas A&M University System, the total cost of the optical-based Wi-Fi, DAS and IPTV network was “somewhere north of $20 million.”

Remote optical cabinet and Wi-Fi AP at Kyle Field.

Remote optical cabinet and Wi-Fi AP at Kyle Field.

And even though the nation’s biggest cellular carriers, AT&T and Verizon Wireless, paid nearly half the network’s cost – $10 million, according to Ray – with the dedication and work crews brought to the table by main suppliers IBM and Corning, and Wi-Fi gear vendor Aruba, you have components, expertise and budgetary freedom that perhaps only a small group of venue owners could hope to match.

But just throwing money and technology at a stadium doesn’t necessarily produce a great network. In a venue the size of the new Kyle Field there needs to be great care and innovative thinking behind antenna placement and tuning, and in that arena Texas A&M also had the guiding hand of AmpThink, a small firm with oversized smarts in Wi-Fi deployment, as evidenced by its impressive track record of helping wireless deployments at the biggest events including several recent Super Bowls.

The core decision to go with optical for the network’s guts, and a tactical decision to put a huge chunk of the Wi-Fi APs in under-seat deployments are just part of the strategy that produced a network that – in A&M fan parlance – can “BTHO” (Beat The Hell Out) of most challengers.

Since it’s almost impossible to directly compare stadiums and venue network performances due to all the possible variables, you’ll never hear us at Mobile Sports Report declare a “champion” when it comes to click-bait themes like “the most connected stadium ever.” Given its remote location some three hours south of Dallas in College Station, Texas, Kyle Field will almost certainly never face the ultimate “big game” pressures of a Super Bowl or a College Football Playoff championship, so the network may never know the stress such large, bucket-list gatherings can produce. And so far, there aren’t many ambitious fan-facing applications that use the network, like in-seat food delivery or wayfinding apps found in other stadiums.

But as part of the football-crazy SEC, and as the altar of pigskin worship for some of the most dedicated fans seen anywhere, Kyle Field is sure to see its share of sellout contests against SEC rivals that will push wireless usage to new heights, especially as more fans learn about and use the still-new system. Though total Wi-Fi usage at the Nov. 7 game we attended versus Auburn (a 26-10 Texas A&M loss) was “only” 2.94 terabytes – a total hampered by cold, windy and rainy conditions – an Oct. 17 game earlier in the season against Alabama saw 5.7 TB of Wi-Fi usage on the Kyle Field network, a number surpassed only by last year’s Super Bowl (with 6.2 TB of Wi-Fi use) in terms of total tonnage.

At the very least, the raw numbers of total attendees and the obvious strength of the still-new network is sure to guarantee that Kyle Field’s wireless deployment will be one of the most analyzed stadium networks for the foreseeable future.

Texas A&M student recording the halftime show.

Texas A&M student recording the halftime show.

What follows are some on-the-spot observations from our visit, which was aided by the guidance and hospitality of Corning project manager Sean Heffner, who played “tour guide” for part of the day, giving us behind-the-scenes access and views of the deployment that are unavailable to the general fan audience.

An off-campus DAS head end

This story starts not inside Kyle Field, but in a section of town just over three miles away from the stadium, on a muddy road that curves behind a funky nursery growing strange-looking plants. A gray metal box, like a big warehouse, is our destination, and the only clue as to what’s inside is the big antenna located right next to it. This structure is the Kyle Field DAS head end, where cellular carrier equipment connects to the fiber network that will bring signals to and from fans inside the stadium.

Why is the head end so far away? According to Corning’s Heffner there was no room for this huge space inside the stadium. But thanks to the use of optical fiber, the location is not a problem since signals traveling at the speed of light makes 3.3 miles an insignificant span.

It might be helpful to back up a bit if you haven’t heard the full story of the Kyle Field deployment, which we told last year when the job was halfway completed. Though the rebuilding of the stadium was started with copper-based networks as the original plan, a last-minute audible championed by Texas A&M chancellor John Sharp sent the school on a decidedly untraditional path, by building a stadium network with a single optical-based core for Wi-Fi, DAS and IPTV networks. The kicker? Not only would this network have huge capacity and be future-proof against growth, it would actually cost less than a comparable copper-based deployment. If it got built on time, that is.

Though the pitch for better performance, far more capacity, use of less space, and cheaper costs might sound a bit too good to believe, most of it is just the combination of the simple physics advantages of using fiber over copper, which are well known in the core telecom and large-enterprise networking worlds, applied to a stadium situation.

One of the many maxed-out speed tests we took at Texas A&M's Kyle Field.

One of the many maxed-out speed tests we took at Texas A&M’s Kyle Field.

Without going too deeply into the physics or technology, a simple explanation of the benefits stem from the fact that optical fiber can carry far more bandwidth than copper, at farther distances, using less power. Those advantages are why fiber is used extensively in core backbone networks, and has been creeping slowly closer to the user’s destination, through deployments like Verizon’s FiOS.

And that’s also the reason why Texas A&M could put its DAS head end out in a field where it’s easier to add to (no space constraints), because the speed of fiber makes distance somewhat irrelevant. Corning’s Heffner also said that the DAS can be managed remotely, so that staff doesn’t need to be physically present to monitor the equipment.

Of course, there was the small matter of digging trenches for optical fibers to get from the head end to the stadium, but again, for this project it is apparent that getting things done was more important than strictly worrying about costs. Beyond the cash that the carriers all put in, other vendors and construction partners all put in some extra efforts or resources – in part, probably because the value of positive publicity for being part of such an ambitious undertaking makes any extra costs easy to justify.

Keeping the best fans connected and happy

From the head end, the fiber winds its way past apartment buildings and a golf course to get to Kyle Field, the center of the local universe on football game days. Deep inside the bowels of the venue is where the fiber meets networking gear, in a room chilled to the temperature of firm ice cream. Here is where the human element that helps keep the network running spends its game days, wearing fleece and ski jackets no matter what the temperature is outside.

See the white dots? Those are under-seat Wi-Fi APs

See the white dots? Those are under-seat Wi-Fi APs

In addition to Corning, IBM and AmpThink employees, this room during our visit also had a representative from YinzCam in attendance, a rarity for a company that prides itself on being able to have its stadium and team apps run without local supervision. But with YinzCam recently named as a partner to IBM’s nascent stadium technology practice, it’s apparent that the Kyle Field network is more than just a great service for the fans in the seats – it’s also a proof of concept network that is being closely watched by all the entities that helped bring it together, who for many reasons want to be able to catch any issues before they become problems.

How big and how ambitious is the Kyle Field network? From the outset, Corning and IBM said the Wi-Fi network part was designed to support 100,000 connections at a speed of 2 Mbps, so that if everyone in the stadium decided to log on, they’d all have decent bandwidth. But so far, that upper level hasn’t been tested yet.

What happened through the first season was a “take rate” averaging in the 35,000-37,000 range, meaning that during a game day, roughly one-third of the fans in attendance used the Wi-Fi at some point. The average concurrent user peaks – the highest numbers of fans using the network at the same time – generally averaged in the mid-20,000 range, according to figures provided by Corning and AmpThink; so instead of 100,000 fans connecting at 2 Mbps, this season there was about a quarter of that number connecting at much higher data rates, if our ad hoc speed tests are any proof.

Our first test that Saturday [Nov. 7, 2015], just inside a lower-level service entryway, hit 41.35 Mbps for download and 18.67 on the upload, on a Verizon iPhone 6 Plus over the stadium’s DAS. And yes, that download speed was the slowest we’d record all day, either on the DAS or the Wi-Fi.

Inside the control room we spent some time with AmpThink CEO Bill Anderson, who could probably use up an entire football game talking about Wi-Fi network deployment strategies if he didn’t have a big network to watch. On this Saturday the top things we learned about Kyle Field is that Anderson and AmpThink are solid believers in under-seat AP placements for performance reasons; according to Anderson at Kyle Field, fully 669 of the stadium’s 1,300 APs can be found underneath seats. Anderson also is a stickler for “real” Wi-Fi usage measurements, like trying to weed out devices that may have autoconnected to the Wi-Fi network but not used it from the “unique user” totals – and to take bandwidth measurements at the network firewall, to truly see how much “live” bandwidth is coming and going.

On the road to College Station, Aggie pride is everywhere. Whoop!

On the road to College Station, Aggie pride is everywhere. Whoop!

AmpThink’s attention to detail includes deploying and configuring APs differently depending on which section they are located in – student sections, for example, are more densely packed with people than other sections so the APs need different tuning. Corning’s Heffner also said that the oDAS – the DAS just outside the stadium – got special attention due to the large numbers of tailgating fans, both before and during the games. At the Alabama game, Heffner said there were some 30,000 fans who remained outside the stadium during the contest, never coming inside but still wanting to participate in the scene.

AmpThink, Corning, IBM and others involved at Kyle Field all seem keen on finding out just how much bandwidth stadium fans will use if you give them unlimited access. The guess? According to Corning’s Heffner, the mantra of stadium networks these days seems to be: “If you provide more capacity, it gets consumed.”

The ‘real’ 12th man

After walking through a tunnel with a nearly full cable tray overhead (“It’d be even more loaded if we were using copper,” Heffner said) we went out into the stadium itself, which was just starting to fill. Though the overcast day and intermittment rain squalls might have kept other teams’ fans from showing up for a 5:30 p.m. local start time, that simply wasn’t the case at an A&M home game.

Some of the Wi-FI and DAS download measurements we took at Kyle Field.

Some of the Wi-FI and DAS download measurements we took at Kyle Field.

As someone who’s attended a countless number of football games, small and large – including a Super Bowl and last year’s inaugural College Football Playoff championship game – I can honestly say that the level of fan participation at Texas A&M is like nothing I’d seen before. The student section alone spans two decks on the stadium’s east side and takes up 40,000 seats, according to stadium officials – simply dwarfing anything I’d ever witnessed. (Out of an enrollment of 57,000+, having 40,000 students attend games is incredible.) And outside of small high school crowds I’d never seen an entire full stadium participate in all the school songs, the “yells” (do NOT call them “cheers” here) and the locked-arms back-and-forth “sawing” dance without any need for scoreboard instruction.

Part of the stadium renovation that closed the structure into a bowl was, according to school officials, designed to make Kyle Field even more intimidating than it already was, by increasing the sound levels possible. Unfortunately the night of our visit some early Auburn scores took some of the steam out of the crowd, and a driving, chilling rain that appeared just before halftime sent a good part of the crowd either home or into the concourses looking for warmth and shelter. (The next day, several columnists in the local paper admonished the fans who left early for their transgressions; how dare they depart a game whose outcome was still in doubt?)

But I’ll never forget the power of the synchronized “yells” of tens of thousands of fans during pregame, and the roar that surfaced when former Aggie QB Johnny Manziel made a surprise appearance on the field before kickoff. Seattle Seahawks fans may stake the pro claim to fan support, but if you want to determine the “real” 12th man experience you need to stop by Kyle Field and give your ears a taste of loud.

Controlling the TV with the app

If the students and alumni and other fans outside provide the vocal power, the money power that helped get the stadium rebuilt can be found in the new Kyle Field suites and premium seating areas, some of which are found on the venue’s west side, which was blown up last December and rebuilt in time for this past season.

Conduit reaching to an under-seat AP

Conduit reaching to an under-seat AP

Inside the All American Club – a behind-the-walls gathering area with catered food and bars that would not seem out of place in Levi’s Stadium or AT&T Stadium – we tested the Wi-Fi and got speeds of 63 Mbps down, 69 Mbps up; Verizon’s 4G LTE service on the DAS hit 48 Mbps/14.78 Mbps, while AT&T’s 4G LTE DAS checked in at 40 Mbps/22 Mbps.

In an actual suite where we were allowed to check out the IPTV displays, the speed tests got 67/67 for Wi-Fi and 57/12 for Verizon 4G LTE. So the well-heeled backers of A&M football shouldn’t have any problems when it comes to connectivity.

As for the IPTV controls, the new system from YinzCam solves one of the problems that’s plagued stadium suites since there’s been suites: What do you do with the TV remote? What YinzCam did for Texas A&M was link the TV controls to a Texas A&M “TV Remote” app; by simply punching in a numerical code that appears on the bottom of the screen in front of you, anyone with access to a suite or club area with TVs can change the channel to a long list of selections, including multiple live game-day views (stadium screen, broadcast view) as well as to other channels, like other games on the ESPN SEC network.

By having a static code number for each TV and another set of numbers that randomly scrambles over time, the system smartly builds security into the channel changing system, and prevents someone who had been in a suite previously from being able to change the channels after they leave. The whole remote-control process took less than a minute to learn, and we had fun wandering through the club-level areas our pass gave us access to, changing screens as we saw fit.

Our favorite places to watch the game at Kyle Field were the loge-level lounges, where you could first purchase food and beverages, including alcoholic ones, at an inside bar and then sit at an outside seat with a small-screen TV in front of you for information overload. The Wi-Fi in the southwest corner loge lounge checked in at 67.03/62.93, so it was no problem being connected via mobile device, either.

What comes next for the Kyle Field network?

Even though the rain had started coming down harder, we left the comfort and warmth of the club levels to wander around the stadium’s upper decks, including the student section, where we watched numerous fans taking pictures or videos of the band’s halftime performance. Clearly most everyone in Kyle Field had gotten the message and wasn’t afraid that they won’t connect if they use their mobile device at the game, even among 102,000 of their closest friends.

Antennas on flag poles atop seating

Antennas on flag poles atop seating

The question now for Kyle Field is what does it do next with its network? The most obvious place for innovation or new features is with a stadium-centric app, one that could provide services like a wayfinding map. Maybe it was our round-the-stadium wandering that produced confusion finding our way around, but any building that seats 102,000 plus could use an interactive map. It might also be interesting to tie a map to concessions – the night we visited, there were long lines at the few hot chocolate stands due to the cold weather; in such situations you could conceivably use the network to find out where hot chocolate stands were running low, maybe open new ones and alert fans through the app.

We’re guessing parking and ticketing functions might also be tied to the app in the future, but for now we’ll have to wait and see what happens. One thing in Kyle Field’s favor for the future: thanks to the capacity of the optical network buildout, the stadium already has thousands of spare fiber connections that aren’t currently being used. That means when it’s time to upgrade or add more DAS antennas, Wi-Fi APs or whatever comes next, Kyle Field is already wired to handle it.

For the Nov. 7 game at Kyle Field, the final numbers included 37,121 unique users of the Wi-Fi network, and a peak concurrent user number of 23,101 taken near the end of the 3rd quarter. The total traffic used on the Wi-Fi network that night was 2.94 TB, perhaps low or average for Kyle Field these days but it’s helpful to remember that just three years ago that was right around the total Wi-Fi data used at a Super Bowl.

Until the next IBM/Corning network gets built in Atlanta (at the Falcons’ new Mercedes-Benz Stadium, slated to open in 2017), the Kyle Field network will no doubt be the center of much stadium-technology market attention, especially if they ever do manage to get 100,000 fans to use the Wi-Fi all at once. While A&M’s on-the-field fortunes in the competitive SEC are a yearly question, the performance of the network in the Aggies’ stadium isn’t; right now it would certainly be one of the top four seeds, if not No. 1, if there was such a thing as a college stadium network playoff.

What we’re looking forward to is more data and more reports from a stadium with a network that can provide “that extra push over the edge” when fans want to turn their connectivity dial past 10. Remember, this one goes to 11. It’s one more.

(More photos below! And don’t forget to download your copy of the STADIUM TECH REPORT for more!)

kf7
Panoramic view of Kyle Field before the 102,000 fans fill the seats.

kf2
Some things at Kyle Field operate at ‘traditional’ speeds.

kf1

Outside the south gate before the game begins.

kf3

Overhang antenna in the middle section of the stadium.

Last-minute audible to optical made Texas A&M’s stadium network a winner

Texas A&M student at recent Aggies football game. Photo: Paul Kapustka, MSR (click on any photo for a larger image)

Texas A&M student at recent Aggies football game. Photo: Paul Kapustka, MSR (click on any photo for a larger image)

The original game plan for the new wireless networks at Texas A&M’s Kyle Field called for copper, not optical fiber, at the network core. Then came a last-minute audible that changed the game not just for the Aggies but maybe for stadium networks overall.

After initially designing the network with a traditional copper wiring system, a late spring 2014 decision by Texas A&M chancellor John Sharp reversed field, switching instead to an all-optical network for DAS, Wi-Fi and IPTV combined. The new network, now in full operational mode, is already being hailed as the future-proof path of the future of stadium network technology, with other schools and pro teams beating a path to College Station to see what they might learn.

With screaming speeds on both the Wi-Fi and DAS networks and plenty of capacity for now and the future, Sharp’s line-of-scrimmage call to go with an IBM and Corning optical-based network seems to be a huge score, according to a school official who brought the idea to Sharp’s attention.

Editor’s note: This story is part of our most recent STADIUM TECH REPORT, the COLLEGE FOOTBALL ISSUE for 2015. The 40+ page report, which includes profiles of stadium deployments at Texas A&M, Kansas State, Ole Miss and Oklahoma, is available for FREE DOWNLOAD from our site. Get your copy today!

A sample of the Wi-Fi and DAS speed tests we took at Kyle Field.

A sample of the Wi-Fi and DAS speed tests we took at Kyle Field.

Last-minute switch from copper to optical

“We had got pretty far down the road with an older, but tried and true [network] architecture,” said Phillip Ray, Vice Chancellor for Business Affairs at The Texas A&M University System. But after hearing and reading about the possible potential of an optical fiber-based network system, Ray brought in Corning and IBM representatives over school spring break in 2014 to discuss the possibility of switching to an optical fiber-based network for Kyle Field – even though the network would have to be ready for the 2014 football season.

“We had some face to face meetings with chancellor Sharp and discussed all the pros and cons,” said Ray, who had been charged by Sharp with overseeing the network deployment part of the $485 million Kyle Field renovation. Though Ray said he was under a “lot of pressure” to stick with the older-type design, he quickly got a green light from Sharp to take the optical choice and run with it.

“If we had gone copper, we knew that we would have had a network in the stadium for game 1,” said Ray. “But the pros of optical far outweighed the cons. Chancellor Sharp instead took a big risk, and took a leap of faith for all the right reasons. He said, ‘this is the chance of a lifetime, to really move the ball and shoot for the top!’ “

According to Ray, the total cost of the combined Wi-Fi, DAS and IPTV network ended up being “just north of $20 million,” but that cost was softened when the two largest cellular carriers, AT&T and Verizon Wireless, ponied up $10 million, almost half the cost.

“The carriers embraced it, funded it, and want to be with us down the road,” said Ray. “It was a paradigm shift for them, but they wanted to be involved.” While AT&T and Verizon are live on the DAS now, Ray said that Texas A&M already has a commitment from T-Mobile to join the DAS soon, and hopes to also add Sprint before long.

Aside from the leap of faith to go optical was the on-the-ground necessity to build the network quickly, since Sharp didn’t want to start the 2014 season without it. Ray said that Todd Chrisner – a former IBM employee who moved to Corning during the past year – “helped lead a Herculean effort” of gear suppliers, service providers and construction workers who finished Phase 1 of the network in time for the first game of last season. Phase 2 of the network also required quick moving, since it didn’t get started until Texas A&M blew up and then rebuilt the entire west side of the stadium between December 2014 and the 2015 season.

On the road to College Station, Aggie pride is everywhere. Whoop!

On the road to College Station, Aggie pride is everywhere. Whoop!

Again, the network (and the building) were finished on time.

“We had a lot of Aggies involved [in the construction],” Ray said. “They knew they were going to be sitting in those seats for the next 35 years, so they worked hard.”

Now that it’s finished and working incredibly well, Ray said the Kyle Field network has already been visited by representatives from other colleges, as well as professional football and hockey stadium-networking types.

“We get calls every week, and we have people down to share what we learned – we’re an open book,” said Ray. And they’re able to tell a success story mainly because Ray, Sharp and others trusted themselves to switch from an OK play to one that could score a touchdown.

“If we had gone with copper we’d be so regretting it now,” Ray said. Having an optical-based network, he said, “sets us up for many years, and eventually will save us money. It was a lot of hard work and risk, and if it had fallen on its head, chancellor Sharp would have taken the heat. Instead, it’s one of the best decisions, ever.”