University of Wisconsin takes on Wi-Fi, Badger Game Day app upgrades

Camp Randall Stadium, University of Wisconsin. Photo: Dave Stluka

Camp Randall Stadium, University of Wisconsin. Photo: Dave Stluka

Sports fans at the University of Wisconsin have been enjoying a nice technology two-fer for the last 20 months: In addition to new Wi-Fi and beacon technology at its largest sporting venues in Madison, Wisc., the university also released v3.0.2 of its Badger Game Day app which adds live and archived video, among other features, for fans and their smartphones.

Jim Roberts, director of technical services for the university’s athletic department, described this as a happy coincidence as opposed to a larger strategy to bring sports technology to the Badger faithful. “Knowing that the new Wi-Fi system was coming, the group working on the app upgrade was able to incorporate more features, knowing fans could take advantage of the improved Wi-Fi and not rely solely on cellular data plans,” Roberts said.

Editor’s note: This profile is an excerpt from our latest STADIUM TECHNOLOGY REPORT, which is available for FREE DOWNLOAD from our site. In addition to this stadium tech deployment profiles we also take an in-depth look at the new trend of deploying Wi-Fi and DAS antennas under seats, and provide a wireless recap from Super Bowl 50. GET YOUR COPY today!

The Wisconsin venues are Camp Randall, a bowl-style football stadium with a capacity of 80,321; and nearby Kohl Center, used for hockey, basketball, concerts and other live events with room for 17,230. The LeBahn Arena, built for women’s ice hockey with a capacity of 2,273, is also included. In part because of their proximity, Roberts and his team used the upgrades to replace and enhance the underlying infrastructure for the venues – core switching, Wi-Fi access points, an IPTV system, cabling, electrical power and HVAC improvements — $11 million for the whole package, according to Roberts.

“Due to the expected size of the population connecting to Wi-Fi, we had to upgrade the entire network,” he explained, adding that the previous 10/100 Mbps backbone with Gigabit Ethernet uplinks and its 32,000 MAC address capacity was insufficient for the job.

“We upgraded our core to some pretty big Cisco routers at each venue that could handle 128,000 MAC addresses, with 10-gigabit fiber to all 33 telecom rooms within the Camp Randall complex,” he said; they also added about 1,100 wireless APs. Camp Randall got upgraded during the summer of 2014; the Kohl Center and LeBahn were done a year later.

Kohl Center

Kohl Center

Camp Randall proved to be the largest test, both from an engineering and design perspective. Built in 1917, its open bowl lacks the overhangs from which RF engineers love to hang antennas and other infrastructure.

“The east side of the bowl became our biggest challenge with getting the signal to penetrate deep enough into the sections,” Roberts said, adding that the problem was especially acute for seats closest to the field, where the first few rows are tarped over. Initially, APs were installed below the tarps, but the signal only carried 10 rows back.

“We ended up mounting the APs on the front, 4-6 feet up from ground level,” and above the tarps, he explained. “They don’t affect the sight lines for spectators. But getting the APs to shoulder height from waist height definitely helped us get it back to row 25.”

APs were also mounted just above the entry tunnels, where the hardware and antenna could be attached to railings and concrete. Cisco is the University of Wisconsin’s AP vendor; the deployment uses Cisco model 3700s.

Wi-Fi install over a VOM at Camp Randall (click on photo for a larger image)

Wi-Fi install over a VOM at Camp Randall (click on photo for a larger image)

Roberts and his team also ran into some structural issues with waterproofing and cabling that kept them from putting in more APs in the student section. They had to re-calculate where the APs would go; consequently, coverage can be spotty in the student section, which is exacerbated by the high density of phones in that part of the stadium. AT&T and Verizon both have DAS infrastructure in Camp Randall that helps coverage, but Roberts and his team are looking at long-term solutions for Wi-Fi coverage in that section and throughout Camp Randall.

The University of Wisconsin worked closely with AmpThink on a facility-wide Wi-Fi analysis, according to Bob Lahey, a network engineer in the athletic department. AmpThink did the design and tuning and worked out some issues in advance. “Our facilities staff and [AmpThink] discussed locations for best coverage and worked through the aesthetics before we started the project,” Lahey said. AmpThink was also onsite during the first year to see how the Wi-Fi performed with people in the bowl. “You can only figure out so much without people there,” Lahey laughed.

Getting Online at Camp Randall

The stadium’s fan-facing wireless network, Badger WiFi, is a captive portal that asks users for their name, email address and zip code. There are also two boxes: one, users must check to agree to terms and conditions of service; the second allows the university to send them emails, and by default, the second box is checked. “Our plan is to send them email surveys and allow them to remain on the system and not have to re-authenticate every time they come to one of our buildings,” Lahey said. “But if they uncheck, they have to re-authenticate.”

The university does no bandwidth limiting or throttling back usage once users are logged in. “We’ve got dual 10-gigabit links and 100-gigabit to the world, so we’re not too concerned about overall bandwidth,” Lahey said. “We limit each radio in the AP to a maximum of 200 clients. It doesn’t happen often, but we see it occasionally.” Camp Randall users normally get at least 1 Mbps bandwidth — plenty for checking scores or posting to social media, Lahey added. Kohl Center users average 40-60 Mbps because the venue is less dense.

Screen shot of Wi-Fi portal login

Screen shot of Wi-Fi portal login

At present, 65-70 percent of Badger Wi-Fi clients are on 5 GHz spectrum rather than 2.4 GHz. Roberts finds the 5 GHz band easier to manage, and said users get a better experience. “If we have problems with wireless, it is most times an older couple with their iPhone 4,” Roberts said. “APs can only do so much, but sometimes a phone [using 2.4 GHz spectrum] will want to connect with an AP a half mile across the field rather than one that’s 10 feet away.”

He also said the maximum number of unique clients for Camp Randall is about 26,000, or 37 percent of the crowd. “We assume that’s going to keep growing and we’ll have to augment the system,” he said. “At some point we won’t have enough access points.”

Game Day Gets a Badger Refresh

Concurrently, the Badger Game Day smartphone app was getting new features like live video replay and interaction with Bluetooth-based beacon technology. The app’s first iteration was initially for football, then expanded to all 10 sports that sell tickets; the latest version embraces all 23 sports at the University of Wisconsin, men’s and women’s. “Not many schools have all their sports represented, so while the traffic may not be high on rowing, it’s a great recruitment tool,” said Ben Fraser, director of external engagement for the athletics department. “So it helps there with the coaches sending out links or for parents and other supporters.”

It also helps with fans. “Collegiate and professional sports venues are looking for how to keep fans entertained and also allow them to participate in the game via social media and other methods,” noted Tam Flarup, director of the athletic department’s website services. When there’s break in the action, Badger fans are busy posting to Facebook, Instagram and of course, Wisconsin’s infamous Jump Around. “Twitter’s also allowing Periscope live video in its tweets now,” Flarup added. “Our fans will like that – it keeps them in the stands with a great game day atmosphere and experience.”

The university developed the first two iterations of Badger Game Day internally but chose to outsource the upgrade to sports-app developer YinzCam in June 2015 and gave them a tight deadline to meet — Aug. 30, just in time for Badger football season. YinzCam delivered on time, and then met an Oct. 15 deadline for revisions and tweaks, Fraser said.

Badger Game Day now includes live video replay from four different camera angles; YinzCam’s secret sauce makes streaming video across Wi-Fi more efficient. “Video would have been impossible without the Wi-Fi investment we made,” Fraser said.

Unlike previous iterations that only allowed the participation of a single sponsor, the new Badger Game Day app gives the university the ability to sell individual pages and sports, Fraser said.

Game day beacon message to app

Game day beacon message to app

Perhaps the leading edge of Badger Game Day is its use of Bluetooth-based beacon technology and messaging with geo-fencing. Gimbal Inc. worked with the university customize the technology; Fraser and his team did some social media messaging to alert fans to the feature and to remind them to turn it on.

The first remote use of messaging with beacons and geo-fencing was in Dallas for Wisconsin’s season opener in Dallas at AT&T Stadium; the feature was then used continually at both Camp Randall and the Kohl Center.

“We continued to use this messaging on the road for the Holiday Bowl in San Diego,” Fraser said. “Messages varied from welcome messages that were linked to videos from our players, to informational messages that informed fans about events, to scavenger hunts that engaged our fans at these sites.”

When users first download the app, there’s a proximity allowance message that they must activate to receive beacon messages. So far, the university has sent out 46 unique messages, 21 of which were geo-fenced. At each home game, they geo-fence Camp Randall with a welcome video from players; they reached an average number of 1,160 fans per game with these welcome messages and videos.

“We’re still learning how fans are using [beacons and Bluetooth], and we’re trying not to hit them with too many ads,” Fraser said. By building their trust, it encourages fans to leave their Bluetooth on for the signal to find them. “And we are looking for ways to improve it,” he added. Potential future additions: Features that show the length of lines at concession stands and restrooms, and an online lost and found. They’re also looking for more robust scheduling information inside the app — such as which broadcast network is carrying the game, along with links to Wisconsin’s video stream and live stats.

App development and a new server cost the university about $100,000, according to Fraser and Flarup. Since August 2015, there have been 123,000 downloads of Badger Game Day and nearly 1 million page views. Average time spent per game on the audio feature of the app is about 14 minutes. There’s more room to grow as fans continue to download and use the app; there’s plenty of revenue upside as well as sponsors discover multiple avenues for their messaging and content.

NCAA hoops sites get wireless upgrades to handle tourney traffic

The two "sliced" balls in the center are AT&T's new "Ten-Ten-Antenna," so called because it delivers 10x the cellular coverage of any previous such device. Photo: AT&T

The two “sliced” balls in the center are AT&T’s new “Ten-Ten-Antenna,” so called because it delivers 10x the cellular coverage of any previous such device. Photo: AT&T

In addition to ticket sales and hotel revenues, you can count on an NCAA basketball tournament crowd to bring wireless demands to host stadiums these days. To prepare for the expected crush, wireless carriers, third-party integrators and venues themselves have bolstered both DAS and Wi-Fi networks, especially at NRG Stadium in Houston, site of the men’s Final Four April 2 and 4.

NRG Stadium, also home to the NFL’s Houston Texans, is slated to host Super Bowl LI next Feburary, and as such will be getting a new Wi-Fi network built by 5 Bars with gear from Extreme Networks ahead of the biggest big game. Unfortunately for data-hungry hoops fans, construction on that network won’t start until after the Final Four, meaning it will be cellular-only for the fans and followers at the championship weekend games.

But connections for customers of major carriers should be fine, since AT&T has already spent $25 million on Houston-area DAS upgrades, including at the stadium itself as well as at the convention center and other areas hosting Final Four activities. There will also be a portable Cell on Wheels, or COW, outside the convention center, where AT&T’s ball-shaped directional antennas will be bringing extra capacity to the scene.

Verizon said that it has already spent $40 million on improving its cellular infrastructure in and around NRG Stadium; inside the venue Verizon said its updated DAS deployment has 783 antennas, able to handle four times the capacity of the previous infrastructure. Outside the stadium Verizon said it has implemented an outdoor DAS to cover parking lots and tailgating areas. Verizon said it is also targeting downtown areas and the Houston airport for improvements ahead of Super Bowl LI.

At some of the regional tourney sites, third-party neutral host ExteNet Systems has been busy as well, adding capacity to some of its stadium DAS deployments as well as to one Wi-Fi network it runs at the Dunkin’ Donuts Center in Providence, R.I. At the Wells Fargo Arena in Des Moines, Iowa, ExteNet recently added U.S. Cellular to its DAS deployment, the first venue in which U.S. Cellular has been an ExteNet customer. Other ExteNet deployments that will see men’s or women’s NCAA games this year include the Barclays Center in Brooklyn, N.Y.; Bankers Life Fieldhouse in Indianapolis; and the Webster Bank Arena in Bridgeport, Conn.

In Denver at the Pepsi Center, a new (but not yet publicly announced) Wi-Fi network using Avaya gear should get a good test if it is live for the regional games there this weekend. With all these and more, if any fans or venues want to send speedtests in or post-games stats, we’ll happily print them.

March Madness viewing: More digital options, plus some virtual reality

MML_iPhone_01-WatchRemember when college basketball tournament season only had a small slice of games available online? Or when you had to pay extra to watch online? It wasn’t that long ago. Thankfully though the future is here now and for 2016 the college hoops postseason has even more ways to watch games mobile or online, including one option to watch games via virtual reality programming.

Like last year, if you have a qualifying cable contract, you are basically covered and should be able to watch all the NCAA Men’s Basketball Tournament games live, on whichever platform you want. The best way to start is to head to the NCAA’s March Madness home page, where you should be able to find any and all information on devices, apps and other avenues to streaming coverage. According to Turner Sports, the NCAA and CBS Sports the games will be available live on 12 different platforms, including Amazon Fire TV, Apple TV, Roku players and Roku TV models. The new March Madness Live app isn’t avalable until Thursday, so check back soon for the go-to app for everything March Madness.

Also like last year, you should be able to watch a few minutes of the first game you see without having to log in — great if you are just trying to catch a buzzer beater. The games of course will be available on regular TV, and the March Madness home page has what may be a great time saver, a widget that helps you find those obscure cable channels other than CBS or TNT where the games might be on. Since we’ve just moved, MSR’s NCAA viewing team might make good use of the Zip Code-powered channel finder.

Screen Shot 2016-03-08 at 12.14.34 PMEven if you don’t have a cable contract you can still watch a lot of games that are streamed online; games broadcast on CBS will be available for no charge on desktop, mobile and tablet platforms, while games broadcast on the other channels (TNT, TBS, truTV and local channels) should be available on those providers’ websites. Again, if you get stuck or lost just defaulting back to the March Madness home page should give you a path to whatever game it is you’re looking for.

Big East tourney available in VR

If you have a NextVR platform you will be able to watch the 2016 Big East tournament (it starts Thursday, March 10) thanks to a partnership between FOX Sports and NextVR. We’re not VR-savvy here at MSR headquarters yet but with seven games and 15 hours of programming scheduled this might be a cool treat for VR fans. NextVR has an instruction page on how to watch the games in VR; if anyone tries this out, send us an email with a report on how it worked (or didn’t) and we’ll let everyone else know.

Also, don’t forget — this year for the first time the NCAA Men’s Championship game, scheduled for Monday, April 4, will be on TBS, NOT on CBS, the first time the champs game has been only on cable. And, there will be streaming options as well during Final Four weekend, according to the official announcement:

For the NCAA Final Four National Semifinals on Saturday, April 2, from Houston, NCAA March Madness Live will provide three distinct live video streams of both games to provide unprecedented viewing options for fans – live streaming of the traditional game coverage provided on TBS, along with “Team Stream by Bleacher Report” coverage or team-specific presentations offered via TNT and truTV. This year’s NCAA Tournament will include the National Championship airing on TBS, the first time the championship has ever been televised on cable television.

Texas A&M’s Kyle Field: A network built for speed

Full house at Kyle Field. All Photos: Paul Kapustka, MSR (click on any photo for a larger image)

Full house at Kyle Field. All Photos: Paul Kapustka, MSR (click on any photo for a larger image)

Is there a combined stadium Wi-Fi and DAS deployment that is as fast as the one found at Texas A&M’s Kyle Field? If so, we haven’t seen or heard of it.

In fact, after reviewing loads of live network-performance data of Kyle Field’s new Wi-Fi and DAS in action, and after maxing out the top levels on our speed tests time after time during an informal walk-around on a game day, we’ve come to the conclusion that Kyle Field has itself a Spinal Tap of a wireless deployment. Meaning, that if other stadium networks stop at 10, this one goes to 11.

Movie references aside, quite simply, by the numbers Kyle Field’s wireless network performance is unequaled by any other large public venue’s we’ve tested in terms of raw speed and the ability to deliver bandwidth. With DAS and Wi-Fi speed measurements ranging between 40 Mbps and 60+ Mbps pretty much everywhere we roamed inside the 102,512-seat venue, it’s a safe bet to say that the school’s desire to “build the best network” in a stadium hit its goal as best as it could.

Editor’s note: This story is part of our most recent STADIUM TECH REPORT, the COLLEGE FOOTBALL ISSUE. The 40+ page report, which includes profiles of stadium deployments at Texas A&M, Kansas State, Ole Miss and Oklahoma, is available for FREE DOWNLOAD from our site. Get your copy today!

On one hand, the network’s top-line performance is not that much of a surprise, since as part of an overall Kyle Field renovation that has already cost an estimated $485 million, the optical-based Wi-Fi, DAS and IPTV deployment inside the Aggies’ football palace is probably among the most expensive and expansive in-venue networks ever built. According to Phillip Ray, Vice Chancellor for Business Affairs at The Texas A&M University System, the total cost of the optical-based Wi-Fi, DAS and IPTV network was “somewhere north of $20 million.”

Remote optical cabinet and Wi-Fi AP at Kyle Field.

Remote optical cabinet and Wi-Fi AP at Kyle Field.

And even though the nation’s biggest cellular carriers, AT&T and Verizon Wireless, paid nearly half the network’s cost – $10 million, according to Ray – with the dedication and work crews brought to the table by main suppliers IBM and Corning, and Wi-Fi gear vendor Aruba, you have components, expertise and budgetary freedom that perhaps only a small group of venue owners could hope to match.

But just throwing money and technology at a stadium doesn’t necessarily produce a great network. In a venue the size of the new Kyle Field there needs to be great care and innovative thinking behind antenna placement and tuning, and in that arena Texas A&M also had the guiding hand of AmpThink, a small firm with oversized smarts in Wi-Fi deployment, as evidenced by its impressive track record of helping wireless deployments at the biggest events including several recent Super Bowls.

The core decision to go with optical for the network’s guts, and a tactical decision to put a huge chunk of the Wi-Fi APs in under-seat deployments are just part of the strategy that produced a network that – in A&M fan parlance – can “BTHO” (Beat The Hell Out) of most challengers.

Since it’s almost impossible to directly compare stadiums and venue network performances due to all the possible variables, you’ll never hear us at Mobile Sports Report declare a “champion” when it comes to click-bait themes like “the most connected stadium ever.” Given its remote location some three hours south of Dallas in College Station, Texas, Kyle Field will almost certainly never face the ultimate “big game” pressures of a Super Bowl or a College Football Playoff championship, so the network may never know the stress such large, bucket-list gatherings can produce. And so far, there aren’t many ambitious fan-facing applications that use the network, like in-seat food delivery or wayfinding apps found in other stadiums.

But as part of the football-crazy SEC, and as the altar of pigskin worship for some of the most dedicated fans seen anywhere, Kyle Field is sure to see its share of sellout contests against SEC rivals that will push wireless usage to new heights, especially as more fans learn about and use the still-new system. Though total Wi-Fi usage at the Nov. 7 game we attended versus Auburn (a 26-10 Texas A&M loss) was “only” 2.94 terabytes – a total hampered by cold, windy and rainy conditions – an Oct. 17 game earlier in the season against Alabama saw 5.7 TB of Wi-Fi usage on the Kyle Field network, a number surpassed only by last year’s Super Bowl (with 6.2 TB of Wi-Fi use) in terms of total tonnage.

At the very least, the raw numbers of total attendees and the obvious strength of the still-new network is sure to guarantee that Kyle Field’s wireless deployment will be one of the most analyzed stadium networks for the foreseeable future.

Texas A&M student recording the halftime show.

Texas A&M student recording the halftime show.

What follows are some on-the-spot observations from our visit, which was aided by the guidance and hospitality of Corning project manager Sean Heffner, who played “tour guide” for part of the day, giving us behind-the-scenes access and views of the deployment that are unavailable to the general fan audience.

An off-campus DAS head end

This story starts not inside Kyle Field, but in a section of town just over three miles away from the stadium, on a muddy road that curves behind a funky nursery growing strange-looking plants. A gray metal box, like a big warehouse, is our destination, and the only clue as to what’s inside is the big antenna located right next to it. This structure is the Kyle Field DAS head end, where cellular carrier equipment connects to the fiber network that will bring signals to and from fans inside the stadium.

Why is the head end so far away? According to Corning’s Heffner there was no room for this huge space inside the stadium. But thanks to the use of optical fiber, the location is not a problem since signals traveling at the speed of light makes 3.3 miles an insignificant span.

It might be helpful to back up a bit if you haven’t heard the full story of the Kyle Field deployment, which we told last year when the job was halfway completed. Though the rebuilding of the stadium was started with copper-based networks as the original plan, a last-minute audible championed by Texas A&M chancellor John Sharp sent the school on a decidedly untraditional path, by building a stadium network with a single optical-based core for Wi-Fi, DAS and IPTV networks. The kicker? Not only would this network have huge capacity and be future-proof against growth, it would actually cost less than a comparable copper-based deployment. If it got built on time, that is.

Though the pitch for better performance, far more capacity, use of less space, and cheaper costs might sound a bit too good to believe, most of it is just the combination of the simple physics advantages of using fiber over copper, which are well known in the core telecom and large-enterprise networking worlds, applied to a stadium situation.

One of the many maxed-out speed tests we took at Texas A&M's Kyle Field.

One of the many maxed-out speed tests we took at Texas A&M’s Kyle Field.

Without going too deeply into the physics or technology, a simple explanation of the benefits stem from the fact that optical fiber can carry far more bandwidth than copper, at farther distances, using less power. Those advantages are why fiber is used extensively in core backbone networks, and has been creeping slowly closer to the user’s destination, through deployments like Verizon’s FiOS.

And that’s also the reason why Texas A&M could put its DAS head end out in a field where it’s easier to add to (no space constraints), because the speed of fiber makes distance somewhat irrelevant. Corning’s Heffner also said that the DAS can be managed remotely, so that staff doesn’t need to be physically present to monitor the equipment.

Of course, there was the small matter of digging trenches for optical fibers to get from the head end to the stadium, but again, for this project it is apparent that getting things done was more important than strictly worrying about costs. Beyond the cash that the carriers all put in, other vendors and construction partners all put in some extra efforts or resources – in part, probably because the value of positive publicity for being part of such an ambitious undertaking makes any extra costs easy to justify.

Keeping the best fans connected and happy

From the head end, the fiber winds its way past apartment buildings and a golf course to get to Kyle Field, the center of the local universe on football game days. Deep inside the bowels of the venue is where the fiber meets networking gear, in a room chilled to the temperature of firm ice cream. Here is where the human element that helps keep the network running spends its game days, wearing fleece and ski jackets no matter what the temperature is outside.

See the white dots? Those are under-seat Wi-Fi APs

See the white dots? Those are under-seat Wi-Fi APs

In addition to Corning, IBM and AmpThink employees, this room during our visit also had a representative from YinzCam in attendance, a rarity for a company that prides itself on being able to have its stadium and team apps run without local supervision. But with YinzCam recently named as a partner to IBM’s nascent stadium technology practice, it’s apparent that the Kyle Field network is more than just a great service for the fans in the seats – it’s also a proof of concept network that is being closely watched by all the entities that helped bring it together, who for many reasons want to be able to catch any issues before they become problems.

How big and how ambitious is the Kyle Field network? From the outset, Corning and IBM said the Wi-Fi network part was designed to support 100,000 connections at a speed of 2 Mbps, so that if everyone in the stadium decided to log on, they’d all have decent bandwidth. But so far, that upper level hasn’t been tested yet.

What happened through the first season was a “take rate” averaging in the 35,000-37,000 range, meaning that during a game day, roughly one-third of the fans in attendance used the Wi-Fi at some point. The average concurrent user peaks – the highest numbers of fans using the network at the same time – generally averaged in the mid-20,000 range, according to figures provided by Corning and AmpThink; so instead of 100,000 fans connecting at 2 Mbps, this season there was about a quarter of that number connecting at much higher data rates, if our ad hoc speed tests are any proof.

Our first test that Saturday [Nov. 7, 2015], just inside a lower-level service entryway, hit 41.35 Mbps for download and 18.67 on the upload, on a Verizon iPhone 6 Plus over the stadium’s DAS. And yes, that download speed was the slowest we’d record all day, either on the DAS or the Wi-Fi.

Inside the control room we spent some time with AmpThink CEO Bill Anderson, who could probably use up an entire football game talking about Wi-Fi network deployment strategies if he didn’t have a big network to watch. On this Saturday the top things we learned about Kyle Field is that Anderson and AmpThink are solid believers in under-seat AP placements for performance reasons; according to Anderson at Kyle Field, fully 669 of the stadium’s 1,300 APs can be found underneath seats. Anderson also is a stickler for “real” Wi-Fi usage measurements, like trying to weed out devices that may have autoconnected to the Wi-Fi network but not used it from the “unique user” totals – and to take bandwidth measurements at the network firewall, to truly see how much “live” bandwidth is coming and going.

On the road to College Station, Aggie pride is everywhere. Whoop!

On the road to College Station, Aggie pride is everywhere. Whoop!

AmpThink’s attention to detail includes deploying and configuring APs differently depending on which section they are located in – student sections, for example, are more densely packed with people than other sections so the APs need different tuning. Corning’s Heffner also said that the oDAS – the DAS just outside the stadium – got special attention due to the large numbers of tailgating fans, both before and during the games. At the Alabama game, Heffner said there were some 30,000 fans who remained outside the stadium during the contest, never coming inside but still wanting to participate in the scene.

AmpThink, Corning, IBM and others involved at Kyle Field all seem keen on finding out just how much bandwidth stadium fans will use if you give them unlimited access. The guess? According to Corning’s Heffner, the mantra of stadium networks these days seems to be: “If you provide more capacity, it gets consumed.”

The ‘real’ 12th man

After walking through a tunnel with a nearly full cable tray overhead (“It’d be even more loaded if we were using copper,” Heffner said) we went out into the stadium itself, which was just starting to fill. Though the overcast day and intermittment rain squalls might have kept other teams’ fans from showing up for a 5:30 p.m. local start time, that simply wasn’t the case at an A&M home game.

Some of the Wi-FI and DAS download measurements we took at Kyle Field.

Some of the Wi-FI and DAS download measurements we took at Kyle Field.

As someone who’s attended a countless number of football games, small and large – including a Super Bowl and last year’s inaugural College Football Playoff championship game – I can honestly say that the level of fan participation at Texas A&M is like nothing I’d seen before. The student section alone spans two decks on the stadium’s east side and takes up 40,000 seats, according to stadium officials – simply dwarfing anything I’d ever witnessed. (Out of an enrollment of 57,000+, having 40,000 students attend games is incredible.) And outside of small high school crowds I’d never seen an entire full stadium participate in all the school songs, the “yells” (do NOT call them “cheers” here) and the locked-arms back-and-forth “sawing” dance without any need for scoreboard instruction.

Part of the stadium renovation that closed the structure into a bowl was, according to school officials, designed to make Kyle Field even more intimidating than it already was, by increasing the sound levels possible. Unfortunately the night of our visit some early Auburn scores took some of the steam out of the crowd, and a driving, chilling rain that appeared just before halftime sent a good part of the crowd either home or into the concourses looking for warmth and shelter. (The next day, several columnists in the local paper admonished the fans who left early for their transgressions; how dare they depart a game whose outcome was still in doubt?)

But I’ll never forget the power of the synchronized “yells” of tens of thousands of fans during pregame, and the roar that surfaced when former Aggie QB Johnny Manziel made a surprise appearance on the field before kickoff. Seattle Seahawks fans may stake the pro claim to fan support, but if you want to determine the “real” 12th man experience you need to stop by Kyle Field and give your ears a taste of loud.

Controlling the TV with the app

If the students and alumni and other fans outside provide the vocal power, the money power that helped get the stadium rebuilt can be found in the new Kyle Field suites and premium seating areas, some of which are found on the venue’s west side, which was blown up last December and rebuilt in time for this past season.

Conduit reaching to an under-seat AP

Conduit reaching to an under-seat AP

Inside the All American Club – a behind-the-walls gathering area with catered food and bars that would not seem out of place in Levi’s Stadium or AT&T Stadium – we tested the Wi-Fi and got speeds of 63 Mbps down, 69 Mbps up; Verizon’s 4G LTE service on the DAS hit 48 Mbps/14.78 Mbps, while AT&T’s 4G LTE DAS checked in at 40 Mbps/22 Mbps.

In an actual suite where we were allowed to check out the IPTV displays, the speed tests got 67/67 for Wi-Fi and 57/12 for Verizon 4G LTE. So the well-heeled backers of A&M football shouldn’t have any problems when it comes to connectivity.

As for the IPTV controls, the new system from YinzCam solves one of the problems that’s plagued stadium suites since there’s been suites: What do you do with the TV remote? What YinzCam did for Texas A&M was link the TV controls to a Texas A&M “TV Remote” app; by simply punching in a numerical code that appears on the bottom of the screen in front of you, anyone with access to a suite or club area with TVs can change the channel to a long list of selections, including multiple live game-day views (stadium screen, broadcast view) as well as to other channels, like other games on the ESPN SEC network.

By having a static code number for each TV and another set of numbers that randomly scrambles over time, the system smartly builds security into the channel changing system, and prevents someone who had been in a suite previously from being able to change the channels after they leave. The whole remote-control process took less than a minute to learn, and we had fun wandering through the club-level areas our pass gave us access to, changing screens as we saw fit.

Our favorite places to watch the game at Kyle Field were the loge-level lounges, where you could first purchase food and beverages, including alcoholic ones, at an inside bar and then sit at an outside seat with a small-screen TV in front of you for information overload. The Wi-Fi in the southwest corner loge lounge checked in at 67.03/62.93, so it was no problem being connected via mobile device, either.

What comes next for the Kyle Field network?

Even though the rain had started coming down harder, we left the comfort and warmth of the club levels to wander around the stadium’s upper decks, including the student section, where we watched numerous fans taking pictures or videos of the band’s halftime performance. Clearly most everyone in Kyle Field had gotten the message and wasn’t afraid that they won’t connect if they use their mobile device at the game, even among 102,000 of their closest friends.

Antennas on flag poles atop seating

Antennas on flag poles atop seating

The question now for Kyle Field is what does it do next with its network? The most obvious place for innovation or new features is with a stadium-centric app, one that could provide services like a wayfinding map. Maybe it was our round-the-stadium wandering that produced confusion finding our way around, but any building that seats 102,000 plus could use an interactive map. It might also be interesting to tie a map to concessions – the night we visited, there were long lines at the few hot chocolate stands due to the cold weather; in such situations you could conceivably use the network to find out where hot chocolate stands were running low, maybe open new ones and alert fans through the app.

We’re guessing parking and ticketing functions might also be tied to the app in the future, but for now we’ll have to wait and see what happens. One thing in Kyle Field’s favor for the future: thanks to the capacity of the optical network buildout, the stadium already has thousands of spare fiber connections that aren’t currently being used. That means when it’s time to upgrade or add more DAS antennas, Wi-Fi APs or whatever comes next, Kyle Field is already wired to handle it.

For the Nov. 7 game at Kyle Field, the final numbers included 37,121 unique users of the Wi-Fi network, and a peak concurrent user number of 23,101 taken near the end of the 3rd quarter. The total traffic used on the Wi-Fi network that night was 2.94 TB, perhaps low or average for Kyle Field these days but it’s helpful to remember that just three years ago that was right around the total Wi-Fi data used at a Super Bowl.

Until the next IBM/Corning network gets built in Atlanta (at the Falcons’ new Mercedes-Benz Stadium, slated to open in 2017), the Kyle Field network will no doubt be the center of much stadium-technology market attention, especially if they ever do manage to get 100,000 fans to use the Wi-Fi all at once. While A&M’s on-the-field fortunes in the competitive SEC are a yearly question, the performance of the network in the Aggies’ stadium isn’t; right now it would certainly be one of the top four seeds, if not No. 1, if there was such a thing as a college stadium network playoff.

What we’re looking forward to is more data and more reports from a stadium with a network that can provide “that extra push over the edge” when fans want to turn their connectivity dial past 10. Remember, this one goes to 11. It’s one more.

(More photos below! And don’t forget to download your copy of the STADIUM TECH REPORT for more!)

kf7
Panoramic view of Kyle Field before the 102,000 fans fill the seats.

kf2
Some things at Kyle Field operate at ‘traditional’ speeds.

kf1

Outside the south gate before the game begins.

kf3

Overhang antenna in the middle section of the stadium.

Fans at College Football Playoff championship game use 4.9 TB of Wi-Fi data, 3.9 TB of DAS from AT&T and Verizon

Alabama coach Nick Saban hoists the college championship trophy. Photo by Kent Gidley / University of Alabama

Alabama coach Nick Saban hoists the college championship trophy. Photo by Kent Gidley / University of Alabama

The exciting national championship game Monday night between Alabama and Clemson also resulted in a big night for Wi-Fi and cellular usage at the University of Phoenix Stadium in Glendale, Ariz., with 4.9 terabytes of Wi-Fi data consumed, according to stadium network officials.

While the number didn’t set a stadium record — the 6.23 TB of Wi-Fi used at Super Bowl XLIX last February in the same venue is still the highest single-game Wi-Fi mark we’ve seen — the 4.9 TB used Monday nearly matches the total from last year’s inaugural College Football Playoff championship game at AT&T Stadium in Arlington, Texas, where 4.93 TB of Wi-Fi was used. It’s worth noting, however, that Monday night’s game had 75,765 fans in attendance, almost 10,000 less than last year’s crowd of 85,689 at the first playoff championship. So at the very least, Monday’s fans used more data per fan in attendance than last year’s game.

On the cellular side of things however, AT&T reported that data usage on its DAS network Monday night exceeded the total from last year’s Super Bowl, with 1.9 TB carried Monday to top the 1.7 TB total AT&T recorded at Super Bowl XLIX. UPDATE, 1/26/16: Verizon has followed up with a report claiming it had 2 TB of DAS traffic at the event. So for right now the wireless total from Monday’s game stands at 8.8 TB, a number that still might grow if we ever hear from Sprint or T-Mobile.

Mark Feller, vice president of information technology for the Arizona Cardinals, said that the University of Phoenix Stadium Wi-Fi network saw 23,306 unique devices connect Monday night, with a peak concurrent connected total of 17,297 devices. The stadium network also saw an additional 1.2 TB of wired data used Monday night, primarily from press and photographer Ethernet connections, Feller said.

The 4.9 TB mark unofficially puts Monday’s game in the “top four” of highest-ever single game Wi-Fi data totals we’ve seen, behind only last year’s Super Bowl, an Alabama game at Texas A&M’s Kyle Field this fall that hit 5.7 TB, and (barely) last year’s college championship game. All eyes in the Wi-Fi totals world now turn to Levi’s Stadium, where Super Bowl 50 takes place Feb. 7. Will the 6.2 TB mark survive, maybe showing that fan data use at big games has peaked? Or will a new record be set?

Report excerpt: New Wi-Fi at Ole Miss

Game day at Vaught-Hemingway Stadium. All photos: Joshua McCoy/Ole Miss Athletics (click on any photo for a larger image)

Game day at Vaught-Hemingway Stadium. All photos: Joshua McCoy/Ole Miss Athletics (click on any photo for a larger image)

If you know anything about college football in general, and the SEC in particular, you know football in the south often means big crowds and fun game-day traditions. At the University of Mississippi — aka Ole Miss — you have the “Hotty Toddy” cheer and the renowned tailgating atmosphere in “the Grove.”

And now, you can add fan-facing Wi-Fi in Vaught-Hemingway Stadium to the mix.

While some might fret that bringing high-speed wireless communications to football stadiums takes away from the live experience, the reality of life in today’s connected society is that people expect their mobile devices to work wherever they roam, even if it’s to a place where 60,580 of their closest friends also congregate, like they do at Vaught-Hemingway on Saturdays in the fall.

Add in the desire these days for football fans to share their live experiences with friends and others over social network sites, and you can see why the demand for mobile bandwidth is now as much a part of college football as marching bands and tailgating parties.

Through a partnership with wireless service provider C Spire, and using Wi-Fi gear from Xirrus, Ole Miss brought fan-facing Wi-Fi to Vaught-Hemingway stadium in 2014, and just finished up its second season of service. According to Michael Thompson, senior associate athletic director for communications and marketing at Ole Miss, the need for better stadium connectivity surfaced after the school started conducting fan experience research about 5 years ago.

“Connectivity was just one component” of the research, said Thompson, alongside questions about many different elements of the game-day experience including parking, ticket-taker friendliness, concession prices and time spent waiting in lines. And then there were questions about using mobile devices for emails or voice calls.

Walk of champions outside the stadium.

Walk of champions outside the stadium.

“We saw [from the surveys] that we had some issues in meeting fan needs, especially in those two areas [voice calls and email],” Thompson said. And while Vaught-Hemingway did have a neutral-host Crown Castle DAS installed several years ago, Thompson said the carrier investment in the deployment was uneven.

Bringing in ‘state of the art’ Wi-Fi

Editor’s note: This story is part of our most recent STADIUM TECH REPORT, the COLLEGE FOOTBALL ISSUE. The 40+ page report, which includes profiles of stadium deployments at Texas A&M, Kansas State, Ole Miss and Oklahoma, is available for FREE DOWNLOAD from our site. Get your copy today!

To bolster connectivity in a method free of the constraints of a DAS, Thompson said the school put out an RFP for stadium Wi-Fi, and found “an incredible partner” in C Spire, a leading connectivity provider in the region around the Oxford, Mississippi campus.

Among the challenges in bringing Wi-Fi to Vaught-Hemingway — a stadium whose initial version was built in 1915 — was a lack of overhangs to place Wi-Fi access points, and old construction methods that wouldn’t allow for under-the-seat APs. But using Xirrus gear, C Spire and Ole Miss found a deployment method that worked — putting a lot of APs underneath the stands, shooting upwards through the concrete.

With 820 Wi-Fi APs inside the stadium, Thompson said the “Rebel Wi-Fi” network is “absolutely a state of the art system,” supporting “tens of thousands” of fans concurrently on the network during football games. Using analytics, Thompson said “it’s interesting to watch [online] behaviors, and to see what people are doing when there are big spikes [in traffic].” Not surprisingly, Thompson said that one recurring spike happens right after each opening kickoff, “when a lot of photos get shared.”

A small fee for non-C Spire customers

Promotion of the Wi-Fi network, Thompson said, starts with C Spire itself, since the carrier is the service provider “for a fairly large percentage of our fans.” C Spire customers can use the Wi-Fi network for free, Thompson said, and can have their devices autoconnect whenever they come to a game.

The panoramic view

The panoramic view

Non-C Spire customers, however, must pay a small fee for use of the Wi-Fi, which can either be added to the cost of a season ticket (the charge is $25 for a full-season Wi-Fi pass) or can buy a “day pass” for a $4.99 fee per game. Thompson said the network has no restrictions or blocking, and has seen fans “watching another game live” while at Vaught-Hemingway.

While it might take time to become a hallowed tradition, it’s a good bet that over time the Ole Miss fans will become as used to taking and sharing videos, photos and texts as they do rooting together and congregating along the “walk of champions” before games. It might not date back to 1915, but it’s an amenity that many mobile-device owners will cherish once they find out it’s there.

“There’s still a lot of people who just accept that it’s going to be hard to connect [at a stadium] because they were trained to think that for so long,” Thompson said. “Connectivity just dropped off their radar.”