Texas A&M’s Kyle Field: A network built for speed

Full house at Kyle Field. All Photos: Paul Kapustka, MSR (click on any photo for a larger image)

Full house at Kyle Field. All Photos: Paul Kapustka, MSR (click on any photo for a larger image)

Is there a combined stadium Wi-Fi and DAS deployment that is as fast as the one found at Texas A&M’s Kyle Field? If so, we haven’t seen or heard of it.

In fact, after reviewing loads of live network-performance data of Kyle Field’s new Wi-Fi and DAS in action, and after maxing out the top levels on our speed tests time after time during an informal walk-around on a game day, we’ve come to the conclusion that Kyle Field has itself a Spinal Tap of a wireless deployment. Meaning, that if other stadium networks stop at 10, this one goes to 11.

Movie references aside, quite simply, by the numbers Kyle Field’s wireless network performance is unequaled by any other large public venue’s we’ve tested in terms of raw speed and the ability to deliver bandwidth. With DAS and Wi-Fi speed measurements ranging between 40 Mbps and 60+ Mbps pretty much everywhere we roamed inside the 102,512-seat venue, it’s a safe bet to say that the school’s desire to “build the best network” in a stadium hit its goal as best as it could.

Editor’s note: This story is part of our most recent STADIUM TECH REPORT, the COLLEGE FOOTBALL ISSUE. The 40+ page report, which includes profiles of stadium deployments at Texas A&M, Kansas State, Ole Miss and Oklahoma, is available for FREE DOWNLOAD from our site. Get your copy today!

On one hand, the network’s top-line performance is not that much of a surprise, since as part of an overall Kyle Field renovation that has already cost an estimated $485 million, the optical-based Wi-Fi, DAS and IPTV deployment inside the Aggies’ football palace is probably among the most expensive and expansive in-venue networks ever built. According to Phillip Ray, Vice Chancellor for Business Affairs at The Texas A&M University System, the total cost of the optical-based Wi-Fi, DAS and IPTV network was “somewhere north of $20 million.”

Remote optical cabinet and Wi-Fi AP at Kyle Field.

Remote optical cabinet and Wi-Fi AP at Kyle Field.

And even though the nation’s biggest cellular carriers, AT&T and Verizon Wireless, paid nearly half the network’s cost – $10 million, according to Ray – with the dedication and work crews brought to the table by main suppliers IBM and Corning, and Wi-Fi gear vendor Aruba, you have components, expertise and budgetary freedom that perhaps only a small group of venue owners could hope to match.

But just throwing money and technology at a stadium doesn’t necessarily produce a great network. In a venue the size of the new Kyle Field there needs to be great care and innovative thinking behind antenna placement and tuning, and in that arena Texas A&M also had the guiding hand of AmpThink, a small firm with oversized smarts in Wi-Fi deployment, as evidenced by its impressive track record of helping wireless deployments at the biggest events including several recent Super Bowls.

The core decision to go with optical for the network’s guts, and a tactical decision to put a huge chunk of the Wi-Fi APs in under-seat deployments are just part of the strategy that produced a network that – in A&M fan parlance – can “BTHO” (Beat The Hell Out) of most challengers.

Since it’s almost impossible to directly compare stadiums and venue network performances due to all the possible variables, you’ll never hear us at Mobile Sports Report declare a “champion” when it comes to click-bait themes like “the most connected stadium ever.” Given its remote location some three hours south of Dallas in College Station, Texas, Kyle Field will almost certainly never face the ultimate “big game” pressures of a Super Bowl or a College Football Playoff championship, so the network may never know the stress such large, bucket-list gatherings can produce. And so far, there aren’t many ambitious fan-facing applications that use the network, like in-seat food delivery or wayfinding apps found in other stadiums.

But as part of the football-crazy SEC, and as the altar of pigskin worship for some of the most dedicated fans seen anywhere, Kyle Field is sure to see its share of sellout contests against SEC rivals that will push wireless usage to new heights, especially as more fans learn about and use the still-new system. Though total Wi-Fi usage at the Nov. 7 game we attended versus Auburn (a 26-10 Texas A&M loss) was “only” 2.94 terabytes – a total hampered by cold, windy and rainy conditions – an Oct. 17 game earlier in the season against Alabama saw 5.7 TB of Wi-Fi usage on the Kyle Field network, a number surpassed only by last year’s Super Bowl (with 6.2 TB of Wi-Fi use) in terms of total tonnage.

At the very least, the raw numbers of total attendees and the obvious strength of the still-new network is sure to guarantee that Kyle Field’s wireless deployment will be one of the most analyzed stadium networks for the foreseeable future.

Texas A&M student recording the halftime show.

Texas A&M student recording the halftime show.

What follows are some on-the-spot observations from our visit, which was aided by the guidance and hospitality of Corning project manager Sean Heffner, who played “tour guide” for part of the day, giving us behind-the-scenes access and views of the deployment that are unavailable to the general fan audience.

An off-campus DAS head end

This story starts not inside Kyle Field, but in a section of town just over three miles away from the stadium, on a muddy road that curves behind a funky nursery growing strange-looking plants. A gray metal box, like a big warehouse, is our destination, and the only clue as to what’s inside is the big antenna located right next to it. This structure is the Kyle Field DAS head end, where cellular carrier equipment connects to the fiber network that will bring signals to and from fans inside the stadium.

Why is the head end so far away? According to Corning’s Heffner there was no room for this huge space inside the stadium. But thanks to the use of optical fiber, the location is not a problem since signals traveling at the speed of light makes 3.3 miles an insignificant span.

It might be helpful to back up a bit if you haven’t heard the full story of the Kyle Field deployment, which we told last year when the job was halfway completed. Though the rebuilding of the stadium was started with copper-based networks as the original plan, a last-minute audible championed by Texas A&M chancellor John Sharp sent the school on a decidedly untraditional path, by building a stadium network with a single optical-based core for Wi-Fi, DAS and IPTV networks. The kicker? Not only would this network have huge capacity and be future-proof against growth, it would actually cost less than a comparable copper-based deployment. If it got built on time, that is.

Though the pitch for better performance, far more capacity, use of less space, and cheaper costs might sound a bit too good to believe, most of it is just the combination of the simple physics advantages of using fiber over copper, which are well known in the core telecom and large-enterprise networking worlds, applied to a stadium situation.

One of the many maxed-out speed tests we took at Texas A&M's Kyle Field.

One of the many maxed-out speed tests we took at Texas A&M’s Kyle Field.

Without going too deeply into the physics or technology, a simple explanation of the benefits stem from the fact that optical fiber can carry far more bandwidth than copper, at farther distances, using less power. Those advantages are why fiber is used extensively in core backbone networks, and has been creeping slowly closer to the user’s destination, through deployments like Verizon’s FiOS.

And that’s also the reason why Texas A&M could put its DAS head end out in a field where it’s easier to add to (no space constraints), because the speed of fiber makes distance somewhat irrelevant. Corning’s Heffner also said that the DAS can be managed remotely, so that staff doesn’t need to be physically present to monitor the equipment.

Of course, there was the small matter of digging trenches for optical fibers to get from the head end to the stadium, but again, for this project it is apparent that getting things done was more important than strictly worrying about costs. Beyond the cash that the carriers all put in, other vendors and construction partners all put in some extra efforts or resources – in part, probably because the value of positive publicity for being part of such an ambitious undertaking makes any extra costs easy to justify.

Keeping the best fans connected and happy

From the head end, the fiber winds its way past apartment buildings and a golf course to get to Kyle Field, the center of the local universe on football game days. Deep inside the bowels of the venue is where the fiber meets networking gear, in a room chilled to the temperature of firm ice cream. Here is where the human element that helps keep the network running spends its game days, wearing fleece and ski jackets no matter what the temperature is outside.

See the white dots? Those are under-seat Wi-Fi APs

See the white dots? Those are under-seat Wi-Fi APs

In addition to Corning, IBM and AmpThink employees, this room during our visit also had a representative from YinzCam in attendance, a rarity for a company that prides itself on being able to have its stadium and team apps run without local supervision. But with YinzCam recently named as a partner to IBM’s nascent stadium technology practice, it’s apparent that the Kyle Field network is more than just a great service for the fans in the seats – it’s also a proof of concept network that is being closely watched by all the entities that helped bring it together, who for many reasons want to be able to catch any issues before they become problems.

How big and how ambitious is the Kyle Field network? From the outset, Corning and IBM said the Wi-Fi network part was designed to support 100,000 connections at a speed of 2 Mbps, so that if everyone in the stadium decided to log on, they’d all have decent bandwidth. But so far, that upper level hasn’t been tested yet.

What happened through the first season was a “take rate” averaging in the 35,000-37,000 range, meaning that during a game day, roughly one-third of the fans in attendance used the Wi-Fi at some point. The average concurrent user peaks – the highest numbers of fans using the network at the same time – generally averaged in the mid-20,000 range, according to figures provided by Corning and AmpThink; so instead of 100,000 fans connecting at 2 Mbps, this season there was about a quarter of that number connecting at much higher data rates, if our ad hoc speed tests are any proof.

Our first test that Saturday [Nov. 7, 2015], just inside a lower-level service entryway, hit 41.35 Mbps for download and 18.67 on the upload, on a Verizon iPhone 6 Plus over the stadium’s DAS. And yes, that download speed was the slowest we’d record all day, either on the DAS or the Wi-Fi.

Inside the control room we spent some time with AmpThink CEO Bill Anderson, who could probably use up an entire football game talking about Wi-Fi network deployment strategies if he didn’t have a big network to watch. On this Saturday the top things we learned about Kyle Field is that Anderson and AmpThink are solid believers in under-seat AP placements for performance reasons; according to Anderson at Kyle Field, fully 669 of the stadium’s 1,300 APs can be found underneath seats. Anderson also is a stickler for “real” Wi-Fi usage measurements, like trying to weed out devices that may have autoconnected to the Wi-Fi network but not used it from the “unique user” totals – and to take bandwidth measurements at the network firewall, to truly see how much “live” bandwidth is coming and going.

On the road to College Station, Aggie pride is everywhere. Whoop!

On the road to College Station, Aggie pride is everywhere. Whoop!

AmpThink’s attention to detail includes deploying and configuring APs differently depending on which section they are located in – student sections, for example, are more densely packed with people than other sections so the APs need different tuning. Corning’s Heffner also said that the oDAS – the DAS just outside the stadium – got special attention due to the large numbers of tailgating fans, both before and during the games. At the Alabama game, Heffner said there were some 30,000 fans who remained outside the stadium during the contest, never coming inside but still wanting to participate in the scene.

AmpThink, Corning, IBM and others involved at Kyle Field all seem keen on finding out just how much bandwidth stadium fans will use if you give them unlimited access. The guess? According to Corning’s Heffner, the mantra of stadium networks these days seems to be: “If you provide more capacity, it gets consumed.”

The ‘real’ 12th man

After walking through a tunnel with a nearly full cable tray overhead (“It’d be even more loaded if we were using copper,” Heffner said) we went out into the stadium itself, which was just starting to fill. Though the overcast day and intermittment rain squalls might have kept other teams’ fans from showing up for a 5:30 p.m. local start time, that simply wasn’t the case at an A&M home game.

Some of the Wi-FI and DAS download measurements we took at Kyle Field.

Some of the Wi-FI and DAS download measurements we took at Kyle Field.

As someone who’s attended a countless number of football games, small and large – including a Super Bowl and last year’s inaugural College Football Playoff championship game – I can honestly say that the level of fan participation at Texas A&M is like nothing I’d seen before. The student section alone spans two decks on the stadium’s east side and takes up 40,000 seats, according to stadium officials – simply dwarfing anything I’d ever witnessed. (Out of an enrollment of 57,000+, having 40,000 students attend games is incredible.) And outside of small high school crowds I’d never seen an entire full stadium participate in all the school songs, the “yells” (do NOT call them “cheers” here) and the locked-arms back-and-forth “sawing” dance without any need for scoreboard instruction.

Part of the stadium renovation that closed the structure into a bowl was, according to school officials, designed to make Kyle Field even more intimidating than it already was, by increasing the sound levels possible. Unfortunately the night of our visit some early Auburn scores took some of the steam out of the crowd, and a driving, chilling rain that appeared just before halftime sent a good part of the crowd either home or into the concourses looking for warmth and shelter. (The next day, several columnists in the local paper admonished the fans who left early for their transgressions; how dare they depart a game whose outcome was still in doubt?)

But I’ll never forget the power of the synchronized “yells” of tens of thousands of fans during pregame, and the roar that surfaced when former Aggie QB Johnny Manziel made a surprise appearance on the field before kickoff. Seattle Seahawks fans may stake the pro claim to fan support, but if you want to determine the “real” 12th man experience you need to stop by Kyle Field and give your ears a taste of loud.

Controlling the TV with the app

If the students and alumni and other fans outside provide the vocal power, the money power that helped get the stadium rebuilt can be found in the new Kyle Field suites and premium seating areas, some of which are found on the venue’s west side, which was blown up last December and rebuilt in time for this past season.

Conduit reaching to an under-seat AP

Conduit reaching to an under-seat AP

Inside the All American Club – a behind-the-walls gathering area with catered food and bars that would not seem out of place in Levi’s Stadium or AT&T Stadium – we tested the Wi-Fi and got speeds of 63 Mbps down, 69 Mbps up; Verizon’s 4G LTE service on the DAS hit 48 Mbps/14.78 Mbps, while AT&T’s 4G LTE DAS checked in at 40 Mbps/22 Mbps.

In an actual suite where we were allowed to check out the IPTV displays, the speed tests got 67/67 for Wi-Fi and 57/12 for Verizon 4G LTE. So the well-heeled backers of A&M football shouldn’t have any problems when it comes to connectivity.

As for the IPTV controls, the new system from YinzCam solves one of the problems that’s plagued stadium suites since there’s been suites: What do you do with the TV remote? What YinzCam did for Texas A&M was link the TV controls to a Texas A&M “TV Remote” app; by simply punching in a numerical code that appears on the bottom of the screen in front of you, anyone with access to a suite or club area with TVs can change the channel to a long list of selections, including multiple live game-day views (stadium screen, broadcast view) as well as to other channels, like other games on the ESPN SEC network.

By having a static code number for each TV and another set of numbers that randomly scrambles over time, the system smartly builds security into the channel changing system, and prevents someone who had been in a suite previously from being able to change the channels after they leave. The whole remote-control process took less than a minute to learn, and we had fun wandering through the club-level areas our pass gave us access to, changing screens as we saw fit.

Our favorite places to watch the game at Kyle Field were the loge-level lounges, where you could first purchase food and beverages, including alcoholic ones, at an inside bar and then sit at an outside seat with a small-screen TV in front of you for information overload. The Wi-Fi in the southwest corner loge lounge checked in at 67.03/62.93, so it was no problem being connected via mobile device, either.

What comes next for the Kyle Field network?

Even though the rain had started coming down harder, we left the comfort and warmth of the club levels to wander around the stadium’s upper decks, including the student section, where we watched numerous fans taking pictures or videos of the band’s halftime performance. Clearly most everyone in Kyle Field had gotten the message and wasn’t afraid that they won’t connect if they use their mobile device at the game, even among 102,000 of their closest friends.

Antennas on flag poles atop seating

Antennas on flag poles atop seating

The question now for Kyle Field is what does it do next with its network? The most obvious place for innovation or new features is with a stadium-centric app, one that could provide services like a wayfinding map. Maybe it was our round-the-stadium wandering that produced confusion finding our way around, but any building that seats 102,000 plus could use an interactive map. It might also be interesting to tie a map to concessions – the night we visited, there were long lines at the few hot chocolate stands due to the cold weather; in such situations you could conceivably use the network to find out where hot chocolate stands were running low, maybe open new ones and alert fans through the app.

We’re guessing parking and ticketing functions might also be tied to the app in the future, but for now we’ll have to wait and see what happens. One thing in Kyle Field’s favor for the future: thanks to the capacity of the optical network buildout, the stadium already has thousands of spare fiber connections that aren’t currently being used. That means when it’s time to upgrade or add more DAS antennas, Wi-Fi APs or whatever comes next, Kyle Field is already wired to handle it.

For the Nov. 7 game at Kyle Field, the final numbers included 37,121 unique users of the Wi-Fi network, and a peak concurrent user number of 23,101 taken near the end of the 3rd quarter. The total traffic used on the Wi-Fi network that night was 2.94 TB, perhaps low or average for Kyle Field these days but it’s helpful to remember that just three years ago that was right around the total Wi-Fi data used at a Super Bowl.

Until the next IBM/Corning network gets built in Atlanta (at the Falcons’ new Mercedes-Benz Stadium, slated to open in 2017), the Kyle Field network will no doubt be the center of much stadium-technology market attention, especially if they ever do manage to get 100,000 fans to use the Wi-Fi all at once. While A&M’s on-the-field fortunes in the competitive SEC are a yearly question, the performance of the network in the Aggies’ stadium isn’t; right now it would certainly be one of the top four seeds, if not No. 1, if there was such a thing as a college stadium network playoff.

What we’re looking forward to is more data and more reports from a stadium with a network that can provide “that extra push over the edge” when fans want to turn their connectivity dial past 10. Remember, this one goes to 11. It’s one more.

(More photos below! And don’t forget to download your copy of the STADIUM TECH REPORT for more!)

kf7
Panoramic view of Kyle Field before the 102,000 fans fill the seats.

kf2
Some things at Kyle Field operate at ‘traditional’ speeds.

kf1

Outside the south gate before the game begins.

kf3

Overhang antenna in the middle section of the stadium.

Fans at College Football Playoff championship game use 4.9 TB of Wi-Fi data, 3.9 TB of DAS from AT&T and Verizon

Alabama coach Nick Saban hoists the college championship trophy. Photo by Kent Gidley / University of Alabama

Alabama coach Nick Saban hoists the college championship trophy. Photo by Kent Gidley / University of Alabama

The exciting national championship game Monday night between Alabama and Clemson also resulted in a big night for Wi-Fi and cellular usage at the University of Phoenix Stadium in Glendale, Ariz., with 4.9 terabytes of Wi-Fi data consumed, according to stadium network officials.

While the number didn’t set a stadium record — the 6.23 TB of Wi-Fi used at Super Bowl XLIX last February in the same venue is still the highest single-game Wi-Fi mark we’ve seen — the 4.9 TB used Monday nearly matches the total from last year’s inaugural College Football Playoff championship game at AT&T Stadium in Arlington, Texas, where 4.93 TB of Wi-Fi was used. It’s worth noting, however, that Monday night’s game had 75,765 fans in attendance, almost 10,000 less than last year’s crowd of 85,689 at the first playoff championship. So at the very least, Monday’s fans used more data per fan in attendance than last year’s game.

On the cellular side of things however, AT&T reported that data usage on its DAS network Monday night exceeded the total from last year’s Super Bowl, with 1.9 TB carried Monday to top the 1.7 TB total AT&T recorded at Super Bowl XLIX. UPDATE, 1/26/16: Verizon has followed up with a report claiming it had 2 TB of DAS traffic at the event. So for right now the wireless total from Monday’s game stands at 8.8 TB, a number that still might grow if we ever hear from Sprint or T-Mobile.

Mark Feller, vice president of information technology for the Arizona Cardinals, said that the University of Phoenix Stadium Wi-Fi network saw 23,306 unique devices connect Monday night, with a peak concurrent connected total of 17,297 devices. The stadium network also saw an additional 1.2 TB of wired data used Monday night, primarily from press and photographer Ethernet connections, Feller said.

The 4.9 TB mark unofficially puts Monday’s game in the “top four” of highest-ever single game Wi-Fi data totals we’ve seen, behind only last year’s Super Bowl, an Alabama game at Texas A&M’s Kyle Field this fall that hit 5.7 TB, and (barely) last year’s college championship game. All eyes in the Wi-Fi totals world now turn to Levi’s Stadium, where Super Bowl 50 takes place Feb. 7. Will the 6.2 TB mark survive, maybe showing that fan data use at big games has peaked? Or will a new record be set?

AT&T: NFL fans used 55% more DAS data this year

dx1AT&T customers who visited NFL stadiums this season used 55 percent more cellular traffic this year than last, according to some year-end figures from AT&T.

In the 31 different NFL venues where there is an AT&T DAS AT&T customers used 132.8 terabytes of cellular data this NFL season, with the Dec. 14 Monday night game between the New York Jets and the Miami Dolphins topping the single-game charts with 1.6 TB of DAS data used, according to AT&T. It’s appropriate that Sun Life Stadium had the biggest data game, since Miami’s home also led the NFL for highest average DAS data used, with 1.4 TB per game. Close behind in second place for average DAS use was AT&T Stadium in Arlington, Texas, where the average hit 1.257 TB this season. Third was San Diego’s Qualcomm Stadium with 1.085 TB, and fourth was Levi’s Stadium in Santa Clara, Calif., with an average of 1.054 TB each game on the AT&T DAS.

Report excerpt: New Wi-Fi at Ole Miss

Game day at Vaught-Hemingway Stadium. All photos: Joshua McCoy/Ole Miss Athletics (click on any photo for a larger image)

Game day at Vaught-Hemingway Stadium. All photos: Joshua McCoy/Ole Miss Athletics (click on any photo for a larger image)

If you know anything about college football in general, and the SEC in particular, you know football in the south often means big crowds and fun game-day traditions. At the University of Mississippi — aka Ole Miss — you have the “Hotty Toddy” cheer and the renowned tailgating atmosphere in “the Grove.”

And now, you can add fan-facing Wi-Fi in Vaught-Hemingway Stadium to the mix.

While some might fret that bringing high-speed wireless communications to football stadiums takes away from the live experience, the reality of life in today’s connected society is that people expect their mobile devices to work wherever they roam, even if it’s to a place where 60,580 of their closest friends also congregate, like they do at Vaught-Hemingway on Saturdays in the fall.

Add in the desire these days for football fans to share their live experiences with friends and others over social network sites, and you can see why the demand for mobile bandwidth is now as much a part of college football as marching bands and tailgating parties.

Through a partnership with wireless service provider C Spire, and using Wi-Fi gear from Xirrus, Ole Miss brought fan-facing Wi-Fi to Vaught-Hemingway stadium in 2014, and just finished up its second season of service. According to Michael Thompson, senior associate athletic director for communications and marketing at Ole Miss, the need for better stadium connectivity surfaced after the school started conducting fan experience research about 5 years ago.

“Connectivity was just one component” of the research, said Thompson, alongside questions about many different elements of the game-day experience including parking, ticket-taker friendliness, concession prices and time spent waiting in lines. And then there were questions about using mobile devices for emails or voice calls.

Walk of champions outside the stadium.

Walk of champions outside the stadium.

“We saw [from the surveys] that we had some issues in meeting fan needs, especially in those two areas [voice calls and email],” Thompson said. And while Vaught-Hemingway did have a neutral-host Crown Castle DAS installed several years ago, Thompson said the carrier investment in the deployment was uneven.

Bringing in ‘state of the art’ Wi-Fi

Editor’s note: This story is part of our most recent STADIUM TECH REPORT, the COLLEGE FOOTBALL ISSUE. The 40+ page report, which includes profiles of stadium deployments at Texas A&M, Kansas State, Ole Miss and Oklahoma, is available for FREE DOWNLOAD from our site. Get your copy today!

To bolster connectivity in a method free of the constraints of a DAS, Thompson said the school put out an RFP for stadium Wi-Fi, and found “an incredible partner” in C Spire, a leading connectivity provider in the region around the Oxford, Mississippi campus.

Among the challenges in bringing Wi-Fi to Vaught-Hemingway — a stadium whose initial version was built in 1915 — was a lack of overhangs to place Wi-Fi access points, and old construction methods that wouldn’t allow for under-the-seat APs. But using Xirrus gear, C Spire and Ole Miss found a deployment method that worked — putting a lot of APs underneath the stands, shooting upwards through the concrete.

With 820 Wi-Fi APs inside the stadium, Thompson said the “Rebel Wi-Fi” network is “absolutely a state of the art system,” supporting “tens of thousands” of fans concurrently on the network during football games. Using analytics, Thompson said “it’s interesting to watch [online] behaviors, and to see what people are doing when there are big spikes [in traffic].” Not surprisingly, Thompson said that one recurring spike happens right after each opening kickoff, “when a lot of photos get shared.”

A small fee for non-C Spire customers

Promotion of the Wi-Fi network, Thompson said, starts with C Spire itself, since the carrier is the service provider “for a fairly large percentage of our fans.” C Spire customers can use the Wi-Fi network for free, Thompson said, and can have their devices autoconnect whenever they come to a game.

The panoramic view

The panoramic view

Non-C Spire customers, however, must pay a small fee for use of the Wi-Fi, which can either be added to the cost of a season ticket (the charge is $25 for a full-season Wi-Fi pass) or can buy a “day pass” for a $4.99 fee per game. Thompson said the network has no restrictions or blocking, and has seen fans “watching another game live” while at Vaught-Hemingway.

While it might take time to become a hallowed tradition, it’s a good bet that over time the Ole Miss fans will become as used to taking and sharing videos, photos and texts as they do rooting together and congregating along the “walk of champions” before games. It might not date back to 1915, but it’s an amenity that many mobile-device owners will cherish once they find out it’s there.

“There’s still a lot of people who just accept that it’s going to be hard to connect [at a stadium] because they were trained to think that for so long,” Thompson said. “Connectivity just dropped off their radar.”

Cuban: Fans shouldn’t look at phones ‘while the ball is in the air’

Mark Cuban during CES panel. All photos: Paul Kapustka, MSR

Mark Cuban during CES panel. All photos: Paul Kapustka, MSR

LAS VEGAS, CES 2016 — On the subject of wireless technology inside stadiums, Dallas Mavericks owner Mark Cuban is historically painted as an anti-tech crusader, based on an old story that has become more myth than truth, especially in the stadium-tech marketplace. Thursday at CES, Cuban clarified his thinking on wireless technology use during sports events, with a very clear nuance that shows the deep thinking that makes him a popular analyzer on numerous topics.

Cuban, maybe known better now outside the sports world for his reality/investment TV show Shark Tank, clarified his thinking on in-stadium wireless use during a panel discussion that was part of a special sports/tech series here hosted by Turner Sports. Since his team’s stadium has a robust Wi-Fi network, Cuban clearly isn’t against good connectivity anymore, and said Thursday that during breaks in game action, wireless technology should help fans do as much as possible to ease the game-day experience.

But when the “ball is in the air,” Cuban said, he still thinks fans should put phones back in their pockets or purses.

“Anytime I see someone looking at a phone [during play] I feel like we lose a little bit of them,” Cuban said. “Technology can work against you in an arena. You have to be very careful that you don’t do anything that will take the game away.”

Shaq greets fans after panel

Shaq greets fans after panel

Using tech to take away pain points

To be sure, wireless technology is only going to increase in NBA arenas, especially when the Sacramento Kings’ new Golden 1 Center opens this fall with one of the most-dense Wi-Fi deployments in any arena. Fellow panelist Shaquille O’Neal waxed eloquently about the Kings’ planned use of wireless technology to support wireless ticketing and marketing integration, all for the benefit of the fan experience.

In an earlier panel, NBA commissioner Adam Silver said league teams “have done a great job” making sure the connectivity inside arenas is a similar experience to “what people get at home.” But even with enough bandwidth to watch the game live at courtside on a phone, almost all of the panelists Thursday were in agreement that the live game experience would still remain wildly popular, even as technologies like virtual reality and on-player cameras make the TV experience that much better.

“People still crave the ability to be around other people,” said Silver, who called sports stadiums “the modern town hall” while noting that NBA season ticket sales were currently at all-time highs. Vivek Ranadive, owner of the Sacramento Kings, said during another panel that live streaming video and other over-the-top Internet experiences only serve to make the live game attendance that much more attractive.

NBA commissioner Adam Silver

NBA commissioner Adam Silver

“Only 18,000 people can come to the stadium,” said Ranadive, noting the capacity of the Golden 1 Center, slated to open for the 2016-17 season. The streaming video and social media outreach by the team, he said, “drives demand for the in-stadium experience.”

And that’s an experience, Cuban said, that simply can’t be duplicated at home, no matter how big a screen or how comfortable a couch.

“When the outcome of a game is hanging on a shot, if you’re there you’re holding your breath while the ball is in the air,” Cuban said. One fan told Cuban that he “did a big tree hug” on a total stranger after a recent last-second win by the Mavericks. “You’re not going to do that with some stranger in your living room,” Cuban said. “The energy you feel [in the stadium] is the most valuable part of the product we own.”

Ruckus, DAS Group Professionals, CommScope and Brocade all part of Sacramento Kings’ new tech-forward stadium

Golden 1 Center in Sacramento taking shape earlier this summer. All photos: Paul Kapustka, MSR (click on any photo for a larger image)

Golden 1 Center in Sacramento taking shape earlier this summer. All photos: Paul Kapustka, MSR (click on any photo for a larger image)

Thursday morning at CES here in Las Vegas Sacramento Kings owner Vivek Ranadive is scheduled to speak and will no doubt tell the CES attendees about the Kings’ new stadium, the Golden 1 Center, and about how tech-loaded it is by design. But Wednesday night details emerged about the vendors helping the Kings with their extensive wireless deployment, and the list includes Ruckus Wireless, DAS Group Professionals, Brocade and CommScope, among others.

As previously reported by Mobile Sports Report, Ruckus gear will be used in the Wi-Fi deployment not just at the 17,500-seat Golden 1 Center, but also in the surrounding area, which is supposed to include a new public plaza and other developments, including hotel, office, housing and retail space. In the press announcement of all the tech underpinnings the Kings do not state exactly how many Wi-Fi APs will be in the stadium proper but instead say that there will be “more than 1,000” APs in both the stadium and surrounding plaza and developments. UPDATE, 1/10/16: The Kings have responded to clarify, saying there isn’t yet an exact AP count but density is expected to be in the area of one AP per 15 seats, which would put the final total well over 1,000 APs and easily be the most APs for a Wi-Fi deployment in any basketball/hockey arena we know of, and perhaps the most dense of any sporting venue (for now).

Since we’re nit-picking we’ll also question the Kings’ claim that Golden 1 Center will be “the first arena in the world to implement wide-band, multimode fiber technology” on the backbone, a curious claim since the fiber-based network at Texas A&M’s Kyle Field is already operational (and working quite well). UPDATE, 1/10/16: The Kings have responded and say that their implementation differs from Texas A&M’s passive optical network; we will provide further details and comparisons in the near future.

The DGP team at Levi's Stadium for a summer interview included, L to R, Derek Cotton, director of engineering; Steve Dutto, president; and Vince Gamick, VP and COO. These guys are probably smiling again now that DGP will be part of the Golden 1 Center deployment.

The DGP team at Levi’s Stadium for a summer interview included, L to R, Derek Cotton, director of engineering; Steve Dutto, president; and Vince Gamick, VP and COO. These guys are probably smiling again now that DGP will be part of the Golden 1 Center deployment.

Frothy claims aside, we are very interested in hearing more about the venue’s tech underpinnings, especially the combined DAS/small cell deployment being installed by DAS Group Professionals, the builders of the DAS network at Levi’s Stadium in Santa Clara, Calif. According to the Kings’ release wireless powerhouse CommScope will be part of the infrastructure as well (along with bandwidth provider Comcast, a deal that was announced last month), and network backbone gear provider Brocade will also be involved, making Golden 1 Center a mini-me kind of version of Levi’s Stadium, where Comcast, Brocade and DGP are all also involved. (This is also not so surprising since we have heard rumors that the Kings hired some IT folks who previously worked on the Levi’s Stadium deployment.)

If there is an outlier to the deal it’s the Wi-Fi presence of Ruckus, which has had a tough year when it comes to potential stadium deployments. First Ruckus had a deal for Wi-Fi at the new San Jose Earthquakes soccer stadium but lost that when Avaya booted Ruckus off the pitch by purchasing naming rights to now-Avaya Stadium for $20 million. More recently, Ruckus was part of an initial winning bid with integrator 5 Bars for the Wi-Fi deployment at Houston’s NRG Stadium, but was replaced at the last minute by Extreme Networks to unspecified and unconfirmed pressure, most likely by the NFL. On the plus side, Ruckus gear was used for the Wi-Fi deployment at Angels Stadium in Anaheim, as well as at Indian Wells Tennis Garden, site of the big spring pro tennis tourney.

We will try to fill in more blanks and details during Ranadive’s appearance Thursday (like who will be designing the team app, which we are guessing might be VenueNext), but the real proof of the Golden 1 pudding won’t come until October, since you never can tell how a stadium network will work until it’s turned on for a full house of device-holding fans. That’s why we don’t put much stock in theoretical claims, like the Kings’ ridiculous promise that the network can handle “over 500,000 Snapchat posts per second” — that’s some fast fingers for a full house of 17,500, no? When it comes to feeds and speeds we are firmly in the show-me house, so we hope the Kings and Golden 1 Center will be as open with their real-world statistics come next fall as they are with press-release superlatives now.