Hockey crowd melted down Levi’s Stadium network and app, overwhelmed light rail

Levi's Stadium scoreboard during Stadium Series hockey game. Credit all images: Paul Kapustka, MSR (click on any photo for larger image).

Levi’s Stadium scoreboard during Stadium Series hockey game. Credit all images: Paul Kapustka, MSR (click on any photo for larger image).

From a financial and publicity standpoint Saturday’s Coors Light Stadium Series hockey game at Levi’s Stadium was a success, with 70,205 fans packing the football facility to watch the San Jose Sharks lose to the Los Angeles Kings, 2-1. But while the sellout crowd contributed to the general electricity that filled the venue, the mass of people also caused problems with the stadium’s vaunted wireless network, knocking out some parts of the Wi-Fi and cellular networks and overwhelming the unique feature of the stadium app designed to allow fans to have food and drinks delivered to their seats.

Hockey fans also swamped the VTA light rail system, causing some fans to wait as long as two hours before they could catch a bus or train to get home from the stadium. Though light rail officials said they will work on correcting the problems, the commuting jam does not bode well for a facility that is scheduled to host Super Bowl 50 in less than a year’s time, especially since many Super Bowl fans are expected to be traveling from San Francisco to the Santa Clara, Calif., neighborhood where Levi’s Stadium sits.

According to Roger Hacker, senior manager for corporate communications for the San Francisco 49ers, the Levi’s Stadium network team identified “isolated interruptions” of the Wi-Fi network, due to “frequency coordination issues” that the network team had not seen at previous events. Hacker also said that one unnamed wireless carrier had “issues” with its base station firmware, but said that the problems were resolved by game’s end. (For the record, I am a Verizon Wireless customer and I had “issues” getting cellular connectivity Saturday, so draw your own conclusions.)

Since the Niners’ full explanation is somewhat light on facts and numbers, we will first offer a “fan’s view” of the events Saturday night, under the caveat that Mobile Sports Report was not attending the game as press, but instead as just a regular hockey fan (one who purchased two full-price tickets) who was looking forward to using the stadium’s technology to enhance the game experience. Unfortunately for this fan, the Levi’s Stadium network, app and transit services all fell down on the job.

Light show a dud

Though the MSR team had no problems getting to the stadium — our light rail train out of Mountain View at about 5:30 p.m. was relatively empty — I noticed some irregularities in network connections during the pregame ceremonies, when I tried to join in the fan-participation light show, a technology feature recently added to the Levi’s Stadium app especially for the Stadium Series game. Like many people in our area, I couldn’t get the app to work, leaving me staring at a spinning graphic while others in the stadium saw their phones contribute flashing lights during pre-game music.

After the light show segment ended, I noticed that the Levi’s app was performing erratically, quitting on its own and kicking my device off the Wi-Fi network. After rebooting the device (a new Apple iPhone 6 Plus) I still couldn’t connect to the Wi-Fi, an experience I’ve never had at Levi’s. Turning off the Wi-Fi didn’t help, as cellular service also seemed poor. Since I wasn’t really there to work — I just wanted to enjoy the game with my older brother, who was in town for the event — I posted a quick tweet and went back to just watching the Sharks play poorly for the first 20 minutes.

One of the benefits of being a close follower of Levi’s Stadium technology is that when you tweet, people listen. By the middle of the first intermission, I was visited personally by Anoop Nagwani, the new head of the Levi’s Stadium network team, along with a technician from Aruba Networks, the Wi-Fi gear supplier at the stadium. Even with laptops and scanners, my visitors couldn’t immediately discern the network problem; they were, however, visited by a number of other nearby fans, who figured out who they were and relayed their own networking problems to them.

To be clear: I didn’t spend the game as I usually do at Levi’s, wandering around to see how the network is performing at as many spots as I can. But even if the outage was only in our area, that’s a significant problem for Levi’s Stadium, which has touted its technology every chance it gets. I also noticed problems with cellular connectivity all night, which leads me to believe that the network issues were more widespread than just at my seating area.

The official statement from Hacker describing the problems doesn’t pin any specific blame, but a guess from us is that perhaps something in the mix of systems used by the entertainment performers (there was a small stage to one side of the rink where musicians performed) and media new to the facility caused the Wi-Fi problem. Here is the official statement on the Wi-Fi issues:

The Levi’s Stadium network team identified isolated interruptions of the WiFi system in specific sections on Saturday night due to frequency coordination issues previously unseen at the venue and unique to this event. Saturday’s event featured extra radio systems not typical to previous stadium events, some of which were found to be unauthorized by event frequency coordinators. To avoid similar situations in the future, Levi’s Stadium management will be initiating additional frequency control protocols for all events.

Hacker said the network team did not track exactly how widespread the outages were, so could not provide a number of fans affected. But enough apparently did connect, since according to Hacker, the Levi’s network saw near-record traffic Saturday night, with a total of 3.0 terabytes of data carried, second only to the season-opening Niners game back in September, which saw 3.3 TB of data used on the Wi-Fi. Hacker said there were 24,792 unique devices connected to Wi-Fi during Saturday’s event, with a peak concurrent user number of 17,400 users, also second highest behind the season-opener total of 19,0000. The Stadium Series game did set a new mark for throughput with 3.5 Gbps on the network just before the start of the game, a surge that seems to be behind some of the other problems.

Food ordering overwhelmed

During the intermission, my brother and I went out on the 300-level concourse to get something to eat and drink — and encountered one of the untold stories of Levi’s Stadium: the incredibly long and slow lines for concessions. While I haven’t researched this problem in depth, after 10 minutes of inertia in our line I told my brother I would use the app’s food and drink ordering function to get us some vittles and beverages. Finally able to connect via Wi-Fi while on the concourse I placed an order for two beers and two hot dogs, and didn’t worry that the delivery time was 20 minutes. That would put it at the very latest near the end of the second period, which was fine by me since it meant I didn’t have to wait in lines. Or so I thought.

Back in my seat, I was troubled by the fact that even halfway through the period, the app had not switched yet from ordered to “en route.” I also got some error messages I had never seen at Levi’s Stadium before:

When the period ended and there was still no movement from the app (which I only checked sporadically since Wi-Fi never fully connected in my seat), I went back on the concourse where I found a small, angry crowd around the food-runner window at the closest concession stand. Pretty much, everyone there had the same problem I had: We’d ordered food and the app had said that the order had been taken, but nothing had happened since then.

Fans trying to figure out why their food orders weren't delivered

Fans trying to figure out why their food orders weren’t delivered

The situation wasn’t good since nobody at the food-runner window had any technology that would allow them to communicate with the app or network team; they couldn’t even cancel orders or make sure credit card refunds would be processed, which only served to increase the frustration for the fans who were just trying to use the services as advertised.

In the end, the staff at the delivery window did the best they could — which at one point resulted in someone producing slips of paper which the waiting fans used to write down their orders; one staffer then tried to fulfill those orders as best he could, going to the concession stand and bringing them out one by one. After waiting nearly the full intermission (missing Melissa Etheridge) I was given two cold hot dogs and two draft beers. Since there were no food holders left at the stand, I had to put the hot dogs into my jacket pockets and hold both beers. At least I didn’t starve or go thirsty, but it was a far cry from the delivered-to-the-seat functionality I had raved about to my brother that simply didn’t materialize.

During this process I sent an email to Louise Callagy, vice president of marketing at stadium app developer VenueNext. Her in-game response was:

“Levi’s Stadium app usage exceeded any previous event and set new records, causing delivery and order fulfillment delays. As always, we will do a post mortem after the event, and make the necessary adjustments to operational and staffing support, including systems performance analysis. We apologize to any fans who were inconvenienced.”

According to Hacker, the Levi’s Stadium food-runner staffing was at the same level as a regular-season Niners’ game; however, Hacker said the hockey fans broke the previous ordering records before the first period was over. Here is the official statement on the food ordering snafu:

With more than 31,000 new downloads of the Levi’s Stadium App – 20 percent more than had ever been seen at any previous stadium event – the [food ordering] system experienced 50 percent higher order volume in the just first hour of the game than had been seen during any previous event. The dramatic increase led to the extended wait times and cancelled orders experienced by some fans.

In a separate email, Hacker did not provide an exact number for how many fans were represented by the term “some,” but he did confirm that “no customers were charged for unfulfilled orders.”

Still, the system shouldn’t have had any unfulfilled orders, at least not according to the Niners’ consistent hype of the network and the app. Remember, Niners officials had long been confident that their network would be able to stand up to any load. Such was not the case Saturday night.

The long wait home

VTA line following Levi's Stadium hockey game

VTA line following Levi’s Stadium hockey game

After an exciting third period and a game that went down to the final horn, we left the stadium and were immediately greeted by a mass of people packing in to the VTA departure area. With too many people and not enough trains and buses, we spent almost an hour moving like slow cattle until we eventually got on a train to Mountain View. We considered ourselves lucky, since it looked like the folks heading south on VTA were in for an even longer wait.

When we got to the Mountain View station, we waited almost another hour to leave since Caltrain (nicely) kept its last train at the station until two more VTA trains brought the stragglers in from Levi’s. Though VTA has since claimed there were more than twice the “normal” number of riders than it saw at Niners games this season, there was no explanation why VTA didn’t or couldn’t provide more capacity after it saw more fans use the service to get to the game. What was most unpleasant was the overall unorganized method of boarding the trains, just a massive group line with one VTA person on a bullhorn telling everyone to make sure they bought a ticket.

In the end, the time it took to get from the start of the VTA line to my house in San Mateo was three hours — almost as long as the game itself. With some other “special” events like Wrestlemania and concerts coming up at Levi’s and the Super Bowl 50 next year, it’s clear there is lots of work that needs to be done to make it a good experience for all who purchase a ticket, especially those looking to use public transport and the app features to enhance their game-day experience.

Sharks and Kings on the ice at Levi's Stadium

Sharks and Kings on the ice at Levi’s Stadium

Artemis announces DISH spectrum lease, setting up San Francisco pCell service trial; also makes venue-specific hub available for trial

Artemis Networks founder Steve Perlman. Credit all photos: Artemis Networks

Artemis Networks founder Steve Perlman. Credit all photos: Artemis Networks

Artemis Networks moved one step closer to a real-world offering of its pCell wireless service with the announcement of a spectrum lease deal with satellite provider DISH that will give Artemis the means to offer commercial services in San Francisco perhaps as early as sometime later this year, pending FCC approval.

In the meantime, owners of large public venues (like sports stadiums) can now test out the Artemis technology for themselves, by testing an Artemis I Hub and antenna combination in a trial arrangement with the company. Announced last year, Artemis’ pCell technology claims to solve two of the biggest problems in wireless networking, namely bandwidth congestion and antenna interference, by turning much of the current technology thinking on its head. If the revolutionary networking idea from longtime entrepreneur Steve Perlman pans out, stadium networks in particular could become more robust while also being cheaper and easier to deploy.

In a phone interview with Mobile Sports Report prior to Tuesday’s announcement, Perlman said Artemis expects to get FCC approval for its pCell-based wireless service sometime in the next 6 months. When that happens, Artemis will announce pricing for its cellular service, which will work with most existing LTE phones by adding in a SIM card provided by Artemis. Phones with dynamic SIMs like some of the newer devices from Apple, Perlman said, will be able to simply choose the Artemis service without having to add in a card.

Though he wouldn’t announce pricing yet, Perlman said Artemis services would be less expensive than current cellular plans. He said that there will likely be an option for local San Francisco service only, and another that includes roaming ability on other providers’ cellular networks for use outside the city.

More proof behind the yet-untasted pudding

When Perlman, the inventor of QuickTime and WebTV, announced Artemis and its pCell technology last year, it was met with both excitement — for its promise of delivering faster, cheaper wireless services — and no shortage of skepticism, about whether it would ever become a viable commercial product. Though pCell’s projected promise of using cellular interference to produce targeted, powerful cellular connectivity could be a boon to builders of large public-venue networks like those found in sports stadiums, owners and operators of those venues are loath to build expensive networks on untested, unproven technology. So it’s perhaps no surprise that Artemis has yet to name a paying customer for its revolutionary networking gear.

Artemis I Hub

Artemis I Hub

But being able to name names and talk about spectrum deals are steps bringing Artemis closer to something people can try, and perhaps buy. VenueNext, the application development firm behind the San Francisco 49ers’ Levi’s Stadium app, confirmed that it is testing Artemis technology, and the San Francisco network will provide Perlman and Artemis with a “beta” type platform to test and “shake out the system” in a live production environment.

“We need to be able to move quickly, get feedback and test the network,” said Perlman about Artemis’ decision to run its own network first, instead of waiting for a larger operator to implement it. “We need to be able to move at startup speed.”

For stadium owners and operators, the more interesting part of Tuesday’s news may be the Artemis I Hub, a device that supports up to 32 antennas, indoor for now with outdoor units due later this year. The trial testing will allow venue owners and operators to kick the tires on pCell deployment and performance on their own, instead of just taking Artemis’ word for it. Artemis also has published a lengthy white paper that fleshes out the explanation of their somewhat radical approach to cellular connectivity, another step toward legitimacy since publishing such a document publicly means that Artemis is confident of its claims.

If networking statistics from recent “big” stadium events are any barometer, the field of stadium networking may need some significant help soon since fans are lately using way more data than ever before, including the 13+ Gigabytes of traffic at the Super Bowl in Phoenix and the 6+ GB figure from the college football playoff championship game. To Perlman, the idea of trying to use current Wi-Fi and cellular technology to address a crowded space doesn’t make sense.

“You simply cannot use interfering technology in a situation where you have closely packed transmitters,” said Perlman. “You just can’t do it.”

Artemis explained

pCell antenna from Artemis Networking

pCell antenna from Artemis Networking

If you’re unfamiliar with the Artemis idea, at its simplest level it’s a new idea in connecting wireless devices to antennas that — if it works as advertised — turns conventional cellular and Wi-Fi thinking on its head. What Perlman and Artemis claim is that they have developed a way to build radios that transmit signals “that deliberately interfere with each other” to establish a “personal cell,” or pCell, for each device connecting to the network.

(See this BusinessWeek story from 2011 that thoroughly explains the Artemis premise in detail. This EE Times article also has more details, and this Wired article is also a helpful read.)

Leaving the complicated math and physics to the side for now, if Artemis’ claims hold true their technology could solve two of the biggest problems in wireless networking, namely bandwidth congestion and antenna interference. In current cellular and Wi-Fi designs, devices share signals from antenna radios, meaning bandwidth is reduced as more people connect to a cellular antenna or a Wi-Fi access point. Adding more antennas is one way to solve congestion problems; but especially in stadiums and other large public venues, you can’t place antennas too close to each other, because of signal interference.

The Artemis pCell technology, Perlman said, trumps both problems by delivering a centimeter-sized cell of coverage to each device, which can follow the device as it moves around in an antenna’s coverage zone. Again, if the company’s claims hold true of being able to deliver full bandwidth to each device “no matter how many users” are connected to each antenna, stadium networks could theoretically support much higher levels of connectivity at possibly a fraction of the current cost.

The next step in Artemis’ evolution will be to see if (or how well) its technology works in the wild, where everyday users can subject it to the unplanned stresses that can’t be tested in the lab. With any luck and FCC willing, we won’t have to wait another year for the next chapter to unfold.

New Atlanta football stadium picks IBM as lead technology integrator

New Atlanta football stadium under construction. Credit all images: New Atlanta Stadium

New Atlanta football stadium under construction. Credit all images: New Atlanta Stadium

The yet-to-be named new football stadium under construction in Atlanta has selected IBM as its lead technology integrator, somewhat officially welcoming the new 800-pound gorilla to the stadium technology marketplace.

While computing giant IBM has dabbled in sports deployments before — mainly contributing technology as part of its corporate sponsorship of events like The Masters in golf and the U.S. Open for tennis — only recently has Big Blue gotten into the large-venue technology integration game. And while IBM’s recent deal as technology integrator for the revamp of Texas A&M’s Kyle Field was probably its true debut, for the new Atlanta stadium IBM will lead the selection, design and deployment of a wide range of technologies, including but not limited to the core Wi-Fi and DAS networks that will provide in-venue wireless connectivity.

Due to open in March of 2017, the new $1.4 billion stadium is expected to hold 71,000 fans for football, and up to 83,000 fans for other events like basketball or concerts. And while soccer and concerts and basketball will certainly be part of its events schedule, the NFL Atlanta Falcons and owner Arthur Blank are driving the bus on the new building, picking IBM in part to help satisfy a desire to build a venue that will be second to none when it comes to fan experience.

IBM’s size and experience a draw for Atlanta

Interior stadium design rendering

Interior stadium design rendering

In addition to Wi-Fi and DAS network buildouts, IBM will design systems to control the expected 2,000-plus digital displays in the planned stadium and will also oversee other technology-related parts of the stadium, including video security, physical door controls and a video intercom system, according to an announcement made today. IBM will also partner with the stadium owners to develop as yet-undetermined applications to “leverage the power of mobility to create a highly contextual, more personalized game day experience for fans, all through the integration of analytics, mobile, social, security and cloud technologies.”

In a phone interview Thursday, Jared Miller, chief technology officer for Blank’s namesake AMB Sports and Entertainment (AMBSE) group, said IBM’s depth and breadth in technology, applications and design made it a somewhat easy choice as lead technology partner.

Miller said the stadium developers looked at the number of different technology systems that would exist within the building, and ideally wanted to identify a single partner to help build and control them all, instead of multiple providers who might just have a single “silo” of expertise.

Proposed stadium exterior

Proposed stadium exterior

“IBM is unique with its span of technology footprint,” Miller said. He also cited IBM’s ability to not just deploy technology but to also help determine what the technology could be used for, with analytics and application design.

“They’ve looked at the [stadium] opportunity in a different manner, thinking about what we could do with the network once it’s built,” Miller said.

IBM, which also has a sizable consulting business, created a group targeting “interactive experiences” about two years ago, according to Shannon Miller, the North America Fan Experience Lead for the IBM Interactive Experience group. Miller (no relation to Jared Miller), also interviewed by phone Thursday, said IBM had been working with Arthur Blank and the Falcons for more than a year to determine how to make the new stadium “the best fan experience in the world.”

And while IBM is somewhat of a newcomer to the stadium-technology integration game, IBM’s Miller said the company not only understands “how to make digital and physical work together,” but also has resources in areas including innovation, technology development and design that smaller firms may not have. And while the Kyle Field project was ambitious, IBM’s Miller said the Atlanta operation will be much bigger.

“The size and scale of what we’re going to do [in Atlanta] will be unique,” he said.

No suppliers picked yet for Wi-Fi or DAS

For industry watchers, IBM and the Falcons team have not yet picked technology suppliers for discrete parts of the coming wireless network, such as Wi-Fi access points and DAS gear. (Daktronics has already been announced as the supplier of the new planned Halo Screen video board.) But those vendor decisions will likely be coming soon, since the stadium is under a hard deadline to open for the first game of the Major League Soccer season in March of 2017.

“We’re working fast and furious on design, and we want to identify [the gear suppliers] as early as possible,” said AMBSE’s Miller.

IBM and AMBSE did announce that the stadium’s network will be fiber-based, and will probably use Corning as a fiber and Passive Optical Network (PON) technology provider, though that choice was not announced or confirmed. IBM and Corning partnered to install a fiber network core for Wi-Fi and DAS at Texas A&M’s Kyle Field, believed to be the first large fiber network in a large stadium anywhere.

The Atlanta deal puts IBM solidly into the rapidly expanding field of stadium technology integration, which includes companies like CDW (which led network deployments at the University of Nebraska and the University of Phoenix Stadium) as well as stadium ownership groups, like the San Francisco 49ers, and technology name sponsors like AT&T, which has partnered with owners for technology and network deployments at venues like AT&T Park and AT&T Stadium.

Overhead view

Overhead view

Levis’ Stadium app adds special features for Sharks-Kings outdoor hockey game

Mocked-up screen shot of what the Levi's Stadium app will look like for Saturday's outdoor hockey game. Credit: VenueNext

Mocked-up screen shot of what the Levi’s Stadium app will look like for Saturday’s outdoor hockey game. Credit: VenueNext

Other than mobile ticketing, all of the regular features of the Levi’s Stadium mobile app will be active for Saturday’s outdoor hockey game between the San Jose Sharks and the Los Angeles Kings, with fans able to use the app over the free Wi-Fi network or the enhanced cellular DAS to do things like watch instant replays, or to order food, drinks and merchandise and have those items delivered to every seat in the 68,500-seat venue.

New for the app as a special treat for fans at the Coors Light NHL Stadium Series event is a “live, crowd-generated light show” experience, using technology from Baltimore, Md.-based Wham City Lights that synchronizes smartphones to produce a mass lighting effect. The app feature will, according to the NHL and the Levi’s app producer VenueNext, “blanket the stadium with a synchronized, multi-colored visualization of the live musical entertainment on the field,” if of course enough fans download the app and activate it at the right time.

Just like Niners fans this past football season, hockey fans at Levi’s Stadium on Saturday will be able to download the free app and use it to watch live streaming video of the event, as well as instant replays from several angles. Fans can also use the app to purchase parking tickets and get directions to the stadium as well as their seating section once inside the venue.

Screen Shot 2015-02-18 at 4.18.31 PMWhat will be interesting to see is if hockey fans generate more wireless data usage than football fans, a possibility since hockey has two natural built-in mid-game breaks as opposed to football’s halftime. Since the event is also more of a “bucket list” type game than a regular-season football game, the possibility exists that Sharks, Kings and general hockey fans in attendance may break the previous Levi’s data record set at the Niners’ home opener. Stay tuned to MSR next week, when with any luck we’ll get wireless usage stats from the Levi’s Stadium network team.

DGP gets deal to extend DAS outside Levi’s Stadium

Franks and DAS: DGP DAS antennas above food station at Levi's Stadium. Photo credit: Paul Kapustka, MSR

Franks and DAS: DGP DAS antennas above food station at Levi’s Stadium. Photo credit: Paul Kapustka, MSR

DAS Group Professionals, the company that installed the neutral-host DAS inside Levi’s Stadium, now has a deal to extend the DAS outside the Levi’s walls, covering parts of the city of Santa Clara, Calif., that surround the stadium.

With next year’s Super Bowl set to take place at Levi’s Stadium, it makes sense that city officials would want to make sure the parking lots and other pre-game gathering areas outside the venue had good cellular connectivity. At the most recent Super Bowl in Glendale, Ariz., neutral host provider Crown Castle did an extensive job of building the “oDAS” or outside DAS in the spaces surrounding the University of Phoenix Stadium.

According to DGP, it will design, build and maintain an oDAS for the City of Santa Clara, initially targeting the area around the Great America theme park and the Santa Clara Convention Center, which sit on the other side of the main Levi’s Stadium parking lots. Like the DAS inside the stadium, access to the network outside the stadium will be offered to all major wireless carriers, who must pay DGP and the city for access to the network.

While the network will definitely come in handy for pre- and post-game connectivity following Levi’s Stadium events, it will also improve overall cellular performance in the area, which is also the home to several large corporate office buildings as well as the busy convention center.

Super Bowl XLIX sets new stadium Wi-Fi record with 6.2 Terabytes of data consumed

University of Phoenix Stadium. Credit: Arizona Cardinals.

University of Phoenix Stadium. Credit: Arizona Cardinals.

The Super Bowl is once again the stadium Wi-Fi champ, as fans at Sunday’s Super Bowl XLIX in Glendale, Ariz., used 6.23 terabytes of data during the contest, according to the team running the network at the University of Phoenix Stadium.

The 6.23 TB mark blew past the most recent entrant in the “most Wi-Fi used at a single-day single-stadium event” sweepstakes, the 4.93 TB used at the Jan. 12 College Football Playoff championship game at AT&T Stadium. Prior to that, pro football games this past season at Levi’s Stadium in Santa Clara, Calif., and at AT&T Stadium had pushed into the 3-plus TB mark to be among the highest totals ever reported.

The live crowd watching the New England Patriots’ 28-24 victory over the Seattle Seahawks also used about as much cellular data as well, with Verizon Wireless, AT&T and Sprint claiming a combined total of 6.56 TB used in and around the stadium on game day. All three carriers were on the in-stadium and outside-the-stadium DAS deployments being run by neutral host Crown Castle. If those figures are correct (more on this later) it would put the total wireless data usage for the event at 12.79 TB, far and away the biggest single day of wireless data use we’ve ever heard of.

Apple OS updates still the application king

Handrails with Wi-Fi antenna enclosures from AmpThink. Credit: Arizona Cardinals.

Handrails with Wi-Fi antenna enclosures from AmpThink. Credit: Arizona Cardinals.

Mark Feller, vice president of information technology for the Arizona Cardinals, and Travis Bugh, senior wireless consultant for CDW, provided Mobile Sports Report with the final Wi-Fi usage numbers, which are pretty stunning for anyone in the stadium networking profession. According to Feller the new CDW-deployed Wi-Fi network with Cisco gear at the UoP Stadium saw 2.499 TB of data downloaded, and 3.714 TB uploaded, for a total of 6.213 TB of Wi-Fi usage. Bugh of CDW said there were 25,936 unique devices connecting to the network on game day, with a peak concurrent usage of 17,322, recorded not surprisingly at halftime.

Peak download usage of 1.3 Gbps was recorded before the game’s start, while peak upload usage of 2.5 Gbps was hit at halftime. The top applications by bandwidth use, Feller said, were Apple (mobile update), Facebook, Dropbox and Snapchat.

DAS numbers also set new record, but clarification needed

The only reason we aren’t yet trumpeting the 6.564 TB of reported DAS use as a verified record is due to the differences in clarity from each of the reporting providers. We also haven’t yet heard any usage totals from T-Mobile, so it’s likely that the final final wireless data use number is somewhere north of 13 TB, if all can be believed.

Parking lot light poles, Westgate entertainment district. Can you spot the DAS?

Parking lot light poles, Westgate entertainment district. Can you spot the DAS?

As reported before, AT&T said it saw 1.7 TB of cellular wireless activity from its customers on game day, with 696 GB of that happening inside the stadium, and the balance coming from the outside areas before and after the game. We’d also like to welcome Sprint to the big-game reporting crew (thanks Sprint!), with its total of 754 GB of all 4G LTE traffic used in and around the stadium on game day. According to Sprint representatives, its Super Bowl coverage efforts included 5 COWs (cell towers on wheels) as well as expanded DAS and macro placements in various Phoenix-area locations. The Sprint coverage included the 2.5 GHz spectrum that uses TDD LTE technology.

As also previously reported, Verizon Wireless claimed 4.1 TB of customer traffic in and around the stadium on game day, which Verizon claims is all cellular traffic and does not reflect any Verizon Wireless customer use of the stadium Wi-Fi network. Verizon also reported some other interesting activity tidbits, which included 46,772 Verizon Wireless devices used at the game, of which just 59.7 percent were smartphones. Verizon also said it saw 10 million emails sent on its networks that day, and 1.9 million websites visited, while also seeing 122.308 videos sent or received over wireless connections.

We’re still waiting to see if we can get usage numbers from the Super Bowl stadium app (we’re especially interested to see if the instant replay feature caught on) but the warning for stadium owners and operators everywhere seems to be clear: If you’re hosting the big game (or any BIG game), make sure your network is ready for 6 TB and beyond!