Levi’s Stadium sees 5.1 TB of Wi-Fi data used at college football championship

Fans and media members at Monday night’s College Football Playoff championship game used a total of 5.1 terabytes of data on the Wi-Fi network at Levi’s Stadium, according to figures provided by the San Francisco 49ers, who own and run the venue.

With 74,814 in attendance for Clemson’s 44-16 victory over Alabama, 17,440 of those in the stands found their way onto the stadium’s Wi-Fi network. According to the Niners the peak concurrent connection number of 11,674 users was seen at 7:05 p.m. local time, which was probably right around the halftime break. The peak bandwidth rate of 3.81 Gbps, the Niners said, was seen at 5:15 p.m. local time, just after kickoff.

In a nice granular breakout, the Niners said about 4.24 TB of the Wi-Fi data was used by fans, while a bit more than 675 GB was used by the more than 925 media members in attendance. The Wi-Fi data totals were recorded during an 8-1/2 hour period on Monday, from 1 p.m. to 9:30 p.m. local time.

Added to the 3.7 TB of DAS traffic AT&T reported inside Levi’s Stadium Monday night, we’re up to 8.8 TB total wireless traffic so far, with reports from Verizon, Sprint and T-Mobile still not in. The top Wi-Fi number at Levi’s Stadium, for now, remains Super Bowl 50, which saw 10.1 TB of Wi-Fi traffic.

Can virtualization help venues meet growing mobile capacity demands?

By Josh Adelson, director, Portfolio Marketing, CommScope

U.S. mobile operators reported a combined 50 terabytes of cellular traffic during the 2018 Super Bowl, nearly doubling over the previous year. In fact, Super Bowl data consumption has doubled every year for at least the past 6 years and it shows no sign of slowing down.

Clearly, fans love their LTE connections almost as much as they love their local team. Fans have the option for cellular or Wi-Fi, but cellular is the default connection whereas Wi-Fi requires a manual connection step that many users may not bother with.[1] The same dynamic is playing out on a smaller scale in every event venue and commercial building.

Whether you are a venue owner, part of the team organization or in the media, this heightened connectivity represents an opportunity to connect more with fans, and to expand your audience to the fans’ own social connections beyond the venue walls.

U.S. mobile operators’ staggering data traffic during events like the Super Bowl underscores the critical need for robust electrical distribution systems in venues. As connectivity demands soar, reliable power distribution becomes paramount to support not only telecommunications infrastructure but also all operational facets of modern venues.

From lighting and HVAC systems to security and concessions, efficient electrical distribution ensures seamless operations and enhances the overall fan experience. Whether in a stadium or commercial building, ensuring uninterrupted power supply is not just a necessity but a strategic advantage in engaging audiences and maximizing event potential.

Regardless of application, the knowledgeable product specialists from this website will ensure your electrical solutions continue to run smoothly and safely. Their expertise in optimizing electrical distribution networks can help venues meet escalating connectivity demands while maintaining operational reliability.

As connectivity becomes a cornerstone of modern experiences, integrating robust electrical systems ensures seamless operations, allowing organizations to focus on enhancing fan experiences and expanding digital engagement effortlessly.

As the demand for seamless connectivity and uninterrupted operations in modern venues intensifies, the importance of reliable power backup solutions cannot be overstated. With events and activities increasingly reliant on continuous electrical support, having a robust backup system is essential for maintaining operational integrity.

Power outages, whether due to unforeseen failures or high demand, can disrupt everything from telecommunications infrastructure to lighting, HVAC systems, and security measures. Integrating a dependable backup solution like the APC Smart SMT UPS ensures that venues remain operational even in the face of power interruptions. This advanced uninterruptible power supply system provides reliable, continuous power, protecting critical systems and maintaining the fan experience without disruption.

By incorporating such cutting-edge technology, venues can safeguard their electrical distribution networks, ensuring that they meet the escalating demands of connectivity while enhancing overall reliability. Whether for large-scale stadiums or smaller commercial settings, effective power backup solutions are a crucial component in delivering a seamless, uninterrupted experience and in keeping digital engagement strong and consistent.

But keeping up with the demand is also a challenge. High capacity can come at a high cost, and these systems also require significant real estate for head-end equipment. Can you please your fans and leverage their connectedness while keeping equipment and deployment costs from breaking the capex scoreboard?

Virtualization and C-RAN to the rescue?

Editor’s note: This post is part of Mobile Sports Report’s new Voices of the Industry feature, in which industry representatives submit articles, commentary or other information to share with the greater stadium technology marketplace. These are NOT paid advertisements, or infomercials. See our explanation of the feature to understand how it works.

Enterprise IT departments have long since learned that centralizing and virtualizing their computing infrastructures has been a way to grow capacity while reducing equipment cost and space requirements. Can sports and entertainment venues achieve the same by virtualizing their in-building wireless infrastructures? To answer this question, let’s first review the concepts and how they apply to wireless infrastructure.

In the IT domain, virtualization refers to replacing multiple task-specific servers with a centralized resource pool that can be dynamically assigned to a given task on demand. Underlying this concept is the premise that, while each application has its own resource needs, at any given time only a subset will be active, so the total shared resource can be less than the sum of the individual requirements.

How does this translate to in-building wireless? Centralizing the base station function is known as C-RAN, which stands for centralized (or cloud) radio access network. C-RAN involves not only the physical pooling of the base stations into a single location — which is already the practice in most venues — but also digital architecture and software intelligence to allocate baseband capacity to different parts of the building in response to shifting demand.

C-RAN brings immediate benefits to a large venue in-building wireless deployment. The ability to allocate capacity across the venue via software rather than hardware adds flexibility and ease of operation. This is especially important in multi-building venues that include not only a stadium or arena but also surrounding administrative buildings, retail spaces, restaurants, hotels and transportation hubs. As capacity needs shift between the spaces by time of day or day of week, you need a system that can “point” the capacity to the necessary hot spots.

C-RAN can even go a step further to remove the head-end from the building campus altogether. Mobile network operators are increasingly deploying base stations in distributed locations known as C-RAN hubs. If there is a C-RAN hub close to the target building, then the in-building system can take a signal directly from the hub, via a fiber connection. Even if the operator needs to add capacity to the hub for this building, this arrangement gives them the flexibility to use that capacity in other parts of their network when it’s not needed at the building. It also simplifies maintenance and support as it keeps the base station equipment within the operator’s facilities.

For the building owner, this architecture can reduce the footprint of the on-campus head-end by as much as 90 percent. Once the baseband resources are centralized, the next logical step is to virtualize them into software running on PC server platforms. As it turns out, this is not so simple. Mobile baseband processing is a real-time, compute-intensive function that today runs on embedded processors in specialized hardware platforms. A lot of progress is being made toward virtualization onto more standard computers, but as of today, most mobile base stations are still purpose-built.

Perhaps more important for stadium owners is the fact that the base station (also called the signal source) is selected and usually owned by the mobile network operator. Therefore the form it takes has at most only an indirect effect on the economics for the venue. And whether the signal source is virtual or physical, the signal still must be distributed by a physical network of cables, radios and antennas.

The interface between the signal source and the distribution network provides another opportunity for savings. The Common Public Radio Interface (CPRI) establishes a digital interface that reduces space and power requirements while allowing the distribution network — the DAS — to take advantage of the intelligence in the base station. To leverage these advantages, the DAS also needs to be digital.

To illustrate this, consider the head-end for a 12 MIMO sector configuration with 4 MIMO bands per sector, as shown below. In this configuration a typical analog DAS is compared with a digital C-RAN antenna system, with and without a CPRI baseband interface. In the analog systems, points of interface (POIs) are needed to condition the signals from the different sources to an equal level before they are combined and converted to optical signals via an e/o (electric to optical) transceiver. In a digital system, signal conditioning and conversion from electric to digital is integrated into a single card, providing significant space saving.

* A typical analog system will require 8 POIs (4 MIMO bands per sector) and 2 o/e transceivers per sector resulting in 10 cards per sector. A typical subrack (chassis) supports up 10-12 cards, so a subrack will support 1 MIMO sector. For 12 MIMO sectors, 12 subracks are needed and each is typically 4 rack unit height. This results in a total space of 48 rack units.

* For a digital system[2] without CPRI, each subrack supports 32 SISO ports. Each MIMO sector with 4 MIMO bands will require 8 ports resulting in supporting 4 MIMO sectors per subrack. For 12 MIMO sectors, 3 subracks of 5 rack unit height each are needed resulting in total space of 15 rack units.

* For a digital system with CPRI, each subrack supports 48 MIMO ports. Each MIMO sector with 4 MIMO bands will require 4 ports resulting in 12 MIMO sectors per subrack. For 12 MIMO sectors, only 1 subrack of 5 rack unit height is needed resulting in total space of 5 rack units.

One commercial example of this is Nokia’s collaboration with CommScope to offer a CPRI interface between Nokia’s AirScale base station and CommScope’s Era C-RAN antenna system. With this technology, a small interface card replaces an array of remote radio units, reducing space and power consumption in the C-RAN hub by up to 90 percent. This also provides a stepping-stone to future Open RAN interfaces when they become standardized and commercialized.

The Benefits in Action — A Case Study

Even without virtualization, the savings of digitization, C-RAN and CPRI at stadium scale are significant. The table below shows a recent design that CommScope created for a large stadium complex in the U.S. For this, we compared 3 alternatives: traditional analog DAS, a digital C-RAN antenna system with RF base station interfaces, and a C-RAN antenna system with CPRI interfaces. Digital C-RAN and CPRI produce a dramatic reduction in the space requirements, as the table below illustrates.

The amount of equipment is reduced because a digital system does in software what analog systems must do in hardware, and CPRI even further eliminates hardware. Energy savings are roughly proportional to the space savings, since both are a function of the amount of equipment required for the solution.

Fiber savings, shown in the table below, are similarly significant.

The amount of fiber is reduced because digitized signals can be encoded and transmitted more efficiently.

But these savings are only part of the story. This venue, like most, is used for different types of events — football games, concerts, trade shows and even monster truck rallies. Each type of event has its own unique traffic pattern and timing. With analog systems, re-sectoring to accommodate these changes literally requires on-site physical re-wiring of head-end units. But with a digital C-RAN based system it’s possible to re-sectorize from anywhere through a browser-based, drag and drop interface.

The Bottom Line

It’s a safe bet that mobile data demand will continue to grow. But the tools now exist to let you control whether this will be a burden, or an opportunity to realize new potential. C-RAN, virtualization and open RAN interfaces all have a role to play in making venue networks more deployable, flexible and affordable. By understanding what each one offers, you can make the best decisions for your network.

Josh Adelson is director of portfolio marketing at CommScope, where he is responsible for marketing the award-winning OneCell Cloud RAN small cell, Era C-RAN antenna system and ION-E distributed antenna system. He has over 20 years experience in mobile communications, content delivery and networking. Prior to joining CommScope, Josh has held product marketing and management positions with Airvana (acquired by CommScope in 2015) PeerApp, Brooktrout Technology and Motorola. He holds an MBA from the MIT Sloan School of Management and a BA from Brown University.

FOOTNOTES
1: 59 percent of users at the 2018 Super Bowl attached to the stadium Wi-Fi network.
2: Dimensioning for digital systems is based on CommScope Era.

AT&T: Lots of DAS traffic for college football championship

DAS on a cart: DAS Group Professionals deployed mobile DAS stations to help cover the parking lots at Levi’s Stadium for the college football playoff championship. Credit: DGP

This may not be a news flash to any stadium network operations team but the amount of mobile data consumed by fans at college football games continues to hit high levels, according to some new figures released by AT&T.

In a press release blog post where AT&T said it saw 9 terabytes of cellular data used over the college football playoff championship-game weekend in the Bay area, AT&T also crowned a cellular “data champion,” reporting that Texas A&M saw 36.6 TB of data used on the AT&T networks in and around Kyle Field in College Station, Texas.

(Actually, AT&T pointedly does NOT declare Texas A&M the champs — most likely because of some contractural issue, AT&T does not identify actual stadiums or teams in its data reports. Instead, it reports the cities where the data use occurred, but we can figure out the rest for our readers.)

For the College Football Playoff championship, AT&T was able to break down some specific numbers for us, reporting 3.7 TB of that overall total was used inside Levi’s Stadium on game day. Cell traffic from the parking lots and tailgating areas (see photo of DAS cart to left) added another 2.97 TB of traffic on AT&T’s networks, resulting in a game-day area total of 6.67 TB. That total is in Super Bowl range of traffic, so we are excited to see what the Wi-Fi traffic total is from the game (waiting now for the college playoff folks to get the statistics finalized, so stay tuned).

DAS antennas visible at Levi’s Stadium during a Niners game this past season. Credit: Paul Kapustka, MSR

For the additional 2+ TB of traffic, a footnote explains it somewhat more: “Data includes the in-venue DAS, COWs, and surrounding macro network for AT&T customers throughout the weekend.”

Any other carriers who want to add their stats to the total, you know where to find us.

Back to Texas A&M for a moment — in its blog post AT&T also noted that the stadium in College Station (which we will identify as Kyle Field) had the most single-game mobile usage in the U.S. this football season, with nearly 7 TB used on Nov. 24. Aggie fans will remember that as the wild seven-overtime 74-72 win over LSU, an incredible game that not surprisingly resulted in lots of stadium cellular traffic.

New Report: Texas A&M scores with new digital fan-engagement strategy

In the short history of in-stadium mobile fan engagement, a team or stadium app has been the go-to strategy for many venue owners and operators. But what if that strategy is wrong?

That question gets an interesting answer with the lead profile in our most recent STADIUM TECH REPORT, the Winter 2018-19 issue! These quarterly long-form reports are designed to give stadium and large public venue owners and operators, and digital sports business executives a way to dig deep into the topic of stadium technology, via exclusive research and profiles of successful stadium technology deployments, as well as news and analysis of topics important to this growing market.

Leading off for this issue is an in-depth report on a new browser-based digital game day program effort launched this football season at Texas A&M, where some longtime assumptions about mobile apps and fan engagement were blown apart by the performance of the Aggies’ new project. A must read for all venue operations professionals! We also have in-person visits to Atlanta’s Mercedes-Benz Stadium and the renovated State Farm Arena, the venue formerly known as Philips Arena. A Q&A with NFL CIO Michelle McKenna-Doyle and a report on a CBRS network test by the PGA round out this informative issue! DOWNLOAD YOUR REPORT today!

We’d like to take a quick moment to thank our sponsors, which for this issue include Mobilitie, JMA Wireless, Corning, Huber+Suhner, Boingo, Oberon, MatSing, Neutral Connect Networks, Everest Networks, and ExteNet Systems. Their generous sponsorship makes it possible for us to offer this content free of charge to our readers. We’d also like to welcome readers from the Inside Towers community, who may have found their way here via our ongoing partnership with the excellent publication Inside Towers. We’d also like to thank the SEAT community for your continued interest and support.

As always, we are here to hear what you have to say: Send me an email to kaps@mobilesportsreport.com and let us know what you think of our STADIUM TECH REPORT series.

BYU scores with new Wi-Fi, app for LaVell Edwards Stadium

BYU’s LaVell Edwards Stadium. Credit all photos: photo@byu.edu (click on any picture for a larger image)

At Brigham Young University, the wait for Wi-Fi was worth it.

After a selection and deployment process that took almost three years, the first full season of Wi-Fi at BYU’s LaVell Edwards Stadium was a roaring success, with high fan adoption rates and a couple 6-plus terabyte single-game data totals seen during the 2018 football season. Using 1,241 APs from gear supplier Extreme Networks, the Wi-Fi deployment also saw high usage of the new game-day app, built for BYU by local software supplier Pesci Sports.

Duff Tittle, associate athletic director for communications at Brigham Young University, said the school spent nearly 2 1/2 years “studying the concept” of bringing Wi-Fi to the 63,470-seat stadium in Provo, Utah. After looking at “five different options,” BYU chose to go with Extreme, based mainly on Extreme’s long track record of football stadium deployments.

“We visited their stadiums, and also liked what they offered for analytics,” said Tittle of Extreme. “They had what we were looking for.”

According to Tittle, the deployment was actually mostly finished in 2017, allowing the school to do a test run at the last game of that season. Heading into 2018, Tittle said the school was “really excited” to see what its new network could do — and the fans went even beyond those expectations.

Opener a big success

For BYU’s Sept. 8 home opener against California, Tittle said the Wi-Fi network saw 27,563 unique connections out of 52,602 in attendance — a 52 percent take rate. BYU’s new network also saw a peak of 26,797 concurrent connections (midway through the fourth quarter) en route to a first-day data total of 6.23 TB. The network also saw a peak bandwidth rate of 4.55 Gbps, according to statistics provided by the school.

Sideline AP deployment

“It blew us away, the number of connections [at the Cal game],” Tittle said. “It exceeded what we thought we’d get, right out of the gate.”

With almost no overhangs in the stadium — there is only one sideline structure for media and suites — BYU and Extreme went with mostly under-seat AP deployments, Tittle said, with approximately 1,000 of the 1,241 APs located inside the seating bowl. Extreme has used under-seat deployments in many of its NFL stadium networks, including at Super Bowl LI in Houston.

Another success story was the new BYU app, which Tittle said had been in development for almost as long as the Wi-Fi plan. While many stadium and team apps struggle for traction, the BYU app saw good usage right out of the gate, finishing just behind the ESPN app for total number of users (2,306 for the BYU app vs. 2,470 for ESPN) during the same Cal game. The BYU app just barely trailed Instagram (2,327) in number of users seen that day, and outpaced SnapChat (1,603) and Twitter (1,580), according to statistics provided by Tittle. The app also supports instant replay video, as well as a service that lets fans order food to be picked up at a couple express-pickup windows.

What also might have helped fuel app adoption is the presence of a “social media” ribbon board along the top of one side of the stadium, where fan messages get seen in wide-screen glory. Tittle said the tech-savvy locals in the Provo area (which has long been the home to many technology companies, including LAN pioneer Novell) are also probably part of the app crowd, “since our fan base loves that kind of stuff.”

Tittle also said that Verizon Wireless helped pay for part of the Wi-Fi network’s construction, and like at other NFL stadiums where Verizon has done so, it gets a separate SSID for its users at LaVell Edwards Stadium. Verizon also built the stadium’s DAS (back in 2017), which also supports communications from AT&T and T-Mobile. (More photos below)

Under-seat AP enclosure

A peek inside

The social media ribbon board above the stands

LaVell Edwards Stadium at night, with a view of the press/suites structure

Mercedes-Benz Stadium Wi-Fi saw 12 TB of data used at January’s college championship

The iconic ‘halo board’ video screen below the unique roof opening at Atlanta’s Mercedes-Benz Stadium. Credit all photos: Paul Kapustka, MSR (click on any picture for a larger image)

The Wi-Fi network at Atlanta’s Mercedes-Benz Stadium saw 12 terabytes of data used at the 2018 College Football Playoff championship on Jan. 8, 2018, according to officals from the Atlanta Falcons, owners and operators of this city’s new distinctive venue.

We’d long suspected that Mercedes-Benz Stadium, which opened in August of 2017, had seen big data days inside the 71,000-seat arena with its innovative technology, but until Sunday the Falcons had never made any network-performance data publicly available. But a day after the venue saw another 8.06 TB of Wi-Fi used during the SEC Championship game, Danny Branch, chief information officer for AMB Sports & Entertainment, revealed the statistics during a live MSR visit at an Atlanta Falcons home game. The 12 TB mark (which was an estimate — we’ll check back with the Falcons for exact numbers) is the second-highest we’ve ever seen in our unofficial research of single-day Wi-Fi totals, trailing only the 16.31 TB recorded at Super Bowl LII in February at U.S. Bank Stadium.

“We’re confident and ready for the Super Bowl,” said Branch during a pregame stadium tour, details of which we’ll dig into deeper in a full profile for our upcoming Winter Stadium Tech Report. Multiple network speed tests taken by MSR during Sunday’s 26-16 Falcons loss to the visiting Baltimore Ravens showed robust Wi-Fi performance on the network that uses gear from Aruba, a Hewlett Packard Enterprise company, in a design from AmpThink.

DAS renovation complete

An under-seat DAS antenna in the 300 seating section at Mercedes-Benz Stadium

According to Branch, the cellular distributed antenna system (DAS) network inside Mercedes-Benz — a deployment that is at the center of a current lawsuit filed by contractor IBM against gear supplier and designer Corning — is also now at full deployment, with the completion of 700 new under-seat DAS antenna deployments, mostly in the upper seating deck.

MSR speed tests taken during Sunday’s game showed a wide range of DAS results, from single-digit tests in some tough-deployment areas to results near 100 Mbps directly in front of what looked like some new antenna deployments. Again, look for more details in our upcoming profile in the Winter Stadium Tech Report (due out in mid-December).

“We’re in a good place [with the DAS],” said Branch, though he did say there was going to be more DAS work done on the outside of Mercedes-Benz Stadium prior to when Super Bowl LIII comes to the venue on Feb. 3, 2019, mainly to help ensure that the move toward more digital Super Bowl tickets goes smoothly. Mercedes-Benz Stadium also now has a couple of MatSing ball antennas in its rafters, there to bring DAS coverage to the sidelines of the playing field.

Sunday the Mercedes-Benz Stadium staffers were hosting a rare big-game back-to-back event, following Saturday’s packed-house tilt between SEC powers Alabama and Georgia, a championship-game rematch won by Alabama 35-28 after a dramatic comeback.

“That was a massive flip,” said Branch of the two-day stretch, which saw another huge data day Saturday with 8.06 TB of Wi-Fi used. The network, sponsored by backbone provider AT&T, averages about a 50 percent take rate from event attendees, according to Branch, who gave praise to Aruba and AmpThink for their combined deployment efforts.

“The expectation for fans now is that there will be Wi-Fi [in a sports venue],” said Branch. “But I love it when friends come to me after a game and tell me ‘the Wi-Fi is so fast!’ ”

THE MSR TOP 17 FOR WI-FI

1. Super Bowl 52, U.S. Bank Stadium, Minneapolis, Minn., Feb. 4, 2018: Wi-Fi: 16.31 TB
2. 2018 College Football Playoff Championship, Alabama vs. Georgia, Mercedes-Benz Stadium, Atlanta, Ga., Jan. 8, 2018: Wi-Fi: 12.0 TB*
3. Super Bowl 51, NRG Stadium, Houston, Feb. 5, 2017: Wi-Fi: 11.8 TB
4. Atlanta Falcons vs. Philadelphia Eagles, Lincoln Financial Field, Philadelphia, Pa., Sept. 6, 2018: Wi-Fi: 10.86 TB
5. Super Bowl 50, Levi’s Stadium, Santa Clara, Calif., Feb. 7, 2016: Wi-Fi: 10.1 TB
6. Taylor Swift Reputation Tour, Gillette Stadium, Foxborough, Mass., July 27, 2018: Wi-Fi: 9.76 TB
7. Minnesota Vikings vs. Philadelphia Eagles, NFC Championship Game, Lincoln Financial Field, Philadelphia, Pa., Jan. 21, 2018: Wi-Fi: 8.76 TB
8. Jacksonville Jaguars vs. New England Patriots, AFC Championship Game, Gillette Stadium, Foxborough, Mass., Jan. 21, 2018: Wi-Fi: 8.53 TB
9. Taylor Swift Reputation Tour, Broncos Stadium at Mile High, May 25, 2018: Wi-Fi: 8.1 TB
10. Kansas City Chiefs vs. New England Patriots, Gillette Stadium, Foxborough, Mass., Sept. 7, 2017: Wi-Fi: 8.08 TB
11. SEC Championship Game, Alabama vs. Georgia, Mercedes-Benz Stadium, Atlanta, Ga., Dec. 1, 2018: Wi-Fi: 8.06 TB*
12. Green Bay Packers vs. Dallas Cowboys, Divisional Playoffs, AT&T Stadium, Arlington, Texas, Jan. 15, 2017: Wi-Fi: 7.25 TB
13. Stanford vs. Notre Dame, Notre Dame Stadium, South Bend, Ind., Sept. 29, 2018: 7.19 TB
14. (tie) Southern California vs. Notre Dame, Notre Dame Stadium, South Bend, Ind., Oct. 21, 2017: 7.0 TB
Arkansas State vs. Nebraska, Memorial Stadium, Lincoln, Neb., Sept 2, 2017: Wi-Fi: 7.0 TB
15. WrestleMania 32, AT&T Stadium, Arlington, Texas, April 3, 2016: Wi-Fi: 6.77 TB
16. Wisconsin vs. Nebraska, Memorial Stadium, Lincoln, Neb., Oct. 7, 2017: Wi-Fi: 6.3 TB
17. Super Bowl 49, University of Phoenix Stadium, Glendale, Ariz., Feb. 1, 2015: Wi-Fi: 6.23 TB

* = pending official exact data