JMA touts virtualized RAN for DAS with new XRAN platform

The marketplace for in-building distributed antenna system (DAS) deployments got an interesting jolt Monday with JMA Wireless’s announcement of its new XRAN software platform, which promises to bring the flexibility and cost savings of hardware virtualization to the world of Radio Access Network (RAN) equipment.

In a quick nutshell, the idea behind JMA’s XRAN is to use software and off-the-shelf Intel-based servers to replace the dedicated racks of equipment that are traditionally used to carry signals from celluar carrier lines to antenna infrastructure in a DAS. In addition to potential large savings in amounts of equipment needed, cooling and power costs, and sheer space, the XRAN also promises to allow cloud-based sharing and administration of systems, which could allow multiple buildings or a campus to share an integrated system for flexible capacity control.

A stadium with XRAN, in an example provided by JMA, could theoretically share its DAS deployment infrastructure with nearby office buildings, allowing for better use of resources. Though not yet deployed anywhere commercially, JMA also announced Monday that XRAN software is being used by Telecom Italia Mobile in a “live dense urban network application.” The announcements were officially made at the Mobile World Congress show in Barcelona.

Looking to cut costs for enterprise wireless deployments

The XRAN announcement may be of most immediate interest in the stadium wireless marketplace to third-party network operators, who typically build a DAS network for a stadium and rent space on it back to carriers. That model, employed by companies including Boingo, Mobilitie, ExteNet and 5 Bars, has come under pressure lately as carriers have voiced displeasure over having to pay what they sometimes consider exorbitant prices for access. If costs for DAS deployments and operations could be cut, third party operators might be able to offer more attractive rates to ensure carrier participation.

To be sure, virtualized RAN operations (also sometimes known as “C-RAN” for Cloud-based RAN) have been the focus of many companies inside the telecom services space, for the same cost-saving and feature flexibility promises made possible by switching from dedicated hardware to commodity platforms. In the press literature accompanying its announcement, JMA notes that while some “partially virtualized” RAN architecture equipment exists, JMA claims the XRAN platform is the first fully virtual RAN, software “that can process the full protocol stack” from Layer 1 through Layer 3.

If the cost savings and functional flexibility of RAN virtualization follow the curves seen by virtualization in the application world, XRAN or any similar platforms that may follow could also potentially hold interest for commercial real estate owners and operators. With most industry estimates showing that many large commercial buildings like office towers currently lack a comprehensive indoor wireless coverage solution, by eliminating a big chunk of the cost of a DAS — or by allowing campuses or multiple buildings to share the costs — a DAS could become a more attractive option.

“Cost, simplicity, footprint, power, and cooling changes dramatically with XRAN,” said Todd Landry, corporate vice president of product and market strategy at JMA Wireless, in a prepared statement. “XRAN is designed from its inception to close the gap between rapidly growing in-building mobile connectivity demands and today’s complex, proprietary hardware solutions unable to evolve and adapt for multi-operator services.”

More as we hear more from what is sure to be a talked-about subject in the big-building wireless world!

Minneapolis airport sees 6 TB of Wi-Fi traffic day after Super Bowl

Super Bowl signs hang in the concourse at Minneapolis-St. Paul airport. Credit: MAC (click on any photo for a larger image)

A day after Super Bowl 52 at U.S. Bank Stadium in Minneapolis set new records for wireless data consumption, the Minneapolis-St. Paul International airport had a big wireless day of its own, with 6 terabytes of traffic used on the airport’s Wi-Fi network and another 6.5 TB on the Verizon cellular network.

Eduardo Valencia, vice president and chief information officer for the Metropolitan Airports Commission, said the Wi-Fi data used on Feb. 5 was “close to double typical data consumption” on the free-access network provided by Boingo Wireless, even though the airport saw a fairly normal range of users connecting.

“There was no spike in [the number] of users, but the users who did connect consumed twice as much data, with downloads about 3 times normal,” Valencia said. The Monday-departure crowd, he said, saw about 31,000 unique users connect to the Wi-Fi network, which Valencia said “is at the top of the normal user range” the airport network usually sees. Valencia said that during the week leading up to the big game on Feb. 4, the airport Wi-Fi saw between 23,000 and 31,000 daily connections.

Boingo, which has been powering the Wi-Fi at Minneapolis-St. Paul International Airport (aka MSP) since 2012, updated and expanded coverage a year ago, according to Valencia. Though Boingo would not provide details on how many new APs were added or how many the network has now, Valencia said coverage was increased in many areas, like the tunnels between terminals, to make sure visitors didn’t lose connectivity.

New neutral host DAS from Verizon

Super Bowl LII signage along a moving walkway at MSP. Credit: MAC

The cellular infrastructure at the airport also got an upgrade before the Super Bowl, with a neutral host distributed antenna system (DAS) deployed by Verizon Wireless. The DAS, which uses Corning ONE fiber equipment on the back end, provided coverage for all the top wireless carriers, Valencia said. Though it was cut close — the final pieces went live on Jan. 19, according to Valencia — the expanded DAS, which added antennas all over the terminals as well as outside covering runways, also performed well, according to Valencia.

Though only Verizon stats were available, Valencia said Verizon saw an average of 2.8 TB of data per day in an 11-day span around the Super Bowl, with 6.5 TB of traffic seen on Monday, Feb. 5. Like the Wi-Fi traffic, Valencia said Verizon’s day-after total was about double the average daily consumption.

While there is extra pressure to perform ahead of the NFL’s big game — “The NFL told us the Super Bowl experience begins and ends at the airport,” Valencia said — the payoff will stay for years, as all the new network gear added in advance is permanent.

“We swallowed hard for 9 days, but the success was the culmination of a lot of planning,” Valencia said. “Now the good thing is, everything [in the network] is here to stay.”

Connectivity at the core of Little Caesars Arena, District Detroit

Little Caesars Arena, the new home for the Detroit Red Wings and the Detroit Pistons. Credit: Olympia Entertainment (click on any photo for a larger image)

Bringing great wireless connectivity to a new stadium is almost table stakes these days. But building up a nearby commercial district — and keeping connectivity high outside the venue’s walls — is a bet of another level, especially in Detroit where networks extend outside the new Little Caesars Arena into the 50-block District Detroit.

Following the arena’s opening in September of 2017, the prognosis so far is so far, so good, with solid reports of high network performance on both Wi-Fi and cellular networks in and around the new home of the NHL’s Detroit Red Wings and the NBA’s Detroit Pistons. But for John King, vice president of IT and innovation for venue owners Olympia Entertainment, the responsibilities for him and his network team extend far beyond the new stadium’s walls.

“We’re focused on the [wireless] signal not just in the bowl, but also in the surrounding elements — the streets, the outdoor arenas, and the Little Caesars Arena garage,” said King in an interview shortly after the arena opened. “The vision is, to be connected wherever you are. And to share that experience.”

An ambitious revival in downtown Detroit

Editor’s note: This profile is from our most recent STADIUM TECH REPORT for Winter 2018, which is available for FREE DOWNLOAD from our site. This issue has an in-depth look at the wireless networks at U.S. Bank Stadium in Minneapolis, as well as profiles of network deployments at the Las Vegas Convention Center and Orlando City Stadium! DOWNLOAD YOUR FREE COPY today!

The inside concourse at Little Caesars Arena. Credit: Olympia Entertainment

Built nearby the Detroit Lions’ Ford Field and the Tigers’ Comerica Park, the new hoops/hockey stadium seats 19,515 for hockey and 20,491 for basketball. Unlike many stadiums of the past which rise up from the ground, Little Caesars Arena is built into the ground, 40 feet below street level. The innovations in construction and accessibility, including an outdoor arena adjacent to the indoor one, may require another full profile and an in-person visit. For now, we’ll concentrate on the wireless deployment in and around Little Caesars Arena, which was funded in part by a sponsorship from Comcast Business, which provides backbone bandwidth to the arena and the district in the form of two 100 Gbps connections. The Wi-Fi network design and deployment, done by AmpThink, uses Cisco Wi-Fi gear; Cisco’s Vision for Sports and Entertainment (formerly known as StadiumVision) is used to synchronize video output to the 1,500 TV screens located in and around the venue.

On the cellular side, Verizon Wireless built a neutral-host DAS, which was getting ready to welcome AT&T as the second carrier on board shortly after the opening. According to King, the Wi-Fi network has approximately 1,100 total APs both inside and outside the arena, many of those from Cisco’s 3802 series, which each have two radios per AP. For many of the 300 APs located in the main seating bowl, Little Caesars Arena went with an under-seat deployment, with some others placed in handrail enclosures, especially for the basketball floor-seating areas.

“AmpThink did a really nice job with the deployment,” said King, who said the arena’s open-air suite spaces helped provide “lots of flow” to wireless gear, without the historical overhangs around to block signals on different levels. One early visitor to the arena saw many Wi-Fi speed tests in the 50-60 Mbps range for both download and upload, as well as several in the 80-to-100 Mbps range, signs that a strong signal was available right at the start.

“We’ve still got a lot of tuning, but early on we’re getting great results,” said King of the Wi-Fi performance. “Our goal is to make it the best it can be.”

Staying connected outside the walls

Like The Battery area surrounding the Atlanta Braves’ new SunTrust Park, the District Detroit is meant to be a stay-and-play kind of space, with restaurants, clubs, office spaces and residences seeking to lure visitors and residents to do more than just see a game. For King and his team, one of their tasks is to ensure that visitors can stay connected no matter where they are inside the district, including inside restaurants, offices and other indoor spaces.

Connectivity blends well with the architecture inside Little Caesars Arena. Credit: Tod Caflisch, special to MSR

“We want the [network] signal to be robust, to carry into outdoor spaces, restaurants and many other areas” inside the District Detroit, King said. “We want to push the envelope a little bit and create a useful opportunity.”

Back inside Little Caesars Arena, the team and stadium apps are built by Venuetize, which built a similar integrated app for the Buffalo Bills and the Buffalo Sabres, one that also extends outside arenas to support connectivity in city areas. King said that Little Caesars Arena will be testing pre-order and express pickup concession ordering through the app, with a focus on seating areas that don’t have ready access to some of the club facilities.

Like any other new facility, Little Caesars Arena will no doubt go through some growing pains in its debut season, but for King and others who spent time getting the venue ready it’s fun to have the doors open.

“It’s really great seeing it all come to life,” King said.

Fans use 16.31 TB of Wi-Fi data during Super Bowl 52 at U.S. Bank Stadium

A Wi-Fi handrail enclosure at U.S. Bank Stadium in Minneapolis. Credit: Paul Kapustka, MSR (click on any photo for a larger image)

It is now official — we have a new record for most Wi-Fi data used at a single-day event, as fans at U.S. Bank Stadium in Minneapolis for Super Bowl 52 used 16.31 terabytes of data on the Wi-Fi network.

According to statistics compiled by Extreme Networks during the Philadelphia Eagles’ thrilling 41-33 victory over the New England Patriots Sunday night, the AmpThink-designed network which uses Cisco Wi-Fi gear also saw 40,033 unique users — 59 percent of the 67,612 in attendance — a new top percentage total for any single-game network experience we’ve been told about. (The Dallas Cowboys saw approximately 46,700 unique Wi-Fi users during a playoff game last season, about 50 percent of attendance at AT&T Stadium.)

The network also saw a peak concurrent connection of 25,670 users, and a peak data transfer rate of 7.867 Gbps, according to the numbers released by Extreme. Though Extreme gear was not used in the operation of the network, Extreme has a partnership deal with the NFL under which it provides the “official” network analytics reports from the Super Bowl.

The final total of 16.31 TB easily puts Super Bowl 52 ahead of the last two Super Bowls when it comes to Wi-Fi data use. Last year at NRG Stadium in Houston, there was 11.8 TB of Wi-Fi use recorded, and at Super Bowl 50 in 2016 there was 10.1 TB of Wi-Fi data used at Levi’s Stadium in Santa Clara, Calif. So in reverse chronological order, the last three Super Bowls are the top three Wi-Fi events, indicating that data demand growth at the NFL’s biggest game shows no sign of slowing down. Combined with the 50.2 TB of cellular data used in and around the stadium on game day, Super Bowl 52 saw a total of 66.51 TB of wireless traffic Sunday in Minneapolis.

Confetti fills the air inside U.S. Bank Stadium after the Philadelphia Eagles defeated the New England Patriots in Super Bowl LII. Credit: U.S. Bank Stadium

Super Bowl 52 represented perhaps a leap of faith, in that the handrail-enclosure Wi-Fi design had not yet seen a stress test like that found at the NFL’s biggest event. Now looking ahead to hosting the 2019 Men’s NCAA Basketball Final Four, David Kingsbury, director of IT for U.S. Bank Stadium, can be forgiven for wanting to take a bit of a victory lap before we set our Wi-Fi sights on Atlanta’s Mercedes-Benz Stadium, home of Super Bowl 53.

“AmpThink, CenturyLink and Cisco designed and built a world-class wireless system for U.S. Bank Stadium that handled record-setting traffic for Super Bowl LII,’ Kingsbury said. “AmpThink president Bill Anderson and his team of amazing engineers were a pleasure to work with and the experts at Cisco Sports and Entertainment supported us throughout the multi-year planning process required for an event of this magnitude. High-density wireless networking is such a challenging issue to manage, but I am very happy with our results and wish the team in Atlanta the best next year. The bar has been raised.”

THE LATEST TOP 10 FOR WI-FI

1. Super Bowl 52, U.S. Bank Stadium, Minneapolis, Minn., Feb. 4, 2018: Wi-Fi: 16.31 TB
2. Super Bowl 51, NRG Stadium, Houston, Feb. 5, 2017: Wi-Fi: 11.8 TB
3. Super Bowl 50, Levi’s Stadium, Santa Clara, Calif., Feb. 7, 2016: Wi-Fi: 10.1 TB
4. Minnesota Vikings vs. Philadelphia Eagles, NFC Championship Game, Lincoln Financial Field, Philadelphia, Pa., Jan. 21, 2018: Wi-Fi: 8.76 TB
5. Kansas City Chiefs vs. New England Patriots, Gillette Stadium, Foxborough, Mass., Sept. 7, 2017: Wi-Fi: 8.08 TB
6. Green Bay Packers vs. Dallas Cowboys, Divisional Playoffs, AT&T Stadium, Arlington, Texas, Jan. 15, 2017: Wi-Fi: 7.25 TB
7. Southern California vs. Notre Dame, Notre Dame Stadium, South Bend, Ind., Oct. 21, 2017: 7.0 TB
8. WrestleMania 32, AT&T Stadium, Arlington, Texas, April 3, 2016: Wi-Fi: 6.77 TB
9. Super Bowl 49, University of Phoenix Stadium, Glendale, Ariz., Feb. 1, 2015: Wi-Fi: 6.23 TB
10. Georgia vs. Notre Dame, Notre Dame Stadium, South Bend, Ind., Sept. 9, 2017: Wi-Fi: 6.2 TB

U.S. Bank Stadium in Minneapolis before the start of Super Bowl LII

Update: AT&T, Verizon and Sprint see a combined 50.2 TB of cellular traffic for Super Bowl 52

Some of the JMA TEKO gear used in the DAS at U.S. Bank Stadium. Credit: Paul Kapustka, MSR

Before, during and after the Philadelphia Eagles’ thrilling 41-33 victory over the New England Patriots in Super Bowl 52, AT&T, Verizon Wireless and Sprint said they saw a combined 50.2 terabytes of cellular traffic Sunday in and around U.S. Bank Stadium in Minneapolis.

Though some of the totals represent different widths of coverage areas, they roughly correspond to metrics used by the same carriers at last year’s Super Bowl 51 in Houston, where a combined total of 25.8 TB of cellular traffic was reported. Like last year, T-Mobile representatives said they will not report data use from the Super Bowl, even though the carrier’s executives Tweeted Sunday night about strong network performance and significant data-use growth over last year’s big game without mentioning any totals for either.

Without any agreed-upon standards for such reporting, it’s probably not an exact science to compare one year’s results to the next since numerous variables exist, like density of fixed and portable networks, and location of stadiums (Minneapolis’ U.S. Bank Stadium, for example, is in the middle of a downtown, while NRG Stadium, home of Super Bowl 51 in Houston, is not). Still, since carriers typically use the same reporting metrics year by year, it’s possible to see a continued increase in data use, a sign that demand for mobile connectivity at sporting events continues to grow.

Social media, video and audio rule the day

Curiously, AT&T saw a slight decrease this year in the amount of traffic it measured directly inside and immediately outside the venue; according to AT&T, it saw 7.2 TB of traffic on Sunday on the in-stadium DAS as well as on its mobile cell sites and macro sites just outside U.S. Bank Stadium. In 2017, AT&T said it saw 9.8 TB of traffic in similar locations around NRG Stadium in Houston.

But in extending its reporting to a 2-mile radius around U.S. Bank Stadium — the same base metric used by Verizon — AT&T said it saw 21.7 TB of traffic Sunday. Verizon, which reported 11 TB of traffic last year in Houston, said it saw 18.8 TB of cellular data used on its networks inside the 2-mile perimeter around U.S. Bank Stadium Sunday. Verizon did not report a figure for its infrastructure inside and adjacent to the stadium. The main cellular infrastructure inside U.S. Bank Stadium, a neutral host DAS, was built and is run by Verizon.

Sprint, which reports traffic each year from networks inside and directly adjacent to the stadiums, said it saw 9.7 TB of traffic on its networks Sunday, up from 5 TB in 2017.

Some quick facts emailed to us from Verizon reps saw top uses by Verizon customers were led (in order) by web browsing, streaming video and social media and sports app usage. According to Verizon, the top three social media apps used by Verizon customers were Snapchat, Facebook and Instagram, “with Snapchat moving from third at last year’s Super Bowl to first most used.”

Again, according to Verizon the largest spikes in traffic happened with “social media video sharing” during the halftime performance at the top, followed by reaction to the Patriots’ fumble late in the game, and at kickoff, when Verizon customers were streaming video and browsing the web. Verizon also said its network was used by 57 percent of the fans at U.S. Bank Stadium, which may explain why Verizon spent a lot of time and money upgrading the network before Sunday’s event.

We have also heard that the Wi-Fi usage also broke previous records, but do not yet have an official number to report.

A final note: Thanks to all the carrier representatives for their figures and to all our Twitter followers for input and advice on how to best present these important metrics. We’ll keep working to make this process as best it can be, so let us know what you think!

‘Super’ Wi-Fi and DAS at U.S. Bank Stadium ready for Super Bowl 52

A look at downtown Minneapolis from inside U.S. Bank Stadium. Credit all photos: Paul Kapustka, MSR (click on any photo for a larger image)

After Sunday’s stunning last-second victory, the Minnesota Vikings are one step closer to becoming the first team to play a Super Bowl in its own home stadium. Should the Vikings beat the Eagles in Philadelphia this weekend, Super Bowl 52 visitors should prepare for a true Norse experience inside U.S. Bank Stadium, with repeated blasts from the oversize “Gjallarhorn” and a fire-breathing dragon ship that will launch the home team onto the field. Skol!

But even if the hometown team falls short of making the big game this season, on Feb. 4, 2018 the stadium itself should do Minneapolis proud, especially when it comes to wireless connectivity. With two full regular seasons of football and numerous other events to test the networks’ capacity, both the Wi-Fi and DAS networks inside the 66,655-seat U.S. Bank Stadium appear more than ready to handle what is usually the highest single-day bandwidth stress test, namely the NFL’s yearly championship game. (Though the selfies and uploads following Sunday’s walk-off touchdown toss may have provided an early indicator of massive network use!)

In a mid-November visit to U.S. Bank Stadium for a Vikings home game against the Los Angeles Rams, Mobile Sports Report found robust coverage on both the Wi-Fi and cellular networks all around the inside of the stadium, with solid performance even amidst thick crowds of fans and even in the highest reaches of the seating bowl. Speedtests on the Wi-Fi network, built by AmpThink using Cisco gear, regularly hit marks of 40 to 50-plus Mbps in most areas, with one reading reaching 85 Mbps for download speeds.

And on the DAS side of things, Verizon Wireless, which built the neutral-host network inside U.S. Bank Stadium, said in December that it has already seen more cellular traffic on its network for a Vikings home game this season than it saw at NRG Stadium for Super Bowl LI last February. With 1,200 total antennas — approximately 300 of which were installed this past offseason — Verizon said it is ready to handle even double the traffic it saw at last year’s game, when it reported carrying 11 terabytes of data on stadium and surrounding macro networks.

Good connectivity right inside the doors

Editor’s note: This profile is from our most recent STADIUM TECH REPORT for Winter 2017-18, which is available for FREE DOWNLOAD from our site. This issue has an in-depth look at the wireless networks at U.S. Bank Stadium in Minneapolis, as well as profiles of network deployments at the brand-new Little Caesars Arena, the Las Vegas Convention Center, and Orlando City Stadium! DOWNLOAD YOUR FREE COPY today!

A new Verizon DAS antenna handrail enclosure (right) at U.S. Bank Stadium in Minneapolis. (The enclosure lower left is for Wi-Fi).

James Farstad, chief technology advisor for the Minnesota Sports Facilities Authority (MSFA), the entity that owns U.S. Bank Stadium, said he and his group are “very pleased” with the state of the wireless networks inside the venue heading toward its Super Bowl date.

“You’re never really satisfied, because you want it to be the best it can be,” said Farstad in an interview during our November visit to Minneapolis. “But generally speaking, we’re very pleased with the state of the networks.”

Those networks are tested the very moment the Vikings open the doors for home games, especially in warmer weather when the signature big glass doors — five of them, all 55 feet wide and ranging in height from 75 to 95 feet — swing out to welcome fans. As the entry that points toward downtown, the west gate can account for as much as 70 percent of the fans arriving, according to the Vikings, putting a big crush on the wireless networks in the doorway area.

To help keep people connected in crowded situations, Verizon deployed extra DAS antennas on short poles in front of both the west and east end zone concourse areas, part of a 48 percent increase in overall DAS antenna numbers added during the football offseason. Even with thick crowds streaming into the stadium, we still got a DAS speedtest of 77.35 Mbps download and 32.40 Mbps upload on the concourse just inside the west doors, and just below the Gjallarhorn.

Walking around the main level concourse, connectivity hardware is easy to see if you know what you’re looking for; part of the extensive DAS coverage includes dual antennas hanging off a single pole above wide walkway segments. And in one instance, we saw a good example of aesthetic integration, with a Wi-Fi AP attached just behind two IPTV screens, with a beacon attached to the side and a DAS antenna mounted just above everything else.

First big test of railing-mounted Wi-Fi?

Moving into the seating bowl, visitors may not know that many of the Wi-Fi network’s 1,300 APs are hiding there in plain sight — inside silver handrail enclosures, many of which now sport bright, bold section numbers to help fans find their seats. Believed to be the first big football-sized stadium that relied mainly on railing-mounted APs, the proximate network design from AmpThink is proving to be a winner in performance, producing regular-season game data totals of around 3 terabytes per event and maybe more importantly, keeping an optimal number of fans attached to the AP closest to them for the speediest connection.

Top-down antennas provide coverage for suite seating

Sitting next to AmpThink president Bill Anderson in the stadium’s press box you get a great view of the field, but it’s doubtful Anderson watches much football action given that he spends most of a game day glued to a screen that shows live detailed performance for every Wi-Fi AP in the building. While the analytics program produces a wealth of interesting data, the one metric that keeps Anderson’s attention is the one showing how many fans are connected to each AP, a number that will be no more than 50 and ideally somewhere around 25 connections if the network is performing as it should be.

On the day we visited, on Anderson’s screen there was one AP showing more than 200 devices trying to connect to it, an issue Anderson noted for immediate problem-solving. But with only a handful of others showing more than 50 connections, Anderson was confident that AmpThink has been able to figure out how to solve for the main dilemma for Wi-Fi in large enclosed structures, namely keeping APs from interfering with each
other. The large clear-plastic roof and wall areas at U.S. Bank Stadium don’t help, since they reflect RF signals to add to the network design degree of difficulty.

But the multiple railing-mount network design – which AmpThink duplicated at Notre Dame University, whose new network is seeing the highest-ever data totals seen at collegiate events – seems to be fulfilling AmpThink’s goal to produce networks with steady AP loads and consistent, high-density throughput in extremely challenging environments. The railing-mounted APs provide connectivity that couldn’t be delivered by overhead antennas, like in Notre Dame’s open concrete bowl and in U.S. Bank Stadium’s similar wide-open seating area, where no overhead structure is within 300 feet of a seat.

Two DAS antennas hang from a pole above the main concourse

“I think we have a network strategy that produces good uniform performance” in venues like U.S. Bank Stadium, Anderson said. “It’s pretty darn exciting to have a formula that works.”

More antennas get DAS ready for big game

And even though Verizon knew the Super Bowl was coming to U.S. Bank Stadium when it built the neutral host DAS for the 2016 opening, it came right back this past offseason and added approximately another 300 new antennas (mainly for its own use and not for the shared DAS), all in the name of unstoppable demand for mobile bandwidth from fans attending events.

Diana Scudder, executive director for network assurance at Verizon, said in a phone interview that “the consumer appetite [for wireless data] is insatiable,” especially at the NFL’s biggest game, where DAS use has grown at a fast clip the past few years. Scudder said these days Verizon pretty much plans to see double whatever the last Super Bowl saw for each following big game, and adds network capacity accordingly. Verizon’s numbers from the past three Super Bowls are a good guide, with the carrier reporting 4.1 TB used at Super Bowl 49, 7 TB at Super Bowl 50, and 11 TB at Super Bowl 51.

AmpThink’s handrail-mounted AP enclosures seem to have played a hand in part of Verizon’s DAS upgrade, as some of the new DAS enclosures seem to mimic the Wi-Fi ones with their smaller silver enclosures. Scudder did say that Verizon used contractors to assist with the new antenna deployment enclosures and mounts, but did not cite AmpThink by name. Verizon also deployed some under-seat antenna enclosures for its upgrade, a tactic the company also used for Super Bowl 50 at Levi’s Stadium in Santa Clara, Calif.

Even up in the most nosebleed of seats — in U.S. Bank Stadium’s case, section 345, which has seats almost touching the roof in the southwest corner, we got a DAS speedtest on the Verizon network of 60.87 Mbps / 44.22 Mbps, most likely from some antennas we could see mounted just above the seats on ventilation pipes a bit toward the field. And hanging from the middle of U.S. Bank Stadium’s roof are a pair of Matsing Ball antennas, which point down to provide cellular service for media and photographers on the sidelines, as well as for floor seating for concerts and other events.

Ready to add more bandwidth on the fly

Even less unseen and probably not appreciated until it’s needed is the stadium’s backbone bandwidth, provided by sponsoring partner CenturyLink.

A Wi-Fi enclosure in section 345, near the stadium’s roof

Though some stadiums are touting 100 Gbps pipes coming in, the U.S. Bank Stadium setup makes the venue its own ISP, according to Farstad.

With six 10-Gbps pipes that are always active — and on two separate network infrastructures for redundancy — the stadium can turn up its bandwidth on the fly, a test the venue got on its first public event.

According to Farstad, when U.S. Bank Stadium opened for the first time with a soccer game on Aug. 3, 2016, the stadium operators expected about 25,000 fans might show up for a clash between Chelsea and AC Milan. But a favorable newspaper article about the stadium led to more than 64,000 fans in the house, a surge that backed up the light-rail trains and saw the concession stands run out of food.

“We were watching the Wi-Fi system during the first break [in the soccer game] and it was coming down fast,” Farstad said. But the ability to increase capacity quickly — Farstad said that within 45 seconds, the stadium was able to provision new bandwidth, a task that in other situations could take weeks — the Wi-Fi survived the unexpected demands, proof that it should be able to handle whatever happens on Super Bowl Sunday.

“I think we can handle the Super Bowl traffic,” Farstad said.