JMA touts virtualized RAN for DAS with new XRAN platform

The marketplace for in-building distributed antenna system (DAS) deployments got an interesting jolt Monday with JMA Wireless’s announcement of its new XRAN software platform, which promises to bring the flexibility and cost savings of hardware virtualization to the world of Radio Access Network (RAN) equipment.

In a quick nutshell, the idea behind JMA’s XRAN is to use software and off-the-shelf Intel-based servers to replace the dedicated racks of equipment that are traditionally used to carry signals from celluar carrier lines to antenna infrastructure in a DAS. In addition to potential large savings in amounts of equipment needed, cooling and power costs, and sheer space, the XRAN also promises to allow cloud-based sharing and administration of systems, which could allow multiple buildings or a campus to share an integrated system for flexible capacity control.

A stadium with XRAN, in an example provided by JMA, could theoretically share its DAS deployment infrastructure with nearby office buildings, allowing for better use of resources. Though not yet deployed anywhere commercially, JMA also announced Monday that XRAN software is being used by Telecom Italia Mobile in a “live dense urban network application.” The announcements were officially made at the Mobile World Congress show in Barcelona.

Looking to cut costs for enterprise wireless deployments

The XRAN announcement may be of most immediate interest in the stadium wireless marketplace to third-party network operators, who typically build a DAS network for a stadium and rent space on it back to carriers. That model, employed by companies including Boingo, Mobilitie, ExteNet and 5 Bars, has come under pressure lately as carriers have voiced displeasure over having to pay what they sometimes consider exorbitant prices for access. If costs for DAS deployments and operations could be cut, third party operators might be able to offer more attractive rates to ensure carrier participation.

To be sure, virtualized RAN operations (also sometimes known as “C-RAN” for Cloud-based RAN) have been the focus of many companies inside the telecom services space, for the same cost-saving and feature flexibility promises made possible by switching from dedicated hardware to commodity platforms. In the press literature accompanying its announcement, JMA notes that while some “partially virtualized” RAN architecture equipment exists, JMA claims the XRAN platform is the first fully virtual RAN, software “that can process the full protocol stack” from Layer 1 through Layer 3.

If the cost savings and functional flexibility of RAN virtualization follow the curves seen by virtualization in the application world, XRAN or any similar platforms that may follow could also potentially hold interest for commercial real estate owners and operators. With most industry estimates showing that many large commercial buildings like office towers currently lack a comprehensive indoor wireless coverage solution, by eliminating a big chunk of the cost of a DAS — or by allowing campuses or multiple buildings to share the costs — a DAS could become a more attractive option.

“Cost, simplicity, footprint, power, and cooling changes dramatically with XRAN,” said Todd Landry, corporate vice president of product and market strategy at JMA Wireless, in a prepared statement. “XRAN is designed from its inception to close the gap between rapidly growing in-building mobile connectivity demands and today’s complex, proprietary hardware solutions unable to evolve and adapt for multi-operator services.”

More as we hear more from what is sure to be a talked-about subject in the big-building wireless world!

Tampa Bay Lightning pick Venuetize for new Amalie Arena app

The Tampa Bay Lightning and Amalie Arena have selected developer Venuetize for a new team and stadium app that will bring features including a multi-purpose digital wallet that will help fans manage their ticket options for hockey games and other events at the venue.

Screen shot of the new Amalie Arena app by Venuetize.

Announced in January, the new app is already available for iOS and Android devices. According to the team, the app supports the ability to purchase concessions and merchandise with a mobile device, as well as being able to perform detailed ticketing transactions including transfers and even transfers of discounts.

The deal with the Lightning represents Venuetize’s second NHL deal this season, following the company’s win to provide a similar stadium and team app for the Detroit Red Wings (and the Detroit Pistons) at Little Caesars Arena. Venuetize also previously built an integrated app for the Buffalo Bills and Buffalo Sabres.

With new entrants like Hopscotch challenging established app players like YinzCam and VenueNext in the stadium and team app arena, Venuetize seems to be claiming its own turf with apps that lean heavily on transaction features, as well as the ability to easily shift between sporting events and other events at stadiums.

YinzCam, which made its name early in the space with content-focused apps, recently unveiled a feature that allows app users to order and pay for food and beverages. Clearly, the ability to support more transaction-based services seems to be part of the increased table stakes in the stadium and team app market going forward.

Minneapolis airport sees 6 TB of Wi-Fi traffic day after Super Bowl

Super Bowl signs hang in the concourse at Minneapolis-St. Paul airport. Credit: MAC (click on any photo for a larger image)

A day after Super Bowl 52 at U.S. Bank Stadium in Minneapolis set new records for wireless data consumption, the Minneapolis-St. Paul International airport had a big wireless day of its own, with 6 terabytes of traffic used on the airport’s Wi-Fi network and another 6.5 TB on the Verizon cellular network.

Eduardo Valencia, vice president and chief information officer for the Metropolitan Airports Commission, said the Wi-Fi data used on Feb. 5 was “close to double typical data consumption” on the free-access network provided by Boingo Wireless, even though the airport saw a fairly normal range of users connecting.

“There was no spike in [the number] of users, but the users who did connect consumed twice as much data, with downloads about 3 times normal,” Valencia said. The Monday-departure crowd, he said, saw about 31,000 unique users connect to the Wi-Fi network, which Valencia said “is at the top of the normal user range” the airport network usually sees. Valencia said that during the week leading up to the big game on Feb. 4, the airport Wi-Fi saw between 23,000 and 31,000 daily connections.

Boingo, which has been powering the Wi-Fi at Minneapolis-St. Paul International Airport (aka MSP) since 2012, updated and expanded coverage a year ago, according to Valencia. Though Boingo would not provide details on how many new APs were added or how many the network has now, Valencia said coverage was increased in many areas, like the tunnels between terminals, to make sure visitors didn’t lose connectivity.

New neutral host DAS from Verizon

Super Bowl LII signage along a moving walkway at MSP. Credit: MAC

The cellular infrastructure at the airport also got an upgrade before the Super Bowl, with a neutral host distributed antenna system (DAS) deployed by Verizon Wireless. The DAS, which uses Corning ONE fiber equipment on the back end, provided coverage for all the top wireless carriers, Valencia said. Though it was cut close — the final pieces went live on Jan. 19, according to Valencia — the expanded DAS, which added antennas all over the terminals as well as outside covering runways, also performed well, according to Valencia.

Though only Verizon stats were available, Valencia said Verizon saw an average of 2.8 TB of data per day in an 11-day span around the Super Bowl, with 6.5 TB of traffic seen on Monday, Feb. 5. Like the Wi-Fi traffic, Valencia said Verizon’s day-after total was about double the average daily consumption.

While there is extra pressure to perform ahead of the NFL’s big game — “The NFL told us the Super Bowl experience begins and ends at the airport,” Valencia said — the payoff will stay for years, as all the new network gear added in advance is permanent.

“We swallowed hard for 9 days, but the success was the culmination of a lot of planning,” Valencia said. “Now the good thing is, everything [in the network] is here to stay.”

Connectivity at the core of Little Caesars Arena, District Detroit

Little Caesars Arena, the new home for the Detroit Red Wings and the Detroit Pistons. Credit: Olympia Entertainment (click on any photo for a larger image)

Bringing great wireless connectivity to a new stadium is almost table stakes these days. But building up a nearby commercial district — and keeping connectivity high outside the venue’s walls — is a bet of another level, especially in Detroit where networks extend outside the new Little Caesars Arena into the 50-block District Detroit.

Following the arena’s opening in September of 2017, the prognosis so far is so far, so good, with solid reports of high network performance on both Wi-Fi and cellular networks in and around the new home of the NHL’s Detroit Red Wings and the NBA’s Detroit Pistons. But for John King, vice president of IT and innovation for venue owners Olympia Entertainment, the responsibilities for him and his network team extend far beyond the new stadium’s walls.

“We’re focused on the [wireless] signal not just in the bowl, but also in the surrounding elements — the streets, the outdoor arenas, and the Little Caesars Arena garage,” said King in an interview shortly after the arena opened. “The vision is, to be connected wherever you are. And to share that experience.”

An ambitious revival in downtown Detroit

Editor’s note: This profile is from our most recent STADIUM TECH REPORT for Winter 2018, which is available for FREE DOWNLOAD from our site. This issue has an in-depth look at the wireless networks at U.S. Bank Stadium in Minneapolis, as well as profiles of network deployments at the Las Vegas Convention Center and Orlando City Stadium! DOWNLOAD YOUR FREE COPY today!

The inside concourse at Little Caesars Arena. Credit: Olympia Entertainment

Built nearby the Detroit Lions’ Ford Field and the Tigers’ Comerica Park, the new hoops/hockey stadium seats 19,515 for hockey and 20,491 for basketball. Unlike many stadiums of the past which rise up from the ground, Little Caesars Arena is built into the ground, 40 feet below street level. The innovations in construction and accessibility, including an outdoor arena adjacent to the indoor one, may require another full profile and an in-person visit. For now, we’ll concentrate on the wireless deployment in and around Little Caesars Arena, which was funded in part by a sponsorship from Comcast Business, which provides backbone bandwidth to the arena and the district in the form of two 100 Gbps connections. The Wi-Fi network design and deployment, done by AmpThink, uses Cisco Wi-Fi gear; Cisco’s Vision for Sports and Entertainment (formerly known as StadiumVision) is used to synchronize video output to the 1,500 TV screens located in and around the venue.

On the cellular side, Verizon Wireless built a neutral-host DAS, which was getting ready to welcome AT&T as the second carrier on board shortly after the opening. According to King, the Wi-Fi network has approximately 1,100 total APs both inside and outside the arena, many of those from Cisco’s 3802 series, which each have two radios per AP. For many of the 300 APs located in the main seating bowl, Little Caesars Arena went with an under-seat deployment, with some others placed in handrail enclosures, especially for the basketball floor-seating areas.

“AmpThink did a really nice job with the deployment,” said King, who said the arena’s open-air suite spaces helped provide “lots of flow” to wireless gear, without the historical overhangs around to block signals on different levels. One early visitor to the arena saw many Wi-Fi speed tests in the 50-60 Mbps range for both download and upload, as well as several in the 80-to-100 Mbps range, signs that a strong signal was available right at the start.

“We’ve still got a lot of tuning, but early on we’re getting great results,” said King of the Wi-Fi performance. “Our goal is to make it the best it can be.”

Staying connected outside the walls

Like The Battery area surrounding the Atlanta Braves’ new SunTrust Park, the District Detroit is meant to be a stay-and-play kind of space, with restaurants, clubs, office spaces and residences seeking to lure visitors and residents to do more than just see a game. For King and his team, one of their tasks is to ensure that visitors can stay connected no matter where they are inside the district, including inside restaurants, offices and other indoor spaces.

Connectivity blends well with the architecture inside Little Caesars Arena. Credit: Tod Caflisch, special to MSR

“We want the [network] signal to be robust, to carry into outdoor spaces, restaurants and many other areas” inside the District Detroit, King said. “We want to push the envelope a little bit and create a useful opportunity.”

Back inside Little Caesars Arena, the team and stadium apps are built by Venuetize, which built a similar integrated app for the Buffalo Bills and the Buffalo Sabres, one that also extends outside arenas to support connectivity in city areas. King said that Little Caesars Arena will be testing pre-order and express pickup concession ordering through the app, with a focus on seating areas that don’t have ready access to some of the club facilities.

Like any other new facility, Little Caesars Arena will no doubt go through some growing pains in its debut season, but for King and others who spent time getting the venue ready it’s fun to have the doors open.

“It’s really great seeing it all come to life,” King said.

Fans use 16.31 TB of Wi-Fi data during Super Bowl 52 at U.S. Bank Stadium

A Wi-Fi handrail enclosure at U.S. Bank Stadium in Minneapolis. Credit: Paul Kapustka, MSR (click on any photo for a larger image)

It is now official — we have a new record for most Wi-Fi data used at a single-day event, as fans at U.S. Bank Stadium in Minneapolis for Super Bowl 52 used 16.31 terabytes of data on the Wi-Fi network.

According to statistics compiled by Extreme Networks during the Philadelphia Eagles’ thrilling 41-33 victory over the New England Patriots Sunday night, the AmpThink-designed network which uses Cisco Wi-Fi gear also saw 40,033 unique users — 59 percent of the 67,612 in attendance — a new top percentage total for any single-game network experience we’ve been told about. (The Dallas Cowboys saw approximately 46,700 unique Wi-Fi users during a playoff game last season, about 50 percent of attendance at AT&T Stadium.)

The network also saw a peak concurrent connection of 25,670 users, and a peak data transfer rate of 7.867 Gbps, according to the numbers released by Extreme. Though Extreme gear was not used in the operation of the network, Extreme has a partnership deal with the NFL under which it provides the “official” network analytics reports from the Super Bowl.

The final total of 16.31 TB easily puts Super Bowl 52 ahead of the last two Super Bowls when it comes to Wi-Fi data use. Last year at NRG Stadium in Houston, there was 11.8 TB of Wi-Fi use recorded, and at Super Bowl 50 in 2016 there was 10.1 TB of Wi-Fi data used at Levi’s Stadium in Santa Clara, Calif. So in reverse chronological order, the last three Super Bowls are the top three Wi-Fi events, indicating that data demand growth at the NFL’s biggest game shows no sign of slowing down. Combined with the 50.2 TB of cellular data used in and around the stadium on game day, Super Bowl 52 saw a total of 66.51 TB of wireless traffic Sunday in Minneapolis.

Confetti fills the air inside U.S. Bank Stadium after the Philadelphia Eagles defeated the New England Patriots in Super Bowl LII. Credit: U.S. Bank Stadium

Super Bowl 52 represented perhaps a leap of faith, in that the handrail-enclosure Wi-Fi design had not yet seen a stress test like that found at the NFL’s biggest event. Now looking ahead to hosting the 2019 Men’s NCAA Basketball Final Four, David Kingsbury, director of IT for U.S. Bank Stadium, can be forgiven for wanting to take a bit of a victory lap before we set our Wi-Fi sights on Atlanta’s Mercedes-Benz Stadium, home of Super Bowl 53.

“AmpThink, CenturyLink and Cisco designed and built a world-class wireless system for U.S. Bank Stadium that handled record-setting traffic for Super Bowl LII,’ Kingsbury said. “AmpThink president Bill Anderson and his team of amazing engineers were a pleasure to work with and the experts at Cisco Sports and Entertainment supported us throughout the multi-year planning process required for an event of this magnitude. High-density wireless networking is such a challenging issue to manage, but I am very happy with our results and wish the team in Atlanta the best next year. The bar has been raised.”

THE LATEST TOP 10 FOR WI-FI

1. Super Bowl 52, U.S. Bank Stadium, Minneapolis, Minn., Feb. 4, 2018: Wi-Fi: 16.31 TB
2. Super Bowl 51, NRG Stadium, Houston, Feb. 5, 2017: Wi-Fi: 11.8 TB
3. Super Bowl 50, Levi’s Stadium, Santa Clara, Calif., Feb. 7, 2016: Wi-Fi: 10.1 TB
4. Minnesota Vikings vs. Philadelphia Eagles, NFC Championship Game, Lincoln Financial Field, Philadelphia, Pa., Jan. 21, 2018: Wi-Fi: 8.76 TB
5. Kansas City Chiefs vs. New England Patriots, Gillette Stadium, Foxborough, Mass., Sept. 7, 2017: Wi-Fi: 8.08 TB
6. Green Bay Packers vs. Dallas Cowboys, Divisional Playoffs, AT&T Stadium, Arlington, Texas, Jan. 15, 2017: Wi-Fi: 7.25 TB
7. Southern California vs. Notre Dame, Notre Dame Stadium, South Bend, Ind., Oct. 21, 2017: 7.0 TB
8. WrestleMania 32, AT&T Stadium, Arlington, Texas, April 3, 2016: Wi-Fi: 6.77 TB
9. Super Bowl 49, University of Phoenix Stadium, Glendale, Ariz., Feb. 1, 2015: Wi-Fi: 6.23 TB
10. Georgia vs. Notre Dame, Notre Dame Stadium, South Bend, Ind., Sept. 9, 2017: Wi-Fi: 6.2 TB

U.S. Bank Stadium in Minneapolis before the start of Super Bowl LII

Update: AT&T, Verizon and Sprint see a combined 50.2 TB of cellular traffic for Super Bowl 52

Some of the JMA TEKO gear used in the DAS at U.S. Bank Stadium. Credit: Paul Kapustka, MSR

Before, during and after the Philadelphia Eagles’ thrilling 41-33 victory over the New England Patriots in Super Bowl 52, AT&T, Verizon Wireless and Sprint said they saw a combined 50.2 terabytes of cellular traffic Sunday in and around U.S. Bank Stadium in Minneapolis.

Though some of the totals represent different widths of coverage areas, they roughly correspond to metrics used by the same carriers at last year’s Super Bowl 51 in Houston, where a combined total of 25.8 TB of cellular traffic was reported. Like last year, T-Mobile representatives said they will not report data use from the Super Bowl, even though the carrier’s executives Tweeted Sunday night about strong network performance and significant data-use growth over last year’s big game without mentioning any totals for either.

Without any agreed-upon standards for such reporting, it’s probably not an exact science to compare one year’s results to the next since numerous variables exist, like density of fixed and portable networks, and location of stadiums (Minneapolis’ U.S. Bank Stadium, for example, is in the middle of a downtown, while NRG Stadium, home of Super Bowl 51 in Houston, is not). Still, since carriers typically use the same reporting metrics year by year, it’s possible to see a continued increase in data use, a sign that demand for mobile connectivity at sporting events continues to grow.

Social media, video and audio rule the day

Curiously, AT&T saw a slight decrease this year in the amount of traffic it measured directly inside and immediately outside the venue; according to AT&T, it saw 7.2 TB of traffic on Sunday on the in-stadium DAS as well as on its mobile cell sites and macro sites just outside U.S. Bank Stadium. In 2017, AT&T said it saw 9.8 TB of traffic in similar locations around NRG Stadium in Houston.

But in extending its reporting to a 2-mile radius around U.S. Bank Stadium — the same base metric used by Verizon — AT&T said it saw 21.7 TB of traffic Sunday. Verizon, which reported 11 TB of traffic last year in Houston, said it saw 18.8 TB of cellular data used on its networks inside the 2-mile perimeter around U.S. Bank Stadium Sunday. Verizon did not report a figure for its infrastructure inside and adjacent to the stadium. The main cellular infrastructure inside U.S. Bank Stadium, a neutral host DAS, was built and is run by Verizon.

Sprint, which reports traffic each year from networks inside and directly adjacent to the stadiums, said it saw 9.7 TB of traffic on its networks Sunday, up from 5 TB in 2017.

Some quick facts emailed to us from Verizon reps saw top uses by Verizon customers were led (in order) by web browsing, streaming video and social media and sports app usage. According to Verizon, the top three social media apps used by Verizon customers were Snapchat, Facebook and Instagram, “with Snapchat moving from third at last year’s Super Bowl to first most used.”

Again, according to Verizon the largest spikes in traffic happened with “social media video sharing” during the halftime performance at the top, followed by reaction to the Patriots’ fumble late in the game, and at kickoff, when Verizon customers were streaming video and browsing the web. Verizon also said its network was used by 57 percent of the fans at U.S. Bank Stadium, which may explain why Verizon spent a lot of time and money upgrading the network before Sunday’s event.

We have also heard that the Wi-Fi usage also broke previous records, but do not yet have an official number to report.

A final note: Thanks to all the carrier representatives for their figures and to all our Twitter followers for input and advice on how to best present these important metrics. We’ll keep working to make this process as best it can be, so let us know what you think!