Wrigley Field gets new DAS in time for Cubs’ home opener

The Chicago Cubs’ Wrigley Field will have a new DAS working for opening day. Credit for these 2017 season pictures: Paul Kapustka, MSR (click on any photo for a larger image)

After some construction delays that no Chicago Cubs fans minded, the Friendly Confines of Wrigley Field will have a new high-performance distributed antenna system (DAS) operational for Monday’s scheduled Cubs home opener for the 2018 season.

Designed and deployed by DAS Group Professionals, the new in-stadium cellular network was originally scheduled to be ready by last year; but when the Cubs took their historic march to the World Series title in 2016, many of the in-progress construction plans for Wrigley Field got delayed or rearranged, to the objection of nobody at all who cheers for the north siders.

And even though some of the most ambitious parts of the Wrigley renovation took place this winter — including removing most of the seats and concrete in the lower seating bowl to clear the way for some lower-level club spaces — the DGP crew along with the Cubs’ IT organization delivered the new cell network in time for the first pitches scheduled Monday afternoon.

Wi-Fi coming in as season goes on

“We definitely put scheduling and timing to the test, but we got it done,” said Andrew McIntyre, vice president of technology for the Chicago Cubs, in a phone interview. First announced back in 2015, the networking plan for the Wrigley renovations — which includes coverage for the plaza outside the stadium, the new team office building as well as the across-the-street Hotel Zachary that also just opened for business — also includes a new Wi-Fi network using gear from Extreme Networks. Since the Wi-Fi network is more construction-conflicted than the DAS deployment, it will be introduced gradually over the next few months, McIntyre said.

“By the All-Star break, we should have both systems online,” McIntyre said.

The DAS system deployed by DGP uses JMA equipment, just like DGP’s other big-stadium DAS deployments at the San Francisco 49ers’ Levi’s Stadium and the Sacramento Kings’ Golden 1 Center. Steve Dutto, president of DGP, acknowledged the challenge of the Wrigley buildout, including one instance where DGP technicians needed to set up scaffolding to mount antennas but couldn’t because instead of a concrete floor there was a 60-foot hole in the ground.

Hey hey!

“We worked around all that and got it done,” said Dutto. According to Dutto DGP has signed up all four major U.S. wireless carriers for the DAS, with all except Sprint operational for opening day. The head-end building for the DAS, he said, is located in what he thinks is a former hot-dog stand a half a block from the park. (If you’re looking for a snack in the head end room, just remember, in Chicago there’s no ketchup on hot dogs.)

Dutto said the DAS antennas are all overhead mounts, not a problem in Wrigley since the overhangs offer plenty of mounting spaces. However, given the historic look and feel of the park, Dutto did say that “we definitely had to tuck things away better and make sure we had good paint matches.” Not a Chicago native, Dutto said that the charm of the stadium hit him on first view.

“When we pulled up for the first time,” he said, “it was… wow. There’s nothing like it.”

Under seat for Wi-Fi will take time to deploy

The Cubs’ McIntyre, who admits to guzzling coffee by the quart these days, said the field-level renovations — which included removing all lower seats and the foundational concrete to clear out room for field-level club spaces — made finishing the Wi-Fi deployment something that couldn’t be pushed. With no overhangs covering the premium box seat areas, Wi-Fi APs there will need to be mounted under seats, something that just couldn’t get finished by Monday.

“It’s less of a technical challenge and more of a structural engineering challenge,” said McIntyre of the under-seat deployment method, which usually involves a lot of work with drilling through concrete and mounting APs in weather-sealed enclosures. McIntyre said the Cubs and Extreme also plan to use under-seat deployments in Wrigley’s famous outfield bleachers, which also lack any overhead infrastructure. In what he termed a “slow roll,” McIntyre said parts of the Wi-Fi network will come online gradually as the season progresses, starting first with the spaces outside the stadium.

Bringing backbone power to the new network is partner Comcast Business, which just announced a sponsorship deal with the Cubs that will see a “XfinityWiFi@Wrigley” label on the Wrigley Wi-Fi SSID. According to McIntyre Comcast will bring in twin 10-Gbps pipes to power the Wrigley Wi-Fi network.

This panoramic view shows why the lower level seats will need under-seat APs for Wi-fi

Average per-fan Wi-Fi use total jumps again at Super Bowl 52

Seen in the main concourse at U.S. Bank Stadium: Two IPTV screens, one Wi-Fi AP and a DAS antenna. Credit: Paul Kapustka, MSR

After a year where the actual amount of average Wi-Fi data used per connected fan at the Super Bowl dropped, the trend of more data used per fan reversed itself again to a new peak at Super Bowl 52, with an average total of 407.4 megabytes per user.

Even though the number of unique connections to the Wi-Fi network at U.S. Bank Stadium for Super Bowl 52 also increased to a record 40,033 users (according to the official statistics compiled by Extreme Networks), the jump from 11.8 terabytes of Wi-Fi data used at Super Bowl 51 to 16.31 TB used at Super Bowl 52 pushed the average per-user number to the top, surpassing the 333 MB per user number from Super Bowl 51, as well as the 370 MB per user mark seen at Super Bowl 50.

While this statistic has not ever been called out by the Extreme Networks Super Bowl compilations, we here at MSR think it is a vital mark since it shows that even with more users on the network those connected users are still using more data. That means that IT departments at venues everywhere should probably still plan for no letup in the overall continued growth in demand for bandwidth at large-venue events, especially at “bucket list” events like the Super Bowl.

Last year we guessed the drop in per-user totals from Super Bowl 50 to Super Bowl 51 might have been due to a larger number of autoconnected users, but we never got an answer from the Extreme Networks team when we asked that question. At U.S. Bank Stadium there was also an autoconnect feature to the Wi-Fi for Verizon Wireless customers, but it didn’t seem to affect the per-user total mark.

JMA touts virtualized RAN for DAS with new XRAN platform

The marketplace for in-building distributed antenna system (DAS) deployments got an interesting jolt Monday with JMA Wireless’s announcement of its new XRAN software platform, which promises to bring the flexibility and cost savings of hardware virtualization to the world of Radio Access Network (RAN) equipment.

In a quick nutshell, the idea behind JMA’s XRAN is to use software and off-the-shelf Intel-based servers to replace the dedicated racks of equipment that are traditionally used to carry signals from celluar carrier lines to antenna infrastructure in a DAS. In addition to potential large savings in amounts of equipment needed, cooling and power costs, and sheer space, the XRAN also promises to allow cloud-based sharing and administration of systems, which could allow multiple buildings or a campus to share an integrated system for flexible capacity control.

A stadium with XRAN, in an example provided by JMA, could theoretically share its DAS deployment infrastructure with nearby office buildings, allowing for better use of resources. Though not yet deployed anywhere commercially, JMA also announced Monday that XRAN software is being used by Telecom Italia Mobile in a “live dense urban network application.” The announcements were officially made at the Mobile World Congress show in Barcelona.

Looking to cut costs for enterprise wireless deployments

The XRAN announcement may be of most immediate interest in the stadium wireless marketplace to third-party network operators, who typically build a DAS network for a stadium and rent space on it back to carriers. That model, employed by companies including Boingo, Mobilitie, ExteNet and 5 Bars, has come under pressure lately as carriers have voiced displeasure over having to pay what they sometimes consider exorbitant prices for access. If costs for DAS deployments and operations could be cut, third party operators might be able to offer more attractive rates to ensure carrier participation.

To be sure, virtualized RAN operations (also sometimes known as “C-RAN” for Cloud-based RAN) have been the focus of many companies inside the telecom services space, for the same cost-saving and feature flexibility promises made possible by switching from dedicated hardware to commodity platforms. In the press literature accompanying its announcement, JMA notes that while some “partially virtualized” RAN architecture equipment exists, JMA claims the XRAN platform is the first fully virtual RAN, software “that can process the full protocol stack” from Layer 1 through Layer 3.

If the cost savings and functional flexibility of RAN virtualization follow the curves seen by virtualization in the application world, XRAN or any similar platforms that may follow could also potentially hold interest for commercial real estate owners and operators. With most industry estimates showing that many large commercial buildings like office towers currently lack a comprehensive indoor wireless coverage solution, by eliminating a big chunk of the cost of a DAS — or by allowing campuses or multiple buildings to share the costs — a DAS could become a more attractive option.

“Cost, simplicity, footprint, power, and cooling changes dramatically with XRAN,” said Todd Landry, corporate vice president of product and market strategy at JMA Wireless, in a prepared statement. “XRAN is designed from its inception to close the gap between rapidly growing in-building mobile connectivity demands and today’s complex, proprietary hardware solutions unable to evolve and adapt for multi-operator services.”

More as we hear more from what is sure to be a talked-about subject in the big-building wireless world!

Minneapolis airport sees 6 TB of Wi-Fi traffic day after Super Bowl

Super Bowl signs hang in the concourse at Minneapolis-St. Paul airport. Credit: MAC (click on any photo for a larger image)

A day after Super Bowl 52 at U.S. Bank Stadium in Minneapolis set new records for wireless data consumption, the Minneapolis-St. Paul International airport had a big wireless day of its own, with 6 terabytes of traffic used on the airport’s Wi-Fi network and another 6.5 TB on the Verizon cellular network.

Eduardo Valencia, vice president and chief information officer for the Metropolitan Airports Commission, said the Wi-Fi data used on Feb. 5 was “close to double typical data consumption” on the free-access network provided by Boingo Wireless, even though the airport saw a fairly normal range of users connecting.

“There was no spike in [the number] of users, but the users who did connect consumed twice as much data, with downloads about 3 times normal,” Valencia said. The Monday-departure crowd, he said, saw about 31,000 unique users connect to the Wi-Fi network, which Valencia said “is at the top of the normal user range” the airport network usually sees. Valencia said that during the week leading up to the big game on Feb. 4, the airport Wi-Fi saw between 23,000 and 31,000 daily connections.

Boingo, which has been powering the Wi-Fi at Minneapolis-St. Paul International Airport (aka MSP) since 2012, updated and expanded coverage a year ago, according to Valencia. Though Boingo would not provide details on how many new APs were added or how many the network has now, Valencia said coverage was increased in many areas, like the tunnels between terminals, to make sure visitors didn’t lose connectivity.

New neutral host DAS from Verizon

Super Bowl LII signage along a moving walkway at MSP. Credit: MAC

The cellular infrastructure at the airport also got an upgrade before the Super Bowl, with a neutral host distributed antenna system (DAS) deployed by Verizon Wireless. The DAS, which uses Corning ONE fiber equipment on the back end, provided coverage for all the top wireless carriers, Valencia said. Though it was cut close — the final pieces went live on Jan. 19, according to Valencia — the expanded DAS, which added antennas all over the terminals as well as outside covering runways, also performed well, according to Valencia.

Though only Verizon stats were available, Valencia said Verizon saw an average of 2.8 TB of data per day in an 11-day span around the Super Bowl, with 6.5 TB of traffic seen on Monday, Feb. 5. Like the Wi-Fi traffic, Valencia said Verizon’s day-after total was about double the average daily consumption.

While there is extra pressure to perform ahead of the NFL’s big game — “The NFL told us the Super Bowl experience begins and ends at the airport,” Valencia said — the payoff will stay for years, as all the new network gear added in advance is permanent.

“We swallowed hard for 9 days, but the success was the culmination of a lot of planning,” Valencia said. “Now the good thing is, everything [in the network] is here to stay.”

Connectivity at the core of Little Caesars Arena, District Detroit

Little Caesars Arena, the new home for the Detroit Red Wings and the Detroit Pistons. Credit: Olympia Entertainment (click on any photo for a larger image)

Bringing great wireless connectivity to a new stadium is almost table stakes these days. But building up a nearby commercial district — and keeping connectivity high outside the venue’s walls — is a bet of another level, especially in Detroit where networks extend outside the new Little Caesars Arena into the 50-block District Detroit.

Following the arena’s opening in September of 2017, the prognosis so far is so far, so good, with solid reports of high network performance on both Wi-Fi and cellular networks in and around the new home of the NHL’s Detroit Red Wings and the NBA’s Detroit Pistons. But for John King, vice president of IT and innovation for venue owners Olympia Entertainment, the responsibilities for him and his network team extend far beyond the new stadium’s walls.

“We’re focused on the [wireless] signal not just in the bowl, but also in the surrounding elements — the streets, the outdoor arenas, and the Little Caesars Arena garage,” said King in an interview shortly after the arena opened. “The vision is, to be connected wherever you are. And to share that experience.”

An ambitious revival in downtown Detroit

Editor’s note: This profile is from our most recent STADIUM TECH REPORT for Winter 2018, which is available for FREE DOWNLOAD from our site. This issue has an in-depth look at the wireless networks at U.S. Bank Stadium in Minneapolis, as well as profiles of network deployments at the Las Vegas Convention Center and Orlando City Stadium! DOWNLOAD YOUR FREE COPY today!

The inside concourse at Little Caesars Arena. Credit: Olympia Entertainment

Built nearby the Detroit Lions’ Ford Field and the Tigers’ Comerica Park, the new hoops/hockey stadium seats 19,515 for hockey and 20,491 for basketball. Unlike many stadiums of the past which rise up from the ground, Little Caesars Arena is built into the ground, 40 feet below street level. The innovations in construction and accessibility, including an outdoor arena adjacent to the indoor one, may require another full profile and an in-person visit. For now, we’ll concentrate on the wireless deployment in and around Little Caesars Arena, which was funded in part by a sponsorship from Comcast Business, which provides backbone bandwidth to the arena and the district in the form of two 100 Gbps connections. The Wi-Fi network design and deployment, done by AmpThink, uses Cisco Wi-Fi gear; Cisco’s Vision for Sports and Entertainment (formerly known as StadiumVision) is used to synchronize video output to the 1,500 TV screens located in and around the venue.

On the cellular side, Verizon Wireless built a neutral-host DAS, which was getting ready to welcome AT&T as the second carrier on board shortly after the opening. According to King, the Wi-Fi network has approximately 1,100 total APs both inside and outside the arena, many of those from Cisco’s 3802 series, which each have two radios per AP. For many of the 300 APs located in the main seating bowl, Little Caesars Arena went with an under-seat deployment, with some others placed in handrail enclosures, especially for the basketball floor-seating areas.

“AmpThink did a really nice job with the deployment,” said King, who said the arena’s open-air suite spaces helped provide “lots of flow” to wireless gear, without the historical overhangs around to block signals on different levels. One early visitor to the arena saw many Wi-Fi speed tests in the 50-60 Mbps range for both download and upload, as well as several in the 80-to-100 Mbps range, signs that a strong signal was available right at the start.

“We’ve still got a lot of tuning, but early on we’re getting great results,” said King of the Wi-Fi performance. “Our goal is to make it the best it can be.”

Staying connected outside the walls

Like The Battery area surrounding the Atlanta Braves’ new SunTrust Park, the District Detroit is meant to be a stay-and-play kind of space, with restaurants, clubs, office spaces and residences seeking to lure visitors and residents to do more than just see a game. For King and his team, one of their tasks is to ensure that visitors can stay connected no matter where they are inside the district, including inside restaurants, offices and other indoor spaces.

Connectivity blends well with the architecture inside Little Caesars Arena. Credit: Tod Caflisch, special to MSR

“We want the [network] signal to be robust, to carry into outdoor spaces, restaurants and many other areas” inside the District Detroit, King said. “We want to push the envelope a little bit and create a useful opportunity.”

Back inside Little Caesars Arena, the team and stadium apps are built by Venuetize, which built a similar integrated app for the Buffalo Bills and the Buffalo Sabres, one that also extends outside arenas to support connectivity in city areas. King said that Little Caesars Arena will be testing pre-order and express pickup concession ordering through the app, with a focus on seating areas that don’t have ready access to some of the club facilities.

Like any other new facility, Little Caesars Arena will no doubt go through some growing pains in its debut season, but for King and others who spent time getting the venue ready it’s fun to have the doors open.

“It’s really great seeing it all come to life,” King said.

Fans use 16.31 TB of Wi-Fi data during Super Bowl 52 at U.S. Bank Stadium

A Wi-Fi handrail enclosure at U.S. Bank Stadium in Minneapolis. Credit: Paul Kapustka, MSR (click on any photo for a larger image)

It is now official — we have a new record for most Wi-Fi data used at a single-day event, as fans at U.S. Bank Stadium in Minneapolis for Super Bowl 52 used 16.31 terabytes of data on the Wi-Fi network.

According to statistics compiled by Extreme Networks during the Philadelphia Eagles’ thrilling 41-33 victory over the New England Patriots Sunday night, the AmpThink-designed network which uses Cisco Wi-Fi gear also saw 40,033 unique users — 59 percent of the 67,612 in attendance — a new top percentage total for any single-game network experience we’ve been told about. (The Dallas Cowboys saw approximately 46,700 unique Wi-Fi users during a playoff game last season, about 50 percent of attendance at AT&T Stadium.)

The network also saw a peak concurrent connection of 25,670 users, and a peak data transfer rate of 7.867 Gbps, according to the numbers released by Extreme. Though Extreme gear was not used in the operation of the network, Extreme has a partnership deal with the NFL under which it provides the “official” network analytics reports from the Super Bowl.

The final total of 16.31 TB easily puts Super Bowl 52 ahead of the last two Super Bowls when it comes to Wi-Fi data use. Last year at NRG Stadium in Houston, there was 11.8 TB of Wi-Fi use recorded, and at Super Bowl 50 in 2016 there was 10.1 TB of Wi-Fi data used at Levi’s Stadium in Santa Clara, Calif. So in reverse chronological order, the last three Super Bowls are the top three Wi-Fi events, indicating that data demand growth at the NFL’s biggest game shows no sign of slowing down. Combined with the 50.2 TB of cellular data used in and around the stadium on game day, Super Bowl 52 saw a total of 66.51 TB of wireless traffic Sunday in Minneapolis.

Confetti fills the air inside U.S. Bank Stadium after the Philadelphia Eagles defeated the New England Patriots in Super Bowl LII. Credit: U.S. Bank Stadium

Super Bowl 52 represented perhaps a leap of faith, in that the handrail-enclosure Wi-Fi design had not yet seen a stress test like that found at the NFL’s biggest event. Now looking ahead to hosting the 2019 Men’s NCAA Basketball Final Four, David Kingsbury, director of IT for U.S. Bank Stadium, can be forgiven for wanting to take a bit of a victory lap before we set our Wi-Fi sights on Atlanta’s Mercedes-Benz Stadium, home of Super Bowl 53.

“AmpThink, CenturyLink and Cisco designed and built a world-class wireless system for U.S. Bank Stadium that handled record-setting traffic for Super Bowl LII,’ Kingsbury said. “AmpThink president Bill Anderson and his team of amazing engineers were a pleasure to work with and the experts at Cisco Sports and Entertainment supported us throughout the multi-year planning process required for an event of this magnitude. High-density wireless networking is such a challenging issue to manage, but I am very happy with our results and wish the team in Atlanta the best next year. The bar has been raised.”

THE LATEST TOP 10 FOR WI-FI

1. Super Bowl 52, U.S. Bank Stadium, Minneapolis, Minn., Feb. 4, 2018: Wi-Fi: 16.31 TB
2. Super Bowl 51, NRG Stadium, Houston, Feb. 5, 2017: Wi-Fi: 11.8 TB
3. Super Bowl 50, Levi’s Stadium, Santa Clara, Calif., Feb. 7, 2016: Wi-Fi: 10.1 TB
4. Minnesota Vikings vs. Philadelphia Eagles, NFC Championship Game, Lincoln Financial Field, Philadelphia, Pa., Jan. 21, 2018: Wi-Fi: 8.76 TB
5. Kansas City Chiefs vs. New England Patriots, Gillette Stadium, Foxborough, Mass., Sept. 7, 2017: Wi-Fi: 8.08 TB
6. Green Bay Packers vs. Dallas Cowboys, Divisional Playoffs, AT&T Stadium, Arlington, Texas, Jan. 15, 2017: Wi-Fi: 7.25 TB
7. Southern California vs. Notre Dame, Notre Dame Stadium, South Bend, Ind., Oct. 21, 2017: 7.0 TB
8. WrestleMania 32, AT&T Stadium, Arlington, Texas, April 3, 2016: Wi-Fi: 6.77 TB
9. Super Bowl 49, University of Phoenix Stadium, Glendale, Ariz., Feb. 1, 2015: Wi-Fi: 6.23 TB
10. Georgia vs. Notre Dame, Notre Dame Stadium, South Bend, Ind., Sept. 9, 2017: Wi-Fi: 6.2 TB

U.S. Bank Stadium in Minneapolis before the start of Super Bowl LII