Nuggets game visit shows Wi-Fi solid at Denver’s Pepsi Center

Nuggets vs. Oklahoma City Thunder at Denver’s Pepsi Center, April 9, 2017. Credit all photos: Paul Kapustka, MSR (click on any photo for a larger image)

About one year into its existence, the fan-facing Wi-Fi network at Denver’s Pepsi Center seems to be in fine working order, at least as far as we could tell by a visit during the last Denver Nuggets home game of the just-finished NBA regular season.

With speed tests showing download speeds of almost 70 Mbps in one spot on the concourse and solid, high-teens numbers in upper deck seats, the Avaya-built public Wi-Fi network allowed us to stay connected at all times. We even watched live video of The Masters golf tournament online while watching Oklahoma City beat Denver in a heartbreaking ending for the Nuggets’ home season, when Thunder star Russell Westbrook capped a 50-point performance with a long 3-pointer that won the game and eliminated Denver from playoff contention.

While we got good speed tests last summer when we toured an empty Pepsi Center, we had no idea how the network would perform under live, full-house conditions, but the Nuggets’ home closer gave us some proof points that the Wi-Fi was working fine. One test on the concourse (in full view of some overhead APs) checked in at 69.47 Mbps for download and 60.96 for upload; another concourse test on the upper deck got numbers of 37.18 / 38.30.

A look from our seats into the rafters, where (we think) we see Wi-Fi APs

In our MSR-budget upper-deck seats (we did not request media access to the game but instead bought tickets like any other fan) we still got solid Wi-Fi numbers, with one test at 15.04 Mbps / 21.44 Mbps and another in the same spot at 17.40 / 16.27. We didn’t see any APs under the seats — according to the Pepsi Center IT staff some of the bowl seats are served by APs shooting up through the concrete (see picture for one possible such location). Looking up we did see some APs hanging from the roof rafters, so perhaps it’s a bit of both.

What’s unclear going forward is who will supply the network for any upgrades, since Avaya is in the process of selling its networking business to Extreme Networks, which has its own Wi-Fi gear and a big stadium network business. For now, it seems like attendees at Nuggets, Avalanche and other Pepsi Center events are covered when it comes to connectivity. Better defense against Westbrook, however, will have to wait until next season.

Upper level concourse APs at Pepsi Center; are these shooting up through the concrete?

Even at the 300 seating level, you have a good view of the court.

Taking the RTD express bus from Boulder is a convenient if crowded option (there was also a Rockies game that day at nearby Coors Field, making the bus trips SRO in both directions)

Who knew Pepsi was found inside mountains? (this photo taken last summer)

AT&T beefs up ski resort reception with stealthy DAS

AT&T DAS antenna stand (right) near the American Eagle lift at Copper Mountain. Credit all photos: Paul Kapustka, MSR (click on any photo for a larger image)

In order to improve cellular reception at the Copper Mountain ski area, AT&T this winter installed a stealthy seven-antenna DAS in several base-area locations, including inside ski-lodge buildings and inside a rooftop cupola.

According to Quin Gelfand, a senior real estate and construction manager for AT&T’s Antenna Solutions Group, the mountain had previously been served only by a single macro tower located up near the slopes of the popular Colorado resort, which is located just off the I-70 Interstate between Frisco and Vail.

On heavy skier-visit days, Gelfand said, the macro tower recently caused some “capacity concerns,” leading AT&T to design a DAS solution for the several base areas at Copper Mountain. In addition to just being saturated by demand, Gelfand said the single macro antennas often didn’t provide strong signals inside buildings at the base areas.

“In a lot of areas around the resort, there were low bars for LTE,” Gelfand said.

AT&T’s Quin Gelfand shows off the main head end DAS gear rack.

But on Feb. 23 this year, that situation changed for AT&T cellular customers, as the DAS went live and immediately started moving lots of cellular traffic. By the time of our visit in early April, Gelfand said the DAS installation (which has the capacity equivalent of a single large macro tower) had already seen more than 7 terabytes of data moved, averaging about 175 GB per day. Like at many Colorado ski areas, March is a busy month at Copper with lots of spring break skiers and locals driving up on weekends from Denver.

Hiding antennas in a cupola

Brad Grohusky, senior IT manager for Copper Mountain, said AT&T approached the resort a couple of years ago to discuss the idea of a DAS. “When we had a dense population of guests, it was pretty easy to saturate a signal,” Grohusky said.

On weekends, Grohusky said Copper could often see as many as 10,000 guests, and might even see as many as 14,000 visitors on popular days or holidays. Wireless communications, he said, could get even more stress if the weather turned nasty or cold, driving more people inside buildings.

DAS antenna (upper top left) in Copper Station lodge

Starting from an existing telecom service room located in an underground garage, AT&T ran fiber this past offseason to three different antenna locations. The closest and most obvious is a three-antenna stand near the “Burning Stones” gathering area and the American Eagle chairlift base. As one of the resort’s main first chairs the American Eagle often has crowds at its base, and the Burning Stones area is a small clearing between the slopes and the base area buildings that is used often for concerts and other public gatherings.

“There was lots of digging last summer,” said Grohusky of the fiber-trenching effort, which gained some extra time thanks to a warmer-than-usual fall that kept the snow at bay. “We took advantage of that extra week,” Grohusky said.

If the American Eagle-area antennas are in plain sight, the two antennas at the Union Creek Schoolhouse base area to the west would be impossible to find if you didn’t know where they were; on the roof of a building AT&T built custom-designed baffling for a rooftop cupola that completely hides the antennas while allowing cellular signals to pass through.

“You would never know the antennas were up there,” Grohusky said. “AT&T really accomodated our architecture there.”

Closer look at DAS tower near American Eagle lift

Back farther to the east, two more antennas were located at the top windows of the Copper Station lodge building, pointed outward to cover the lift base areas and the condos and other buildings in that area. According to Gelfand AT&T used Nokia RAN gear as well as Corning fiber equipment, CommScope cabling components and antennas from JMA Wireless in the deployment. The DAS is powered by a 100 Mbps fiber link from CenturyLink, and supports three cellular bands — 700 MHz, AWS and PCS, according to Gelfand.

Even though ski season is all but over, the network will still get use in the non-snowy months as Copper Mountain, like many Colorado resorts, has an active summer schedule of on-mountain activities. The resort also has a limited free public Wi-Fi network in certain base area buildings, including in and around the Starbucks location right next to the Burning Stones area. Gohusky said there are no current plans to expand the Wi-Fi, and also said that none of the other major cellular carriers are planning to add any of their own DAS deployments.

But for AT&T customers, Grohusky said connectivity is vastly improved. “The feedback has been great,” he said. “Connectivity used to be poor inside buildings, but now it’s great.”

Look back toward the Burning Stones gathering area, near American Eagle lift

Union Creek Schoolhouse building — cupola with AT&T antennas is the one closest to ski hill

JMA Wireless antenna mounted high up inside Copper Station lodge

CommScope gear inside the Copper Station node equipment room

Corning optical gear inside the Copper Station node equipment room

Copper Station lodge building (with DAS antennas) on far right, showing proximity to eastern base area

Stadium Tech Report: Sharks bite into digital future with new Wi-Fi, app strategy for SAP Center

Welcome to the Shark Tank. Credit: Paul Kapustka, MSR (click on any photo for a larger image)

Here’s a dirty secret about Silicon Valley sports: Even in this birthplace of the digital era, the hometown hockey arena has historically had some of the worst mobile connectivity around.

Despite the fact that loyal hockey fans always faithfully filled the seats to support their San Jose Sharks, the building now known as the SAP Center somehow never had the kind of wireless network you’d think its tech-savvy locals would demand.

But that was then. This is now.

After years of low connectivity, the “Shark Tank” is now filled with speedy, high-definition Wi-Fi that forms the base of a new digital-experience strategy for the Sharks and the SAP Center. The new digital experience includes a new team app as well as multiple LED screens in all parts of the stadium, bringing the old building screaming into the forefront of older venues retrofitted with technology that both enhances the fan experience while providing new business opportunities.

“If sports is behind the world in technology, we were even behind in sports,” said John Tortora, chief operating officer for the San Jose Sharks, about the building’s historical shortcomings. Interviewed between periods during a late-January visit by Mobile Sports Report to a Sharks game at SAP Center, Tortora said a sort of perfect storm of desires and needs arrived this past postseason, ending up as an initiative that brought in the arena’s first true fan-facing Wi-Fi network, an expanded LED-screen deployment throughout the arena and a new stadium app. Together, the elements are all aimed at supporting a data-driven strategy to improve marketing efforts while simultaneously providing a huge boost to the fan experience.

And make no mistake about it, better connectivity was an amenity fans wanted most of all.

High-density Wi-Fi provides digital backbone

Handrail enclosures bring Wi-Fi APs close to the fans. Credit: San Jose Sharks

Editor’s note: this profile is from our most recent STADIUM TECH REPORT, which also has in-depth looks at new networks at the Utah Jazz’s Vivint Smart Home Arena, and a recap of wireless activity from Super Bowl LI! DOWNLOAD your FREE COPY today!

“We had a survey of fans from the first half of last season, and the direct response was that the Wi-Fi needed to be improved,” Tortora said. Though there was some Wi-Fi in the building — according to Tortora, there was a system deployed in 2013 — it wasn’t anything the team wanted to talk about or promote; in fact, multiple requests by MSR to review the stadium’s networking infrastructure went ignored for years, prior to the new initiative now in place.

According to Tortora, there were also previously a number of different standalone apps for the various activities that took place in the building, including separate ones for the Sharks, the seasonal Disney ice shows, for youth hockey programs and for other SAP Center events like concerts. Bringing multiple apps together into a unified strategy led the Sharks to simultaneously seek a partner to help upgrade the mobile network infrastructure. Tortora said the Sharks found that partner in Cisco, which brought Wi-Fi gear and its StadiumVision digital-display system as well as some creative financing to the table.

“We had a chance to parallel both a new app and a new infrastructure, and Cisco was a great partner,” Tortora said. Though the terms are undisclosed, Cisco is also participating in the financing and operation of the network marketing elements as a partner to the Sharks.

Under-seat AP enclosure in the lower bowl. Credit: Paul Kapustka, MSR

The Sharks also brought in Wi-Fi deployment expert firm AmpThink to lead the network design, and used Daktronics technology for the multiple new LED boards, many of which are located in the previously blank-walled concourse and club areas. The result is a high-density Wi-Fi network operating at peak speeds, which forms the base for a high-touch digital experience that will ultimately give the Sharks deeper insight into fan behaviors, and a more personal way to deliver the right experience to each fan walking through the doors.

Going low and high to deliver Wi-Fi

Even before you get inside the building, you can connect to the new SAP Center Wi-Fi network, thanks to a bank of APs mounted on the outside walls. Allison Aiello, director of information technologies for the Sharks, said that many fans typically gather just outside the arena pregame, especially in a park just to the east side, and with the push toward more digital ticketing, providing pre-entry connectivity was a must.

Once inside the doors, fans are greeted by the innovative “Kezar” scanners from app developer VenueNext, which can scan either paper or digital tickets, with a green light on the top of the cylindrical system showing that a ticket is valid. Connectivity inside the entryways is also superb, as our tests showed Wi-Fi download speeds in the mid-60 Mbps range, even as crowds of fans came in through the doors.

Wi-Fi APs hanging from the rafters. Credit: Paul Kapustka, MSR

A quick lower-level visit to the main data room showed some of the challenges of retrofitting an older building (the arena, which opened in 1993, had been known as the Compaq Center and the HP Pavilion before becoming the SAP Center in 2013): One of the members of our tech-tour entourage bumped his head on a low-hanging pipe, part of the ice-making infrastructure, on the way to the data center doorway.

With more than 20 years of cabling history inside its walls, the SAP Center wiring closets were an ongoing challenge for the implementation crew from AmpThink, which took pride in its work to streamline and organize the wiring as it installed the new network (with some of the cabling in new, Shark-specific teal coloring). Moving out into the lower seating bowl, AmpThink president Bill Anderson showed off some of the under-seat and railing-mounted AP enclosures, where attention to detail includes drilling concrete cores around the railings below the surface level so that shoes, brooms and other items don’t catch on areas where work has been done.

Anderson said the lower-bowl network is only operating on 5 GHz Wi-Fi channels, adding San Jose to an industry trend of leaving 2.4 GHz channels off the network in fan-facing areas. The main reason for this switch has to do with both the administrative challenges of the 2.4 GHz networks, along with the fact that almost all consumer devices these days support the wider bands of the 5 GHz space. Anderson also had high praise for Cisco’s new 3800 series of Wi-Fi APs that were used in the deployment, which can support dual 5 GHz channels.

According to the Sharks’ Aiello, there are 49 handrail Wi-Fi enclosures in the lower seating bowl, with 47 of those having two APs in each enclosure. For concerts, she said the arena can hang additional APs over the sideline hockey boards, which stay in place while the end zone boards are removed. The total number of APs in the stadium is 462. Our pregame network tests prior to a Sharks-Blackhawks game on Jan. 31 showed a Wi-Fi speed of 63.39 Mbps download and 57.59 upload, halfway down the stairs between sections 114 and 115.

Overhead Wi-Fi for the upper deck

In stadiums where under-seat or handrail APs are used, it’s usually best to not combine those placements with overhead APs since client devices will often seek to connect first to overhead APs, even if they are farther away. But due to a quirk in the SAP Center’s construction, AmpThink went with a deployment strategy of overhead APs for the arena’s upper seating deck, mainly because of the low ceiling that is closer to the seats than many other indoor venues.

A look at the overhead APs from below. Credit: Paul Kapustka, MSR

Though combining different AP architectures is tricky, AmpThink’s Anderson said it’s better to pick “the one that’s economically right” rather than staying stuck with one method. Overhead placements like SAP Center’s, which are hung from the walkways near the roof, are typically much cheaper per placement than under-seat or handrail deployments, which often require extensive work including core drilling through concrete floors.

“I was a little concerned at first [about the combination of overhead and railing placements] but the roof is close enough to work,” said Aiello of the dual placement methods. According to AmpThink’s Anderson, most of the overhead antennas are about 30 feet away from the seats, with the farthest being 45 feet away — still close enough so that the power needed to reach fans doesn’t bleed the signal down into the lower bowl. Aiello also noted that an under-seat or handrail AP design for the upper deck would have required the Sharks the extra expense and work to drill through the ceilings of the stadium’s premium suites, which are located between the two main bowl seating levels.

In the upper deck section 219, we tested the Wi-Fi at several seating locations and came up with consistently fast speeds, including one at 48.88 Mbps/44.96; at the lounge area along the arena’s top row we saw even faster speeds, including a mark of 68.00/68.52. We also saw many VenueNext railing-mounted beacon enclosures, part of a planned 500-beacon network that Aiello said will be coming online sometime soon.

Since hockey games have two long breaks built into each game, it’s extremely important for venues to provide good connectivity in concourse and club areas where fans typically congregate between periods. And even though the SAP Center is an older building — which sometimes makes aesthetics a challenge — AmpThink and the Sharks were able to hide almost all of the APs that are placed every 50 feet around the main circular concourse thanks to a small drywall facade that sticks out from the main wall to support directional and section-number signage.

While some of the Wi-Fi speedtests we took while roaming the concourse during the crowded pre-game were in the high 40 Mbps range, we also got a few tests much higher, with one at 67.94 Mbps/ 58.14 Mbps, and another at 63.76 / 55.96, the latter near a crowded concession area. And even with fans streaming in at a good clip, we even got one test at 69.62 / 70.54 near a doorway, showing that walk-around coverage appears to be solid throughout the building.

A VenueNext beacon mount in the upper deck. Credit: Paul Kapustka, MSR

And even though the Sharks were eliminated in the first round of this year’s playoffs, fans used the SAP Center Wi-Fi at higher levels than normal during the postseason. According to Aiello, the stadium saw a peak of 5,013 concurrent users en route to a total of 1.3 terabytes of data used at the first home playoff game; the second home game saw 1.1 TB of data used, with 4,890 peak concurrent users.

New LED boards keep fans connected while out of seats

If the Wi-Fi APs will remain hidden to fans strolling the concourses, the new LED boards will have an opposite effect — instead of just a few TV screens here and there, the Sharks and Daktronics, along with AmpThink and Cisco have gone all-in with a strategy that has multiple-screen boards and long banks of LED strips that can all be controlled and programmed from a single location, thanks to the Cisco StadiumVision system.

Having networked and controllable screens is a huge plus for administration — according to Aiello the previous setup required manual walk-arounds to configure and check each display. AmpThink also helped reduce the wiring needed for all the new displays by connecting the LED boards to the IP cabling used for the Wi-Fi system.

This photo shows how close the ceiling is to the upper deck seats. Credit: Paul Kapustka, MSR

“The video, audio and Wi-Fi all used to be discrete systems,” said AmpThink’s Anderson. “Now they all roll up to one converged network.”

With StadiumVision, the Sharks will be able to program the displays on the fly, substituting advertising for live game action as quickly as hockey teams change players on the ice. Aiello noted that the combination of screens and a beacon system will allow the Sharks to sell more targeted advertising with real metrics showing the number of fans in the area of a display. Big displays mounted above doorways can also be changed to assist with foot traffic and transportation info for postgame exit flows.

App already providing more marketing leads

Wrapping it all together in the fans’ hands is the new app from VenueNext, a company Tortora said the Sharks had been in contact with since its inaugural launch of the Levi’s Stadium app for the San Francisco 49ers. While the VenueNext app will evolve over time to potentially add in a list of services, the ability to let fans move tickets around digitally has already helped the Sharks start down their desired path of having more personalized information to better reach current and prospective customers.

“During the preseason this year we had 2,500 tickets transferred per game, versus 800 during last year’s preseason games,” Tortora said. Because many of those transfers involved sending tickets to email addresses or phone numbers that weren’t current season ticketholders, Tortora said the Sharks were able to add approximately 7,500 new names to their ticket marketing database, which Tortora simply called “gold.”

Fans’ social media posts are featured on the scoreboard during pregame. Credit: Paul Kapustka, MSR

“To do digital tickets, fans have to download the app, so now we can market directly to that person,” Tortora said. That move will help the Sharks identify things like “who’s not coming to games and why,” which may help the team find out early if fans may not be wanting to renew season tickets, and market to them accordingly.

A Cisco-built fan portal is also part of the overall package, and eventually the team hopes to use that software to construct more-personal marketing messages that can be determined by factors including live presence and location within the building. As more data accumulates, Tortora said the Sharks plan to get even deeper into a strategy currently underway that revolves around dynamic ticket pricing.

“We can use data to find out where seats are in demand, and where some sections may not be selling well,” Tortora said, and shift prices accordingly. The team has already broken seating prices into 16 different categories for this season, with plans to expand that to 36 different categories for next season, Tortora said.

“Airlines do this, hotels do this,” Tortora said. “It’s all about being data-driven.”

The Sharks and Blackhawks get ready to rumble. Credit: Paul Kapustka, MSR

LED screens above an entryway. Credit: Paul Kapustka, MSR

More LED screens above the seating entry areas in the main concourse. Credit: San Jose Sharks

LED screens above entryway, where fans use the VenueNext ‘Kezar’ scanners to validate tickets. Credit: Paul Kapustka, MSR

Even more LED screens, on a different concourse. Credit: San Jose Sharks

More Wi-Fi consolidation as Riverbed acquires Xirrus

Network management company Riverbed announced a definitive agreement to acquire Wi-Fi provider Xirrus, for an undisclosed fee. The move is just the latest in a string of acquisitions of standalone Wi-Fi companies, following Hewlett Packard’s purchase of Aruba in 2015 and the journeys of Ruckus Wireless, which was first bought by Brocade and is now on its way to being a part of Arris.

While Riverbed said it would continue to offer Xirrus’ Wi-Fi products as a standalone business, its interest in Xirrus’ Wi-Fi technology is most likely in the enterprise office market. While nowhere near the size of players like Cisco or Aruba in the stadium wireless networking arena, Xirrus did have some wins in the space, including deals where its equipment was resold by Avaya. More as we hear more on the deal.

Braves see 8.4 TB of Wi-Fi data used at SunTrust Park’s opening weekend

SunTrust Park, new home of the Atlanta Braves

The Atlanta Braves and Wi-Fi provider partner Comcast Business claim that there was 8.4 terabytes of data used on the Wi-Fi network at SunTrust Park and in the surrounding “Battery” public areas during the opening weekend for the Braves’ new home stadium.

According to the Braves and Comcast the networks inside and outside the new ballpark saw 5.3 TB of Wi-Fi used on April 14, the park’s first game for Major League Baseball regular-season action. On the following two nights, the network saw 1.8 TB and 1.3 TB of activity, respectively. The network, which uses Cisco Wi-Fi gear, has 800 APs inside the stadium proper. Look for a profile of the network at SunTrust Park coming soon here on MSR!

Final Four final score: 17.6 TB (at least) of wireless data used at University of Phoenix Stadium

We finally have the Wi-Fi numbers from the NCAA men’s basketball tournament Final Four weekend at the University of Phoenix Stadium, and they are big — a total of 11.2 terabytes of data used during the two days of competition, according to the stadium network crews running the operations for the NCAA. Combined with AT&T’s reported DAS total of 6.4 TB, that means the total wireless usage so far is at least 17.6 TB — and that’s not including DAS numbers from Verizon Wireless, Sprint or T-Mobile, which if we had them would probably push the total far higher.

Just on the Wi-Fi side of things, the Saturday semifinal games this year produced enough single-day traffic (6.3 TB) to sneak into our unofficial Top 5 list for Wi-Fi events, barely edging Super Bowl XLIX, which saw 6.2 TB of traffic in the same building a couple years earlier. Granted, the Final Four has more fans in attendance and more time with two games compared to one, but it’s still a sign (to us, anyway) that wireless use by fans at big games of all types is continuing to grow. (It’s cool to see the comparison between a Super Bowl and a Final Four in the same venue, as well. Looks like the network operators there keep improving from big game to big game.)

According to the network stats provided to us, the Final Four crowd on Saturday saw 38,520 unique users connected to the Wi-Fi at some point, with a max concurrent user total of 20,675. On Monday night’s championship game, those numbers were 31,458 uniques and 19,861 max concurrent users. Attendance for the two sessions was 77,612 for Saturday’s semifinals and 76,168 for Monday’s championship, which were both second-highest ever numbers, according to a cool NCAA infographic that has some more stats on TV and internet viewership.

See you next year in San Antonio, NCAA… to see if the connectivity pace keeps increasing!

THE NEW TOP 8 FOR WI-FI

1. Super Bowl 51, NRG Stadium, Houston, Feb. 5, 2017: Wi-Fi: 11.8 TB
2. Super Bowl 50, Levi’s Stadium, Santa Clara, Calif., Feb. 7, 2016: Wi-Fi: 10.1 TB
3. Green Bay Packers vs. Dallas Cowboys, Divisional Playoffs, AT&T Stadium, Arlington, Texas, Jan. 15, 2017: Wi-Fi: 7.25 TB
4. WrestleMania 32, AT&T Stadium, Arlington, Texas, April 3, 2016: Wi-Fi: 6.77 TB
5. NCAA Men’s Final Four, University of Phoenix Stadium, Glendale, Ariz., April 1, 2017: Wi-Fi: 6.3 TB
6. Super Bowl 49, University of Phoenix Stadium, Glendale, Ariz., Feb. 1, 2015: Wi-Fi: 6.23 TB
7. Alabama vs. Texas A&M, Kyle Field, College Station, Texas, Oct. 17, 2015: Wi-Fi: 5.7 TB
8. Pittsburgh Steelers vs. New England Patriots, AFC Championship Game, Gillette Stadium, Foxborough, Mass., Jan. 22, 2017: Wi-Fi: 5.11 TB