College Football Playoff championship sees 2.4 TB of Wi-Fi — big decline from 2016

We finally have numbers for the Wi-Fi usage at the most recent College Football Playoff championship game, and in somewhat of a first the total data used during the event was much lower than the previous year’s game, with just 2.4 terabytes of data used on Jan. 9 at Raymond James Stadium in Tampa, Fla. — compared to 4.9 TB of Wi-Fi used at the championship game in 2016, held at the University of Phoenix Stadium in Glendale, Ariz.

The reason for the decline is probably not due to any sudden dropoff in user demand, since usage of in-stadium cellular or DAS networks increased from 2016 to 2017, with AT&T’s observed network usage doubling from 1.9 TB to 3.8 TB in Tampa. More likely the dropoff is due to the fact that the Wi-Fi network at the University of Phoenix Stadium had been through recent upgrades to prepare for both the college championship game and Super Bowl XLIX, while the network in Raymond James Stadium hasn’t seen a significant upgrade since 2010, according to stadium officials. At last check, the Wi-Fi network at University of Phoenix Stadium had more than 750 APs installed.

Joey Jones, network engineer/information security for the Tampa Bay Buccaneers, said the Wi-Fi network currently in use at Raymond James Stadium has a total of 325 Cisco Wi-Fi APs, with 130 of those in the bowl seating areas. The design is all overhead placements, Jones said in an email discussion, with no under-seat or handrail enclosure placements. The total unique number of Wi-Fi users for the college playoff game was 11,671, with a peak concurrent connection of 7,353 users, Jones said.

Still tops among college playoff championship games in Wi-Fi is the first one held at AT&T Stadium in 2015, where 4.93 TB of Wi-Fi was used. Next year’s championship game is scheduled to be held at the new Mercedes-Benz Stadium in Atlanta, where one of the latest Wi-Fi networks should be in place and operational.

Seahawks see big jump in Wi-Fi usage at CenturyLink Field for 2016-17 season

Screen Shot 2016-09-12 at 1.11.24 PMThe Seattle Seahawks saw almost every metric associated with the Wi-Fi network at CenturyLink Field just about double from the 2015-16 to the 2016-17 NFL regular season, according to statistics provided by the team.

Chip Suttles, vice president of technology for the Seahawks, sent us over some excellent season-long charts of Wi-Fi activity, including unique and concurrent-user peaks, top throughput, and a couple of comparison charts mapping this most recent season’s activity compared to that a year before.

With a capacity crowd attendance total of 69,000, the always sold-out CenturyLink saw a take rate nearing 50 percent for most of the season, with a top unique-user number of 35,808 for a Nov. 7 31-25 win over the Buffalo Bills. Interestingly, the biggest day for total data usage wasn’t the Bills game (3.259 terabytes) but a 26-15 win over the Philadelphia Eagles on Nov. 20, when the Wi-Fi network saw 3.526 TB of data used. If you look at the comparitive graphs, both peak usage and total usage numbers pretty much doubled down on what was seen the year before.

According to Suttles, there wasn’t much in the way of upgrades to the Extreme Networks-based network before this past season — just some firmware and software updates, and “about a half-dozen” new APs to support additional seating added in the south end zone area. “Overall, I think it [the data increases] is more to do with familiarity,” Suttles said. Thanks to Chip and the Seahawks for sharing the numbers!

sea2

sea1

From overhead to under seat: A short history of the hows and whys of stadium Wi-Fi network design

Wi-Fi handrail enclosures at U.S. Bank Stadium, Minneapolis, Minn. Credit: Paul Kapustka, MSR

Wi-Fi handrail enclosures at U.S. Bank Stadium, Minneapolis, Minn. Credit: Paul Kapustka, MSR

By Bill Anderson, AmpThink

The history of high density (HD) Wi-Fi deployments in stadiums and arenas is short. Yet the the amount of change that occurred is significant; both in terms of how these networks are deployed and why.

Venue operators, manufacturers, and integrators are still grappling with the particulars of HD Wi-Fi in large open environments, even though there are a substantial number of deployed high quality implementations. Below, I’ve shared our perspective on the evolution of HD Wi-Fi design in stadiums and arenas and put forth questions that venue operators should be asking to find a solution that fits their needs and their budget.

AmpThink’s background in this field

Over the past 5 years, our team has been involved in the deployment of more than 50 high-density Wi-Fi networks in stadiums throughout North America. In that same period, the best-practices for stadium HD Wi-Fi design have changed several times, resulting in multiple deployment methodologies.

Each major shift in deployment strategy was intended to increase total system capacity [1]. The largest gains have come from better antenna technology or deployment techinques that better isolated access point output resulting in gains in channel re-use.

What follows is a summary of what we’ve learned from the deployments we participated in and their significance for the future. Hopefully, this information will be useful to others as they embark on their journeys to purchase, deploy, or enhance their own HD Wi-Fi networks.

In the beginning: All about overhead

Editor’s note: This post is part of Mobile Sports Report’s new Voices of the Industry feature, in which industry representatives submit articles, commentary or other information to share with the greater stadium technology marketplace. These are NOT paid advertisements, or infomercials. See our explanation of the feature to understand how it works.


Designers of first generation of HD Wi-Fi networks were starting to develop the basic concepts that would come to define HD deployments in large, open environments. Their work was informed by prior deployments in auditoriums and convention centers and focused on using directional antennas. The stated goal of this approach was to reduce co-channel interference [2] by reducing the effective footprint of an individual access point’s [3] RF output.

However the greatest gains came from improving the quality of the link between clients and the access point. Better antennas allowed client devices to communicate at faster speeds which decreased the amount of time required to complete their communication, making room for more clients on each channel before a given channel became saturated or unstable.

Under seat Wi-Fi AP at Bank of America Stadium. Credit: Carolina Panthers

Under seat Wi-Fi AP at Bank of America Stadium. Credit: Carolina Panthers

The concept was simple, but limited by the fact that there were few antennas available that could do the job effectively. Creative technicians created hybrid assemblies that combined multiple antennas into arrays that rotated polarization and tightened the antenna beam to paint the smallest usable coverage pattern possible. In time, this gap was addressed and today there are antennas specifically developed for use in overhead HD deployments – Stadium Antennas.

Typically, Stadium Antennas are installed in the ceilings above seating and/or on the walls behind seating because those locations are relatively easy to cable and minimize cost. We categorize these deployments as Overhead Deployments.

From overhead to ‘front and back’

First generation overhead deployments generally suffer from a lack of overhead mounting locations to produce sufficient coverage across the entire venue. In football stadiums, the front rows of the lower bowl are typically not covered by an overhang that can be used for antenna placement.

These rows are often more than 100 feet from the nearest overhead mounting location. The result is that pure overhead deployments leave some of the most expensive seats in the venue with little or no coverage. Further, due to the length of these sections, antennas at the back of the section potentially service thousands of client devices [4].

As fans joined these networks, deployments quickly became over-loaded and generated service complaints for venue owners. The solution was simple — add antennas at the front of long sections to reduce the total client load on the access points at the back. It was an effective band-aid that prioritized serving the venues’ most important and often most demanding guests.

This approach increased the complexity of installation as it was often difficult to cable access points located at the front of a section.

And for the first time, antennas were placed where they were subject to damage by fans, direct exposure to weather, and pressure washing [5]. With increased complexity, came increased costs as measured by the average cost per installed access point across a venue.

Because these systems feature antennas at the front and rear of each seating section, we refer to these deployments as ‘Front-to-Back Deployments.’ While this approach solves specific problems, it is not a complete solution in larger venues.

‘Filling In’ the gaps

Data collected from Front-to-Back Deployments proved to designers that moving the antennas closer to end users:
— covered areas that were previously uncovered;
— increased average data rates throughout the bowl;
— used the available spectrum more effectively; and
— increased total system capacity.

The logical conclusion was that additional antennas installed between the front and rear antennas would further increase system capacity. In long sections these additional antennas would also provide coverage to fans that were seated too far forward of antennas at the rear of the section and too far back from antennas at the front of the section. The result was uniform coverage throughout the venue.

In response, system designers experimented with hand rail mounted access points. Using directional antennas, coverage could be directed across a section and in opposition to the forward-facing antennas at the rear of the section and rear-facing antennas at the front of a section. These placements filled in the gaps in a Front-to-Back Deployment, hence the name ‘In-Fill Deployment.’

While these new In-Fill Deployments did their job, they added expense to what was already an expensive endeavor. Mounting access points on handrails required that a hole be drilled in the stadia at each access point location to cable the installed equipment. With the access point and antenna now firmly embedded in the seating, devices were also exposed to more traffic and abuse. Creative integrators came to the table with hardened systems to protect the equipment – handrail enclosures. New costs included: using ground-penetrating radar to prepare for coring; enclosure fabrication costs; and more complex conduit and pathway considerations. A typical handrail placement could cost four times the cost of a typical overhead placement and a designer might call for 2 or 3 handrail placements for every overhead placement.

Getting closer, better, faster: Proximate Networks

In-Fill strategies substantially solved the coverage problem in large venues. Using a combination of back of section, front of section, and hand-rail mounted access points, wireless designers had a tool box to deliver full coverage.

But with that success came a new problem. As fans discovered these high density networks and found new uses for them, demands on those networks grew rapidly, especially where teams or venue owners pushed mobile-device content strategies that added to the network load. In-spite of well placed access points, fan devices did not attach to the in-fill devices at the same rate that they attached to the overhead placements [6]. In-fill equipment remained lightly used and overhead placements absorbed hundreds of clients. Gains in system capacity stalled.

Close-up look at U.S. Bank Stadium railing enclosure during final construction phase, summer 2016. Credit: Paul Kapustka, MSR

Close-up look at U.S. Bank Stadium railing enclosure during final construction phase, summer 2016. Credit: Paul Kapustka, MSR

To overcome uneven system loading, designers needed to create a more even distribution of RF energy within the deployment. That required a consistent approach to deployment, rather than a mix of deployment approaches. The result was the elimination of overhead antennas in favor of access points and antennas installed within the crowd, closest to the end use; hence the name ‘Proximate Networks.’

Proximate networks come in two variations: handrail only and under seat only. In the hand rail only model, the designer eliminates overhead and front of section placements in favor of a dense deployment of hand rail enclosures. In the under seat model, the designer places the access point and antenna underneath the actual seating (but above the steel or concrete decking). In both models, the crowd becomes an important part of the design. The crowd attenuates the signal as it passes through their bodies resulting in consistent signal degradation and even distribution of RF energy throughout the seating bowl. The result is even access point loading and increased system capacity.

An additional benefit of embedding the access points in the crowd is that the crowd effectively constrains the output of the access point much as a wall constrains the output of an access point in a typical building. Each radio therefore hears fewer of its neighbors, allowing each channel to be re-used more effectively. And because the crowd provides an effective mechanism for controlling the spread of RF energy, the radios can be operated at higher power levels which improves the link between the access point and the fan’s device. The result is more uniform system loading, higher average data rates, increased channel re-ue, and increases in total system capacity.

While Proximate Networks are still a relatively new concept, the early data (and a rapid number of fast followers) confirms that if you want the densest possible network with the largest possible capacity, then a Proximate Network is what you need.

The Financials: picking what’s right for you

From the foregoing essay, you might conclude that the author’s recommendation is to deploy a Proximate Network. However, that is not necessarily the case. If you want the densest possible network with the largest possible capacity, then a Proximate Network is a good choice. But there are merits to each approach described and a cost benefit analysis should be performed before a deployment approach is selected.

For many venues, Overhead Deployments remain the most cost effective way to provide coverage. For many smaller venues and in venues where system utilization is expected to be low, an Overhead deployment can be ideal.

Front-to-Back deployments work well in venues where system utilization is low and the available overhead mounting assets can’t cover all areas. The goal of these deployments is ensuring usable coverage, not maximizing total system capacity.

In-fill deployments are a good compromise between a coverage-centric high density approach and a capacity-centric approach. This approach is best suited to venues that need more total system capacity, but have budget constraints the prevent selecting a Proximate approach.

Proximate deployments provide the maximum possible wireless density for venues where connectivity is considered to be a critical part of the venue experience.

Conclusion

If your venue is contemplating deploying a high density network, ask your integrator to walk you through the expected system demand, the calculation of system capacity for each approach, and finally the cost of each approach. Make sure you understand their assumptions. Then, select the deployment model that meets your business requirements — there is no “one size fits all” when it comes to stadium Wi-Fi.

Bill Anderson, AmpThink

Bill Anderson, AmpThink

Bill Anderson has been involved in the design and construction of wireless networks for over 20 years, pre-dating Wi-Fi. His first experience with wireless networking was as a software developer building software for mobile computers communicating over 400 MHz and 900 MHz base stations developed by Aironet (now part of Cisco Systems).

His work with mobile computing and wireless networks in distribution and manufacturing afforded him a front row seat to the emergence of Wi-Fi and the transformation of Wi-Fi from a niche technology to a business critical system. Since 2011 at AmpThink Bill has been actively involved in constructing some of the largest single venue wireless networks in the world.

Footnotes

^ 1. A proxy for the calculation of overall system capacity is developed by multiplying the average speed of communication of all clients on a channel (avg data rate or speed) by the number of channels deployed in the system (available spectrum) by the number of times we can use each channel (channel re-use) or [speed x spectrum x re-use]. While there are many other parameters that come into play when designing a high density network (noise, attenuation, reflection, etc.), this simple equation helps us understand how we approach building networks that can support a large number of connected devices in an open environment, e.g. the bowl of a stadium or arena.

^ 2. Co-channel interference refers to a scenario where multiple access points are attepting to communicate with client devices using the same channel. If a client or access point hears competing communication on the channel they are attempting to use, they must wait until that communication is complete before they can send their message.

^ 3. Access Point is the term used in the Wi-Fi industry to describe the network endpoint that client devices communicate with over the air. Other terms used include radio, AP, or WAP. In most cases, each access point is equipped with 2 or more physical radios that communicate on one of two bands – 2.4 GHz or 5 GHz. HD Wi-Fi deployments are composed of several hundred to over 1,000 access points connected to a robust wired network that funnels guest traffic to and from the internet.

^ 4. While there is no hard and fast rule, most industry experts agree that a single access point can service between 50 and 100 client devices.

^ 5. Venues often use pressure washers to clean a stadium after a big event.

^ 6. Unlike cellular systems which can dictate which mobile device attaches to each network node, at what speed, and when they can communicate, Wi-Fi relies on the mobile device to make the same decisions. When presented with a handrail access point and an overhead access point, mobile devices often hear the overhead placement better and therefore prefer the overhead placement. In In-Fill deployments, this often results in a disproportionate number of client devices selecting overhead placements. The problem can be managed by lowering the power level on the overhead access point at the expense of degrading the experience of the devices that the designer intended to attach to the overhead access point.

Arris to acquire Ruckus from Brocade as part of $800 million deal

Screen Shot 2017-02-22 at 6.17.41 PMThe wireless networking business once known as Ruckus Wireless is finding a new home, as Arris announced today that it plans to buy Ruckus from current owner Brocade as part of an $800 million deal that also includes Brocade’s ICX switch business.

Brocade, which purchased Ruckus for $1.2 billion earlier this year, is being acquired itself, by chipmaker Broadcom in a $5.5 billion deal announced last November.

According to an Arris release, the deal will be completed when Broadcom’s purchase of Brocade closes, an event expected to happen at the end of July.

We’ve got some calls in to see what, if anything, will happen to the growing stadium networking business at Ruckus, which in the past year has seen Ruckus Wi-Fi gear in arenas like Bankers Life Fieldhouse in Indianapolis and Golden 1 Center in Sacramento. Stay tuned!

AmpThink’s Wi-Fi data reveals interesting attendance trends for collegiate customer

AmpThink infographic about how Wi-Fi data can help teams discover attendance information (click on photo for link to infographic page)

AmpThink infographic about how Wi-Fi data can help teams discover attendance information (click on photo for link to infographic page)

If there’s a single business concern we hear over and over again from stadium owners and operators, it’s the desire to answer a simple but powerful question: Who, exactly, is sitting in our seats?

Before digital technology arrived, that question was exceedingly hard to answer, as teams, schools and other ticket-sellers often only knew details of a small percentage of the actual fans in attendance. Paper tickets could be given to family, friends or sold to brokers, meaning the people actually at the game might very well not be the person who purchased the tickets.

While digital ticketing has improved the insight somewhat, many fans at many stadiums still use printed tickets for access, which may still keep the at-game attendee somewhat anonymous. But with a high-definition Wi-Fi network in place, stadium owners and operators can gain deep insights from the fans who do attend, even whether or not they actually log on to the network for access.

Wi-Fi deployment firm AmpThink, which has customers in all the U.S. pro leagues as well as in many major university venues, has put together a deeply sourced infographic showing how Wi-Fi analytics from a full season of games at a Power 5 college football stadium can produce some interesting insights — like the fact that 71 percent of all attendees only went to one game, and that only 2 percent of attendees went to all six games.

Using data to replace assumptions

Editor’s note: This post is part of Mobile Sports Report’s new Voices of the Industry feature, in which industry representatives submit articles, commentary or other information to share with the greater stadium technology marketplace. These are NOT paid advertisements, or infomercials. See our explanation of the feature to understand how it works.

While we here at Mobile Sports Report don’t often recommend company-produced infographics, the data and conclusions surfaced in this one are eye-opening and are likely to be informative to venue owners and operators in a wide range of sports; that’s why we agreed to make this information available to our readers.

We also like the detailed explanations accompanying the infographic, spelling out how the data were collected and how Wi-Fi can be used to identify fans (including those with devices that may not even be purposely connected to the network). The last part of the infographic page, which asks “How could Wi-Fi data change sports marketing?” is a question we’ve already seen others starting to answer — and one we expect many to test in the near future as teams deploy not just Wi-Fi networks but also Bluetooth beacons, login portal pages and other methods to increase the granularity of fan identification.

For the unidentified client, AmpThink said the results “surprised” the school, which had (like others) believed in “long-held industry assumptions about fan loyalty and audience size.” It’s our guess that digital data will increasingly be used to replace assumptions, and we’re looking forward to sharing your stories of how that happens.

AT&T Stadium sees 7.25 TB of Wi-Fi for Packers vs. Cowboys playoff game

The Dallas Cowboys before taking the field against the Green Bay Packers in a Jan. 15 playoff game. Credit: James D. Smith/Dallas Cowboys

The Dallas Cowboys before taking the field against the Green Bay Packers in a Jan. 15 playoff game. Credit: James D. Smith/Dallas Cowboys

Pro football’s biggest stadium had the biggest non-Super Bowl Wi-Fi traffic day we’ve heard of this season, as the Dallas Cowboys reported seeing 7.25 terabytes of Wi-Fi data on the AT&T Stadium network during the Packers’ thrilling 34-31 victory on Jan. 15.

John Winborn, chief information officer for the Dallas Cowboys, sent us the info on the stadium’s biggest Wi-Fi day ever, surpassing the previous record of 6.77 TB seen on the AT&T Stadium Wi-Fi network for WrestleMania 32 back on April 5, 2016. The new total for Wi-Fi was even set by fewer fans, with attendance for the Jan. 15 playoff game at 93,396, compared to the 101,763 at WrestleMania.

Though he didn’t provide an exact number, Winborn also said that the take rate of unique clients on the Wi-Fi network for the Packers game was 50 percent of attendees, roughly 46,700, easily one of the biggest numbers we’ve seen anywhere. During the Cowboys’ excellent regular season, Winborn said the average of Wi-Fi data used per game was 5.28 TB, an increase of 33 percent over the 2015 season.

UPDATE: The AT&T folks have provided the DAS stats for the same game, with an additional 3 TB of data used on the AT&T cellular networks inside the stadium. So we’re up to 10.25 TB for a non-Super Bowl game… doubt we will get any other carriers to add their totals but sounds to me like this is the biggest non-Super Bowl event out there in terms of total data.

Any other NFL teams (or college teams) out there with peak games and/or season averages, send them in! Let’s keep updating this list!

THE NEW TOP 7 FOR WI-FI

1. Super Bowl 51, NRG Stadium, Houston, Feb. 5, 2017: Wi-Fi: 11.8 TB
2. Super Bowl 50, Levi’s Stadium, Santa Clara, Calif., Feb. 7, 2016: Wi-Fi: 10.1 TB
3. Green Bay Packers vs. Dallas Cowboys, Divisional Playoffs, AT&T Stadium, Arlington, Texas, Jan. 15, 2017: Wi-Fi: 7.25 TB
4. WrestleMania 32, AT&T Stadium, Arlington, Texas, April 3, 2016: Wi-Fi: 6.77 TB
5. Super Bowl 49, University of Phoenix Stadium, Glendale, Ariz., Feb. 1, 2015: Wi-Fi: 6.23 TB
6. Alabama vs. Texas A&M, Kyle Field, College Station, Texas, Oct. 17, 2015: Wi-Fi: 5.7 TB
7. Pittsburgh Steelers vs. New England Patriots, AFC Championship Game, Gillette Stadium, Foxborough, Mass., Jan. 22, 2017: Wi-Fi: 5.11 TB