From overhead to under seat: A short history of the hows and whys of stadium Wi-Fi network design

Wi-Fi handrail enclosures at U.S. Bank Stadium, Minneapolis, Minn. Credit: Paul Kapustka, MSR

Wi-Fi handrail enclosures at U.S. Bank Stadium, Minneapolis, Minn. Credit: Paul Kapustka, MSR

By Bill Anderson, AmpThink

The history of high density (HD) Wi-Fi deployments in stadiums and arenas is short. Yet the the amount of change that occurred is significant; both in terms of how these networks are deployed and why.

Venue operators, manufacturers, and integrators are still grappling with the particulars of HD Wi-Fi in large open environments, even though there are a substantial number of deployed high quality implementations. Below, I’ve shared our perspective on the evolution of HD Wi-Fi design in stadiums and arenas and put forth questions that venue operators should be asking to find a solution that fits their needs and their budget.

AmpThink’s background in this field

Over the past 5 years, our team has been involved in the deployment of more than 50 high-density Wi-Fi networks in stadiums throughout North America. In that same period, the best-practices for stadium HD Wi-Fi design have changed several times, resulting in multiple deployment methodologies.

Each major shift in deployment strategy was intended to increase total system capacity [1]. The largest gains have come from better antenna technology or deployment techinques that better isolated access point output resulting in gains in channel re-use.

What follows is a summary of what we’ve learned from the deployments we participated in and their significance for the future. Hopefully, this information will be useful to others as they embark on their journeys to purchase, deploy, or enhance their own HD Wi-Fi networks.

In the beginning: All about overhead

Editor’s note: This post is part of Mobile Sports Report’s new Voices of the Industry feature, in which industry representatives submit articles, commentary or other information to share with the greater stadium technology marketplace. These are NOT paid advertisements, or infomercials. See our explanation of the feature to understand how it works.


Designers of first generation of HD Wi-Fi networks were starting to develop the basic concepts that would come to define HD deployments in large, open environments. Their work was informed by prior deployments in auditoriums and convention centers and focused on using directional antennas. The stated goal of this approach was to reduce co-channel interference [2] by reducing the effective footprint of an individual access point’s [3] RF output.

However the greatest gains came from improving the quality of the link between clients and the access point. Better antennas allowed client devices to communicate at faster speeds which decreased the amount of time required to complete their communication, making room for more clients on each channel before a given channel became saturated or unstable.

Under seat Wi-Fi AP at Bank of America Stadium. Credit: Carolina Panthers

Under seat Wi-Fi AP at Bank of America Stadium. Credit: Carolina Panthers

The concept was simple, but limited by the fact that there were few antennas available that could do the job effectively. Creative technicians created hybrid assemblies that combined multiple antennas into arrays that rotated polarization and tightened the antenna beam to paint the smallest usable coverage pattern possible. In time, this gap was addressed and today there are antennas specifically developed for use in overhead HD deployments – Stadium Antennas.

Typically, Stadium Antennas are installed in the ceilings above seating and/or on the walls behind seating because those locations are relatively easy to cable and minimize cost. We categorize these deployments as Overhead Deployments.

From overhead to ‘front and back’

First generation overhead deployments generally suffer from a lack of overhead mounting locations to produce sufficient coverage across the entire venue. In football stadiums, the front rows of the lower bowl are typically not covered by an overhang that can be used for antenna placement.

These rows are often more than 100 feet from the nearest overhead mounting location. The result is that pure overhead deployments leave some of the most expensive seats in the venue with little or no coverage. Further, due to the length of these sections, antennas at the back of the section potentially service thousands of client devices [4].

As fans joined these networks, deployments quickly became over-loaded and generated service complaints for venue owners. The solution was simple — add antennas at the front of long sections to reduce the total client load on the access points at the back. It was an effective band-aid that prioritized serving the venues’ most important and often most demanding guests.

This approach increased the complexity of installation as it was often difficult to cable access points located at the front of a section.

And for the first time, antennas were placed where they were subject to damage by fans, direct exposure to weather, and pressure washing [5]. With increased complexity, came increased costs as measured by the average cost per installed access point across a venue.

Because these systems feature antennas at the front and rear of each seating section, we refer to these deployments as ‘Front-to-Back Deployments.’ While this approach solves specific problems, it is not a complete solution in larger venues.

‘Filling In’ the gaps

Data collected from Front-to-Back Deployments proved to designers that moving the antennas closer to end users:
— covered areas that were previously uncovered;
— increased average data rates throughout the bowl;
— used the available spectrum more effectively; and
— increased total system capacity.

The logical conclusion was that additional antennas installed between the front and rear antennas would further increase system capacity. In long sections these additional antennas would also provide coverage to fans that were seated too far forward of antennas at the rear of the section and too far back from antennas at the front of the section. The result was uniform coverage throughout the venue.

In response, system designers experimented with hand rail mounted access points. Using directional antennas, coverage could be directed across a section and in opposition to the forward-facing antennas at the rear of the section and rear-facing antennas at the front of a section. These placements filled in the gaps in a Front-to-Back Deployment, hence the name ‘In-Fill Deployment.’

While these new In-Fill Deployments did their job, they added expense to what was already an expensive endeavor. Mounting access points on handrails required that a hole be drilled in the stadia at each access point location to cable the installed equipment. With the access point and antenna now firmly embedded in the seating, devices were also exposed to more traffic and abuse. Creative integrators came to the table with hardened systems to protect the equipment – handrail enclosures. New costs included: using ground-penetrating radar to prepare for coring; enclosure fabrication costs; and more complex conduit and pathway considerations. A typical handrail placement could cost four times the cost of a typical overhead placement and a designer might call for 2 or 3 handrail placements for every overhead placement.

Getting closer, better, faster: Proximate Networks

In-Fill strategies substantially solved the coverage problem in large venues. Using a combination of back of section, front of section, and hand-rail mounted access points, wireless designers had a tool box to deliver full coverage.

But with that success came a new problem. As fans discovered these high density networks and found new uses for them, demands on those networks grew rapidly, especially where teams or venue owners pushed mobile-device content strategies that added to the network load. In-spite of well placed access points, fan devices did not attach to the in-fill devices at the same rate that they attached to the overhead placements [6]. In-fill equipment remained lightly used and overhead placements absorbed hundreds of clients. Gains in system capacity stalled.

Close-up look at U.S. Bank Stadium railing enclosure during final construction phase, summer 2016. Credit: Paul Kapustka, MSR

Close-up look at U.S. Bank Stadium railing enclosure during final construction phase, summer 2016. Credit: Paul Kapustka, MSR

To overcome uneven system loading, designers needed to create a more even distribution of RF energy within the deployment. That required a consistent approach to deployment, rather than a mix of deployment approaches. The result was the elimination of overhead antennas in favor of access points and antennas installed within the crowd, closest to the end use; hence the name ‘Proximate Networks.’

Proximate networks come in two variations: handrail only and under seat only. In the hand rail only model, the designer eliminates overhead and front of section placements in favor of a dense deployment of hand rail enclosures. In the under seat model, the designer places the access point and antenna underneath the actual seating (but above the steel or concrete decking). In both models, the crowd becomes an important part of the design. The crowd attenuates the signal as it passes through their bodies resulting in consistent signal degradation and even distribution of RF energy throughout the seating bowl. The result is even access point loading and increased system capacity.

An additional benefit of embedding the access points in the crowd is that the crowd effectively constrains the output of the access point much as a wall constrains the output of an access point in a typical building. Each radio therefore hears fewer of its neighbors, allowing each channel to be re-used more effectively. And because the crowd provides an effective mechanism for controlling the spread of RF energy, the radios can be operated at higher power levels which improves the link between the access point and the fan’s device. The result is more uniform system loading, higher average data rates, increased channel re-ue, and increases in total system capacity.

While Proximate Networks are still a relatively new concept, the early data (and a rapid number of fast followers) confirms that if you want the densest possible network with the largest possible capacity, then a Proximate Network is what you need.

The Financials: picking what’s right for you

From the foregoing essay, you might conclude that the author’s recommendation is to deploy a Proximate Network. However, that is not necessarily the case. If you want the densest possible network with the largest possible capacity, then a Proximate Network is a good choice. But there are merits to each approach described and a cost benefit analysis should be performed before a deployment approach is selected.

For many venues, Overhead Deployments remain the most cost effective way to provide coverage. For many smaller venues and in venues where system utilization is expected to be low, an Overhead deployment can be ideal.

Front-to-Back deployments work well in venues where system utilization is low and the available overhead mounting assets can’t cover all areas. The goal of these deployments is ensuring usable coverage, not maximizing total system capacity.

In-fill deployments are a good compromise between a coverage-centric high density approach and a capacity-centric approach. This approach is best suited to venues that need more total system capacity, but have budget constraints the prevent selecting a Proximate approach.

Proximate deployments provide the maximum possible wireless density for venues where connectivity is considered to be a critical part of the venue experience.

Conclusion

If your venue is contemplating deploying a high density network, ask your integrator to walk you through the expected system demand, the calculation of system capacity for each approach, and finally the cost of each approach. Make sure you understand their assumptions. Then, select the deployment model that meets your business requirements — there is no “one size fits all” when it comes to stadium Wi-Fi.

Bill Anderson, AmpThink

Bill Anderson, AmpThink

Bill Anderson has been involved in the design and construction of wireless networks for over 20 years, pre-dating Wi-Fi. His first experience with wireless networking was as a software developer building software for mobile computers communicating over 400 MHz and 900 MHz base stations developed by Aironet (now part of Cisco Systems).

His work with mobile computing and wireless networks in distribution and manufacturing afforded him a front row seat to the emergence of Wi-Fi and the transformation of Wi-Fi from a niche technology to a business critical system. Since 2011 at AmpThink Bill has been actively involved in constructing some of the largest single venue wireless networks in the world.

Footnotes

^ 1. A proxy for the calculation of overall system capacity is developed by multiplying the average speed of communication of all clients on a channel (avg data rate or speed) by the number of channels deployed in the system (available spectrum) by the number of times we can use each channel (channel re-use) or [speed x spectrum x re-use]. While there are many other parameters that come into play when designing a high density network (noise, attenuation, reflection, etc.), this simple equation helps us understand how we approach building networks that can support a large number of connected devices in an open environment, e.g. the bowl of a stadium or arena.

^ 2. Co-channel interference refers to a scenario where multiple access points are attepting to communicate with client devices using the same channel. If a client or access point hears competing communication on the channel they are attempting to use, they must wait until that communication is complete before they can send their message.

^ 3. Access Point is the term used in the Wi-Fi industry to describe the network endpoint that client devices communicate with over the air. Other terms used include radio, AP, or WAP. In most cases, each access point is equipped with 2 or more physical radios that communicate on one of two bands – 2.4 GHz or 5 GHz. HD Wi-Fi deployments are composed of several hundred to over 1,000 access points connected to a robust wired network that funnels guest traffic to and from the internet.

^ 4. While there is no hard and fast rule, most industry experts agree that a single access point can service between 50 and 100 client devices.

^ 5. Venues often use pressure washers to clean a stadium after a big event.

^ 6. Unlike cellular systems which can dictate which mobile device attaches to each network node, at what speed, and when they can communicate, Wi-Fi relies on the mobile device to make the same decisions. When presented with a handrail access point and an overhead access point, mobile devices often hear the overhead placement better and therefore prefer the overhead placement. In In-Fill deployments, this often results in a disproportionate number of client devices selecting overhead placements. The problem can be managed by lowering the power level on the overhead access point at the expense of degrading the experience of the devices that the designer intended to attach to the overhead access point.

AT&T Stadium sees 7.25 TB of Wi-Fi for Packers vs. Cowboys playoff game

The Dallas Cowboys before taking the field against the Green Bay Packers in a Jan. 15 playoff game. Credit: James D. Smith/Dallas Cowboys

The Dallas Cowboys before taking the field against the Green Bay Packers in a Jan. 15 playoff game. Credit: James D. Smith/Dallas Cowboys

Pro football’s biggest stadium had the biggest non-Super Bowl Wi-Fi traffic day we’ve heard of this season, as the Dallas Cowboys reported seeing 7.25 terabytes of Wi-Fi data on the AT&T Stadium network during the Packers’ thrilling 34-31 victory on Jan. 15.

John Winborn, chief information officer for the Dallas Cowboys, sent us the info on the stadium’s biggest Wi-Fi day ever, surpassing the previous record of 6.77 TB seen on the AT&T Stadium Wi-Fi network for WrestleMania 32 back on April 5, 2016. The new total for Wi-Fi was even set by fewer fans, with attendance for the Jan. 15 playoff game at 93,396, compared to the 101,763 at WrestleMania.

Though he didn’t provide an exact number, Winborn also said that the take rate of unique clients on the Wi-Fi network for the Packers game was 50 percent of attendees, roughly 46,700, easily one of the biggest numbers we’ve seen anywhere. During the Cowboys’ excellent regular season, Winborn said the average of Wi-Fi data used per game was 5.28 TB, an increase of 33 percent over the 2015 season.

UPDATE: The AT&T folks have provided the DAS stats for the same game, with an additional 3 TB of data used on the AT&T cellular networks inside the stadium. So we’re up to 10.25 TB for a non-Super Bowl game… doubt we will get any other carriers to add their totals but sounds to me like this is the biggest non-Super Bowl event out there in terms of total data.

Any other NFL teams (or college teams) out there with peak games and/or season averages, send them in! Let’s keep updating this list!

THE NEW TOP 7 FOR WI-FI

1. Super Bowl 51, NRG Stadium, Houston, Feb. 5, 2017: Wi-Fi: 11.8 TB
2. Super Bowl 50, Levi’s Stadium, Santa Clara, Calif., Feb. 7, 2016: Wi-Fi: 10.1 TB
3. Green Bay Packers vs. Dallas Cowboys, Divisional Playoffs, AT&T Stadium, Arlington, Texas, Jan. 15, 2017: Wi-Fi: 7.25 TB
4. WrestleMania 32, AT&T Stadium, Arlington, Texas, April 3, 2016: Wi-Fi: 6.77 TB
5. Super Bowl 49, University of Phoenix Stadium, Glendale, Ariz., Feb. 1, 2015: Wi-Fi: 6.23 TB
6. Alabama vs. Texas A&M, Kyle Field, College Station, Texas, Oct. 17, 2015: Wi-Fi: 5.7 TB
7. Pittsburgh Steelers vs. New England Patriots, AFC Championship Game, Gillette Stadium, Foxborough, Mass., Jan. 22, 2017: Wi-Fi: 5.11 TB

Arizona State upgrades DAS, Wi-Fi at Sun Devil Stadium

Sun Devil Stadium at Arizona State. Credit all photos: ASU

Sun Devil Stadium at Arizona State. Credit all photos: ASU

When Arizona State University started renovating Sun Devil Stadium three years ago, the project wasn’t so much a simple wireless refresh as it was a total reset of what technology, sports and academia could co-create.

In addition to expanded Wi-Fi and DAS for the stadium (a venue that includes classrooms, meeting rooms and retail outlets), ASU activated a virtual beacon trial. The university also joined up with Intel to explore how Internet of Things devices might yield better environmental information about the bowl, including acoustic data, Jay Steed, assistant vice president of IT operations, told Mobile Sports Report.

The university’s IT department understood that a richer fan experience for football and other events would require a robust network. Steed and his colleagues visited other venues like Levi’s Stadium, AT&T Stadium, Stanford and Texas A&M to get a better handle on different approaches to networking, applications and services.

Regardless, some sort of refresh was overdue. Wedged between two buttes in the southeastern Phoenix suburb of Tempe, the 71,000-seat Sun Devil Stadium was completed in 1958 and needed infrastructure and technology updates. Wi-Fi access was limited to point-of-sale systems and stadium suites; fans generally relied on a DAS network.

Time for an upgrade

Editor’s note: This profile is from our latest STADIUM TECH REPORT, the ONLY in-depth publication created specifically for the stadium technology professional and the stadium technology marketplace. Read about the Sacramento Kings’ new Golden 1 Center and the new Wi-Fi network for the Super Bowl in our report, which is available now for FREE DOWNLOAD from our site!

“The stadium needed a lot of facelifting, not just from a technology perspective but also for the fan experience, like ADA compliance and overall comfort,” Steed said. “We didn’t just want to rebuild a venue for six football games a year, but extend its availability to 365 days and make it a cornerstone and anchor for the whole campus.”

The 'Inferno' student section got a priority for better connectivity.

The ‘Inferno’ student section got a priority for better connectivity.

The reset included tearing out the lower bowl to “punch some new holes” — new entry points to the stadium — and to add conduits and cabling for the new 10-gigabit fiber backbone for the stadium. The network can be upgraded as needed to 40- and even 100-gigabit pipes, according to Steed.

“We wanted to make sure it could support fans’ [connectivity needs] and all the facility’s operations with regard to video and StadiumVision, learning and education, and Pac-12 needs as well,” he said.

The overall stadium renovation was budgeted at $268 million; the technology upgrades will total about $8 million.

The university added 250 new DAS antennas. The vendor-neutral network includes AT&T, Verizon, Sprint and T-Mobile, which share 21 DAS sectors to keep cell service humming inside the stadium.

On the Wi-Fi side, ASU opted for Cisco’s access points. The networking vendor was already entrenched across the 642-acre campus; Steed and the IT department prefer the simplicity of a single-vendor network. Cisco helped with the hardware and RF engineering for Sun Devil Stadium. CenturyLink offered guidance on the networking and fiber pieces of the project, while Hunt-Sundt, a joint venture, was the contractor for most of the physical construction.

Wireless service for ‘The Inferno’

When the renovation is officially completed later in 2017 (most of the network is already live), there will be 1,100 APs in and around Sun Devil Stadium. The student sections, also known as The Inferno, get more APs and bandwidth since historical data has shown students to be the biggest bandwidth consumers in the stadium. Consequently, the ratio in the student sections is one AP to every 50 users; the rest of the bowl’s APs each handle about 75 users on average, Steed said.

Breakaway look at an under-seat AP

Breakaway look at an under-seat AP

ASU’s new Wi-Fi network was engineered to deliver 1.5 Mbps upstream and 3 Mbps downstream, but Steed said so far users are getting better performance – 8 Mbps up and 12 Mbps down. “We’re getting about 25 percent saturation,” he added. “Many users put their phones away during the games, but we see spikes at halftime and during commercial breaks.” Regardless, ASU continually monitors Wi-Fi and DAS usage and adjusts bandwidth as needed.

Another big challenge is the desert climate – temperatures regularly soar into triple digits. With about 450 under-seat APs in the bowl, Steed and his team had to make sure the enclosures could withstand heat and didn’t obstruct the walkways. “We’ll see how well the electronics do, baking at 120 degrees six months out of the year,” he laughed.

ASU is also working with Intel, using the stadium’s APs as part of an Internet of Things trial. As Steed described it, IoT sensors work alongside stadium APs to measure temperature, noise, vibration and other environmental data. “We also look at lighting control and water distribution and flow,” he said.

Concourses also got expanded Wi-Fi and DAS coverage.

Concourses also got expanded Wi-Fi and DAS coverage.

Automating the control of environmental functions like heating, cooling, power usage and facilities management will help the university toward its goal of being carbon-neutral by 2025, Steed added. The trials are designed so that the technology can be expanded across the university, possibly for campus transportation kiosks or student concierge services. IoT devices could give students and visitors information about adjacent buildings or landmarks around campus.

Separate but related, the university is also testing cloud-based, Bluetooth low energy (BLE) technology from Mist Systems. These “virtual beacons” use sensors attached to an AP to flag information or a point of interest for students or stadium visitors. “The virtualized beacon technology helps us understand where people are walking around and what they’re looking at in the stadium and elsewhere around campus,” Steed said.

They’re currently being tested in some of Sun Devil Stadium’s suites; Steed foresees expanding that to the student union to help guide people to meeting rooms, retail facilities or food vendors, for example.

Steed credited excellent communication and collaboration among the university’s athletic and IT departments and other players in the upgrade equation. “Our athletic director, Ray Anderson, brought the CIO and me into his office and really worked together with us,” he explained. “The biggest piece of our success was knowing that the AD supported our recommendations and brought us in as valued advisors.”

New Report: First look at Sacramento’s Golden 1 Center

q4 thumbMOBILE SPORTS REPORT is pleased to announce the Winter 2016-2017 issue of our STADIUM TECH REPORT series, with a first look at the pervasive stadium technology built into the Sacramento Kings’ new home, the Golden 1 Center.

Also in our latest in-depth publication created specifically for the stadium technology professional and the stadium technology marketplace is a profile of a new Wi-Fi deployment at the Indiana Pacers’ Bankers Life Fieldhouse, and a profile of new Wi-Fi and DAS networks deployed at Arizona State’s Sun Devil Stadium. We also provide an update on how the new Wi-Fi network at Houston’s NRG Stadium is getting ready for the upcoming Super Bowl LI.

Renting a Wi-Fi network?

In addition to our historical in-depth profiles of successful stadium technology deployments, our fourth issue for 2016 has additional news and analysis, including a look at whether or not stadiums will soon be able to lease their Wi-Fi networks. Download your FREE copy today!

We’d like to take a quick moment to thank our sponsors, which for this issue include Mobilitie, Crown Castle, SOLiD, CommScope, JMA Wireless, Corning, Samsung Business, Xirrus, Huber+Suhner, ExteNet Systems, and Extreme Networks. Their generous sponsorship makes it possible for us to offer this content free of charge to our readers. We’d also like to thank you for your interest and support.

As always, we are here to hear what you have to say: Send me an email to kaps@mobilesportsreport.com and let us know what you think of our STADIUM TECH REPORT series.

Vikings hit peak of 4.32 TB for Wi-Fi use at U.S. Bank Stadium, with average 43 percent take rate

Game day at U.S. Bank Stadium. Credit all photos: Vikings.com (click on any photo for a larger image)

Game day at U.S. Bank Stadium. Credit all photos: Vikings.com (click on any photo for a larger image)

While the football season may not have gone exactly to Vikings’ fans wishes, the Wi-Fi network at U.S. Bank Stadium performed well during its inaugural NFL season, with a peak single-game data total of 4.32 terabytes used, part of a season average of 2.89 TB used during Vikings games.

According to statistics provided to MSR by Tod Caflisch, vice president and chief technical officer for the Vikings, the biggest data-use day was Sept. 18, 2016, during the regular-season home opener for the Vikings against the rival Green Bay Packers, a 17-14 Vikings victory. That contest also saw season highs for unique Wi-Fi users, with 31,668 fans connecting to the Wi-Fi at some point of the game day, and for most concurrent users, with 17,556 users connected at the same time. The 31,668 number represented a 49 percent take rate, with the game’s reported attendance of 64,786.

Even though Caflisch said the Vikings didn’t heavily promote the AmpThink-designed Wi-Fi network — which uses Cisco Wi-Fi gear in mostly handrail-mounted AP locations to serve the main bowl seating areas — the average take rate during the season was at the high end of numbers we’ve seen, with a 43 percent average over the two preseason and eight regular-season Vikings games.

Screen Shot 2017-01-12 at 11.41.21 AMAnd even though the total data-used number only crested 3 TB one other time in the season — a 3.16 TB mark during a 30-24 Vikings win over the Arizona Cardinals on Nov. 20, 2016 — the average mark of 2.89 TB per game showed solid, consistent use.

Caflisch said that the Vikings and U.S. Bank Stadium were also able to correct the train-snafu issue that arose at some of the early events at the new venue, which has a light-rail station right outside the stadium doors. While some of the first events had big lines of riders and not enough trains, Caflisch said that during the season extra trains were held in reserve at the transit station that is close to Target Field (a few stops down the line from U.S. Bank) and then filtered in as Vikings games neared their end.

“We were able to clear the [train] platform in 40 minutes after the last game,” Caflisch said. “The fans really loved the trains.” (More U.S. Bank Stadium images below)

Screen Shot 2017-01-12 at 11.39.38 AM

Vikings fans gather outside the stadium for pregame activites.

Screen Shot 2017-01-12 at 11.40.04 AM

Great nighttime view with city skyline visible through windows.

vik5

A look at the handrail Wi-Fi antenna mounts (this photo credit: Paul Kapustka, MSR)

Will stadiums soon be able to rent their Wi-Fi networks from equipment vendors?

Nationwide Arena. Credit: Columbus Blue Jackets

Nationwide Arena. Credit: Columbus Blue Jackets

If it costs too much to buy a Wi-Fi network for your stadium, why not rent one instead?

A fairly common option in the world of enterprise networking, the ability to rent, or lease a fully operational Wi-Fi network may soon be coming to the world of sports stadiums and other large public venues, if it isn’t here already. Three of the largest Wi-Fi gear suppliers, Cisco, Aruba and Ruckus, already have public offers of network leasing arrangements, where venue owners could pay some kind of monthly or recurring fee for network setup and operation, instead of buying equipment outright. And Cisco, another leader in the marketplace, is rumored to be offering full-control lease-type arrangements for stadium Wi-Fi networks, possibly beginning with the new Wi-Fi network being built at the SAP Center in San Jose.

Though no large sports stadium has yet publicly announced a deal to lease, or in networking lingo, to buy a “Network as a service,” or NaaS, the idea is potentially attractive to stadium owners and operators, many of whom have struggled with the return-on-investment question ever since the idea of putting wireless networks in stadiums has emerged. While cellular carriers have so far borne the lion’s share of the costs of deploying enhanced cellular systems like DAS (distributed antenna system) in stadiums, the question of “who will pay for the Wi-Fi” is still a big one for many venues, especially those that are only filled several times a year.

The benefits of moving to opex vs. capex

Bart Giordano, vice president of business development for the Ruckus business unit at Brocade, said the idea of leasing a stadium network could be attractive especially to venue owners who don’t have the upfront capital necessary to pay for Wi-Fi, a cost that could run into the tens of millions of dollars.

Under new parent Brocade, Ruckus Wi-Fi gear can be obtained via something called the Brocade Network Subscription, a NaaS program that Giordano said “converts it all to opex — you subscribe to the network and pay a monthly service fee.” Under the subscription program, Giordano said Brocade/Ruckus will actually own the equipment, allowing the venue owner the flexibility of being able to return it or upgrade it as needed.

With many stadiums that deployed Wi-Fi several years ago already going through significant upgrades, the idea of a leased network that could be more easily refreshed with new technology might soon gain favor. Though no Ruckus stadium subscribers have yet been announced, Giordano said “some are coming.”

Aruba, now a Hewlett Packard Enterprise company, has a similar subscription model plan for enterprise wireless deployments, one which company representatives said could be used by stadiums as well. Both companies said such deals could possibly come via consulting partnerships, where the consultant firm manages the relationship and deployment/operation details.

Cisco also has a leasing option available for wireless networks, but so far has not made any public announcements of such deals in the sports stadium marketplace. However, there are reports of Cisco taking a more active role in the ownership, deployment and operation of stadium networks, like the Cisco-powered Wi-Fi currently being installed at the SAP Center in San Jose, home of the NHL’s Sharks. So far, neither Cisco nor the Sharks will comment on any business specifics of the new Wi-Fi network other than its use of Cisco gear.

Can leasing work for stadiums?

While the leasing idea for stadiums isn’t new, the business model has met some challenges along the way of the short history of wireless networks in large venues. So far, third-party integrators like Mobilitie, ExteNet Systems and Boingo have crafted lease-like deals in which the venue does not pay the full cost of the network but instead allows the operators to run networks (typically both DAS and Wi-Fi), earning money by leasing space on those networks to wireless carriers or by selling advertising or sponsorships.

Another leasing model, one that crashed and burned, was the one employed by SignalShare, a company now in bankruptcy proceedings with legal claims of fraudulent business practices against it. SignalShare, which also offered venues networks for a monthly cost, may have been hampered by a lack of financial resources, something that shouldn’t be an issue for companies the size of Cisco, HP and Brocade, who will mainly be offering leases on equipment they manufacture themselves. The larger equipment vendors may also not be under as much pressure as SignalShare was to earn revenues on the network operations, which may make them better able to succeed in the NaaS space.

And while the idea sounds good in theory, there are still unanswered questions about how the leases would work, and whether they will make good business sense for both sides. Unlike enterprise operations in traditional offices, stadium networks are far more complex to install and operate, especially those being retrofitted in stadiums built decades ago. Stadium networks also have a much different operational profile, with traffic coming in large spikes rather than daily workday routines.

But stadium networks can also act as public advertisements of sorts, gaining more attention for vendors in PR than perhaps in direct profits. As the market matures and vendors seek out potential customers who have shied away from Wi-Fi in the past due to upfront costs, leasing may be a way forward for both sides — as long as both can find a benefit to the deal.