Changes ahead for DAS industry business models, technology

JMA Wireless shows ‘smart’ trash bins at DAS and Small Cells Congress in Las Vegas. Credit all photos: Paul Kapustka, MSR

LAS VEGAS — New technologies combined with the need for new business models are driving imminent changes to the distributed antenna system (DAS) marketplace, according to industry representatives speaking Tuesday at this year’s DAS and Small Cells Congress here.

And while the end product of the market transformation is still uncertain, executives from DAS gear manufacturers, cellular carriers and other industry experts all agreed on one thing: In the near future, the DAS industry won’t look at all like it does today.

For large public venue owners specificially, the days of carrier-funded DAS deployments may already be at an end, unless your stadium is in line to host a Super Bowl. Tightening budgets due to economic pressures on the nation’s biggest cellular carriers means that the recent years of free spending by AT&T and Verizon Wireless may have already gone by, putting more pressure on venue owners to find different financial models to bring cellular signals inside their buildings.

Cathedral Consulting’s Seth Buechley

“There was never a problem I couldn’t throw more money at,” said Philip French, executive director for the West and North Central areas for Verizon, during a Tuesday keynote session at the Planet Hollywood hotel. “Those days are gone.”

Also putting pressure on traditional DAS designs are the emergence of small cells, basically smaller versions of carrier macro towers that, like DAS, are used primarily to bring connectivity inside buildings or to urban areas with RF challenges, like crowded city streets. Experiments with newer “5G” cellular technologies and trials of networks at newer slices of spectrum, like the Citizens Broadband Radio Service (CBRS) at 3.5 GHz, may also impact the traditional DAS architectures as carriers and building owners look for ways to get more connectivity bang for the buck.

Getting more worth out of the network

Seth Buechley, chairman and CEO of business-advisory firm Cathedral Consulting (and former co-founder of DAS equipment provider SOLiD USA), said that the biggest cellular carriers are under increasing pressure to improve their bottom lines, a situation that could affect the DAS industry by drying up the funds previously used to bring DAS deployments to places like stadiums and arenas. AT&T, for example, has already disbanded the internal group that led an industry charge to bring DAS to many sports venues at no charge to teams or facility owners.

“Internal [carrier] competition for resources is the biggest threat to DAS,” Buechley said.

In his remarks, Verizon’s French noted that the “unlimited” data plans that have resurfaced for major carriers like Verizon are putting “a tremendous amount of pressure” on budgets. Another current popular DAS business model, where a third-party operator builds a stadium network and then signs up carriers on a subscription model, may also be in danger as carriers hold off on participating. At Texas A&M, T-Mobile recently signed a $3.5 million deal to get its signals on the DAS network at 102,512-seat Kyle Field, where AT&T and Verizon both paid in the neighborhood of $5 million for their access to the network.

Todd Landry, JMA Wireless

Unless your facility is that big or it’s getting ready to host a big event like WrestleMania or the Super Bowl, where DAS traffic is likely to be off the charts, the carriers may not be as ready to pay.

“We still love the NFL, but neutral host [participation] can be very expensive for Verizon,” French said.

More network intelligence = more revenue opportunity

Todd Landry, corporate vice president for product and market strategy at DAS supplier JMA Wireless, said the DAS industry needs to look at its own offerings to see how it can help its customers get more out of their networks.

“We’ve got to re-imagine what we’re trying to do,” said Landry. “What do we do with the network to get more out of it?”

Specifically, Landry sees advancements in DAS network intelligence as a prime opportunity to provide more value rather than simply cutting costs. At the conference, JMA was showing a prototype of a “DAS trash can,” a hardened waste bin (with solar power) that could also host a DAS antenna inside. Another attached bin was shown with a connected sensor that could tell operators whether the can was full or not, eliminating the need for multiple truck rolls just to check on whether the bin needed to be emptied.

DAS gear inside the ‘smart’ trash can

For stadiums and other public spaces like shopping malls, Landry said parking spots might have sensors that could indicate whether or not a spot was available — and then relay that information to a self-driving car, which could drop off its passengers at the venue, then proceed on to park itself. Such a service could be offered for a fee to game or mall attendees.

“As we go forward, we need be more clever,” Landry said. “We need to take more knowledge [from] the plumbing, and extract value from it.”

And even while technologies like “5G” and CBRS, which uses LTE technology to provide what proponents see as a sort of “private cellular” environment, may be a few years off from practical deployments, Landry said their presence is already being felt and absorbed by firms building current-day DAS gear. Elements of small cells and DAS, he said, “will come together,” as the equipment vendors “re-imagine what we’re doing for the industry.”

While there may be multiple paths forward for the DAS market, all in attendance seemed to agree with Landry’s final statement: “Things will be very different from what you know today.”

‘Right opportunity’ led Rushton from IBM to LA Chargers job

Jim Rushton

Jim Rushton, who held one of the most high-profile jobs in the sports network business market as leader of the stadium-technology group at IBM, said it was a “once in a lifetime chance” at the “right opportunity” that led him to leave Big Blue to become the new chief revenue officer for the Los Angeles Chargers.

Rushton, who started his new job this week, spoke with Mobile Sports Report last week on the phone after what he described was a “whirlwind” of activity, which ended up with him at one of the top-level business spots for the former San Diego Chargers, who are in the midst of a move up the coast.

“The chance to help rebuild and evolve an NFL franchise in a market like Los Angeles doesn’t come up very often,” said Rushton. “It really was a once in a lifetime career opportunity.”

Part of that opportunity will be to help figure out how to remake the Chargers franchise as part of a joint tenant agreement at the yet-to-be-built new Los Angeles NFL stadium, a venue being built by LA Rams owner Stan Kroenke. Rushton said that fan data and anayltics will me a “massive part” of his new purview, and that as a partner in the stadium operations the Chargers will be part of “joint decisions” on technology matters inside the new venue.

Rushton, who held a similar position with the NFL’s Miami Dolphins before moving to IBM, said his post with the Chargers will have more responsibilities.

On his short but productive IBM tenure — during which IBM came from pretty much nowhere to becoming one of the leaders in the stadium-networking integration space — Rushton said he felt he was leaving the operation improved from its initial inception.

With IBM-led deployments at Texas A&M, Atlanta’s new Mercedes-Benz Stadium and the forthcoming LA Football Club venue leading the way, Rushton said IBM’s tech-integration business now has a “signficiant [deal] pipeline across the world.”

One of the more interesting features of Rushton’s new job is the fact that the Chargers will play home games the next two seasons at the StubHub Center, a 27,000-seat soccer stadium in Carson, Calif., that will become the NFL’s smallest venue starting this fall. Though it’s not clear whether or not the stadium will improve its technology offerings before the Chargers play, Rushton was excited by the prospect of a scaled-down experience.

“It’s going to be terrific — it’s like having only premium seats, because everything will be lower bowl,” Rushton said.

LA Chargers name IBM’s Jim Rushton as chief revenue officer

Jim Rushton

Jim Rushton, the former Miami Dolphins exec who became the leader of IBM’s stadium technology-integration program, has now joined the Los Angeles Chargers as the team’s chief revenue officer, according to a post on the team’s website.

We’re still trying to reach out to Rushton for more info, but it’s unclear yet as to what effect his departure will have on IBM’s stadium networking business, which directed projects at Texas A&M as well as at the Atlanta Falcon’s new Mercedes-Benz Stadium, due to open later this year. IBM was also tapped to lead technology deployments at the forthcoming Los Angeles Football Club soccer facility.

College Football Playoff championship sees 2.4 TB of Wi-Fi — big decline from 2016

We finally have numbers for the Wi-Fi usage at the most recent College Football Playoff championship game, and in somewhat of a first the total data used during the event was much lower than the previous year’s game, with just 2.4 terabytes of data used on Jan. 9 at Raymond James Stadium in Tampa, Fla. — compared to 4.9 TB of Wi-Fi used at the championship game in 2016, held at the University of Phoenix Stadium in Glendale, Ariz.

The reason for the decline is probably not due to any sudden dropoff in user demand, since usage of in-stadium cellular or DAS networks increased from 2016 to 2017, with AT&T’s observed network usage doubling from 1.9 TB to 3.8 TB in Tampa. More likely the dropoff is due to the fact that the Wi-Fi network at the University of Phoenix Stadium had been through recent upgrades to prepare for both the college championship game and Super Bowl XLIX, while the network in Raymond James Stadium hasn’t seen a significant upgrade since 2010, according to stadium officials. At last check, the Wi-Fi network at University of Phoenix Stadium had more than 750 APs installed.

Joey Jones, network engineer/information security for the Tampa Bay Buccaneers, said the Wi-Fi network currently in use at Raymond James Stadium has a total of 325 Cisco Wi-Fi APs, with 130 of those in the bowl seating areas. The design is all overhead placements, Jones said in an email discussion, with no under-seat or handrail enclosure placements. The total unique number of Wi-Fi users for the college playoff game was 11,671, with a peak concurrent connection of 7,353 users, Jones said.

Still tops among college playoff championship games in Wi-Fi is the first one held at AT&T Stadium in 2015, where 4.93 TB of Wi-Fi was used. Next year’s championship game is scheduled to be held at the new Mercedes-Benz Stadium in Atlanta, where one of the latest Wi-Fi networks should be in place and operational.

Seahawks see big jump in Wi-Fi usage at CenturyLink Field for 2016-17 season

Screen Shot 2016-09-12 at 1.11.24 PMThe Seattle Seahawks saw almost every metric associated with the Wi-Fi network at CenturyLink Field just about double from the 2015-16 to the 2016-17 NFL regular season, according to statistics provided by the team.

Chip Suttles, vice president of technology for the Seahawks, sent us over some excellent season-long charts of Wi-Fi activity, including unique and concurrent-user peaks, top throughput, and a couple of comparison charts mapping this most recent season’s activity compared to that a year before.

With a capacity crowd attendance total of 69,000, the always sold-out CenturyLink saw a take rate nearing 50 percent for most of the season, with a top unique-user number of 35,808 for a Nov. 7 31-25 win over the Buffalo Bills. Interestingly, the biggest day for total data usage wasn’t the Bills game (3.259 terabytes) but a 26-15 win over the Philadelphia Eagles on Nov. 20, when the Wi-Fi network saw 3.526 TB of data used. If you look at the comparitive graphs, both peak usage and total usage numbers pretty much doubled down on what was seen the year before.

According to Suttles, there wasn’t much in the way of upgrades to the Extreme Networks-based network before this past season — just some firmware and software updates, and “about a half-dozen” new APs to support additional seating added in the south end zone area. “Overall, I think it [the data increases] is more to do with familiarity,” Suttles said. Thanks to Chip and the Seahawks for sharing the numbers!

sea2

sea1

From overhead to under seat: A short history of the hows and whys of stadium Wi-Fi network design

Wi-Fi handrail enclosures at U.S. Bank Stadium, Minneapolis, Minn. Credit: Paul Kapustka, MSR

Wi-Fi handrail enclosures at U.S. Bank Stadium, Minneapolis, Minn. Credit: Paul Kapustka, MSR

By Bill Anderson, AmpThink

The history of high density (HD) Wi-Fi deployments in stadiums and arenas is short. Yet the the amount of change that occurred is significant; both in terms of how these networks are deployed and why.

Venue operators, manufacturers, and integrators are still grappling with the particulars of HD Wi-Fi in large open environments, even though there are a substantial number of deployed high quality implementations. Below, I’ve shared our perspective on the evolution of HD Wi-Fi design in stadiums and arenas and put forth questions that venue operators should be asking to find a solution that fits their needs and their budget.

AmpThink’s background in this field

Over the past 5 years, our team has been involved in the deployment of more than 50 high-density Wi-Fi networks in stadiums throughout North America. In that same period, the best-practices for stadium HD Wi-Fi design have changed several times, resulting in multiple deployment methodologies.

Each major shift in deployment strategy was intended to increase total system capacity [1]. The largest gains have come from better antenna technology or deployment techinques that better isolated access point output resulting in gains in channel re-use.

What follows is a summary of what we’ve learned from the deployments we participated in and their significance for the future. Hopefully, this information will be useful to others as they embark on their journeys to purchase, deploy, or enhance their own HD Wi-Fi networks.

In the beginning: All about overhead

Editor’s note: This post is part of Mobile Sports Report’s new Voices of the Industry feature, in which industry representatives submit articles, commentary or other information to share with the greater stadium technology marketplace. These are NOT paid advertisements, or infomercials. See our explanation of the feature to understand how it works.


Designers of first generation of HD Wi-Fi networks were starting to develop the basic concepts that would come to define HD deployments in large, open environments. Their work was informed by prior deployments in auditoriums and convention centers and focused on using directional antennas. The stated goal of this approach was to reduce co-channel interference [2] by reducing the effective footprint of an individual access point’s [3] RF output.

However the greatest gains came from improving the quality of the link between clients and the access point. Better antennas allowed client devices to communicate at faster speeds which decreased the amount of time required to complete their communication, making room for more clients on each channel before a given channel became saturated or unstable.

Under seat Wi-Fi AP at Bank of America Stadium. Credit: Carolina Panthers

Under seat Wi-Fi AP at Bank of America Stadium. Credit: Carolina Panthers

The concept was simple, but limited by the fact that there were few antennas available that could do the job effectively. Creative technicians created hybrid assemblies that combined multiple antennas into arrays that rotated polarization and tightened the antenna beam to paint the smallest usable coverage pattern possible. In time, this gap was addressed and today there are antennas specifically developed for use in overhead HD deployments – Stadium Antennas.

Typically, Stadium Antennas are installed in the ceilings above seating and/or on the walls behind seating because those locations are relatively easy to cable and minimize cost. We categorize these deployments as Overhead Deployments.

From overhead to ‘front and back’

First generation overhead deployments generally suffer from a lack of overhead mounting locations to produce sufficient coverage across the entire venue. In football stadiums, the front rows of the lower bowl are typically not covered by an overhang that can be used for antenna placement.

These rows are often more than 100 feet from the nearest overhead mounting location. The result is that pure overhead deployments leave some of the most expensive seats in the venue with little or no coverage. Further, due to the length of these sections, antennas at the back of the section potentially service thousands of client devices [4].

As fans joined these networks, deployments quickly became over-loaded and generated service complaints for venue owners. The solution was simple — add antennas at the front of long sections to reduce the total client load on the access points at the back. It was an effective band-aid that prioritized serving the venues’ most important and often most demanding guests.

This approach increased the complexity of installation as it was often difficult to cable access points located at the front of a section.

And for the first time, antennas were placed where they were subject to damage by fans, direct exposure to weather, and pressure washing [5]. With increased complexity, came increased costs as measured by the average cost per installed access point across a venue.

Because these systems feature antennas at the front and rear of each seating section, we refer to these deployments as ‘Front-to-Back Deployments.’ While this approach solves specific problems, it is not a complete solution in larger venues.

‘Filling In’ the gaps

Data collected from Front-to-Back Deployments proved to designers that moving the antennas closer to end users:
— covered areas that were previously uncovered;
— increased average data rates throughout the bowl;
— used the available spectrum more effectively; and
— increased total system capacity.

The logical conclusion was that additional antennas installed between the front and rear antennas would further increase system capacity. In long sections these additional antennas would also provide coverage to fans that were seated too far forward of antennas at the rear of the section and too far back from antennas at the front of the section. The result was uniform coverage throughout the venue.

In response, system designers experimented with hand rail mounted access points. Using directional antennas, coverage could be directed across a section and in opposition to the forward-facing antennas at the rear of the section and rear-facing antennas at the front of a section. These placements filled in the gaps in a Front-to-Back Deployment, hence the name ‘In-Fill Deployment.’

While these new In-Fill Deployments did their job, they added expense to what was already an expensive endeavor. Mounting access points on handrails required that a hole be drilled in the stadia at each access point location to cable the installed equipment. With the access point and antenna now firmly embedded in the seating, devices were also exposed to more traffic and abuse. Creative integrators came to the table with hardened systems to protect the equipment – handrail enclosures. New costs included: using ground-penetrating radar to prepare for coring; enclosure fabrication costs; and more complex conduit and pathway considerations. A typical handrail placement could cost four times the cost of a typical overhead placement and a designer might call for 2 or 3 handrail placements for every overhead placement.

Getting closer, better, faster: Proximate Networks

In-Fill strategies substantially solved the coverage problem in large venues. Using a combination of back of section, front of section, and hand-rail mounted access points, wireless designers had a tool box to deliver full coverage.

But with that success came a new problem. As fans discovered these high density networks and found new uses for them, demands on those networks grew rapidly, especially where teams or venue owners pushed mobile-device content strategies that added to the network load. In-spite of well placed access points, fan devices did not attach to the in-fill devices at the same rate that they attached to the overhead placements [6]. In-fill equipment remained lightly used and overhead placements absorbed hundreds of clients. Gains in system capacity stalled.

Close-up look at U.S. Bank Stadium railing enclosure during final construction phase, summer 2016. Credit: Paul Kapustka, MSR

Close-up look at U.S. Bank Stadium railing enclosure during final construction phase, summer 2016. Credit: Paul Kapustka, MSR

To overcome uneven system loading, designers needed to create a more even distribution of RF energy within the deployment. That required a consistent approach to deployment, rather than a mix of deployment approaches. The result was the elimination of overhead antennas in favor of access points and antennas installed within the crowd, closest to the end use; hence the name ‘Proximate Networks.’

Proximate networks come in two variations: handrail only and under seat only. In the hand rail only model, the designer eliminates overhead and front of section placements in favor of a dense deployment of hand rail enclosures. In the under seat model, the designer places the access point and antenna underneath the actual seating (but above the steel or concrete decking). In both models, the crowd becomes an important part of the design. The crowd attenuates the signal as it passes through their bodies resulting in consistent signal degradation and even distribution of RF energy throughout the seating bowl. The result is even access point loading and increased system capacity.

An additional benefit of embedding the access points in the crowd is that the crowd effectively constrains the output of the access point much as a wall constrains the output of an access point in a typical building. Each radio therefore hears fewer of its neighbors, allowing each channel to be re-used more effectively. And because the crowd provides an effective mechanism for controlling the spread of RF energy, the radios can be operated at higher power levels which improves the link between the access point and the fan’s device. The result is more uniform system loading, higher average data rates, increased channel re-ue, and increases in total system capacity.

While Proximate Networks are still a relatively new concept, the early data (and a rapid number of fast followers) confirms that if you want the densest possible network with the largest possible capacity, then a Proximate Network is what you need.

The Financials: picking what’s right for you

From the foregoing essay, you might conclude that the author’s recommendation is to deploy a Proximate Network. However, that is not necessarily the case. If you want the densest possible network with the largest possible capacity, then a Proximate Network is a good choice. But there are merits to each approach described and a cost benefit analysis should be performed before a deployment approach is selected.

For many venues, Overhead Deployments remain the most cost effective way to provide coverage. For many smaller venues and in venues where system utilization is expected to be low, an Overhead deployment can be ideal.

Front-to-Back deployments work well in venues where system utilization is low and the available overhead mounting assets can’t cover all areas. The goal of these deployments is ensuring usable coverage, not maximizing total system capacity.

In-fill deployments are a good compromise between a coverage-centric high density approach and a capacity-centric approach. This approach is best suited to venues that need more total system capacity, but have budget constraints the prevent selecting a Proximate approach.

Proximate deployments provide the maximum possible wireless density for venues where connectivity is considered to be a critical part of the venue experience.

Conclusion

If your venue is contemplating deploying a high density network, ask your integrator to walk you through the expected system demand, the calculation of system capacity for each approach, and finally the cost of each approach. Make sure you understand their assumptions. Then, select the deployment model that meets your business requirements — there is no “one size fits all” when it comes to stadium Wi-Fi.

Bill Anderson, AmpThink

Bill Anderson, AmpThink

Bill Anderson has been involved in the design and construction of wireless networks for over 20 years, pre-dating Wi-Fi. His first experience with wireless networking was as a software developer building software for mobile computers communicating over 400 MHz and 900 MHz base stations developed by Aironet (now part of Cisco Systems).

His work with mobile computing and wireless networks in distribution and manufacturing afforded him a front row seat to the emergence of Wi-Fi and the transformation of Wi-Fi from a niche technology to a business critical system. Since 2011 at AmpThink Bill has been actively involved in constructing some of the largest single venue wireless networks in the world.

Footnotes

^ 1. A proxy for the calculation of overall system capacity is developed by multiplying the average speed of communication of all clients on a channel (avg data rate or speed) by the number of channels deployed in the system (available spectrum) by the number of times we can use each channel (channel re-use) or [speed x spectrum x re-use]. While there are many other parameters that come into play when designing a high density network (noise, attenuation, reflection, etc.), this simple equation helps us understand how we approach building networks that can support a large number of connected devices in an open environment, e.g. the bowl of a stadium or arena.

^ 2. Co-channel interference refers to a scenario where multiple access points are attepting to communicate with client devices using the same channel. If a client or access point hears competing communication on the channel they are attempting to use, they must wait until that communication is complete before they can send their message.

^ 3. Access Point is the term used in the Wi-Fi industry to describe the network endpoint that client devices communicate with over the air. Other terms used include radio, AP, or WAP. In most cases, each access point is equipped with 2 or more physical radios that communicate on one of two bands – 2.4 GHz or 5 GHz. HD Wi-Fi deployments are composed of several hundred to over 1,000 access points connected to a robust wired network that funnels guest traffic to and from the internet.

^ 4. While there is no hard and fast rule, most industry experts agree that a single access point can service between 50 and 100 client devices.

^ 5. Venues often use pressure washers to clean a stadium after a big event.

^ 6. Unlike cellular systems which can dictate which mobile device attaches to each network node, at what speed, and when they can communicate, Wi-Fi relies on the mobile device to make the same decisions. When presented with a handrail access point and an overhead access point, mobile devices often hear the overhead placement better and therefore prefer the overhead placement. In In-Fill deployments, this often results in a disproportionate number of client devices selecting overhead placements. The problem can be managed by lowering the power level on the overhead access point at the expense of degrading the experience of the devices that the designer intended to attach to the overhead access point.