A building for the future: Tech shines through at Sacramento’s new Golden 1 Center

Golden 1 Center, the new home of the Sacramento Kings. Credit: Sacramento Kings

If you’re building for the future, it’s best to start with a building for the future.

That’s what has happened in downtown Sacramento, where the Sacramento Kings have built a technology-laden future-proof arena, a venue designed not just to host basketball games but to be the centerpiece of a metro revival for years to come.

Now open for business, the Golden 1 Center is a living blueprint for the arena of the future, especially from a technology perspective. And while some technology inside the venue is impossible to ignore — starting with the massive 4K scoreboard that overhangs the court — there’s also a wealth of less-apparent technology woven throughout the building’s core and pervasive in its operating functions.

Led by Kings majority owner and former software company founder Vivek Ranadive, the technology-focused direction of the new arena is a blend of the latest thinking in venue experiences and operations. Among the many got-to-have staples: High-quality wireless connectivity and multiple mobile device-based services, including food ordering and delivery, map-based parking, wayfinding help, and digital ticketing. While its already-available options easily place Golden 1 Center among the top tier of connected stadiums today, what may be more impressive is the internal planning for future technologies and services, a sign that its owners and operators clearly understand the ever-changing nature of digital systems.

The purple lights are on in the Golden 1 Center data room. Credit all following photos: Paul Kapustka, MSR (click on any photo for a larger image)

While the arena is open today, it’s still somewhat of a diamond in the rough, as planned surrounding structures, including adjacent hotel and retail outlets, are still in the concrete-and-cranes phase, with “coming soon” signs on the area’s many construction fences. As they wait for their team to show signs of on-court improvement, Sacramento citizens must also be patient for the full plan of the downtown arena to emerge, along with its promise to revive an area once stuck in the past.

The good news? With Golden 1 Center Sacramento fans already have a winner, in a venue that will provide fans with some of the best digital-based services and amenities found anywhere, for now and for the foreseeable future. What follows are our first impressions from an early December 2016 visit to a Kings home game, hosted by representatives of the Kings’ technical staff along with representatives from Wi-Fi gear provider Ruckus and cellular DAS deployment firm DAS Group Professionals.

Showing off the data center

Editor’s note: This profile is from our latest STADIUM TECH REPORT, the ONLY in-depth publication created specifically for the stadium technology professional and the stadium technology marketplace. Read about new networks at the Indiana Pacers’ Bankers Life Fieldhouse and the new Wi-Fi network used for the Super Bowl in our report, which is available now for FREE DOWNLOAD from our site!

Data center guards. Small, but well armed.

If you had any doubts about how proud the Kings are of their stadium technology, those are erased the moment you enter the stadium via the VIP doorway; after the metal detectors but before you hit the new-wave ticket scanners, you see a set of floor-to-ceiling glass walls and doors to your left, showing off the impressive racks of the venue’s main data equipment room.

How can gear racks be impressive? How about if they are impeccably encased in their own white metal and glass enclosures, a technique that allows the Kings to refrigerate each rack separately, leaving the rest of the room at a temperature more suitable to human bodies. You don’t have to be a network equipment operator to recognize an over-the-top attention to detail here; even the exposed fiber cabling that stretches out up and across the ceiling is color-coded in the Kings’ main team purple; another level of coolness appears when the main lights in the room are turned off, and more LEDs come on to bathe the room in a completely purple hue.

This room is also where you see the first hints of how the team is preparing for the future, with two 100 Gbps incoming bandwidth pipes (from Comcast), as well as two full rows of racks left empty, waiting for whatever innovation needs arise next. While the backbone bandwidth will eventually also support the nearby hotel and retail locations, twin 100-Gbps connections should provide adequate throughput for now and the foreseeable future.

Walk a few steps past the mini-sized Imperial Stormtroopers who guard the facility and you are in a hallway that separates a “mission control” room with monitors for a huge number of operational services, and the video control room. The innovation here starts simply with the side-by-side proximity of network, operations and video administration rooms, a rarity especially in older stadiums where coordination between people working in such rooms often meant walkie-talkies and lots of running around.

Multiple live video inputs in the “control room” at G1C.

While the video control room and its need to supply coordinated content to more than 800 monitors in the building (as well as to the app) is impressive, what’s really interesting is the “mission control” room, where Kings employees, network types and public safety personnel can track multiple inputs on a wall of monitors. In addition to security and public service video monitoring (Kings reps talk about seeing fans spill a drink and hustling to deploy clean-up services before anyone can ask for them), there are also displays for real-time social media mentions and live traffic information, which the Kings can monitor and respond to as needed.

Another “unseen” technology innovation is an operational app that provides real-time access to a huge list of game-day business statistics, like live ticket-scan numbers and real-time updates to concession purchases. This app is also available to Kings execs on their mobile devices, and it’s addicting to watch the numbers update in real time, especially the fast-moving alcoholic beverage purchase totals; according to the Kings, during a Jimmy Buffett concert at the arena, adult-beverage purchases were pushing the $1,000-per-minute mark.

When it comes to the fan experience, such “hidden” technologies may be the services that provide the best examples for how high-quality networks can bring real ROI to stadiums and large public venues. Fans may never know the guts of the system, but when a stand doesn’t run out of hot dogs or a clean-up squad arrives quickly to mop up a spilled beer, it’s a good bet that customer satisfaction will keep increasing. With massively connected systems and attached real-time analytics, such services become easier to deploy and manage; at Golden 1 Center, it’s easy to see how multiple stakeholders in the venue benefit from the decision to make networked technology a primary core of the building’s operations.

The huge scoreboard dominates the view at Golden 1 Center.

A scoreboard that stretches from hoop to hoop

Taking an elevator up to the main concourse floor, the initial impression of Golden 1 Center is its openness — it is built so that the main or ground level entrance is at the top of the bottom bowl of seats, with court level below. Open all the way around, the ability to see across the venue gives it an airy feeling, more like a bigger enclosed football stadium than a basketball arena. On the night we toured the venue its unique glass entryway windows were closed, but they can be opened to let in the breeze during milder days — adding another degree of difficulty for wireless network administration, since LTE signals can both enter and leave the building when the windows are open.

The next thing to catch your eye is the main scoreboard, which the Kings bill as the biggest 4K screen for a permanent indoor arena, with 35 million pixels. If it were lowered during a game, the Kings folks claim the screen would touch both baskets, so without any other numbers you get the idea: This thing is huge.

New entry kiosks from SkiData move more fans inside more quickly, the Kings claim.

It’s also incredibly clear, thanks in part to the 4K resolution but also in part to the fact that it is tilted at just the correct angles so that it’s easy to glance up from live action for a look at either the main screens or the bordering screens on both sides. Just citing clarity or size for scoreboards, I think, is missing a critical factor for video boards — what really matters is whether or not the screen is a positive or negative factor for during-game viewing, a subjective measurement that may take time to sink in. First impressions, however, during the live action between the Kings and Knicks during our visit, were incredibly positive, with the screen not interfering with live action views but incredibly clear for replays and live statistics.

The next part of our tour was to see if we could spot any of the 931 Ruckus Wi-Fi APs that are installed inside the venue. With the clear emphasis on clean aesthetics it was hard to spot any of the wall- or ceiling-mounted units, but we were able to locate several of the many under-seat AP enclosures, including some on retractable seats. According to the Ruckus folks on hand the retractable-seat APs took a little extra engineering, to allow the devices to be disconnected during seat movements.

The JMA Wireless DAS equipment was a little easier to spot, since like at Levi’s Stadium there are a number of antenna placements around the main concourse, pointing down into the lower bowl seating. The DAS Group Professional representatives on hand also pointed out more antennas up in the rafters, as well as some specially designed “antenna rocks” that hide cellular equipment outside the stadium in the open-air plaza. According to DGP and the Kings there are 136 DAS remote placements housing 213 antennas; right now only AT&T and Verizon Wireless are active on the DAS, with T-Mobile scheduled to join before the end of the NBA season. Negotiations with Sprint are still under discussion.

Blazing Wi-Fi in the basement of the building… and the rafters

When we dropped back down to the court-level to see the locker room entrances and one of the premium-seat club areas, we took our first Wi-Fi speed test at Golden 1 Center, and almost couldn’t believe the result: We got 132 Mbps for the download speed and 98 Mbps for upload. Greeted a few minutes later by owner Ranadive himself, we congratulated him on getting what he wanted in terms of connectivity, a theme he relentlessly promoted during the arena’s construction phases.

That’s good Wi-Fi. Taken in the Lexus Club on court level at Golden 1 Center.

The Wi-Fi connectivity was superb throughout the venue, with readings of 51.35/22.21 on press row (located at the top of the main lower bowl, just in front of the main concourse) and 42.14/38.83 in the crowded Sierra Nevada brewpub club at the top level of the arena. In section 220 in the upper deck we got Wi-Fi readings of 53.39 Mbps for download and 36.27 for upload. Throughout the stadium the Verizon LTE signal was in low teens to 20 Mbps range on the download side and usually between 20-30 Mbps on the upload side.

One of the decisions the Kings made on the Wi-Fi side was to drop 2.4 GHz coverage for fan devices in the main bowl area. According to both Ruckus and the Kings, fan devices now are almost 90 percent 5 GHz capable, meaning that it makes administrative sense to take 2.4 GHz out of the main fan Wi-Fi equation (while still keeping it for back-of-house operations like POS and wireless wristbands and cameras, which all still use 2.4 GHz technology). Other teams in the NBA, including the Indiana Pacers (who also recently installed a Ruckus Wi-Fi network) have also said that they are getting rid of 2.4 GHz coverage for fans since most devices used today have 5 GHz connectivity.

While we didn’t have time during this visit to explore all the numerous services available through the team’s app — including a game that lets fans bet loyalty points on predictions about which players will score the most points — it was clear that many fans were taking advantage of the connectivity, especially in the brewpub area where handy lean-up railings with small shelves made it easier to operate a mobile device while still being somewhat engaged with the court action below.

Team execs can get live feeds of fan-related stats on their internal app.

According to the Kings, during the first regular-season home game on Oct. 27, 2016, there were 8,307 unique users of the Wi-Fi network, out of 17,608 fans in attendance. The connected fans used a total of 1.4 terabytes of data on the Wi-Fi network that night, with a top peak concurrent connection number of 7,761 users. The highest sustained traffic to the Internet that night was a mark of 1.01 Gbps for a 15-minute period between 7:45 to 8:00 p.m., according to the Kings.

Another technology twist we saw in the brewpub was the use of Appetize’s flip-screen POS terminals, which allows for faster order taking simply by letting fans sign on screens with their fingers. Back at the front gates, the new ticket-scanning kiosks from SkiData may take some time for fans to get used to, but even obvious first-timers seemed to quickly understand the kiosk’s operation without much help needed, thanks to the helpful instructions on the wide screen that greets fans as they encounter the device. According to the Kings, tests of the new kiosks at other venues have shown that they can be as much as three times faster than previous technologies, good news to anyone who’s ever had to wait in line just to have their ticket checked.

A building for the future, whenever it comes

While we here at MSR clearly focus on venue technology, it was clear even during our brief stay at Golden 1 Center that while Sacramento fans may be immediately enjoying the amenities, they are still first and foremost concerned about the product on the court. In the upper deck two men spent several minutes questioning why Kings star DeMarcus “Boogie” Cousins (who has since been traded to the New Orleans Pelicans) didn’t seem to get the kind of refereeing treatment alloted to other NBA leaders; on an escalator another fan interrupted one of my speedtests by loudly requesting a fan-to-fan fistbump while he simply said, “Kings basketball, right baby?”

A view outside the stadium’s main entrance, with one of the two large vertical video boards visible.

Even in the face of mulitple years without playoff teams, Sacramento fans still turn out for the Kings; the point here in regards to technology is that it may take time for fans to notice and embrace the finer points of all the technological attributes of their new arena, which should become more than just an NBA venue as more concerts and civic events are held in and around its environs.

Our quick take is that fans may turn faster to services like the traffic, parking and seat-wayfinding features in the app, simply due to the newness of the building to everyone, as well as its tightly sandwiched downtown location. Like in other new arenas, the jury is still out on other app-based services like the loyalty-points voting game, and in-seat concessions ordering and delivery; the Kings declined to provide any statistics for in-seat ordering and delivery, a service which became available to the entire stadium on the night of our visit. The Kings, like many other teams, also offer instant replays via the app, but with the numerous high-quality big-screen displays (including two arena-sized screens outside the main entryway) it will be interesting to see if fans ever see an overwhelming need to check their devices for live action while attending a game.

The good news for the Kings is that they based their stadium and team app on a new flexible platform from a company called Built.io, which the Kings say allows for easier addition (or deletion) of services through an API layer. Like the future-proof parts of the building itself, the app also shows the Kings’ dedication to building something now that will almost certainly change going forward. As we look to the future it will be interesting to see which parts of the technology base contribute most to the fan experience and business operations at Golden 1 Center — and to see how many other existing or new arenas follow the lead.

More photos from our visit below!

Under seat Wi-Fi AP on a moveable section of stands.

The view from upper-deck seats.

A Wi-Fi speed test from those same seats.

One of the “rocks” hiding DAS antennas on the outside walkway.

Extreme bids $100 million for Avaya’s networking business

In a move that could expand its reach into the stadium networking business, Extreme Networks said Tuesday that it had entered into an “asset purchase agreement” to acquire the networking business of Avaya, which recently filed for bankruptcy. The price of the deal is $100 million, which is still subject to the possibility of other companies submitting higher bids for Avaya’s networking business assets.

We have calls in to get some more information, such as what might possibly happen to Avaya stadium networks in venues like Avaya Stadium, Montreal’s Bell Centre and Denver’s Pepsi Center. More as we hear more.

College Football Playoff championship sees 2.4 TB of Wi-Fi — big decline from 2016

We finally have numbers for the Wi-Fi usage at the most recent College Football Playoff championship game, and in somewhat of a first the total data used during the event was much lower than the previous year’s game, with just 2.4 terabytes of data used on Jan. 9 at Raymond James Stadium in Tampa, Fla. — compared to 4.9 TB of Wi-Fi used at the championship game in 2016, held at the University of Phoenix Stadium in Glendale, Ariz.

The reason for the decline is probably not due to any sudden dropoff in user demand, since usage of in-stadium cellular or DAS networks increased from 2016 to 2017, with AT&T’s observed network usage doubling from 1.9 TB to 3.8 TB in Tampa. More likely the dropoff is due to the fact that the Wi-Fi network at the University of Phoenix Stadium had been through recent upgrades to prepare for both the college championship game and Super Bowl XLIX, while the network in Raymond James Stadium hasn’t seen a significant upgrade since 2010, according to stadium officials. At last check, the Wi-Fi network at University of Phoenix Stadium had more than 750 APs installed.

Joey Jones, network engineer/information security for the Tampa Bay Buccaneers, said the Wi-Fi network currently in use at Raymond James Stadium has a total of 325 Cisco Wi-Fi APs, with 130 of those in the bowl seating areas. The design is all overhead placements, Jones said in an email discussion, with no under-seat or handrail enclosure placements. The total unique number of Wi-Fi users for the college playoff game was 11,671, with a peak concurrent connection of 7,353 users, Jones said.

Still tops among college playoff championship games in Wi-Fi is the first one held at AT&T Stadium in 2015, where 4.93 TB of Wi-Fi was used. Next year’s championship game is scheduled to be held at the new Mercedes-Benz Stadium in Atlanta, where one of the latest Wi-Fi networks should be in place and operational.

Seahawks see big jump in Wi-Fi usage at CenturyLink Field for 2016-17 season

Screen Shot 2016-09-12 at 1.11.24 PMThe Seattle Seahawks saw almost every metric associated with the Wi-Fi network at CenturyLink Field just about double from the 2015-16 to the 2016-17 NFL regular season, according to statistics provided by the team.

Chip Suttles, vice president of technology for the Seahawks, sent us over some excellent season-long charts of Wi-Fi activity, including unique and concurrent-user peaks, top throughput, and a couple of comparison charts mapping this most recent season’s activity compared to that a year before.

With a capacity crowd attendance total of 69,000, the always sold-out CenturyLink saw a take rate nearing 50 percent for most of the season, with a top unique-user number of 35,808 for a Nov. 7 31-25 win over the Buffalo Bills. Interestingly, the biggest day for total data usage wasn’t the Bills game (3.259 terabytes) but a 26-15 win over the Philadelphia Eagles on Nov. 20, when the Wi-Fi network saw 3.526 TB of data used. If you look at the comparitive graphs, both peak usage and total usage numbers pretty much doubled down on what was seen the year before.

According to Suttles, there wasn’t much in the way of upgrades to the Extreme Networks-based network before this past season — just some firmware and software updates, and “about a half-dozen” new APs to support additional seating added in the south end zone area. “Overall, I think it [the data increases] is more to do with familiarity,” Suttles said. Thanks to Chip and the Seahawks for sharing the numbers!

sea2

sea1

From overhead to under seat: A short history of the hows and whys of stadium Wi-Fi network design

Wi-Fi handrail enclosures at U.S. Bank Stadium, Minneapolis, Minn. Credit: Paul Kapustka, MSR

Wi-Fi handrail enclosures at U.S. Bank Stadium, Minneapolis, Minn. Credit: Paul Kapustka, MSR

By Bill Anderson, AmpThink

The history of high density (HD) Wi-Fi deployments in stadiums and arenas is short. Yet the the amount of change that occurred is significant; both in terms of how these networks are deployed and why.

Venue operators, manufacturers, and integrators are still grappling with the particulars of HD Wi-Fi in large open environments, even though there are a substantial number of deployed high quality implementations. Below, I’ve shared our perspective on the evolution of HD Wi-Fi design in stadiums and arenas and put forth questions that venue operators should be asking to find a solution that fits their needs and their budget.

AmpThink’s background in this field

Over the past 5 years, our team has been involved in the deployment of more than 50 high-density Wi-Fi networks in stadiums throughout North America. In that same period, the best-practices for stadium HD Wi-Fi design have changed several times, resulting in multiple deployment methodologies.

Each major shift in deployment strategy was intended to increase total system capacity [1]. The largest gains have come from better antenna technology or deployment techinques that better isolated access point output resulting in gains in channel re-use.

What follows is a summary of what we’ve learned from the deployments we participated in and their significance for the future. Hopefully, this information will be useful to others as they embark on their journeys to purchase, deploy, or enhance their own HD Wi-Fi networks.

In the beginning: All about overhead

Editor’s note: This post is part of Mobile Sports Report’s new Voices of the Industry feature, in which industry representatives submit articles, commentary or other information to share with the greater stadium technology marketplace. These are NOT paid advertisements, or infomercials. See our explanation of the feature to understand how it works.


Designers of first generation of HD Wi-Fi networks were starting to develop the basic concepts that would come to define HD deployments in large, open environments. Their work was informed by prior deployments in auditoriums and convention centers and focused on using directional antennas. The stated goal of this approach was to reduce co-channel interference [2] by reducing the effective footprint of an individual access point’s [3] RF output.

However the greatest gains came from improving the quality of the link between clients and the access point. Better antennas allowed client devices to communicate at faster speeds which decreased the amount of time required to complete their communication, making room for more clients on each channel before a given channel became saturated or unstable.

Under seat Wi-Fi AP at Bank of America Stadium. Credit: Carolina Panthers

Under seat Wi-Fi AP at Bank of America Stadium. Credit: Carolina Panthers

The concept was simple, but limited by the fact that there were few antennas available that could do the job effectively. Creative technicians created hybrid assemblies that combined multiple antennas into arrays that rotated polarization and tightened the antenna beam to paint the smallest usable coverage pattern possible. In time, this gap was addressed and today there are antennas specifically developed for use in overhead HD deployments – Stadium Antennas.

Typically, Stadium Antennas are installed in the ceilings above seating and/or on the walls behind seating because those locations are relatively easy to cable and minimize cost. We categorize these deployments as Overhead Deployments.

From overhead to ‘front and back’

First generation overhead deployments generally suffer from a lack of overhead mounting locations to produce sufficient coverage across the entire venue. In football stadiums, the front rows of the lower bowl are typically not covered by an overhang that can be used for antenna placement.

These rows are often more than 100 feet from the nearest overhead mounting location. The result is that pure overhead deployments leave some of the most expensive seats in the venue with little or no coverage. Further, due to the length of these sections, antennas at the back of the section potentially service thousands of client devices [4].

As fans joined these networks, deployments quickly became over-loaded and generated service complaints for venue owners. The solution was simple — add antennas at the front of long sections to reduce the total client load on the access points at the back. It was an effective band-aid that prioritized serving the venues’ most important and often most demanding guests.

This approach increased the complexity of installation as it was often difficult to cable access points located at the front of a section.

And for the first time, antennas were placed where they were subject to damage by fans, direct exposure to weather, and pressure washing [5]. With increased complexity, came increased costs as measured by the average cost per installed access point across a venue.

Because these systems feature antennas at the front and rear of each seating section, we refer to these deployments as ‘Front-to-Back Deployments.’ While this approach solves specific problems, it is not a complete solution in larger venues.

‘Filling In’ the gaps

Data collected from Front-to-Back Deployments proved to designers that moving the antennas closer to end users:
— covered areas that were previously uncovered;
— increased average data rates throughout the bowl;
— used the available spectrum more effectively; and
— increased total system capacity.

The logical conclusion was that additional antennas installed between the front and rear antennas would further increase system capacity. In long sections these additional antennas would also provide coverage to fans that were seated too far forward of antennas at the rear of the section and too far back from antennas at the front of the section. The result was uniform coverage throughout the venue.

In response, system designers experimented with hand rail mounted access points. Using directional antennas, coverage could be directed across a section and in opposition to the forward-facing antennas at the rear of the section and rear-facing antennas at the front of a section. These placements filled in the gaps in a Front-to-Back Deployment, hence the name ‘In-Fill Deployment.’

While these new In-Fill Deployments did their job, they added expense to what was already an expensive endeavor. Mounting access points on handrails required that a hole be drilled in the stadia at each access point location to cable the installed equipment. With the access point and antenna now firmly embedded in the seating, devices were also exposed to more traffic and abuse. Creative integrators came to the table with hardened systems to protect the equipment – handrail enclosures. New costs included: using ground-penetrating radar to prepare for coring; enclosure fabrication costs; and more complex conduit and pathway considerations. A typical handrail placement could cost four times the cost of a typical overhead placement and a designer might call for 2 or 3 handrail placements for every overhead placement.

Getting closer, better, faster: Proximate Networks

In-Fill strategies substantially solved the coverage problem in large venues. Using a combination of back of section, front of section, and hand-rail mounted access points, wireless designers had a tool box to deliver full coverage.

But with that success came a new problem. As fans discovered these high density networks and found new uses for them, demands on those networks grew rapidly, especially where teams or venue owners pushed mobile-device content strategies that added to the network load. In-spite of well placed access points, fan devices did not attach to the in-fill devices at the same rate that they attached to the overhead placements [6]. In-fill equipment remained lightly used and overhead placements absorbed hundreds of clients. Gains in system capacity stalled.

Close-up look at U.S. Bank Stadium railing enclosure during final construction phase, summer 2016. Credit: Paul Kapustka, MSR

Close-up look at U.S. Bank Stadium railing enclosure during final construction phase, summer 2016. Credit: Paul Kapustka, MSR

To overcome uneven system loading, designers needed to create a more even distribution of RF energy within the deployment. That required a consistent approach to deployment, rather than a mix of deployment approaches. The result was the elimination of overhead antennas in favor of access points and antennas installed within the crowd, closest to the end use; hence the name ‘Proximate Networks.’

Proximate networks come in two variations: handrail only and under seat only. In the hand rail only model, the designer eliminates overhead and front of section placements in favor of a dense deployment of hand rail enclosures. In the under seat model, the designer places the access point and antenna underneath the actual seating (but above the steel or concrete decking). In both models, the crowd becomes an important part of the design. The crowd attenuates the signal as it passes through their bodies resulting in consistent signal degradation and even distribution of RF energy throughout the seating bowl. The result is even access point loading and increased system capacity.

An additional benefit of embedding the access points in the crowd is that the crowd effectively constrains the output of the access point much as a wall constrains the output of an access point in a typical building. Each radio therefore hears fewer of its neighbors, allowing each channel to be re-used more effectively. And because the crowd provides an effective mechanism for controlling the spread of RF energy, the radios can be operated at higher power levels which improves the link between the access point and the fan’s device. The result is more uniform system loading, higher average data rates, increased channel re-ue, and increases in total system capacity.

While Proximate Networks are still a relatively new concept, the early data (and a rapid number of fast followers) confirms that if you want the densest possible network with the largest possible capacity, then a Proximate Network is what you need.

The Financials: picking what’s right for you

From the foregoing essay, you might conclude that the author’s recommendation is to deploy a Proximate Network. However, that is not necessarily the case. If you want the densest possible network with the largest possible capacity, then a Proximate Network is a good choice. But there are merits to each approach described and a cost benefit analysis should be performed before a deployment approach is selected.

For many venues, Overhead Deployments remain the most cost effective way to provide coverage. For many smaller venues and in venues where system utilization is expected to be low, an Overhead deployment can be ideal.

Front-to-Back deployments work well in venues where system utilization is low and the available overhead mounting assets can’t cover all areas. The goal of these deployments is ensuring usable coverage, not maximizing total system capacity.

In-fill deployments are a good compromise between a coverage-centric high density approach and a capacity-centric approach. This approach is best suited to venues that need more total system capacity, but have budget constraints the prevent selecting a Proximate approach.

Proximate deployments provide the maximum possible wireless density for venues where connectivity is considered to be a critical part of the venue experience.

Conclusion

If your venue is contemplating deploying a high density network, ask your integrator to walk you through the expected system demand, the calculation of system capacity for each approach, and finally the cost of each approach. Make sure you understand their assumptions. Then, select the deployment model that meets your business requirements — there is no “one size fits all” when it comes to stadium Wi-Fi.

Bill Anderson, AmpThink

Bill Anderson, AmpThink

Bill Anderson has been involved in the design and construction of wireless networks for over 20 years, pre-dating Wi-Fi. His first experience with wireless networking was as a software developer building software for mobile computers communicating over 400 MHz and 900 MHz base stations developed by Aironet (now part of Cisco Systems).

His work with mobile computing and wireless networks in distribution and manufacturing afforded him a front row seat to the emergence of Wi-Fi and the transformation of Wi-Fi from a niche technology to a business critical system. Since 2011 at AmpThink Bill has been actively involved in constructing some of the largest single venue wireless networks in the world.

Footnotes

^ 1. A proxy for the calculation of overall system capacity is developed by multiplying the average speed of communication of all clients on a channel (avg data rate or speed) by the number of channels deployed in the system (available spectrum) by the number of times we can use each channel (channel re-use) or [speed x spectrum x re-use]. While there are many other parameters that come into play when designing a high density network (noise, attenuation, reflection, etc.), this simple equation helps us understand how we approach building networks that can support a large number of connected devices in an open environment, e.g. the bowl of a stadium or arena.

^ 2. Co-channel interference refers to a scenario where multiple access points are attepting to communicate with client devices using the same channel. If a client or access point hears competing communication on the channel they are attempting to use, they must wait until that communication is complete before they can send their message.

^ 3. Access Point is the term used in the Wi-Fi industry to describe the network endpoint that client devices communicate with over the air. Other terms used include radio, AP, or WAP. In most cases, each access point is equipped with 2 or more physical radios that communicate on one of two bands – 2.4 GHz or 5 GHz. HD Wi-Fi deployments are composed of several hundred to over 1,000 access points connected to a robust wired network that funnels guest traffic to and from the internet.

^ 4. While there is no hard and fast rule, most industry experts agree that a single access point can service between 50 and 100 client devices.

^ 5. Venues often use pressure washers to clean a stadium after a big event.

^ 6. Unlike cellular systems which can dictate which mobile device attaches to each network node, at what speed, and when they can communicate, Wi-Fi relies on the mobile device to make the same decisions. When presented with a handrail access point and an overhead access point, mobile devices often hear the overhead placement better and therefore prefer the overhead placement. In In-Fill deployments, this often results in a disproportionate number of client devices selecting overhead placements. The problem can be managed by lowering the power level on the overhead access point at the expense of degrading the experience of the devices that the designer intended to attach to the overhead access point.

Arris to acquire Ruckus from Brocade as part of $800 million deal

Screen Shot 2017-02-22 at 6.17.41 PMThe wireless networking business once known as Ruckus Wireless is finding a new home, as Arris announced today that it plans to buy Ruckus from current owner Brocade as part of an $800 million deal that also includes Brocade’s ICX switch business.

Brocade, which purchased Ruckus for $1.2 billion earlier this year, is being acquired itself, by chipmaker Broadcom in a $5.5 billion deal announced last November.

According to an Arris release, the deal will be completed when Broadcom’s purchase of Brocade closes, an event expected to happen at the end of July.

We’ve got some calls in to see what, if anything, will happen to the growing stadium networking business at Ruckus, which in the past year has seen Ruckus Wi-Fi gear in arenas like Bankers Life Fieldhouse in Indianapolis and Golden 1 Center in Sacramento. Stay tuned!