From overhead to under seat: A short history of the hows and whys of stadium Wi-Fi network design

Wi-Fi handrail enclosures at U.S. Bank Stadium, Minneapolis, Minn. Credit: Paul Kapustka, MSR

Wi-Fi handrail enclosures at U.S. Bank Stadium, Minneapolis, Minn. Credit: Paul Kapustka, MSR

By Bill Anderson, AmpThink

The history of high density (HD) Wi-Fi deployments in stadiums and arenas is short. Yet the the amount of change that occurred is significant; both in terms of how these networks are deployed and why.

Venue operators, manufacturers, and integrators are still grappling with the particulars of HD Wi-Fi in large open environments, even though there are a substantial number of deployed high quality implementations. Below, I’ve shared our perspective on the evolution of HD Wi-Fi design in stadiums and arenas and put forth questions that venue operators should be asking to find a solution that fits their needs and their budget.

AmpThink’s background in this field

Over the past 5 years, our team has been involved in the deployment of more than 50 high-density Wi-Fi networks in stadiums throughout North America. In that same period, the best-practices for stadium HD Wi-Fi design have changed several times, resulting in multiple deployment methodologies.

Each major shift in deployment strategy was intended to increase total system capacity [1]. The largest gains have come from better antenna technology or deployment techinques that better isolated access point output resulting in gains in channel re-use.

What follows is a summary of what we’ve learned from the deployments we participated in and their significance for the future. Hopefully, this information will be useful to others as they embark on their journeys to purchase, deploy, or enhance their own HD Wi-Fi networks.

In the beginning: All about overhead

Editor’s note: This post is part of Mobile Sports Report’s new Voices of the Industry feature, in which industry representatives submit articles, commentary or other information to share with the greater stadium technology marketplace. These are NOT paid advertisements, or infomercials. See our explanation of the feature to understand how it works.


Designers of first generation of HD Wi-Fi networks were starting to develop the basic concepts that would come to define HD deployments in large, open environments. Their work was informed by prior deployments in auditoriums and convention centers and focused on using directional antennas. The stated goal of this approach was to reduce co-channel interference [2] by reducing the effective footprint of an individual access point’s [3] RF output.

However the greatest gains came from improving the quality of the link between clients and the access point. Better antennas allowed client devices to communicate at faster speeds which decreased the amount of time required to complete their communication, making room for more clients on each channel before a given channel became saturated or unstable.

Under seat Wi-Fi AP at Bank of America Stadium. Credit: Carolina Panthers

Under seat Wi-Fi AP at Bank of America Stadium. Credit: Carolina Panthers

The concept was simple, but limited by the fact that there were few antennas available that could do the job effectively. Creative technicians created hybrid assemblies that combined multiple antennas into arrays that rotated polarization and tightened the antenna beam to paint the smallest usable coverage pattern possible. In time, this gap was addressed and today there are antennas specifically developed for use in overhead HD deployments – Stadium Antennas.

Typically, Stadium Antennas are installed in the ceilings above seating and/or on the walls behind seating because those locations are relatively easy to cable and minimize cost. We categorize these deployments as Overhead Deployments.

From overhead to ‘front and back’

First generation overhead deployments generally suffer from a lack of overhead mounting locations to produce sufficient coverage across the entire venue. In football stadiums, the front rows of the lower bowl are typically not covered by an overhang that can be used for antenna placement.

These rows are often more than 100 feet from the nearest overhead mounting location. The result is that pure overhead deployments leave some of the most expensive seats in the venue with little or no coverage. Further, due to the length of these sections, antennas at the back of the section potentially service thousands of client devices [4].

As fans joined these networks, deployments quickly became over-loaded and generated service complaints for venue owners. The solution was simple — add antennas at the front of long sections to reduce the total client load on the access points at the back. It was an effective band-aid that prioritized serving the venues’ most important and often most demanding guests.

This approach increased the complexity of installation as it was often difficult to cable access points located at the front of a section.

And for the first time, antennas were placed where they were subject to damage by fans, direct exposure to weather, and pressure washing [5]. With increased complexity, came increased costs as measured by the average cost per installed access point across a venue.

Because these systems feature antennas at the front and rear of each seating section, we refer to these deployments as ‘Front-to-Back Deployments.’ While this approach solves specific problems, it is not a complete solution in larger venues.

‘Filling In’ the gaps

Data collected from Front-to-Back Deployments proved to designers that moving the antennas closer to end users:
— covered areas that were previously uncovered;
— increased average data rates throughout the bowl;
— used the available spectrum more effectively; and
— increased total system capacity.

The logical conclusion was that additional antennas installed between the front and rear antennas would further increase system capacity. In long sections these additional antennas would also provide coverage to fans that were seated too far forward of antennas at the rear of the section and too far back from antennas at the front of the section. The result was uniform coverage throughout the venue.

In response, system designers experimented with hand rail mounted access points. Using directional antennas, coverage could be directed across a section and in opposition to the forward-facing antennas at the rear of the section and rear-facing antennas at the front of a section. These placements filled in the gaps in a Front-to-Back Deployment, hence the name ‘In-Fill Deployment.’

While these new In-Fill Deployments did their job, they added expense to what was already an expensive endeavor. Mounting access points on handrails required that a hole be drilled in the stadia at each access point location to cable the installed equipment. With the access point and antenna now firmly embedded in the seating, devices were also exposed to more traffic and abuse. Creative integrators came to the table with hardened systems to protect the equipment – handrail enclosures. New costs included: using ground-penetrating radar to prepare for coring; enclosure fabrication costs; and more complex conduit and pathway considerations. A typical handrail placement could cost four times the cost of a typical overhead placement and a designer might call for 2 or 3 handrail placements for every overhead placement.

Getting closer, better, faster: Proximate Networks

In-Fill strategies substantially solved the coverage problem in large venues. Using a combination of back of section, front of section, and hand-rail mounted access points, wireless designers had a tool box to deliver full coverage.

But with that success came a new problem. As fans discovered these high density networks and found new uses for them, demands on those networks grew rapidly, especially where teams or venue owners pushed mobile-device content strategies that added to the network load. In-spite of well placed access points, fan devices did not attach to the in-fill devices at the same rate that they attached to the overhead placements [6]. In-fill equipment remained lightly used and overhead placements absorbed hundreds of clients. Gains in system capacity stalled.

Close-up look at U.S. Bank Stadium railing enclosure during final construction phase, summer 2016. Credit: Paul Kapustka, MSR

Close-up look at U.S. Bank Stadium railing enclosure during final construction phase, summer 2016. Credit: Paul Kapustka, MSR

To overcome uneven system loading, designers needed to create a more even distribution of RF energy within the deployment. That required a consistent approach to deployment, rather than a mix of deployment approaches. The result was the elimination of overhead antennas in favor of access points and antennas installed within the crowd, closest to the end use; hence the name ‘Proximate Networks.’

Proximate networks come in two variations: handrail only and under seat only. In the hand rail only model, the designer eliminates overhead and front of section placements in favor of a dense deployment of hand rail enclosures. In the under seat model, the designer places the access point and antenna underneath the actual seating (but above the steel or concrete decking). In both models, the crowd becomes an important part of the design. The crowd attenuates the signal as it passes through their bodies resulting in consistent signal degradation and even distribution of RF energy throughout the seating bowl. The result is even access point loading and increased system capacity.

An additional benefit of embedding the access points in the crowd is that the crowd effectively constrains the output of the access point much as a wall constrains the output of an access point in a typical building. Each radio therefore hears fewer of its neighbors, allowing each channel to be re-used more effectively. And because the crowd provides an effective mechanism for controlling the spread of RF energy, the radios can be operated at higher power levels which improves the link between the access point and the fan’s device. The result is more uniform system loading, higher average data rates, increased channel re-ue, and increases in total system capacity.

While Proximate Networks are still a relatively new concept, the early data (and a rapid number of fast followers) confirms that if you want the densest possible network with the largest possible capacity, then a Proximate Network is what you need.

The Financials: picking what’s right for you

From the foregoing essay, you might conclude that the author’s recommendation is to deploy a Proximate Network. However, that is not necessarily the case. If you want the densest possible network with the largest possible capacity, then a Proximate Network is a good choice. But there are merits to each approach described and a cost benefit analysis should be performed before a deployment approach is selected.

For many venues, Overhead Deployments remain the most cost effective way to provide coverage. For many smaller venues and in venues where system utilization is expected to be low, an Overhead deployment can be ideal.

Front-to-Back deployments work well in venues where system utilization is low and the available overhead mounting assets can’t cover all areas. The goal of these deployments is ensuring usable coverage, not maximizing total system capacity.

In-fill deployments are a good compromise between a coverage-centric high density approach and a capacity-centric approach. This approach is best suited to venues that need more total system capacity, but have budget constraints the prevent selecting a Proximate approach.

Proximate deployments provide the maximum possible wireless density for venues where connectivity is considered to be a critical part of the venue experience.

Conclusion

If your venue is contemplating deploying a high density network, ask your integrator to walk you through the expected system demand, the calculation of system capacity for each approach, and finally the cost of each approach. Make sure you understand their assumptions. Then, select the deployment model that meets your business requirements — there is no “one size fits all” when it comes to stadium Wi-Fi.

Bill Anderson, AmpThink

Bill Anderson, AmpThink

Bill Anderson has been involved in the design and construction of wireless networks for over 20 years, pre-dating Wi-Fi. His first experience with wireless networking was as a software developer building software for mobile computers communicating over 400 MHz and 900 MHz base stations developed by Aironet (now part of Cisco Systems).

His work with mobile computing and wireless networks in distribution and manufacturing afforded him a front row seat to the emergence of Wi-Fi and the transformation of Wi-Fi from a niche technology to a business critical system. Since 2011 at AmpThink Bill has been actively involved in constructing some of the largest single venue wireless networks in the world.

Footnotes

^ 1. A proxy for the calculation of overall system capacity is developed by multiplying the average speed of communication of all clients on a channel (avg data rate or speed) by the number of channels deployed in the system (available spectrum) by the number of times we can use each channel (channel re-use) or [speed x spectrum x re-use]. While there are many other parameters that come into play when designing a high density network (noise, attenuation, reflection, etc.), this simple equation helps us understand how we approach building networks that can support a large number of connected devices in an open environment, e.g. the bowl of a stadium or arena.

^ 2. Co-channel interference refers to a scenario where multiple access points are attepting to communicate with client devices using the same channel. If a client or access point hears competing communication on the channel they are attempting to use, they must wait until that communication is complete before they can send their message.

^ 3. Access Point is the term used in the Wi-Fi industry to describe the network endpoint that client devices communicate with over the air. Other terms used include radio, AP, or WAP. In most cases, each access point is equipped with 2 or more physical radios that communicate on one of two bands – 2.4 GHz or 5 GHz. HD Wi-Fi deployments are composed of several hundred to over 1,000 access points connected to a robust wired network that funnels guest traffic to and from the internet.

^ 4. While there is no hard and fast rule, most industry experts agree that a single access point can service between 50 and 100 client devices.

^ 5. Venues often use pressure washers to clean a stadium after a big event.

^ 6. Unlike cellular systems which can dictate which mobile device attaches to each network node, at what speed, and when they can communicate, Wi-Fi relies on the mobile device to make the same decisions. When presented with a handrail access point and an overhead access point, mobile devices often hear the overhead placement better and therefore prefer the overhead placement. In In-Fill deployments, this often results in a disproportionate number of client devices selecting overhead placements. The problem can be managed by lowering the power level on the overhead access point at the expense of degrading the experience of the devices that the designer intended to attach to the overhead access point.

AmpThink’s Wi-Fi data reveals interesting attendance trends for collegiate customer

AmpThink infographic about how Wi-Fi data can help teams discover attendance information (click on photo for link to infographic page)

AmpThink infographic about how Wi-Fi data can help teams discover attendance information (click on photo for link to infographic page)

If there’s a single business concern we hear over and over again from stadium owners and operators, it’s the desire to answer a simple but powerful question: Who, exactly, is sitting in our seats?

Before digital technology arrived, that question was exceedingly hard to answer, as teams, schools and other ticket-sellers often only knew details of a small percentage of the actual fans in attendance. Paper tickets could be given to family, friends or sold to brokers, meaning the people actually at the game might very well not be the person who purchased the tickets.

While digital ticketing has improved the insight somewhat, many fans at many stadiums still use printed tickets for access, which may still keep the at-game attendee somewhat anonymous. But with a high-definition Wi-Fi network in place, stadium owners and operators can gain deep insights from the fans who do attend, even whether or not they actually log on to the network for access.

Wi-Fi deployment firm AmpThink, which has customers in all the U.S. pro leagues as well as in many major university venues, has put together a deeply sourced infographic showing how Wi-Fi analytics from a full season of games at a Power 5 college football stadium can produce some interesting insights — like the fact that 71 percent of all attendees only went to one game, and that only 2 percent of attendees went to all six games.

Using data to replace assumptions

Editor’s note: This post is part of Mobile Sports Report’s new Voices of the Industry feature, in which industry representatives submit articles, commentary or other information to share with the greater stadium technology marketplace. These are NOT paid advertisements, or infomercials. See our explanation of the feature to understand how it works.

While we here at Mobile Sports Report don’t often recommend company-produced infographics, the data and conclusions surfaced in this one are eye-opening and are likely to be informative to venue owners and operators in a wide range of sports; that’s why we agreed to make this information available to our readers.

We also like the detailed explanations accompanying the infographic, spelling out how the data were collected and how Wi-Fi can be used to identify fans (including those with devices that may not even be purposely connected to the network). The last part of the infographic page, which asks “How could Wi-Fi data change sports marketing?” is a question we’ve already seen others starting to answer — and one we expect many to test in the near future as teams deploy not just Wi-Fi networks but also Bluetooth beacons, login portal pages and other methods to increase the granularity of fan identification.

For the unidentified client, AmpThink said the results “surprised” the school, which had (like others) believed in “long-held industry assumptions about fan loyalty and audience size.” It’s our guess that digital data will increasingly be used to replace assumptions, and we’re looking forward to sharing your stories of how that happens.

Vikings hit peak of 4.32 TB for Wi-Fi use at U.S. Bank Stadium, with average 43 percent take rate

Game day at U.S. Bank Stadium. Credit all photos: Vikings.com (click on any photo for a larger image)

Game day at U.S. Bank Stadium. Credit all photos: Vikings.com (click on any photo for a larger image)

While the football season may not have gone exactly to Vikings’ fans wishes, the Wi-Fi network at U.S. Bank Stadium performed well during its inaugural NFL season, with a peak single-game data total of 4.32 terabytes used, part of a season average of 2.89 TB used during Vikings games.

According to statistics provided to MSR by Tod Caflisch, vice president and chief technical officer for the Vikings, the biggest data-use day was Sept. 18, 2016, during the regular-season home opener for the Vikings against the rival Green Bay Packers, a 17-14 Vikings victory. That contest also saw season highs for unique Wi-Fi users, with 31,668 fans connecting to the Wi-Fi at some point of the game day, and for most concurrent users, with 17,556 users connected at the same time. The 31,668 number represented a 49 percent take rate, with the game’s reported attendance of 64,786.

Even though Caflisch said the Vikings didn’t heavily promote the AmpThink-designed Wi-Fi network — which uses Cisco Wi-Fi gear in mostly handrail-mounted AP locations to serve the main bowl seating areas — the average take rate during the season was at the high end of numbers we’ve seen, with a 43 percent average over the two preseason and eight regular-season Vikings games.

Screen Shot 2017-01-12 at 11.41.21 AMAnd even though the total data-used number only crested 3 TB one other time in the season — a 3.16 TB mark during a 30-24 Vikings win over the Arizona Cardinals on Nov. 20, 2016 — the average mark of 2.89 TB per game showed solid, consistent use.

Caflisch said that the Vikings and U.S. Bank Stadium were also able to correct the train-snafu issue that arose at some of the early events at the new venue, which has a light-rail station right outside the stadium doors. While some of the first events had big lines of riders and not enough trains, Caflisch said that during the season extra trains were held in reserve at the transit station that is close to Target Field (a few stops down the line from U.S. Bank) and then filtered in as Vikings games neared their end.

“We were able to clear the [train] platform in 40 minutes after the last game,” Caflisch said. “The fans really loved the trains.” (More U.S. Bank Stadium images below)

Screen Shot 2017-01-12 at 11.39.38 AM

Vikings fans gather outside the stadium for pregame activites.

Screen Shot 2017-01-12 at 11.40.04 AM

Great nighttime view with city skyline visible through windows.

vik5

A look at the handrail Wi-Fi antenna mounts (this photo credit: Paul Kapustka, MSR)

Cisco deploys Wi-Fi network at San Jose Sharks’ SAP Center

SAP Center, home of the San Jose Sharks. Credit: SanJoseSharks.com.

SAP Center, home of the San Jose Sharks. Credit: SanJoseSharks.com.

The San Jose Sharks have announced a new Wi-Fi network for their home arena, SAP Center — one that will use Cisco Wi-Fi gear as well as Cisco’s StadiumVision system for digital-display content management.

San Jose Sharks chief operating officer John Tortora said that the new Wi-Fi network — believed to be the first full public Wi-Fi deployment in the building — joins a new team app developed by VenueNext as part of a big revamp for the technology-related fan experience at the so-called “Shark Tank.”

According to the Sharks, the Wi-Fi network will have 500 access points, with 50 of those mounted in handrail enclosures in the lower seating bowl; another 17 APs will be located under seats in the retractable seating sections of the arena. Wi-Fi design and deployment firm AmpThink helped install the new network, which is slated to go live by Dec. 1, the Sharks said.

“To complement our new Sharks app and the use of it at SAP Center, we are in the process of deploying Cisco Connected Stadium Wi-Fi, a best-in-class Wi-Fi platform used in sports venues around the world,” Tortora said in an email communication. “We want our patrons to be able to easily and reliably connect while at SAP Center to allow for the best fan experience when attending Sharks games and other events.”

Sharks fans at Wednesday night’s home opener may have noticed some of the other technical enhancements to the arena, which include 13 new LED panels and 625 new digital displays. The Cisco StadiumVision system allows for remote control and synchronization of digital display content, including the ability to split screens to show things like live video alongside static advertising.

Until the Wi-Fi network goes live, SAP Center attendees should still be able to connect via an in-stadium distributed antenna system (DAS) run by AT&T, which also carries Verizon Wireless signals.

New Report: Carolina Panthers build new Wi-Fi and DAS; Mercedes-Benz Stadium update, and more!

Q3thumbMobile Sports Report is pleased to announce the Q3 issue of our STADIUM TECH REPORT series, the ONLY in-depth publication created specifically for the stadium technology professional and the stadium technology marketplace.

In addition to our historical in-depth profiles of successful stadium technology deployments, our Q3 issue for 2016 has additional news and analysis, including a look at Wi-Fi analytics at the Mall of America, and a story about how the Cleveland Browns found $1 million in ROI using new analytics software from YinzCam. Download your FREE copy today!

Inside the report our editorial coverage also includes:

— Bank of America Stadium profile: An in-depth look at the Carolina Panthers’ decision to bring new Wi-Fi and DAS networks in-house;
— Mercedes-Benz Stadium profile: An early look at the technology being built into the new home of the Atlanta Falcons, with an emphasis on fiber;
— T-Mobile Arena photo essay: A first look at the newest venue on the famed Las Vegas Strip;
— Avaya Stadium profile: How the stadium’s Wi-Fi network became the star of the MLS All-Star game.

We’d like to take a quick moment to thank our sponsors, which for this issue include Mobilitie, Crown Castle, SOLiD, CommScope, JMA Wireless, Corning, Samsung Business, Xirrus, Huber+Suhner, ExteNet Systems, DAS Group Professionals and Boingo Wireless. Their generous sponsorship makes it possible for us to offer this content free of charge to our readers. We’d also like to thank you for your interest and support.

Wi-Fi network powers rich data collection at Mall of America

Fans greet One Direction at Mall of America. Credit: Tony Nelson (click on any photo for a larger image)

Fans greet One Direction at Mall of America. Credit: Tony Nelson (click on any photo for a larger image)

If you’re shopping for mobile customer data, why not just go to the mall?

That’s what Minnesota’s Mall of America did, not by finding a service that sells such information, but by investing in a massive and complex Wi-Fi network, designed and deployed by AmpThink using Cisco gear. And the service is free for Mall visitors. While being an attractive guest feature, the service simultaneously provides the Mall with enough data to fill digital warehouses with information about what people do both online and in the real world while on the property.

According to Janette Smrcka, IT director for Mall of America, though the Mall is only at the start of its data analysis program, it is already seeing interesting results that will likely help the Mall better connect with its visitors and, in all likelihood, improve business results for Mall tenants. A recent month-long sampling of customer behavior data gave the Mall a tremendous amount of insight on how activities such as promotions and events affect visitor behavior, information the Mall wouldn’t have had without its Wi-Fi network.

“We’re just at the beginning of being able to use all of this valuable data and translate it into actionable information,” said Smrcka in an interview at Mall of America, located in Bloomington, nine miles south of Minneapolis.

Almost a Super Bowl of data every week

How much data are we talking about? In the world of stadium networks, the most recent Super Bowl at Levi’s Stadium in Santa Clara, Calif. set a single-day record with 26 terabytes of wireless data used – 15.9 TB on cellular networks and 10.1 TB on the stadium Wi-Fi. At Mall of America, from the launch of their Wi-Fi network during Thanksgiving weekend last year until May 2016, nearly 320,000 unique Mall visitors connected to the network, using a total of 486 TB of traffic – almost a Super Bowl of data per week.

Wi-Fi AP visible below Mall of America sign. Credit: Paul Kapustka / MSR

Wi-Fi AP visible below Mall of America sign. Credit: Paul Kapustka / MSR

While the Mall might not match the single-day crush of a Super Bowl, the steady stream of visitors (Mall officials estimate that the six million square-foot facility sees an average of 109,000 visitors on weekdays and 160,000 on weekends) produces some staggering numbers. According to a recent public presentation about the network, Mall of America claimed that one month of Wi-Fi usage on its network equaled a full year of Wi-Fi activity on an NBA-sized stadium network.

According to Smrcka, the Mall knew it needed Wi-Fi connectivity as a table stakes amenity, but it was mindful of fitting performance to price, while achieving return on investment justification in the process.

“We knew we needed something but the challenge was the cost,” Smrcka said. “We knew we couldn’t charge for the service.” Smrcka also said the Mall has seen other malls try and fail with initial Wi-Fi deployments due to subpar service, prompting guest disdain.

“We’ve seen some of our peers put Wi-Fi in, and not do it very well, and get lots of complaints,” Smrcka said. “It’s like airport Wi-Fi. Sure it’s there, but just try using it.”

With AmpThink, Mall of America found a partner who knew the need for high-quality deployments. The design and install company has been behind the Wi-Fi networks at several Super Bowls, as well as recent networks built in stadiums like Kyle Field at Texas A&M and US Bank Stadium in Minneapolis.

Wi-Fi 'ball' visible in middle of theme park area. Credit: Paul Kapustka / MSR

Wi-Fi ‘ball’ visible in middle of theme park area. Credit: Paul Kapustka / MSR

However, this massive Wi-Fi network, along with the unique challenges of implementing it at the Mall, also needed the elusive component of return on investment.

Not easy to build inside a mall

On a recent early morning tour of the Mall, AmpThink president Bill Anderson showed Mobile Sports Report some of the challenges inherent to one of the world’s biggest shopping venues. For starters, there was the need for custom enclosures that fit with the facility’s overall aesthetics as well as cutting through double firewalls (the real kind of firewalls, not the software kind) to keep safety codes intact.

To cover one of the Mall’s more unique spaces, the 7-acre theme park in the center of the facility, AmpThink had to design and build enclosures that look like big lollipops. This “Wi-Fi ball on a stick” design fit the park’s design aesthetics while providing coverage in and around the various rides and amusement spaces. AmpThink also figured out how to fit a Wi-Fi AP inside digital sign kiosks, so the kiosks could connect to the network and therefore the Cisco StadiumVision system for digital display management. In many places, the APs included beacons inside, setting up the network for device proximity capabilities.

From an RF perspective, Anderson said one of the toughest challenges was keeping interference to a minimum between APs on different floors of the multi-level mall. Another large challenge was simply the logistics of construction, with separate scheduling needed for the many hundreds of Mall tenants.

The mall's many levels make it a tough place to tune RF. Credit: Paul Kapustka / MSR

The mall’s many levels make it a tough place to tune RF. Credit: Paul Kapustka / MSR

“We own the space above their ceilings, and would need to get in the stores to run cable through,” said Smrcka. Coordinating construction was a challenge at times, like when teenage clerks didn’t relay scheduling messages to their store managers, further complicated by the need to have security officers present to keep an eye on inventory.

Anderson said that AmpThink deployment teams also needed to make sure they cleaned up after putting in APs, as any drywall dust found on store facades could result in complaints from store owners. Despite the extra hurdles, deployment of the network, composed of more than 600 APs, started in July of last year and launched just before Thanksgiving. Then the data started pouring in. Now, what to do with all that information?

Putting students to work

To help figure out how to best use the stream of information coming its way, the Mall conducted a study of its data in a partnership with graduate students at the Carlson Analytics Lab at the University of Minnesota’s Carlson School of Management. By mapping visitor behavior using Wi-Fi and beacon activity – tracking where shoppers arrived, where they walked and how long they spent in different areas of the Mall – Mall of America and the student researchers were able to uncover interesting stats on things like in-mall promotions, events and store appeal.

Mall kiosk with Wi-Fi inside to drive the Cisco StadiumVision software. Credit: Paul Kapustka / MSR

Mall kiosk with Wi-Fi inside to drive the Cisco StadiumVision software. Credit: Paul Kapustka / MSR

According to the Mall’s presentation at the recent SEAT Conference in Las Vegas, one analysis showed that if visitors were offered free admission to the amusement park, they actually spent 40 percent of their time at the Mall somewhere other than the amusement park — a sign that free amusement park entry could spur more shopping. The analysis also showed that during events at the Mall – according to the Mall, it hosts more than 400 special events a year – visitors stayed at the Mall on average 1.4 times longer than visitors who did not attend an event. The Mall also found out that 39 percent of event attendees visited the Mall’s food courts, compared to 25 percent of non-event visitors.

During its presentation, the Mall also showed screenshots of interactive “heat maps” showing exactly where visitors entered, where they walked, and how long they stayed. This information was gathered by the Wi-Fi AP beacons, which allowed for accurate device location tracking. With such information at their fingertips, the Mall sees a future where the network helps initiate new features for assisted shopping and custom experiences for visitors without resorting to historic feedback systems like surveys or focus groups.

Data driving the future

“This data is golden when it comes to describing shopper behavior,” said Smrcka, who also talked about the deployment at the SEAT Conference in Las Vegas. Shopper surveys, she said, have proven to sometimes not be reliable, and “who has the time to sit in a focus group for hours?”

How many malls do you know that have a One Direction tribute photo? Credit: Paul Kapustka / MSR

How many malls do you know that have a One Direction tribute photo? Credit: Paul Kapustka / MSR

This information enhances other Mall guest behavior and experience initiatives. “We already have a social media command team watching geo-located social media posts,” Smrcka said. They also employ a text messaging system, which visitors use to send messages to customized numbers to communicate if a bathroom needs servicing, set a reminder for their parking spot, or to find out where the closest gelato stand is.

And while Mall of America, like other bricks and mortar retailers, competes every day against online shopping, Smrcka said there are plenty of people who still want to see and feel the goods they are purchasing. Mall of America plans to use their Wi-Fi network’s data to make the guest experience even better and study the feasibility of possible future services like personal shopping, valet parking, curbside pickup and home delivery.

Moreover, Smrcka and her team can better segment and target its visitors and their entertainment, dining and shopping needs.

“What we’re able to do [from analytics] is still changing from month to month,” Smrcka said. “But the data really empowers a team like ours.”