Small company delivers big Wi-Fi for Minnesota United at Allianz Field

The standing section at Allianz Field for the opening game this spring. Credit: Minnesota United (click on any picture for a larger image)

Fans at the new Allianz Field in St. Paul are the beneficiaries of a big project done by a small company to bring solid fan-facing Wi-Fi to the new 19,400-seat home arena for the Minnesota United FC MLS team.

The striking new $250 million facility, opened in April just off the highway that connects Minneapolis to St. Paul, is a looker from first sight, especially at night if the multi-colored lights in its cursive outside shell are lit. Inside, the clean sight lines and close-to-the-pitch seating that seems a hallmark of every new soccer-specific facility are accompanied by something that’s not as easy to detect: A solid fan-facing Wi-Fi network with approximately 480 Cisco access points, in a professional deployment that wouldn’t seem out of place at any larger facility, like an NFL stadium.

Actually, the Wi-Fi network inside Allianz Field is somewhat more conspicuous than many other deployments, mainly because instead of hiding or camouflaging the APs, most have very visible branding, letting visitors know that the Wi-Fi is “powered by” Atomic Data.

Who is Atomic Data? Though perhaps better known for their data center and enterprise business managed-services prowess, the 215-person Minneapolis-based firm also has a developing track record in stadium technology deployments, including a role as part of the IT support team for the launch of U.S. Bank Stadium two years ago. In what is undeniably a unique arrangement, Atomic Data paid for and owns the network infrastructure at Allianz Field, providing fan-facing Wi-Fi as well as back-of-house connectivity as a managed service to the team as well as to internal venue vendors like concessionaires.

LOCAL PARTNER EARNS TEAM’S TRUST

Editor’s note: This report is from our latest STADIUM TECH REPORT, an in-depth look at successful deployments of stadium technology. Included with this report is a profile of the new Wi-Fi network at Chesapeake Energy Arena in Oklahoma City, and an in-depth research report on the new Wi-Fi 6 standard! DOWNLOAD YOUR FREE COPY now!

While most new stadium builds often look for network and technology firms with a bigger name or longer history, Atomic Data was well known to the Minnesota team, having been a sponsor even before the club moved up to MLS.

One of the Cisco Wi-Fi APs installed by Atomic Data inside the new Allianz Field in Minneapolis. Credit: Paul Kapustka, MSR

Chris Wright, CEO of the MNUFC, credited a longtime relationship with Atomic CEO Jim Wolford, a company Wright had known from his days with the NBA Timberwolves and WNBA’s Lynx.

“They [Atomic Data] are a very strong local company and we knew of their work, including at U.S. Bank Stadium,” Wright said. “Jim has also been a huge advocate of the [soccer] club, even before they moved to MLS. Their history is solid, and they [Atomic Data] have an incredible reputation.”

As the team prepared to move into its under-construction home, Wright said that originally having a high-definition wireless network wasn’t in the cards.

“The original plan was not to have a robust Wi-Fi network,” Wright said, citing overall budget concerns as part of the issue. But when he was brought in as CEO he was looking for a way to change the direction and have a more digital-focused fan experience – and he said by increasing Atomic Data’s partnership, the company and the team found a way to make it happen.

As described by both Wright and Atomic Data, the deal includes having Atomic Data pay for and own the Wi-Fi network components, and also to act as the complete IT outsourcer for the team, providing wired and wireless connectivity as a managed service.

“When you look at the demographic of our fans, they’re mostly millenials and we wanted to have robust connectivity to connect with them,” Wright said. “Over time we were able to negotiate a deal [with Atomic Data] to build what I think is the most capable Wi-Fi network ever for a soccer-specific venue. I think we’ve turned some heads.”

UNDER SEAT AND OUTSIDE THE DOORS

Just before the stadium hosted its first league game, Mobile Sports Report got a tour of the facility from Yagya Mahadevan, enterprise project manager for Atomic Data and sort of the live-in maestro for the network at Allianz Field. Mahadevan, who worked on the U.S. Bank Stadium network deployment before joining Atomic Data full-time, was clearly proud of the company’s deployment work, which fit in well with the sleek designs of the new facility.

An under-seat AP deployment at Allianz Field. Credit: Paul Kapustka, MSR

For the 250 APs in the main seating bowl, Atomic Data used a good amount of under-seat AP deployments, since many of the seats have no overhang. A mix of overhead APs covers the seating areas that do have structures overhead, and more APs – which are clearly noticable, including some APs painted white to pop out against black walls and vice versa – are mounted along concourse walkways as well as on the outside of the main entry gates. Since MNUFC is a paperless ticketing facility, Mahadevan said Atomic Data paid special attention to entry gates to make sure fans could connect to Wi-Fi to access their digital tickets.

Wright, who called Atomic Data’s devotion to service “second to none,” noted that before the first three games at the new stadium, Atomic Data had staff positioned in a ring around the outside of the field, making sure fans knew how to access their tickets via the team app and the Wi-Fi network.

“The lines to get in were really minimized, and that level of desire to deliver a high-end experience is just the way they think,” Wright said of Atomic Data.

According to Atomic Data the network is backed by two redundant 10-Gbps backbone pipes (from CenturyLink and Consolidated Communications) and is set up to also provide secure Wi-Fi connectivity to the wide number of independent retail and concession partners. Mahadevan also said that the network has a number of redundant cable drops already built in, in case more APs need to be added in the future. The stadium also has a cellular distributed antenna system (DAS) built by Mobilitie, but as of early this spring none of the carriers had yet been able to deploy gear.

Even the chilly temperatures at the team’s April 13 home opener didn’t keep fans from trying out the new network, as Atomic Data said it saw 85 gigabytes of Wi-Fi data used that day, with 6,968 unique Wi-Fi device connections, a 35 percent take rate from the sellout 19,796 fans on hand. According to the Atomic Data figures, the stadium’s Wi-Fi network saw peak Wi-Fi bandwidth usage of 1.9 Gbps on that opening day; of the 85 GB Wi-Fi data total, download traffic was 38.7 GB and upload traffic was 46.3 GB.

According to Wright, the stadium has already had several visits from representatives from other clubs, who are all interested in the networking technology. Wright’s advice to other clubs who are in the process of thinking about or building new stadiums: You should get on the horn with Atomic Data.

“I tell them if you’re from Austin or New England, you should be talking to Atomic,” Wright said. “They should try to replicate the relationship we have with them.”

Venue Display Report: Small directories do big job at Mall of America

The Mall of America turned to small digital directories to solve a big wayfinding problem. Credit all photos: Paul Kapustka, VDR (click on any picture for a larger image)

With hundreds of stores, shops and restaurants – and an embedded amusement park – being big only starts to describe the breadth of the Mall of America. Yet to help guests find what they are looking for and to find their way around the 5.6 million square-foot property, the Mall went small with its digital-display directories, a winning move that has produced more than 10 million interactions since going fully live just more than a year ago.

Once upon a time, the Bloomington, Minn.-based Mall of America was no different from any other shopping mall when it came to directories. In what was somewhat of a shopping center tradition, the Mall of America had four-sided standalone structures with four-foot wide printed displays, crammed with maps and lists of locations. While big, printed directories may have been the way malls always did things, they didn’t fit in with the Mall of America’s recent moves to embrace digital technology to improve the guest experience.

“Those old directories were monstrosities, and they were obsolete the moment you printed them,” said Janette Smrcka, information technology director for the Mall of America. During a visit this spring by Venue Display Report to the Mall, Smrcka said attempts to put granular information on the printed maps – like stacked graphics showing the multiple oor levels close together – produced a mostly frustrating experience for guests.

“You had to walk around these things, and it was really difficult to find anything, because there are so many stores,” Smrcka said. “And the maps were kind of information overload. We found guests didn’t react well tothem.”

Going digital for directories

Editor’s note: This profile is from our most recent issue of our VENUE DISPLAY REPORT series, where we focus on telling the stories of successful venue display technology deployments and the business opportunities these deployments enable. This issue also contains profiles of the new big video board at Oracle Park in San Francisco, and an in-depth look at display technology at U.S. Bank Stadium during the Final Four! START READING the issue today!

Thin lighted poles show where directories are located, without blocking the view

For the technology-forward Mall of America – which installed a high-definition Wi-Fi network throughout the property a few years ago – digital touch-screen directories seemed a logical next step ahead. According to Smrcka, the generational shift to embrace more touchscreen devices like phones and tablets, and the emergence of similar devices in many public places like airports and restaurants has produced a public that is far more comfortable with touching a display.

“Five or 10 years ago a touchscreen directory might have been too soon, but now very few people are hesitant [to use touchscreens],” Smrcka said. “Everything is a touchscreen, and people expect it. Everyone feels comfortable [using them].”

An important caveat for Mall of America, Smrcka said, was finding a way to make the mall directory experience more personal and private, like using an ATM.

“We always knew we wanted the screens to be smaller,” Smrcka said. Some other shopping centers that the Mall of America team had scouted had larger interactive displays, which Smrcka said could produce a “creepy” feeling since personal searches could potentially be viewed by people walking by.

“We felt like we wanted the screens to be a size where your body could be a shield,” Smrcka said. “Nobody needs to know what I’m looking for.”

As part of its deployment strategy, the Mall of America followed its agile development ethos and rolled out a small number of test units live in the Mall in late 2016. The displays, about the size of an iMac desktop screen tilted vertically, are from Aopen, and use a Chromebox commercial base for the operating system. A local Minneapolis-area wayfinding solutions firm, Express Image, provided the programming, and without much fanfare, the Mall flipped the switch and let its guests interact with the devices to see what happened.

Reducing search times to under a minute

“We didn’t exactly stalk people, but we did watch them [using the directories],” Smrcka said. Though some of the features enabled by the devices – like a search field – were obvious adds, exposing other services like maps and wayfinding weren’t as straightforward.

“There is a real problem of how do you logistically show 5 million square feet,” Smrcka said.

The start screen exposes some of the most-used directory services.

After watching users interact, the Mall of America has currently settled on a 2D mapping feature that can, if users choose it, show an animated path from where they are to where their desired destination is.

“The focus is how quickly can we help guests find what they are looking for,” Smrcka said.

After pulling the trigger to roll out 100 of the directories in mid-2017, the Mall’s IT team was rewarded with extensive usage analytics, which they put into an immediate feedback loop to improve the directories’ feature list and what was shown to users first.

“By far, the number one search was for restrooms,” Smrcka said. The Mall took that information and now has a prominent button on the main screen that will quickly show users the closest restrooms to that spot.

“There’s nothing like that immediate need,” Smrcka sAaid. “That was a quick win.”

Analyzing more of the data helps the Mall’s IT team do a better job of predicting what users are looking for when they misspell store names, or if users are having difficulty with directions. According to Smrcka some data analysis showed that one physical location of a set of directories was causing confusion since “people told to take a left turn ended up inside a Cinnabon.” Moving the directories around the nearby corner helped improve the directions feature, she said.

The directories got a good workout during Final Four weekend.

The problem of how to show directions to places on different levels of the mall was solved by having a different screen for each level; the animated directions will even advance the pathways up and down escalators or stairways. Srmcka is also proud of the Mall’s desire to make information as real-time as possible; that effort includes a kind of “mall hack” where cheap power meters feed information into the directory system to let guests know if, say, an escalator is temporarily out of service.

“Little things like that make a big difference,” said Smrcka.

In a casual mall walkaround, VDR observed many guests taking turns at the numerous directory locations, seeming to find what they need quickly without any obvious confusion. According to Smrcka, the directories have now logged more than 10 million interactions, with the average interaction time at 38.98 seconds.

Some features in the directories, like the ability for users to enter their phone number to get information via text message, may take longer to take off, Smrcka said. “There are always going to be some people who don’t have the comfort level to put their number into a public device,” Smrcka said. But overall, the small digital directories have added up to a huge success.

Top-down approach brings Wi-Fi to OKC Thunder’s Chesapeake Energy Arena

Chesapeake Energy Arena, home of the NBA’s Thunder. Credit all photos: Oklahoma City Thunder

If there’s one sure thing about stadium Wi-Fi deployments, it’s that pretty much no two networks are ever exactly the same. So even as there is a growing large-venue trend for putting Wi-Fi access points under seats or in handrails, sometimes the traditional top-down method is still the one that works best.

Such was the case for the first full fan-facing Wi-Fi network at Chesapeake Energy Arena in Oklahoma City, home of the NBA’s Thunder. With a large amount of retractable seating in the 18,000-seat venue, an under-seat approach to Wi-Fi would prove too costly and disruptive, leading the team to look for connectivity from above.

While a solid in-building cellular distributed antenna system (DAS) had done a good job of keeping fans connected the last few years, the team’s desire to have more mobile insight to fan activity as well as a switch to a Wi-Fi-centric point of sale system led Oklahoma City to finally install fan-facing Wi-Fi throughout the venue.

Chris Nelson, manager of information technology for venue manager SMG, and Tyler Lane, director of technology for the Thunder, spoke with Mobile Sports Report about the recent Wi-Fi deployment at Chesapeake Energy Arena, which went live during the most recent NBA season.

An AP placement in the rafters

Though the venue looked at all options, Nelson said that going under-seat with APs would have been “very costly” to do, given the large number of retractable seats in the arena.

“We wanted to hang them [APs] from the top if we could,” Nelson said.

After testing the top equipment brands available, the Thunder settled on Ruckus gear, for what they said was a simple reason, one involving the 96 feet in air space from the catwalk to the arena floor.

“Ruckus was the only one whose gear could reach down all the way,” Nelson said.

Adding to the fan experience

Editor’s note: This report is from our latest STADIUM TECH REPORT, an in-depth look at successful deployments of stadium technology. Included with this report is a profile of the new Wi-Fi network at Allianz Field in St. Paul, Minn., and an in-depth research report on the new Wi-Fi 6 standard! DOWNLOAD YOUR FREE COPY now!

According to the team the deployment saw 410 total APs used, with 350 in the arena proper and another 60 deployed across the street at the Cox Convention Center. According to the Thunder’s Lane, the team rolled out the service slowly at first, with some targeted testing and feedback from season ticket holders.

Close-up of an AP placement

“We got some good feedback and then when we went to a full rollout we had signage in the concourses, communications via ticketing services and announcements over the PA and on the scoreboard,” to tell fans about the system, said Lane.

According to statistics provided by the team, the Wi-Fi was getting good traction as the season went on, with a March 16 game vs. the Golden State Warriors seeing 589.3 gigabytes of traffic, from 2,738 clients that connected to the network. Lane said the team employed Jeremy Roach and his Rectitude 369 firm to assist with the network design; Roach in the past helped design networks at Levi’s Stadium and Sacramento’s Golden 1 Center.

Now that the Wi-Fi network is in place, Lane said the Thunder is starting to increase the ways it can add to the fan experience via digital means, including app-based features like showing press conferences live and by having an artificial intelligence chatbot to help provide fans with arena information.

“It’s really all about enhancing the fan experience,” Lane said, with an emphasis on driving digital ticketing use in the YinzCam-developed team app. Lane said that the system also drives a lot of mobile concessions traffic, and added that “Ruckus did a fantastic job of asking all the right questions for our food and beverage partners.”

Temporary courtside network helps set Final Four Wi-Fi records

A temporary under-seat Wi-Fi network helped bring connectivity to courtside seats at this year’s Final Four. Credit all photos: Paul Kapustka, MSR (click on any picture for a larger image)

One of the traditional characteristics of the Final Four is the yearly travel scramble of the fortunate fans and teams who have advanced to the championship weekend. Somehow, with only a week’s notice, plane flights, road trips and hotel rooms get scheduled and booked, leading to packed houses at college basketball’s biggest event.

The excitement of the Final Four extends beyond the court, as fans rush to secure their accommodations in host cities. Hotels in these areas prepare for an influx of guests, offering extended services to handle the demand. From quick check-ins to providing travel tips and local dining recommendations, hotel staff are often stretched thin. It’s during these high-pressure times that technology becomes a game changer, with many properties now adopting an AI concierge for hotels. This smart service enhances the guest experience by streamlining bookings, answering questions, and providing 24/7 assistance, all while reducing the strain on human staff.

On the stadium technology side, a similar last-minute fire drill happens just about every year as well, as the hosting venues reconfigure themselves to host basketball games inside cavernous buildings built mainly to hold football crowds. At this year’s NCAA Men’s Final Four at U.S. Bank Stadium in Minneapolis, the stadium tech team and partner AmpThink were able to quickly construct a temporary Wi-Fi network to cover the additional lower-bowl seating. The new capactity was part of a record-setting Wi-Fi network performance at the venue, with single-day numbers surpassing those from Super Bowl 52, held in the same building the year before.

The Wi-Fi numbers, both staggering and sobering especially to venues who are next in line for such bucket-list events, totaled 31.2 terabytes for the two days of game action, according to figures provided by the NCAA. For the semifinal games on Saturday April 6, U.S. Bank Stadium’s Wi-Fi network saw 17.8 TB of traffic, topping the 16.31 TB used during Super Bowl 52 on Feb. 4, 2018. The Saturday semifinals also set an attendance record for the venue, with 72,711 on hand, topping the 67,612 in attendance for Super Bowl 52.

During the championship game on April 8, U.S. Bank Stadium saw an additional 13.4 TB of data used on the Wi-Fi network, giving the venue three of the top four single-day Wi-Fi numbers we’ve reported, with this year’s mark of 24.05 TB at Super Bowl 53 in Atlanta the only bigger number. Saturday’s take rate at U.S. Bank Stadium, however, surpassed even the most-recent Super Bowl, with 51,227 unique users on the network, a 70 percent take rate.

‘Like building an arena network inside a football stadium’

Editor’s note: This report is from our latest STADIUM TECH REPORT, an in-depth look at successful deployments of stadium technology. Included with this report is a profile of the new Wi-Fi network at Allianz Field in St. Paul, Minn., and an in-depth research report on the new Wi-Fi 6 standard! DOWNLOAD YOUR FREE COPY now!

Switches for the temporary network were deployed under the seat scaffolding.

There’s no doubt that the temporary network installed by AmpThink and the U.S. Bank Stadium IT team contributed a great deal to the final Wi-Fi totals, with 250 access points installed in the additional seats. Like at other football venues that are transformed into basketball arenas, U.S. Bank Stadium had temporary seating installed on all four sides of the stadium, with temporary risers stretching down over football seating as well as with risers built behind both baskets. More seats were installed on the “floor” of the football field, right up to the elevated court set in the middle. The temporary APs, like the existing ones in the stadium, are from Cisco.

“There are a lot more moving parts to a Final Four than to a Super Bowl,” said David Kingsbury, director of IT for U.S. Bank Stadium, describing the difference in providing the networking and technical underpinnings for each event. While planning for the networks was obviously done far in advance, the actual buildout of the temporary Wi-Fi couldn’t even begin until the additional seating was in place, a task that finished just five days before the first game was played.

That’s when AmpThink deployed a staff of 12 workers to start connecting cables to APs and to switches, while also adding in another 700 wired network connections to the courtside areas for media internet and TV monitor connections. Like it does for every venue network it designs and deploys, AmpThink came to the stadium equipped with a wide assortment of lengths of pre-terminated cables, preparation that made the fast deployment possible.

“If we had to spin raw cable and terminate it on site, we never would have been able to finish in five days,” said AmpThink president Bill Anderson.

AmpThink’s previous experience in deploying such temporary networks under temporary seating — including at the previous year’s Final Four in San Antonio — taught the company that it would also need protection for under-seat switch deployments, to fend off the inevitable liquid spills from the seats above. That requirement was potentially even more necessary at U.S. Bank Stadium, since this year’s Final Four was the first to allow in-venue sales of alcoholic beverages.

Some temporary seats were deployed on top of existing lower bowl seats.

With some of the temporary seating installed over existing seating, there were 95 APs in the existing handrail-enclosure design that had to be turned off for the Final Four, according to Kingsbury. The 250 new APs added were all installed under the folding chairs, in enclosures that simply sat on the floor.

According to AmpThink’s Anderson, the company did learn a lesson at U.S. Bank Stadium — that it will, at future events, need to secure the actual enclosures since during the weekend curious fans opened a few of the boxes, with one AP disappearing, perhaps as an interesting IT souvenir.

In San Antonio, AmpThink had zip-tied the enclosures to chairs, which led to increased labor to detatch the devices during the post-event breakdown. While having no such measures at U.S. Bank led to a fast removal — AmpThink said it had removed all the temporary network elements just seven hours after the championship game confetti had settled — for next year’s Final Four AmpThink plans to at least zip-tie the enclosures shut so that fans can’t attempt any ad hoc network administration.

More APs for back of house operations

Another difference between the Final Four and the Super Bowl is the fact that four, not two, teams are in attendance for a full weekend, necessitating the need to set up temporary “work rooms” adjacent to each school’s locker room area. The media work center for the Final Four is also typically larger than that of a Super Bowl, again with more cities and their attendant media outlets on site thanks to there being four, not just two, teams involved.

A concourse speed test taken just after halftime of the final game.

“We had to cover a lot of places in the stadium that we don’t normally cover” with wireless and wired network access, Kingsbury said, saying that an additional 30 APs were needed for team rooms and the main media workspace, which were located on the field level of the stadium in the back hallways. An interesting note at U.S. Bank Stadium was that the yards and yards of fabric used as curtains to cover the clear-plastic roofing and wall areas was actually benefical to Wi-Fi operations, since it cut off some of the reflective interference caused by ETFE surfaces.

According to Kingsbury the final count of active APs for the Final Four was 1,414, a number reached by adding in the temporary APs while deducting the ones taken offline. Not included in the official NCAA traffic numbers was an additional 3 TB of traffic seen during the free-admission Friday practice sessions, when 36,000 fans visited the stadium, with 9,000 joining the Wi-Fi network.

From the official stats, the peak concurrent user number from Final Four Saturday of 31,141 was also an overall record, beating Super Bowl 53’s mark of 30,605. (Super Bowl 53 had 70,081 fans in attendance for the Feb. 3 game between the New England Patriots and the Los Angeles Rams.) The Wi-Fi network numbers for Monday’s championship game (won by Virginia 85-77 over Texas Tech in overtime) saw big numbers itself, with 13.4 TB of total data used, and 48,449 unique connections and 29,487 peak concurrent users (out of 72,062 in attendance). Monday’s game also produced a peak throughput number of 11.2 Gbps just after the game ended.

None of those totals could have been reached without the temporary network, which AmpThink’s Anderson compared to “building a 10,000-seat arena network inside a football stadium.” Next stop for a temporary Wi-Fi network is Mercedes-Benz Stadium in Atlanta, where the 2020 Final Four awaits.

This is what your football stadium looks like with a championship basketball game inside of it.

The temporary center-hung scoreboard was able to play video programming onto the court surface.

The NBA on TBS crew was courtside for the Final Four.

The secret to keeping your network operations room running? All kinds of energy inputs.

Final Four displays, new Giants scoreboard, all in the new VENUE DISPLAY REPORT!

Mobile Sports Report is pleased to announce the second issue of our new VENUE DISPLAY REPORT, with in-depth profiles of display technology at the Final Four, a huge new video board for the San Francisco Giants at Oracle Park, and the innovative directory displays at the Mall of America. No need to sign up or register — just click on the image below and start reading the issue today!

A new vertical-specific offering of MSR’s existing STADIUM TECH REPORT series, the VENUE DISPLAY REPORT series will focus on telling the stories of successful venue display technology deployments and the business opportunities these deployments enable. Like its sibling Stadium Tech Report series, the Venue Display Report series will offer valuable information about cutting-edge deployments that venue owners and operators can use to inform their own plans for advanced digital-display strategies.

Our reporting and analysis will be similar to that found in our popular STR series, with stadium and venue visits to see the display technology in action, and interviews and analysis with thought leaders to help readers better inform their upcoming technology purchasing decisions. And in case you are new to the MSR world, rest assured that all our VDR reports will be editorially objective, done in the old-school way of real reporting. We do not accept paid content and do not pick profiles based on any sponsorship or advertising arrangements.

This second issue is packed with real-world information, including how U.S. Bank Stadium uses the Cisco Vision IPTV display management system to help run the 2,000-plus digital displays inside and around the venue. We also take a good look at the huge new video board installed for this season at Oracle Park in San Francisco, and also bring you an in-person profile of the innovative directory display system at the Mall of America.

Start reading the second issue now! No download or registration necessary. You can also go back and view our inaugural VDR issue for more great information!

As venues seek to improve fan engagement and increase sponsor activation, display technology offers powerful new ways to improve the in-stadium fan experience. While these topics are of prime interest to many of our long-term audience of stadium tech professionals, we suggest that you share the link with colleagues on the marketing and advertising sales side of the house, as they will likely find great interest in the ROI enabled by strategic display system deployments.

Sponsorship spots are currently available for future VDR series reports; please contact Paul at kaps at mobilesportsreport.com for media kit information.

New Report: Wi-Fi 6 research report, record Wi-Fi at the Final Four, and more!

MOBILE SPORTS REPORT is pleased to announce the Summer 2019 issue of our STADIUM TECH REPORT series, the ONLY in-depth publication created specifically for the stadium technology professional and the stadium technology marketplace.

Our latest issue contains a research report on the new Wi-Fi 6 standard and what it means to stadium networks, as well as three separate profiles of Wi-Fi network deployments, including a look at how a temporary network helped fans use record data totals at the Final Four! Download your FREE copy today!

Inside the report our editorial coverage includes:

— A Wi-Fi 6 research report that looks into the new standard’s technology improvements that make it a great bet for in-venue networks;
— An in-person report from the NCAA Men’s 2019 Final Four at U.S. Bank Stadium, where the weekend saw a record 31+ terabytes of Wi-Fi data used;
— How Minnesota United’s new home, Allianz Field, got a big Wi-Fi network from a small company, Atomic Data;
— A look at the new Wi-Fi network at Chesapeake Energy Arena, home of the NBA’s Oklahoma City Thunder.

Download your free copy today!

We’d like to take a moment to thank our sponsors, which for this issue include Mobilitie, JMA Wireless, Corning, Boingo, MatSing, Cox Business/Hospitality Network, ExteNet, Neutral Connect Networks, Atomic Data, Oberon, and America Tower. Their generous sponsorship makes it possible for us to offer this content free of charge to our readers. We’d also like to welcome readers from the Inside Towers community, who may have found their way here via our ongoing partnership with the excellent publication Inside Towers. We’d also like to thank the SEAT community for your continued interest and support.