Minnesota United MLS home opener at Allianz Field sees 85 GB of Wi-Fi

One of the Cisco Wi-Fi APs installed by Atomic Data inside the new Allianz Field in Minneapolis. Credit: Paul Kapustka, MSR (click on any picture for a larger image)

A chilly home opener for the Minnesota United soccer team in their brand-new Allianz Field saw 85 gigabytes of data used on the stadium’s Wi-Fi network, according to statistics provided by Atomic Data, the stadium’s technology provider.

With 19,796 fans on hand on April 13 to pack the $250 million venue, Atomic Data said it saw 6,968 unique Wi-Fi device connections, a 35 percent take rate. The Allianz Field Wi-Fi network uses Cisco gear with 480 Wi-Fi APs installed throughout the venue. Approximately 250 of those are located in the seating bowl, with many installed under-seat. The stadium also has a neutral-host DAS built by Mobilitie, though none of the wireless carriers are currently online yet. (Look for an in-depth profile of the Allianz Field network in our upcoming Summer STADIUM TECH REPORT issue!)

According to the Atomic Data figures, the stadium’s Wi-Fi network saw peak Wi-Fi bandwidth usage of 1.9 Gbps; of the 85 GB Wi-Fi data total, download traffic was 38.7 GB and upload traffic was 46.3 GB. Enjoy some photos from the opening game (courtesy of MNUFC) and a couple from our pre-opening stadium tour!

The game was opened with a helicopter fly-by


A look at the standing-area supporter end zone topped by the big Daktronics display

The traditional soccer scarves were handy for the 40-degree temperatures

A view toward the field through the brew house window

The main pitch gets its opening salute

Entry ways were well covered with Wi-Fi to power the all-digital ticketing

The Loons have a roost!

The view as you approach the stadium crossing I-94

One of the under-seat Wi-Fi AP deployments

Message boards let fans know how to connect

Little Caesars Arena revs the engine on wireless

Little Caesars Arena in Detroit is revving its engine with wireless deployments of Wi-Fi and DAS. Credit all photos: Terry Sweeney, MSR (click on any picture for a larger image)

Detroit has made an ambitious bet on the sports entertainment model with its 50-block District Detroit development – which embraces Ford Field (where the NFL’s Lions play), Comerica Park (MLB’s Tigers) and most recently, Little Caesars Arena (NBA’s Pistons and NHL’s Red Wings).

In fact, Motor City might just as easily be renamed Stadium City as Detroit looks to professional sports as one cornerstone of economic re-development.

The city has all four major pro sports teams competing within a few blocks of each other, noted John King, vice president of IT and innovation for Olympia Entertainment and the Detroit Red Wings. District Detroit plays host to more than 200 events, welcoming some 3 million visitors annually – not bad for an area that’s barely 18 months old.

Detroit’s hardly alone in riding this development wave. Sports entertainment districts are a proven engine to boost local economies and are popping up all over the country:
–Los Angeles’s LA Live complex uses the Staples Center as its hub but includes restaurants, hotels and plenty of retail;
–Houston Avenida gangs together Minute Maid Park, BBVA Compass Stadium and NRG Stadium, along with a convention center and hotels;
–Battery Atlanta houses the Atlanta Braves’ SunTrust Park and a Coca-Cola entertainment facility, along with retail, residences and hotels;
— Westgate Entertainment District in the greater Phoenix area houses State Farm Stadium (NFL’s Cardinals) and Gila River Arena (NHL’s Coyotes), plus the obligatory retail, restaurants and hotels.

San Francisco, Kansas City, Cincinnati and Sacramento and other cities are all building out similar sports entertainment developments in their downtown areas that encourage sports fans to make a night of it, or even a weekend. Even venerable venues like Green Bay’s Lambeau Field and Chicago’s Wrigley Field are also getting in the act of trying to build areas outside the parks to keep fans engaged (and spending) before and after events, or even when there’s no games being played.

Robust DAS, Wi-Fi in LCA

Editor’s note: This report is from our latest STADIUM TECH REPORT, an in-depth look at successful deployments of stadium technology. Included with this report is a profile of the new Wi-Fi and DAS networks being planned for the University of Colorado, as well as a profile of Wi-Fi at Vivint Smart Home Arena in Salt Lake City! DOWNLOAD YOUR FREE COPY now!

John King oversees the IT operations at Little Caesars Arena

King is pleased with the performance of the IT infrastructure at Little Caesars Arena since the $863 million venue opened in the fall of 2017. With a backbone of two 100-Gbps fiber connections, the arena counts more than 700 Cisco Wi-Fi access points. There are 364 APs in the bowl itself; the bulk of those – 300 APs – have been installed under seats to get the signals closer to where the users are.

Mobile Sports Report put LCA’s Wi-Fi network and DAS system to the test this season during a Red Wings home game against the New York Rangers. Due to personal technical constraints, we were only able to test Verizon’s portion of the DAS deployment; the Wi-Fi network tested was the District Detroit Xfinity SSID.

The good news is that both network types performed admirably. No surprise that bandwidth was most plentiful and speeds were fastest on concourses near concessions, as well as in the private clubs parceled around LCA. Fastest measured speeds: 139.68 Mbps download/33.24 Mbps on the DAS network outside the MotorCity Casino Club. The Wi-Fi was also well engineered there – 51.89 Mbps download and 72.34 Mbps upload were plenty fast for hockey’s power users.

We measured comparable speeds by the Rehmann Club with 134.4 Mbps/36.25 Mbps on the DAS, and 21.56 Mbps/120.8 Mbps on Wi-Fi. Similarly, connectivity was not an issue while standing in front of the impossible-to-miss Gordie Howe statue in LCA’s main concourse, where we clocked DAS at 102.95 Mbps/22 Mbps, and Wi-Fi at 43.34 Mbps/43.72 Mbps.

Speeds around the arena were generally in double-digit megabits, both for Wi-Fi and DAS. The Wi-Fi signal got a little sluggish in Section M7 (0.79 Mbps/3.03 Mbps) and Section M33 (1.68 Mbps/29 Mbps). Lowest measured throughput on the DAS network was in Suite 17 with 16.18 Mbps/17.41 Mbps, still plenty fast to handle most fan requirements.

Lighting Things Up in District Detroit

In tandem to LCA, there are approximately 1,000 APs also attached to the network that either handle District Detroit’s public Wi-Fi or connect to 34 parking lots and garages.

Wireless gear painted to blend in

“Our goal is to bring life and excitement throughout the District and not just focus on Little Caesars Arena,” King said. Video and digital signage are essential to that effort, both inside and outside LCA. The network enables more than 1,500 IPTV connections distributed across the arena, but also externally to LED boards and electronic parking signs. “We want to take the excitement from the event and run it out to the city – ‘5 minutes to puck drop’, on all those signs as one example,” King explained. “We can leverage [signage] for more than just the price of parking.”

The network uses the Cisco Vision IPTV digital display management system to control display programming, including advertising that appears on video screens in LCA’s many hospitality suites. With five TV screens per suite, LCA deploys an L-shaped “wrapper” around the main video image used for advertising. “We rotate that content in the suites and run loops in concourse before and after events,” King said. “It allows us to put scripting in different zones or post menus and dynamically update prices and items for sale.” LCA’s concessionaires can change the price or location of food and beverage items, all through the networked point-of-sale system.

Tune-able Wi-Fi

The District Detroit app is divided into three “buckets,” according to King: Detroit Red Wings, Detroit Pistons and 313 Presents — all the events and entertainment outside of sporting events (313 is Detroit’s area code). When configured for hockey, LCA can accommodate up to 19,515 Red Wings fans; as a basketball arena for the Pistons, LCA holds 20,491. But some events may draw fewer people and King and his team adjust accordingly.

“We’re an arena for 20,000 fans and as we looked at that density, we found that 10,000 fans behave differently and we’ve had to tune the arena differently based on traffic flows,” he said. When completely full, Wi-Fi signals must pass through so many “bags of water,” as RF engineers sometimes describe human spectators. Half as many fans means that Wi-Fi signals behave differently, consequently, a fan may connect to an AP that’s less than ideal, which can affect both user experience and system performance.

An under-seat Wi-Fi enclosure

“We’ve looked at some power tweaks and tuning; we also have the ability to tune [the arena] on the fly,” King said, but emphasized that the venue’s Wi-Fi doesn’t get re-tuned for every event. “We try to find the sweet spot and not do that too much. On an event day, we try not to touch anything that isn’t broken,” he said.

Previews of coming attractions

Like any sports and entertainment IT exec, King is looking at ways to improve the fan experience and derive more performance and revenue from Olympia’s IT investment. Buoyed by the success of mobile ticketing at LCA, King said he’d like to find some way to use biometrics to help speed up transactions at counters and pedestals throughout the arena. And he’s excited about 5G cellular deployment, which he believes could compete with Wi-Fi if 5G delivers on all that’s been promised by carriers.

LCA’s app uses Bluetooth for navigation, letting fans input their seat information for directions. “Right now, we have pre-order pickup, but in-seat service is something we’re looking at. What other line-busting technologies can we do?” King said.

And while fans can pre-order food and beverages at LCA, King also wonders if pre-ordering of team merchandise (“merch”) is something that would appeal to fans and be easy to execute. “We’re looking at a Cincinnati venue where they have compartments for food, hot or cold, that’s been pre-ordered,” he said, wondering if a similar compartmentalized pickup system be used for merch.

King sees plenty of room for improvement in overall management reporting across IT systems at LCA and the 12,000 active ports that keep systems humming.

“Everything is connected and our electricians can use their iPads to dim or turn on lights anywhere in the building,” he said, adding that everything’s monitored — every switch, every port. “It would be nice to see more information around traffic flow and performance patterns. We’re seeing a little bit of that. But I’d like to see network information on people tracking and doors, and correlate visual information with management data.”

Another set of metrics King can’t get at the moment: Performance data from AT&T, T-Mobile and Verizon about LCA’s 8-zone DAS system. King said he’s talking with Verizon, the lead DAS operator at the venue, about getting autonomous reports in the future, but for the time being King and his team don’t have much visibility there. The DAS uses the Corning ONE system.

Amalie Arena’s MatSing-powered DAS ready for Women’s Final Four

MatSing ball antennas seen behind championship banners at Amalie Arena. Credit all photos: MatSing (click on any photo for a larger image)

The new DAS at Amalie Arena in Tampa, which uses 52 MatSing ball antennas, is fully operational and ready for this weekend’s NCAA Women’s Final Four, which starts on Friday.

According to AT&T, which is running and operating the new DAS, the new network “is officially on-air,” after going through some test runs during Tampa Bay Lightning NHL games. According to one informer, AT&T CEO John Donovan (an old friend of MSR) attended a recent hockey game at Amalie and gave a big thumbs-up to the new DAS, which is the biggest known installation of the unique MatSing antennas, which are basically huge spheres with lots of directional cellular antennas inside.

A press release from AT&T about the new DAS claims that has boosted cellular capacity inside Amalie Arena by 400 percent from last year. The new DAS also uses Corning ONE gear on the back end.

MSR will be in Minneapolis this weekend at the other Final Four, so if you are in Tampa for the women’s tourney take a speedtest or two on cellular and let us know what you see. We are watching the DAS deployment at Amalie Arena carefully since it is our guess that it won’t be the last you hear of MatSing deployments this year. Some more photos from the Amalie Arena MatSing deployment below.

Super Bowl recap: 24 TB for Wi-Fi, 12 TB for DAS

Pats fans celebrate with a selfie at the end of Super Bowl 53. Credit all photos: Mercedes-Benz Stadium (click on any picture for a larger image)

Super Bowl 53 at Atlanta’s Mercedes-Benz Stadium rewrote the record book when it comes to single-day stadium Wi-Fi, with 24.05 terabytes of traffic seen on the stadium’s network. That is a huge leap from the official 16.31 TB seen at last year’s Super Bowl 52 in Minneapolis at U.S. Bank Stadium.

According to official statistics provided by Extreme Networks, new high-water marks were set last Sunday in every category of network measurement, including an amazing 48,845 unique users on the network, a take rate of 69 percent out of the 70,081 who were in attendance to watch the New England Patriots beat the Los Angeles Rams 13-3. The average Wi-Fi data use per connected fan also set a new record, with the per-fan mark of 492.3 megabytes per user eclipsing last year’s mark of 407.4.

While fans might have preferred some more scoring excitement during the game, the lack of any tense moments in network operations was a perfect outcome for Danny Branch, chief information officer for AMB Sports & Entertainment.

“I was ecstatic on how [the network] executed, but honestly it was sort of uneventful, since everything went so well,” said Branch in a phone interview the week after the game. Though network performance and fan usage during some of the big events leading up to the Super Bowl had Branch thinking the Wi-Fi total number might creep near the 20-terabyte range, the early network use on game day gave Branch a clue that the final number might be even higher.

“When I saw the initial numbers that said we did 10 [terabytes] before kickoff we didn’t know where it would end,” Branch said. “When we were watching the numbers near the end of the game, we were just laughing.”

Aruba APs and AmpThink design shine

Editor’s note: This report is from our latest STADIUM TECH REPORT, an in-depth look at successful deployments of stadium technology. Included with this report is a profile of the new Wi-Fi and DAS networks being planned for the University of Colorado, as well as a profile of Wi-Fi at Little Caesars Arena in Detroit! DOWNLOAD YOUR FREE COPY now!

Digital device use once again set records at the NFL’s championship game.

With some 1,800 APs installed inside Mercedes-Benz Stadium — with most of the bowl seating APs located underneath the seats — the Wi-Fi gear from Aruba, a Hewlett Packard Enterprise company, in a design from AmpThink, also saw a peak throughput rate of 13.06 Gbps, seen at halftime. The peak number of concurrent network users, 30,605, also took place during the halftime show, which featured the band Maroon 5 (whose show played to mixed reviews).

Extreme Networks, which provides Wi-Fi analysis in a sponsorship deal with the NFL, had a great list of specific details from the event. Here are some of the top-line stats:

Need proof that people still watch the game? Out of the 24.05 TB total, Extreme said 9.99 TB of the traffic took place before the kickoff, followed by 11.11 TB during the game and halftime, and another 2.95 TB after the game concluded.

On the most-used apps side, Extreme said the most-used social apps were, in order of usage, Facebook, Instagram, Twitter, Snapchat and Bitmoji; on the streaming side, the most-used apps were iTunes, YouTube, Airplay, Spotify and Netflix. The most-used sporting apps by fans at the game were, in order, ESPN, NFL, the Super Bowl LIII Fan Mobile Pass (the official app for the game), CBS Sports (which broadcast the game live) and Bleacher Report.

Did Verizon’s offload spike the total?

While Super Bowl Wi-Fi traffic has grown significantly each year since we started reporting the statistics, one reason for the bigger leap this year may have been due to the fact that Verizon Wireless used its sponsorship relationship with the NFL to acquire its own SSID on the Mercedes-Benz Stadium Wi-Fi network.

Hard copy signage in the stadium helped direct fans to the Wi-Fi.

According to Andrea Caldini, Verizon vice president for networking engineering in the Eastern U.S., Verizon had “autoconnect in play,” which meant that any Verizon customer with Wi-Fi active on their devices would be switched over to Wi-Fi when inside the stadium.

“It’s going to be a good offload for us,” said Caldini in a phone interview ahead of the Super Bowl. While Verizon claimed week to have seen “record cellular traffic” as well during Super Bowl Sunday, a spokesperson said Verizon will no longer release such statistics from the game.

According to Branch, the NFL helped fans find the Wi-Fi network with additional physical signage that was put up just for the Super Bowl, in addition to rotating messages on the digital display screens around the stadium.

“The venue was well signed, we really liked what they [the NFL] did,” Branch said. Branch said the league also promoted the Wi-Fi link throughout the week, with a common ID at all the related Super Bowl activity venues, something that may have helped fans get connected on game day.

No issues with the DAS

One of the parts of the wireless mix at Mercedes-Benz Stadium, the cellular distributed antenna system, was under scrutiny after a lawsuit emerged last fall under which technology supplier IBM sued Corning over what IBM said was faulty installation. While Corning has disputed the claims, over the past year IBM, the Falcons and the NFL all said they got the DAS in working order, and according to Branch “all the carriers were pleased” with its operation during the Super Bowl.

There was only one, but it helped increase the wireless traffic.

According to Branch, the Falcons saw 12.1 TB of traffic on the in-stadium DAS on Super Bowl Sunday, including some traffic that went through the Matsing Ball antennas. Branch said the two Matsing Balls, which hang from the rafters around the Halo Board video screen, were turned back on to assist with wireless traffic on the field during the postgame awards ceremony.

Overall, the record day of Wi-Fi traffic left Branch and his team confident their infrastructure is ready to support the wireless demands of more big events into the future, including next year’s NCAA men’s Final Four.

“Until you’ve taken the car around the track that fast, you don’t really know how it will perform,” Branch said. “But so much work was done beforehand, it’s great to see that it all paid off.”

New Report: Record Wi-Fi at Super Bowl 53, and Wi-Fi and DAS for Colorado’s Folsom Field

MOBILE SPORTS REPORT is pleased to announce the Spring 2019 issue of our STADIUM TECH REPORT series, the ONLY in-depth publication created specifically for the stadium technology professional and the stadium technology marketplace.

Our string of historical in-depth profiles of successful stadium technology deployments continues with reports from the record-setting Wi-Fi day at Super Bowl 53, a look at the network performance at Little Caesars Arena, plans for Wi-Fi and DAS at the University of Colorado and more! Download your FREE copy today!

We’d like to take a quick moment to thank our sponsors, which for this issue include Mobilitie, JMA Wireless, Corning, Boingo, MatSing, and Cox Business/Hospitality Network. Their generous sponsorship makes it possible for us to offer this content free of charge to our readers. We’d also like to welcome readers from the Inside Towers community, who may have found their way here via our ongoing partnership with the excellent publication Inside Towers. We’d also like to thank the SEAT community for your continued interest and support.

PGA Tour gives CBRS a test

Volunteers track shots with lasers on the fairways of PGA Tour tournaments. Credit: Chris Condon/PGA TOUR (click on any photo for a larger image)

CBRS technology doesn’t need spikey shoes to gain traction on the fairways, if early results from technology tests undertaken by the PGA Tour at courses around the country are any indication.

A recent 14-state test run by the top professional U.S. golf tour tapped the newly designated Citizens Broadband Radio Service (CBRS), which comprises 150 MHz of spectrum in the 3.5 GHz band. Golf courses, which typically lack the dense wireless coverage of more populated urban areas, are easily maxed out when thousands of fans show up on a sunny weekend to trail top-ranked players like Brooks Koepka, Rory McIlroy or perennial favorite Tiger Woods.

To cover the bandwidth needs of tournaments, the PGA Tour has over time used a mix of technologies, many portable in nature given the short stay of a tournament at any given course. Like Wi-Fi or temporary cellular infrastructures used in the past, the hope is that CBRS will help support public safety, scoring and broadcast applications required to keep its events operating smoothly and safely, according to the PGA Tour.

“We’re looking at replacing our 5 GHz Wi-Fi solution with CBRS so we can have more control over service levels,” said Steve Evans, senior vice president of information systems for the PGA Tour. Unlike 5 GHz Wi-Fi, CBRS is licensed spectrum and less prone to interference the Tour occasionally experienced.

CBRS will also make a big difference with the Tour’s ShotLink system, a wireless data collection system used by the PGA Tour that gathers data on every shot made during competition play – distance, speed and other scoring data.

“CBRS would help us get the data off the golf course faster” than Wi-Fi can, Evans explained. “And after more than 15 months of testing we’ve done so far, CBRS has better coverage per access point than Wi-Fi.”

The preliminary results are so encouraging that the Tour is also looking to CBRS to carry some of its own voice traffic and has already done some testing there. “We need to have voice outside the field of play, and we think CBRS can help solve that problem,” Evans added.

But as an emerging technology, it’s important to acknowledge the limitations of CBRS. Compatible handsets aren’t widely available; the PGA Tour has been testing CBRS prototypes from Essential. Those units only operate in CBRS bands 42 and 43; a third, band 48, is expected to be added by device makers sometime in the first half of 2019.

“We’re waiting for the phones to include band 48 and then we’ll test several,” Evans told Mobile Sports Report. “I expect Android would move first and be very aggressive with it.”

CBRS gear mounted on temporary poles at a PGA Tour event. Credit: PGA Tour

The PGA Tour isn’t the only sports entity looking at CBRS’s potential. The National Football League is testing coach-to-coach and coach-to-player communications over CBRS at all the league’s stadiums; the NBA’s Sacramento
Kings are testing it at Golden 1 Center with Ruckus; NASCAR has been testing video transmission from inside cars using CBRS along with Nokia and Google, and the ISM Raceway in Phoenix, Ariz., recently launched a live CBRS network that it is currently using for backhaul to remote parking lot Wi-Fi hotspots.

Outside of sports and entertainment, FedEx, the Port of Los Angeles and General Electric are jointly testing CBRS in Southern California. Love Field Airport in Dallas is working with Boingo and Ruckus in a CBRS trial; service provider Pavlov Media is testing CBRS near the University of Illinois Champaign-Urbana with Ruckus gear. Multiple service providers from telecom, cable and wireless are also testing the emerging technology’s potential all around the country.

Where CBRS came from, where it’s going

Editor’s note: This profile is from our latest STADIUM TECH REPORT, an in-depth look at successful deployments of stadium technology. Included with this report is a profile of the new game-day digital fan engagement strategy at Texas A&M, as well as a profile of Wi-Fi at Merceds-Benz Stadium, home of Super Bowl LIII in Atlanta! DOWNLOAD YOUR FREE COPY now!

CBRS has undergone a 6-year gestation period; 150 MHz worth of bandwidth was culled from the 3.5 GHz spectrum, which must be shared (and not interfere) with U.S. government radar operations already operating in that same spectrum.

From a regulatory perspective, CBRS’s experimental status is expected to give way to full commercial availability in the near future. Consequently, wireless equipment vendors have been busy building – and marketing – CBRS access points and antennas for test and commercial usage. But entities like the PGA Tour have already identified the benefits and aren’t waiting for the FCC to confer full commercial status on the emerging wireless technology.

CBRS equipment vendors and would-be service providers were hard to miss at last fall’s Mobile World
Congress Americas meeting in Los Angeles. More than 20 organizations – all part of the CBRS Alliance – exhibited their trademarked OnGo services, equipment and software in a day-long showcase event. (Editor’s note: “OnGo” is the alliance’s attempt to “brand” the service as something more marketable than the geeky CBRS acronym).

The CBRS Alliance envisions five potential use cases of the technology, according to Dave Wright, alliance president and director of regulatory affairs and network standards at Ruckus:
• Mobile operators that want to augment capacity of their existing spectrum
• Cable operators looking to expand into wireless services instead of paying a mobile virtual network operator (MVNO)
• Other third-party providers looking to offer fixed broadband services
• Enterprise and industrial applications: extending or amplifying wireless in business parks and remote locations; Internet of Things data acquisition.
• Neutral host capabilities, which some have likened to LTE roaming, an important development as 5G cellular services ramp up.

Previously, if customers wanted to extend cell coverage inside a building or a stadium, their best option was often distributed antenna systems (DAS). But DAS is complicated, expensive and relies on carrier participation, according to Wright. “Carriers also want to make sure your use of their spectrum doesn’t interfere with their macro spectrum nearby,” he added.

CBRS uses discrete spectrum not owned by a mobile operator, allowing an NFL franchise, for example, to buy CBRS radios and deploy them around the stadium, exclusively or shared, depending on their requirements and budgets.

More CBRS antenna deployment. Credit: PGA Tour

On a neutral host network, a mobile device would query the LTE network to see which operations are supported. The device would then exchange credentials with the mobile carriers – CBRS and cellular – then permissions are granted, the user is authenticated, and their usage info gets passed back to the carrier, Wright explained.

With the PGA Tour tests, the Essential CBRS devices get provisioned on the network, then connect to the CBRS network just like a cell phone connects to public LTE, Evans explained. The Tour’s custom apps send collected data back to the Tour’s network via the CBRS access point, which is connected to temporary fiber the Tour installs. And while some of Ruckus’s CBRS access points also support Wi-Fi, the Tour uses only the CBRS. “When we’re testing, we’re not turning Wi-Fi on if it’s there,” Evans clarified.

While the idea of “private LTE” networks supported by CBRS is gaining lots of headline time, current deployments would require a new SIM card for any devices wanting to use the private CBRS network, something that may slow down deployments until programmable SIM cards move from good idea to reality. But CBRS networks could also be used for local backhaul, using Wi-Fi to connect to client devices, a tactic currently being used at ISM Raceway in Phoenix.

“It’s an exciting time… CBRS really opens up a lot of new opportunities,” Wright added. “The PGA Tour and NFL applications really address some unmet needs.”

CBRS on the Fairways

Prior to deploying CBRS access points at a location, the PGA Tour surveys the tournament course to create a digital image of every hole, along with other data to calculate exact locations and distances between any two coordinates, like the tee box and the player’s first shot or the shot location and the location of the hole. The survey also helps the Tour decide how and where to place APs on the course.

Courses tend to be designed in two different ways, according to the PGA Tour’s Evans. With some courses, the majority number of holes are adjacent to each other and create a more compact course; other courses are routed through neighborhoods and may snake around, end-to-end.

“In the adjacent model, which is 70 percent of the courses we play, we can usually cover the property with about 10 access points,” Evans explained.

Adjacent-style courses where the PGA Tour has tested CBRS include Ridgewood Country Club in Paramus, N.J.; Aronimink Golf Club in Newtown Square, Penn.; and East Lake Golf Club in Atlanta.

In the second model, where the holes are strung back to back, the PGA Tour may have to deploy as many as 18 or 20 APs to get the coverage and throughput it needs. That’s the configuration used during a recent tournament at the TPC Summerlin course in Las Vegas, Nev., Evans told Mobile Sports Report.

On the course, CBRS APs get attached to some kind of structure where possible, Evans added. “Where that doesn’t make sense, we have portable masts we use – a tripod with a pole that goes up 20 feet,” he said. The only reason he’d relocate an AP once a tournament began is if it caused a problem with the competition or fan egress. “We’re pretty skilled at avoiding those issues,” he said.

A handful of PGA Tour employees operates its ShotLink system, which also relies on an army of volunteers – as many as 350 at each tournament – who help with data collection and score updates (that leader board doesn’t refresh itself!). “There’s a walker with each group, recording data about each shot. There’s technology for us on each fairway and green, and even in the ball itself, as the ball hits the green and as player hits putts,” said Evans.

The walker-volunteers relay their data back to a central repository; from there, ShotLink data then gets sent to PGA Tour management and is picked up by a variety of organizations from onsite TV broadcast partners; the pgatour.com Website; players, coaches and caddies; print media; and mobile devices.

In addition to pushing PGA Tour voice traffic over on to CBRS, the organization is also looking for the technology to handle broadcast video. “We think broadcast video capture could become a [CBRS] feature,” Evans said. The current transport method, UHF video, is a low-latency way to get video back to a truck where it can be uploaded for broadcast audiences.

A broadcast program produced by the organization, PGA Tour Live, follows two groups on the course; each group has four cameras and producers cut between each group and each camera. That video needs to be low latency, high reliability, but is expensive due to UHF transmission.

Once 5G standards are created for video capture, the PGA Tour could use public LTE to bond a number of cell signals together. Unfortunately, that method has higher latency. “It’s fine for replay but not for live production,” Evans said, but is expected to eventually improve performance-wise. “The idea is eventually to move to outside cameras with CBRS and then use [CBRS] for data collection too,” he added. “If we could take out the UHF cost, it would be significant for us.”

In the meantime, the Tour will continue to rely largely on Cisco-Meraki Wi-Fi and use Wi-Fi as an alternate route if something happens to CBRS, Evans said. “But we expect CBRS to be primary and used 99 percent of the time.”