Ookla shares Speedtest data from CenturyLink Field, other stadiums

Ookla ad banner being flown over CenturyLink Field in Seattle. Credit: Ookla

Ookla ad banner being flown over CenturyLink Field in Seattle. Credit: Ookla

Anyone who follows Mobile Sports Report knows that I use the Speedtest app from Ookla to measure stadium network performance whenever I visit a sporting venue. While my one-man tests do show some measure of network power, I always dreamed of harnessing the results from many fans at the same game to see a better picture of the network performance.

Well, Speedtest’s creators think along the same lines, and conducted an experiment during an Aug. 25 Seattle Seahawks preseason game at CenturyLink Field in Seattle. You can read their very thorough post and neat results here, with some interesting twists — for instance, the cellular networks are way faster than the CenturyLink Wi-Fi, according to the Ookla results.

UPDATE: Ookla responded to our email and let us know that on Aug. 25, there were 252 Speedtests at CenturyLink Field, a great sampling to draw results from. Ookla also talked about tests from 12 different events at CenturyLink Field, and said in the email that across those events it saw 1,143 tests conducted.

Ookla also published some test result totals from other stadiums as well, including Levi’s Stadium, AT&T Stadium and Bank of America Stadium, but didn’t say when those tests were recorded, or how many tests were taken.

What we really like, however, is that Ookla’s tests show what our stadium tech report surveys have been showing — that overall, in-stadium network performance is steadily improving. Over time, more data like this can help dispel the still-lingering rumor that stadium networks don’t deliver good connectivity. Now if we could only get Ookla to partner with us to do league-wide or college-comparison speedtests… anyone ready for that idea?

Will cellular carrier aggregation matter in stadium networks?

Kauffman Stadium during 2015 World Series

Kauffman Stadium during 2015 World Series

Over the past few days, both Sprint and Verizon Wireless have made announcements about a technique called “carrier aggregation” (CA for short) for LTE cell networks that basically bonds together different frequency channels to bring more bandwidth to a mobile device. Though the premise sounds great, what we here at MSR HQ haven’t been able to ascertain yet is whether or not this technique will help solve the biggest problem in stadium network situations, namely providing enough capacity for users on the networks installed there.

Sprint has made the most noise this week, with claims of CA demonstrations at Soldier Field in Chicago and Kansas City’s Kauffman Stadium that (they said) showed Sprint devices bonding three different frequency channels to hit download speeds of 230 Mbps, a score way off the charts for any existing stadium networks. (The fastest Wi-Fi and cellular speeds we’ve seen in our short history of stadium tests, by comparison, are in the 60 Mpbs range.) Verizon made a similar announcement about CA being put in across its network, without specifying if the service would be available in stadiums. Other carriers, including AT&T and T-Mobile, are also exploring use of the CA technique. At the very least, some lucky users with newer devices may see leaps in performance thanks to CA deployments, a good thing on any level.

But our bigger question — which hasn’t been answered in the press releases and hasn’t (yet) been answered in email questions to Sprint or Verizon — is whether or not CA will help with overall network capacity, which to us seems to be a more pressing problem at most stadiums as opposed to simple download speeds. I mean, demos are great and it’s cool to see what the upper limits are for one device; but it’d be more impressive if Sprint could guarantee that 230 Mbps mark to every device in the park, should everyone there have a Sprint phone with the capability to perform the CA trick (not all devices in the market today can do so).

Finally using the Clearwire spectrum

What’s also not completely revealed in the press releases is what kind of gear is necessary on the back end of the network to make CA work, and whether or not it makes economic sense to have that gear placed inside stadiums to enable the technique for as many fans as possible. While we understand the basic premise probably better than most (since in a former life yours truly spent several years following and analyzing the Clearwire spectrum holdings at 2.5 GHz) it’s not clear if CA solves any congestion problems, especially for carriers other than Sprint, who only have a limited amount of licensed spectrum in each market they serve.

(Without getting too deep into spectrum geekiness, Sprint on paper probably has more room to grow in the CA space since its 2.5 GHz holdings dwarf other carriers’ licensed bands; but to make use of that spectrum, you need customers with devices that can use that spectrum, and enough cash for a wide network buildout, both of which Sprint may be challenged to find.)

As we understand CA, by bonding channels you can make one device faster since it has more aggregate bandwidth to work with. But it’s not clear that using CA in a stadium environment would make the overall situation any faster than say, three phones using single channels by themselves. Also, since you can’t create new bandwidth, if one phone starts tapping three different channels doesn’t that actually leave less room for other devices that may want to also use those channels? Perhaps with CA the connections would be faster and wouldn’t last as long, thereby freeing up spectrum for other devices; again, there’s not a lot of information yet on the capacity side of the equation, especially in crowded stadiums or at big events where bandwidth needs escalate. If there are any cellular wizards in the audience with more knowledge of the situation, feel free to chime in.

We did get an email response from our old friend John Saw, formerly of Clearwire and now chief technical officer at Sprint. Here’s his explanation of why CA is a good thing for stadiums:

Essentially, sites with bonded channels will drive higher capacities. This will be especially timely and helpful in crowded spaces like Soldier Field where there are surges in capacity demand during live sporting events. Sprint customers with CA enabled phones will enjoy 2X (in the case of 2CA) or 3X (in the case of 3CA) their download speeds, which means that they will get a better data experience with a bigger pipe. But wait – CA will lift all boats and it will also benefit those Sprint customers who have not upgraded to CA enabled phones yet. While they may not enjoy the higher peak speeds enabled by CA phones, their phones will have access to more network resources which means they will also have a better data experience, with no stalling or without that dreaded “windmill effect” in a crowded stadium.

I kind of understand what Saw is talking about here, but I am still having a problem with the math that says all boats will be lifted through the use of CA. Plus, experience and interviews have taught us that across the country, Sprint is behind Verizon and AT&T when it comes to DAS deployments inside stadiums; and, it’s not clear (and hasn’t been answered) whether or not CA can work over a neutral-host DAS deployment where carriers share antennas and other infrastructure.

From an industry-wide standpoint, CA seems like a great thing for all cell phone users since as it progresses devices should be able to utilize whatever bandwidth is around to make performance better. It’s also good to see more technology advancements made on the network side of things, since infrastructure needs all the help it can get to keep up with devices. But right now, we’re not sure if CA is the answer to any of the capacity problems stadium network operators face. Anyone with views that can expand the explanation, feel free to hit the comments section below or send me an email to kaps at mobilesportsreport.com.

Wi-Fi a winner at Avaya Stadium’s MLS All-Star game

Just before game time at Avaya Stadium for the 2016 MLS All Star game. Credit all photos: Paul Kapustka, MSR

Just before game time at Avaya Stadium for the 2016 MLS All Star game. Credit all photos: Paul Kapustka, MSR

On the pitch it was the Arsenal lads emerging victorious over by a 2-1 score over the Major League Soccer All-Stars, but in the stands it was Avaya Stadium’s Wi-Fi network that won the day at the 2016 MLS All-Star game Thursday in San Jose, Calif.

Unlike a year ago at the Avaya Stadium opening, when we found the fan-facing Wi-Fi a bit lacking, the Wi-Fi network performed at top speeds for almost all of our tests during an MSR “walkaround” before and during the MLS All-Star game. Unfortunately, we can’t say the same about the cellular network performance in and around Avaya Stadium, with many signals so weak on both the AT&T and Verizon Wireless networks that in most places we couldn’t conduct a speed test.

Leaving behind the question as to why there hasn’t been a DAS installed yet at Avaya Stadium, it was great to see the Wi-Fi network consistently hit download and upload speeds in the mid- to high-20+ Mbps range in just about every spot of the U-shaped soccer-centric arena. In the main seating areas, in the concourses below the stands as well as around the huge open-air bar there was solid connectivity, fueled by what looked like a lot more APs than we saw during vists to the venue last year.

While some of the AP installs looked like last-minute fixes (we saw several instances where paper binder clips were used in Phil Mickelson fashion to secure wiring to metal beams) there was certainly a noticeable amount of extra equipment, especially on the stanchions that loom out over the seating area. There, it seemed like every beam or at least every other beam had three sets of paired APs, which no doubt helped produce a speed test of 28.93/27.44 we took in the middle of the seating area (of section 120, in the closed end zone). Last year, it was a challenge to get a good reading in the middle of the seats.

The top speed test we got Thursday night was outside a sausage stand at one corner of the open end zone, where the meter hit 44.00/33.49 just before game time. We were also impressed by the consistent coverage at the huge outdoor bar in the open end zone, helped no doubt by APs on the top of the bar roof on each end, and three APs per side above the bar server areas.

Somewhat ironically the only place we couldn’t get a Wi-Fi signal was in our assigned “press box” seat, actually the upper back corner in the southwest part of the stadium. While we are guessing the problem may have been due to press-laptop overload (or some APs missing from what looked like normal install points), we noticed that by walking one section away from the press section we were able to reconnect with the regular stadium Wi-Fi network at a reading of 18.27/20.38.

One of the many 'doubled' AP installs we saw

One of the many ‘doubled’ AP installs we saw

Most of the 18,000+ fans in the sellout crowd seemed to have no issues connecting to the Internet, as we saw fans heads-down on their devices in all parts of the venue. We did talk to one fan at halftime who said the lack of cellular connectivity outside the main gate kept him from being able to call up his digital tickets on his phone.

“But once I got inside I connected to Wi-Fi and everything was fine,” he said.

On the upper deck walkways we did finally get a fairly strong AT&T 4G LTE signal — 8.23/8.93 — which may have been due to a clear line of sight to the “eyeball” antenna we saw deployed in the VIP parking lot. And while we could make calls and send texts on our Verizon phone 4G LTE connection, trying to load a web page took so long we gave up. Moral of the story is for Avaya Stadium fans: Make sure you hit the GOQUAKES SSID as soon as you’re near the stadium!

Enjoy the rest of our photos from our quick trip to San Jose, and know that while beers may cost $12.50 and shots of Jameson’s may set you back $12 each, at this year’s All-Star game all the Arsenal cheers and songs you ever wanted to hear were free of charge.

mls3
APs like this ringed the lower concourse wall areas

mls4

When the players walk through the concourse to take the pitch, it’s snapshot city

mls5

At the huge end zone bar, it was SRO all day long

mls6
AP visible in middle of roof section of bar

mls7
Pictures and selfies were the order of the day

mls8
The famed sausage stand AP with the 44 down reading. The bratwurst was good, too

mls10
The Avaya Stadium social media wall

mls13
Staying connected in the stands

mls14

Nice view from the upper deck

mls15
I think if I stuck around this group for one more beer I could have learned at least three Arsenal songs

mls16
AT&T eyeball antenna sighting in the VIP parking lot

Fan-facing Wi-Fi on hold as Coliseum gets ready for Rams’ return to Los Angeles

DAS antennas visible on the LA Coliseum's facade

DAS antennas visible on the LA Coliseum’s facade

Normally when a new professional sports franchise comes to town or opens a new venue, preparations move into overdrive pretty quickly — especially for infrastructure like luxury suites, Wi-Fi and DAS.

But this is the Los Angeles Rams, and nothing about the team’s trajectory is normal, including the technology. So after making good on a longstanding threat to move the team from St. Louis, owner Stan Kroenke in January broke ground on a new, $2.6 billion “NFL Disneyland” venue in the LA suburb of Inglewood. It’s expected to be ready in time for the 2019 season. So for right now, the returning Rams will play at the Los Angeles Coliseum.

While it has an established and robust DAS system, the Coliseum has no dedicated, fan-facing Wi-Fi network, just as the Edward Jones Dome in St. Louis didn’t. But it’s LA, baby. Kanye. Jack. Celebrity sightings means more bandwidth is needed, not less. But the first appearance of Wi-Fi at the venerable Coliseum won’t be for fans, but for operations.

Relying on DAS for fan wireless

Editor’s note: This profile is from our most recent STADIUM TECH REPORT, the Q2 issue which contains a feature story on Wi-Fi analytics, and a sneak peek of the Minnesota Vikings’ new US Bank Stadium. DOWNLOAD YOUR FREE COPY today!

Artist rendering of the proposed new LA football stadium

Artist rendering of the proposed new LA football stadium

So when the Rams play their first pre-season game at the Coliseum in August against the Dallas Cowboys, fans will have to rely on the DAS network for connectivity. AT&T, Verizon Wireless and more recently T-Mobile, are the DAS carriers; Sprint trucks in a COW for USC games.

The Rams will bring in their own Wi-Fi for communications to and from the sidelines, along with their Microsoft Surface tablets, according to Derek Thatcher, IT manager at the Coliseum and an employee of USC, which oversees and administers the venue for Los Angeles County. There will also be private Wi-Fi in the locker rooms and the officiating rooms. Thatcher’s working closely with the Rams and the NFL, including one of the NFL’s frequency coordinators, to ensure everybody has the bandwidth they need.

Separate from the Rams and the NFL, USC is undertaking a major renovation of the Coliseum, home to the university’s storied football team. The $270 million project cost will be funded entirely by USC Athletics from capital gifts, sponsorship revenue, non-USC athletic events at the Coliseum, and donor naming opportunities. “The project will not require any student fees or general university, local, state or federal funds,” the university said on ColiseumRenovation.com.

In addition to significant expansion of luxury suites and press box facilities (which houses most the IT and networking gear for the Coliseum), USC will also be adding public Wi-Fi and is talking with different vendors about their requirements.

DAS antennas inside the concourse

DAS antennas inside the concourse

It is worth noting that Aruba Networks provides wireless networking for most of the adjacent USC campus (more than 5,000 APs), including the Galen Center and USC’s healthcare facilities. Thatcher emphasized the bidding was open to all vendors.

Waiting for the new stadium to be built

The Coliseum’s current capacity is 93,600 and the NFL will use 80,000 seats; post-renovation, capacity will be 77,500, due to replacement of all seats and addition of handrails to the aisles. That’s good news for Wi-Fi engineers, since the Coliseum’s bowl design has no overhangs to speak of; DAS antennas are mounted to poles that ring the stadium, and also above the bowl’s entry and exit tunnels. “Underseat AP design is expensive… we could end up with underseat and handrail,” Thatcher told Mobile Sports Report. “We’re looking at all possible solutions.”

The renovation is scheduled to begin right after USC’s 2017 football season ends and is expected to be done in time for the 2019 opener. The university said it will plan the construction schedule so that 2018 season can still be played at the Coliseum.

The other wild card in the Coliseum’s future is the Los Angeles bid to host the 2024 Summer Olympics. The city has already done the honors before (1932, 1984) and the Coliseum served as the Olympic Stadium both times; a third time hosting would be unprecedented. But because of LA’s experience, coupled with plenty of already-built sporting venues to handle a plethora of events and requirements, it was natural for the US Olympic Committee to turn to LA once Boston bailed.

The city’s bid includes $300 million for additional renovations to the Coliseum.

Rome, Paris and Budapest are also competing to host the games. The winner will be announced by the International Olympic Committee in September 2017.

Wi-Fi Analytics: Taking the first steps

Wi-Fi antennas at Joe Louis Arena. Credit: Detroit Red Wings (click on any photo for a larger image)

Wi-Fi antennas at Joe Louis Arena. Credit: Detroit Red Wings (click on any photo for a larger image)

Even though the physical construction and deployment of a fan-facing Wi-Fi network seems like the biggest challenge facing a stadium’s information technology team, in reality everyone involved knows it’s just step one.

While turning on a live network is certainly a great accomplishment, once the data starts flowing the inevitable questions follow: Now that we have Wi-Fi, what do we do with it? And how do we find out who’s using it, why they are using it, and how can we use that information it to find out better ways to improve the fan experience while also improving our business?

Those “step two” questions can only be answered by analytics, the gathering of information about Wi-Fi network performance and user activity. And while almost every live network operator almost instantly uses performance numbers to help tune the system, plans to harvest and digest the more personalized information like end-user identification, application use and fan engagement are just getting started, even at the most technically advanced stadiums with Wi-Fi networks in place.

What follows here are some conversations with stadium tech professionals who are already running fan-facing Wi-Fi networks, exploring how they use Wi-Fi metrics and analytics to both enhance the game-day experience for fans while also building a base of information that can be used by both technical staffs and marketing organizations inside the team, school and venue organizations.

Even this small sample seems to suggest that while Wi-Fi networks may be somewhat pervasive in the larger stadiums across the country, the harvesting and processing of data generated by digital fan engagement is just getting started, with plenty of unanswered questions and experiments that have yet to bear significant fruit. Yet everyone we spoke with also had an unshakable confidence that getting metrics and analytics right was the key to wireless success over the long haul, and all are fully engaged in pursuing that goal. It may take longer than physical deployment, but the “step two” of learning from the networks is well underway.

Detroit Red Wings: Pushing past the initial learning curve

Editor’s note: This profile is from our most recent STADIUM TECH REPORT, the Q2 issue which contains a feature story on Wi-Fi analytics, and a sneak peek of the Minnesota Vikings’ new US Bank Stadium. DOWNLOAD YOUR FREE COPY today!

Now that the Wi-Fi network at Joe Louis Arena in Detroit is coming up on its second birthday, Tod Caflisch said network administrators can relax a bit on game nights. Early on, however, he remembers “babysitting” the network during games, watching live performance stats to make sure everything was working correctly.

Watching the live network performance statistics, Caflisch said, “I could tell if there were issues. If throughput looked a little flat, we might have to reboot a switch. It was important, because there was so much at stake.”

As former director of information technology for the NHL’s Detroit Red Wings (he recently left Detroit and is now with the Minnesota Vikings), Caflisch helped drive the deployment of an Extreme Networks Wi-Fi network at the “Joe.” Though Joe Louis Arena is only going to host games a little while longer — a new downtown arena is just around the corner — Caflisch said the team in Detroit is already heading down the learning curve of interpreting analytics, with big goals on the horizon.

Right now, some of the most interesting network statistics have to do with fan Wi-Fi usage, including total tonnage, which Caflisch said hit 14 terabytes of data for the Red Wings’ home games this past season. That number is one and a half times bigger per game than the first year the network was in place, he said.

Big spikes for a score

One of the more interesting results came when Caflisch mapped network data to game action, an exercise that showed that hockey games may have bigger data spikes and troughs than other sports.

“We saw that traffic spikes corresponded with scores, and we also had huge spikes during intermissions,” Caflisch said. “And there were huge craters during the periods of regular action.”

While Caflisch said “it was kind of cool” to watch the network action mapped to the game action, in the future he sees the ability for the Red Wings use such actionable moments to better engage fans.

“There’s got to be some kind of marketing potential” to connect with fans during a network-activity spike, Caflisch said. What that is, is still unknown. But using networks to more closely engage fans is a big part of the Red Wings’ road map, especially as Detroit builds out a “venue environment” around the new arena.

According to Caflisch, the team in Detroit is planning to build out a network surrounding the arena, in parking lots and public spaces, including lots of beacons for proximity engagement. Though DAS and Wi-Fi numbers can show where foot traffic goes in and around stadiums, the next level of analytics Caflisch sees as important is on fan spending behavior, on items like parking, concessions and in restaurants and bars near the arena. Future projects in Detroit, he said, might include beacon-generated discounts, like a free coffee at a nearby Tim Horton’s or a free beer at a nearby bar.

“The kinds of things you want to find out are what kind of money are fans spending, and how often do they buy,” Caflisch said. “Do they stick around after the game? Do they rush in at the start? That’s the kind of stuff you’re looking for.”

Of course to get some of that data Caflisch knows the team needs to convince fans to engage digitally, by downloading a team app and providing some information for identification. So far some efforts in that direction have been helpful in identifying fans not in the team’s ticketing database, especially fans coming across the border from Canada.

In Detroit, Caflisch said, the Wings are “now marketing to those people, trying to get them to more games for the same or less money.”

Baylor University: Enlisting fans to help pinpoint problems

When Baylor University built its new football mecca, McLane Stadium, the stadium technology department was often as nervous as a football team before a big game. Would the new fan-facing Wi-Fi work as planned? Would they be able to solve problems before they became big problems?

Baylor's McLane Stadium. Credit: Paul Kapustka, MSR

Baylor’s McLane Stadium. Credit: Paul Kapustka, MSR

“At the beginning, the questions we asked were along the lines of, ‘can we get through the day,’ ” said Pattie Orr, vice president for information technology and Dean of University Libraries and the public face of the McLane Stadium network. Now that the network team is a couple years into running stadium Wi-Fi, Orr can laugh a bit about the initial fears. But from the beginning, she said, analytics “were a big factor” in making sure the network was running right.

An Extreme Networks deployment, Baylor uses Extreme’s Purview analytics system, which Orr lauds for being “easy to use” and a “great console for real-time information during a game.”

Solving for 2.4 GHz and using fan input

Mostly that means watching the dashboards to see if any APs are causing any errors, something the network stats package can usually show clearly. One of the things the network crew learned quickly during the first season with Wi-Fi was that Baylor fans were using a lot more 2.4 GHz Wi-Fi devices than anyone had thought, meaning that there were more older phones in use that didn’t have the newer 5 GHz Wi-Fi chips.

“The first season we were about 50-50 between 2.4 GHz and 5 GHz, and that surprised us,” said Bob Hartland, director of IT servers and networking services at Baylor. “We had to prioritize for more 2.4 GHz.” This past season, Hartland said, the fan devices skewed closer to 60 percent using 5 GHz bands.

A Baylor "Wi-Fi Coach" helps a fan negotiate the network. Credit: Baylor University

A Baylor “Wi-Fi Coach” helps a fan negotiate the network. Credit: Baylor University

Baylor also added Wi-Fi to its basketball arena this past season, presenting a whole new set of problems, like devices trying to connect to APs across the smaller stadium. Though network analytics were a start, Baylor’s team found out that fan input could also help isolate where problems might be from a physical standpoint. Having a team of “network coaches” on hand also helped pinpoint the problems in a way that might be impossible just working from the network side of things.

This past year, Orr said Baylor added a feature to its stadium app to let fans “send a message to the Wi-Fi coach” with their row number and seat number if they were having a network problem. The coaches (part of most Extreme Wi-Fi deployments) also followed social media like Twitter to see if fans were reporting network problems.

“It’s fantastic to have the live [performance] data from your fans,” Orr said. With fan and network data and area knowledge in hand, the coaches and the network team could more quickly determine if it was a network or device problem, and respond more quickly to the issue. So more data = better solutions, faster.

“If you don’t have good access to analytics you can’t deal with fan [problems] in real time,” Orr said.

VenueNext and the Niners: Finding out who’s in the building

As one of the newer and more technologically advanced venues, Levi’s Stadium often gets noticed for its wireless networks, which set single-day records of 26 terabytes of data for combined DAS and Wi-Fi usage at Super Bowl 50.

A VenueNext beacon enclosure at Levi's Stadium. Credit: VenueNext

A VenueNext beacon enclosure at Levi’s Stadium. Credit: VenueNext

Though wireless performance is important to teams and fans, the information being gathered by the Levi’s Stadium app — built by VenueNext, the company created by the Niners specifically to construct stadium apps — may end up being among the most valuable digital assets, since it helps teams discover exactly who is coming in the building and how they are spending time, attention and dollars.

“We generate data for analytics,” said VenueNext CEO John Paul, talking about the role VenueNext plays as a stadium app partner. One of the more stunning facts revealed after the Niners’ first year at Levi’s Stadium was that via the stadium app, the team was able to increase its marketing database of fan names from 17,000 to 315,000, with even more impressive success in the details.

“We were able to find out things like how many games fans attended, and who they got the tickets from,” said Paul. Such data, he said, helps teams solve the classic problem of “having no idea who’s in the building on any given day.”

Knowing how many hot dogs can be delivered

While VenueNext’s value proposition may be centered on its ability to help teams gather such valuable marketing data, VenueNext itself relies on internal analytics to ensure the services its apps support — like express food ordering and in-seat food delivery — keep working smoothly during games.

After the first season at Levi’s Stadium, Paul said VenueNext learned that it needed to expose some of its data in real-time to fans — “to improve service during the event,” Paul said. One example is that now, if there are too many orders in a certain section, the app can send a message to fans that wait times might be longer than normal. Conversely, if a certain area of the stadium has idle kitchen capacity and runners, a team might send an in-app notification asking if fans want to order something, to create demand.

Over time, Paul said the VenueNext analytics might help teams find out where walk-up concession stands get overloaded by foot traffic, and maybe reconfigure stadium kitchen placements to assist with food delivery options. In the end, he said, it should be seamless to the fans, so that in-seat delivery becomes a regular part of a game-day experience.

“The fans should have no idea where the food comes from,” Paul said.

Minnesota Twins’ Target Field: Photo Essay and Wi-Fi tests

Great sight to see when you get off the plane in Minnesota.

Great sight to see when you get off the plane in Minnesota.

During Mobile Sports Report’s visit to Minneapolis earlier this summer, we had a free afternoon so we took the public tour of the Minnesota Twins’ Target Field, home of the 2014 MLB All Star Game. Though it was a kind of drizzly day we still got a lot of looks (and tests) of the thing we came to see: The park’s new Wi-Fi and DAS networks, which were operational and since it was an empty house, probably running at full capacity for all our tests.

After a short (~30 min.) light rail trip from the airport to downtown, we dumped our bags at the hotel and hoofed it over to Target Field, staying dry by cleverly using the city’s skywalk pathways. Once at the stadium it was just a short wait for the 3 p.m. tour to start, so we cruised the Twins’ gift store where the full-body Twins jammies made us think of cold September nights.

Tech you can and can’t see

Target Field from a nearby walkway. Notice the freeway running underneath.

Target Field from a nearby walkway. Notice the freeway running underneath.

I’d never been on one of these public tours before, but our group of 7 dudes learned a lot of lore from our excellent guide Rick, who had his stats down cold. The big glove outside the stadium, he let us know, is 522 feet from home plate, the longest home run recorded by Twins legend Harmon Killebrew. That home run was hit in 1967 at the old Metropolitan Stadium, where the Mall of America now stands.

Rick started out our tour by informing us that the $600 million Target Field, which opened in 2010, has a whole lot of technology under the field, pipes that heat the field and carry water away from it; there’s no dirt on the playing field, just sand underneath a very thin covering of grass. Baseball capacity now is 38,868, Rick said, though on opening day the park had 40,000+ there. That’s great stuff, man, but what about the Wi-Fi? Though I couldn’t get a Wi-Fi signal outside the gates, once inside the network was clearly humming: As Rick took us through the press box, where there were Ethernet cords in front of each seat, I wondered how necessary those were with a reading of 59.26 Mbps down and 62.67 up as I sat in a front-row seat.

Twins jammies for those cold Minnesota nights.

Twins jammies for those cold Minnesota nights.

As one the MLBAM-led technology deployment deals (in part to get ready for the All Star Game demands) the Wi-Fi inside Target Field is mainly Cisco gear, at least those that you can see. The familiar white boxes (now with MLBAM ID stickers) are fairly ubiquitous. Since we weren’t able to get ahold of the Twins’ IT crew before our visit I’m not sure what the final AP or DAS antenna count is these days. But if you know where to look, and we do, you can see a lot of antennas around.

Dealing with outside-the-park interference

One of the interesting things we learned in our profile of the park prior to the All Star Game was that since the stadium is right downtown, the Twins and the major carriers had to figure out how to keep macro antennas on buildings outside the ballpark from bleeding into the stadium’s DAS. According to another source we spoke with in Minnesota, this year was the first year that Target Field’s DAS didn’t need any more alterations; as you can see by one of pictures here of the Ford Center, which is across the street from the back side of Target Field, there’s a lot of RF on rooftops in the near vicinity.

Inside the press box. Grandpa, what's that cord for?

Inside the press box. Grandpa, what’s that cord for?

Down near field level, the Wi-Fi was still cranking in the mid-40s, an excellent score for a place that’s normally hard to cover. Looking around I didn’t initially see any APs, with none on the wall facing backwards as some stadiums do it. Then after some more inspection I saw the source of the bandwidth, some well-covered railing APs mounted on the railing behind the 10 or so rows of near-the-field seats. On our way out I saw some of the distinctive AmpThink-designed sideways railing enclosures, for the open-bowl seating not covered by overhangs.

Though ideally we’d love to come back on a game day, from the looks of the physical placements we were able to see and the tests we took, it seems like both the cellular and Wi-Fi networks at Target Field are high performers, good news for Twins fans who need connectivity. And if you need to drown your sorrows or celebrate, there is also an in-stadium beer network, which supplies suds from main keg rooms through conduits that are definitely more tasty than copper or fiber. Prosit!

Credit all photos: Paul Kapustka, MSR

tf7

Target Field in panoramic view.

tf8

A silhouette of a Wi-Fi antenna. MSR geek art.

tf11

A Wi-Fi AP and some kind of gun antenna. Anyone know what that is?

tf12

You bought it, you put your name on it.

tf14

Another panoramic view, showing how close downtown is.

tf16

The Ford Center is across the street from the back of the park. We’re guessing those macro antennas on top had to be tuned to keep their signals from interfering.

tf18

Not Wi-Fi, but a network worth building for thirsty fans.

tf19

Anyone want to test download speeds of these pipes?

tf20

Great quote overheard in Minnesota: “It takes a lot of wire to make a park wireless.”

tf21

Tour guide Rick getting set to take his “team” out on the field. BUT NOT ON THE GRASS!

tf25

The railing APs that cover the field-level seats.

tf26

An AmpThink railing enclosure. Rick didn’t know what those were, but he does now.

tf23

Nice hardware in the Twins’ high-rollers club area.

tf22

Our tour didn’t get to see inside, but we can guess what’s behind that door.

tf27

If you can hit one here, the Twins want to talk to you.

tf24

That’s about as close as MSR will ever get to being in “The Show.” Until next time!