Nokia deal part of new wholesale/white-label strategy for Artemis Networks

Artemis Networks founder Steve Perlman. Credit all photos: Artemis Networks

Artemis Networks founder Steve Perlman. Credit all photos: Artemis Networks

A deal by startup Artemis Networks to provide test deployments of its pCell wireless networking technology to select Tier 1 phone-network customers of telecom equipment giant Nokia Networks is both a “coming out party” as well as a significant shift in the Artemis business strategy, from a consumer and end-user focus to a wholesale, business-to-business plan.

Though no actual customers, users or live pCell networks have yet been announced, Artemis founder and CEO Steve Perlman said he can see the end to the “long and winding road” toward real-world deployments that officially started when Artemis went public with its ideas back in February of 2014. “We look at this [the Nokia announcement] as our coming-out party,” said Perlman in a phone interview with Mobile Sports Report. “You’ll be seeing [customer] announcements soon.”

In addition to the Nokia “memorandum of understanding,” which says that Nokia and Artemis will “jointly test Artemis pCell wireless technology in 2016 with wireless operators, initially in large indoor venues and other high density areas,” Artemis also announced a shift in its plans for its expected commercial network in its home town of San Francisco, which was originally supposed to launch this past summer. (For a detailed explanation of Artemis technology, scroll to the end of this post and its links.)

From consumer network to wholesale provider

Instead of operating its own network as originally planned and selling access to consumers, Perlman said Artemis will sell LTE capacity wholesale to any interested network provider as soon as the now-approved network is completed. Artemis, which obtained a lease of spectrum from satellite provider DISH, is now setting up antennas on 58 rooftops in San Francisco, Perlman said, after finally getting FCC approval for its plans a little later than expected.

pCell antenna from Artemis Networking

pCell antenna from Artemis Networking

And instead of having to outsource or build its own customer-facing signup, billing and other back-end systems, the 12-person Artemis will instead sell capacity on its San Francisco network to any interested provider. According to Perlman, there are customers ready to buy, even though none are yet named. Potential customers could include MVNOs (mobile virtual network operators) like TracPhone, who don’t own their own networks, or other larger providers looking for roaming capacity or cheap LTE in the crowded city by the Bay.

While it’s less cool than having its own branded devices and network, being a wholesale provider makes sense for the small-size Artemis, instead of trying to compete with wireless giants like Verizon Wireless and AT&T. “Wholesale [capacity] was a market we really didn’t know existed,” said Perlman. “And when they [potential customers] told us what they would pay, it was easy to see B2B as being the way for us.”

Big customers more comfortable with big suppliers

On the networking gear sales side, Perlman said that teaming up with a big equipment provider like Nokia was a necessity to get any traction in the world of LTE cellular networks. As we said before, though pCell’s projected promise of using cellular interference to produce targeted, powerful cellular connectivity could be a boon to builders of large public-venue networks like those found in sports stadiums, owners and operators of those venues are loath to build expensive networks on untested, unproven technology. And big metro wireless providers are even more so.

“We had a lot of Tier 1 operators tell us ‘we love this [pCell technology], we really need this, but we’re not buying from a 12-person startup,’ ” said Perlman. So even while Artemis’ radio technology — which promises huge leaps in performance compared to current gear — was attractive, the company’s lack of any kind of integration with the boring but necessary part of telecom infrastructure, including billing and authentication systems, held it back, Perlman said.

“We were told we could get things done more instantly if we partnered with a large infrastructure company,” Perlman said.

And while real customers from the Nokia deal will probably surface first in a stadium or other large public venue — since such a deployment would be easier to test and install than a new metro network — one team that won’t be using pCell technology any time soon is VenueNext, the app provider for the San Francisco 49ers’ Levi’s Stadium. Though VenueNext was publicly listed as a testing partner last spring, VenueNext has not commented on any results of any testing, and according to multiple sources there was no testing of Artemis equipment at Levi’s Stadium this summer. Though it develops the application and backend systems only, VenueNext does need to work closely with equipment providers, like Aruba Networks at Levi’s Stadium, to integrate its app functionality with the network.

Perlman, who also confirmed there was nothing brewing anymore with VenueNext (“but we’re still friends with VenueNext”), said the app developer also preferred to work with a larger-size developer than the short-bench Artemis. VenueNext, which recently announced the NBA’s Orlando Magic as its second stadium-app customer, has said publicly it would announce an additional 29 new customers before the end of the calendar year.

“We [Artemis] could probably go and do one stadium,” said Perlman about his company’s deployment abilities.

Wi-Fi thrown in for free

And while the main business for Artemis out of the gate will probably be in adding capacity to LTE networks that are running out of spectrum, Perlman said that having Wi-Fi support built into the pCell equipment could make the technology attractive to venues who need or want to bring Wi-Fi services to fans. The Wi-Fi version of pCell technology was also an after-the-fact idea that surfaced after the original pCell announcements.

“The pWave radio heads have [support for] all LTE bands and both Wi-Fi bands,” Perlman said. “So everything that Nokia does [with pCell deployments] can also do Wi-Fi. That’s pretty exciting.”

What’s yet unknown is how the ongoing acquisition of Alcatel-Lucent by Nokia may affect any potential pCell deployments. In the best possible scenario for Artemis, the acquisition could provide more entry points if the pCell technology gets integrated with Alcatel-Lucent telecom gear.

Wi-Fi stats left on the bench in RootMetrics’ baseball stadium network performance scores

The folks at RootMetrics have another network research project out, one that claims to determine the best wireless connectivity in all the U.S. Major League Baseball stadiums. However, the report doesn’t include Wi-Fi network performance in any of its scoring processes, and it doesn’t publicly reveal the limits of its network tests, which are based on just one day’s results from a handful of devices in each venue and do not include any results from Apple iOS devices.

According to the RootMetrics survey, Fenway Park in Boston ended up atop their results, with strong scores for all the four major U.S. wireless carriers, a list that includes AT&T, Verizon Wireless, Sprint and T-Mobile. But the caveat about those “scores” is that they are composite results devised by RootMetrics itself and not a direct reflection of numerical network performance.

At Fenway, for instance, RootMetrics’ own results show that T-Mobile’s median upload and download speeds are 3.0 Mbps and 3.5 Mbps, respectively, while Verizon’s are 20.7 Mbps and 13.0 Mbps. Yet RootMetrics gives T-Mobile a third place at Fenway with a 89.5 “Rootscore,” compared to Verizon’s winning mark of 97.9, meaning that in RootMetrics’ scoring system a network six times as fast is only 10 percent better.

While it’s not included in the scoring or ranking, the Wi-Fi network at Fenway as measured by RootMetrics delivered speeds of 23.1 Mbps down and 22.0 up, besting all the cellular networks in the stadium. In its blog post RootMetrics does not explain why it doesn’t include Wi-Fi networks in its network measurements or scoring, even though its testing does show Wi-Fi performance at individual stadiums. Over the past year, Major League Baseball led a $300 million effort to install Wi-Fi networks in all MLB parks.

Unlike its metro-area tests, where RootMetrics uses “millions of data points,” the baseball stadium tests were calculated using just one device from each carrier — and all are Android-based, since RootMetrics’ internal testing system doesn’t run on iOS devices. And while RootMetrics said that for its results each park was visited “at least once,” in going through all 29 stadium reports there was only a single visit date mentioned for each one. RootMetrics also did not visit Rogers Centre in Toronto, home of the American League’s Blue Jays.

New Report: Green Bay’s Lambeau Field leads new NFL Wi-Fi deployments

Wave the flag, Wi-Fi has come to Lambeau Field! Photo: Green Bay Packers

Wave the flag, Wi-Fi has come to Lambeau Field! Photo: Green Bay Packers

When most NFL fans think of the Green Bay Packers and Lambeau Field, they think of frozen tundra — of Vince Lombardi roaming the sideline in his thick glasses and peaked hat, with visible breath coming through the face masks of behemoth linemen on the field. In the stands, they see the venerable fans braving the cold of northern Wisconsin in their snowmobile suits, with mittens wrapped around a bratwurst and a beer.

But do they think of those same Packers fans pulling out their iPhones and Samsungs to take selfies, and posting them to Instagram or Facebook? Maybe not so much.

The reality of 2015, however, finds us with fans in Green Bay being just like fans anywhere else — meaning, they want to be able to use their mobile devices while at the game. As the cover story of our most recent Stadium Tech Report series, we explore the details of bringing Wi-Fi to historic Lambeau Field, where late-season texting might carry the threat of frostbitten fingers.

Our PRO FOOTBALL ISSUE has 50-plus pages of insight and how-to explanations that in addition to Green Bay’s work also cover some interesting Wi-Fi access point hiding tricks practiced by the IT folks at AT&T Stadium, and a recap of Levi’s Stadium plans as it gets ready to host Super Bowl 50. Plus team-by-team capsule descriptions of stadium tech deployments for all 32 NFL franchises. It’s all free to you, so download your copy today!

The NFL haves and have-nots when it comes to Wi-Fi

PRO_FB_ThumbWas it really three long years ago that NFL commissioner Roger Goodell issued an edict calling for Wi-Fi in all 31 NFL stadiums? While we’re almost there, it’s not quite everywhere yet and during the course of preparing this year’s PRO FOOTBALL ISSUE we found ourselves wondering how many of the current NFL stadium Wi-Fi networks are really up to snuff. Sure, there are leaders in the networking space, as teams with lots of money or recent Super Bowl hostings seem to be in a bit of an arms war when it comes to installing robust wireless networks. Teams like the Dallas Cowboys, the San Francisco 49ers, the Miami Dolphins, the New England Patriots and a few others come to mind when you are making a list of top networks, and you can probably add Green Bay’s 1,000-plus AP deployment to that tally.

But what about the balance of the league, which now has some kind of fan-facing Wi-Fi in 25 of its 31 venues? While those that don’t have any Wi-Fi at all are somewhat understandable (mainly due to questions about imminent franchise relocation), what about the stadiums that put in Wi-Fi a few years ago, or only put in a limited amount of technology? With no end in sight to the increasing demands for wireless bandwidth, how soon will the older networks need revamping? Including the DAS deployments? Those are questions we’ll keep asking and looking to answer, as we’ve already seen some public reports about Wi-Fi networks falling down on the job. The best place to start, of course, is with the report, so DOWNLOAD YOUR COPY right now!

Thank the sponsors, who let you read for free

Reporting, writing, editing and producing all this content has a cost, but thanks to our generous (and increasing!) list of sponsors, our editorially objective content remains free for you, the reader. We’d like to take a quick moment to thank the sponsors of the Q3 issue of Stadium Tech Report, which include Mobilitie, Crown Castle, SOLiD, CommScope, TE Connectivity, Aruba Networks, JMA Wireless, Corning, 5 Bars, Extreme Networks, ExteNet Systems. and partners Edgewater Wireless and Zinwave. We’d also like to thank you, our readers for your interest and continued support.

As always, we are here to hear what you have to say: Send me an email to kaps at mobilesportsreport.com and let us know what you think of our STADIUM TECH REPORT series, and whether or not the Wi-Fi at your local NFL stadium is a division winner.

Levi’s Stadium Monday Night Football debut sees 2.87 TB of Wi-Fi traffic, 874 GB on AT&T DAS

Levi's Stadium during its inaugural Monday Night Football game. Photo: Levi's Stadium

Levi’s Stadium during its inaugural Monday Night Football game. Photo: Levi’s Stadium

For its first-ever Monday Night Football game, Levi’s Stadium saw 2.87 terabytes of data cross its Wi-Fi network, with an additional 874 GB traversing the AT&T cellular DAS network during the Niners’ somewhat surprising 20-3 victory over the Minnesota Vikings.

With the confirmed numbers bumping up against the 4 TB mark — and if you add in the probable (but unreported) 1 TB or more that was used by Verizon Wireless, Sprint and T-Mobile customers on the Levi’s Stadium DAS — it’s readily apparent that usage of wireless data inside stadiums is only continuing to grow, with no top end yet in sight.

Though the Wi-Fi mark didn’t hit the same heights as the 3.3 TB number recorded at the first regular-season opener at Levi’s Stadium last fall, it’s impressive nonetheless because of the game’s somewhat lower profile given the modest expectations for a Niners team that has suffered through an exceptionally strange offseason that saw its high-profile coach Jim Harbaugh leave for the University of Michigan, and a number of top players retire, like star linebacker Patrick Willis, or depart, like running back Frank Gore, who went to Indianapolis.

And with the new-car buzz somewhat gone from Levi’s Stadium if almost 3 TB of Wi-Fi is a “regular” mark you have to start wondering what the totals are going to be like when Super Bowl 50 comes to the venue in February. On the DAS side of things, the cellular traffic generated by AT&T customers at Levi’s Stadium Monday night was the second-highest in the NFL venues measured by AT&T, trailing only the traffic at namesake AT&T Stadium, where AT&T saw 1.107 TB of DAS traffic during the Cowboys’ opening-game victory over the New York Giants. According to AT&T, DAS traffic at NFL stadiums during the first week of games was up 46 percent compared to the first week of games in 2014. We’ll have a separate post on college DAS traffic tomorrow, which is also up. Thanks to the Niners for the data chart below.

Screen Shot 2015-09-16 at 9.40.52 PM

Stadium Tech Report: Los Angeles Dodgers hit it out of the park with Cisco, Aruba Wi-Fi

Dodgers Stadium, the SoCal baseball shrine. All photos: Terry Sweeney, MSR (click on any photo for a larger image)

Dodgers Stadium, the SoCal baseball shrine. All photos: Terry Sweeney, MSR (click on any photo for a larger image)

Growing up in the Los Angeles suburb of Norwalk, Ralph Esquibel recalled playing outdoors while inside the Dodger game was on the radio. “I knew from the kinds of noises coming out of the house how the game was going,” he laughed. Esquibel, now vice president of IT for the Los Angeles Dodgers, may have wished for some similar indicators or guideposts as he began the wireless retrofitting of Major League Baseball’s third oldest stadium (after Boston’s Fenway Park and Chicago’s Wrigley Field) in early 2011.

Esquibel faced multiple challenges with Dodger Stadium. First, there was all that concrete to push signals through or around. There was the size of the Chavez Ravine venue and its far-flung parking lots, spanning more than 350 acres. The stadium also has few overhangs, a favorite place to attach Wi-Fi access points or distributed antenna system (DAS) gear. Then there’s Dodger Stadium’s capacity — 56,000 seats – the largest in the league and almost 30 percent larger than the average MLB stadium (42,790).

Esquibel’s biggest hurdle? ” Trying to achieve the network that we wanted but also maintain an appropriate budget for the solution,” he said. While Esquibel would not specify what the Dodgers spent, he did allow that it was “an 8-figure project.”

Coverage challenges in the best seats

Initially, the best seats in the house presented a coverage challenge; field and club level seats along the third- and first-base lines and the dugout lack any overhangs. So while phones in those sections could receive a short, directional beam sent from across the outfield, the upstream signal couldn’t get back to the AP across the field, said Esquibel.

Ralph Esquibel, VP of IT for the Dodgers, with the new Wi-Fi relief pitcher mobile.

Ralph Esquibel, VP of IT for the Dodgers, with the new Wi-Fi relief pitcher mobile.

“We wanted to guarantee a premium experience, regardless of the seat,” said Esquibel, who joined the Dodgers 6 years ago after working in IT at Toyota and Honda. So by using what he calls “a hybrid approach,” Wi-Fi APs and antennas are installed overhead where possible, but also under seats and in staircase handrails that divide the stadium’s steep aisles.

All told, nearly 1,000 APs from Cisco and Aruba Networks blanket Dodger stadium, its concession areas and parking lots. Horizon Communications helped the Dodgers with design and installation of the Wi-Fi and DAS.

The under-seat APs/Wi-Fi antennas on the club level are housed in NEMA enclosures about every 15 seats, set eight rows apart. Esquibel was concerned about losing real estate under those seats; he also didn’t want to create any potential trip hazard for fans. In addition, the Dodgers use Cat 6A cabling, whose thickness and rigidity couldn’t run up a stepped incline. Consequently, they drilled through concrete to snake the cabling through from the clubhouse underneath. “There’s no visible conduit leading into the enclosure,” Esquibel explained. The profile and footprint of the enclosure still leaves space for fans to place belongings.

Handrail Wi-Fi enclosure

Handrail Wi-Fi enclosure

It’s the same modus operandi for the enclosures housed in the stair rails, except there are two APs in larger enclosures at the top of each staircase on the reserve level and upper deck, then a single AP per enclosure as the stairways descend. Some 290 APs offer coverage on the reserve level, which by itself has a greater capacity than nearby Staples Center (18,118 seats), Esquibel told Mobile Sports Report. After 2 years of use, there have been no issues with the AP enclosures. “We power-wash the seats and stands after games and [the enclosures] are very resilient against the sun, water and wind,” Esquibel said.

He also acknowledged some early challenges with Wi-Fi. Part of the issue was working with Cisco’s CleanAir technology, which is supposed to minimize RF interference, if not eliminate it altogether. If an AP starts broadcasting over a frequency in use by another AP, for example, CleanAir helps it find another frequency. It took a few months to fully tune the network; some directional antennas needed a 10-degree adjustment, Esquibel said. Another challenge was having APs from more than one vendor. “If your network is 100 percent Cisco and all leveraging the same controllers, [CleanAir] will work perfectly,” Esquibel said. “If you have a mixed environment that pushes Wi-Fi in certain locations, it becomes a problem — there’s competition for frequencies.”

Coordinating the APs

A third-party leveraging a non-public frequency would switch channels, for example, causing the APs for public use to also switch channels. “What we had was a lot of bouncing back and forth,” Esquibel said, which affected performance. “So we assigned channels and frequencies for each AP, which still requires a lot of coordination.”

Under-seat Wi-Fi enclosure

Under-seat Wi-Fi enclosure

Since 2013, the stadium has been carved into 24 DAS sectors. AT&T, T-Mobile USA and Verizon Wireless are the carriers presently using the DAS; Ericsson makes the DAS antennas. Stubborn Sprint relies on a tower adjacent to the stadium.

Dodger fans average anywhere from 500-655 megabytes of data use per game, according to Esquibel. During a busy game, the wireless networking accommodates 16,000 concurrent users; a slower event clocks in at 4,000-8,000. To test upload speed, Esquibel will push a 50MB video to Facebook. When there’s lots of available bandwidth, he gets 60 Mbps performance; on the low end, it’s closer to 4 Mbps. Esquibel said users are mostly streaming and posting videos and photos to social media; Dodger Stadium is the second most Instagrammed site in southern California, after Disneyland, Esquibel added.

The Dodgers have their own version of Ballpark, the in-stadium MLB app, which offers video replay and highlights; in-seat ordering of food and drink in certain areas; and stadium mapping. Check-ins on Ballpark are handled through a network of 44 iBeacons, which takes advantage of Bluetooth Low Energy (BTLE) technology. Between Ballpark and social media activity, Dodger fans have run up as much as 700 MB data usage during games — and the network is ready if more demand is needed.

“We don’t do any rate limiting, so if we consume all our bandwidth we get a free upgrade, thanks to a clause in our agreement with our ISP, AT&T,” Esquibel explained.

To ensure a family-friendly and wholesome environment, the Dodgers use Palo Alto Networks 5020 firewalls for content filtering. “As we developed our SLAs, it was one of the first issues to pop up — no sexual content, no malware/phishing, and no illegal drug sites,” he said.

What’s on his wish list for the future? “I’d like geo-fencing within the Wi-Fi network so if I see someone enter a club, I can say hi or welcome them, notify them of specials, or flag points of interest around the stadium,” Esquibel said, like the World Series trophy case or giveaway locations for promotional items. Alongside all the other applications, wireless can be used as guideposts for fans and visitors to Dodger Stadium.

All-Star Wi-Fi: Cincinnati crowds used 4.3 TB over All-Star Game activities

Fans at All-Star Game taking pictures of Pete Rose. Photo: Screenshot courtesy Fox Sports/Cincinnati Reds

Fans at All-Star Game taking pictures of Pete Rose. Photo: Screenshot courtesy Fox Sports/Cincinnati Reds

Like a player added to the roster just before game time, the new Wi-Fi network at the Great American Ball Park in Cincinnati handled some all-star traffic levels, carrying a total of 4.3 terabytes of data over the three separate events that made up Major League Baseball’s All-Star Game festivities earlier this week, according to IT execs at the ballpark.

Though it only came online a couple weeks before the big event, the GABP Wi-Fi network held up admirably for the big game, carrying 2.36 TB during Tuesday night’s main event, according to Brian Keys, vice president of technology for the Cincinnati Reds. Almost another 2 TB was recorded during the ancillary events, the futures game and the Home Run Derby, proving once again that “big event” crowds like their Wi-Fi and are adept and finding and using in-stadium wireless networks. We don’t have DAS stats yet but it’s an easy guess that all four DAS deployments inside the stadium also carried significant traffic loads during the All-Star activities.

In a phone interview Friday, Keys said that the peak concurrent Wi-Fi user number hit 9,700 at one point during the actual All-Star Game, with a total of 12,000 unique Wi-Fi connections over all of Tuesday night. And even though the game attracts a national audience, the hometown fans provided the biggest traffic surges during Cincinnati Reds-specific moments — like at the end of Monday’s Home Run Derby when local hero Todd Frazier won in dramatic fashion, and when former Reds star Pete Rose had a brief pre-game introduction.

“Especially when Todd [Frazier] got up to bat, that really tested the limits of our [bandwidth] pipe,” Keys said. The Rose introduction, he said, put similar stress on the 576 Wi-Fi access points, but with Keys’ staff as well as a special group from Wi-Fi gear provider Cisco on hand to help out, the new network performed in big-league fashion, Keys said.

During construction, the IT team had to overcome one structural hurdle, namely the lack of any railings in the lower bowl to mount Wi-Fi APs on. Keys said some of that was solved by putting APs at the bottom of seating rows pointing up, and using overhang space for other antenna mounts. The Great American Ball Park did not use any under-seat APs, Keys said.

Pete Rose. Photo: Screen shot of Fox Sports broadcast courtesy of Cincinnati Reds.

Pete Rose. Photo: Screen shot of Fox Sports broadcast courtesy of Cincinnati Reds.

Though the ballpark had explored putting Wi-Fi in last season, the initial deployment was stalled last summer due to what Keys called contract issues. But with the All-Star game coming this season, the park re-started its Wi-Fi deployment, which was part of the Major League Baseball Advanced Media (MLBAM) plan to bring Wi-Fi to all parks for this season. Keys said the new network deployment began in March and finished up on June 26, giving his team a few home dates to kick the tires and tune it up quickly for its big event.

Going forward, Keys said the four-DAS deployment — with four sets of antennas and four different headends — will be consolidated into a single, neutral host DAS operation. Keys is also looking forward to adding features enabled by the Wi-Fi network, like expanded food ordering and greater use of beacon technology. “It’ll be great to add more things to improve the fan experience,” he said.