Nokia deal part of new wholesale/white-label strategy for Artemis Networks

Artemis Networks founder Steve Perlman. Credit all photos: Artemis Networks

Artemis Networks founder Steve Perlman. Credit all photos: Artemis Networks

A deal by startup Artemis Networks to provide test deployments of its pCell wireless networking technology to select Tier 1 phone-network customers of telecom equipment giant Nokia Networks is both a “coming out party” as well as a significant shift in the Artemis business strategy, from a consumer and end-user focus to a wholesale, business-to-business plan.

Though no actual customers, users or live pCell networks have yet been announced, Artemis founder and CEO Steve Perlman said he can see the end to the “long and winding road” toward real-world deployments that officially started when Artemis went public with its ideas back in February of 2014. “We look at this [the Nokia announcement] as our coming-out party,” said Perlman in a phone interview with Mobile Sports Report. “You’ll be seeing [customer] announcements soon.”

In addition to the Nokia “memorandum of understanding,” which says that Nokia and Artemis will “jointly test Artemis pCell wireless technology in 2016 with wireless operators, initially in large indoor venues and other high density areas,” Artemis also announced a shift in its plans for its expected commercial network in its home town of San Francisco, which was originally supposed to launch this past summer. (For a detailed explanation of Artemis technology, scroll to the end of this post and its links.)

From consumer network to wholesale provider

Instead of operating its own network as originally planned and selling access to consumers, Perlman said Artemis will sell LTE capacity wholesale to any interested network provider as soon as the now-approved network is completed. Artemis, which obtained a lease of spectrum from satellite provider DISH, is now setting up antennas on 58 rooftops in San Francisco, Perlman said, after finally getting FCC approval for its plans a little later than expected.

pCell antenna from Artemis Networking

pCell antenna from Artemis Networking

And instead of having to outsource or build its own customer-facing signup, billing and other back-end systems, the 12-person Artemis will instead sell capacity on its San Francisco network to any interested provider. According to Perlman, there are customers ready to buy, even though none are yet named. Potential customers could include MVNOs (mobile virtual network operators) like TracPhone, who don’t own their own networks, or other larger providers looking for roaming capacity or cheap LTE in the crowded city by the Bay.

While it’s less cool than having its own branded devices and network, being a wholesale provider makes sense for the small-size Artemis, instead of trying to compete with wireless giants like Verizon Wireless and AT&T. “Wholesale [capacity] was a market we really didn’t know existed,” said Perlman. “And when they [potential customers] told us what they would pay, it was easy to see B2B as being the way for us.”

Big customers more comfortable with big suppliers

On the networking gear sales side, Perlman said that teaming up with a big equipment provider like Nokia was a necessity to get any traction in the world of LTE cellular networks. As we said before, though pCell’s projected promise of using cellular interference to produce targeted, powerful cellular connectivity could be a boon to builders of large public-venue networks like those found in sports stadiums, owners and operators of those venues are loath to build expensive networks on untested, unproven technology. And big metro wireless providers are even more so.

“We had a lot of Tier 1 operators tell us ‘we love this [pCell technology], we really need this, but we’re not buying from a 12-person startup,’ ” said Perlman. So even while Artemis’ radio technology — which promises huge leaps in performance compared to current gear — was attractive, the company’s lack of any kind of integration with the boring but necessary part of telecom infrastructure, including billing and authentication systems, held it back, Perlman said.

“We were told we could get things done more instantly if we partnered with a large infrastructure company,” Perlman said.

And while real customers from the Nokia deal will probably surface first in a stadium or other large public venue — since such a deployment would be easier to test and install than a new metro network — one team that won’t be using pCell technology any time soon is VenueNext, the app provider for the San Francisco 49ers’ Levi’s Stadium. Though VenueNext was publicly listed as a testing partner last spring, VenueNext has not commented on any results of any testing, and according to multiple sources there was no testing of Artemis equipment at Levi’s Stadium this summer. Though it develops the application and backend systems only, VenueNext does need to work closely with equipment providers, like Aruba Networks at Levi’s Stadium, to integrate its app functionality with the network.

Perlman, who also confirmed there was nothing brewing anymore with VenueNext (“but we’re still friends with VenueNext”), said the app developer also preferred to work with a larger-size developer than the short-bench Artemis. VenueNext, which recently announced the NBA’s Orlando Magic as its second stadium-app customer, has said publicly it would announce an additional 29 new customers before the end of the calendar year.

“We [Artemis] could probably go and do one stadium,” said Perlman about his company’s deployment abilities.

Wi-Fi thrown in for free

And while the main business for Artemis out of the gate will probably be in adding capacity to LTE networks that are running out of spectrum, Perlman said that having Wi-Fi support built into the pCell equipment could make the technology attractive to venues who need or want to bring Wi-Fi services to fans. The Wi-Fi version of pCell technology was also an after-the-fact idea that surfaced after the original pCell announcements.

“The pWave radio heads have [support for] all LTE bands and both Wi-Fi bands,” Perlman said. “So everything that Nokia does [with pCell deployments] can also do Wi-Fi. That’s pretty exciting.”

What’s yet unknown is how the ongoing acquisition of Alcatel-Lucent by Nokia may affect any potential pCell deployments. In the best possible scenario for Artemis, the acquisition could provide more entry points if the pCell technology gets integrated with Alcatel-Lucent telecom gear.

Texas A&M’s fiber-backed Wi-Fi at Kyle Field records 5.7 TB of data during Alabama game

Scoreboard, Kyle Field. Photos: Texas A&M

Scoreboard, Kyle Field. Photos: Texas A&M

We’ve been hearing rumors about how much data was flowing at the new fiber-based Wi-Fi network at Texas A&M’s Kyle Field this fall, and now we finally have some verified numbers that are sure to pop some eyeballs: According to the networking crew at Corning, fans at Kyle Field used 5.7 terabytes of Wi-Fi data during the Oct. 17 game against Alabama, which the Aggies lost 41-23.

In case you are keeping score the 5.7 TB mark is the second-largest single-game Wi-Fi usage number we’ve seen, trailing only the 6.2 TB recorded at Super Bowl XLIX in Glendale, Ariz., earlier this year. Before you pin it all on the network, however, be aware that the newly refurbished Kyle Field can hold a whole lotta fans — the announced attendance for the ‘Bama game was 105,733, which is 35,000+ more fans than the 70,288 who attended the Super Bowl at the University of Phoenix Stadium on Feb. 1. Still, building a network to support basically another baseball stadium’s worth of fans is pretty cool, too.

Other related numbers from the Wi-Fi network are in Super Bowl territory as well, including the 37,823 unique clients recorded during pre-game and game time, as well as the 26,318 peak concurrent user count. We’re not sure why only 10 people tweeted about the Wi-Fi (8 good, 2 bad) but the 3.2 Gbps throughput should also turn some heads.

Corning ONE DAS headend equipment at Texas A&M's Kyle Field deployment

Corning ONE DAS headend equipment at Texas A&M’s Kyle Field deployment

The question this all raises for us is, has the availability of a fiber backbone allowed fans to simply use more traffic? And is the demand for mobile data at big events perhaps even higher than we thought? With a regular-season game at Nebraska hitting 4.2 TB earlier this season, it’s pretty clear that data demands are showing no signs of hitting a plateau. Or maybe we can deduce that the better the network, the more traffic it will carry?

It’s also worthwhile to note that stats this season from AT&T have shown several 1+ TB data totals for games at Kyle Field on the AT&T DAS network, which uses the same fiber backbone as the Wi-Fi. This “fiber to the fan” infrastructure, built by IBM and Corning, will also be at the core of the network being built at the new home of the NFL’s Falcons, the Mercedes-Benz Stadium in Atlanta, scheduled to open in 2017.

We’ll have more soon from Kyle Field, as Mobile Sports Report is scheduled to make a visit there for the Nov. 7 game against Auburn. If you plan to be in College Station that weekend give us a holler. Or a yell, right? We are looking forward to seeing the stadium and the network firsthand, to do some speedtests to see how well all areas are covered. With 5.7 TB of Wi-Fi, it’s a good guess the coverage is pretty good.

(Statistics provided by Corning for the Oct. 17 game are below.)

Screen Shot 2015-10-29 at 9.50.45 PM

NFL Stadium Tech Reviews — AFC North

Editor’s note: The following team-by-team capsule reports of NFL stadium technology deployments are an excerpt from our most recent Stadium Tech Report, THE PRO FOOTBALL ISSUE. To get all the capsules in one place as well as our featured reports, interviews and analysis, download your free copy of the full report today.

AFC NORTH

Reporting by Paul Kapustka

M&T Bank Stadium. All photos: Baltimore Ravens (click on any photo for a larger image)

M&T Bank Stadium. All photos: Baltimore Ravens (click on any photo for a larger image)

Baltimore Ravens
M&T Bank Stadium
Seating Capacity: 71,008
Wi-Fi – Yes
DAS – Yes

Extreme Networks picked up another NFL win this offseason, being selected to provide the Wi-Fi network gear for the Baltimore Ravens M&T Bank Stadium.

According to press releases from the team and Extreme, Extreme will install approximately 800 Wi-Fi APs to provide wireless service to the seating and concourse areas of the stadium. The $6.5 million network will be designed and deployed by integrator PCM Inc. of El Segundo, Calif., and the team app will be developed by YinzCam. According to the Ravens M&T Bank Stadium has a seating capacity of 71,000 for football.

The Ravens are also unveiling a new 3-D video system called freeD that the team said shows replays from every possible angle, like the replays seen on newscasts that can circle around the field of view.

Cincinnati Bengals
Paul Brown Stadium
Seating Capacity: 65,515
Wi-Fi – Yes
DAS – Yes

After putting Wi-Fi from Extreme Networks into Paul Brown Stadium for last season, the Bengals announced an additional $20 million in improvements for 2015, including newer, larger video boards. A TE Connectivity DAS in already in place, and the Wi-Fi network went through some upgrades especially in the stadium’s canopy level.

Cleveland Browns
FirstEnergy Stadium
Seating Capacity: 73,200
Wi-Fi – No, planned for 2015
DAS – Yes

While the Cleveland Browns continue to add improvements to FirstEnergy Stadium, Wi-Fi is not yet installed; according to news reports it should be available by the end of the 2015 season.

Pittsburgh Steelers
Heinz Field
Seating Capacity: 65,500
Wi-Fi – Yes/limited (club and suite areas only)
DAS – Yes

With room for 3,000 more fans in Heinz Field this season thanks to some offseason construction work, more Steelers fans than ever will be able to cheer on the Black and Gold.

Wi-Fi access, however, remains limited, not available in the full bowl but only in the FedEx Great Hall and the West Main Concourse. Clubs and suites also have free Wi-Fi, and the team said customer service reps will be available to deal with issues.

Wi-Fi stats left on the bench in RootMetrics’ baseball stadium network performance scores

The folks at RootMetrics have another network research project out, one that claims to determine the best wireless connectivity in all the U.S. Major League Baseball stadiums. However, the report doesn’t include Wi-Fi network performance in any of its scoring processes, and it doesn’t publicly reveal the limits of its network tests, which are based on just one day’s results from a handful of devices in each venue and do not include any results from Apple iOS devices.

According to the RootMetrics survey, Fenway Park in Boston ended up atop their results, with strong scores for all the four major U.S. wireless carriers, a list that includes AT&T, Verizon Wireless, Sprint and T-Mobile. But the caveat about those “scores” is that they are composite results devised by RootMetrics itself and not a direct reflection of numerical network performance.

At Fenway, for instance, RootMetrics’ own results show that T-Mobile’s median upload and download speeds are 3.0 Mbps and 3.5 Mbps, respectively, while Verizon’s are 20.7 Mbps and 13.0 Mbps. Yet RootMetrics gives T-Mobile a third place at Fenway with a 89.5 “Rootscore,” compared to Verizon’s winning mark of 97.9, meaning that in RootMetrics’ scoring system a network six times as fast is only 10 percent better.

While it’s not included in the scoring or ranking, the Wi-Fi network at Fenway as measured by RootMetrics delivered speeds of 23.1 Mbps down and 22.0 up, besting all the cellular networks in the stadium. In its blog post RootMetrics does not explain why it doesn’t include Wi-Fi networks in its network measurements or scoring, even though its testing does show Wi-Fi performance at individual stadiums. Over the past year, Major League Baseball led a $300 million effort to install Wi-Fi networks in all MLB parks.

Unlike its metro-area tests, where RootMetrics uses “millions of data points,” the baseball stadium tests were calculated using just one device from each carrier — and all are Android-based, since RootMetrics’ internal testing system doesn’t run on iOS devices. And while RootMetrics said that for its results each park was visited “at least once,” in going through all 29 stadium reports there was only a single visit date mentioned for each one. RootMetrics also did not visit Rogers Centre in Toronto, home of the American League’s Blue Jays.

Keeping the Wi-Fi hidden: AT&T Stadium perfects the art of Wi-Fi AP concealment

Wi-Fi antennas visible under the 'shroud' covering the outside of the overhang at AT&T Stadium. Photo: Dallas Cowboys (Click on any photo for a larger image)

Wi-Fi antennas visible under the ‘shroud’ covering the outside of the overhang at AT&T Stadium. Photo: Dallas Cowboys (Click on any photo for a larger image)

Since they like to do everything big in Texas, it’s no surprise that the IT team at AT&T Stadium has taken the art of Wi-Fi access point concealment to new heights.

To just above the first and second seating levels of the stadium, that is.

Even though the venue has more Wi-Fi APs than any sports stadium we’ve ever heard of, trying to find any of the 1,900 permanently installed APs is a tough task, thanks to measures like the fiberglass shrouds that circle the stadium just above the first and second seating levels. Underneath those custom-built coverings are numerous Wi-Fi APs, DAS antennas and even cameras, all contributing to the high level of connectivity inside AT&T Stadium while remaining invisible to the visiting fan’s eyes.

“The philosophy throughout the stadium is for a clean, stark look,” said John Winborn, chief information officer for the Dallas Cowboys Football Club, which is the primary tenant of the venue. “That’s a high standard, and that is a real challenge for us when it comes to Wi-Fi and DAS.”

In just about every stadium network deployment we write about, concealment and aesthetics are always one of the top concerns, especially when it comes to Wi-Fi access points and DAS antennas. For some reason, the physical appearance of an obvious piece of technology evokes strong reactions, even as other necessary structural items are ignored.

(Editor’s note: This story is an excerpt from our most recent Stadium Tech Report, the PRO FOOTBALL ISSUE, which is available for FREE DOWNLOAD right now from our site. In the report our editorial coverage includes a profile of the new Wi-Fi network at Green Bay’s Lambeau Field and team-by-team profiles of Wi-Fi and DAS deployments at all 31 NFL stadiums. Get your copy today!)

Under-seat Wi-Fi AP at AT&T Stadium. Photo: Dallas Cowboys

Under-seat Wi-Fi AP at AT&T Stadium. Photo: Dallas Cowboys

A clean, sleek look at the house that Jerry built

As one anonymous commentator at this summer’s SEAT conference noted, “stadium supervisors don’t ever care about seeing a 4-inch pipe, but leave one antenna out and they go crazy.” And whoever that stadium person is, he or she probably has a kindred soul in Dallas Cowboys owner Jerry Jones.

An unseen antenna is Jerry Jones’ favorite kind. Winborn said that AT&T Stadium embraces design in all things visible, noting that the “clean look” idea extends to advertising inside the seating bowl, where the only permanent signs are located in the end zone areas.

“We’re very conscious of the aesthetics here,” Winborn said. “Everyone here sees the benefit of what a great looking building can be. And it all starts with [Jerry] Jones.” Jones’ ideas, Winborn said, “are a major influence on everything we do.”

What that means when it comes to Wi-Fi is that while the stadium always aims to be the best-connected venue around – for this football season, AT&T Stadium will have 1,900+ permanent Wi-Fi APs and another 100 or so available for temporary placements – it also aims to hide the physical gear as much as possible. In suites and hallways there is the natural solution of putting antennas behind ceiling panels, but in the seating bowl, Winborn said, “we don’t have a lot of areas to hide them. We’ve had to become pretty clever about ways to hide APs.”

Two years ago, when the stadium’s AP count was going up from 750 to 1,250, the idea came about to design a custom fiberglass “shroud” that would circle the arena on the front of the overhangs above the first and second seating levels.

A row shot of the under-seat APs. Photo: Dallas Cowboys

A row shot of the under-seat APs. Photo: Dallas Cowboys

“We’re very conscious of the aesthetics here,” Winborn said. “Everyone here sees the benefit of what a great looking building can be.

Winborn said one of the IT staff members had contacts in the manufacturing world, which helped the Cowboys build a slighly convex design that wouldn’t be readily apparent to the untrained eye, yet be big enough to house Wi-Fi and DAS gear all the way around the bowl.

Winborn said the shroud and its underlying gear were installed during one of the recent off-seasons, taking about a couple months – and the final result was so good that Winborn says he needs to use a laser pointer to show interested parties exactly where the equipment shroud sits. Since it’s fiberglass the shroud is somewhat easy to move to allow administration and maintenance of the equipment, but the seamless flow of the structure around the bowl may just be the most elegant AP hiding strategy in the short history of stadium Wi-Fi.

But even with the shrouds there was still a need for more new placements, especially in the middle of the open seating areas. So last year the AT&T Stadium team started deploying under-seat AP enclosures, working with design teams at the AT&T Foundry program to build a custom unit that is much smaller and unobtrusive than other under-seat AP enclosures currently in use.

“We worked with the AT&T Foundry and went through [testing] about a half-dozen models,” Winborn said, before finally arriving at a design that worked well and stayed small. “It’s about the size of a small cigar box,” said Winborn of the under-seat APs, 300 of which were installed in the 100-level seating last year. Another 250 are being installed for this year up in the 300-level seating, he said.

Winborn credited the early use of under-seat APs by the IT team at AT&T Park in San Francisco as a welcome guide.

Here's the big bowl that needs to be filled with Wi-Fi. Photo: Paul Kapustka / MSR

Here’s the big bowl that needs to be filled with Wi-Fi. Photo: Paul Kapustka / MSR

“I talked to the Giants and Bill [Schlough, the Giants’ CIO] and had my concerns” about under-seat APs, Winborn said. “But after they did it and had only one complaint in 2 years, that raised my comfort level.”

Like the Giants’ under-seat APs, the ones in AT&T Stadium are designed to be as maintenance-free as possible, so that they can be steam washed and not harmed by spills or any other physical interactions. Winborn said the Cowboys have even started putting sealant and paint over the top of the under-seat APs, “so they look just like a bump.”

With 1,900 to 2,000 APs available, it might seem like the AT&T Stadium IT crew has enough APs for now, so they can relax a bit when it comes to finding new ways to hide Wi-Fi gear. But Winborn knows the next surge is probably right around the corner (including early results from this season showing 4+ terabytes of Wi-Fi use).

“Everything we are giving the fans [in Wi-Fi bandwidth] they are gobbling it up, pretty quickly,” Winborn said.

Rangers fans lead postseason baseball DAS usage on AT&T networks

Screen Shot 2015-10-14 at 11.43.57 AMFans at the Texas Rangers’ Globe Life Park in Arlington have so far topped the charts for cellular traffic totals on AT&T networks during baseball’s postseason, with an average of 992 gigabytes of data used in two games played.

Across all the series, DAS totals for postseason play showed big leaps in data use compared to regular-season totals, in one case almost six times as much. And while you can’t really compare apples to oranges it looks like DAS traffic for games this year might eclipse last year’s record wireless traffic totals at places like AT&T Park.

According to statistics provided by AT&T, game 3 of the divisional series between the Rangers and the Toronto Blue Jays saw 1,109 GB of data move across the AT&T DAS network at Globe Life Park, the highest single-game DAS total across all baseball venues this fall. Remember, stats mentioned here are ONLY AT&T customer traffic on AT&T networks in the stadiums mentioned. According to AT&T, the 992 GB average of the two games so far in Arlington are 51 percent higher than the average DAS use from the season’s opening series back in the spring.

Over in the National League, AT&T customers at Citi Field in New York used 617 GB of data during game 3, which AT&T said was an increase of 600 percent compared to average use during the Mets’ season-opening series. At games 1 and 2 in Dodgers Stadium, AT&T saw an average of 532 GB of data used per game, a 34 percent jump from the season-opening average in Chavez Ravine.

Game 3 in the Chicago Cubs vs. St. Louis Cardinals at Wrigley Field saw 500 GB of data used, a 120 percent jump compared to the season-opening series (which may be skewed since Wrigley was still undergoing construction at that point). For games 1 and 2 in Busch Stadium in St. Louis the AT&T networks saw an average of 586 GB per game, with 617 GB used during game 2.

For the Royals-Astros series, AT&T did not have any stats for games in Kansas City (perhaps because the Kauffman Stadium DAS is still being deployed) but for game 3 and 4 in Minute Maid Park in Houston AT&T saw an average of 237 GB per game.