Microsoft outduels Amazon for JEDI

Microsoft beats out Amazon after contentious competition for DOD’s JEDI award

Late on the afternoon of Friday, Oct. 25, the Department of Defense (DOD) announced it had selected Microsoft (Nasdaq: MSFT) for its lucrative Joint Enterprise Defense Infrastructure (JEDI) cloud contract, the Pentagon’s plan to adopt a general-purpose cloud infrastructure first announced in November 2017. The notification of JEDI’s winner came at an odd time — we saw the first notification of Microsoft’s win at 6:30 p.m. EDT. Releasing news or documents late on a Friday afternoon is sometimes referred to as a “Friday news dump” by members of the media, a technique that can thwart in-depth media analysis of bad news or unfavorable developments affecting the story’s source.

Regardless of why the DOD chose to announce the winner of the biggest single cloud contract to date in federal IT (and one of the biggest IT contracts in federal IT history) when it did, Microsoft is now poised to capture potentially billions in revenue as the DOD’s leading cloud vendor on JEDI, an award with a $10 billion ceiling and a potential 10-year life span if all options are exercised. Vendor selection for JEDI has been ongoing for over a year, plagued by multiple protests, internal investigations, and conflict-of-interest allegations by and between the initial four contestants, Amazon (Nasdaq: AMZN), IBM (NYSE: IBM), Microsoft and Oracle (NYSE: ORCL). The acrimony kept the DOD from awarding JEDI by its original target date of April 2019, though the agency eliminated IBM and Oracle in April in the first “down-select” of the vendor review process.

Amazon was once the ostensible front-runner, but Microsoft’s approach to hybrid cloud may have won out in the end

Amazon won the $600 million cloud award with the CIA in 2013, beating out AT&T (NYSE: T), IBM and Microsoft, an engagement many industry observers expected would act as a springboard for Amazon to future cloud work in the federal IT sector. After JEDI was announced in late 2017, industry analysts believed Amazon, the market share leader in the cloud space, and its ongoing cloud work in the U.S. Intelligence Community (IC) would help clear the way to victory on JEDI. Amazon’s alliance with VMware (NYSE: VMW) was key to winning the CIA cloud work, as VMware was estimated to be hosting between two-thirds and three-quarters of government workloads running on the cloud at the time. Amazon had also enhanced the security of its cloud offerings to accommodate defense- and intelligence-grade data assurance needs by steadily obtaining new authorizations to host government data at increasingly higher security levels. As the vendor selection process for JEDI moved along, however, concerns arose that JEDI’s single-source structure would diminish the DOD’s flexibility in choosing cloud vendors and technologies. There were also indications during 2019 that the DOD’s cloud migration strategy was increasingly favoring a more piecemeal and unhurried transition to the cloud. The DOD’s evolving cloud preferences seemed to shift the JEDI competition in favor of Microsoft’s hybrid cloud approach that blends exiting IT infrastructures with new cloud systems while leveraging partners to a greater degree in the migration process. 

Peering over the edge with 5G

Edge and 5G computing working together: It just makes sense

5G and the edge exemplify the ‘co-influenced’ trend

The afternoon opened with an exercise whereby small groups were asked to define the relationship between the edge and 5G. The somewhat laborious attempt to define the edge in the context of 5G (or vice versa) exemplified the fact that technology is developing so rapidly, the terminology to define it does not exist yet. While the communal search for a word could have dragged on for the better part of the afternoon, the group, pressed for time, settled on the term “co-influence.”

These small group breakouts led to lively discussion, and provided insight around how co-dependency does not exist between the edge and 5G as well as how there are no hard-and-fast rules as to which precedes the other and/or which is the greater influencer. Edge technology, while not a spotlight stealer, was cast in a leading role at a conference that in previous years was solely dedicated to 5G. After a morning of 5G transport immersion, the afternoon sessions highlighted that the industry is recognizing that both technologies already have a symbiotic relationship no matter what definition is used.

5G and connecting the world via the edge

Despite the excitement about what the future holds with 5G, there is widespread acknowledgement that the technology is still extremely new and that there are very few use cases highlighting extensive deployment success and in only a few highly controlled markets. Indeed, many of the event’s examples spoke to the potential of 5G and showed fascinating benefits, but within controlled testing environments.

Outside the lab, potential exists for edge to help 5G scale massively as billions of devices will be connected and widely distributed worldwide by the end of 2019. While a fraction of devices are as ubiquitous smartphones, the rest will consist of connected wearables, drones and sensors on just about every wearable or implantable imaginable. The proliferation of edge computing provides an increased network of diverse devices, enabling a means to process, filter and protect data locally, which, in turn, plays a key role in enhancing the value of 5G networks.

On Oct. 10, over 200 5G transport industry professionals gathered in New York for a day packed with presentations and discussions around the deployment of advanced 5G transport networks as well as the integration of edge cloud infrastructures and services. The event, hosted by Light Reading, brought together industry thought leaders from Verizon, Juniper Networks, Corero, ZenFi, MetTel, Ericsson and Fujitsu, among others, who led discussions on a range of topics covering edge concepts, mobile connectivity solutions, and technology structure, architecture and design. While 5G transport was the overarching topic, discussion around the edge factored heavily into the event. For the first time in the history of the 5G transport conference, one of the two afternoon tracks was dedicated to the relevance and importance of edge technology as it relates to 5G.

Ecosystems and trust: What KPMG brings to blockchain

‘It’s not about the enterprise anymore; it’s about the ecosystem’

Opening the event with KPMG’s view of innovation and technology, including specifics around blockchain, National Managing Partner for Innovation and Enterprise Solutions Fiona Grandi and Global Blockchain Leader Arun Ghosh emphasized that achieving meaningful blockchain adoption requires moving beyond the enterprise to the entire ecosystem. In these remarks, particularly when KPMG stressed its role as a network provider, a “trusted layer” across a platform and an ecosystem, TBR heard echoes of the “Business of One” framework and the gradual shift within the IT services, consulting and technology space toward more robust partnering — and clients that expect more from their vendors’ ecosystems. Trust, as repeatedly invoked by KPMG, echoes the firm’s DNA as one of the Big Four, a firm trusted with clients’ financials, systems and regulatory obligations. Neatly pulling these two ideas together — the increasing need to play across an ecosystem, and KPMG’s core value around trust — Ghosh said one key question the firm helps clients answer, when considering blockchain, is quite simply, “Can I create a trusted ecosystem?” If clients can answer that question, they are prepared to move beyond what Grande described as a nonstarter position around blockchain. “When [clients] say, ‘We want it on blockchain,’ they haven’t thought it through,” Grandi said. On a more concrete level, KPMG’s leaders stressed the firm’s role in helping clients move toward smart contracts, a core use case for blockchain’s distributed ledger technology. Smart contracts, as KPMG’s U.S. blockchain program lead, Tegan Keele, summed up nicely, do not automate processes; they remove manual tasks. To remove those manual tasks, businesses comprising the ecosystem have to reach a consensus on process diagrams to establish the governance flows for the blockchain.

One specific example from the day stood out to TBR. KPMG professionals described a large-scale operations consulting engagement, including “pain and trust point mapping,” that led to a blockchain-enabled solution providing farm-to-table provenance, starting with the government agency responsible for licensing the farms. We will explore below how a government-mandated blockchain could enhance societal goals around welfare and certification, but the key characterization of KPMG’s role came from Ghosh, who said the use case highlighted the firm’s overall goal for blockchain, which is to “create [a] common, real-time, trusted source of the truth to help solve industry’s most critical issues … create an ecosystem around something that already exists, then add a layer of trust, enabled by blockchain.”

‘Those who get it want to create their own ecosystem and control it’

Understanding how KPMG defines the core values of blockchain requires also understanding how clients and technology partners see the firm itself, including what KPMG brings to innovations, engagements and solutions. Throughout the event, KPMG ceded the stage to clients and technology partners, such as IBM (NYSE: IBM) and Microsoft (Nasdaq: MSFT), that repeated a few key themes on what KPMG brings to blockchain. Most frequently, these speakers noted KPMG’s industry expertise, especially as related to specific business processes and industry-centric regulatory challenges. On this second point, one client stated that KPMG’s trusted brand and regulatory expertise were essential in the blockchain space “to drive institutional adoption.” Another client said KPMG brought a “holy trinity of expertise” around business processes, applicable technology and change management. (Note: In TBR’s view, change management remains a critical, if sometimes neglected, element of all emerging technology adoption and digital transformation. As multiple clients and consultancies have said, “The people, not the technology, are the problem.”) A technology partner said blockchain is a “team sport” and that “KPMG has deep process expertise in life sciences and supply chain,” two elements that had been critical to the partner’s joint engagement with a U.S. pharmaceutical giant. TBR also noted that multiple KPMG clients described the firm as a systems integrator (SI), fitting with KPMG’s approach to let the solution drive decisions around the technology stack, products and software.      

McDermott will support ServiceNow’s ambitions to fill enterprise application gaps left by vendors like SAP

On Oct. 22, 2019, ServiceNow announced Bill McDermott, who resigned from SAP less than two weeks prior, would be taking over John Donahoe’s position as CEO at the end of 2019. McDermott’s experience in the enterprise software space will inform ServiceNow’s innovations in and around business applications from SAP and its closest competitors.

McDermott’s knowledge of the enterprise applications space is key

At its core, ServiceNow has a very different software portfolio than SAP, but considering the strategic objectives ServiceNow recently laid out, McDermott is well equipped to steer the company toward continued leading financial performance. As ServiceNow aims to engage more deeply with Global Elite partners such as Deloitte and Accenture, and to develop solutions that fill enterprise software gaps as it has with mobile onboarding and financial close automation, McDermott’s enterprise applications experience is a good fit. In addition to his global partner and enterprise customer relationships McDermott brings a deep and unique understanding of the “gaps,” or workflow disconnects, around enterprise applications that ServiceNow has identified as its key growth areas.

That said, it is a toss-up whether McDermott’s guidance will help ServiceNow avoid innovating into competitive overlap or steer the company directly into applications competition. With his knowledge of the functionality and gaps in broad enterprise applications suites like SAP Business Suite for HANA (S/4HANA), McDermott can direct ServiceNow’s innovation to either fill gaps or directly compete on specific functions. With that uncertainty, this appointment should put SAP, Oracle and Workday on high alert through 2020, as McDermott’s influence becomes more clear.

Notably, ServiceNow has been undergoing other executive changes, with former CFO Mike Scarpelli leaving to follow Frank Slootman (ServiceNow’s CEO before Donahoe) to Snowflake. McDermott has indicated that ServiceNow and Donahoe have already given McDermott the leeway to influence the CFO replacement process, and there is opportunity for McDermott to further shape ServiceNow through other executive appointments.

Core to ServiceNow’s capabilities, the Now Platform has long been overshadowed by the applications built on top of it. ServiceNow’s dilemma with the Now Platform is not how to enhance the capabilities, but how to brand the portfolio in such a way that the platform becomes as ubiquitous with the ServiceNow brand as the early IT workflow products have, while still capitalizing on the company’s ability to innovate into ― and capitalize on ― niche solution areas.

Centricity of technology and talent: Atos keeps growing in North America

New Artificial Intelligence Lab in Texas will facilitate Atos’ collaborative innovation work with clients

On Oct. 3 Atos held a ceremony for the launch of its new Artificial Intelligence (AI) Lab in partnership with Google Cloud (Nasdaq: GOOGL). The Atos AI Lab, so far the largest of Atos’ labs, is located on the first floor of Atos’ office building in Irving, Texas, and joins a global network of labs, the rest of which are located in London, Paris, Frankfurt and Munich. The lab will work with clients in North America to build clients’ understanding around AI and define use cases for data analytics and machine learning with the goal of improving their business performance. Collaborating with Atos and Google, clients will be able to jointly develop solutions that are specific to their business and industry needs. TBR noted that the lab is a very bright, flexible and laid-back environment that enables ideation and creative thinking by combining digital experience and design thinking methodology. Another thing that stood out during our discussions with Atos’ North America leadership team is that North America is becoming a region for the company’s innovation. Atos is instilling a culture of innovation, entrepreneurship and creative thinking in North America and creating intellectual property that is brought back to Europe to be utilized with clients in Atos’ main geography. Because these kinds of collaborative centers play an important role in vendors’, including Atos’, ability to groom talent and leadership, the proper messaging about specific technology capabilities, such as around Google, will help Atos stay abreast of the next wave of partnership models. According to TBR’s Digital Transformation Insights Report: Cross Vendor, published in September 2019, partnerships will also evolve in the long term, as in their current form, the technology diminishes differentiation among all parties. This evolution could create siloed, federated-like model enterprises, which bring a different set of challenges. However, with the expectations coming from the advent of open data standards amplified through blockchain, TBR anticipates such hurdles will be easy to overcome.

Establishing a talent base will improve Atos’ ability to generate IP in North America

Early in the event, Walsh spoke of the “centricity of technology and talent” summarizing Atos’ view of itself, its clients and its ecosystem — a sentiment echoed repeatedly by Atos executives throughout the day. In TBR’s view, Atos recognizes that its strength lies not in strategy consulting but instead in staying centered on technology while trying to help clients solve business problems. Atos also recognizes that talent, not technology, will differentiate consultancies and IT services vendors from each other as everyone pursues digital transformation. On multiple occasions during the event, as both part of the formal presentations and in side-bar discussions, Atos executives stressed the company’s commitment to training, reskilling, developing and retaining top talent across all of its lines of business (and geographies, although most of the event centered on North America). Wagner’s description of purpose-built, SAP-centric teams around consulting, Google Cloud Platform and the CTO was as much about the talent Atos needed to recruit and develop as it was about the organizational changes Atos needed to make and the SAP capabilities it needed to acquire.

In all, the Texas event confirmed to TBR that Atos’ strategy and execution in North America have shifted into a substantially higher gear, as the company accelerates its push to bring U.S.-led initiatives to the front of this very European company. One small indication: According to Walsh, Atos North America now generates more IP to share with Atos Europe than Europe develops to send to the U.S. TBR does not expect Atos’ U.S. headquarters in Irving to replace Atos’ global headquarters near Paris any time soon, but the gravitational pull of success will make the next few years interesting for Atos. 

Supporting its strategy to expand in North America, France-based digital transformation company Atos held its second annual North America industry analyst event at its North America headquarters in Irving, Texas. The event took place in Atos’ Business Technology Innovation Center (BTIC), which is part of a network of nine BTICs that Atos uses as a platform for hands-on innovation with customers and partners. Using a balanced mix of presentations, innovation showcases set up at the BTIC, one-on-one sessions, and North America client examples — one of which, a public sector organization, was presented live on stage — Atos showed the industry analyst community that its expansion in the region is accelerating. Atos is using acquisitions, such as that of Syntel, to expand its portfolio and resources, improve its service delivery model, and drive profitable growth in North America.

5G will drive CSPs to adopt a new network architecture, with NFV and SDN as critical aspects

According to TBR’s 3Q19 NFV/SDN Telecom Market Landscape, 5G will push CSPs to accelerate and broaden their NFV/SDN initiatives. As such, TBR expects NFV/SDN-related spend growth will correlate with 5G deployments. Since CSPs will need to upgrade their networks from an end-to-end perspective to realize the full potential of 5G, this will naturally drive CSPs toward the virtualization and cloudification of their networks. This trend will impact most, if not all, of the major network domains from an NFV/SDN perspective over the next five years. TBR notes that 5G core is inherently virtualized, and that this will also naturally push CSPs deeper into the NFV/SDN space over the next five years as they transition to stand-alone 5G networks.

CSPs are more deeply collaborating with other operators, telecom vendors and other technology providers to reap the full benefits offered by NFV and SDN. In particular, the technology industry is becoming more willing to work with open-source groups such as Airship and the O-RAN Alliance to develop open infrastructure that is interoperable across multiple vendors. CSPs such as AT&T and Colt Technology Services are also collaborating to advance the development of intercarrier virtualized network solutions, which will particularly benefit multinational businesses requiring connectivity from multiple service providers within their global footprint.

TBR’s NFV/SDN Telecom Market Landscape includes key findings, market size, customer adoption, operator positioning and strategies, geographic adoption, vendor positioning and strategies, and acquisition and alliance strategies and opportunities. TBR’s NFV and SDN research encompasses the internal (i.e., network operations) and external (e.g., SD-WAN and other virtual network functions [VNFs] sold to end users such as enterprises) NFV- and SDN-related initiatives of communication service providers (CSPs).

For additional information about this research or to arrange a one-on-one analyst briefing, please contact Dan Demers at +1 603.929.1166 or [email protected].

Transform at the intersect: NIIT Technologies and the near future of digital and post-digital transformation

NIIT Technologies gets closer to buyers to provide deeper support in an evolving digital and post-digital market  

Opening the event, NIIT Technologies CEO Sudhir Singh described his efforts to recraft the company as a “post-digital firm,” including making cognitive a part of every NIIT Technologies engagement. He further declared NIIT Technologies had transitioned from being a vendor that works for clients to being a services vendor that works with clients and technology partners. Across the three-day event, multiple NIIT Technologies leaders and professionals echoed this sentiment around the shift from “work for” to “work with.” Clients also repeated versions of this message, indicative of the traction it has gained within NIIT Technologies’ ecosystem. At a strategic level, Sudhir Singh said his company intended to “move the center of gravity to the markets,” putting NIIT Technologies’ people where the company’s clients are. In that effort, NIIT Technologies over the last 18 months has opened new centers in Atlanta and Augusta, Ga.; Las Vegas; Princeton, N.J.; and Boise, Idaho. At the same time, Sudhir Singh reiterated an NIIT Technologies characteristic which TBR highlighted last year: staying focused on doing a few things exceptionally well. Now and going into next year, this approach includes limiting industries served, remaining selective in partnerships and identifying a limited number of emerging technology areas where NIIT Technologies can excel. Sudhir Singh said (and multiple discussions with NIIT Technologies professionals and clients confirmed) that banking, insurance, travel and retail/media make up the vast majority of NIIT Technologies’ clients, and one of NIIT Technologies’ strengths is a “hyper-specialization” within these industries. Matrixed across those four industries, NIIT Technologies delivers services in four technology areas: cognitive, data analytics, automation/integration and cloud. Staying within its core “swim lane” is the right approach as, according to TBR’s Digital Transformation Insights research, buyers often expect vendors to bring forward pointed, purpose-driven solutions rather than “blue sky” transformational ideas during workshops discussions.

Turning to the overall IT services market, Sudhir Singh focused on three main trends. First, “technology spend is increasingly nondiscretionary,” resulting in less worry on NIIT Technologies’ part about clients’ year-to-year IT budgets and a greater emphasis on long-term relationships and fully leveraging emerging technologies. Second, at a third or more of NIIT Technologies’ clients, the COO also acts as the CIO, furthering a trend toward digital readiness and adoption across all leadership levels within an enterprise. (Note: TBR’s Digital Transformation Insights Report: Voice of the Customer from earlier this year confirms these two trends.) Lastly, Sudhir Singh said that with cognitive being the “X factor” in IT services going forward, fully connecting the front, middle and back office while continuing to get customer experience right will drive most enterprises’ digital and post-digital transformation in the near term.

NIIT Technologies gathered roughly 170 clients, technology partners, industry experts, analysts and NIIT Tech professionals, for two days of discussions and informal meetings in Miami. The setting fostered casual conversations among clients and other attendees, with clients surprisingly receptive to TBR’s questions. NIIT Technologies had only one main stage presentation, an opening address by the CEO, with clients and industry experts driving the rest of the panel discussions and main stage presentations. 

Hybrid, cloud-native and open source define Cloudera’s 3-pronged approach, post-merger

Cloud-native and open source are top of mind in Cloudera’s post-merger product portfolio

One of the key highlights of the event was the launch of Cloudera Data Platform (CDP), an open-source, hybrid cloud platform that includes Cloudera Data Warehouse, Cloudera Machine Learning and Cloudera Data Hub services. CDP is currently available on Amazon Web Services (AWS; Nasdaq: AMZN); however Cloudera hopes to provide customers with a broader range of IaaS providers as the company announced plans to bring CDP to Microsoft Azure and Google Cloud Platform (GCP) in the coming months. While Cloudera is taking a calculated risk by pushing customers to competing services, TBR believes the benefits will outweigh the costs due to the vendor’s increased exposure to a large customer base. The launch of CDP highlights the company’s cloud-native play but also aligns with Cloudera’s intent to offer customers more deployment options. TBR notes that many vendors still perceive the data center as a legacy standard; however, Cloudera is attempting to view it as a gateway to creating a hybrid instance, exemplified by its forthcoming launch of an on-premises version of CDP, dubbed CDP Data Center. This offering will be especially appealing to “lift and shift” customers who have large data sets on-premises and wish to migrate to the cloud.

Relying on security and governance for differentiation

Leveraging open-source technology to deliver solutions to customers regardless of deployment method is rapidly gaining acceptance in the market and therefore has forced Cloudera to explore new avenues for differentiation. TBR believes the vendor is attempting to achieve this through its enterprise-grade security and data governance solution, Cloudera SDX (Shared Data Experience). As a single management plane, SDX separates the data layer from the compute layer to provide automated security and compliance across platforms to help reduce costs and mitigate risk. Cloudera works its SDX offering into the rest of its portfolio, including its recently launched CDP offering, to secure data lakes and centrally manage large amounts of data. VP of Product Management Fred Koopmans and VP of Engineering Ram Venkatesh highlighted the negative effects shadow IT vendors are having on customers’ data privacy as a lack of interconnectivity between platforms hinders fraud detection and data repurposing.

Additionally, shadow IT causes dispersed data, which will inevitably require more labor resources and thus only increase the burden on customers that are likely operating on a shortage of sufficient IT skills. Findings from TBR’s 1H19 Cloud Applications Customer Research indicate that shadow IT is being eliminated while increasingly consolidated purchasing is leading lines of business to report greater autonomy when it comes to making IT decisions. As a result of these trends, we believe Cloudera is taking the right approach by strengthening SDX integrations to provide customers with greater autonomy and centralized data, making app developers, data engineers, business intelligence (BI) analysts and data scientists far more likely to adopt CDP or similar platforms.

In September Cloudera hosted its annual Cloudera Analyst Day, where analysts gained insights during breakout sessions, product demonstrations, keynotes and detailed one-on-one talks with company executives, customers and partners. Key talks included product demonstrations from Cloudera’s recently appointed CEO Marty Cole, Chief Marketing Officer Mick Hollison and Chief Product Officer Arun Murthy, along with a presentation from IBM’s General Manager of Data and AI Rob Thomas. Founded in 2008, Cloudera operates in 85 countries and has approximately 3,000 employees and over 2,000 customers.

Arriving at the edge of cloud computing

The cloud reimagined by edge computing and influenced by IoT

Cloud computing can be best described as a centralized data center remotely running thousands of physical servers. All devices that need to access this data or use applications associated with it first must connect to the cloud. Since everything is centralized, the cloud is generally easy to secure and control while still allowing for reliable remote access.

As IoT devices become more common and require more processing power, an increasing amount of data is being generated on what is referred to as the edge of distributed computing networks. By sending only the most important and least time-sensitive information to the cloud, as opposed to raw streams of it, edge computing eases the burden on the cloud and reduces costs. Put simply, edge computing delivers the decentralized complement to today’s core centralized and hyperscale cloud and legacy data centers.

The edge and the cloud do not compete with one another, and emphasizing edge or cloud computing is not an “either/or” choice, but rather, the adoption model can be viewed as a “1+1=3” opportunity. The relatively distributed nature of cloud and access to scalable compute resources is augmented by the real-time data gathering potential of the edge, reducing efficiency and latency concerns. These latency requirements vary by device and are highly situational depending on the need for real-time analytics and response versus transactional or business intelligence analytics.

More than a decade after the initial transition to the cloud forever expanded the limitations of physical and on-premises storage and compute options, we’ve reached quite literally, the edge of a new era of cloud. Organizations in industries such as telco and manufacturing, among others, will increasingly rely on edge computing to provide a suitable infrastructure and to complement the ongoing adoption of related technologies such as machine intelligence and IoT. The edge should not be viewed as a threat to cloud computing, but rather as the next phase in the evolution, driving increased adoption of the cloud into the next decade.

Timely clearance of mid-band spectrum is essential for U.S. to remain at forefront of global 5G race

TBR perspective

Significant progress has been made on 5G ecosystem development since the 2018 5G Americas Analyst Forum held last October, as commercial mobile 5G services have been launched by the four U.S. Tier 1 operators, as well as in Uruguay by state-run operator ANTEL, over the past year. However, the infancy of the 5G era in the Americas has been somewhat underwhelming due to tepid smartphone adoption, the limited range of service on millimeter wave spectrum, and lack of coverage outside major metro areas.

The U.S. is at risk of falling behind other countries, especially South Korea and China, in the global 5G race. 5G adoption is growing at a more accelerated rate in South Korea, as the country gained 2 million 5G subscribers within the first four months of commercial services being offered and reached 3 million 5G subscribers as of September. South Korea’s rapid growth is being driven by its widespread 5G coverage, which is expected to reach 80% of the population by the end of 2019, as well as operators heavily subsidizing 5G devices to offset high smartphone prices. Conversely, China will make a strong entrance into the 5G market by launching commercial services in 50 major cities in the beginning of October, with plans to deploy 100,000 5G sites by the end of 2019.

The greatest barrier to the U.S. competing at the forefront of the global 5G race is its current lack of mid-band spectrum as global operators across all major regions have already been allocated a significant amount of mid-band licenses to support initial deployments. Offering 5G services across a mix of low-band, mid-band and high-band spectrum is critical to provide optimal coverage. Though deploying services on millimeter wave spectrum is necessary for U.S. operators to realize the fastest 5G speeds, the licenses are limited by the short range of coverage they provide.

Conversely, low-band spectrum will provide the coverage range necessary for operators including AT&T (NYSE: T) and T-Mobile (Nasdaq: TMUS) to deploy nationwide 5G services in 2020, but the spectrum will not yield significantly faster speeds compared to LTE. Mid-band spectrum provides the best of both worlds, speed and range of coverage, and the acquisition of mid-band licenses will play a pivotal role in the Americas’ position in the global 5G market as well as how individual operators compete for 5G market share in their respective countries.

Nearly 200 industry analysts and representatives from well-known telecom operators and vendors convened at the 2019 5G Americas Analyst Forum to discuss the state of the developing 5G market in North America and Latin America. The event featured an opening presentation from T-Mobile CTO Neville Ray regarding 5G leadership in the Americas, a fireside chat with Federal Communications Commission (FCC) Commissioner Michael O’Rielly, and a choice of 26 roundtable discussions focused on key 5G topics including IoT, edge computing, 5G network infrastructure and technologies, regulatory considerations, and private cellular networks.