Posts

Atos’ pragmatism cuts through the AI hype

Atos Technology Days 2018, held in Paris on July 4 and 5, displayed myriad practical applications of emerging technologies. It was not A science fair with a focus on the future, as many technology vendor events have been of late. Rather, Atos showcased artificial intelligence (AI)-infused applications powering prescriptive maintenance, digital twins, Internet of Things (IoT)-enabled retail and payment applications, electric grid optimizations, and process optimization use cases for insurance and banking. The presentations walked through the underlying technological components powering these business outcomes and how Atos, in conjunction with its major technology alliance partners such as Google and Siemens, can stitch together existing technologies to deliver this business lift to its customers.

Intelligent use of artificial intelligence: A tech provider’s guide to automation success

The enterprise automation market opportunity is nearing a tipping point where proof-of-concept tests using adaptive, emerging technologies are hardening and scaling. The gap increasingly widens between leaders and laggards, with leaders now moving from obvious cost take out initiatives into creative destruction pilots for new revenue sources in adjacent markets. Amazon (Nasdaq: AMZN), for example, has been throwing knockout punches against traditional brick-and-mortar firms and is now taking aim at legacy grocers with its Whole Foods acquisition, which moves the company into adjacent markets. As little as one-tenth of data center personnel are required to monitor and maintain enterprise compute instances as more and more compute provisioning becomes abstracted and automated to create building blocks on the way to utility computing, a concept first raised before the turn of the century. Like it or not, we are on the way to Skynet.

By Executive Analysts Geoff Woollacott and Stephen Davidson

 

Portfolio and network enhancements showcase Telefonica’s evolution to digital service provider

Telefonica’s (NYSE: TEF) progress in evolving from a traditional communication service provider to a digital service provider was highlighted at the company’s 2018 Industry Analyst & Customer Day. Telefonica’s digital transformation initiatives will yield internal cost savings and increased operational efficiency for the company while enhancing connectivity, scalability and customer experience for consumers and businesses. Telefonica is also expanding its portfolio in emerging technologies, including artificial intelligence (AI), machine learning (ML), big data and advanced Internet of Things (IoT) use cases that will increasingly become a part of businesses’ and consumers’ daily lives over the next decade. However, Telefonica will face difficulties in its journey, including balancing network and technology investments while managing the company’s high debt load, the challenging regulatory environment for telecom companies, and competitive threats from webscales and rival carriers that are also aggressively expanding their digital portfolios.

Telecom vendors anticipate revenue from 5G in select countries as early as 2H18, but lower China capex drove market decline in 1Q18

HAMPTON, N.H. (June 29, 2018) According to Technology Business Research, Inc.’s (TBR) 1Q18 Telecom Vendor Benchmark, the conclusion of LTE coverage projects in China hampered revenue for the largest vendors. A reduction in demand from telecom operators for routing and switching products also caused revenue to decline for Cisco, Juniper and Nokia. In this market downturn, vendors are employing various strategies to maintain margins and mitigate revenue declines while eyeing initial commercial 5G rollouts, which are set to begin in the U.S. in 2H18.

“Suppliers are trying to sell into the IT environments of operators, engaging with the webscale customer segment, and expanding software portfolios to partially offset falling telecom operator capex,” said TBR Telecom Senior Analyst Michael Soper. “The shift in the revenue mix toward software is also helping vendors maintain operating margins in spite of lower hardware volume. Vendors are also growing their use of automation and artificial intelligence in service delivery to improve profitability by reducing their reliance on human resources.”

Western-based vendors are preparing their portfolios to build out 5G for U.S.-based operators in 2H18. Several operators have aggressive 5G rollout timetables and intend to leverage the technology for fixed wireless broadband and/or to support their mobile broadband densification initiatives. Vendors that have high exposure to the U.S. and are well aligned with market trends such as 5G, media convergence and digital transformation will likely increase market share over the next two years as operators in the region are expected to aggressively invest in these areas starting in 2H18.

The Telecom Vendor Benchmark details and compares the initiatives and tracks the revenue and performance of the largest telecom vendors in segments including infrastructure, services and applications and in geographies including the Americas, EMEA and APAC. The report includes information on market leaders, vendor positioning, vendor market share, key deals, acquisitions, alliances, go-to-market strategies and personnel developments.

For additional information about this research or to arrange a one-on-one analyst briefing, please contact Dan Demers at +1 603.929.1166 or [email protected].

 

ABOUT TBR

Technology Business Research, Inc. is a leading independent technology market research and consulting firm specializing in the business and financial analyses of hardware, software, professional services, and telecom vendors and operators. Serving a global clientele, TBR provides timely and actionable market research and business intelligence in a format that is uniquely tailored to clients’ needs. Our analysts are available to address client-specific issues further or information needs on an inquiry or proprietary consulting basis.

TBR has been empowering corporate decision makers since 1996. For more information please visit www.tbri.com.

Democratization now: It’s good for business

Data democratization is a hot topic now. Spokespeople from Google, SAP, Capgemini and other tech companies have spoken and written about how making data available to as many people as possible will both unleash the power of technology and prevent abuse of closely held data. Microsoft CEO Satya Nadella sprinkles his talks and interviews with references to democratization. TBR agrees that data access is critical to achieving artificial intelligence’s (AI) considerable potential, but access is not enough. For AI to do what it can do, business people with domain knowledge, who are regular users, must be able to work directly with data, without intervening layers of developers and data scientists.

Data access is a conundrum. Ensuring appropriate privacy and security while making data available to as many people as possible is a challenge, one that is inhibiting the growth of widespread AI-driven data analysis. This post will not address that challenge, however. It focuses on one of the other growth inhibitors: the current need for data experts, scientists, engineers and janitors, as well as developers, to extract the value from data.

Business users might see the hierarchy of data experts as a priesthood or a bureaucracy, standing between them and the data, but that is not really what is happening. Currently, there are no tools with which business users can conduct their own analyses, at least not without a lot of preparation by the data experts. Better tools are coming; there are R&D efforts worldwide to make data more directly accessible, which is part of what Nadella and other spokespeople are talking about.

Before these democratic tools are made available, there is strong growth in AI and the utilization of data analytics, because the value is there. But the need for experts greatly increases the cost of analysis, so only analyses with the highest potential value are performed. As more democratic tools become available, many more analytic projects will be worthwhile and the use of analytics will grow much faster.

The impact of democratized analytics tools will be huge because the costs associated with the data expert hierarchy are great. Those costs go beyond just personnel. Communication between business users and data experts is time-consuming and expensive, and it lowers the quality and value of the analyses. Business users and data experts live in different worlds and have different vocabularies; misunderstandings are common. Misunderstandings are expensive, but what is worse, working through intermediaries slows the iterative process by orders of magnitude. The potential value of data lies in insights, and finding insight is an iterative process.

The history of business technology is a progress propelled by increasing democratization of tools. The PC itself is prime example, making computing directly available to business users. The internet rode a wave of disintermediation and self-service to its current global saturation. In terms of democratization of AI analytics, the best parallel is the PC spreadsheet, which made it possible for business people to create and tune their own quantitative models of business activities. Before the spreadsheet, creating those models required coding.

“Spreadsheets for AI,” one of which may well be a version of Microsoft’s Excel, will accelerate growth in analytics, data access, storage, cloud services and the Internet of Things. AI spreadsheets will not retard growth in the use of data experts; they serve a different market. Even with very good first versions, broad adoption will take years, so the acceleration of growth will not be sudden. Over the years, however, the ability of business users to directly analyze their data will contribute substantially to the revenue of IT vendors and to that of their customers.

 

CX opportunities in the era of AI

Artificial intelligence needs human design

Artificial intelligence (AI) technologies continue to progress, with vendors increasingly embedding machine learning capabilities into enterprise applications and consumers coming to expect a level of personalized, yet automated, interaction that only AI can deliver at scale. Discussions around the potential hazards of AI to brand reputations, personal data protection, constitutional freedoms and society at large have become commonplace, but this has not slowed the pace of technological advancement. While AI technology vendors continue to lead and engage in these discussions (especially when their own reputations and research investments are at risk), ultimately, organizations that incorporate AI tools into business decisions and automated processes will be responsible for the impacts of those technologies.

If the 2018 O’Reilly Artificial Intelligence Conference made anything clear, it was that as AI adoption grows, so does the technology’s complexity, particularly at the intersection points between humans and machines and between regulatory policy and technological innovation. This should sustain professional services opportunities for vendors that can stay on top of AI technology developments while maintaining a broader perspective on the impact of AI on clients’ business processes and HR strategies. Still, many questions remain unanswered, including how to manage security and governance over the massive autonomous systems that will be coming online in the next several years; whether the approach taken by the European Union with its General Data Protection Regulation (GDPR) will become the global standard; and what the long-term impact on human intelligence and skills will be as machines take over more tasks. It is unlikely these issues will be resolved by the 2019, or even 2020, O’Reilly Artificial Intelligence Conference, but vendors can start to address some of these questions with clients through consulting and solution design engagements tied to broader digital transformation initiatives.

Event overview

TBR attended business and technology learning content company O’Reilly Media’s third O’Reilly Artificial Intelligence Conference, an event centered on a variety of AI topics, including enterprise use cases, implementation, business and societal impacts, product design, and machine learning methodologies, over two days in New York. The conference’s theme, “Putting AI to Work,” mirrored that of last year, but sessions and discussions reflected growing maturity in how enterprises and researchers approach, develop and apply AI technologies. Keynote speakers represented AI technology vendors such as Intel AI (the conference’s co-presenter, as announced last year), Google, IBM Watson, Microsoft (Nasdaq: MSFT), Amazon Web Services, SAS, Digitate and Uber, as well as research institutions such as MIT, Princeton and Carnegie Mellon. In addition to tactical sessions around specific AI use cases designed for data scientists and software engineers that were abundant last year, new in 2018 was the AI Business Summit track tailored for executives, business leaders and strategists (and for TBR’s lead analyst covering professional services related to AI, analytics and digital transformation). TBR also interacted one-on-one with founders, product leads and marketing executives from AI-related startups such as Alegion, Kinetica, Clusterone and Dataiku throughout the conference.

ServiceNow Virtual Agent looks to bolster AI strategy

I see [ServiceNow] pivoting out of the IT department a bit, which has been an ongoing theme for them. They are moving towards business users, trying to tie them in closer to the broader base of enterprise users. Even [for] the requests that make it through to IT, the system points users back to the self-services resources. — Meaghan McGrath, Senior Analyst

Full Article

Azure IoT Edge tool set stirs AI into Microsoft’s cloud

You can have IoT without AI, and you can have AI without IoT, but there is complementarity of the two together — particularly if you’re talking about edge and AI. — Ezra Gottheil, Principal Analyst

Full Article

Laundering money and funding terrorism cannot withstand analytics and AI

Despite banks’ substantial investments in technology, people and processes to meet regulations, they currently lack effective and efficient systems for tackling financial crimes such as money laundering and terrorist financing. Regulators cannot keep pace with change, and the time and investment to overhaul banks’ legacy systems are too great given the complexity of global organizations and inevitable disruption to operations. But the three elements — technology, people and process — match EY’s strengths in technology consulting, especially when paired with deep financial services industry and risk and compliance expertise. EY continues to invest and evolve its financial crime (FinCrime) practice as it listens to financial institutions’ demands for services that embed regulatory compliance expertise and technology innovation, offered at scale on an outcomes-based pricing model. EY’s FinCrime practice collaborates across the firm to combine legacy capabilities and emerging technologies to differentiate from competitors’ portfolios in the market and provide, in TBR’s current analysis, industry-leading offerings.

EY’s connected approach to disrupting financial crime: Technology disruption, industry collaboration and process innovation

Over the course of EY’s two-day Financial Crime Analyst Summit, the firm’s leaders and banking sector clients spoke with TBR about the challenges financial institutions face, including high operating costs, stifled revenue growth, and demands to undergo business transformation while maintaining compliance with evolving regulations. Many industries contend with the first two challenges, but this last one — transforming while complying — fits well with EY’s strengths: industry expertise, emerging tech capabilities, and a deep understanding of the regulators in the U.S. and globally. In applying those strengths, EY’s financial crime practice relies on three pillars — technology disruption, industry collaboration and process innovation — in other words, meet demand for services and solutions that are backed by regulation credibility, infused with technology innovation and offered with tiered pricing to successfully disrupt FinCrime.

Before getting to the specific ways that EY addresses FinCrime, one key aspect of the financial services market as a whole deserves extra attention: trust. In the consulting and technology spaces, trust has come to mean delivering on promises and securing data. In the banking space, with the additional weight of money and regulators, trust becomes the single most important factor in determining the extent of a provider-client relationship. With a heritage as a trusted auditor, a reputation for delivering consulting services, and a position between clients and regulators, EY has built up enough trust capital to take on industrywide challenges.