Trusted facilitator: Atos discusses its place in the blockchain ecosystem

Atos’ global lead for blockchain, Klaus Ottradovetz, confirmed TBR’s assessment that as many companies move beyond evaluation and proof of concept to implementation of blockchain solutions, the broader IT ecosystem will see the benefits and acceleration of a networking effect. If IoT marries analytics to connectivity, blockchain binds transactional partners in an ever-increasing series of interwoven networks. Within those blockchain ecosystems, collaborative models with multiple stakeholders sharing costs and benefits provide the backbone for new value-generating services based on the advantages delivered by blockchain. To illustrate his point, Ottradovetz described a crop insurance solution (which recently won the Technology Excellence in Blockchain award at NASSCOM) as an ideal use case for blockchain, in which the various stakeholders, including insurers, reinsurers, farmers and content providers would all benefit. The solution uses smart contract functionality and mobile payment to create a cost-effective and fully automated claim settlement process, which also strengthens the trust between the two entities. Insurance companies can use the application to collate weather data through satellites and measure weather conditions (e.g., rainfall, drought), which are then used to compensate farmers for crop losses.

Atos adds value to blockchains by working with all stakeholders   

Ottradovetz noted, and TBR agrees, that our report did not include many details on Atos’ blockchain practice, which resides with the Business and Platform Solutions division and leads industry-specialized go-to-market activities and service delivery around consulting, integrating and operating blockchains. Atos taps into the Infrastructure and Data Management division for hosting and running nodes in blockchain networks and also integrates offerings from its Big Data and Cybersecurity division, such as cybersecurity services and IP-based products like the Evidian Identity and Access Management software suite and the Trustway Hardware Security Module. Ottradovetz said Atos has focused on working with its existing clients, including in the government sector, and that Atos acts as a “services provider” and is “not in a position of taking value from others within the blockchain ecosystem, not taking value out of the product chain.” In a characterization that TBR believes accurately describes Atos’ market position, Ottradovetz said the company serves as a “trusted” functionary, “the independent advisor to each stakeholder…we help [clients] build a cooperative blockchain.”

Quantum.Tech: Brilliant minds collaborate on current tech challenges while also separating hype from reality

The variety of organizations involved in quantum computing demonstrates the vast reach this technology will have once commercialized at scale

At Quantum.Tech, attendees heard from a diverse group of speakers from established companies directly involved in the quantum computing space, such as IBM (NYSE: IBM) and Microsoft (Nasdaq: MSFT), as well as startups, such as Zapata Computing and IQM Quantum Computers. In addition to encouraging great minds to work together, Quantum.Tech did an excellent job of demonstrating the sheer magnitude of reach that quantum technologies will have once unleashed. There were speakers from multiple government agencies, both within and outside of the U.S., as well as from healthcare, banking, venture capital, telecom and security companies. These organizations have a vast reach and are also just a small representation of the sheer volume of industries quantum technologies will impact. Time and time again, these speakers noted that the commercialization of a fault-tolerant quantum computer is expected to be achieved within the next 10 years, indicating that economic advantage is on our doorstep and that large-scale quantum adoption is just around the corner.

Quantum sensing will change human activity and has changed centuries-old standards definitions

Atomic clocks, quantum sensors and lidar (light detection and ranging) were several of the small-scale quantum technologies discussed, in addition to the general-purpose quantum computing, networking and security aspects of mainstream computing use cases. Quantum-enabled lidar appears reminiscent of early radar technology first brought to market during World War II and the great-great-grandparent to current GPS technology. Quantum-enabled lidar will be adopted traditionally, being put to use for military applications first, then moving to commercial aircraft and ultimately over to automobiles, drones and other modes of human transportation.

During a discussion at Quantum.Tech, RSK, a civil engineering firm in the U.K., indicated it already uses quantum sensing technology for site assessments. These mid-six-figure devices pay for themselves in six months to a year, and in at least one situation, the $10,000 to $15,000 fee for service enabled the client to gain information that previously would have entailed risking $5 million for a project to uncover. Later in the talk, the device was also shown being towed behind an automobile mapping the ground below the road. In this fashion, municipalities can pinpoint areas where water or sewer lines are leaking, for example.

Quantum.Tech spanned two days with three discrete presentation tracks for over 400 attendees. There was a smattering of references to existing commercial products amid discussions about persistent challenges being researched around the globe. Academics, government agencies, technology companies, venture capitalists (VCs) and enterprise businesses had their turns to discuss the challenges and anticipated benefits of quantum computing. The general tenor of the event was one of tempered optimism. Advancements continue, challenges persist, and the pace of scientific breakthroughs can be uneven. Concerns were expressed about over-hyping the technology while working on real scientific obstacles taking months, if not years, to solve. Hyping the expected pace of discovery could trigger a “quantum winter,” with VCs pulling back necessary funding for long-term developments as hype leads to unrealistic expectations of rapidly tangible results.

Canonical doubles down on multicloud in defense of its strategic position against Red Hat and VMware

TBR perspective

At Canonical’s 2019 Analyst Day, the company displayed a compelling business model and a clear road map toward achieving its desired business outcomes. However, TBR believes the long strategic strides Canonical has taken over the past year have only propelled the company so far due to the increasingly competitive field that Red Hat and VMware (NYSE: VMware) are creating. It has been just over two months since IBM (NYSE: IBM) completed its purchase of Red Hat, so it was not a surprise that Canonical emphasized the competitive landscape IBM has shifted with its $34 billion purchase. Even less surprising was Canonical’s dive into specific areas, including public cloud and data center, where it expects to sidestep its two biggest competitors, Red Hat and VMware, both of which Canonical is also most often compared to within the market.  

Navigating a competitive landscape

While Canonical boasts that its multicloud strategy is unique, the vendor’s approach to multicloud aligns with that of major public cloud providers as Canonical aims to run Kubernetes on its Linux-based operating system (OS), Ubuntu, to solidify its place at the interoperability layer. In his opening remarks, Shuttleworth alluded to the fact that open infrastructure is just beginning and that the pending explosion of open source will occur at the applications layer; on top of that, Canonical claims PaaS will not account for more than 10% of applications. TBR believes Shuttleworth’s comments take direct aim at VMware Cloud Foundry and the fact that VMware does not own any applications. However, while Canonical boasts that its approach goes beyond infrastructure with its Linux app store, VMware is close behind given its recent acquisition of Bitnami, which specializes in application packaging, supporting VMware’s application ecosystem strategy.

As one of the few remaining OpenStack providers, Canonical has positioned its proprietary OpenStack offering, BootStack, to still be very much part of the company’s value proposition while other vendors like IBM, Rackspace and Mirantis are de-emphasizing the technology. As part of its private cloud strategy, Canonical maintains support for OpenStack private clouds on Ubuntu, whereas in public cloud Canonical places Ubuntu on the platforms of partners, such as Amazon Web Services (AWS; Nasdaq: AMZN), Microsoft (Nasdaq: MSFT), Google (Nasdaq: GOOGL), Oracle (NYSE: ORCL), Rackspace and IBM. TBR expects IBM will shift its current Ubuntu-based workloads to Red Hat Enterprise Linux (RHEL) now that its purchase of Red Hat is finalized, resulting in the loss of business for Canonical. Presentation materials also highlighted opportunity around fully managed infrastructure services. TBR notes the managed services opportunity around OpenStack was far more prominent 10 years ago, whereas now the opportunity is around creating a managed services portfolio that emphasizes higher-complexity workloads as well as IoT. Canonical noted it is not attempting to build a managed services empire, yet further development in this area presents an uphill battle for the company, especially as IaaS leaders AWS, Microsoft and Google make it easier for enterprises to navigate the challenges of a hybrid environment, which in many cases OpenStack cannot serve.

At Canonical’s 2019 Analyst Day, CEO Mark Shuttleworth and other company executives got together with industry analysts to highlight the company’s revamped business strategy, one that emphasizes four competitive battlegrounds including public cloud, data center, edge cluster and IoT. The event featured presentations from Shuttleworth, Finance Director Seb Butter and VP of Public Cloud Christian Reis, among others. The event also incorporated presentations from key partners, including from Atos VP of Cloud Engineering Bob Seddigh and BT Group Chief Architect Neil McRae.

EY’s Managed Services: A co-sourcing partner for value creation

EY’s approach to managed services is a boardroom rather than operational discussion  

While the nuances around the definition of managed services vary across vendors and buyers, the common theme of supporting organizational functions resonates with all. As a result, there is a fair amount of confusion and sometimes little-to-no differentiation among suppliers that are simply trying to expand client mindshare. While the advent of AI, cognitive and similar technologies, along with the firm’s desire to participate in the “as a Service” space, has fueled EY’s efforts to build its information systems management capabilities, the firm’s position in the managed services space is largely determined by its role as a trusted tax partner. While buyers have engaged with EY for years around its tax expertise, outsourcing and/or in some cases co-sourcing, tax technology and tax operations are somewhat newer areas of opportunity for the firm. Delivered through EY’s Tax and Finance Operate framework, the firm’s relationship with the CFO buyer allows it to capture strategic tax activities typically managed by in-house tax professionals including financial crime, tax policy administration and cyber, among others.

EY has seen its fair share of success in the space, most recently signing a long-term managed services agreement with Nokia (NYSE: NOK), which followed a similar deal with a global insurance provider back in 2018 for managing tax and compliance. As part of the Nokia deal, EY will provide tax, finance, data and technology managed services supporting the mobile provider across 127 countries, leveraging EY Global Tax Platform and global delivery centers.

Adding technology to a business framework, as EY did with the inclusion of Microsoft Azure in the EY Global Tax Platform, and platformizing micro-processes to support corporate applications, including corporate income tax, provisioning and value-added tax, is one way EY’s Managed Services practice is trying to bridge the gap between tax services and IT, which can then drive other opportunities, including in advisory services. As digital permeates all services and the regulatory environment across the globe continues to evolve, EY’s investments position it to carve a niche applicable to its strengths rather than building one inorganically, similar to rivals that have been investing heavily in areas such as marketing operations to better appeal to the CMO buyer.

HPE’s CMS unit reemerges as a software-centric contender in the new network architecture

TBR perspective  

TBR believes HPE’s CMS unit has the potential to become a significant disruptor in the telecom space. CMS, which had been marginalized in prior years while Hewlett Packard Co. split into HP Inc. and HPE and as HPE executed divestitures, restructurings and developed a new strategy, has received new life after obtaining corporate sponsorship from HPE’s relatively new CEO, Antonio Neri, and CFO, Tarek Robbiati, who was formerly the CFO at Sprint (NYSE: S). CMS leadership reports directly to Robbiati. With the C-Suite and board of directors providing corporate support, the telecom vertical will become a key growth pillar for HPE going forward, given the technology transformation and business model transformation that is being prompted by 5G, edge computing, AI and automation. 

The CMS unit represents only a small percentage of HPE’s total revenue, but the unit is a key gateway into emerging opportunities that are impacting the telecom vertical. CMS is reestablishing itself in the market as a growth engine for HPE corporate and is receiving the funding and support required to drive its portfolio, particularly in the management and orchestration (MANO), 5G core, and digital identity spaces. TBR believes CMS is positioned to be a key vendor in the new network architecture, which will be microservices-based, cloud-native and distributed.

CMS faces some notable hurdles, including the negative perception of its capabilities that followed the bad press it received as a supplier and the prime systems integrator for Telefonica’s (NYSE: TEF) software-defined transformation initiative back in 2015. The company was eventually replaced by several other suppliers. TBR believes the lingering effects of this situation have hindered CMS’ growth over the past few years, but notes that CMS has put the incident in its rearview mirror and is making significant headway moving forward.

CMS’ mindshare and credibility are moving in a positive direction, and the unit is gaining significant traction in CSP accounts, particularly for its Service Orchestrator and NFV Director MANO offerings. CMS has an impressive roster of CSP customers and has played a behind-the-scenes role in several significant network transformation projects, including SK Telecom (NYSE: SKM) and Vodafone (Nasdaq: VOD). These reference wins will be critical to positioning HPE as a contender in new RFPs, particularly in disruptive areas such as MANO and 5G core.

CMS is challenged by OSS domain incumbents like Amdocs (Nasdaq: DOX) and Ericsson (Nasdaq: ERIC), which CSPs will be reluctant to move on from due to possible migration and integration issues. This hesitancy could also prohibit the majority of CSPs from altering their procurement models to adopt more modular solutions, as webscales have done. CMS’ portfolio is increasingly aligned to this trend. The most difficult challenge may be delivering on helping CSPs become more than the connectivity provider or “dumb pipe” in a 5G world. Vendors will be jockeying to deliver this dream, but HPE may be better served focusing on providing the solutions that will enable CSPs to run the most efficient, cost-effective networks possible.

HPE (NYSE: HPE) hosted its first ever North America Communications and Media Solutions (CMS) Analyst Summit in Boston, bringing along top leadership from the company’s CMS business, who delved into CMS’ strategy and portfolio as well as key customer wins and success stories. Following executive presentations, which were interactive in nature, with industry analysts able to pose questions to presenters, analysts received one-on-one time with CMS VP and General Manager Phil Mottram, CMS Chief Technology Officer Jeff Edlund, CMS VP of R&D and Delivery Mark Colaluca, and CMS VP of Product Strategy and Lifecycle Management Domenico Convertino.

With CMS recently emerging from the shadows of HPE’s Pointnext business and retooling its portfolio to align with demand from communications service providers (CSPs), executives were upbeat about CMS’ ability to take market share and compete with highly entrenched incumbent vendors and startups alike.

Comcast Business targets strategic acquisitions and international expansion to fuel long-term growth

TBR perspective  

Comcast Business remains in an enviable position and is an outlier in the telecom industry. The cable operator’s business unit continues to post double-digit revenue growth and is taking market share from a range of competitors in the U.S. The company’s core value proposition is its powerful DOCSIS 3.1-based, hybrid-fiber-coax (HFC) fixed access platform, which provides significantly better value in terms of connectivity performance to price paid compared to legacy technologies such as MPLS and DSL. Comcast Business sees at least another $60 billion in market share that is available to take from less competitively positioned incumbent telecom operators in its domestic market. This sizable addressable opportunity will continue to feed the Comcast Business machine for at least the next few years.

TBR believes Comcast Business also has significant opportunity to sustain long-term revenue growth through international expansion and the company’s evolving sales and marketing strategies. The company is pivoting from a focus on selling its products horizontally across verticals to a solution-centric, verticalized approach, evident in the acquisition of Deep Blue Communications, which specializes in the hospitality, retail and entertainment industries, and also evident in the company’s new product offerings and design principles. Comcast Business is also expanding outside the U.S. and is building presence via acquisitions, in Canada via iTel, in the Caribbean and Mexico via Deep Blue Communications, and in Europe via Sky. Over time, TBR believes the shift to build its international footprint will help Comcast Business win more of the Fortune 1000 companies that have sites in multiple countries and that require global services. In addition, Comcast Business’ acquisition of BluVector, an AI-powered cybersecurity technology company, points to another trend at the company, which is to build out robust security capabilities.

The 2019 Comcast Business Analyst Conference highlighted the company’s business progress and financial performance and detailed initiatives that will spur long-term growth including new acquisitions and portfolio developments as well as Comcast Business’ evolving go-to-market strategies. The event included a State of the Business update by Comcast Business President Bill Stemper as well as presentations focused on areas including the company’s recent Deep Blue Communications acquisition, the ActiveCore SDN platform, network security and Comcast Business’ fast-growing enterprise division.

IBM z15 brings enterprise-grade, automated and compliant data management to systems of engagement

IBM expects prominent mention in Chapter 2 of the cloud

IBM asserts with some legitimacy that incumbents have an advantage when it comes to disrupting markets. They possess the account knowledge, have working capital, and, more importantly, have data in their core systems and databases. To build an ecosystem, these enterprises have to unlock the power of that data and distribute it beyond the core platform and into a hybrid multicloud environment, which IBM envisions as Chapter 2 of the cloud.

IBM asserts Chapter 1 of the cloud consisted of experimentation and then scaling of new systems of engagement applications while mission-critical systems of record and their multiple dependencies within enterprises remained firmly ensconced behind the firewall for increased data protection. Chapter 2, in IBM’s thinking, will enable these existing investments to participate in hybrid cloud to leverage business data in inside-out fashion alongside emerging mission-critical systems of engagement.

In historical context, if Chapter 1 of the cloud correlates to IBM’s ill-fated 9370 hardware launch, then Chapter 2 should be on par with IBM’s successful launch of the AS/400 minicomputer, which lives on today in junior and large enterprises and is ripe for modernization and migration to the cloud.

At the very least, that is the bet IBM hedged with its $34 billion Red Hat acquisition, which is still being thoughtfully rationalized with the venerable Z platform as midsize and large enterprise consumption patterns merge in a hybrid multicloud world enabled by the growing popularity of multi-enterprise business networks underpinned by IoT, blockchain and AI.

Success in the data economy rests on ecosystems with low-risk data velocity

New with the z15 is what IBM calls Data Privacy Passports, which enable policy-based data management. Passports act as a data proxy. Queries go to the Passports controller rather than to the database itself. The permissions stay with any copies that are made, and the permissioned access can be modified or revoked at any time. IBM highlights the success of Passports in complying with General Data Protection Regulation (GDPR) requirements, allowing enterprises to control the data even without physical custody of the data.

The net benefit of adopting z15 flows from future-proofed risk mitigation. Enterprises and policy makers continue to evolve data privacy regulations. A software shell that allows for policy-based data management that can quickly convert domain knowledge into automated procedures can ensure continued compliance in the rapidly changing regulatory environment around data utilization. The lower risk likewise accelerates data velocity beyond the core enterprise platforms in a prudent and trusted manner.

IBM’s (NYSE: IBM) development efforts position the latest refresh of its flagship Z offering, the z15, to become the premier enterprise data controller in what the company considers Chapter 2 of the cloud. Namely, the z15 product launch represents a continuation of the engineering, development and customer engagement that IBM has messaged for 50 years in the rapidly developing hybrid multicloud environment. The product is purpose built for enterprise-grade applications and has evolved in line with demand for new iterations to meet changing needs and use cases. If z14 brought data protection to mission-critical systems of record by delivering pervasive encryption, then z15 delivers enterprise-grade data privacy provisioning via software control planes to protect data when it leaves the Z platform for use among emerging cloud-native, mission-critical systems of engagement.

Voice, business and device design: For business, smart speakers need a button

The expanding scope of conversational user interfaces

There are many applications for conversational interfaces in business, for both customer and employee users. Input devices include not only smart speakers but also phones, PCs and vehicles. The conversational interface also works without voice; for someone at a keyboard, typing the command is easier, faster and less disruptive than speaking it.

Conversational interfaces offer benefits far beyond hands-free operation. Users need not remember specific applications and commands, but rather the interface can suggest a possible meaning of what the user has asked or can say it does not understand and users can then rephrase and try again. This reduces the burden on the user to learn how to make a command and thereby expands the number of commands available. For some commands, like making an appointment, speaking or typing a command in natural language is far easier to use than any point-and-click or touch-and-swipe interface.

Additionally, when there is a wide range of possibilities from which to select, the conversational interface is supreme. The user need not remember the exact name of the item. Naming the item as the user remembers it, and then correcting as necessary, is far easier than any of the conventional user interface tools for choosing from many possibilities, including hierarchical menus, scrolling lists and incremental search. Items include names of people as well as names of songs, television shows or podcasts for consumer applications along with products, regions, or variables for business users. A well-implemented conversational user interface can make it very easy for businesspeople to query a complex database, and they can do it using a smartphone. Widespread adoption of conversational interfaces in business will drive up the value and utilization of data, leading to more applications that create potentially valuable data.

As conversational user interfaces evolve, they are becoming more powerful. Currently, the popular consumer voice interfaces process user input on servers, so each conversational interchange requires a round trip between the user’s device and the cloud. As devices become more powerful and conversational software improves, there will be a migration of voice processing from the cloud to the edge. Google (Nasdaq: GOOGL) has announced that the next generation of Google Assistant will process requests on the Android smartphone, starting with the company’s Pixel phone. Google claims this change will improve responsiveness by a factor of 10, and will allow a richer vocabulary of commands, more complex back-and-forth dialogues between the user and the device, and greater adaptation of the interface to the needs of each user. TBR believes this local processing will leverage AI capabilities built into the devices and, in the case of the Google Pixel, will use Google’s Edge TPU, the mobile version of the company’s server-oriented Tensor Processing Unit (TPU). With increased power and better security and privacy, this migration of processing to the edge will drive growth of business applications.

Streamline and extend: IBM’s play for what it calls ‘hybrid multicloud’ or ‘Chapter 2 of the Cloud’

IBM will use OpenShift to bring a consistent cloud value proposition, remaining agnostic toward delivery method, location or cloud provider

In 2015 Red Hat’s CEO made a statement at an analyst day presentation that Red Hat aimed to do to the PaaS layer with OpenShift what it had done to the enterprise operating system layer with RHEL. That strategy was thoroughly validated with IBM’s acquisition of Red Hat. At the core of the IBM Cloud Summit were discussions of how OpenShift was the only platform layer capable of running on multiple clouds, in what IBM describes as hybrid multicloud. In IBM’s definition, hybrid denotes the ability to run applications on premises, in private clouds, in public clouds and at the edge. Multicloud denotes the ability to mix and match different cloud providers across the hybrid continuum. Many vendors can deliver single brand — or monocloud — hybrid that is a new version of vendor lock-in. IBM asserts OpenShift is the only cloud operating system and development layer that can enable customers to code once and deploy on any form factor from any technology vendor.

Stefanie Chiras, VP and general manager of the RHEL Business Unit, articulated the central Red Hat value proposition as being an enterprise software company with an open-source development model. Involved in a variety of different open-source projects, Red Hat monitors these myriad projects, filtering the most powerful innovations from these upstream contributions into the RHEL operating system before hardening these projects into enterprise-grade products and adding the necessary security and DevOps deployment features needed for large enterprises.

This hardened curation of open-source community projects has seen Kubernetes container management rapidly emerge as a de facto deployment standard, which has Linux as the underpinning operating system. OpenShift stitches RHEL and Kubernetes together with additional development services, and IBM will pivot all its software into cloud-native deployment models resting on top of this foundation to enable the software to run anywhere its customers require.

IBM bets on Cloud Paks to simplify application modernization migration to the cloud in what it calls the ‘second chapter’

IBM’s point of view is that the cloud is entering its second chapter. In the first chapter, which has been the last 15 years, enterprises have moved some development and some customer-facing applications to the cloud, but the deep back-office systems of record have remained on premises for a variety of reasons, including the technical debt of the custom applications, as well as concerns around security and compliance.

Cloud Paks are the manifestation of three to five years of ongoing development work to refactor their monolithic middleware applications into containerized, cloud-ready services. From a packaging standpoint, the Paks simplify the sprawling IBM middleware portfolio into pre-integrated solutions that address the most pressing challenges to cloud migration and operations. All Cloud Paks sit atop RHEL and OpenShift, with IBM promising “single button push deployment” when running applications on the IBM Cloud. Being underpinned by RHEL and by OpenShift maintains infrastructure independence and enables enterprise customers the ability to choose any vendor cloud or underlying on-premises infrastructure to run these applications anywhere.

The IBM Cloud Summit 2019 combined several main IBM initiatives into a full day of executive presentations for the analyst community. IBM’s presentations on strategy and innovation centered on the opportunity to migrate 80% of the workloads still run by traditional IT delivery methods to the cloud. IBM’s legacy strengths, combined with more recent investments, make the company well suited to help customers address the remaining 80% of IT workloads, most of which are mission critical. IBM’s recent landmark investment was the purchase of Red Hat, and IBM laid out in even greater detail how that will benefit customers. This was followed by a series of presentations taking aim at the untapped market opportunity as well as the various middleware services and professional services IBM can bring to bear on the application migration and modernization efforts for its customers resting on the foundational elements of Red Hat Enterprise Linux (RHEL), Red Hat OpenShift and Kubernetes container management clusters.

India-centric professional services rivals compete to prove themselves as front-runner in European market

Vendors look to Europe for new revenue pipelines, but some are falling short

India-centric IT services vendors are looking to do more in Europe, utilizing a few different strategies to get there, including pursuing acquisitions, developing new offerings and building on-the-ground workforces. This allows them to diversify revenue streams geographically. TBR feels this push into Europe is a crucial move for India-centric vendors, to not only remain competitive with each other and advance their bottom lines but also to reap the benefits that the European market has to offer, such as new investment opportunities and a larger talent pool for recruiting.

Vendors look to Europe for new revenue pipelines, but some are falling short

The five leading India-centric professional services vendors — Cognizant (Nasdaq: CTSH), HCL Technologies (HCLT), Infosys (NYSE: INFY), Tata Consultancy Services (TCS) and Wipro (NYSE: WIT) — explored new pipelines for expanding their presence into the mature and growing European market. Infosys worked to close the gap on HCLT in 1Q19, while TCS, Wipro IT Services (Wipro ITS) and Cognizant are feeling the pressure to conform to stay competitive.