Smart city solutions have to think outside the trash bin

The “Connecting Your Business to the Smart Cities We All Live In” panel during PTC’s LiveWorx event included ideas consistent with TBR’s previous views on smart cities. One of the most interesting speakers was Nigel Jacob, the co-founder of the Mayor’s Office of New Urban Mechanics, an R&D organization within Boston’s City Hall. Jacob gave a presentation on the “Boston Smart City Playbook,” compiled by his organization, which lists the following rules for vendor engagement:

  1. Stop sending sales people.
  2. Solve real problems for real people.
  3. Don’t worship efficiency.
  4. Better decisions, not (just) better data
  5. Platforms make us go ¯\_(ツ)_/¯.
  6. Toward a “public” privacy policy

All of these points align well with TBR’s view of how vendors need to improve their go-to-market strategy, but a few stood out. “Stop sending sales people” translates well inside and outside smart city applications. Internet of Things (IoT) is a complex technology, and it is difficult for end users to really understand what IoT can do for them. Public sector officials, just like the CEO, CIO or CTO of any private organization, do not want to listen to a sales pitch about why a technology is great. Instead, in the example of Boston, decision makers desire vendor engineers or consultants to be on-site to explain why IoT is good for their city’s particular challenges, how it can be implemented and how it has worked for others, as well as to provide concrete evidence of what Boston can expect to gain in the long run. Only then will a vendor’s solution be taken seriously.

“Better decisions, not (just) better data” is a point TBR believes vendors should take to heart. Data is a building block to insight, but piles of data with no feasible way to turn the data into actionable insight is little more useful than no data at all. Customers seek insight through data, but if there is not an easy path to achieving insight, its value is significantly reduced. Customers believe that to get value out of IoT, they need to bolster their IT, operational technology (OT) and data scientist staff. TBR believes incorporating artificial intelligence and improving user interfaces to simplify IoT products is a path to unlocking value for business decision makers, enabling them to make better decisions without incurring huge selling, general and administrative expenses.

“Platforms make us go ¯\_(ツ)_/¯” is also parallel to customer concerns recorded by TBR. Platforms are exciting to techies, but they do not mean much to customers. Instead, they generally raise fears of platform lock-in, where customers will be unable to access outside technologies or risk becoming a member of a dying standard. Also, the platform level is often too high for customers to understand how IoT will benefit them. Vendors must continue to boast interoperability and focus on use cases or small deployments. Small deployments that solve immediate problems — not technical and platform-based discussions — will be vendors’ gateways to customers. After a few successful small projects, vendors can introduce customers to the grander view centered on a wide platform.

Bigbelly vice president of North American Distribution and Global Marketing Leila Dillon, another presenter during the panel, explained how Bigbelly solved multiple problems for individual cities by thinking outside the box. The company sells solar-powered waste systems, mostly bins, that automatically compact trash and alert waste management when they need to be picked up. This granted cities substantially increased efficiency not only because automatic compacting eliminated waste buildup but also because the alert system saved wasted time having trucks on routes checking all bins instead of only those that are full. Additionally, Bigbelly observed that by thinking creatively, it could further cities’ smart city goals. It started working with cities to equip waste bins with small-cell technology to enable ubiquitous citizen connectivity. In other cases, the company equipped cameras or sensors to track foot or street traffic to help cities understand congestion. Bigbelly is a great example of a company helping to solve a pointed problem — in this case, making waste collection more efficient — and then working with the cities to build additional IoT use cases one success at a time.

‘Popcorn market’ and ‘shrink-wrapped’ IoT: TBR gets creative with industry terms

Observers of emerging tech trends often seek the “hockey stick” moment, or that period when the market takes off following an explosion of activity. However, as TBR Principal Analyst Ezra Gottheil explains in his special report ‘Shrink-wrapped’ IoT will drive accelerating growth; an explosion of activity, or huge moment of growth, will likely never occur in the overall commercial IoT market. Gottheil writes:

Each IoT [Internet of Things] solution comes to market at a different time, meaning that as more packaged solutions become available and as some experience rapid growth, the total growth accelerates. The IoT market has been described as a “popcorn” market, in which each submarket “pops” at its own pace — some smaller markets grow explosively, but the total market (the “pot of popcorn”) expands more uniformly.

A popcorn market leads to slowly accelerating overall growth, generating frustration for companies that had anticipated rapid adoption. This is especially true in the IoT market for horizontal IT companies such as Hewlett Packard Enterprise (HPE) and Dell EMC that are finding themselves selling into new markets, including product development, operational technology (OT) and data science organizations, instead of traditional IT department constituencies. Gottheil notes that for organizations that are seeking to benefit from IoT, the key to accelerating growth is developing packaged “off the shelf” — or “shrink-wrapped” — IoT solutions. The increased availability of IoT solutions targeting specific use cases and business processes in industry subverticals will be key to generating IoT-driven vendor revenue for the foreseeable future.

Your digital transformation center needs leadership

Start with a new space, furnish it with funky chairs, nontraditional work spaces and all the latest technologies. Recruit creative talent, mixed with some data scientists and wonder-tech folks, plus seasoned strategists. Bring in current clients and consult on digital transformation.

As companies implement this playbook, a couple of common themes and challenges are emerging, mostly around client selection, talent management and technology partner cooperation. (Look for future blog posts on all three of these.) We had the pleasure of meeting many of the leaders at these new digital transformation centers (in Miami, Dublin, Frankfurt, Dallas, New York City and more), and I noticed common traits among the people charged with running these new places: passionate, invested, visionary. Some places took a kind of “buddy cop” approach, pairing a creative with an executioner (in a good way, for both). Some bolted long-standing capabilities onto an acquisition. The real kicker: these centers need nonstandard leaders, even as the larger firm — the board that just invested $20 million in a new space and new talent — wants to ensure the investment pays off and puts a trusted, almost always longtime company professional in charge. And that makes leadership more critical than ever.

The best we’ve met (a highly subjective and personal assessment) echoed lessons I learned during my brief time in the U.S. Army and my long exposure to U.S. military culture: train everyone, especially the leaders, and train them for their next job; promote them when they’re ready and support them with more training as their responsibilities evolve. One center leader described to me how her company invested in her management skills, ensuring she could handle the diverse set of backgrounds, skills, expectations, and corporate cultural mindsets she would be leading at the new center. Longtime professionals who grew up within a firm might be able to manage teams mixed with experienced and new hires. But leading such a team requires skills not typically gained from serving only in one organization or growing professionally mostly through similar roles.

As much as I’ve enjoyed digging deeper into the substance behind the hype of these centers — the funky chairs and bleeding-edge tech and clients taking journeys to digital transformations — we still want to understand the business case, the strategies and the metrics that determine whether these substantial investments of money and brand are beginning to pay off. From what we can see to date, success still relies on what it always has: leadership and teamwork. Companies recognizing this lead the pack right now, especially as that pack becomes crowded with cloud, network and legacy IT vendors all looking to play in the digital transformation space.

 

Will digital transformation be the catalyst for adoption of new outcome-based pricing models?

Every day I find myself reading about the developments happening in business-to-consumer (B2C) pricing.

Here’s a sample of those that jumped out recently:

These developments highlight the growing momentum behind providing dynamic, value-based and outcome-based pricing models, a movement being driven by companies’ desires to provide personalized customer experiences at scale.

While this push has been most publicized and noteworthy in the B2C world, driven by the likes of Uber, Netflix and MoviePass, it also consistently permeates the complex business-to-business (B2B) IT products and services world that we focus on. “How do we shift from a cost-plus to value-based pricing model? Are companies really doing outcome-based pricing? Who is doing it well, and for what types of customers? How?” These are common questions vendors are trying to sort through as they change their businesses.

Often, we’ve heard that IT vendors are serious about making outcome-based pricing models work, but the customers are putting the brakes on these types of arrangements. Customers will ultimately balk at the variability and risk of an outcome-based arrangement at some stage of a deal negotiation and push vendors to offer predictable fixed-price engagements. Customers like the idea of not paying when an outcome is not achieved more than sharing the benefit of an outcome that is met, and somewhere in that trade-off the fallback becomes a traditional contractual arrangement.

What’s interesting is that based on recent research, this customer hesitance seems to be abating. In our 2H17 Digital Transformation Customer Research, we asked 165 global enterprises that are undertaking digital transformation initiatives to identify the pricing structures they’ve experienced, and outcome-based pricing emerged as the most common model globally.

As my colleague Jen Hamel points out in the report, “This indicates vendors have become more flexible and creative with pricing to convince clients to take the DT [digital transformation] leap but may see delayed ROI from DT skill investments as revenue depends on project success.”

As digital transformation continues to take root, the question of how vendors can shift to outcome-based pricing will only be asked more frequently, particularly as changes in the timing of revenue recognition from engagements impact vendors’ flexibility around resource investments. We are eager to watch (and to report) as best practices develop and new models emerge and would love to hear about what others think on this topic.

Drop a comment here or email me at [email protected].

 

Pricing research is not always about price

I recently read an article summarizing an onstage interview with Amazon CEO Jeff Bezos at the George Bush President Center. During the interview, Bezos described Amazon’s data mindset:

“We have tons of metrics. When you are shipping billions of packages a year, you need good data and metrics: Are you delivering on time? Delivering on time to every city? To apartment complexes? … [Data around] whether the packages have too much air in them, wasteful packaging … The thing I have noticed is when the anecdotes and the data disagree, the anecdotes are usually right. There’s something wrong with the way you are measuring it.”

This is critical insight for market research practitioners, including those (like myself) focused on pricing. As analysts, we tend to deep dive on the facts and seek hard evidence. We rely on the data to tell the story and articulate the outcomes. Bezos isn’t saying that we should totally discount data. What he’s saying is that data has value when contextualized and re-examined in the context of the actual customer experience.

Pricing is an inherently data-driven exercise. IT and telecom vendors lean on transactional systems, price lists, research firm pricing databases, and other data-centric tools to make pricing decisions and determine appropriate price thresholds. Most of the pricing projects that we do on behalf of our clients start with the question, “Are we competitively priced versus our peers?” That is usually the most basic component of the results that we deliver.

What we’ve found over the years doing this work is that pricing in the ICT space is often more art than science, and that customer anecdotes about pricing are often as valuable and instructive to pricing strategy as the market pricing datasets produced. Our approach to pricing research is rooted in interviews with representatives of the vendor and enterprise customer communities. Often in conducting these interviews, we’ll uncover that the root issues with pricing, which were thought to be associated with the price itself, are often broader issues — something related to value articulation, market segmentation, packaging or delivery efficiency. These aspects influence the customer experience, create pain points, and ultimately dictate willingness to pay and value capture.

When we deliver these results to our pricing research clients, the outcomes are often not only a list or street pricing change, but rather, a rethinking of a broader pricing, go-to-market or customer engagement strategy. Clients will utilize customer anecdotes to rethink how they message a product in their marketing campaigns and content, devise a new approach to customer segmentation, or take a hard look at the delivery cost structure and resource pyramid levels that are driving their price position. In designing pricing research initiatives, we encourage our clients to think more broadly about pricing and incorporate multiple organizational stakeholders into the process, as this can uncover true unforeseen drivers of price position.

How does this compare to your organization’s approach to pricing? How important are customer anecdotes to your pricing processes? Drop a comment here or email me at [email protected].

 

Is the IT hardware market ready for Hardware as a Service?

Hardware as a Service — or maybe you call it PCaaS, DaaS or XaaS — is basically referring to bundling some type of hardware (e.g., phones, PCs, servers) with life cycle services and charging a recurring fee over a multiyear contract. The customer never really owns the hardware, and the vendor takes it back at the end of the agreement.

Sure, it’s not a new concept. But the solution hasn’t exactly taken off like a rocket ship, either. So, is it going to? Maybe. Its initial speed may be more like a Vespa than a SpaceX Falcon, but there are a few things working in its favor.

Why do buyers want it?

  • Retiring hardware is a huge pain. I have talked to IT leaders who have literally acquired warehouse space solely to store old hardware they have no idea what to do with.
  • Making it easier to stay up to date with tech. Management can no longer deny the negative impact on morale brought by an unattractive, slow and/or unreliable device.
  • Automation & Internet of Things (IoT) usher in new capabilities. Who doesn’t want to make managing hardware easier? Hardware as a Service is basically IoT for your IT department. Device management features like tracking device location and health are key functions of many IoT deployments and is a core selling point of Hardware as a Service offerings.

Why do vendors want to sell it?

  • Business models are changing. That darn cloud computing had to come along and change expense models, not to mention make it easier to switch between vendors. From Spotify and Netflix to Amazon Web Services and Salesforce, “as a Service” is second nature to IT buyers in both their personal and professional lives.
  • Creating stickiness. Hardware is more often perceived as “dumb” with the software providing the real value. If you’re a hardware maker (or a VAR), you need to make the buyer see your relationship as one that’s valuable and service-oriented versus transactional.
  • Vendors desire simplicity. Most vendors will tell you they have been building similar enterprise service agreements on a one-off basis for years. These new programs will hopefully create swim lanes to make it faster and easier for partners to build solutions.

Buyers are used to monthly SaaS pricing models, but that’s not really what creates the main appeal for Hardware as a Service. Buyers really want the value-added services and fewer managerial headaches.

So, how’s it going?

As someone who manages several research streams, I get to peek at results from a lot of different studies. Here are a few snippets of things I’ve heard and seen in the last month or so.

  • Personal devices: It certainly seems like there’s the most buzz around PCs, with Dell, HP Inc. and Lenovo all promoting DaaS offerings. I have also heard from enterprises doing initial DaaS pilots with as many as 5,000 PCs, but we seem to still be in very early stages of adoption. Both PC vendors and their channel partners are beginning to report “legit” pipeline opportunities tied to DaaS.
  • Servers: Either outright purchasing or leasing servers is still the overwhelming choice of purchase method for about 90% of IT buyers recently surveyed by TBR. Perceptions that an “as a Service” model will be more expensive in the long run is the main customer concern to date that vendors will need to address via emphasizing the value-added life cycle services.
  • Hyperconverged infrastructure (HCI): A bundle of hardware and a services bundle? This is the bundle of bundles! Not too many HCI vendors are openly promoting an “as a Service” pricing model at this point, but 80% of current HCI buyers in TBR’s most recent Hyperconverged Platforms Customer Research indicated they are interested in a consumption-based purchasing model, particularly to enhance their scalability. About 84% of those surveyed are using HCI for a private or hybrid cloud buildout, so maybe a more cloud-like pricing model make sense. Make no mistake, interest is not synonymous with intent, but it’s safe to say these buyers are at least paying attention to their purchasing options.

My general verdict is that things are still moving at Vespa speed. PCs have a head start over data center hardware based on the concerted go-to-market efforts of the big three OEMs and a consumption model that more closely aligns with the consumer services we’re used to. The second half of this year will be an interesting proving ground to see if the reported pipeline growth is converted to actual customers. Depending on how that goes, maybe we’ll see the data center guys making more serious moves in this space.

What do you think? Add a comment or drop me an email at [email protected].

 

Key findings from TBR’s upcoming HCI customer research

Hyperconverged infrastructure (HCI) is a growing market ripe with opportunity for vendors. TBR forecasts the market will reach $11.7 billion by 2022. Although TBR research indicates that incumbent vendors with a strong presence in the converged infrastructure (CI) market, such as Dell EMC and Cisco, have an advantage in the space, findings also indicate that a growing number of smaller vendors are rising in popularity. Add to that the approximately one-quarter of existing customers who indicated that brand is not a key factor in their decision making, and it becomes clear that the opportunity to take share from existing vendors is high. Further, with nearly three-quarters of respondents indicating they have not yet taken the plunge into the HCI space, there is massive opportunity, through strategic marketing and support, for vendors to encourage new adopters to be their customers.

HCI has a significant place in the cloud market

Eighty-four percent of respondents indicated they are leveraging HCI for either hybrid or private cloud installations. TBR believes this suggests that cloud is not necessarily an inhibitor to HCI adoption, as some vendors may perceive. Further, we believe this signals that consumption-based pricing options, which 81% of respondents indicated they would be interested in considering in the future, will encourage more HCI adoption. Consumption-based pricing enables customers to select HCI for a capex solution as well as for public cloud if they choose, and they can simply compare performance and other features between the two to make purchasing decisions. Vendors can capitalize on this flexibility with strategic marketing.

IT leaders play a crucial role in the HCI decision-making process

HCI remains a strategic purchase, as evidenced by the fact that 74% of respondents indicated IT directors and managers were one of the decision makers. TBR believes that as customers become more familiar with HCI and their HCI vendor, they will be more likely to make repeat purchases and will be less likely to demand direct-from-vendor sales.

To learn more about TBR’s Hyperconverged Platforms Customer Research, contact Stanley Stevens ([email protected]) or your account executive.

 

Democratization now: It’s good for business

Data democratization is a hot topic now. Spokespeople from Google, SAP, Capgemini and other tech companies have spoken and written about how making data available to as many people as possible will both unleash the power of technology and prevent abuse of closely held data. Microsoft CEO Satya Nadella sprinkles his talks and interviews with references to democratization. TBR agrees that data access is critical to achieving artificial intelligence’s (AI) considerable potential, but access is not enough. For AI to do what it can do, business people with domain knowledge, who are regular users, must be able to work directly with data, without intervening layers of developers and data scientists.

Data access is a conundrum. Ensuring appropriate privacy and security while making data available to as many people as possible is a challenge, one that is inhibiting the growth of widespread AI-driven data analysis. This post will not address that challenge, however. It focuses on one of the other growth inhibitors: the current need for data experts, scientists, engineers and janitors, as well as developers, to extract the value from data.

Business users might see the hierarchy of data experts as a priesthood or a bureaucracy, standing between them and the data, but that is not really what is happening. Currently, there are no tools with which business users can conduct their own analyses, at least not without a lot of preparation by the data experts. Better tools are coming; there are R&D efforts worldwide to make data more directly accessible, which is part of what Nadella and other spokespeople are talking about.

Before these democratic tools are made available, there is strong growth in AI and the utilization of data analytics, because the value is there. But the need for experts greatly increases the cost of analysis, so only analyses with the highest potential value are performed. As more democratic tools become available, many more analytic projects will be worthwhile and the use of analytics will grow much faster.

The impact of democratized analytics tools will be huge because the costs associated with the data expert hierarchy are great. Those costs go beyond just personnel. Communication between business users and data experts is time-consuming and expensive, and it lowers the quality and value of the analyses. Business users and data experts live in different worlds and have different vocabularies; misunderstandings are common. Misunderstandings are expensive, but what is worse, working through intermediaries slows the iterative process by orders of magnitude. The potential value of data lies in insights, and finding insight is an iterative process.

The history of business technology is a progress propelled by increasing democratization of tools. The PC itself is prime example, making computing directly available to business users. The internet rode a wave of disintermediation and self-service to its current global saturation. In terms of democratization of AI analytics, the best parallel is the PC spreadsheet, which made it possible for business people to create and tune their own quantitative models of business activities. Before the spreadsheet, creating those models required coding.

“Spreadsheets for AI,” one of which may well be a version of Microsoft’s Excel, will accelerate growth in analytics, data access, storage, cloud services and the Internet of Things. AI spreadsheets will not retard growth in the use of data experts; they serve a different market. Even with very good first versions, broad adoption will take years, so the acceleration of growth will not be sudden. Over the years, however, the ability of business users to directly analyze their data will contribute substantially to the revenue of IT vendors and to that of their customers.

 

Critical success factors for successful pricing research

In my day-to-day life at TBR, I regularly interact with clients seeking to undertake pricing research. Their needs are varied. Some want to understand pricing for a new product or service or sustain their competitive position for an existing offering, while others seek to design an overarching commercial strategy or to increase the effectiveness of their sales teams by arming them with tactical data and insights — and nearly all are focused on influencing revenue and margin.

Capturing pricing data that can be utilized defensibly for decision making is challenging. All pricing is situational and can be influenced by any number of factors. Pricing decisions influence, and are influenced by, nearly all organizational departments, from sales and finance to product management and business strategy, and thus are often highly politicized within ICT enterprises.

Based on our regular experience in serving the pricing research and consulting needs of our client base across ICT industry segments, we have identified five critical success factors that can help clients navigate these challenges:

  • Start with outcomes: We often find that our customers come to us with a research concept in mind, but not a defined goal or set of operational plans for how the research will be deployed in their organization. Sometimes the request is: “We need to know what vendors like us are charging.” But the real goal of the team may be to answer the question: How can we be more efficient in resourcing deals? By starting with the end goal and use case in mind, we find that we often explore areas adjacent to pricing, and that insights on those topics, in concert with pricing data, unlock business value for our client base.
  • Focus on business impact: For all the research we do at TBR, including in pricing, we advise our clients to frame all research needs around the underlying business impact. We design projects, including the questions we propose to cover in primary research and the data that we seek to capture, to ensure that the recommendations we deliver around pricing aim directly at influencing business strategy, revenue and profitability.
  • Focus on context: Our pricing research methodology relies on interviews with vendors and customers. This approach allows us to capture not only pricing data but also contextual data and insights on topics such as discounting, commercial incentives, pricing structures and portfolios. When paired with core quantitative pricing data, these types of interview-driven insights provide predictive value focused on vendor and customer pricing and consumption behavior.
  • Build market constructs: To normalize against deal-specific influencers that can impact a true view of market pricing, we design our research to focus on deal constructs. These constructs are used in all interviews to ensure apples-to-apples comparisons and that we are characterizing a full spectrum of potential price points. Context on topics such as discounting is addressed through qualitative conversations.
  • Consider adjacent markets: Many times, particularly with clients seeking to stand up pricing models and price levels for new offerings, we find that the direct peer landscape may not be the best basis of comparison. By looking at adjacent offerings and considering how similar-yet-adjacent offerings are packaged and delivered, clients are able to gain a broader foothold in their peer landscape in its entirety, and often identify areas to elevate value proposition and raise prices accordingly.

 

The next 5 years: A successful strategy for Infosys

Infosys wants to change. Founder N.R. Narayana Murthy’s return to leadership last spring signaled Infosys’ willingness to change directions; the question is, will it work? Can the firm find a strategy that moves it closer to IBM and Accenture and widens any positive gaps between Infosys and its India-based peers? 

 

In July, after resuming his role as executive chairman, N.R. Narayana Murthy said the following about Infosys’ future: “The strategy is to focus on opportunities from consulting-led end-to-end solutions, leveraging technology for higher margins, developing intellectual property-based solutions to delink revenues from effort.” Murthy set three broad goals — improving sales efficiency, increasing automation and boosting employee productivity, and rationalizing cost — but early indications show Infosys making the third goal its priority as the company has been moving to a substantial onshore/offshore ratio shift. Currently at a 35-to-65 ratio, Infosys’ goal is a 10-to-90 ratio weighted to low-cost resources, ultimately to preserve margins as the company refocuses on large low-cost outsourcing engagements.

 

After setting policy, Murthy oversaw executive-level personnel changes, letting go the leaders who had been in charge while Infosys’ India-centric peers outpaced the company (see following chart). Murthy kept some key initiatives, including Infosys 3.0, which could determine how close Infosys moves to market leaders Accenture and IBM. Launched in 2011, Infosys 3.0 is a long-term growth accelerator, especially as it melds transformational consulting with emerging technologies. However, Murthy and his new team must provide time and support, both in building the 3.0 engine and focusing on low-end commoditized engagements. 

 

TBR believes Infosys must take three strategic steps to achieve its long-term goals in current market conditions:

  • Build up nearshore Americas capacity and capabilities
    Infosys needs to serve the U.S. market better, either with resources on the ground in the U.S. or nearshore in Latin America. The potential growth for any IT company in the U.S., Canada, Mexico and Brazil demands a substantial investment, and India-centric firms cannot rely on the vagaries of the U.S. visa system. Infosys tried to hire mid- to senior level, U.S.-based consultants in the 2000s, but the effort stalled when the company could not find the right people or receive permission from clients to engage in higher-level consulting work. Infosys opened a 200-person BPO center in Atlanta and a 100-person delivery center in Costa Rica in 2013; however, the company still lags in this critical market relative to peers.

  • Acquire consulting capacity in Europe
    Infosys can continue to serve European clients with outsourcing services based nearshore (in lower-cost European countries such as the Czech Republic and Poland) and offshore, but if Infosys wants to shift its consulting/outsourcing revenue ratio from 35-to-65 to closer to 50-to-50, the company must gain more wallet share from EU clients. Infosys recognized the need to move upstream in Europe when it purchased Lodestone in 3Q12 and committed to reorganization on the continent in 2013, but the company must continue acquiring talent that buys permission to play in the management and technology consulting space.

  • Invest in IP
    Global leaders Accenture and IBM separate themselves from Infosys and its peers with IT innovation, bringing together strategy and emerging technologies to create tech-enabled solutions specific to customers’ needs. Infosys started down this path through the development of its Edge Platform suite, including BigData Edge, a comprehensive cloud-based offering for aggregating, analyzing and processing analytics data from internal and external sources; but absent a sustained strategic approach through customer-centric, acquisition-enabled or research-driven IP, the company risks seeing commoditization strip out every difference between Infosys and its India-centric peers.