ALU

ipaas

3 Reasons to Move Your On-Premise Data Architecture to the Cloud

Most companies only use 5 to 10 percent of the data they collect. So estimates Beatriz Sanz Sai, a 20-year veteran in advanced analytics and the head of Ernst and Young’s global data and analytics practice. While it’s impossible to validate such a claim, the fact is many organizations are gathering lots of data but analyzing little of it.

Legacy database management systems shoulder much of the blame for this. They hog time and resources for the sake of storing, managing, and preparing data and thus impede analytics.

Original Link

iPaaS: Key Considerations and Warnings

Digital transformation requires cloud appropriate adoption, legacy IT systems modernization, and Agile-based methodologies for faster innovation. SaaS applications are the preferred choice for organizations adopting a digital strategy and they soon find themselves overwhelmed with cloud products. As most of the cloud products specialize in a subset of functionalities offered by their traditional monolithic on-premise counterparts, there is a surge in SaaS applications in enterprise’s application portfolio, that addresses various specific business needs of an organization. With a plethora of SaaS applications thus introduced, it becomes paramount to establish seamless connectivity among them to behave as a well-connected system that meets the overall business objective. iPaaS (integration Platform-as-a-Service) is a next-generation integration platform that addresses the ever-evolving integration challenges with the SaaS applications. iPaaS is the latest addition to enterprise’s middleware portfolio, which offers integration as a service capability to the organization.

With growing interest in iPaaS, many organizations are curious and ready to adopt it to reap benefits. This article presents the key considerations that an enterprise architect, business or IT team must think before jumping into the iPaaS bandwagon. Is it right to think that iPaaS is the panacea for all integration challenges?

Original Link

Surmounting Cloud Adoption Challenges With an iPaaS Model

Cloud is not the next big thing anymore. It is now the big thing. Immersive cloud-based technologies have totally altered the IT landscape. Organizations now understand that centralized computing and archaic architectures cannot drive the edge. The zeal to achieve low latency and hyper-interactivity is pushing organizations to adopt cloud-based technologies. Integration is being seen in the light of new approaches, but organizations are facing same old challenges that were prevailing in old frameworks. We will cover those challenges and methods to circumvent them with the help of an Integration Platform as a Service (iPaaS) framework.

Barriers to Cloud Adoption

A fair share of cloud migration challenges are spawned by weak integration between on-premise and cloud-based applications. These problems keep on resurfacing when the point-to-point network and hairball coding is used to integrate several applications. Developers develop code and throw it before the testing team for validation. And whenever there is a shakeup, developers need to repeat the entire process. This strategy is not ideal for companies with thousands of applications.

Once upon a time, applications were developed with little focus on integration with other applications. They were primarily stove-piped applications that offered one or two endpoints. These points don’t scale to accommodate SOA infrastructure. Lengthy code needs to be redeveloped to accommodate small changes. The following are some of the drawbacks of the hand-coded approach:

  • Growth in complexity leading to the formation of an integration hairball.
  • Lack of scalability for Service Oriented Architecture.
  • Increased Total Cost of Ownership (TCO).
  • Long delays in onboarding partner data.
  • Lack of dedicated features for firewall mediation, data management, governance, etc.

Historically, data was brought to the Central Processing Unit (CPU) for processing. This approach was changed as massive amounts of data soon overwhelmed the processor. The instant response was bringing multiple processors to data for processing and assessing. Each server consisted of small components to process individual elements of data sets. It was called parallel processing, where an infinite number of CPUs processed infinite numbers of data sets. In such an ecosystem, organizations face difficulties in scaling the ability of processors up to deal with a wide variety, volume, and velocity of data. As a result, teams encountered treacherously difficult challenges while:

  • Deriving true value from their data.
  • Overcoming data silos that restrain capabilities.
  • Getting the skills required for marshaling and orchestrating data.
  • Linking the data with Big Data and other digital initiatives.
  • Safely exchanging data with business partners.

To surmount these challenges, experts recommend organizations embrace a strategic integration approach that is based on industry best practices. Hybrid cloud adoption will see further increase and organizations undermining integration will fail to receive the benefits of cloud migration.

Conventional Approaches to Hybrid Integration

Initially, application leaders used ESB architecture to manage integration between applications and services. However, they realized that ESB doesn’t support scenarios where new and old applications run parallelly. It lacked scalability to accommodate new technology initiatives like Salesforce, Workday, Quickbooks, etc. Frequent IT intervention was needed at every layer to develop, test, and generate code.

Extract, Transform, and Load (ETL) was also used, but it suffered from many drawbacks, too. It only allowed getting data from a data repository. However, with the advent of Hadoop, this functionality has become outdated. Traditional row-and-column ETL doesn’t allow users to support storage of structured and unstructured data.

Modern applications use the Extended Markup Language (XML) format for storing data. On the other hand, physical machines use comma-separated values (CSV) for data storage. The ESB/ETL approaches are overwhelmed when too much data needs to be mapped from XML to CSV or CSV to XML. It is unwise to use these approaches for high speed and high volume projects.

Connecting applications through APIs is good, but it is not a silver bullet for more pervasive B2B integration needs. Specific APIs need to be developed for specific integration needs. And too much pressure is taken on when the same API muscle is used for multiple integrations. Organizations need to buy separate licenses for specific application integration scenarios.

Data has become a critical corporate asset and it is no longer a secondary asset. Jurassic integration approaches don’t help teams leverage data and engage customers. A greater degree of support from data integration tools is required to bring data from mobile, Internet of Things (IoT), and social channels.

Data security has become an even bigger concern as new and stringent compliances like the General Data Protection Regulation (GDPR) are looming. Conventional approaches don’t provide a safe passage through dense API cloud networks. A reliable pathway is needed that secures data at every endpoint.

Manual Coding and Thorny Challenges

Thorny challenges await application leaders when on-premise systems need to be integrated with cloud-based systems. And disruptions continue to resurrect in manual workflows. Let’s take a real-time scenario to understand this problem where manual steps will be used to connect SAP with Salesforce. This is a powerful combination of two trusted icons that can help organizations in becoming more productive.

Salesforce offers the Data Loader to integrate with other applications. To download the application, go to Salesforce Setup –> Data management -> Data Loader.

Lighting Connect is another approach to connect an SAP ERP Central Component (ECC) system with Salesforce, which will be discussed in this article. Here are the steps to integrate SAP with Salesforce with this tool.

Step 1: Configure Lighting Connector to perform queries and getting connected with SAP ERP system.

Step 2: Login to SAP Community Network

  • Select Join us and get free Login @ http://scn.sap.com/
  • A pop up appears
  • Register so that you are now part of the SAP Community Network

Step 3: Get access to the Public SAP System

Step 4: Configure the Lighting Connector for SAP Access

  • Click Setup → Develop → External Data Sources.
  • Click New External Data Sources
  • Fill out the following information

Label:

SAP Data

Name:

SAP_Data

Type:

Lightning Connect OData 2.0

URL:

https://sapes1.sapdevcenter.com/sap/opu/odata/sap/SALESORDERXX/

Connection Timeout:

120

High Data Volume:

Unchecked

Compress Requests:

Unchecked

Include in Salesforce Searches:

Checked

Custom Query Option:

Blank

Format:

AtomPub

Certificate:

Blank

Identity Type:

Named Principal

Authentication Protocol

Password Authentication

Username:

<The SAP Supplied user-name from Step 2>

Password:

<The SAP supplied password from Step 2>

  • Click Save >>A page appears: Connect to a Third Party System or Content System
  • Click Validate and Sync >>Validate External Data Source Page Appears

Step 5: Synchronize tables from Salesforce to SAP (creating corresponding custom objects inside Salesforce)

  • Click Sync to allow Salesforce to Read SAP Tables
  • Click SOHeaders to see the custom object and the custom fields

Objects that end in __c are the custom objects that you created

Step 6: Create an Apex Class to Retrieve SAP Data

  • Go to Setup –> Develop –> Apex Classes and click New.
  • Cut and paste this code provided in the code editor and then Save:
public class SAPsalesordersExtension { // // Read the custom object SOHeaders__x, that was created by the oData sync. // Use this to display the specific sales order data by customer number via // a VF page... // private final Account acct; List<SOHeaders__x> orderlist; public SAPsalesordersExtension(ApexPages.StandardController stdController) { Account a = (Account)stdController.getRecord(); List<Account> res = [ SELECT Id, AccountNumber from Account WHERE Id = :a.Id LIMIT 1]; this.acct = res.get(0); } public String getSAPCustomerNbr() { return acct.AccountNumber; } public List<SOHeaders__x> getOrderList() { if (null == this.orderList) { orderList = [SELECT ExternalId, CustomerId__c, SalesOrg__c, DistChannel__c, Division__c, DocumentDate__c, DocumentType__c, OrderId__c, OrderValue__c, Currency__c FROM SOHeaders__x WHERE CustomerId__c = :this.acct.AccountNumber LIMIT 300]; } return orderList; }
} // end of oData Apex Class

The SAP Sales Order Execution Page Appears

Step 7: Create a Visualforce page to display results

  • Go to Setup –> Develop –> Pages, click New
  • The Visualforce page appears

Include the following information

Label

SAP_oData_Example

Name

SAP_oData_Example

Description

A simple example of getting SAP Data without any middle-ware!

  • Paste this code in the Editor
<apex:page > <style> td { border-bottom-color: rgb(224, 227, 229); border-bottom-style: solid; border-bottom-width: 1px; background-color: #FFFFFF; border-collapse: separate; padding-bottom: 4px; padding-left: 5px; padding-right: 2px; padding-top: 5px; font-size:12px; } th { border-color: rgb(224, 227, 229); border-style: solid; border-width: 1px; background-color: #F7F7F7; border-collapse: separate; font-size: 11px; font-weight: bold; padding-bottom: 4px; padding-left: 5px; padding-right: 2px; padding-top: 5px; font-size:12px; } table { border-color: rgb(224, 227, 229); border-style: solid; border-width: 1px; } </style> <apex:dataTable > <apex:column > <apex:facet >Id</apex:facet> <apex:outputText ><a >{!order.Externalid}</a></apex:outputText> </apex:column> <apex:column > <apex:facet >Sales Org</apex:facet> <apex:outputText >{!order.SalesOrg__c}</apex:outputText> </apex:column> <apex:column > <apex:facet >Dist Channel</apex:facet> <apex:outputText >{!order.DistChannel__c}</apex:outputText> </apex:column> <apex:column > <apex:facet >Division</apex:facet> <apex:outputText >{!order.Division__c}</apex:outputText> </apex:column> <apex:column > <apex:facet >Customer Id</apex:facet> <apex:outputText >{!order.CustomerId__c}</apex:outputText> </apex:column> <apex:column > <apex:facet >Document Type</apex:facet> <apex:outputText >{!order.DocumentType__c}</apex:outputText> </apex:column> <apex:column > <apex:facet >Order Id</apex:facet> <apex:outputText >{!order.OrderId__c}</apex:outputText> </apex:column> <apex:column > <apex:facet >Order Value</apex:facet> <apex:outputText >{!order.OrderValue__c}</apex:outputText> </apex:column> <apex:column > <apex:facet >Currency</apex:facet> <apex:outputText >{!order.Currency__c}</apex:outputText> </apex:column> <apex:column > <apex:facet >Date</apex:facet> <apex:outputText >{!order.DocumentDate__c}</apex:outputText> </apex:column> </apex:dataTable>
</apex:page>
  • Click Save.
  • The SAPo Data Page appears

Step 8: Assign the Visualforce page

  • Setup –> Customize –> Accounts –> Page Layouts.
  • Click Edit to modify page layout
  • The Account Layout Page populates
  • Drag the Section option and move it to the spot where it needs to be moved.
  • A Popup appears to name the new section.
  • Click the OK button
  • Locate the Visualforce Page list and drag it to the Accounts Page Layout
  • Add the newly created Visualforce Page to the screen.
  • Drop the Visualforce page on the screen
  • Click Save to save updated Accounts Page Layout

Step 9: Test Drive Data Movement

  • Click Accounts tab and bring up any account.
  • Click New to Create a New Account
  • Populate the Following Test Data. For Example:

Account Name

Belmont cafe Inc

Account Number

100001

  • Click the Save button on the Salesforce Account page:
  • Click on the newly created account:
  • Click on the newly created account under the recent accounts section

All real-time SAP data will be returned.

This method is cumbersome and it cannot be used over and again for integrating Salesforce data with other applications. This approach allows only access to SAP data and moving large chunks of data can be an uphill task.

iPaas Approach for Becoming a Cloud First Enterprise

Previously, it was only employees who were generating and accumulating organizational data into computer systems. Now users and machines are also generating the data across social channels, forums, online commerce, etc. Due to this change, organizations today have to deal with larger accumulated data generated by their customer facing platforms, monitoring systems, smart meters, etc. The next big challenge is unlocking this colossal amounts of data which holds massive hidden opportunities. Processing and refining such massive amounts of data is a next level challenge during cloud migration which can be addressed through iPaaS.

IT experts consider iPaaS is the best solution for defying the impact of disruption on security, data and analytics, communications, and endpoint technology. Smarter organizations are successfully overcoming their integration weaknesses and setting up future-ready IT architecture with this model. Next generation iPaaS model delivers compelling business benefits:

  • 3000 times faster lead times.
  • 300 times increase in deployments.
  • 30X faster recovery.
  • 10 times lower failure rate.
  • 60 times more reusability

By simplifying application and data integration, iPaaS helps in modernizing IT architecture and setting up a cloud-first enterprise. The framework enables even business users to integrate with a gamut of external and internal business applications and processes safely and cost-effectively. Organizations can access new workloads from new channels (social, analytics, cloud, and the Internet of Things with a few clicks). This advantage allows users to connect faster with partner networks, bringing data faster and reducing total cost of ownership and accelerating time to revenue.

Leveraging iPaaS: No Code Approach for Integrating a Cloud Application (Salesforce) with an ERP (SAP)

An advanced iPaaS framework automates integration between Salesforce and SAP. It provides a secure bridge to connect Salesforce with ERP and other applications. Normal business users can handle exceptions and replicate Salesforce data faster with other applications. Here are some steps for Salesforce API integration with SAP.

 Adeptia iPaaS Interface to Connect Salesforce with SAP Adeptia iPaaS Interface to Connect Salesforce with SAP

Choose from a shared list of Salesforce connections

Step 1: Log into the Adeptia Integration Suite

Step 2: Choose from a variety of Salesforce to SAP Connectors

Create Connections

Step 3: Use Triggers and Actions.

  • Triggers: For Salesforce to the target system.

  • Actions: To sync-up data from other business application into Salesforce

Visually Map Data Fields

Step 4: Map Data between Source and Target fields with drag-and-drop ease

Step 5: Click Save

The data between Target and Source Systems is mapped!

In this way, an iPaaS framework allows normal business users to connect with any business application. Users can update leads, contacts, and campaigns in Salesforce in simple non-technical steps.

Guidelines for Adopting an iPaaS Framework

Adopting an iPaaS framework is a longstanding decision. And organizations should evaluate their requirements before investing in the right platform.

It is about time that organizations should become more data-oriented instead of system-oriented. View iPaaS as a process that streamlines IT integration in steps. Here are some guidelines for succeeding with iPaaS adoption.

Preparing a Proof of Concept (PoC): Data governance and management is as important as any other change management initiative. Organizations need to demonstrate that their iPaaS approach delivers significant value and return on investment, i.e., improved forecasting, greater degree of personalization, optimized resources, better-targeted marketing, etc. It is important to consider the continuum of data challenges arriving from a wide variety of data sources. Establishing a strong data governance model will require continuous support from all departments at every layer. It is important to bring all departments for getting more ideas and preparing a crackerjack data governance frame.

Testing the Hypothesis: It is better to fail faster during testing than at the later stages. The next step is putting the frameworks to the test and shortlisting an ideal model that ensures triumphant success. The frameworks should be tested on the basis of data readiness, feasibility analysis, usability, etc.

Validating the Roadmap: At this stage, the data governance model should be tested in the actual working environment. Application leaders must demonstrate at every stage that the model allows an organization to harness more value with less friction. Some of the factors to consider are capital expenditure, operational expenditure, total cost of ownership, and ROI.

Implementing the Data Governance Model: Then the model has to be handed over to the business teams and embedded into the organization. It is important to verify that the model delivers real-time benefits in a continuous operating environment.

Emerging business needs are driving more innovations in the iPaaS market. The iPaaS market is becoming dense with every newly added functionality. To achieve continued success from an investment, an organization should follow these guidelines and select an iPaaS model that promises stable trajectory and guaranteed success.

Original Link

5 Ways to Learn More About Data Integration and iPaaS Offerings

Integration platform as a service, or iPaaS, might be a less familiar term than ETL to data analysts, integration architects, and anyone else who needs to move and shape data from source systems to target systems. However, the purpose of this tool is the same of ETL’s: extract, transform, and load data. It is just that iPaaS doesn’t stop there.

In the following post, you’ll learn about the specifics of iPaaS, how it surpasses ETL in its functionalities, and what the areas of its implementation are. Additionally, you will find a few sources that will guide you through the great number of various iPaaS offerings and help you learn about some modern trends in data integration.

Last but not least, you will get to know a few best practices that, hopefully, will help you make an informed decision when selecting the right iPaaS.

It Is All About Data, Isn’t It?

Modern data platforms such as cloud-based data solutions make things much simpler by enabling the transformation of data as it is queried within these platforms.

However, masses of raw data are difficult to make sense of for people who aren’t experts. There is still a need to transform data into a clean and readable format with a strongly defined data model before it is queried. One way to do that is to use iPaaS.

iPaaS is a cloud-based, multi-tenant integration tool for extracting data from different sources, transforming the data by applying a set of rules to prepare it for querying and analysis, and finally, loading the transformed data into the target system(-s), which can be practically anything from a database to data analysis platform to a system of record.

It is not difficult to see, therefore, how iPaaS is simply a means of data integration and a modern successor of the classic ETL tools to consolidate data for driving better business decisions.

But How Is iPaaS Different From a Classic ETL Tool Exactly?

An ETL tool does what its name suggests — it extracts data from a source system, transforms it so that a target system can read it, and loads it into the target system. Its main characteristic, though, is that it moves data in batches, usually on an hourly, daily or weekly basis. Thus, ETL is good when the data is not really time-sensitive, but it will very likely fail when you need real-time or near real-time data. Additionally, it is not well suited for connecting more than two endpoints at a time.

The basic purpose of iPaaS is not different from the one of ETL. As mentioned earlier, it also extracts, transforms and loads data. The main difference is, however, that you can connect by far more than two endpoints at a time when using iPaaS, and you can set up a real-time or near real-time data sync. Additionally, it usually comes with various pre-built and pre-configured integration components, like for example connectors for different business applications, which allows to considerably reduce the overall development and implementation efforts.

In general, iPaaS tools are considered to be an optimal solution for people who use data for BI purposes or other business-critical applications. First and foremost, they make data understandable for the layman analyst by conforming it to business terminology. In addition to that, more often than not, iPaaS is built with a regular user in mind, offering an interface that is equally easy to use by both technical folks and business users. This is another major difference between iPaaS and ETL, with the latter being hardly considered as user-friendly for a layman.

5 Ways to Learn More about Data Integration and iPaaS

iPaaS was originally used to sync data between cloud-based business applications. However, thanks to its specific characteristics, some of which are mentioned above, iPaaS is evolving to support data integration across mobile platforms, IoT platforms, BI platforms, on-premises applications, and, of course, the cloud.

In the end, though, iPaaS is just one of many approaches and tools used for data integration. At this point, it’s important to note that data integration will always be relevant because it drives insights from the data that every business collects. Data-driven insights lead to better business decisions.

It’s imperative, therefore, to keep up to speed on recent developments and trends in data integration, which includes iPaaS and how it is evolving.
Below are five ways to learn more about data integration and iPaaS in modern organizations:

1. Peruse the Wiki for Data Integration Solutions

That’s correct, there is this relatively new project started by Alooma, ETL Wiki, which offers a broad overview of various data integration solutions and covers many data integration topics. It is helpful in providing the broad overview necessary to understand everything you need to know about data integration, starting from the very basics like ETL and going deeper down the road. Browsing the wiki tree allows you to understand the structure of the data integration space, and how different topics or trends relate to each other.

2. Understand the Mechanics of Data Integration

While the previous resource goes through a lot of different concepts, it’s important to get an extensive overview of the mechanics behind data integration too, particularly for modern enterprises with increasing amounts of data.

A good vendor-independent guide to data integration, such as Enterprise Integration Patterns, provides the bedrock required to understand different integration patterns, how to apply them and what pattern might fit which integration scenario best.

For those who still prefer real books to online sources, there is a book accompanying the website.

3. Become Familiar With Various iPaaS Offerings

There are so many different data integration solutions that it is hard to decide which one is best for your own organization. I bet sometimes it even seems easier to just scramble something together on your own instead of spending hours doing the research, reading about the specifics of this and that platform and talking to sales reps and consultants.

Luckily, not least of all due to the increased usage of iPaaS solutions, there are quite a few sources out there that have already done at least some of the work for you. One of the main sources of information is certainly Gartner with its numerous Magic Quadrants for virtually any type of data integration solutions. The pro is that Gartner really tries to cover every possible tool in its respective category. The cons are, however, that only a handful of solutions get an extensive coverage, and these papers are not quite for free.

If your company doesn’t have a regular access to Gartner’s information portals, you can nonetheless find some good overviews of current iPaaS offerings on various expert blogs, like this one on Soft Examiner in English (don’t let the title confuse you, it’s not only about enterprise bus) or this one on Digital Besser in German.

4. Follow Best Practices for iPaaS Selection

It is hardly open to debate that choosing a software must be a well-informed decision. This is even more true for data integration solutions because a wrong tool can mess up with or even corrupt your data, which is nowadays tantamount to a disaster.

So, how to make an informed decision on the right iPaaS tool for you?

First and foremost, you need to make sure that you get a trial period for the iPaaS solutions that you are evaluating. iPaaS providers usually offer only one trial month, but try to get a better deal in a personal call. Take your time to test the software, make careful notes what you like and what can be improved – they might come in handy later in the selection process to a) choose the right iPaaS for your organization and b) to justify your choice to the budget-holders.

After the trial period, no matter how much you liked the platform, wait with the total commitment. Take some more time to build a proof of concept. In this light, take the smallest pricing plan and the shortest billing period offered, preferably a monthly billing option. (Some very large vendors do not offer such a choice, but then again, bigger doesn’t necessarily mean better.)

After the proof of concept, if everything goes well and depending on your integration needs, it might make sense to define a pilot project. A pilot project is still not a commitment on your part, so the iPaaS provider’s contracts must be flexible enough to allow you to spend on the platform only this and this amount of time until the pilot project is finished and the results can be evaluated. Which brings us back to the benefits of the monthly billing option.

If everything is satisfying, then and only then should you actually commit yourself to one iPaaS provider.

5. Take a Course in Data Integration

The beauty of the Internet is that it contains a vast library of resources useful for almost any field you can think of. There is a host of free or low-cost online and offline courses designed by data and software professionals that go into great detail on data integration and anything around it.

This Udemy course, for example, covers data warehouses, business intelligence, and ETL testing, including the different testing scenarios required to thoroughly test any ETL software. Similarly, on the European Commission’s website, you can find a range of various courses on data integration, but also on big data, data collection, data analysis, and other exciting topics.

Closing Thoughts

  • The ever-growing number of cloud-based platforms and software solutions, the increasing need for extensive data analysis and the rising complexity of business use cases that involve modern technologies – all these are the reasons why iPaaS is slowly but confidently replacing some classic data integration solutions.
  • While concepts such as data integration and data integration solutions are commonplace in all enterprises that collect and analyze data, it’s vital to understand these concepts fully before deciding how to approach integrating your data.
  • Testing is everything. When you deal with such sensitive matter as data, you have to be sure that the tool that you will implement organization-wide will do what it promises, and this is equally true for iPaaS or any other data integration solutions. So, make sure that you dedicate enough time to test a tool in several different ways before committing to it.

Original Link

Data Analytics and Data Integration Are at the Core of an Omnichannel Business

A few weeks ago, we published the first part of our article on omnichannel, which came as a result out of an interview with Duncan Avis, Customer Enterprise Lead at Global KPMG Connected. We said that according to one of the latest KPMG studies, companies successful at omnichannel are also successful at eight core capabilities which imply connecting the front, middle and back offices (see the image below).

In this second part of the article, we would like to touch upon the technical side of their success. In other words, what are the technical prerequisites for enabling these eight capabilities?

To Be Omnichannel, Companies Require an Enterprise-Wide Grip on Their Data

It’s not that companies tend to lack certain IT systems; at least, usually, they don’t. Quite on the contrary, one of the questions at the KPMG study was specifically around the technology in place. All sorts of systems have been named — marketing analytics, marketing automation, in-store and sales solutions, contract and order management, billing solutions, even workforce management and product innovation management solutions.

But what Duncan and his colleagues noticed was that the companies that said they do get their return on investment — in other words, are successful in their omnichannel efforts — have enterprise-wide data management and enterprise-wide analytic capabilities in technology.

Image title

Data-centricity leading to consumer-centricity are two key factors to being an omnichannel business, or as Duncan calls is, “omni-business” as opposed to just omnichannel. That being said, one needs to keep in mind that the data-centric approach presupposes agile data integration and smart data analysis.

Yet the Current State of Data Is Alarming

How important the data analytics tools are going to be for businesses show the answers of the KPMG interviewees. While data and analytics were named as the third important area of investments currently and within the next 12 months, “it was the one area that had the largest increase going forward,” says Duncan, explaining these insights as:

“I think organizations recognize that data and analytics are important today, but they also recognize that it is going to be super important going forward, and it’s no longer about just analytics.”

Data integration tools are often mentioned almost in passing in this regard, yet they do play an important role in collecting data from various sources in the front, middle and back offices, and feeding it to the data analytics systems. Data integration solves the rather classic problem of having disconnected, channel-specific systems in place, which do a good job in their own closed environment, but fail completely when used together with another channel.

Let me give you an example to prove my point. Duncan shared it with me when talking about integrating the front, middle, and back offices, but I think it fits perfectly into the data integration topic as well. Many times, because they were delivered by channel, product information management tools used for online have the schemes that are different to that of the core ERP systems used in stores.

Now, imagine you buy something online and you want to return it back to a store. Because of this difference in schemes, there is a high chance that the store’s technology won’t recognize the item even if it looks like their product and, therefore, won’t allow you to give it back this way.

Another recent study named Data 2020: State of Big Data, for which a certain market research firm at the commission of SAP surveyed over 500 IT decision makers from enterprise-level companies in the US, Brazil, UK, Canada, Germany, France, Japan, China, and Australia, only proves the point. 64% of the interviewees admitted low or no accessibility of data to a wide variety of business stakeholders; 85% revealed that they struggle with data from a variety of locations, while 72% called their data landscape complex, with the variety and number of data sources.

It is safe to say that, in light of the topic of how important data-centricity is, these numbers look quite alarming.

Enable Data-Centricity With Proper Data Integration Tools

In order to ensure proper access to various data sources and locations as well as a clear data landscape, it is certainly wise to bet from the start on smart IT systems, which either provide easy connectivity options or simply cover “all.”

However, let’s be honest: many established companies that want to go omnichannel now, have accumulated their systems for years, building a complex architecture upon and around it, adding new “layers,” such as for new channels, only when needed. There is a very low chance that they will be willing to just scrap all this architecture and replace it with “shiny” new, interconnected systems. That would be exactly what many anti-omnichannel experts warn about: an insane act of throwing out of the window corporate money.

Is there a way around it? According to one of the latest Gartner’s papers, Survey Analysis: Integration Platform as a Service Turns Strategic, iPaaS is currently increasingly used not only for application and data integration but also in data warehousing, mobile app integration, B2B integration, analytics, and other more or less similar scenarios anyways.

So, modern iPaaS tools might be indeed a way to connect disparate data sources and locations without having to break down the architecture. They are scalable and flexible when it comes to data traffic spikes; they are often hybrid, connecting public and private clouds to on-premises, and as a rule, they are easy-to-use to be a perfect alternative to the “quick and dirty” point-to-point integration. The latter is particularly important unless you want your whole architecture to get eventually so stiff that it will become practically impossible to introduce new systems or remove old one without breaking the whole organization.

Without proper data integration, when you have a grip on literally all data in your organization, accurate data analytics is virtually impossible, which in its turn, impedes data-centricity. That is why data integration tools should not be regarded as an add-on to connect an application here and there, but one rather needs to approach this questions from a strategic perspective.

Original Link