Enterprise software and I.T. infrastructures have evolved significantly over the past several years. Businesses are rapidly migrating to a cloud-based architecture for their mission-critical applications (HR, Finance, Customer Relationship Management, Customer Service, Operations Management, etc.), and are more inclined to use best of breed “point solutions” rather than licensing all their software from one provider.

For instance, in the HR system domain, an enterprise might use a SaaS (Software as a Service) offering such as Workday as its core HR software, but may opt for a recruiting platform like Greenhouse, a time-tracking package like Kronos, a background check system by Sterling, an equity compensation system by Solium, along with literally dozens of other solutions.

Benefits Of An Enterprise Cloud Ecosystem Approach

  • Reduced fixed costs by eliminating on-premise computers and the people that manage them.
  • Easy upgrades to avoid software obsolescence.
  • Rapid deployment of systems.
  • Scalable infrastructure.
  • Redundancy, failover, and security managed by cloud service providers.
  • Unique software solutions to meet the exact needs of the enterprise.

The Data Integration Challenge

With these benefits comes a significant challenge. How do organizations effectively integrate these cloud-based applications? How does data from one application get shared with other systems that make up an entire workflow?

In this day and age, it is not unusual for large companies to have dozens or hundreds of applications running in the cloud (or in clouds), each producing, consuming and transforming data from other systems thousands of times a day.

These applications aren’t owned by the enterprise, nor installed on-premises.  Only the data belongs to the company, and it flows between the applications over the open internet to create a virtual “nervous system”.  This nervous system needs to be seamlessly and reliably integrated in order for business operations to function.

It’s not enough for each of these disparate systems to work flawlessly and securely unto themselves – they must also flawlessly and securely exchange data, potentially in various formats and often in real-time.

In one of our client environments that use Workday HR management software, there are 190 integration points with other third-party applications and services, and these integrations exchange data over 40,000 times each day. This is just in HR, with even more data flowing in finance, operations, sales, and other functions.

This speaks to the emerging need for a new IT competence – beyond just managing dozens (or hundreds) of applications, but also to manage the flow of the data between these applications.

It is the flow of data between applications that is often overlooked. SaaS vendors are responsible for their own applications, and will typically provide ways to get data to the “edge” of their system through application program interfaces (APIs).  They don’t take responsibility for the data once it leaves their application, and certainly don’t take responsibility to ensure data arrives at its destination, or that it arrives in the right format.

So, what could possibly go wrong?

This challenge is akin to when you check baggage at the airport.  You hope your bag gets on your plane, is handled correctly, isn’t picked up by someone else, things aren’t stolen or added, and arrive undamaged at the other end.  What happens when your bags don’t arrive when you get to your destination? Your tag says it’s supposed to be there, but it isn’t. Now what? Why did the system fail? Is it the departure airport’s fault?  The airline?  The arrival airport? Where did the bag get lost? How will it be retrieved? How long before I get it back, if ever?

The acceptance of cloud-based operating environments has happened so quickly that many enterprises have not yet caught up to the implications of having terabytes of their information – often sensitive in nature – flowing over the open internet between complex applications.

And yet, there are a number of things that can go wrong, any of which can cause significant business disruption.

  • Poorly Designed (or lack of) API’s
    Most software vendors offer Application Programming Interfaces (API’s). But not all API’s are created equally. Some are better at integrating with other applications than others, more feature-rich than others, well (or poorly) documented, unexpectedly deprecated or incomplete. In short, the quality of API’s is inconsistent.
  • Loss of Data
    What happens when data does not make it from one application to the other? How do you know? Is there adequate monitoring to confirm the right data arrived? If there is a problem, how do you determine the root cause? Is it an-going or intermittent problem?
  • The Wrong Data
    In many systems (often where humans are involved in the integration process), it is easy to send or receive incorrect data without it being flagged before it’s too late. The wrong payroll file, accidentally uploaded to an integration system to be sent to a payroll processor can have unforeseen and adverse consequences.
  • Compromised Security
    Businesses send their most confidential data through integrations. If security is not well designed, it can create significant risk. Personally Identifiable Information (PII) and Private Health Information (PHI) is data which has additional regulatory security requirements (such as GDPR in Europe, PIPEDA in Canada, and HIPAA in the U.S.A.). Sending PII or PHI through unsecured or unencrypted integration methods can result in breaches of trust and compromised goodwill, along with enormous fines and penalties.
  • Data Corruption
    Data may reach its destination, but a poorly designed integration could corrupt some part of, or an entire, data file.  This can cause the integration to fail, or pollute the downstream application with corrupt or incomplete data.  Sometimes this happens silently, meaning the integration appears successful and isn’t detected until damage has occurred to the business.
  • Data Conversion
    Data from one application must often be converted in type and format to integrate with other applications. Integrations typically don’t simply move data from one application to another – they translate, enrich, combine or break it into other data structures. Understanding the data models of each application and how to create these translations requires specialized skills to design and test.
  • Transient Problems
    The internet and all the applications in it are not perfectly stable with 100% uptime. Transient events happen constantly and can take down one or more applications and temporarily disrupt communications. If mission-critical data is in the middle of flowing between applications, being transmitted or received during the transient event, what happens? This creates the need for integrations to be fault-tolerant with guaranteed delivery and data integrity.
  • Lack of Monitoring to Flag Problems
    There is a dearth of tools available to monitor these data exchanges in real-time. Monitoring is required so that problems are flagged immediately to minimize errors or disruption to crucial business operations.
  • No Error Checking Mechanisms
    API’s might do basic error checking, but typically don’t include business-quality data validation checking mechanisms to ensure the data being received is valid and meets predefined criteria.
  • Damaged Integrations Post Software Upgrades
    If a company has hundreds of cloud-based applications, you can bet that at any point in time one or more of them are being upgraded, reconfigured, or replaced.  People often don’t consider the implications to downstream applications if something changes in the data type or format sent through integrations to other systems.  Inadequate system-level regression testing and data quality assurance can cause unforeseen problems.

Key Considerations

Today’s app ecosystems require high performance and high-security integrations. However, this is not always considered – businesses that want to deploy SaaS applications quickly to gain a competitive edge often underestimate the cost and time associated with integrating these systems.

Don’t expect the software vendors to solve this for you – they generally see this as your responsibility.

A new set of tools designed to address integration has emerged, generally referred to as integration Platform as a Service (iPaaS) applications.  But they vary in performance, features, and required competence. iPaaS tools won’t automatically monitor integration health and performance – this requires yet a different toolkit.

In this operating environment, your data flows over the open internet.  Security, especially for confidential data, PII and PHI is paramount.

Cloud-to-cloud integration is still a relatively new discipline, and most software engineers are not taught the fundamentals to effectively manage them. As a result, expertise is highly variable.

It Must Be Taken Seriously

There is no doubt that running mission-critical applications in the cloud is the way of the future. The benefits make it the right infrastructure for most enterprises.

However, this approach creates a significant data integration challenge. And as more applications are put into the cloud, creating the need for more integrations, the challenge only gets bigger and more complex.

This is something organizations must take seriously to maximize speed to market and reap the resulting value from business-critical systems. It must be addressed on-going because, in a world of rapidly changing customer expectations and competitive pressures, new applications and integrations will be a constant.

In future articles, we will explain how to best deal with these challenges.

To find out more about how we can help secure your integrations, Contact us today.


About Dispatch Integration:

Dispatch Integration exists to help organizations with complex and ever-changing needs to integrate data and to optimize end-to-end workflows, between cloud-based mission-critical applications. We make your cloud investments shine by adopting a “best of breed” approach while designing an app ecosystem best suited for your needs. 

Read more from Dispatch Integration:

Cameron Hay

Cameron Hay is the CEO and co-founder of Dispatch Integration, a data integration and workflow automation company with offices across Canada. He is a seasoned technology company CEO, having worked at Unitron, a global leader in hearing healthcare technologies with sales in over 70 countries. He has an engineering degree from the University of Manitoba and an MBA from the Richard Ivey School of Business. Cameron began his career as an engineer at IBM and was also a Management Consultant at Cap Gemini.
Cameron Hay