Factors To Consider Before Buying A Data Integration Solution

It is often tempting to think of "integration"solely in terms of the latest hype, whether itís electronic procurement, Web services, or technology such as XML. Yielding to this temptation, however, eliminates broad categories of business processes and problems that require sharing of data among the underlying applications.

Even before the industry began focusing on application integration, business process integration, and extending business processes to external constituencies, many organizations had already begun to take steps in those directions. Often, the "integration"was very much at armís length Ė a company outsourcing its payroll to an independent payroll processor might send magnetic tapes containing new payroll data for each pay period. The companyís HR systems were "integrated" with the payroll processorís check preparation systems by transferring files of data. However crude the mechanism appears to have been by todayís standards, this was still an early form of data integration.

As computer networks became ubiquitous and network connections between organizations became more cost-effective, more sophisticated forms of data integration became necessary. Automated sharing of data was rapidly recognized as the way to improve both productivity and cost efficiency for integrating applications and processes. The Internet further drove down the costs of automated file transfer by eliminating the need for leased lines or dedicated networks between organizations.

While much of the literature on application and B2B integration focuses on specific processes such as procurement, supply chain management (SCM), customer relationship management (CRM) or Web commerce, there are many other classes of business activity that require some level of application integration. Many of these activities are particular to specific verticals, such as the transfer of ACH or ATM data to back-end processing systems in financial services, or the exchange of claims processing data between medical providers and insurers in the healthcare industry.

Now that organizations are realizing the benefits of integrating data, there are requirements that must be in place to make it a success. Whether internal or external, data integration requires a robust infrastructure for data delivery.

Requirements for Data Integration

Infrastructure for application integration is comprised of three main layers: data transport interface, data exchange services, and user/ application interface.

Figure 1 provides a conceptual view of the elements of such an infrastructure:

The lowest layer is the physical network transport and the data flow protocols at this level, i.e., HTTP, OFTP, FTP, etc.

The Data Transport Interface layer links the integration infrastructure software to the physical transport layer. It establishes the communication links between the two participants in a data transmission, and manages all elements of data flow.

Data Exchange Services address requirements associated with data management.

These services include:

  • File system interfaces, such as file format conversion
  • Physical media interfaces
  • Database extraction and insertion
  • Data transformation, including code page management and character set conversion
  • Storage into and retrieval from a collection repository
  • Data compression

The User/Application Interface enables administrators, end-users, or applications to access the services of the infrastructure. Administrators access the system for configuration, management, security and monitoring purposes.

At all levels, the infrastructure requires:

  • Security features that can be tailored to align with organizational security policies
  • Management services for configuration and management of the infrastructure and for defining and managing the internal and external constituencies using the system for data exchange
  • Automation for unattended, 24/7/365 operations
  • Automatic failover and recovery in high-availability environments for business-critical and time-sensitive data

Understanding infrastructure requirements will help an organization move to the deployment stage. However, IT organizations must link infrastructure requirements to the organizationís business processes and needs for a successful deployment.

Deploying a Data Integration Solution

A successful integration software infrastructure deployment depends on a variety of factors:

  • Internal integration requirements, including:
  • Relative centralization or distribution of internal business processes
    Location of data stores within the organization
    Current application deployment on centralized or distributed systems
    Changes in business conditions such as acquisition of a company or business unit
  • External integration requirements, including:
  • Nature of the business relationship with the external constituency
    Types of shared business processes
    Data exchange standards imposed by the external organization
    Technology available at the external site
  • Organizational security policies
  • Cost considerations, both for initial deployment and for long-term operation
    • Internal Integration

      Deployment for internal integration typically involves the use of local area networks (LANs), wide area networks (WANs), and channel-connected platforms -- any form of connectivity that is fully within the virtual four walls of the organization. In the case of organizations with a significant number of remote locations (such as retail or franchise operations), connectivity between the remote and central sites may also be achieved through less direct means, such as dial-up or Internet-based connections. Integration may require data to move over any of the internal networks, between mainframe or midrange servers, from desktops to servers or any combination.

      External Integration in a Controlled Environment

      Integration with external constituencies may be in a highly-controlled environment. This may occur when the parties have strong, well-established partnerships, when one or more of the parties impose standards for data exchange, or when the volume of data exchanged is very high and frequent. It may also occur when the nature of the business processes or the data is sufficiently time-sensitive or business critical, requiring dedicated connections. In some cases, it may be an artifact of connectivity and implementation established prior to the advent of the Internet or its proven reliability as a cost-effective alternative to dedicated lines.

      Accommodating All Integration Requirements

      As business processes are extended to wider external constituencies, the degree of control that may be exercised drops significantly. This is particularly true when integration extends to smaller partners, consumer-like customers, constituencies with whom data exchange is infrequent. Furthermore, as integration expands within an organization, more attention is focused on the most cost-effective solutions, adding the Internet into the options for connectivity.


      Organizations seek to integrate applications and information to gain a global competitive advantage that involves realizing new sources of revenue while increasing profitability. Streamlining internal and external business processes enables organizations to be more adaptive in penetrating new markets while improving productivity and cost efficiency. Yet, application integration and business process integration are complex problems requiring multifaceted solutions depending upon industry and the business objective. The solution chosen should be scalable and adaptive, enabling an organization to implement as simple or as complex an infrastructure as necessary to meet business demands.

      About the Author

      Terry Noreault is Senior Vice President, Integration Software for Sterling Commerce. Noreault oversees the companyís e-business integration software including Gentran Integration Suite. His responsibilities include engineering, product management and product marketing.

      Noreault has 18 years of experience in developing and delivering leading-edge software systems. Prior to joining Sterling Commerce in 2000, Noreault served as Vice President of Research for the Online Computer Library Center (OCLC), where he worked for 15 years in a variety of engineering positions. During his tenure, he managed the development of text search engines for PCs to Mainframes, as well as online systems that supported thousands of users from libraries in 68 countries.

      Noreault holds a bachelorís degree in computer science from State University of New York at Oswego and a doctorate degree in information transfer from Syracuse University.

      More by Terry Noreault