How do you get systems like SAP or Pronto to exchange information in real-time with the other disparate systems in your organisation and use it to create value?

What are some possible strategies to solve this problem? Our Obzervr CTO, Marc Walter, takes a look at how information exchange methods have evolved and how it becomes open, real-time integration.

But first, what is integration and why is it important?

Integration is the exchange of information between two or more systems to fulfill  a broader business process or transaction, as one. 

So how do systems achieve this unified state of working as one?

He believes there are two main questions that integrating systems need to answer:

  1. How can you tell me what you’ve done?
  2. How can I tell you what I’ve done?

Let’s dive in to see what techniques can be utilised to achieve this exchange and how they might answer these fundamental questions.

Files

Traditionally, systems exchanged information via file export and import. This means that a file format and data format is determined and on an agreed cadence, the file is produced and dropped into a file system.

These files could be exchanged using the various transfer protocols such as sFTP or SMTP (Secure File Transfer Protocol or Simple Mail Transfer Protocol, in normal person terminology) or shared manually with storage media such as 5.25” floppy disks, CDs, DVDs, Thumb Drives, etc.

Do you remember the days of floppy disks and when an 8GB USB was expensive? 

Nonetheless, once the file was in the required location, the system waiting for the information in the file would be written to check for the file in a location and if the file is present, process that file, it would report on any problems with the file format, the data format or the validity of each data item in the file.

If all goes well, the file is processed, and the information in the file has taken the desired effect in the system.

This method is slow and error-prone and does not support the integration of a web of systems executing a broader business process, but is still a very popular integration method, Obzervr supports integration via file exchange using either sFTP or SMTP.

Secure File Transfer Protocol
How Secure File Transfer Protocol works

Importantly, the system with the least ability to solve the problem gets stuck with the problem.

Staging Databases

An intermediary database can also be used to facilitate the exchange of information between systems. This allows both systems to use SQL (short for ‘Structured Query Language’ and pronounced SEEquill) and ODBC (Open Database Connectivity, don’t worry, I had to google it too) to perform the integration and this allows the systems to leverage the power of enterprise relational database systems and their rich features.

In this model, a source system will include a process that connects to the staging database and inserts data into the tables in the staging database (or calls stored procedures). 

An independent target system includes a process that connects to the same staging database and will periodically check the staging database for any changes made. When a change is found, the target system will read the data in the staging tables and take the necessary action.

Staging Databases
How a Staging Database is like a Restaurant

The staging database is very similar to a restaurant’s kitchen. The staging area is where raw, source data is transformed into a target model of meaningful, presentable information. The staging area must be laid out and architected long before any data is extracted from the source. 

This method is slow, less error-prone (thanks to validation and rules that relational databases can perform on the data) but, like files, requires the target system to constantly check for changes and leaves the target system responsible for handing errors that only the source system can rectify.

Moreover, the two systems cannot be fully informed without creating additional integration. After System A has provided data to the stage, it is not aware of how well System B has managed to process the data it has provided.

Message Queues

What is message queuing?

Message queues provide a slightly improved exchange technology over files and staging databases. Queues include infrastructure to keep two endpoints between two machines connected and available for message exchange.

Message queues have the added flexibility of being able to broadcast or publish a single message to multiple ‘subscribers’. This allows systems to publish one message but have that message received by multiple systems in parallel.

This is, in fact, very similar to subscribing to a newsletter. When you subscribe to a newsletter, every time a new issue is released, you will be sent an email with a copy. With message queues, when a new message is published, you will be notified with a copy of the message. 

In Obzervr, we use webhooks to publish notifications based on ‘commands’ or events in our system. This is handy when you want multiple people to receive an email alert when work is completed or if a plant reading is out of bounds. 

Message queuing is not a new technology but has suffered from a lack of standards for messaging technologies, which has meant that systems could only integrate with other systems that have adopted the same messaging technology.

More recently, Advanced Message Queuing Protocol and Message Queuing Telemetry Transport standards have gained widespread adoption allowing systems to support message queue integration that would be more broadly applicable.

Application Programming Interfaces (APIs)

Most recently, the use of real-time, standards-based APIs between internet-exposed systems over HTTPS (the protocol of the internet) has allowed the utopia of simplified integration between systems to finally materialise.

These APIs follow standards for documentation, protocols, formats and versioning that allow non-engineers (yet, still tech savvy) users to ‘glue’ APIs together from various systems using an array of rich, user-friendly internet services. Once ‘glued’ together these data flows create a data value chain allowing useful business process execution across system boundaries.

But what is an API and how does it work? It seems like a lot of integration can be explained with food or kitchens, so lets take a look using the restaurant analogy. It’s a nice and simple way to understand some of the key concepts underpinning an API.

This tweet explains how an API call works a lot like a restaurant. You, as a customer, are not allowed into the kitchen of the restaurant (or the backend of the system i.e. a database).

When you come to the restaurant, you look at the menu (the documentation which explains what you can ask for) and make a request to the waiter. The waiter is the API. The waiter will go to the kitchen, request your food (i.e. smashed avo toast) and bring out your food when it’s ready (the response). However, if there is something wrong with your order, then the waiter will tell you it’s not available (the system will produce an error).

So why should both technical and non-technical business people care about APIs?

APIs are important to create an integrated software ecosystem. It can be time-consuming and difficult to navigate a disparate system of tools. Technology companies need to provide the interfaces to allow communication between their systems so that their customers can obtain the required information in an instant to get their work done quickly and efficiently.

Tools like Slack and Asana have nailed this. When I create a task in Asana and assign it to a teammate, they get a notification in Slack. Why shouldn’t a similar seamless communication process happen between centralised enterprise software and other solutions?

Enter The API Economy

To more clearly illustrate the possibilities with APIs, I’ll describe a couple of scenarios and show the use of APIs to implement the solution.

Suppose an organisation has recently implemented Obzervr as their digital fieldwork solution. The platform is configured to allow mechanical and electrical tradespeople to inspect and complete service sheets for their fixed and mobile plant assets.

Suppose also that this organisation utilises an Enterprise Resource Planning (ERP) tool like Pronto, SAP, Microsoft Dynamics 365 or Oracle’s Enterprise Asset Maintenance solutions to manage the maintenance of fixed and mobile plant operations. These solutions track maintenance, repair, and overhaul services which deliver critical asset information across departments and improve equipment reliability.

Suppose further that this organisation would like data to be ‘pushed into’ and ‘pulled out of’ the ERP. They would like instant text message and email alerts for anomalous plant readings. When they complete work in Obzervr, they also want to mark the work as Done in the ERP without actually entering the system.

Using traditional integration technology, this organisation would need to employ the services of a systems integrator (SI) to develop bespoke integration that use either files, messages queues or staging databases to share data. They would also build other middleware to coordinate, transform and manage the exchange of information between these systems. This process can take months to years and cost tens of thousands of dollars to define, implement, test and commence operation.

Today, technology works a bit differently. We use public APIs to exchange information because contemporary technology providers realised the value was not in the one-off consulting to allow System A only to talk to System B, but in the value achieved if System A could talk to System B, C or D without bespoke integration work. We call this the data value chain

Modern systems which have developed public APIs allow the user to sign-up for an account at one of many SaaS (Software as a Service) API coordinators such as IFTTT[1], Zapier[2], Microsoft Flow[3], Microsoft Logic Apps[4], Amazon SWF[5], Dell Boomi[6]. Using these systems, the user would follow a few simple steps to connect to the APIs for these systems and start integrating. The data value chain linkage is ready to provide value.

Obzervr Puts Integration First

It is for these reasons Obzervr has designed and built numerous API technologies to tell you what it has done and so you can tell us what you have done. Obzervr is API first, meaning that it was created with integration in mind, it expects to share its data and events with the outside world and is ready to participate in complex integration scenarios.

Obzervr integrates with both SAP and Pronto Enterprise Asset Maintenance solutions. Our customers needed it, so we delivered. Their Work Orders are pushed directly into Obzervr when they are triggered as In Progress. When the work is completed in Obzervr Capture (our mobile app), it is pushed back. Part of this work also includes the communication of work requests, materials and timesheets, all vital components of a Work Order.

As your business undergoes its digital transformation journey, integration will inevitably pop up on your radar. It will become increasingly important to gain a business-wide view of your operations. So, now that you understand information exchange, how is your business integrating its software solutions? Do you use APIs? Let us know in the comments below.

What’s next?

Imagine a world in sync, a world that works as one connected through smarter data, empowering not just you, but everyone, wherever you are. This is what the team at Obzervr work toward every day. 

Join our journey by subscribing to our Newsletter at www.obzervr.com or request a demo at info@obzervr.com.


[1]  IFTTT: https://ifttt.com/
[6] Dell Boomi: https://www.boomi.com 

Leave a Reply

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>