Modernizing an integration solution
This section uses an example to illustrate how an existing integration solution that has
grown over time can be modernized using SOA methods, and the scenarios from the
The example is a simplified version of a specific customer project in which an existing
solution was modernized with the help of SOA.
The task of the integration solution is to forward orders entered in the central ERP system
to the external target applications.
The current solution is primarily based on a file transfer mechanism that sends the new
and modified orders at intervals to the relevant applications, in the form of files in two
possible formats (XML und CSV). The applications are responsible for processing the
At a later date, another application (IT app in the following diagram) was added to the
system using a queuing mechanism, because this mechanism allowed for the guaranteed
exchange of messages with the application by reading new orders, and sending
appropriate messages through the queue in the form of a transaction.
The following diagram shows the initial situation before the modernization process took
Insert image 1049EN_04_21.png
The extraction and file creation logic is written in PL/SQL. A Unix shell script is used to
send the files through the File Transfer Protocol (FTP), as no direct FTP call was
possible in PL/SQL. Both a shell script and the PL/SQL logic are responsible for
orchestrating the integration process.
Oracle Advanced Queuing (AQ) is used as the queuing infrastructure. As PL/SQL
supports sending of AQ messages through an API (package), it was possible to
implement this special variant of the business case entirely in PL/SQL, without a call to a
shell script being needed. In this case, the integration is bi-directional. This means that
when the order has been processed by the external system, the application must send a
feedback message to the ERP system. A second queue, which is implemented in the
integration layer using PL/SQL, is used for this purpose.
Sending new orders
The job scheduler triggers an event every 30 minutes for each external system
that has to be integrated.
- The event triggered by the job scheduler starts a shell script, which is
responsible for part of the orchestration.
- The shell script first starts a PL/SQL procedure that creates the files, or
writes the information to the queue.
- The PL/SQL procedure reads all the new orders from the ERP system’s
database, and enriches them with additional information about the product
ordered and the customer.
- Depending on the external target system, a decision is made as to whether the
information about the new order should be sent in the form of files, or
messages in queues.
- The target system can determine in which format (XML or CSV) the file
should be supplied. A different PL/SQL procedure is called depending on the
- The PL/SQL procedure writes the file in the appropriate format using a
PL/SQL tool (in other words, the built-in package UTL_FILE) to the database
server. The database server is used only for interim storage of the files, as
these are uploaded to the target systems in the next step.
- The main shell script starts the process of uploading the files to the external
system, and another shell script completes the task.
- The files are made available on the external system and are processed in
different ways depending on the application in question.
- A PL/SQL procedure is called to send the order information through the
queue. The procedure is responsible for formatting and sending the message.
- The document is now in the output queue (send) ready to be consumed.
- The application (IT app) consumes the messages from the queue immediately
and starts processing the order.
- When the order has been processed, the external application sends a message
to the feedback queue (receive).
Receiving the confirmation
The job scheduler triggers an event every 15 minutes.
- The job scheduler event starts a PL/SQL procedure, which processes the
- The message is consumed from the feedback queue (receive).
- A SQL UPDATE command updates the status of the order in the ERP database.
Evaluation of the existing solution
By evaluating the existing solution we came to the following conclusions:
- This is an integration solution that has grown up over time using a wide
variety of different technologies.
- A batch solution which does not allow real-time integration or which makes
this very difficult.
- Exchanging information in files is not really a state-of-the-art solution.
- Data cannot be exchanged reliably, as FTP does not support
- Error handling and monitoring are difficult and time-consuming.
(It’s not easy to determine if the IT app does not send a
- Files must be read and processed by the external applications, all
of which use different methods.
- Integrating new distribution channels (such as web services) is difficult, as
neither PL/SQL nor shell scripts are the ideal solution in this case.
- Many different technologies are used. The integration logic is distributed,
which makes maintenance difficult:
- Job scheduler (for orchestration)
- PL/SQL (for orchestration and mediation)
- Shell script (for orchestration and mediation)
- Different solutions are used for files and queues.
Many of these disadvantages are purely technical. From a business perspective, only the
first disadvantage represents a real problem. The period of a maximum of 30 minutes
between the data being entered in the ERP system, and the external systems being
updated, is clearly too long. From a technical point of view, it is not possible to reduce
this amount of time, as the batch solution overhead is significant and, in the case of
shorter cycles, the total overhead would be too large.
Therefore, the decision was made to modernize the existing integration solution and to
transform it into an event-driven, service-oriented integration solution based on the
processing of individual orders.
Modernizing—integration with SOA
The main objective of the modernization process, from a business perspective, is the realtime
integration of orders.
From a technical standpoint, there are other objectives, including the continued use of the
batch mode through file connections. This means that the new solution must completely
replace the old one, and the two solutions should not be left running in parallel. A further
technical objective is that of improved support as a result of the introduction of a suitable
On the basis of these considerations, a new SOA-based integration architecture was
proposed and implemented, as shown in the following diagram:
Insert image 1049EN_04_22.png
Each new order is published to a queue in the ERP database, using the change
data capture functionality of the ERP system.
- The business event is consumed from the queue by an event-driven consumer
building block in the ESB. The corresponding AQ adapter is used for this
- A new BPEL process instance is started for the integration process. This
instance is responsible for orchestrating all the integration tasks for each
- First, the important order information concerning the products and the
customer must be gathered, as the ERP system only sends the primary key
for the new order in the business event. A service is called on the ESB that
uses a database adapter to read the data directly from the ERP database, and
compiles it into a message in canonical format.
- A decision is made about the system to which the order should be sent, and
about whether feedback on the order is expected.
- In the right-hand branch, the message is placed in the existing output queue
(send). A message translator building block converts the order from the
canonical format, to the message format used so far, before it is sent. The AQ
adapter supports the process of sending the message. The BPEL process
instance will be paused until the callback from the external applications is
- The message is processed by the external application in the same way as
before. The message is retrieved, the order is processed and, at a specified
time, a feedback message is sent to the feedback queue (receive).
- The paused BPEL process instance is reactivated and consumes the message
from the feedback queue.
- An invoke command is used to call another service on the ESB, which
modifies the status of the ERP system in a similar way to the current
solution. This involves a database adapter making direct modifications to a
table or record in the ERP database.
- In the other case, which is shown in the branch on the left, only a message is
sent to the external systems. Another service is called on the ESB for this
purpose, which determines the target system and the target format based on
some information passed in the header of the message.
- The ESB uses a header-based router to support the content-based forwarding
of the message.
- Depending on the target system, the information is converted from the
canonical format to the correct target format.
- The UK app already has a web service, which can be used to pass the order
to the system. For this reason, this system is connected via a SOAP adapter.
- The two other systems continue to use the file-based interface. Therefore, an
FTP adapter creates and sends the files through FTP in XML or CSV format.
- In order to ensure that the external application (labeled GE app in the
diagram) still receives the information in batch mode, with several orders
combined in one file, an aggregator building block is used. This collects the
individual messages over a specific period of time, and then sends them
together in the form of one large message to the target system via the FTP
- An aggregation process is not needed for the interface to the other external
application (labeled CH app in the image), as this system can also process a
large number of small files.
Evaluation of the new solution
An evaluation of the new solution shows the following benefits:
- The orchestration is standardized and uses only one technology.
- One BPEL instance is responsible for one order throughout the entire
- This simplifies the monitoring process, because the instance
continues running until the order is completed; in other words, in
one of the two cases until the feedback message from the
external system has been processed.
- The orchestration is based only on the canonical format. The target system
formats are generated at the last possible moment in the mediation layer.
- Additional distribution channels can easily be added on the ESB,
without having to modify the orchestration process.
- The solution can easily support other protocols or formats that
are not yet known, simply by adding an extra translator building
Latest posts by Krishna Srinivasan (see all)
- 5 New Features in HTML5 - November 29, 2013
- How To Set Tomcat JVM Heap Size in Eclipse? - November 28, 2013
- How To Resolve “Resource Is Out Of Sync With The Filesystem” Error in Eclipse? - November 28, 2013