Thursday, 29 June 2017

MongoDB Jolie Connector Part 3 : ObjectID creation and cross collections handling

In the last year I have been working on the development of  MongoDB connector for Jolie ( First Language for microservices ). In my previous two posts I have been looking into the basic CRUD operations and how to program them with Jolie.
In this post I am going to look at the handling of ObjectID and cross collections relationships. The two topics are closely related as MongoDB used ObjectID to cross reference documents between collections.

Assigning ObjectID to a document and adding ObjectID reference to a document.

ObjectID is a MongoDB specific data type that is not native to Jolie , therefore like in previous cases where there was not direct correspondence between MongoDB data type and Jolie data type a  the following semantic solution was adopted  


The node "@type" signal to the connector that it needs to handle the value of the node "_id" as a ObjectID. The node "_id" it is a specific node name that will trigger the the assignment of a unique ObjectID to the document as shown in the following example 


The response from the insert operation will return an structure identical to that used to insert the ObjectID


Of course one should really consider the opportunity of setting an objectID with a custom value, considering the all the drawbacks connected with handling the uniqueness of the value.
If the "_id" node is not defined the insert operation will return the auto generated value; that can be used to create documents cross referencing.
In a similar manner we can set one of the document fields as ObjectID to reference an other document as shown in the code under .


The resulting document looks like


We can see that the resulting document contains the auto generated "_id" and the reference to another object that is desired result.It is also clear that any time we can update the document adding further documents reference as shown in the following code.



The document now looks like


Cardinality of the relationship

When approaching MongoDB from a SQL prospective the first question came to my mind was "what about schema". In the excellent post "6 rules of thumb for MongoDB schema design" the author goes back to basic identifying three types of possible relationships:

  1. One to Few
  2. One to Many
  3. One to Squillions
The first  is not really strictly pertinent to the topic of this post but it comes without saying that One to Few can be easily model with an embedded document.
The renaming two are more pertinent: and I will try to demonstrate to code these relationships with Jolie
In my example  are present 3 collections that model a small part of a e-commerce solution:
  • CustomerData
  • CartData
  • SearchData
The CustomerData document ( see below ) will contain an array field cartHistory containing the reference to CartData document
   


The document for CartData( see below ) also contains a reference to a forth collection ProductData that will be not specified in this post.


In both cases the documents are modeling a One to Many relationship The SearchData document ( see below ) acts as log of all the search activity logging the visited object by a the customer


The SearchData document is design to model One to Squllion or better Squillion to one

One to Many


We can imagine a situation where by on the end of the purchase the CustomerData will be updated adding the concluded CartID the code below shows how this would be done in Jolie


Of course this code is simplified and it does not consider any exceptions compensation or business logic consideration, but it aims to show how is possible to manipulate MongoDB ObjectID across collections using Jolie


One to Squillions

Although the used data model is probably not the most correct to keep track of the navigation history it is a simple example to give the idea how handle this problem in Jolie 


Of course one should consider now how to search the data of this potentially enormous collection and how efficient this search may be but it is not topic of this post.

Wednesday, 3 May 2017

Implementing IT solution for customer loyalty program using a microservices approach

General view

We are all familiar with the concept of  loyalty program, and probably we all are members of at least of one or two of these programs. Designing and implementing a reliable and flexible IT solution, that handles such program, may result more complex that expected.
In the recent few months I had to coordinate a small team of developers in the design and development of  such a solution for the company I work for .
The company that I work for operates in the media industry with and in focus on newspapers both classical and online.  
From the start it was evident that the implementation of this project presented the following constrains :
  • It needed to operate in well establish IT ecosystem 
  • It needed to be capable of adapting itself to the new marketing demands
  • It needed to be able to communicate and exchange data with third party software
It is my opinion that to match all three contains the best approach is  the microservices one , with the following  microservices based architecture is described by the diagram  :  




External API Service

This service provides access to data regarding the loyalty program and marketing promotion and other customer management operation. The API have been published as REST API , being REST a simpler meteorology of interaction with third parties software.

WEB Server

The solution provides the user with a website where to check is loyalty program status and interact with the marketing department The instance of  the WEB server was generated  from the  Jolie Leonardo WEB Server template and evolved to fit with the project requirements

Integration with third part Web CMS

One of the project requirement was the integration with the the Applix Polopoly WEB CMS , the customer web portal had to be embedded in standard web article of our on-line newspaper
The solution adopted was to spiting the technology stack in two all the presentational  files (HTML and CSS ) were deployed on Polopoly and the JavaScript provided by the Jolie WEB Server.
It was also necessary modify the HTTP message response header to handle access control issues adding in the HTTP port configuration  parameters as shown  in following code


Protocol: http{

.osc.OperationName.response.headers.("Access-Control-Allow-Origin") -> SiteOrigin;
.osc.OperationName.response.headers.("Access-Control-Allow-Headers") = "Content-Type, Access-Control-Allow-Headers, Authorization, X-Requested-With";
.osc.OperationName.response.headers.("Access-Control-Allow-Credentials") = "true"

}

SiteOrigin is variable that should  be set to the location of the caller

Integration with an Authentication Service Provider 

In order to provide a secure and reliable process of authentication  Auth0 has been identified as our authentication  provider. Auth0 provides social and Gmail integration and several others interesting features.
Auth0  requires the client WEB Server to publish a REST callback operation to complete the login or registration process.
As in the case of the WEB CMS integration it was required to define specific HTTP message response hearder value to handle HTTP handle access control issues and  cookie policies adding in the HTTP port configuration  parameters as shown  in following code

Protocol: http{
   .osc.callback.response.headers.("Set-Cookie")->sid;
   .osc.callback.response.headers.("Access-Control-Allow-Origin") -> Auth0Site;
}


The variable sid is passed via the response Header and it allows to set a value for the cookie that will be used by the WEB Server to  provide user dependent data.
The cookies is recovered and piped in in the operation request value using the following code

Protocol: http{
 .osc.OperationName.cookies.AuthCode = "sid"
}

This will free the frontend developer from having to pass as value the AuthCode in the calls. and also it allows the backend developer to manage at port level and operation level the use of cookie

Technology stack 

The technology stack used in the web solution is the following

  • HTML5 /CSS
  • jQuery 
  • Jolie (server side)
The interaction between the HTML page and the backend operation is achieved via AJAX call. The implementation of all the JQuery function and it the operations request response types is generated automatically by a tool
Handling the asynchronicity of the AJAX is achieved using a event driven model. the tools includes in the generated code an specific event for each operation. 
   
function operationName(value) {
    $.ajax({
        url: '/operationName',
        dataType: 'json',
        type: 'POST',
        contentType: 'application/json;charset=UTF-8',

        success: function(response) {
            if (typeof(response.error) == 'undefined') {

                responseObj = new OperationNameResponse()
                responseObj.parseJson(response)
                var event = new CustomEvent('EventOperationNameResponse')
                event.data = responseObj
                document.dispatchEvent(event)
            } else {

                var eventError = new CustomEvent('EventGeneralError')
                eventError.data = response
                document.dispatchEvent(eventError)
            }

        },

        error: function() {
            var eventErrorHttp = new CustomEvent('EventGeneralHttpError')
            document.dispatchEvent(eventErrorHttp)
        },
        data: JSON.stringify(value.toJson())
    })
}


In this way the frontend programmer will be able to work with the the operation published by the WEB Server by calling operation or methods of classes in js


handler_EventModifyAnagraphicResponse(e){
    operationNameResponse = e.originalEvent.data
   /*   insert your code here */
}

$(document).ready(function() {
              $(document).on('EventGeneralError', handler_EventGeneralError);
              $(document).on('EventOperationNameResponse', handler_EventOperationNameResponse);
 }

also it will be able to handle the data from the call in an event handler function
The use of this tool has allowed the team to increase the efficiency of the Web Application development achieving a good separation between the different roles in the developing team  .

Business Logic Service  

This service is the fulcrum of the solution it handles the following aspects 
  •  Logic for the loyalty program
  •  Logic for the creation/update and recovery of customer data
  •  Logic for the creation/update and recovery of distribution data
Each of this aspects has generated a separated interface definition, to make make simple the separation of the implementation when required 
The business logic service acts as an coordinator  of internal and external data to achieve the desired business objective. The internal data are masked behind specific CRUD services to increase the decoupling between data and logic.

CRUD Services  

The CRUD Services mask data from existent database. The choice of developing specific services to publish internal data has been driven by the desire to make such data available to other future application.

Third Party Connectors 

Some of the data required by the application are provided by external service of the following list 
All three implementations are center of the use of a HTTP Jolie output port, such a port acts as a HTTP client .
The process of implementing a integration to a Third Party REST Service requires the following steps

  1. Definition of the interface and its operation ( it can be also a subset of the provided operation)
  2. Definition of the request and response type 
  3. Definition of the port parameters such as  Method Aliases ecc ecc
Using this simple methodology it has been possible developing the connector that now can be used for any other application it  may be needed for. 

Google Geocoding API

It was necessary to implement a geo localization process to populate the distributor and shop data with the their latitude ad longitude coordinate  In this way the marketing team will be able to add a new shop or distributor without worrying to provide the localization data.
These data will then available via web application to the customers searching for shop or distributor in his/her area.

Auth0

In the WebServer section it was mentioned the Auth0 as authentication  provider.  It was also necessary to implement the integration serverside with some of the API provided by Auth0. This has made possible recovering user and token info.

Prestashop 

It was also necessary to implement the integration with our instance of Prestashop, The business requirement was to provide  to our customer a feedback on the use of their loyalty points via our e-commence allowing the customers to see customer on run-time the status of its discount voucher and the history of his/her purchases

Friday, 5 August 2016

MongoDB Jolie Connector Part 2: Application Example

In the previous post I looked at the basic use of Jolie's MongoDB connector with an overview of its characteristic and it basic uses. In this post we are looking at a real use of the connector, the obvious choice would have been the implementation of Sales and Distribution application; yet in the company where I work such a need has not arisen. The opportunity to use MongoDB and jolie have arisen when approving the development of a Plant Management software. Production plants are often very complex structures with hundreds of  components. The best way to represent such complex structures is by using a tree representation  where the hierarchical nature of plant and its sub component can be best expressed. The tree data structure also fits perfectly with jolie data structures and with MongoDB document approach.
Therefore the first step in the development process was to define a "document" or better a tree data structure that represents my company production plants.  
     
 type ProductionSiteType {  
  .productionSite:void{  
   .productionSiteName:string  
   .productionSiteErpCode:string  
   .mainLine*:void{  
       .mainLineName:string  
       .mainLineErpCode:string  
       .subLine*:void{  
         .subLineName:string  
         .subLineErpCode:string  
          .equip*:void{  
           .equipName:string  
           .equipErpCode:string  
          }  
         }  
        }  
       }  
    }  
 }  

The second issue was to valuate how flexible this data structure is? can it adapt to changes required by the business , with the insertion of  new information. Let us suppose that the business requires to specify a mainLineManager (under the mainLine node ) with his/her contact details, and also it is required to supplierMaterialCode. The compulsiveness of the information can be now handled using the jolie  node's cardinality.

       .mainLineManager:void{  
          .managerName:string  
          .email:string  
          .phone:string        
       }  

The insertion of this node node will require the data within the collection to be  processed again

        .equip*:void{  
           .supplierMaterialCode?:string
           .equipName:string  
           .equipSapCode:string  
          }  

Where in this case there will be no need for the collection to be processed again
So far the reader may think that the writer has only shown  how to represent a possible plant structure using jolie.
The advantage of the document approach will become evident when looking at the interface and the operation

interface MaintanaceServiceInterface{
   RequestResponse:
    addProductionSite( ProductionSiteType )(AddProductionSiteResponse)
 }

It can be seen that any change in the document is propagated at the addProductionSite operation, and it is also evident looking at the implementation,

   [addProductionSite(request)(response){
    scope (addProductionSiteScope){
     install (default => valueToPrettyString@StringUtils(addProductionSiteScope)(s);
              println@Console(s)());
              q.collection = "SiteStructure";
              q.document << request;
              insert@MongoDB(q)(responseq)
     }
     }]{nullProcess}

The same can be said for the reading of plants structures


type GetMainLineRequest: void{
     .productionSiteErpCode?:string
}
type GetMainLineResponse:void{
  .document*:productionSiteType
}
  
interface MaintanaceServiceInterface{
   RequestResponse:
    addProductionSite( ProductionSiteType )(AddProductionSiteResponse),
    getProductionSite(GetProductionSiteRequest)(GetProductionSiteResponse)
 }


With the following implementation


  [getProductionSite(request)(response){
      scope (getProductionSiteScope){
       install (default => valueToPrettyString@StringUtils(getProductionSiteScope)(s);
                          println@Console(s)());
                q.collection = "SiteStructure";
                if ( is_defined (request.productionSiteErpCode)){
                   q.filter << request;
                   q.filter = "{'productionSite.productionSiteErpCode': '$productionSiteErpCode'}"
                };
                if (DEBUG_MODE){
                  valueToPrettyString@StringUtils(q)(s);
                  println@Console(s)()
                };
                query@MongoDB(q)(responseq);
                if (DEBUG_MODE){
                  valueToPrettyString@StringUtils(responseq)(s);
                  println@Console(s)()
                };
                response << responseq
       }
      }]{nullProcess}


Once again it can be noticed that the filter is expressed as MongoDB standard query language with the injection of the data directly from the jolie type.
It has been also necessary to refactor some of the subNode such as the equip defining a not inline type



       type equipType:void{  
           .equipName:string  
           .equipSapCode:string  
          }

 type AddEquipmentRequest {
     .productionSiteErpCode:string 
     .mainLineErpCode:string
     .subLineErpCode:string
     .equip: equipType     
  }
  interface MaintanaceServiceInterface{
      RequestResponse:
                 addProductionSite( ProductionSiteType )(AddProductionSiteResponse),
                 getProductionSite(GetProductionSiteRequest)(GetProductionSiteResponse),
                 addEquipment( AddEquipmentRequest )(AddProductionSiteResponse)
    } 


By defining a new jolie complex type it has been possible centralizing the definition of the equipment that can be used both as type for further operations or directly as new document for a new collection
An other example of the usefulness of the type/document  driven design is the access to subNodes content as individual nodes


type GetMainLineRequest: void{
     .productionSiteErpCode?:string
  }
type GetMainLineResponse:void{
   .document*:productionSiteType
}
  interface MaintanaceServiceInterface{
      RequestResponse:
                 addProductionSite( ProductionSiteType )(AddProductionSiteResponse),
                 getProductionSite(GetProductionSiteRequest)(GetProductionSiteResponse),
                 addEquipment( AddEquipmentRequest )(AddProductionSiteResponse)
    } 


 [getMainLine(request)(response){
   scope (getMainLineScope){
    install (default => valueToPrettyString@StringUtils(getMainLineScope)(s);
                       println@Console(s)());
         q.collection = "SiteStructure";
         q.filter[0] = "{ $match : { 'productionSite.productionSiteErpCode' :'$productionSiteErpCode' } }";
         q.filter[0].productionSiteErpCode = request.productionSiteErpCode;
         q.filter[1] = "{$unwind : '$productionSite'}";
         q.filter[2] = "{$unwind : '$productionSite.mainLine'}";
         if (DEBUG_MODE){
           valueToPrettyString@StringUtils(q)(s);
           println@Console(s)()
         };
            aggregate@MongoDB(q)(responseq);
         if(DEBUG_MODE){
              valueToPrettyString@StringUtils(responseq)(s);
              println@Console(s)()
            };
        response << responseq


  }
   }]{nullProcess}


By using the aggregation operation it has been possible to extract the content of the subNode mainLine for a specific plant and creating the correlation one document for each mainLine.

With in the limit of blog post I hope the reader will appreciate how jolie and jolie's MongoDB connector are a valid  technology to approach microservices  based data intensive applications.

Monday, 4 July 2016

MongoDB Jolie Connector Part 1

One of the buzz words of the IT community it is definitely "Big Data", yet like it often happens the actual definition is somewhat vague, is "Big Data" a question of volume of data or also a question of performances.The answer can be found ,in my opinion, in the definition provided by Gartner: "Big Data:is high-volume, high-velocity and/or high-variety information assets that demand cost-effective, innovative forms of information processing that enable enhanced insight, decision making, and process automation"
Does it means that to do "Big Data" we need to have a new DB technology: well  the short answer is yes we do, traditional DB technologies poorly fit  with some of the "Big Data" requirements. This new challenge has created a proliferation of new DB technologies and vendors with different approaches and solution. A good review on the several option has been provided by the following site Database technology comparison-matrix
MongoDB it is one of the possible technology choices, it is my opinion that with its document approach makes data modeling a much more linear approach easing also the not always comunication between IT and not IT members of a company.
The next question  is: "Can microservices play a role in a Big Data scenario?". It is my opinion  that microservices  can play an important role in developing data handling in a "Big Data" landscape. Expanding on the micro services orchestration concept, it can imagined a scenario where a poll of microservices  can be used as data orchestration where new "aggregated data" are created by using individual function exposed by the microservices. This will certainly cover two of characteristic of Gartner definition:
  1. high-variety information:  The orchestration of original data into new aggregated data will avoid the creation of new  database bounded table or collection 
  2.  cost-effective: microservices are effective as much we are able to decompose correctly the problem on hand
On the purpose of  bringing Jolie within the Big Data discourse a MongoDB connector has been developed: the choice of MongoDB as first Big Data technology approached  by the Jolie's team has not been accidental but driven by the tree like data representation used by MongoDB that extremity similar if not identical to the way Jolie represents complex data in its  complex type definition.
The connector has been developed externally the normal Jolie Language development but it is developed by the Jolie team members and it is also responds to a specific demands of a growing Jolie programming community.
The developing team has adopted a top down approach when developing the connector starting from an analysis of a set of requirement passing through the interface definition to arrive to some pilot project ( which  result will be presented further on ).
The identified requirement were  the following
  1.  Preserve the not SQL nature of MongoDB
  2. Give to the connector user a simple and instinctive interface to the MongoDB world
  3. Preserve some of the native aggregation and correlation characteristic of MongoDB
From this three requirements the following considerations have been thought 
  1. The connector will have to have all the CRUD operation 
  2. The connector will use the native query representational (JSON) 
  3. The connector will provide access to MongoDB aggregation capability
  4. The naming convention of  service operation types replicate the terms used by MongoDB,
and resulted in the following interface:


interface MongoDBInterface {
  RequestResponse:
  connect (ConnectRequest)(ConnectResponse) throws MongoException ,
  query   (QueryRequest)(QueryResponse)   throws MongoException JsonParseException ,
  insert  (InsertRequest)(InsertResponse)   throws MongoException JsonParseException ,
  update  (UpdateRequest)(UpdateResponse)   throws MongoException JsonParseException ,
  delete  (DeleteRequest)(DeleteResponse)   throws MongoException JsonParseException ,
  aggregate (AggregateRequest)(AggregateResponse)   throws MongoException JsonParseException
}



Like mentioned before the connector aims to propose a simple to use interface:
Starting from insert operation

q.collection = "CustomerSales";
    with (q.document){
          .name    = "Lars";
          .surname = "Larsesen";
          .code = "LALA01";
          .age = 28;
           with (.purchase){
             .ammount = 30.12;
             .date.("@type")="Date";
             .date= currentTime;
             .location.street= "Mongo road";
             .location.number= 2
          }
     };
As can be seen the document can be inserted int the "CustomerSales" collection by simply defining the document to be insert as Jolie structured variable.

The query operation or any other operation that require a subset filtering ( update , delete ,aggregate )

q.filter = "{'purchase.date':{$lt:'$date'}}";
q.filter.date =long("1463572271651");
q.filter.date.("@type")="Date";

notice the use of  '$nameOfVariable' to dynamically inject the value in search filter
Using the sane representation it can be inject the value for an update operation

q.collection = "CustomerSales";
q.filter = "{surname: '$surname'}";
q.filter.surname = "Larsesen";
q.documentUpdate = "{$set:{age:'$age'}}";
q.documentUpdate.age= 22; 

and so it goes for the aggregation:

q.collection = "CustomerSales";
q.filter = "{$group:{ _id : '$surname', total:{$sum : 1}}}";

Use and limitation of the connector

The connector can be downloaded here ( Jolie Custom services Installation ), the current release has to be consider a Beta release, therefore prone to present  imperfections and bugs
Some of them are already known and are on the way to be corrected:
  1. Logoff operation missing
  2. Handling  MongoDB security
  3. DateTime storing error[UTC DateTime
the user must consider the following native data type mapping

Mongo Type Jolie Type Detail
Double double
int32 int
int64 int
String sting
DateTime long with child node ("@type")="Date"
ObjectId long






   
   

Tuesday, 28 April 2015

Using defined macro vs personalized Jolie service

When designing software solution using microservices  one finds himself to wander how do handle repetitive code.  This is particularly true when one uses microservices for data handling and data mapping; where often it is required to transform data from a format to another for example:
  1. Date Format
  2. Double Value Format 
  3. Other Alphanumerical manipulation  
Jolie language provides  to the programmer already with some rather powerful primitives to manipulate such data but is not uncommon to have to personalize some behavior. And this is where the dilemma araises. Do I go for 
  1. A series of  macros defined by the primitive define 
  2. Or do I create a service with a number of exposed operation
Now this post does not aim to set a standard or either to propose a methodology for microservices coding in Jolie, Its only aim is to share a personal checklist when taking a coding decision on repetitive code .
Let's look  to an example of date transformation
   
define transformDate{

    if (is_defined(__date)){

requestSplit= __date;
requestSplit.length = 4 ;
splitByLength@StringUtils (requestSplit)(resultSplitYear);
year = resultSplitYear.result[0];
requestSplit= resultSplitYear.result[1];
requestSplit.length =2;
splitByLength@StringUtils (requestSplit)(resultSplitMonth);
month = resultSplitMonth.result[0];
day = resultSplitMonth.result[1];
__transformedDate =day+"-"+month+"-"+year
}
}


Now this section of code could be defined with in an iol file and included by several services and used in this way

__date = value_original_date
transformDate
value_new_date = __transformedDate

Now let us see the same functionality implemented with a service

type DateTransformRequest:string

type DateTransformResponse:string

interface TransformInterface {
RequestResponse:
 transformDate (DateTransformRequest)(DateTransformResponse)
}

and with an implementation that looks like

inputPort TransformerInputPort{ 
  Location:"socket://localhost:8099" 
  Interfaces:TransformInterface
  Protocol:sodep 
}
main{
transformDate(request)(response){
  //the code that does it here
}]{nullProcess}


Now let us go down to the topic of the post how can we compare the two implementations

Define implementation

Pro
Con
Need to restart all the services if there a change to be implemented

Yes
No network resources
Yes

Shorter time of implementation
Yes

Clear error tractability

No
Network deploy

No

and for the Jolie service implementation

Jolie Service implementation

Pro
Con
Need to restart all the services if there a change to be implemented
No

No network resources

No
Shorter time of implementation

No
Clear error traceability
Yes

Network deploy
Yes


Now the reader may be tempted to object to such a limited arbitrary check list and he or she would be right to do so, but my aim was not to give a definitive answer but more indicate a methodology of thinking.
Now let us step back to the drawing board and consider the following questions.

  1. Do I need to re-use the code several time and across several implementations
  2. My hardware infrastructure will be subject to  changes
  3. Do I need trace possible errors specializing error raising and handling
  4. Do I need to use complex data type  

If the answer to all these questions is yes is my opinion that would be better to opt for a Jolie Service Implementation approach for the following reason:
  1. In both implementation the cross usability is possible but the Jolie service implementation is favorable for :
    1. In case of code changes I need to change in a single place 
    2. If I need to use in a new project I do not have to copy specific implementation but refer to an interface and leave the implementation stable
  2. This point is often underestimated during the design phase of the project but in microservices architecture has a huge impact ; also in this case the Jolie Service Implementation is favorable because:
    1. Its naturally more adaptable to infrastructure changes ( the service can be deploy where ever )
    2. If there is a need of splitting the microservices implementation over several machine, in the case of the define base implementation we will find ourself duplication the implementation on several machine , with all the problems concerning code duplication
  3. The error handling  it is essential in any software paradigm , but it is even more important in SOA architecture where the high level of concurrent execution would almost impossible to trace errors. In the case of the Jolie Service Implementation the possible error are definable at level of interface making in my opinion simpler the use of the functionalists
  4. /* define implementation */
    define myDefine{
       /*some code*/
       throw (someError)
       /* some other code */
       throw (someOtherError) 
    }
    /* Use of interface  */
    interface myInterface{
    RequestResponse:
    myOperation (requestType)(responseType) throws someError , someOtherError
    }  
    1. As the code above tries to exemplify the user of the functionalists  would   find easier to identify the possible error to catch via interface instead of having to scan the implementation
  5. Similarly to the issue of the fault handling the use of complex types is aided massively by the use of  a Jolie Service Implementation approach where we can define complex types in a clear way instead of having to reverse engineer the code of the define
I hope you will find my post somehow useful to define define recurrent code in jolie

Sunday, 25 January 2015

Using Jolie to implement BTE transactions

For whom of you that is familiar with SAP and its complex behaviours, the term BTE ( Business Transaction Event  ) should be familiar. For those less familiar with SAP a BTE can be defined as a specific moment ( status ) in  the life cycle of enterprise document. such concept transcends SAP and its implementation of an ERP  For example an Sales Invoice can be represented by a cohesive object that contains all the required information.
In the service oriented programming object as such do no exist instead they are  substituted by operation and they relative input and output messages. In a standard WSDL notation operations and their relative messages are represented in this way 

<message name="createSalesInvoiceRequest">
  <part name="clientCode"    type="xs:string"/>
  <part name="invoiceDate"   type="xs:string"/>
  <part name="invoiceAmount" type="xs:double"/>
</message>

<message name="createSalesInvoiceResponse">
  <part name="invoiceId" type="xs:string"/>
</message>

<portType name="salesInvoiceInterface">
  <operation name="createSalesInvoice">
    <input message="createSalesInvoiceRequest"/>
    <output message="createSalesInvoiceResponse"/>
  </operation>
</portType>

In Jolie the operation can be represented using the following definition

type createSalesInvoiceRequest:void{
   .clientCode:    string
   .invoiceDate:   string
   .orderNumber : string
   .invoiceAmount: double
}
type createSalesInvoiceResponse:void{
      .invoiceId:int
}

interface SalesInvoiceInterface{
 RequestResponse:
    createSalesInvoice (createSalesInvoiceRequest)(createSalesInvoiceResponse)
}

Is not a case that as an example of operation createSalesInvoice was selected as an example: often the communication medium of such document varies from client. Let’s look at some examples

Your customer is :

1.     A big corporation and wants a communication via XML File 
2.     A medium size company that requires PDF
3.     A small company may just want an e-mail
Now let's us define a DocumentInterface

type createDocumentRequest:void{
   .invoiceId:    int
   .invoiceDate:   string
   .orderNumber : string
   .invoiceAmount: double
   .clientInfo:clientInfoType
   .
}
type createDocumentResponse:void{
      .docId:string
}

interface DocumentInterface{
 RequestResponse:
    createDocument (createDocumentRequest)(createDocumentResponse)
}

Now let's us assume that three separate services are created and each of the them is implementing the documentInterface

Now in the service that implements createSalesInvoice we need to define an output port in this way

outputPort DocumentServicePort {
  Location:  "local"
  Protocol:sodep
  Interfaces: DocumentInterface
}

and an implementation of the operation as follow

[ createSalesInvoice(request)(response){
  /*code to save on the DB */
  response.invoiceId = responseDB.invoice
}]{ 
     requestClientInfo.clientCode = request.clientCode;
     getClientInfo@ClientServicePort(requestClientInfo)(responseClientInfo);
     requestDocumentCreator.invoiceId =  response.invoiceId ;
     requestDocumentCreator.invoiceDate = request.invoiceDate;
     requestDocumentCreator.orderNumber = request.orderNumber;
     requestDocumentCreator.invoiceAmount =request.invoiceAmount;
     requestDocumentCreator.clientInfo << responseClientInfo.clientInfo;
     DocumentServicePort.location = responseClientInfo.serviceInfo.location
    createDocument@DocumentServicePort( requestDocumentCreator) (responseDocumentCreator);
}

It can be seen on the line underlined in green how the location of the DocumentService is dynamically this will allow to select the location dynamically depending on the client code the location and therefore its implementation of createDocument.
Let’s go back to our three clients with there relative localtion

1.     A big enterprise =>"socket://localhost:3001"
2.     A medium enterprise =>  "socket://localhost:3002"
3.     A small enteprise=>  "socket://localhost:3003"

Let’s assume now that in the life cycle of the implementation will be necessary to specialize even further the document service
The IT development team will be able to design and develop the new service without having to touch the existing code. Once tested and approved by the business it will be possible to activating by simply adding the new location to all the client master data that need to have this new service

I hope this post has showed some the excellent flexibility capabilities Jolie possess in implement BTE. How a change in a part of the business workflow does not mean long hours of redesign and redeployment. The author does realize that this example centrally does not represent the complexity of a real business work flow, but this post aims to propose Jolie as a serious contend for Business workflow design and implementation