MuleSoft Services – small PoC, API building, File Transfers, DB inserts using S3 & AWS RDS

Purpose:  Test Mule Services, integration, file transfer, DB inserts and see if Mule adds value on projects which have an on-premise to AWS or Cloud migration component.

HLD – High Level Diagram

 

Observations:

Mule Positive features:

-For multi-cloud deployments, or cloud apps integrated with legacy systems, it could be a good choice to manage/construct APIs

-Java engineers should be able to pick up Mule within 1-2 weeks through hands-on PoC

-Forces you to think about interface design and management (along with workflows)

-Lightweight Java (akin to RMI) with connectors to difficult-to-interface with-legacy systems eg. SAP

 

Mule Disadvantages:

-Cumbersome UI

-Reasonably disappointing documentation

-If you are ‘all in’ to AWS you don’t need Mule, it is just another layer, another cloud, another HA/DR to worry about

-FTP and other services can be handled within AWS

-Programmatic coding is better for complex tasks, APIs, systems

1.       Created AWS Free Tier account

Access:

https://aws.amazon.com/console/

Username: can be your own

Note: Please be careful in order to make sure that account stays within free AWS tier

2.       Created MySQL on AWS
a)      Created MySQL micro instance on AWS

Tutorial: https://aws.amazon.com/getting-started/tutorials/create-mysql-db/

Instance: rds-mysql-poc-01

Username: masterUsername

Address: rds-mysql-poc-01.clsfdo3jxsj1.ca-central-1.rds.amazonaws.com

Note: IP of client must be added to AWS security group to allow the connection.

b)      Created two DBs in the instance (db01 and db02)

Two databases created:

DB Name: db01

DB Name: db02

c)       Created one table in each DB

Script used to create sample table:

— Table structure for table `orderdetails`

DROP TABLE IF EXISTS `orderdetails`;

CREATE TABLE `orderdetails` (

`orderid` int(11) NOT NULL,

`itemNumber` varchar(12) NOT NULL,

`shipToNumber` varchar(6) NOT NULL,

`customerPONumber` varchar(15) NOT NULL,

`trackingPONumber` varchar(15) NOT NULL,

`reqDeliveryDate` varchar(15) NOT NULL,

`deliveryInstruction` varchar(60) NOT NULL,

`customerPartNumber` varchar(22) NOT NULL,

`quantity` int unsigned NOT NULL,

`price` decimal(9,4),

PRIMARY KEY (`orderid`,`itemNumber`,`shipToNumber`) USING BTREE

) ENGINE=InnoDB DEFAULT CHARSET=latin1 ROW_FORMAT=FIXED;

 

CREATE DATABASE db02;

d)      Inserted test data into db01

Script used to populate sample data in db01:

 

DELETE FROM `orderdetails`;

INSERT INTO `orderdetails` VALUES (1,’10800004′,’123456′,’123456789012345′,’234567890123456′,’2017-07-31′,’Instructions Order 1′,’123456′,5,42.0600);

INSERT INTO `orderdetails` VALUES (2,’10855554′,’222222′,’222222222222222′,’222222222222222′,’2017-07-31′,’Instructions Order 2′,’222222′,2,22.0000);

INSERT INTO `orderdetails` VALUES (3,’10833333′,’333333′,’333333333333333′,’333333333333333′,’2017-07-31′,’Instructions Order 3′,’333333′,3,33.0000);

INSERT INTO `orderdetails` VALUES (4,’10844444′,’444444′,’444444444444444′,’444444444444444′,’2017-07-31′,’Instructions Order 4′,’444444′,4,44.0000);

INSERT INTO `orderdetails` VALUES (5,’10855555′,’555555′,’555555555555555′,’555555555555555′,’2017-07-31′,’Instructions Order 5′,’555555′,5,55.0000);

INSERT INTO `orderdetails` VALUES (6,’10866666′,’666666′,’666666666666666′,’666666666666666′,’2017-07-31′,’Instructions Order 6′,’666666′,6,66.0000);

INSERT INTO `orderdetails` VALUES (7,’10877777′,’777777′,’777777777777777′,’777777777777777′,’2017-07-31′,’Instructions Order 7′,’777777′,7,77.0000);

INSERT INTO `orderdetails` VALUES (8,’10888888′,’888888′,’888888888888888′,’888888888888888′,’2017-07-31′,’Instructions Order 8′,’888888′,8,88.0000);

 

3.       Created Mule Cloud Hub  account

Accounts can be created at: https://anypoint.mulesoft.com/login/#/signin

 

4.       Tried to establish MySQL connection from Mule CloudHub

References:

  • https://docs.mulesoft.com/mule-user-guide/v/3.6/database-connector
  • https://docs.mulesoft.com/mule-user-guide/v/3.6/database-connector-examples

Note: Connection could not be established since MySQL instance on AWS allows connection

only from certain IPs only (only my IP as of now). Looks that Mule does not allow having static

IP with trial version.

 

5.       Got Mule Anypoint Studio – started Getting Started and some template projects.

References:

6.       Created Mule project which

Current project is composed of seven flows:

a)      Entry point which listens on “/” and displays links to other Mule flows and URLs

 

Fig: Entry point output

 

Entry point listens on the root “/” path and all other flows listen on other paths:

Paths:

  • Reset (delete S3 and DB 02 content): “/reset”
  • Display DB 01 content in JSON format: “/db01”
  • Display DB 02 content in JSON format: “/db02”
  • Count AWS S3 files: “/S3”
  • Copy all records from DB 01 to db01.json file and upload file to S3 (overwrite db01.json if exists): “/db01_to_S3”
  • Download db01.json file from S3 and insert records from the file into DB 02 (overwrite DB 02 records if DB 02 not empty): “/S3_to_DB02”

 

b)      Reset (delete S3 and DB 02 content)

The flow deletes db01.json file from AWS S3 and deletes all records from DB 02.

 

Fig – Anypoint Studio flow – Reset S3 and DB 02

 

c)       Display DB 01 content in JSON format

Retrieves data from DB01 and displays data in JSON format

 

Fig – Anypoint Studio flow – AWS MySQL to JSON

Notes:

  • HTTP request is configured to listen on 0.0.0.0 which means “all IPv4 addresses on the local machine”
  • HTTP request is configured to listen on the port which is specified in ${http.port} Mule variable. Mule variable are specified in mule-app.properties file
  • Mule database component is configured to connect to DB using ${jdbc.url.db02} variable. ${jdbc.url.db02} is also specified in mule-app.properties files
  • Had to adjust Amazon security settings in order to allow connection from any remote IP address (details in next section)

 

d)      Display DB 02 content in JSON format

Retrieves data from DB02 and displays data in JSON format

It has similar configuration as previous step, it uses different DB name in connection URL

 

e)      Count AWS S3 files

Connects to Amazon A3 and displays number of files residing in S3 bucket

S3 connection configuration is specified under Mule’s flow Global Elements (Global Elements tab).

Note:

Fig – Anypoint Studio flow – get count of files in S3 bucket

f)        Copy all records from DB 01 to db01.json file and upload file to S3 (overwrite db01.json if exists)

Fig – Anypoint Studio flow – copy records from DB 01 to db01.json

 

g)      Download db01.json file from S3 and insert records from the file into DB 02 (overwrite DB 02 records if DB 02 not empty)

Fig – Anypoint Studio flow – copy records from db01.json to DB 02

 

h)      REST – Reset (delete DB01.product content)

Fig – Anypoint Studio flow – reset REST DB

REST functions work with product table created in DB01 database (AWS RDS).

The flow deletes data from DB01.product table.

 

i)        REST – Get all records from DB01.product table

Fig – Anypoint Studio flow – get all data

http://poc-aws.cloudhub.io/rest/products REST endpoint returns DB01.product data in   JSON format.

The endpoint could be used by any REST client.

j)        REST POST – Use Postman Chrome extension to post JSON (details in documentation)

Fig – Anypoint Studio flow – insert record

The flow accepts POST requests only. It inserts product record into DB01.product. The     record is included in a form of JSON in POST body.

Note: HTTP / REST client must be used to test the flow. Postman can be used, added to   Google Chrome.

Steps:

  • Using Google Chrome, go to https://chrome.google.com/webstore/detail/postman/fhbjgbiflinjbdggehcddcbncdddomop?hl=en and install Postman
  • Once installed, go to Chrome Apps and launch Postman
  • Fill out the screen as follows:
    • Select “POST” as request type
    • Enter ” http://poc-aws.cloudhub.io/rest/insert” address
    • Select “Body” tab and enter

{

“id”:”3″,

“name”:”Product # 3″

}

 

  • Click “Send”

Fig: Postman initial entries

 

Fig: Postman initial entries

 

POST requests are received by Mule backend. JSON record is inserted into AWS DB01 product.

 

k)      FTP – Move db01.json from S3 to FTP server

 

Fig – Anypoint Studio flow – transfer file from AWS S3 to external FTP

The flow transfers db01.json file from AWS S3 to external FTP.

Note:

  • FTP access details to be added
  • The flow throws ” Failed to invoke getObjectContent.” exception if db01.json file does not exist on AWS S3

 

l)        FTP / Mule schedule – Mule monitors FTP for new files

The process runs in the background on Mule CloudHub.

Actions:

  • Check for new files on FTP every 100 minutes.
  • If files exist
    • Move file to another FTP location
    • Make backup copy of processed file on the FTP server
7.       How to run Mule flows
  1. Click “Reset (delete S3 and DB 02 content)” link to refresh AWS state.
  2. Click ” Display DB 01 content in JSON format” to check DB 01 content (test bed data available in JSON format)
  3. Click “Display DB 02 content in JSON format” to check DB 02 content (it is empty at this stage)
  4. Click “Count AWS S3 files” – it is ZERO at this point
  5. Click “Copy all records from DB 01 to db01.json file and upload file to S3 (overwrite db01.json if exists)”.

This flow retrieves all data from DB01, generates db01.json file and uploads it over to Amazon S3 bucket.

  1. Click ” Download db01.json file from S3 and insert records from the file into DB 02 (overwrite DB 02 records if DB 02 not empty)”.

This flow:

  • Retrieves db01.json file from AWS S3 bucket in form of stream
  • Converts stream to Object in form of java.util.List
  • Uses Collection Splitter to split List object
  • Populates DB02 with List records
  1. Click “Display DB 02 content in JSON format” to check DB 02 content (same data as in DB01 available now)
  2. Click “Count AWS S3 files” – it is ONE at this point
  3. REST functions – click “Reset (delete DB01.product content)”
  4. Click “Get all records from DB01.product table” – it is empty at this point.
  5. Insert record into DB01.product using Postman HTTP client
  6. FTP functions – click “Move db01.json from S3 to FTP server”
  7. Check FTP server content (access details to be provided)

 

8.       Deployed project on Mule CloudHub

Note: CloudHub needs to be created.

Trial version of Mule CloudHub has number of limitations:

  • Cannot assign static IP address to deployment
  • Can deploy max one project at a time

 

Reference: https://docs.mulesoft.com/runtime-manager/deploying-to-cloudhub

9.       Created Amazon AWS components
a)      MySQL RDS Instance

Console Access: Services – RDS (under Database group) – Instances (left hand side)

 

Fig – Amazon MySQL instance

 

b)      Security group for MySQL Instance

Console Access: Expand “Instance Actions” on RDS Dashboard and select “See Details”

 

Fig – Amazon MySQL instance details

 

Click on ” rds-launch-wizard (sg-d16c42b8)” in order to view / edit security settings for MySQL

Fig – Amazon MySQL instance – security group

 

c)       Amazon S3

Console Access: Services – S3 (under Storage)

 

d)      IAM User for S3 access

Reference: http://docs.aws.amazon.com/IAM/latest/UserGuide/id_users_create.html

Console Access: Services – IAM (under Security, Identity & Compliance)

 

1 thought on “MuleSoft Services – small PoC, API building, File Transfers, DB inserts using S3 & AWS RDS”

Comments are closed.