Using WSO2 Micro Integrator-4.3.0 for Basic Integration Scenarios via VS Code
This article will guide through on how to use the latest VS Code Plugin, which WSO2 released on how to create the integrations that runs on WSO2 Micro Integrator (MI). This will contain on how to create connectors, registry items, how to add the third party dependencies and create the carbon application with Github Copilot assistance to be deployed on MI-4.3.0 as well as run the integration in the VS code itself for debugging purposes.

This article will focus on creating integrations but not how to setup the VS code since it is clearly instructed in the documentation.
Theres a possibility for the developer to integrate MI VS Code extension with Github Co Pilot according to the documentation, which this functionality is used in this article too.
Then follow the Quick start guide in order to create a simple API via documentation and configure an MI instance for testing purposes. The basic steps are shown in images below.
Note: Sample backend service is implemented with Ballerina and available at https://github.com/WSO2SADemo/ballerina-backed-for-mi-apim. Refer to the quick start guide and install and run this in few minutes
Initial Steps




Once the API is created, there will be a view as follows, where you can click on the resource and create your integration.

Once the API is available, its possible to edit the resource

And start adding the needed integrations by clicking on the [+] icon in the diagram.
After those steps, lets go into creating the following integration scenarios.
- Simple Transformations using payload mediator: XML to Json and Json to XML
- Using connectors to read files and perform transformations: CSV to Json
- Using Filters Foreach mediators for Error Handling with the assistence of Github Copilot
- Validate events and publishing to Kafka using Kafka connector
- Service Chaining
- Exposing the Integrations API via WSO2 API Manager
1. Simple Transformations using payload mediator: XML to Json and Json to XML
JSON to XML
To do the transformation, Payload Factory mediator
will be used. It can be added using the [+] icon in the diagram and by selecting the relavant mediator

Afterwards, provide the relevant properties and save and submit.

Afterwards, its needed to send the response to the client, which can be done using the response mediator.
Note: the payload factory mediator which got added in the Github Copilot intro doesn’t have the correct payload which can be corrected by the values here.

Which you will be getting the whole flow as follows

Since the MI server instance is configured with VS Code, its possible to Run the integration which you will end up with the following view.

And its possible to use the following curl command to test the functionality
curl --location 'http://localhost:8290/integrations/xml_to_json' \
--header 'X-REQUEST-ID: test-id' \
--header 'Content-Type: application/xml' \
--data '<?xml version="1.0"?>
<acc:Request xmlns:acc="http://TargetNamespace.com/AccountCreate">
<acc:PartyNumber>string</acc:PartyNumber>
<acc:SourceSystem>string</acc:SourceSystem>
<acc:SourceSystemReferenceValue>string</acc:SourceSystemReferenceValue>
<acc:OrganizationName>string</acc:OrganizationName>
<acc:Type>string</acc:Type>
<acc:SalesProfileStatus>string</acc:SalesProfileStatus>
<acc:OrganizationSize>string</acc:OrganizationSize>
<acc:OrganizationDEO_OpeningDate_c>string</acc:OrganizationDEO_OpeningDate_c>
<acc:OrganizationDEO_ClosingDate_c>anyType</acc:OrganizationDEO_ClosingDate_c>
<acc:OrganizationDEO_CompanyName_c>string</acc:OrganizationDEO_CompanyName_c>
<acc:OrganizationDEO_Location_c>string</acc:OrganizationDEO_Location_c>
<acc:OrganizationDEO_StoreType_c>string</acc:OrganizationDEO_StoreType_c>
<acc:OrganizationDEO_Region_c>string</acc:OrganizationDEO_Region_c>
<acc:OrganizationDEO_OrganizationNumber_c>string</acc:OrganizationDEO_OrganizationNumber_c>
<acc:OrganizationDEO_HKNr_c>anyType</acc:OrganizationDEO_HKNr_c>
<acc:OrganizationDEO_ICABusinessPartner_c>anyType</acc:OrganizationDEO_ICABusinessPartner_c>
<!--1 or more repetitions:-->
<acc:Address>
<acc:AddressNumber>string</acc:AddressNumber>
<acc:Address1>string</acc:Address1>
<acc:City>string</acc:City>
<acc:PostalCode>100</acc:PostalCode>
<acc:Country>string</acc:Country>
<acc:AddressType>string</acc:AddressType>
</acc:Address>
<acc:Address>
<acc:AddressNumber>string</acc:AddressNumber>
<acc:Address1>string</acc:Address1>
<acc:City>string</acc:City>
<acc:PostalCode>100</acc:PostalCode>
<acc:Country>string</acc:Country>
<acc:AddressType>string</acc:AddressType>
</acc:Address>
</acc:Request>'
JSON to XML configurations can be referred in the link.

2. Using connectors to read files and perform transformations: CSV file to Json file

In this scenario, a CSV file will get read using the file connector and convert into an JSON format and send the response to the client.
Create a new resource called csvFile-to-json
. And select the GET
method selection.
Once the resource is created, lets add a file connector to read the file as shown below. This will need internet connection since the file connector will be downloaded.

Once the connector is downloaded, there will be set of information required such as file connection and other parameters.

Add a new connection with the following information which has the file path includes the folder path of the source files and the target files (target files will be required for other integrations)

Provide the File/Directory
path as /csvreadGET/source.csv
and submit
.
Then when the file is read the context needed to be converted from CSV to JSON. And to do that the CSV connector has to be used. In the similar manner how the the File connector is added, this can be added too.

Once it is selected, it will be downloaded and have to configure as follows.

Then it is needed to write to a file

Which has to be followed by the configurations as follows.

Whenever the GET call happens, there will be events that are read from the file getting appended to the target.json file. The following is a sample curl command.
curl --location 'http://localhost:8290/integrations/csvFile_to_json'
3. Using Filters for Error Handling and getting assistance from Github Copilot

In this scenario, Github Copilot will be used explained as well. First, is it needed to select copilot edits
from the top section and add the files that you need to edit.

After the file is added, its just a simple matter of adding the requirement as Get the sellerId from the payload { “sellerId”: “clothing” } received to /error-handling resource and save it in a property uri.var.sellerId
in the chat where copilot will add the integration to the code automatically (once provided the necessary permissions to do so)

Once it is executed, the following will be added to the code which is pretty impressive.

Now lets add a endpoint to call the backend service as follows


Now lets add that in our integration


Lets again get assistance on creating a filter with Create a filter which checks the response code from the call and check whether its 500 or not

Then lets add a conversion to the 500 response error message to XML. In 500 request, im getting the following response { “message”: “Exception ocurred when reading sellerId” } convert it to xml in the format of
<m:CustomerDetailsError xmlns:m=”http://services.samples/xsd" >. <m:ErrorMessage >{message}</m:ErrorMessage> </m:CustomerDetailsError>
where {message} is the content of message ey in the json response
Which we will get the following payload factory and the property getting added.

As such you could interact with Github Copilot inorder to get the integrations done.
Note: But there are cases which Github Copilot doesnt give the exactly what we needed and sometimes end up with errors too.

Instead, this can be simply done with sequences just a simple foreach with a payload mediator and a property mediator in a sequence. Lets add the following to a sequence called ItemsForEachSequence



And add the following content to the text area and save the file.
<?xml version="1.0" encoding="UTF-8"?>
<sequence name="ItemsForEachSequence" trace="disable" xmlns="http://ws.apache.org/ns/synapse">
<payloadFactory media-type="json" template-type="default">
<format>{
"id": "$1",
"title": "$2",
"price": "$3",
"sellerId": "$4"
}</format>
<args>
<arg expression="$.ID" evaluator="json"/>
<arg expression="$.Title" evaluator="json"/>
<arg expression="$.Price" evaluator="json"/>
<arg expression="$.sellerId" evaluator="json"/>
</args>
</payloadFactory>
<property name="messageType" scope="axis2" type="STRING" value="text/xml"/>
</sequence>
Go back to the resource. You can goto the resource in the navigation by clicking the /error-handling resource

Add a property mediator and a foreach mediator as follows

You could find the full code for this mediation here.
The following cURL can be used to test the scenario by replacing -1 and other numbers.
curl --location 'http://localhost:8290/integrations/error-handling' \
--header 'usernameid: abcd' \
--header 'Content-Type: application/json' \
--data '{
"sellerId": -1
}'
4. Validate events and publishing to Kafka using Kafka connector

This part contains adding registry components, adding connectors and its third party dependencies.
Add a registry resource according to the instructions provided in the documentation. Or else, you can expand the registry resources at the bottom and add

And add the following content
{
"$schema": "http://json-schema.org/draft-04/schema#",
"type": "object",
"properties": {
"orders": {
"type": "array",
"items": [
{
"type": "object",
"properties": {
"order": {
"type": "object",
"properties": {
"id": {
"type": "string"
},
"title": {
"type": "string"
},
"price": {
"type": "number"
},
"department": {
"type": "object",
"properties": {
"name": {
"type": "string"
},
"location": {
"type": "string"
}
},
"required": [
"name",
"location"
]
}
},
"required": [
"id",
"title",
"price",
"department"
]
}
},
"required": [
"order"
]
}
]
}
},
"required": [
"orders"
]
}
Once the registry section is refreshed, the artefact will be available in the list in the given path.

Create a resource as validate-and-send-to-kafka

Lets create a sequence ValidateAndSendToKafkaPublisherSequence
to validate the received events from the resource, divide/ separate the events and and invoke another sequence which converts the event and send the separated events to a Kafka broker.
So, create a new sequence called ValidateAndSendToKafkaPublisherSequence

First, lets add the registry resource from the left panel’s bottom [+]
icon and by adding the below values and saving it. Afterwards its possible to check the resource is added by clicking the “refresh
” button in the registry explorer panel.

Add the following content in the opened window for the registry resource.
{
"$schema": "http://json-schema.org/draft-04/schema#",
"type": "object",
"properties": {
"orders": {
"type": "array",
"items": [
{
"type": "object",
"properties": {
"order": {
"type": "object",
"properties": {
"id": {
"type": "string"
},
"title": {
"type": "string"
},
"price": {
"type": "number"
},
"department": {
"type": "object",
"properties": {
"name": {
"type": "string"
},
"location": {
"type": "string"
}
},
"required": [
"name",
"location"
]
}
},
"required": [
"id",
"title",
"price",
"department"
]
}
},
"required": [
"order"
]
}
]
}
},
"required": [
"orders"
]
}
Add the following to enable schema validation within the sequence. If the schema doesnt match, a new payload will be created with the error message and respond.
<validate cache-schema="true">
<schema key="conf:/config/schema/jsonvalidator.json"/>
<on-fail>
<payloadFactory media-type="json" template-type="default">
<format>{"Error":"$1",
"Error Details" : "$2" }</format>
<args>
<arg expression="$ctx:ERROR_MESSAGE" evaluator="xml"/>
<arg expression="$ctx:ERROR_DETAIL" evaluator="xml"/>
</args>
</payloadFactory>
<property name="HTTP_SC" scope="axis2" type="STRING" value="500"/>
<respond/>
</on-fail>
</validate>
If the payload is valid and according to the schema, it is needed to separate the event array and send events one by one to Kafka Broker.
So create a new sequence called KafkaPublishingSequence

Then lets add a payload mediator with the following content to convert the nested json to filter out some values and create a flat structure.
<payloadFactory media-type="json" template-type="default">
<format>{"id": "$1","title": "$2","price": "$3","department name": "$4","department location": "$5"}</format>
<args>
<arg expression="$.order.id" evaluator="json"/>
<arg expression="$.order.title" evaluator="json"/>
<arg expression="$.order.price" evaluator="json"/>
<arg expression="$.order.department.name" evaluator="json"/>
<arg expression="$.order.department.location" evaluator="json"/>
</args>
</payloadFactory>
Now to publish messages, it is needed to add a Kafka connector. Just as before when adding a File connector, follow the same steps. Once its added, new Kafka connection has to be created as well.

And provide the necessary values as below
localhost:9092
org.apache.kafka.common.serialization.StringSerializer

Add orders
as the topic and 0
as partition and save the connector and submit
.

Afterwards in the ValidateAndSendToKafkaPublisherSequence
sequence, add a foreach mediator and provide json-eval($.orders)
to iterate through the array and call the created sequence to publish via Kafka.

And add a response at the end or else, there will be a waiting until the timeout which will degrade performance.
Since Kafka connector requires multiple Kafka 3rd party libraries, Add avro-1.8.1.jar, kafka-schema-registry-client-7.7.1.jar, metrics-core-4.1.12.1.jar, scala-library-2.13.6.jar,
from the kafka_2.13–3.0.0 distribution to the libs in the directory structure (not in the MI project view), then when the project starts, all the libraries will copied to the distribution by converting to osgi as well.
kafka-clients-3.0.0.jar, kafka_2.13–3.0.0.jar, org.apache.synapse.kafka.poll-1.2.3.jar, zookeeper-3.6.3.jar

Then navigate to the folder which the Kafka distribution is and start the zookeeper, Kafka servers, create the topic, listen to the topic and start the project with the following commands.
bin/zookeeper-server-start.sh config/zookeeper.properties
bin/kafka-server-start.sh config/server.properties
bin/kafka-topics.sh --create --partitions 1 --replication-factor 1 --topic orders --bootstrap-server localhost:9092
bin/kafka-console-consumer.sh --topic orders --from-beginning --bootstrap-server localhost:9092
Then send the array of events using the following cURL command.
curl --location 'http://localhost:8290/integrations/validate-and-send-to-kafka' \
--header 'Content-Type: application/json' \
--data '{
"orders": [
{
"order": {
"id": "7",
"title": "Dog Collar",
"price": 100.00,
"department": {
"name": "pets",
"location": "colombo"
}
}
},
{
"order": {
"id": "8",
"title": "Cat Food",
"price": 50.00,
"department": {
"name": "pets",
"location": "colombo"
}
}
}
]
}'
which can see the separated events in the command.

5. Service Chaining

Create a new resource check-all-purchased-items
and create a new sequence called ServiceChainingSequence.
Create two endpoints (GetItemByItemId
and GetDepartmentByName
) where it receives item information with one endpoint and gets department information it belongs to with the second endpoint and enrich the item with department details after the two calls finishes.


The add the following code in order to do the mediation in the ServiceChainingSequence
.
<property name="REST_URL_POSTFIX" scope="axis2" action="remove"/>
<property name="uri.var.itemId" scope="default" type="STRING" expression="$url:itemId"/>
<call>
<endpoint key="GetItemByItemId"/>
</call>
<property expression="json-eval($.departmentName)" name="uri.var.departmentName" scope="default" type="STRING"/>
<property expression="json-eval($.sellerId)" name="sellerId" scope="default" type="STRING"/>
<property expression="json-eval($.ID)" name="idValue" scope="default" type="STRING"/>
<property expression="json-eval($.Title)" name="titleValue" scope="default" type="STRING"/>
<property expression="json-eval($.Price)" name="priceValue" scope="default" type="STRING"/>
<call blocking="true">
<endpoint key="GetDepartmentByName"/>
</call>
<payloadFactory media-type="json">
<format>
{
"id": "$1",
"title": "$2",
"price": "$3",
"department": {
"name": "$4",
"location": "$5"
}
}
</format>
<args>
<arg evaluator="xml" expression="$ctx:idValue"/>
<arg evaluator="xml" expression="$ctx:titleValue"/>
<arg evaluator="xml" expression="$ctx:priceValue"/>
<arg evaluator="json" expression="$.name"/>
<arg evaluator="json" expression="$.location"/>
</args>
</payloadFactory>
<log level="full"/>
<respond/>
In the end add the sequence to the resource and test the service chaining with the following cURL command by changing the itemId value from 1–3
curl --location 'http://localhost:8290/integrations/check-all-purchased-items?itemId=2'
Where a response will be received as follows
{
"id": "2",
"title": "Title2",
"price": "2.0",
"department": {
"name": "kitchen",
"location": "usa"
}
}
6. Exposing the Integrations API via WSO2 API Manage
Once the MI deployment.toml which is pointed by the VS code is configured to enable service catalog, assuming the API Manager is running on port 9444 with the following configuration,
[[service_catalog]]
apim_host = "https://localhost:9444"
enable = true
username = "admin"
password = "admin"
The service will be shown in the API Manager publisher as below in the service catalog section.

In this article, it is given a in detailed explanation on creating basic integration examples, how to integrate connectors and its 3rd party libraries with the assistance of the Github Copilot and how it can be exposed as an API in the WSO2 API Manager seamlessly.