Low Code way for Serverless file extraction and transfer using Kumologica

Kumologica
5 min readAug 28, 2021

Extraction of file from a remote FTP server and processing the file content to a destination system is a common use case in a small, medium or large enterprises. There are multiple ways by which enterprise deals this use case. Mostly they depend on some File transfer products or end up in building a solution using VM’s. In both the cases the total cost of ownership (TCO) of the solution for a simple file extraction and processing is high when compared to Serverless solution.

In this article we are going to create a simple serverless file extraction and its content transformation using Kumologica. For those who are new to Kumologica I would recommend to go through our articles and youtube videos to get an insight.

Training & Certification
For a limited time, Kumologica offers free online certification for developers on https://training.kumologica.com/

Use case

Lab team in ABC enterprise shares the lab results to processing team regularly on mid-night via FTP server. The result file is a zip file containing multiple csv files. Each csv file has to be placed on to a S3 folder individually so that processing team can access them.

High level diagram

Prerequisite

  1. Kumologica DesignerDownload the designer for building the flow.
  2. FTP server access
  3. AWS S3— For Kumologica flow to send the CSV file.
  4. AWS Lambda— For running the Kumologica flow as lambda.
Install the ZIP node and FTP node from "Add more nodes" panel in Kumologica designer.

Implementation

Let’s start building the Extraction service flow that will be triggered nightly once every day.

Extraction Service flow

Steps

  1. Open Kumologica Designer, click the Home button and choose Create New Kumologica Project.
  2. Enter name (for example ExtractService), select directory for project.
  3. press Create Button.

4. Drag and drop EventListener node to the canvas the provide the following configuration.

Display Name :  Trigger 11:30 PM
Provider :
AWS
Event Source
: Amazon CloudWatch Events

5. Add a Logger node and provide the following configuration. Wire the logger node to EventListener node.

Message : 'Triggered at : ' & $now()
Log format : string

6. Add FTP node from the palette and provide the following configuration. Wire the node to logger node.

Display Name : Read From FTP
Hostname : your FTP hostname
Port : your FTP port (default 21)
Username : your FTP username
Password : your FTP password
Operation : Get

7. Drag and drop ZIP node to the canvas and provide the following configuration. Wire the node with the FTP node.

Operation : Extract
Content : msg.payload

8. Add a Set-Property node and configure the following. This is to store the extracted payload to a different object so that payload is not lost during any subsequent process in the flow. Wire the Set-property node with the ZIP node.

Operation :  Set
Target : msg.files
Source : msg.payload

9. Add a ForEach node to the canvas in order to iterate the files which is retrieved after the extraction by the ZIP node. Provide the following configuration for the ForEach node. Wire the ForEach node with the Set-Property node.

Input Array : msg.files
Await? : Checked

10. Drag and drop S3 node from the palette and provide the following configuration. Ensure that the S3 bucket exist in your AWS account. Wire the S3 node with the ForEach node.

Display Name : SendToS3
Operation : PutObject
Bucket : ResultsCSV
Key : msg.payload.filename
Content : msg.payload.content
Content Type : text/csv

Each iterated object by the ForEach node will be part of the msg.payload object.

11. Now we will end the iteration by adding the ForEachEnd node. Wire the ForEachEnd node to the S3 node.

12. And finally , We will add the EventListener End node to complete the flow. Provide the following configuration and wire the EventListener End node with the ForEachEnd node.

Display Name :  Success
Status Code
: 200
Content-Type : text/plain
Payload : 'Completed'

Deploying to AWS

  1. Select AWS icon on the right hand side vertical tab of Kumologica designer, select your AWS Profile.

Note: If you haven’t mapped the your local AWS profile with the designer then you may follow the below video to configure it.

Mapping the AWS profile to Kumologica designer

2. Click Connect. If successfully connected will show the rest of the configuration options.

3. Set the Memory to 512mb and Timeout as 20 seconds.

4. Go to “Trigger” section under cloud tab and select the Amazon CloudWatch Event and click on `+Event`. Default will be API gateway trigger.

Click the ‘-’ button to remove the API gateway trigger.

4. Now provide the Rule ARN.

For getting the rule ARN you need to create a cloudwatch watch event with the CRON schedule for the nightly trigger of the flow.

5. Now click Deploy.

Check your AWS lambda service in your AWS account to see the extraction service function deployed successfully.

Conclusion

This article has shown how easy to build a File extraction and processing service in Kumologica using FTP node , ZIP node and S3 node.

Remember Kumologica is totally free to download and use. Go ahead and give it a try, we would love to hear your feedback.

--

--

Kumologica

Kumologica is the first low-code development solution that makes your integration services run on serverless compute regardless the cloud provider.