Sales Data Synchronization via FTP File Integration¶
This guide demonstrates how to build a file-based ETL (Extract, Transform, Load) workflow using WSO2 Integrator: BI. You will create an integration that monitors an FTP server for incoming JSON sales report files, parses the sales data, and loads each item into a MySQL database.
Try this in Devant:
Scenario¶
Retail stores generate daily sales reports as JSON files and upload them to a central FTP server. Your integration must automatically detect new files, parse the sales data, and insert each item as a separate row in a database.
Flow¶
- FTP listener monitors
/sales/newdirectory for.jsonfiles - Sales report is parsed and validated
- Each sale item is inserted as a separate row in MySQL
Salestable - Processed files are moved to
/sales/processed/ - If any errors are encountered while processing, files are moved to
/sales/error
Prerequisites¶
- WSO2 Integrator: BI - Install from Visual Studio Marketplace
- Docker - For running MySQL and FTP server containers
- MySQL Database - Local or containerized instance
- FTP Server - With read/write access to sales directories
Set up the environment¶
Set up MySQL database¶
Run the MySQL container:
docker run -d --name mysql-sales \
-e MYSQL_ROOT_PASSWORD=root@123 \
-e MYSQL_DATABASE=sales_db \
-p 3307:3306 \
mysql:8.0
Verify MySQL is running:
docker ps
docker logs mysql-sales
Connect to MySQL and create the Sales table:
docker exec -it mysql-sales mysql -uroot -proot@123 sales_db
Execute the following SQL:
CREATE TABLE Sales (
id INT AUTO_INCREMENT PRIMARY KEY,
store_id VARCHAR(50) NOT NULL,
store_location VARCHAR(100) NOT NULL,
sale_date DATE NOT NULL,
item_id VARCHAR(50) NOT NULL,
quantity INT NOT NULL,
total_amount DECIMAL(10, 2) NOT NULL
);
Type exit to close the MySQL connection.
Set up FTP server¶
Run the FTP server container:
docker run -d --name ftp-sales \
-p 21:21 \
-e "PUBLICHOST=localhost" \
-e "FTP_USER_NAME=ftpuser" \
-e "FTP_USER_PASS=ftppass" \
-e "FTP_USER_HOME=/home/ftpuser" \
stilliard/pure-ftpd
Create the required directories:
docker exec ftp-sales mkdir -p /home/ftpuser/sales/new /home/ftpuser/sales/processed /home/ftpuser/sales/error
# Fix permissions for write access
docker exec ftp-sales chmod -R 755 /home/ftpuser/sales
Verify the FTP server is accessible:
docker logs ftp-sales
Add sample data to the FTP server¶
Create a test file store42-2024-01-15.json:
{
"storeId": "STORE-42",
"storeLocation": "Colombo",
"saleDate": "2024-01-15",
"items": [
{"itemId": "ITEM-001", "quantity": 10, "totalAmount": 250.00},
{"itemId": "ITEM-002", "quantity": 5, "totalAmount": 175.50},
{"itemId": "ITEM-003", "quantity": 20, "totalAmount": 890.00}
]
}
Upload the file to the FTP server:
docker exec ftp-sales sh -c "cat > /home/ftpuser/sales/new/store42-2024-01-15.json << 'EOF'
{
\"storeId\": \"STORE-42\",
\"storeLocation\": \"Colombo\",
\"saleDate\": \"2024-01-15\",
\"items\": [
{\"itemId\": \"ITEM-001\", \"quantity\": 10, \"totalAmount\": 250.00},
{\"itemId\": \"ITEM-002\", \"quantity\": 5, \"totalAmount\": 175.50},
{\"itemId\": \"ITEM-003\", \"quantity\": 20, \"totalAmount\": 890.00}
]
}
EOF"
Develop the integration¶
Step 1: Create a new integration project¶
- Click on the BI icon on the sidebar.
- Click on the Create New Integration button.
- Enter the project name as
sales-data-sync. - Select Project Directory and click on Select Location.
-
Click Create New Integration to create the project.
Step 2: Create an FTP Integration¶
- In the design view, click on the Add Artifact button.
- Select FTP / SFTP under the File Integration category.
-
Select FTP as the protocol.
-
Fill in the connection properties:
Property Value Host localhostPort Number 21Folder Path /sales/newAuthentication Basic Authentication Username ftpuserPassword ftppass -
Click Create to create the FTP service.
Step 3: Add file handler¶
- Click + Add File Handler and select onCreate handler.
- Select JSON as the File Format.
- Click + Define Content Schema.
- Select Import Header.
- Paste the Sales Data Sample into the Sample Data text box.
- Rename the type name as
SalesReportand click Import Type. - Click Save.
- This shows the implementation designer by default.
-
Add a Log Info action. In the Msg field, type
Processing file from storefollowed by a space, and use the Helper Panel to select Inputs -> content -> storeId.
Import binds JSON to specific types
You can view these in the Types section in the left panel.
Step 4: Add a MySQL connection¶
- Click + and add a Connection.
- Select MySQL from the connectors list.
-
Fill in the connection properties in Advanced Configurations:
Property Value Host localhostUser rootPassword root@123Database sales_dbPort 3307 -
Click Save Connection.
Step 5: Implement business logic¶
- Click + and add a Foreach loop.
- For Collection, use the Helper Panel to select Inputs -> content -> items.
- Add
itemas the Variable Name andItemsItemas Variable Type. -
Click Save.
-
Click + inside the foreach loop.
- Select the
mysqlClientconnection and choose the Execute operation. -
Add the following partial SQL query:
INSERT INTO Sales (store_id, store_location, sale_date, item_id, quantity, total_amount) VALUES () -
For parameters, use the Helper Panel to select Inputs -> content and map to the SQL values. The final statement would be:
INSERT INTO Sales (store_id, store_location, sale_date, item_id, quantity, total_amount) VALUES (${content.storeId}, ${content.storeLocation}, ${content.saleDate}, ${item.itemId}, ${item.quantity}, ${item.totalAmount}) -
Click Save.
Step 6: Add post-processing logic - success¶
- Click + and select caller connection.
- Select Move operation.
- For Source Path, use the Helper Panel to select Inputs -> fileInfo -> pathDecoded.
- For Destination Path, type
/sales/processed/and use the Helper Panel to select Inputs -> fileInfo -> name. -
Click + and add a Log Info action. In the Msg field, type
File moved to processed:followed by a space, and use the Helper Panel to select Inputs -> fileInfo -> name.
Step 7: Add post-processing logic - failure¶
- Click Error Handler.
- Delete the Return error node.
- Click + and select caller connection.
- Select Move operation.
- For Source Path, use the Helper Panel to select Inputs -> fileInfo -> pathDecoded.
- For Destination Path, type
/sales/error/and use the Helper Panel to select Inputs -> fileInfo -> name. -
Click + and add a Log Info action. In the Msg field, type
File moved to error:followed by a space, and use the Helper Panel to select Inputs -> fileInfo -> name.
Run and test¶
Run the integration¶
- Click on the Run button in the top-right corner.
- Wait for the integration to start (check the output panel for logs).
Verify results¶
-
Check the BI logs for processing messages:
Processing sales report: store42-2024-01-15.jsonFile moved to processed: store42-2024-01-15.json
-
Verify data in MySQL:
docker exec -it mysql-sales mysql -uroot -proot@123 sales_db -e "SELECT * FROM Sales;"You should see 3 rows inserted, one for each item in the sales report.
-
Verify the file was moved to the processed folder:
docker exec ftp-sales ls /home/ftpuser/sales/new # Should be empty docker exec ftp-sales ls /home/ftpuser/sales/processed # Should contain store42-2024-01-15.json
Deploy on Devant¶
- Deploy this integration on Devant as a File Integration.
- Configure the FTP and MySQL connection parameters with your production values.










