Sync File Data
This guide summarizes how to create bi-directional, real-time syncs within a many-to-one type data ecosystem. In this case, FTP Servers to Amazon S3.

In this guide, we're using the Event SDK to create our FTP Sources. Keep in mind, you can also use the Developer Dashboard to create FTP Sources.
To start using the Event SDK, install the package:
Node.js
npm install @event-inc/connections
We'll start off by initializing the client, which requires your Event API key. You can specify the environment as either "sandbox" or "production".
Node.js
import { createClient } from '@event-inc/connections';
const client = createClient(process.env.EVENT_INC_API_KEY, {
environment: "sandbox"
});
Next, we'll create an FTP source using the
createSource
function. Here's an example:TypeScript
const serverA = await createSource<"ftp">(client, {
type: "ftp",
group: "customer-abc",
label: "Supply Chain Production Server",
config: {
FTP_HOST: process.env.CUSTOMER_ABC_FTP_HOST,
FTP_USER: process.env.CUSTOMER_ABC_FTP_USER,
FTP_PASSWORD: process.env.CUSTOMER_ABC_FTP_PASSWORD,
FTP_PATH: process.env.CUSTOMER_ABC_FTP_PATH,
FTP_PORT: process.env.CUSTOMER_ABC_FTP_PORT,
FTP_SCAN_INTERVAL: "*/15 * * * *",
FTP_FILE_ARCHIVING_ENABLED: "Y",
FTP_FILE_EXTRACTION_ENABLED: "Y",
FTP_EXTRACTOR_FILE_TYPE: "csv",
FTP_EXTRACTOR_FILE_SIZE_LIMIT: "2GB",
FTP_EXTRACTOR_RECORD_COUNT_LIMIT: "25"
}
});
Then, we'll create a second FTP source for our 2nd customer using the same
createSource
function. In this code example, notice how we've tied this server to the unique customer using the group
.TypeScript
const serverB = await createSource<"ftp">(client, {
type: "ftp",
group: "customer-xyz",
label: "Supply Chain Production Server",
config: {
FTP_HOST: process.env.CUSTOMER_XYZ_FTP_HOST,
FTP_USER: process.env.CUSTOMER_XYZ_FTP_USER,
FTP_PASSWORD: process.env.CUSTOMER_XYZ_FTP_PASSWORD,
FTP_PATH: process.env.CUSTOMER_XYZ_FTP_PATH,
FTP_PORT: process.env.CUSTOMER_XYZ_FTP_PORT,
FTP_SCAN_INTERVAL: "*/15 * * * *",
FTP_FILE_ARCHIVING_ENABLED: "Y",
FTP_FILE_EXTRACTION_ENABLED: "Y",
FTP_EXTRACTOR_FILE_TYPE: "csv",
FTP_EXTRACTOR_FILE_SIZE_LIMIT: "2GB",
FTP_EXTRACTOR_RECORD_COUNT_LIMIT: "25"
}
});
Use this code template to connect to an infinite amount of customer FTP Servers
In this guide, we're using the Developer Dashboard to create our Destination. Keep in mind, you can also use the Event SDK to create Amazon S3 Destinations.
Now that you're streaming your customer FTP Server data into Event, it's time to connect our internal Amazon S3 bucket that we'll be using as our recipient of that data.
Step 1: Click the Destination tab of the Developer Dashboard

Step 2: Click the + New button

Step 3: Select the Amazon S3 Destination from the catalog

Step 4: Add your Connection credentials

Step 5: Test your Connection
You can test your connection credentials to make sure it will connect successfully before attempting to connect permanently.
Step 6: Click Connect
That's it! Your data Destination is now connected.
To start using the Event SDK, install the package:
Node.js
npm install @event-inc/pipelines
We'll start off by initializing the client, which requires your Event API key. You can specify the environment as either "sandbox" or "production".
Node.js
import { createClient } from '@event-inc/pipelines';
const client = createClient(process.env.EVENT_INC_API_KEY, {
environment: "sandbox"
});
Create a Pipeline using the
createPipeline
function and use it to orchestrate data. Here's an example:TypeScript
import { createPipeline } from '@event-inc/pipelines';
import { Event } from "@event-inc/types";
const pipeline = await createPipeline<"ftp", "amazon-s3">(client, {
label: "Upload CSV Data to S3",
source: {
key: serverA.key,
events: ["record.parsed"],
},
destination: {
key: "destination::amazon-s3::my-bucket", // copied from Developer Dashboard
action: "upload",
},
extractors: [],
transformation: {
type: "javascript",
func: function transform(event: Event, context: Record<string, unknown>) {
const body = JSON.parse(event.body);
const { file, records } = body;
const fileName = `Sync-${file.name}-${Date.now()}.json`;
const fileContent = records.map((record) => {
return {
name: record.name,
sku: record.sku,
quantity: record.quantity,
manufacturer: record.manufacturer,
orderDate: record.orderedOn,
tracking: {
trackingNumber: record.trackingNumber,
transportationMethod: record.transportationMethod,
transportationCompany: record.transportationCompany,
},
receiving: {
receivedBy: record.receivedBy,
receivingDate: record.receivedOn,
storageLocation: record.storageLocation,
},
status: record.status,
}
});
return {
bucket: "customer-csv-data",
key: fileName,
body: JSON.stringify(fileContent)
};
},
},
settings: {
startToCloseTimeout: "30 minutes",
retry: {
maximumAttempts: 10,
},
},
});
The above pipeline code snippet is for the 1st customer pipeline. Simply replicate this code for the 2nd customer pipeline.
You can observe and monitor all activity and job statuses while data is being moved through the pipes using Event's monitoring layer
Last modified 2mo ago