How to Automate Data Transfer from Amazon S3 to Zendesk using Amazon AppFlow

この記事は公開されてから1年以上経過しています。情報が古い可能性がありますので、ご注意ください。

Hi!

In this blog let's learn how to transfer stored data from Amazon S3 to Zendesk using Amazon AppFlow.

Amazon AppFlow allows you to create bidirectional data flows between AWS applications and SAAS products without the need of code.

What are we going to do?

Let's have a csv file with following columns and entries. We'll try to make new end-users in Zendesk with the following entries through s3 and AppFlow.

Let's get started!

We should have a s3 bucket with two folders. I named one as 'demo' to store my csv file and one as 'error' to store the possible errors if we get.

Let's go to amazon AppFlow and create a new flow.

First page of the flow is all about some metadata. Give the flow name according to your choice. We'll leave the optional info as it is and move further; click on next.

So now it becomes interesting. We have to give source name i.e. S3 and choose bucket. Give destination name as Zendesk and name your connection.

Go to Zendesk account and under Zendesk API, click on OAuth client.

Client name can be anything you want. We'll leave the optional data. Your unique identifier will match you client ID.

Copy your AWS URL up-to '/appflow', paste it in the Zendesk window redirect URL, and add OAuth at the end in the Redirect URLs.

When you will save, it will give you a secret ID. Don't forget to save because you will have just one chance to copy it.

Copy it and paste it into the client secret section in AWS AppFlow window.

In the Account section, copy your Zendesk window URL after 'https' and just before '.zendesk' and paste it in the AWS AppFlow window. Connection name can be anything of your choice(without spaces).

Next, it will ask if you want AppFlow to access your Zendesk account. Click on allow.

Under choose Zendesk object, select users. Next, under 'Write data that couldn't be transferred' select your s3 bucket and error folder. This means, in case there is some problem and data could not get transferred, you can read the possible errors in your error folder.

Under 'Flow Trigger', first option 'Run on Demand' means that if you click the 'Next' button at the end of this page, your data will be pulled from s3 and written to Zendesk.

Second option is 'Run flow on schedule'. It means you can set different times in which you want your data to get transferred.

Let's carry on with 'Run on demand' for now.

Next, we have to do mapping. We have 3 options here. First one is, 'Insert new records'. It will take all the data from your csv file and transfer as it is. Second one is, 'Update existing records'. It will not add new ones but it will update any record that it finds existing. Third one is 'Upsert records'. We will choose this one because it is a combination of other two. If it doesn't find any user from csv file, it will add them and if it finds, it will update them with the new entries.

We need a unique identifier to match between the two. Let's choose email as our unique identifier.

Now let's do the rest of the mapping. CLick on 'Map all fields' under Amazon s3. Left side is s3 and right side is Zendesk. Do check if the mapping is proper.

We don't need email as we used it for our unique identifier. You can remove rows which you don't want to map.

Next is add validation. So let's say if we have any row that doesn't have an email, we can ignore that row entry and move on to the next row.

Let's click on Next. We will skip filter as we don't need it. Next, check if all your entries are correct and click on next. After that, click on Run Flow on the top right corner.

Wait for some time and it should display a successful message. If not, do let me know in the comments, and I will definitely help you out.

And then, you can check people in the Zendesk if your end-users are added.

I have hidden other members due to security reasons. Also, I will be deleting my flow so that nobody can hack my secret code I got during Zendesk-AppFlow connections.

Thank you so much for your time.

Happy learning:)