Process CSV Records Using Salesforce Bulk API 2.0

 

In this tutorial we will demonstrate how can we process CSV records using salesforce bulk api 2.0.

Bulk API 2.0:

  • Bulk API 2.0 provides a simple interface to quickly load large amounts of data into your Salesforce org and to perform bulk queries on your org data.
  • Bulk API 2.0 simplifies uploading large amounts of data by breaking the data into batches automatically. All you have to do is upload a CSV file with your record data.
  • We can process 100 million records per 24-hour period.
  • Salesforce creates a separate batch for each 10,000 records in your job data internally

Ref: https://developer.salesforce.com/docs/atlas.en-us.api_bulk_v2.meta/api_bulk_v2/introduction_bulk_api_2.htm

Salesforce Connector Creates a Bulk API V2 operation is used to insert, update, delete or upsert the data.

Prerequisites:

  • Installing salesforce connector(by login into anypoint exchange from studio you can install the salesforce connector)
  • Salesforce developer account
  • Security token
  • Consumer key and secret(If you are using the Salesforce connector to access an OAuth API, you also need a consumer key and secret)
  • CSV file used to process data to salesforce should be UTF 8 format and can not be greater then 100 MB

 

Ref: https://developer.salesforce.com/docs/atlas.en-us.api_bulk_v2.meta/api_bulk_v2/datafiles_prepare_csv.htm

Create project in anypoint studio and configure the listener to trigger the flow

Drag and drop create job bulk api v2 operation from mule palette

 

Process CSV Records Using Salesforce Bulk API 2.0

 

Configure the file read operation to read the CSV file

 

Process CSV Records Using Salesforce Bulk API 2.0

 

Change the mime type and streaming value

 

Process CSV Records Using Salesforce Bulk API 2.0

 

Configure the below parameters for Create job bulk api v2 operation

ObjectType: type of the salesforce object

sObjects: An array of one or more array objects(payload)

Operation: operation to be invoked insert/update/delete/upsert

Line ending: line ending of csv file, LF or CRLF

Column delimiter: backquote (`), caret (^), comma, pipe (|), semicolon, and tab

External Id field name: it only required for upsert operation

 

Process CSV Records Using Salesforce Bulk API 2.0

 

Transform the payload to get the response from salesforce batch job

%dw 2.0
output application/json
---
Payload

 

Deploy the application and trigger the flow,create the testdata as a csv file

 

See below all the records more than 900 present in CSV are processed within few seconds

We can check the status of the batch job after login into salesforce

Go to setup and search Bulk Data Load Jobs in quick find as shown below

 

 

Sample application: InsertBatchRecords sample application

soapui project: bulkapiv2-soapui-project

  
Thank you for taking out time to read the above post. Hope you found it useful. In case of any questions, feel free to comment below. Also, if you are keen on knowing about a specific topic, happy to explore your recommendations as well.
 
For any latest updates or posts on our website, you can follow us on LinkedIn. Look forward to connecting with you there.


Share this:
Subscribe
Notify of
guest
2 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments
KP-ke
KP-ke
3 years ago

how will the scenario be achieved when there is a parent child relationship
say parent — Object A
child — Object B

The ideal case would be object A has to be upeserted first but the job will take time to complete even more than 10 mins

in that case will bulk job api v2 work?

Messi
Messi
1 year ago
Reply to  KP-ke

I am facing the same problem while trying to upsert to two objects or more connected via a lookup.