Using Data Loader to import data into a big object

For ease, and because the CSV file in the section is so large (nearly half a million records at the time of writing), we will use the Salesforce Data Loader tool. The process to import historical lap time data from the preceding website is as follows:

  1. Search for Data Loader in the Setup menu and select the applicable link to download the tool to your computer and follow the install instructions.
  2. Navigate to the Database images section of the site (https://ergast.com/mrd/db/) and download the f1db_csv.zip file and unzip it.
  3. Edit the lap_times.csv file to insert the following column headers as the first row: RaceId__c, DriverId__c, Lap__c, Position__c, Time__c, and Milliseconds__c.
  4. Run the Data Loader tool from your installed location and click the Insert button. 
  5. When prompted, log in to your scratch org by selecting OAuth and Sandbox from the Environment drop-down prompt and click Login. Refer to the SFDX commands used earlier in this chapter to obtain the username and password if needed.
  6. Select the Race Lap History object as shown in the following screenshot and click Next:

  1. Click on Ok to confirm the number of records.
  1. In Step 3: Mapping, click Create or Edit Map and then Auto-Match Fields to Columns, click Ok to accept the defaults, and then click Next. If you do not see the field names, ensure you have completed step 3 and then retry this step.
  2. In Step 4: Finish, confirm the output directory for success and error files will be written and then click Finish.
  3. Click Yes when prompted to start the import and monitor the progress:

Since inserting data into a big object is equivalent to an upsert operation, it's possible to retry failed uploads over and over without risking duplicate data for those records that have been inserted. This behavior is by design since big objects do not support transactions; this is the recovery path when errors occur.

After the process completes, the Storage Usage page (under Setup) will update in the next few hours to show the number of records present in big objects. Salesforce sells Big Object storage by the number of records, in 50 million chunks, regardless of the size of the records. Each org gets 1 million records for free, including the scratch orgs used in this chapter!

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
18.222.240.21