Salesforce Data Architect Salesforce Certified Data Architect Exam Practice Test

Page: 1 / 14
Total 257 questions
Question 1

Which API should a data architect use if exporting 1million records from Salesforce?



Answer : A

Using Bulk API to export 1 million records from Salesforce is the best option. Bulk API is a RESTful API that allows you to perform asynchronous operations on large sets of data. You can use Bulk API to create, update, delete, or query millions of records in batches. Bulk API is optimized for performance and scalability, and it can handle complex data loading scenarios.


Question 2

Universal Containers (UC) has implemented a master data management strategy, which uses a central system of truth, to ensure the entire company has the same customer information in all systems. UC customer data changes need to be accurate at all times in all of the systems. Salesforce is the identified system of record for this information.

What is the correct solution for ensuring all systems using customer data are kept up to date?



Answer : D

Having each system pull the record changes from Salesforce using change data capture (option D) is the correct solution for ensuring all systems using customer data are kept up to date, as it allows the systems to subscribe to real-time events from Salesforce and receive notifications when customer records are created, updated, deleted, or undeleted. Sending customer data nightly to the system of truth in a scheduled batch job (option A) or sending customer record changes from Salesforce to each system in a nightly batch job (option B) are not good solutions, as they may cause data latency and inconsistency, and they do not provide real-time updates. Sending customer record changes from Salesforce to the system of truth in real time (option C) is also not a good solution, as it does not address how the other systems will receive the updates from the system of truth.


Question 3

Universal Containers has been a customer of Salesforce for 10 years. Currently they have 2 million accounts in the system. Due to an erroneous integration built 3 years ago, it is estimated there are 500,000 duplicates in the system.

Which solution should a data architect recommend to remediate the duplication issue?



Answer : D

Implementing duplicate rules (option D) is the best solution to remediate the duplication issue, as it allows the data architect to identify and merge duplicate accounts in Salesforce using native features and tools. Developing an ETL process that utilizes the merge API to merge the duplicate records (option A) is not a good solution, as it may require more coding and testing effort, and it does not prevent duplicates from being created in Salesforce. Utilizing a data warehouse as the system of truth (option B) is also not a good solution, as it may introduce additional complexity and cost, and it does not address the duplication issue in Salesforce. Extracting the data using data loader and using excel to merge the duplicate records (option C) is also not a good solution, as it may be time-consuming and error-prone, and it does not prevent duplicates from being created in Salesforce.


Question 4

A large automobile company has implemented Salesforce for its sales associates. Leads flow from its website to Salesforce using a batch integration in Salesforce. The batch job converts the leads to Accounts in Salesforce. Customers visiting their retail stores are also created in Salesforce as Accounts.

The company has noticed a large number of duplicate Accounts in Salesforce. On analysis, it was found that certain customers could interact with its website and also visit the store. The sales associates use Global Search to search for customers in Salesforce before they create the customers.

Which option should a data architect choose to implement to avoid duplicates?



Answer : A

Leveraging duplicate rules in Salesforce to validate duplicates during the account creation process (option A) is the best option to implement to avoid duplicates, as it allows the sales associates to identify and merge duplicate accounts before they are saved. Developing an Apex class that searches for duplicates and removes them nightly (option B) is not a good option, as it may cause data loss or conflicts, and it does not prevent duplicates from being created in the first place. Implementing an MDM solution to validate the customer information before creating Salesforce (option C) is also not a good option, as it may introduce additional complexity and cost, and it does not address the issue of customers interacting with both the website and the store. Building a custom search functionality that allows sales associates to search for customer in real time upon visiting their retail stores (option D) is also not a good option, as it may not be reliable or user-friendly, and it does not leverage the existing Global Search feature.


Question 5

Universal Containers (UC) is implementing Salesforce and will be using Salesforce to track customer complaints, provide white papers on products, and provide subscription-based support.

Which license type will UC users need to fulfill UC's requirements?



Answer : C

Service Cloud License (option C) is the license type that UC users need to fulfill UC's requirements, as it allows them to track customer complaints, provide white papers on products, and provide subscription-based support. Sales Cloud License (option A) is mainly for managing sales processes and leads, Lightning Platform Starter License (option B) is for building custom apps and workflows, and Salesforce License (option D) is a generic term that does not specify a particular license type.


Question 6

UC is preparing to implement sales cloud and would like to its users to have read only access to an account record if they have access to its child opportunity record. How would a data architect implement this sharing requirement between objects?



Question 7

North Trail Outfitters (NTD) is in the process of evaluating big objects to store large amounts of asset data from an external system. NTO will need to report on this asset data weekly.

Which two native tools should a data architect recommend to achieve this reporting requirement?



Page:    1 / 14   
Total 257 questions