Snowflake DSA-C02 SnowPro Advanced: Data Scientist Certification Exam Practice Test

Page: 1 / 14
Total 65 questions
Question 1

All Snowpark ML modeling and preprocessing classes are in the ________ namespace?



Answer : D

All Snowpark ML modeling and preprocessing classes are in the snowflake.ml.modeling namespace. The Snowpark ML modules have the same name as the corresponding module from the sklearn namespace. For example, the Snowpark ML module corresponding to sklearn.calibration is snow-flake.ml.modeling.calibration.

The xgboost and lightgbm modules correspond to snowflake.ml.modeling.xgboost and snow-flake.ml.modeling.lightgbm, respectively.

Not all of the classes from scikit-learn are supported in Snowpark ML.


Question 2

Which ones are the correct rules while using a data science model created via External function in Snowflake?



Answer : A, B, C, D

From the perspective of a user running a SQL statement, an external function behaves like any other UDF . External functions follow these rules:

External functions return a value.

External functions can accept parameters.

An external function can appear in any clause of a SQL statement in which other types of UDF can appear. For example:

1. select my_external_function_2(column_1, column_2)

2. from table_1;

1. select col1

2. from table_1

3. where my_external_function_3(col2) < 0;

1. create view view1 (col1) as

2. select my_external_function_5(col1)

3. from table9;

An external function can be part of a more complex expression:

1. select upper(zipcode_to_city_external_function(zipcode))

2. from address_table;

The returned value can be a compound value, such as a VARIANT that contains JSON.

External functions can be overloaded; two different functions can have the same name but different signatures (different numbers or data types of input parameters).


Question 3

Which of the following is a useful tool for gaining insights into the relationship between features and predictions?



Answer : C

Partial dependence plots (PDP) is a useful tool for gaining insights into the relationship between features and predictions. It helps us understand how different values of a particular feature impact model's predictions.


Question 4

How do you handle missing or corrupted data in a dataset?



Answer : D


Question 5

Which of the Following is not type of Windows function in Snowflake?



Answer : C, D

Window Functions

A window function operates on a group (''window'') of related rows.

Each time a window function is called, it is passed a row (the current row in the window) and the window of rows that contain the current row. The window function returns one output row for each input row. The output depends on the individual row passed to the function and the values of the other rows in the window passed to the function.

Some window functions are order-sensitive. There are two main types of order-sensitive window functions:

Rank-related functions.

Window frame functions.

Rank-related functions list information based on the ''rank'' of a row. For example, if you rank stores in descending order by profit per year, the store with the most profit will be ranked 1; the second-most profitable store will be ranked 2, etc.

Window frame functions allow you to perform rolling operations, such as calculating a running total or a moving average, on a subset of the rows in the window.


Question 6

Which object records data manipulation language (DML) changes made to tables, including inserts, updates, and deletes, as well as metadata about each change, so that actions can be taken using the changed data of Data Science Pipelines?



Answer : C

A stream object records data manipulation language (DML) changes made to tables, including inserts, updates, and deletes, as well as metadata about each change, so that actions can be taken using the changed data. This process is referred to as change data capture (CDC). An individual table stream tracks the changes made to rows in a source table. A table stream (also referred to as simply a ''stream'') makes a ''change table'' available of what changed, at the row level, between two transactional points of time in a table. This allows querying and consuming a sequence of change records in a transactional fashion.

Streams can be created to query change data on the following objects:

* Standard tables, including shared tables.

* Views, including secure views

* Directory tables

* Event tables


Question 7

A Data Scientist as data providers require to allow consumers to access all databases and database objects in a share by granting a single privilege on shared databases. Which one is incorrect SnowSQL command used by her while doing this task?

Assuming:

A database named product_db exists with a schema named product_agg and a table named Item_agg.

The database, schema, and table will be shared with two accounts named xy12345 and yz23456.

1. USE ROLE accountadmin;

2. CREATE DIRECT SHARE product_s;

3. GRANT USAGE ON DATABASE product_db TO SHARE product_s;

4. GRANT USAGE ON SCHEMA product_db. product_agg TO SHARE product_s;

5. GRANT SELECT ON TABLE sales_db. product_agg.Item_agg TO SHARE product_s;

6. SHOW GRANTS TO SHARE product_s;

7. ALTER SHARE product_s ADD ACCOUNTS=xy12345, yz23456;

8. SHOW GRANTS OF SHARE product_s;



Answer : C

CREATE SHARE product_s is the correct Snowsql command to create Share object.

Rest are correct ones.

https://docs.snowflake.com/en/user-guide/data-sharing-provider#creating-a-share-using-sql


Page:    1 / 14   
Total 65 questions