Suggested Answer:B🗳️
Scenario: An Azure Data Factory pipeline must be used to move data from Cosmos DB to SQL Database for Race Central. If the data load takes longer than 20 minutes, configuration changes must be made to Data Factory. The telemetry data is sent to a MongoDB database. A custom application then moves the data to databases in SQL Server 2017. The telemetry data in MongoDB has more than 500 attributes. The application changes the attribute names when the data is moved to SQL Server 2017. You can copy data to or from Azure Cosmos DB (SQL API) by using Azure Data Factory pipeline. Column mapping applies when copying data from source to sink. By default, copy activity map source data to sink by column names. You can specify explicit mapping to customize the column mapping based on your need. More specifically, copy activity: Read the data from source and determine the source schema 1. Use default column mapping to map columns by name, or apply explicit column mapping if specified. 2. Write the data to sink 3. Write the data to sink Reference: https://docs.microsoft.com/en-us/azure/data-factory/copy-activity-schema-and-type-mapping
This section is not available anymore. Please use the main Exam Page.DP-200 Exam Questions
Log in to ExamTopics
Sign in:
Community vote distribution
A (35%)
C (25%)
B (20%)
Other
Most Voted
A voting comment increases the vote count for the chosen answer by one.
Upvoting a comment with a selected answer will also increase the vote count towards that answer by one.
So if you see a comment that you already agree with, you can upvote it instead of posting a new comment.
syu31svc
Highly Voted 4 years, 6 months agomaciejt
4 years agoJGECM
Most Recent 4 years, 5 months ago