exam questions

Exam DP-203 All Questions

View all questions & answers for the DP-203 exam

Exam DP-203 topic 1 question 54 discussion

Actual exam question from Microsoft's DP-203
Question #: 54
Topic #: 1
[All DP-203 Questions]

HOTSPOT -
You have an Azure Synapse Analytics dedicated SQL pool named Pool1 and an Azure Data Lake Storage Gen2 account named Account1.
You plan to access the files in Account1 by using an external table.
You need to create a data source in Pool1 that you can reference when you create the external table.
How should you complete the Transact-SQL statement? To answer, select the appropriate options in the answer area.
NOTE: Each correct selection is worth one point.
Hot Area:

Show Suggested Answer Hide Answer
Suggested Answer:
Box 1: blob -
The following example creates an external data source for Azure Data Lake Gen2
CREATE EXTERNAL DATA SOURCE YellowTaxi
WITH ( LOCATION = 'https://azureopendatastorage.blob.core.windows.net/nyctlc/yellow/',
TYPE = HADOOP)

Box 2: HADOOP -
Reference:
https://docs.microsoft.com/en-us/azure/synapse-analytics/sql/develop-tables-external-tables

Comments

Chosen Answer:
This is a voting comment (?). It is better to Upvote an existing comment if you don't have anything to add.
Switch to a voting comment New
galacaw
Highly Voted 3 years, 2 months ago
1. dfs (for Azure Data Lake Storage Gen2)
upvoted 69 times
Rob77
2 years, 1 month ago
Correct, https://learn.microsoft.com/en-us/sql/t-sql/statements/create-external-data-source-transact-sql?view=azure-sqldw-latest&preserve-view=true&tabs=dedicated#location--prefixpath
upvoted 5 times
...
panda_azzurro
2 years, 5 months ago
dfs is not valid
upvoted 1 times
suvec
2 years, 2 months ago
dfs is valid Data Lake Storage Gen2 abfs[s] <container>@<storage_account>.dfs.core.windows.net http[s] <storage_account>.dfs.core.windows.net/<container>/subfolders wasb[s] <container>@<storage_account>.blob.core.windows.net
upvoted 7 times
...
...
jds0
2 years, 3 months ago
This table corroborates that "dfs" should be used for ADLS Gen 2: https://learn.microsoft.com/en-us/azure/synapse-analytics/sql/develop-tables-external-tables?tabs=hadoop#location
upvoted 1 times
...
Vedjha
2 years, 4 months ago
CREATE EXTERNAL DATA SOURCE mydatasource WITH ( LOCATION = 'abfss://[email protected]', CREDENTIAL = AzureStorageCredential, TYPE = HADOOP )
upvoted 9 times
...
...
Kure87
Highly Voted 2 years, 7 months ago
1. blob. Acoording with this article https://learn.microsoft.com/en-us/azure/synapse-analytics/sql/develop-tables-external-tables?tabs=hadoop we only use DFS (abfss endpoint) when your account has secure transfer enabled. On the question the location starts with "https://account1." not "abfss://"
upvoted 39 times
tlb_20
1 year, 2 months ago
As it is written in this MSF example on how to create an External data source based on Azure storage account: " TYPE = HADOOP, -- For dedicated SQL pool -- TYPE = BLOB_STORAGE, -- For serverless SQL pool " https://learn.microsoft.com/en-us/training/modules/use-azure-synapse-serverless-sql-pools-for-transforming-data-lake/2-transform-data-using-create-external-table-select-statement
upvoted 5 times
...
vadiminski_a
2 years, 3 months ago
That's not correct, you use abfss:// if you have secure transfer enabled. There is nothing wrong with using https:// when you don't have secure transfer enabled. However, for DLSv2 you need to specify .dfs. ... The correct answer is is: dfs hadoop
upvoted 7 times
...
suvec
2 years, 2 months ago
@kure87 DFS is valid Data Lake Storage Gen2 abfs[s] <container>@<storage_account>.dfs.core.windows.net http[s] <storage_account>.dfs.core.windows.net/<container>/subfolders wasb[s] <container>@<storage_account>.blob.core.windows.net
upvoted 6 times
JustImperius
5 months, 1 week ago
I agree with DFS but remember this is possible on blob storage: http[s] <storage_account>.blob.core.windows.net/<container>/subfolders I created an linked service from my synapse to storage account explicitly using blob.
upvoted 1 times
...
...
Sebastian1677
2 years, 6 months ago
please upvote this
upvoted 3 times
...
...
samianae
Most Recent 5 months ago
dfs and hadoop
upvoted 1 times
...
ff5037f
8 months ago
answer is correct. as location prefix is https. if the container location prefix is abfs then it is dfs, if it is wbfs then it is blob . https://learn.microsoft.com/en-us/sql/t-sql/statements/create-external-data-source-transact-sql?view=azure-sqldw-latest&preserve-view=true&tabs=dedicated#location--prefixpath Data Lake Storage Gen2 http[s] <storage_account>.dfs.core.windows.net/<container>/subfolders
upvoted 2 times
...
ypan
12 months ago
Dedicated SQL Pool: Use CREATE EXTERNAL DATA SOURCE with TYPE = HADOOP for accessing Azure Data Lake Storage Gen2. Serverless SQL Pool: Use OPENROWSET for direct querying of the external data.
upvoted 1 times
...
Charley92
1 year, 2 months ago
CREATE EXTERNAL DATA SOURCE MyDataSource WITH ( TYPE = HADOOP, LOCATION = 'abfss://<container-name>@<storage-account-name>.dfs.core.windows.net/', CREDENTIAL = <your-credential-name> );
upvoted 2 times
...
dgerok
1 year, 2 months ago
answer is correct see the MS example https://learn.microsoft.com/en-us/azure/synapse-analytics/sql/develop-tables-external-tables?tabs=hadoop#example-for-create-external-data-source
upvoted 1 times
...
ankeshpatel2112
1 year, 2 months ago
Correct Answer : DFS Explnation : If your source is ADLS Gen2 then it would be "DFS" and if your source is Azure Blob Storage then "Blob". Please refer below table from Microsoft Documentation. External Data Source Connector location prefix Location path Data Lake Storage* Gen1 adl <storage_account>.azuredatalake.net Data Lake Storage Gen2 abfs[s] <container>@<storage_account>.dfs.core.windows.net Azure Blob Storage wasbs <container>@<storage_account>.blob.core.windows.net Azure Blob Storage https <storage_account>.blob.core.windows.net/<container>/subfolders Data Lake Storage Gen1 http[s] <storage_account>.azuredatalakestore.net/webhdfs/v1 Data Lake Storage Gen2 http[s] <storage_account>.dfs.core.windows.net/<container>/subfolders Data Lake Storage Gen2 wasb[s] <container>@<storage_account>.blob.core.windows.net
upvoted 2 times
...
Alongi
1 year, 3 months ago
Should be DFS for Datalake Gen2
upvoted 1 times
...
Momoanwar
1 year, 6 months ago
Both `blob` and `dfs` endpoints work when connecting to Azure Data Lake Storage Gen2, but they serve different purposes. The `blob` endpoint is typically used for standard storage operations, while the `dfs` endpoint is optimized for hierarchical file system operations and is preferred for analytics workloads with Azure Synapse Analytics.
upvoted 1 times
Momoanwar
1 year, 6 months ago
To simply access files in Azure Data Lake Storage Gen2 for reading and analysis, without the need for Data Lake specific features like directory management or fine-grained ACLs, using the `blob` endpoint is sufficient. If your operations are primarily related to accessing files for reading, the `blob` endpoint can be used in the external data source definition within Azure Synapse Analytics.
upvoted 2 times
...
...
fahfouhi94
1 year, 9 months ago
Ans : dfs & hadoop https://learn.microsoft.com/en-us/sql/t-sql/statements/create-external-data-source-transact-sql?view=azure-sqldw-latest&preserve-view=true&tabs=dedicated
upvoted 1 times
...
kkk5566
1 year, 9 months ago
CREATE EXTERNAL DATA SOURCE AzureDataLakeStore WITH -- Please note the abfss endpoint when your account has secure transfer enabled ( LOCATION = 'abfss://[email protected]' , CREDENTIAL = ADLS_credential , TYPE = HADOOP ) ; CREATE EXTERNAL DATA SOURCE YellowTaxi WITH ( LOCATION = 'https://azureopendatastorage.blob.core.windows.net/nyctlc/yellow/', TY HADOOP, blob
upvoted 1 times
...
kdp203
1 year, 10 months ago
dfs should be the correct answer (ADLS Gen2)
upvoted 1 times
...
auwia
2 years ago
Confirmed HADOOP and DFS: External Data Source | Connector | Location path ---------------------------------------------------------------------------------------------------------------------------- Data Lake Storage Gen1 | adl | <storage_account>.azuredatalake.net Data Lake Storage Gen2 | abfs[s] | <container>@<storage_account>.dfs.core.windows.net Azure Blob Storage | wasbs | <container>@<storage_account>.blob.core.windows.net Azure Blob Storage | https | <storage_account>.blob.core.windows.net/<container>/subfolders Data Lake Storage Gen1 | http[s] | <storage_account>.azuredatalakestore.net/webhdfs/v1 Data Lake Storage Gen2 | http[s] | <storage_account>.dfs.core.windows.net/<container>/subfolders Data Lake Storage Gen2 | wasb[s] | <container>@<storage_account>.blob.core.windows.net
upvoted 3 times
...
auwia
2 years ago
CREATE EXTERNAL DATA SOURCE source1 WITH ( LOCATION = 'https://account1.dfs.core.windows.net', TYPE = HADOOP )
upvoted 1 times
...
aga444
2 years ago
CREATE EXTERNAL DATA SOURCE DataSourceName WITH ( TYPE = HADOOP, LOCATION = 'adl://Account1.dfs.core.windows.net/', CREDENTIAL = SqlPoolCredential );
upvoted 1 times
...
janaki
2 years, 1 month ago
CREATE EXTERNAL DATA SOURCE <datasource_name> WITH ( TYPE = HADOOP, LOCATION = 'adl://<account_name>.dfs.core.windows.net', CREDENTIAL = <credential_name> ); So answer is dfs and Type = Hadoop
upvoted 1 times
...
Community vote distribution
A (35%)
C (25%)
B (20%)
Other
Most Voted
A voting comment increases the vote count for the chosen answer by one.

Upvoting a comment with a selected answer will also increase the vote count towards that answer by one. So if you see a comment that you already agree with, you can upvote it instead of posting a new comment.

SaveCancel
Loading ...