You plan to implement an Azure Data Lake Storage Gen2 container that will contain CSV files. The size of the files will vary based on the number of events that occur per hour.
File sizes range from 4 KB to 5 GB.
You need to ensure that the files stored in the container are optimized for batch processing.
What should you do?
VeroDon
Highly Voted 11 months agoauwia
2 years, 2 months agoBouhdy
11 months, 3 weeks agoMassy
3 years, 3 months agoanks84
2 years, 11 months agobhrz
2 years, 11 months agoCanary_2021
Highly Voted 11 months agoJolyboy
Most Recent 4 months, 3 weeks agoJanuaz
5 months agoIMadnan
6 months agosamianae
6 months, 3 weeks agoRMK2000
7 months, 2 weeks agomoize
8 months, 3 weeks agoEmnCours
8 months, 3 weeks agoroopansh.gupta2
11 months agod39f475
1 year, 2 months agoDusica
1 year, 3 months agof214eb2
1 year, 4 months agoCharley92
1 year, 4 months agoda257c2
1 year, 4 months agoKhadija10
1 year, 6 months agoalphilla
1 year, 6 months ago