You plan to implement an Azure Data Lake Storage Gen2 container that will contain CSV files. The size of the files will vary based on the number of events that occur per hour.
File sizes range from 4 KB to 5 GB.
You need to ensure that the files stored in the container are optimized for batch processing.
What should you do?
VeroDon
Highly Voted 8 months, 2 weeks agoauwia
1 year, 11 months agoBouhdy
9 months, 1 week agoMassy
3 years, 1 month agoanks84
2 years, 8 months agobhrz
2 years, 8 months agoCanary_2021
Highly Voted 8 months, 2 weeks agoJolyboy
Most Recent 2 months, 1 week agoJanuaz
2 months, 2 weeks agoIMadnan
3 months, 3 weeks agosamianae
4 months, 1 week agoRMK2000
5 months agomoize
6 months, 1 week agoEmnCours
6 months, 1 week agoroopansh.gupta2
8 months, 2 weeks agod39f475
1 year agoDusica
1 year, 1 month agof214eb2
1 year, 1 month agoCharley92
1 year, 1 month agoda257c2
1 year, 1 month agoKhadija10
1 year, 4 months agoalphilla
1 year, 4 months ago