A Machine Learning Specialist is developing a custom video recommendation model for an application. The dataset used to train this model is very large with millions of data points and is hosted in an Amazon S3 bucket. The Specialist wants to avoid loading all of this data onto an Amazon SageMaker notebook instance because it would take hours to move and will exceed the attached 5 GB Amazon EBS volume on the notebook instance.
Which approach allows the Specialist to use all the data to train the model?
JayK
Highly Voted 3 years, 7 months agoliangfb
Highly Voted 3 years, 7 months agoJonSno
Most Recent 2 months, 4 weeks agoreginav
5 months agoMickey321
7 months, 3 weeks agoloict
7 months, 3 weeks agokyuhuck
1 year, 3 months agoVenkatesh_Babu
1 year, 9 months agoValcilio
2 years, 2 months agoyemauricio
2 years, 4 months agoShailendraa
2 years, 8 months agoHuy
3 years, 6 months agocloud_trail
3 years, 6 months agobobdylan1
3 years, 6 months agoWillnguyen22
3 years, 7 months agoGeeBeeEl
3 years, 7 months agoroytruong
3 years, 7 months ago