Your company is running their first dynamic campaign, serving different offers by analyzing real-time data during the holiday season. The data scientists are collecting terabytes of data that rapidly grows every hour during their 30-day campaign. They are using Google Cloud Dataflow to preprocess the data and collect the feature (signals) data that is needed for the machine learning model in Google Cloud Bigtable. The team is observing suboptimal performance with reads and writes of their initial load of 10 TB of data. They want to improve this performance while minimizing cost. What should they do?
IsaB
Highly Voted 4 years, 7 months ago[Removed]
Highly Voted 5 years, 1 month ago[Removed]
5 years, 1 month agomeh_33
Most Recent 8 months, 3 weeks ago09878d5
9 months, 1 week agoJOKKUNO
1 year, 4 months agoaxantroff
1 year, 5 months agohkris909
1 year, 8 months agoroty
1 year, 4 months agoFP77
1 year, 9 months agoMathew106
1 year, 9 months agoAdswerve
2 years agoBrillianttyagi
2 years, 4 months agohilel_eth
2 years, 4 months agoPime13
2 years, 9 months agoArkon88
3 years, 1 month agosamdhimal
3 years, 3 months agoMaxNRG
3 years, 5 months agoanji007
3 years, 6 months ago