How should an Administrator BEST architect a large multi-layer Long Short-Term Memory (LSTM) recurrent neural network (RNN) running with MXNET on Amazon EC2? (Choose two.)
A.
Use data parallelism to partition the workload over multiple devices and balance the workload within the GPUs.
B.
Use compute-optimized EC2 instances with an attached elastic GPU.
C.
Use general purpose GPU computing instances such as G3 and P3.
D.
Use processing parallelism to partition the workload over multiple storage devices and balance the workload within the GPUs.
answer is correct. https://aws.amazon.com/blogs/machine-learning/parallelizing-across-multiple-cpu-gpus-to-speed-up-deep-learning-inference-at-the-edge/
Data Parallelism vs Model Parallelism
By default, MXNet uses data parallelism to partition the workload over multiple devices. Assume there are n devices. Then each one will receive a copy of the complete model and train it on 1/n of the data. The results such as gradients and updated model are communicated across these devices.
MXNet also supports model parallelism. In this approach, each device holds onto only part of the model. This proves useful when the model is too large to fit onto a single device.
Answer doesn't list Model Parallelism as that would be correct when using large models maybe this is a typo?
https://aws.amazon.com/blogs/machine-learning/reducing-deep-learning-inference-cost-with-mxnet-and-amazon-elastic-inference/ Mentions increased performance with EI elastic GPUs on compute ec2 instances. However answer doesn't refer to Amazon Elastic Inference.
A voting comment increases the vote count for the chosen answer by one.
Upvoting a comment with a selected answer will also increase the vote count towards that answer by one.
So if you see a comment that you already agree with, you can upvote it instead of posting a new comment.
san2020
3 years, 7 months agomattyb123
3 years, 8 months agomattyb123
3 years, 8 months agomattyb123
3 years, 8 months agomattyb123
3 years, 8 months agomattyb123
3 years, 8 months agofreedomeox
3 years, 7 months agomattyb123
3 years, 8 months agojlpl
3 years, 8 months agomattyb123
3 years, 8 months ago