
Answer-first summary for fast verification
Answer: Azure Kubernetes Service
The correct answer is Azure Kubernetes Service (AKS) because it provides robust GPU support for real-time inferencing of deep learning models. AKS allows configuration of GPU-enabled node pools specifically designed for parallel computation tasks like image recognition inference. While Machine Learning Compute (D) supports GPU-based training, it is not optimized for real-time inferencing endpoints. Azure Container Instance (A) lacks GPU support entirely, and Field Programmable Gate Array (C) is hardware, not a cloud compute service. The community discussion strongly supports B with 100% consensus and references to Microsoft documentation confirming AKS as the solution for GPU-based real-time inferencing.
Author: LeetQuiz Editorial Team
Ultimate access to all questions.
You are creating a deep learning model for image recognition on Azure Machine Learning service using GPU-based training. You must deploy the model to an endpoint that supports real-time, GPU-based inferencing.
Which compute type should you use to configure the compute resources for model inferencing?
A
Azure Container Instance
B
Azure Kubernetes Service
C
Field Programmable Gate Array
D
Machine Learning Compute
No comments yet.