In modern cloud computing systems, jobs consist of a large number of parallel tasks which can be assigned to parallel servers in a cloud. A job is completed when all its tasks are completed, so the delay of a job is the max of the delays of its tasks. In our work, we study a model with a simple random policy for task assignment. Using association properties, we show that tail probabilities of job delay are upper bounded by the tail probabilities of the max of independent task delays, thus establishing a stochastic upper bound on job delay. Then we prove that job delay converges to this upper bound in an asymptotic regime where the number of servers in the system becomes large and the number of tasks in a job is also allowed to grow.