We study the convergence and convergence rates of a multi-block proximal alternating direction method of multipliers (PADMM) for solving linearly constrained separable nonconvex nonsmooth optimization problems. This algorithm is an important variant of the alternating direction method of multipliers (ADMM) which includes a proximal term in each subproblem, to cancel out complicated terms in applications where subproblems are not easy to solve or do not admit a simple closed form solution. We consider an over-relaxation step size in the dual update and provide a detailed proof of the convergence for any step size beta is an element of(0,2). We prove the convergence of the sequence generated by the PADMM by showing that it has a finite length and it is Cauchy. Under the powerful Kurdyka-Lojasiewicz (KL) property, we establish the convergence rates for the values and the iterates, and we show that various values of KL-exponent associated with the objective function can raise PADMM with three different convergence rates. More precisely, we show that if the (KL) exponent theta=0, the sequence generated by PADMM converges in a finite numbers of iterations. If theta is an element of(0,1/2], then the sequential rate of convergence is cQk where c>0, Q is an element of(0,1), and k is an element of N is the iteration number. If theta is an element of(1/2,1], then O(1/kr) rate where r=(1-theta)/(2 theta-1) is achieved.