We consider a distributed multi-agent network system where the goal is to minimize an objective function that can be written as the sum of component functions, each of which is known partially (with stochastic errors) to a specific network agent. We propose an asynchronous algorithm that is motivated by random gossip schemes where each agent has a local Poisson clock. At each tick of its local clock, the agent averages its estimate with a randomly chosen neighbor and adjusts the average using the gradient of its local function that is computed with stochastic errors. We investigate the convergence properties of the algorithm for two different classes of functions. First, we consider differentiable, but not necessarily convex functions, and prove that the gradients converge to zero with probability 1. Then, we consider convex, but not necessarily differentiable functions, and show that the iterates converge to an optimal solution almost surely.