Too often, capacity planning activities that are crucial to software performance are being pushed to late development phases where trivial measurement-based assessment techniques can be employed on enterprise applications that are nearing completion. This procedure is highly inefficient, time consuming, and may result in disproportionately high correction costs to meet existing service level agreements. However, enterprise applications nowadays excessively make use of standard software that is shipped by large software vendors to a wide range of customers. Therefore, an application similar to the one whose capacity is being planned may already be in production state and constantly produce log data as part of application performance monitoring facilities. In this paper, we demonstrate how potential capacity planning service providers can leverage the dissemination effects of standard software by applying machine learning techniques on measurement data from various running enterprise applications. Utilizing prediction models that were trained on a large scale of monitoring data enables cost-efficient measurement-based prediction techniques to be used in early design phases. Therefore, we integrate knowledge discovery activities into well-known capacity planning steps, which we adapt to the special characteristics of enterprise applications. We evaluate the feasibility of the modeled process using measurement data from more than 1,800 productively running enterprise applications in order to predict the response time of a widely used standard business transaction. Based on the trained model, we demonstrate how to simulate and analyze future workload scenarios. Using a Pareto approach, we were able to identify cost-effective design alternatives for a planned enterprise application.