Capacity utilization is an economics concept which refers to the extent to which an enterprise or a nation actually uses its installed productive capacity. Thus, it refers to the relationship between actual output produced and potential output that could be produced with installed equipment, if capacity was fully used.
Potential output can represent the maximum amount of output that can be produced short term with the existent stock of capital. As such, a standard definition of capacity utilization is the weighted average of the ratio between the actual outputs of firms to the maximum that could be produced per unit of time, with existing resources. Output could also of course be measured in physical units or market values.
Many organizations may experience an increase in their average cost of production as output increases and before the absolute physical limit of capacity is reached. This can happen even in the absence of additional resources being used. An alternative approach, sometimes called the “economic” utilization rate, is therefore to measure the ratio of actual output to the level of output, beyond which the average cost of production begins to rise.
In the 1970s, American businesses carried a great deal of excess capacity. In the 1980s, businesses improved their capacity utilization by as much as twenty percent. In today’s business climate, organizations are much more cost conscious and accountable to the organization’s business objectives. Today’s organizations must be able to strategically identify the strongest project mix that will deliver the highest return from their workforce. New project requests must be objectively modeled against current projects, capacity and projected costs/return. Organizations now engage in capacity planning to ensure the most effective and efficient use of their resources. Capacity planning assures appropriate capacity levels to meet business objectives and project management needs.
Resource Management & Scheduling