Overhead (computing)
In computing, overhead is the consumption of computing resources for aspects that are not directly related to achieving a desired goal. Overhead is required for more general processing and impacts achieving a more focused goal. Overhead manifests as aspects such as slower processing, less memory, less storage capacity, less network bandwidth, and longer latency.[1] Overhead can impact software design with regard to structure, error correction, and feature inclusion. Overhead in computing is a special case of engineering overhead and has the same essential meaning as in business; organizational overhead. Software designChoice of implementationA programmer/software engineer may have a choice of several algorithms, encodings, data types or data structures, each of which have known characteristics. When choosing among them, their respective overhead should also be considered. TradeoffsIn software engineering, overhead can influence the decision whether or not to include features in new products, or indeed whether to fix bugs. A feature that has a high overhead may not be included – or needs a big financial incentive to do so. Often, even though software providers are well aware of bugs in their products, the payoff of fixing them is not worth the reward, because of the overhead. For example, an implicit data structure or succinct data structure may provide low space overhead, but at the cost of slow performance (space/time tradeoff). Run-time complexity of softwareAlgorithmic complexity is generally specified using Big O notation. This makes no comment on how long something takes to run or how much memory it uses, but how its increase depends on the size of the input. Overhead is deliberately not part of this calculation, since it varies from one machine to another, whereas the fundamental running time of an algorithm does not. This should be contrasted with algorithmic efficiency, which takes into account all kinds of resources – a combination (though not a trivial one) of complexity and overhead. ExamplesFile system metadataIn addition to file content, a file system uses storage space for overhead information including: metadata (such as file name and modification timestamps), hierarchical directory organization and much more. In general, many small files requires more overhead than a smaller number of large files. CPU cache metadataIn a CPU cache, capacity is the maximum amount of data that it stores including overhead data; not how much user content it holds. For instance, a 4KB capacity cache stores less than 4KB of user data since some of the space is required for overhead bits such as frame, address, and tag information.[2] Communication protocolReliably sending a payload of data over a communications network requires sending more than just the payload itself. It also involves sending various control and signaling data (TCP) required to reach the destination. This creates a so-called protocol overhead as the additional data does not contribute to the intrinsic meaning of the message.[3][4] In telephony, number dialing and call set-up time are overheads. In two-way (but half-duplex) radios, the use of "over" and other signaling needed to avoid collisions is an overhead. Protocol overhead can be expressed as a percentage of non-application bytes (protocol and frame synchronization) divided by the total number of bytes in the message. Data encodingThe encoding of information and data introduces overhead too. The date and time "2011-07-12 07:18:47" can be expressed as Unix time with the 32-bit signed integer <?xml version="1.0" encoding="UTF-8"?>
<datetime qualifier="changedate" index="1">
<year>2011</year>
<month>07</month>
<day>12</day>
<hour>07</hour>
<minute>18</minute>
<second>47</second>
</datetime>
The 349 bytes, resulting from the UTF-8 encoded XML, correlates to a size overhead of 8625% over the original integer representation. Function callCalling a function requires a relatively small amount of run-time overhead for operations such as stack maintenance and parameter passing.[5] The overhead is relatively small, but can be problematic when there are many calls (i.e. in a loop) or when timing requirements are tight. Sometimes a compiler can minimize this overhead by inlining a function; eliminating the function call.[6] See alsoReferences
|