Information theory studies the transmission, processing, extraction, and utilization of information. Abstractly, information can be thought of as the resolution of uncertainty. In the case of communication of information over a noisy channel, this abstract concept was formalized in 1948 by Claude Shannon in a paper entitled A Mathematical Theory of Communication, in which information is thought of as a set of possible messages, and the goal is to send these messages over a noisy channel, and to have the receiver reconstruct the message with low probability of error, in spite of the channel noise. Shannon's main result, the noisy-channel coding theorem, showed that, in the limit of many channel uses, the rate of information that is asymptotically achievable is equal to the channel capacity, a quantity dependent merely on the statistics of the channel over which the messages are sent.[5]
Coding theory is concerned with finding explicit methods, called codes, for increasing the efficiency and reducing the error rate of data communication over noisy channels to near the channel capacity. These codes can be roughly subdivided into data compression (source coding) and error-correction (channel coding) techniques. In the latter case, it took many years to find the methods Shannon's work proved were possible.[citation needed]
The landmark event establishing the discipline of information theory and bringing it to immediate worldwide attention was the publication of Claude E. Shannon's classic paper "A Mathematical Theory of Communication" in the Bell System Technical Journal in July and October 1948. Historian James Gleick rated the paper as the most important development of 1948, above the transistor, noting that the paper was "even more profound and more fundamental" than the transistor.[20] He came to be known as the "father of information theory".[21][22][23] Shannon outlined some of his initial ideas of information theory as early as 1939 in a letter to Vannevar Bush.[23]
Prior to this paper, limited information-theoretic ideas had been developed at Bell Labs, all implicitly assuming events of equal probability. Harry Nyquist's 1924 paper, Certain Factors Affecting Telegraph Speed, contains a theoretical section quantifying "intelligence" and the "line speed" at which it can be transmitted by a communication system, giving the relation W = K log m (recalling the Boltzmann constant), where W is the speed of transmission of intelligence, m is the number of different voltage levels to choose from at each time step, and K is a constant. Ralph Hartley's 1928 paper, Transmission of Information, uses the word information as a measurable quantity, reflecting the receiver's ability to distinguish one sequence of symbols from any other, thus quantifying information as H = log Sn = n log S, where S was the number of possible symbols, and n the number of symbols in a transmission. The unit of information was therefore the decimal digit, which since has sometimes been called the hartley in his honor as a unit or scale or measure of information. Alan Turing in 1940 used similar ideas as part of the statistical analysis of the breaking of the German second world war Enigma ciphers.[citation needed]
In Shannon's revolutionary and groundbreaking paper, the work for which had been substantially completed at Bell Labs by the end of 1944, Shannon for the first time introduced the qualitative and quantitative model of communication as a statistical process underlying information theory, opening with the assertion:
"The fundamental problem of communication is that of reproducing at one point, either exactly or approximately, a message selected at another point."
the mutual information, and the channel capacity of a noisy channel, including the promise of perfect loss-free communication given by the noisy-channel coding theorem;
Information theory is based on probability theory and statistics, where quantified information is usually described in terms of bits. Information theory often concerns itself with measures of information of the distributions associated with random variables. One of the most important measures is called entropy, which forms the building block of many other measures. Entropy allows quantification of measure of information in a single random variable.[24] Another useful concept is mutual information defined on two random variables, which describes the measure of information in common between those variables, which can be used to describe their correlation. The former quantity is a property of the probability distribution of a random variable and gives a limit on the rate at which data generated by independent samples with the given distribution can be reliably compressed. The latter is a property of the joint distribution of two random variables, and is the maximum rate of reliable communication across a noisy channel in the limit of long block lengths, when the channel statistics are determined by the joint distribution.
The choice of logarithmic base in the following formulae determines the unit of information entropy that is used. A common unit of information is the bit or shannon, based on the binary logarithm. Other units include the nat, which is based on the natural logarithm, and the decimal digit, which is based on the common logarithm.
In what follows, an expression of the form p log p is considered by convention to be equal to zero whenever p = 0. This is justified because for any logarithmic base.
Entropy of an information source
Based on the probability mass function of each source symbol to be communicated, the Shannon entropyH, in units of bits (per symbol), is given by
where pi is the probability of occurrence of the i-th possible value of the source symbol. This equation gives the entropy in the units of "bits" (per symbol) because it uses a logarithm of base 2, and this base-2 measure of entropy has sometimes been called the shannon in his honor. Entropy is also commonly computed using the natural logarithm (base e, where e is Euler's number), which produces a measurement of entropy in nats per symbol and sometimes simplifies the analysis by avoiding the need to include extra constants in the formulas. Other bases are also possible, but less commonly used. For example, a logarithm of base 28 = 256 will produce a measurement in bytes per symbol, and a logarithm of base 10 will produce a measurement in decimal digits (or hartleys) per symbol.
Intuitively, the entropy HX of a discrete random variable X is a measure of the amount of uncertainty associated with the value of X when only its distribution is known.
The entropy of a source that emits a sequence of N symbols that are independent and identically distributed (iid) is N ⋅ H bits (per message of N symbols). If the source data symbols are identically distributed but not independent, the entropy of a message of length N will be less than N ⋅ H.
If one transmits 1000 bits (0s and 1s), and the value of each of these bits is known to the receiver (has a specific value with certainty) ahead of transmission, it is clear that no information is transmitted. If, however, each bit is independently equally likely to be 0 or 1, 1000 shannons of information (more often called bits) have been transmitted. Between these two extremes, information can be quantified as follows. If is the set of all messages {x1, ..., xn} that X could be, and p(x) is the probability of some , then the entropy, H, of X is defined:[25]
(Here, I(x) is the self-information, which is the entropy contribution of an individual message, and is the expected value.) A property of entropy is that it is maximized when all the messages in the message space are equiprobable p(x) = 1/n; i.e., most unpredictable, in which case H(X) = log n.
The special case of information entropy for a random variable with two outcomes is the binary entropy function, usually taken to the logarithmic base 2, thus having the shannon (Sh) as unit:
Joint entropy
The joint entropy of two discrete random variables X and Y is merely the entropy of their pairing: (X, Y). This implies that if X and Y are independent, then their joint entropy is the sum of their individual entropies.
For example, if (X, Y) represents the position of a chess piece—X the row and Y the column, then the joint entropy of the row of the piece and the column of the piece will be the entropy of the position of the piece.
Despite similar notation, joint entropy should not be confused with cross-entropy.
Conditional entropy (equivocation)
The conditional entropy or conditional uncertainty of X given random variable Y (also called the equivocation of X about Y) is the average conditional entropy over Y:[26]
Because entropy can be conditioned on a random variable or on that random variable being a certain value, care should be taken not to confuse these two definitions of conditional entropy, the former of which is in more common use. A basic property of this form of conditional entropy is that:
Mutual information (transinformation)
Mutual information measures the amount of information that can be obtained about one random variable by observing another. It is important in communication where it can be used to maximize the amount of information shared between sent and received signals. The mutual information of X relative to Y is given by:
In other words, this is a measure of how much, on the average, the probability distribution on X will change if we are given the value of Y. This is often recalculated as the divergence from the product of the marginal distributions to the actual joint distribution:
Mutual information is closely related to the log-likelihood ratio test in the context of contingency tables and the multinomial distribution and to Pearson's χ2 test: mutual information can be considered a statistic for assessing independence between a pair of variables, and has a well-specified asymptotic distribution.
Kullback–Leibler divergence (information gain)
The Kullback–Leibler divergence (or information divergence, information gain, or relative entropy) is a way of comparing two distributions: a "true" probability distribution, and an arbitrary probability distribution . If we compress data in a manner that assumes is the distribution underlying some data, when, in reality, is the correct distribution, the Kullback–Leibler divergence is the number of average additional bits per datum necessary for compression. It is thus defined
Although it is sometimes used as a 'distance metric', KL divergence is not a true metric since it is not symmetric and does not satisfy the triangle inequality (making it a semi-quasimetric).
Another interpretation of the KL divergence is the "unnecessary surprise" introduced by a prior from the truth: suppose a number X is about to be drawn randomly from a discrete set with probability distribution . If Alice knows the true distribution , while Bob believes (has a prior) that the distribution is , then Bob will be more surprised than Alice, on average, upon seeing the value of X. The KL divergence is the (objective) expected value of Bob's (subjective) surprisal minus Alice's surprisal, measured in bits if the log is in base 2. In this way, the extent to which Bob's prior is "wrong" can be quantified in terms of how "unnecessarily surprised" it is expected to make him.
Directed Information
Directed information, , is an information theory measure that quantifies the information flow from the random process to the random process . The term directed information was coined by James Massey and is defined as
In contrast to mutual information, directed information is not symmetric. The measures the information bits that are transmitted causally[clarification needed] from to . The Directed information has many applications in problems where causality plays an important role such as capacity of channel with feedback,[27][28] capacity of discrete memoryless networks with feedback,[29]gambling with causal side information,[30]compression with causal side information,[31]real-time control communication settings,[32][33] and in statistical physics.[34]
Other quantities
Other important information theoretic quantities include the Rényi entropy and the Tsallis entropy (generalizations of the concept of entropy), differential entropy (a generalization of quantities of information to continuous distributions), and the conditional mutual information. Also, pragmatic information has been proposed as a measure of how much information has been used in making a decision.
Coding theory is one of the most important and direct applications of information theory. It can be subdivided into source coding theory and channel coding theory. Using a statistical description for data, information theory quantifies the number of bits needed to describe the data, which is the information entropy of the source.
Data compression (source coding): There are two formulations for the compression problem:
lossy data compression: allocates bits needed to reconstruct the data, within a specified fidelity level measured by a distortion function. This subset of information theory is called rate–distortion theory.
Error-correcting codes (channel coding): While data compression removes as much redundancy as possible, an error-correcting code adds just the right kind of redundancy (i.e., error correction) needed to transmit the data efficiently and faithfully across a noisy channel.
This division of coding theory into compression and transmission is justified by the information transmission theorems, or source–channel separation theorems that justify the use of bits as the universal currency for information in many contexts. However, these theorems only hold in the situation where one transmitting user wishes to communicate to one receiving user. In scenarios with more than one transmitter (the multiple-access channel), more than one receiver (the broadcast channel) or intermediary "helpers" (the relay channel), or more general networks, compression followed by transmission may no longer be optimal.
Source theory
Any process that generates successive messages can be considered a source of information. A memoryless source is one in which each message is an independent identically distributed random variable, whereas the properties of ergodicity and stationarity impose less restrictive constraints. All such sources are stochastic. These terms are well studied in their own right outside information theory.
Rate
Information rate is the average entropy per symbol. For memoryless sources, this is merely the entropy of each symbol, while, in the case of a stationary stochastic process, it is:
that is, the conditional entropy of a symbol given all the previous symbols generated. For the more general case of a process that is not necessarily stationary, the average rate is:
that is, the limit of the joint entropy per symbol. For stationary sources, these two expressions give the same result.[35]
It is common in information theory to speak of the "rate" or "entropy" of a language. This is appropriate, for example, when the source of information is English prose. The rate of a source of information is related to its redundancy and how well it can be compressed, the subject of source coding.
Communications over a channel is the primary motivation of information theory. However, channels often fail to produce exact reconstruction of a signal; noise, periods of silence, and other forms of signal corruption often degrade quality.
Consider the communications process over a discrete channel. A simple model of the process is shown below:
Here X represents the space of messages transmitted, and Y the space of messages received during a unit time over our channel. Let p(y|x) be the conditional probability distribution function of Y given X. We will consider p(y|x) to be an inherent fixed property of our communications channel (representing the nature of the noise of our channel). Then the joint distribution of X and Y is completely determined by our channel and by our choice of f(x), the marginal distribution of messages we choose to send over the channel. Under these constraints, we would like to maximize the rate of information, or the signal, we can communicate over the channel. The appropriate measure for this is the mutual information, and this maximum mutual information is called the channel capacity and is given by:
This capacity has the following property related to communicating at information rate R (where R is usually bits per symbol). For any information rate R < C and coding error ε > 0, for large enough N, there exists a code of length N and rate ≥ R and a decoding algorithm, such that the maximal probability of block error is ≤ ε; that is, it is always possible to transmit with arbitrarily small block error. In addition, for any rate R > C, it is impossible to transmit with arbitrarily small block error.
Channel coding is concerned with finding such nearly optimal codes that can be used to transmit data over a noisy channel with a small coding error at a rate near the channel capacity.
A binary symmetric channel (BSC) with crossover probability p is a binary input, binary output channel that flips the input bit with probability p. The BSC has a capacity of 1 − Hb(p) bits per channel use, where Hb is the binary entropy function to the base-2 logarithm:
A binary erasure channel (BEC) with erasure probability p is a binary input, ternary output channel. The possible channel outputs are 0, 1, and a third symbol 'e' called an erasure. The erasure represents complete loss of information about an input bit. The capacity of the BEC is 1 − p bits per channel use.
Channels with memory and directed information
In practice many channels have memory. Namely, at time the channel is given by the conditional probability.
It is often more comfortable to use the notation and the channel become .
In such a case the capacity is given by the mutual information rate when there is no feedback available and the Directed information rate in the case that either there is feedback or not[27][36] (if there is no feedback the directed information equals the mutual information).
Fungible information
Fungible information is the information for which the means of encoding is not important.[37] Classical information theorists and computer scientists are mainly concerned with information of this sort. It is sometimes referred as speakable information.[38]
Information theoretic concepts apply to cryptography and cryptanalysis. Turing's information unit, the ban, was used in the Ultra project, breaking the German Enigma machine code and hastening the end of World War II in Europe. Shannon himself defined an important concept now called the unicity distance. Based on the redundancy of the plaintext, it attempts to give a minimum amount of ciphertext necessary to ensure unique decipherability.
Information theory leads us to believe it is much more difficult to keep secrets than it might first appear. A brute force attack can break systems based on asymmetric key algorithms or on most commonly used methods of symmetric key algorithms (sometimes called secret key algorithms), such as block ciphers. The security of all such methods comes from the assumption that no known attack can break them in a practical amount of time.
Information theoretic security refers to methods such as the one-time pad that are not vulnerable to such brute force attacks. In such cases, the positive conditional mutual information between the plaintext and ciphertext (conditioned on the key) can ensure proper transmission, while the unconditional mutual information between the plaintext and ciphertext remains zero, resulting in absolutely secure communications. In other words, an eavesdropper would not be able to improve his or her guess of the plaintext by gaining knowledge of the ciphertext but not of the key. However, as in any other cryptographic system, care must be used to correctly apply even information-theoretically secure methods; the Venona project was able to crack the one-time pads of the Soviet Union due to their improper reuse of key material.
Pseudorandom number generators are widely available in computer language libraries and application programs. They are, almost universally, unsuited to cryptographic use as they do not evade the deterministic nature of modern computer equipment and software. A class of improved random number generators is termed cryptographically secure pseudorandom number generators, but even they require random seeds external to the software to work as intended. These can be obtained via extractors, if done carefully. The measure of sufficient randomness in extractors is min-entropy, a value related to Shannon entropy through Rényi entropy; Rényi entropy is also used in evaluating randomness in cryptographic systems. Although related, the distinctions among these measures mean that a random variable with high Shannon entropy is not necessarily satisfactory for use in an extractor and so for cryptography uses.
Seismic exploration
One early commercial application of information theory was in the field of seismic oil exploration. Work in this field made it possible to strip off and separate the unwanted noise from the desired seismic signal. Information theory and digital signal processing offer a major improvement of resolution and image clarity over previous analog methods.[39]
Semiotics
SemioticiansDoede Nauta and Winfried Nöth both considered Charles Sanders Peirce as having created a theory of information in his works on semiotics.[40]: 171 [41]: 137 Nauta defined semiotic information theory as the study of "the internal processes of coding, filtering, and information processing."[40]: 91
Concepts from information theory such as redundancy and code control have been used by semioticians such as Umberto Eco and Ferruccio Rossi-Landi to explain ideology as a form of message transmission whereby a dominant social class emits its message by using signs that exhibit a high degree of redundancy such that only one message is decoded among a selection of competing ones.[42]
Integrated process organization of neural information
Quantitative information theoretic methods have been applied in cognitive science to analyze the integrated process organization of neural information in the context of the binding problem in cognitive neuroscience.[43] In this context, either an information-theoretical measure, such as functional clusters (Gerald Edelman and Giulio Tononi's functional clustering model and dynamic core hypothesis (DCH)[44]) or effective information (Tononi's integrated information theory (IIT) of consciousness[45][46][47]), is defined (on the basis of a reentrant process organization, i.e. the synchronization of neurophysiological activity between groups of neuronal populations), or the measure of the minimization of free energy on the basis of statistical methods (Karl J. Friston's free energy principle (FEP), an information-theoretical measure which states that every adaptive change in a self-organized system leads to a minimization of free energy, and the Bayesian brain hypothesis[48][49][50][51][52]).
^Burnham, K. P.; Anderson, D. R. (2002). Model Selection and Multimodel Inference: A Practical Information-Theoretic Approach (Second ed.). New York: Springer Science. ISBN978-0-387-95364-9.
^ abF. Rieke; D. Warland; R Ruyter van Steveninck; W Bialek (1997). Spikes: Exploring the Neural Code. The MIT press. ISBN978-0262681087.
^Pinkard, Henry; Kabuli, Leyla; Markley, Eric; Chien, Tiffany; Jiao, Jiantao; Waller, Laura (2024). "Universal evaluation and design of imaging systems using information estimation". arXiv:2405.20559 [physics.optics].
^ abMassey, James (1990), "Causality, Feedback And Directed Information", Proc. 1990 Intl. Symp. on Info. Th. and its Applications, CiteSeerX10.1.1.36.5688
^Permuter, Haim Henry; Weissman, Tsachy; Goldsmith, Andrea J. (February 2009). "Finite State Channels With Time-Invariant Deterministic Feedback". IEEE Transactions on Information Theory. 55 (2): 644–662. arXiv:cs/0608070. doi:10.1109/TIT.2008.2009849. S2CID13178.
^Kramer, G. (January 2003). "Capacity results for the discrete memoryless network". IEEE Transactions on Information Theory. 49 (1): 4–21. doi:10.1109/TIT.2002.806135.
^Permuter, Haim H.; Kim, Young-Han; Weissman, Tsachy (June 2011). "Interpretations of Directed Information in Portfolio Theory, Data Compression, and Hypothesis Testing". IEEE Transactions on Information Theory. 57 (6): 3248–3259. arXiv:0912.4872. doi:10.1109/TIT.2011.2136270. S2CID11722596.
^Simeone, Osvaldo; Permuter, Haim Henri (June 2013). "Source Coding When the Side Information May Be Delayed". IEEE Transactions on Information Theory. 59 (6): 3607–3618. arXiv:1109.1293. doi:10.1109/TIT.2013.2248192. S2CID3211485.
^Charalambous, Charalambos D.; Stavrou, Photios A. (August 2016). "Directed Information on Abstract Spaces: Properties and Variational Equalities". IEEE Transactions on Information Theory. 62 (11): 6019–6052. arXiv:1302.3971. doi:10.1109/TIT.2016.2604846. S2CID8107565.
^Permuter, Haim Henry; Weissman, Tsachy; Goldsmith, Andrea J. (February 2009). "Finite State Channels With Time-Invariant Deterministic Feedback". IEEE Transactions on Information Theory. 55 (2): 644–662. arXiv:cs/0608070. doi:10.1109/TIT.2008.2009849. S2CID13178.
^Maurer, H. (2021). Cognitive Science: Integrative Synchronization Mechanisms in Cognitive Neuroarchitectures of the Modern Connectionism. CRC Press, Boca Raton/FL, chap. 10, ISBN 978-1-351-04352-6. https://doi.org/10.1201/9781351043526
^Edelman, G.M. and G. Tononi (2000). A Universe of Consciousness: How Matter Becomes Imagination. Basic Books, New York.
^Tononi, G. and O. Sporns (2003). Measuring information integration. BMC Neuroscience 4: 1-20.
^Tononi, G. (2004a). An information integration theory of consciousness. BMC Neuroscience 5: 1-22.
^Tononi, G. (2004b). Consciousness and the brain: theoretical aspects. In: G. Adelman and B. Smith [eds.]: Encyclopedia of Neuroscience. 3rd Ed. Elsevier, Amsterdam, Oxford.
^Friston, K. and K.E. Stephan (2007). Free-energy and the brain. Synthese 159: 417-458.
^Friston, K. (2010). The free-energy principle: a unified brain theory. Nature Reviews Neuroscience 11: 127-138.
^Friston, K., M. Breakstear and G. Deco (2012). Perception and self-organized instability. Frontiers in Computational Neuroscience 6: 1-19.
^Friston, K. (2013). Life as we know it. Journal of the Royal Society Interface 10: 20130475.
^Kirchhoff, M., T. Parr, E. Palacios, K. Friston and J. Kiverstein. (2018). The Markov blankets of life: autonomy, active inference and the free energy principle. Journal of the Royal Society Interface 15: 20170792.
J. L. Kelly Jr., Princeton, "A New Interpretation of Information Rate" Bell System Technical Journal, Vol. 35, July 1956, pp. 917–26.
R. Landauer, IEEE.org, "Information is Physical" Proc. Workshop on Physics and Computation PhysComp'92 (IEEE Comp. Sci.Press, Los Alamitos, 1993) pp. 1–4.
Timme, Nicholas; Alford, Wesley; Flecker, Benjamin; Beggs, John M. (2012). "Multivariate information measures: an experimentalist's perspective". arXiv:1111.6857 [cs.IT].
Textbooks on information theory
Alajaji, F. and Chen, P.N. An Introduction to Single-User Information Theory. Singapore: Springer, 2018. ISBN978-981-10-8000-5
Arndt, C. Information Measures, Information and its Description in Science and Engineering (Springer Series: Signals and Communication Technology), 2004, ISBN978-3-540-40855-0
A. I. Khinchin, Mathematical Foundations of Information Theory, New York: Dover, 1957. ISBN0-486-60434-9
H. S. Leff and A. F. Rex, Editors, Maxwell's Demon: Entropy, Information, Computing, Princeton University Press, Princeton, New Jersey (1990). ISBN0-691-08727-X
Robert K. Logan. What is Information? - Propagating Organization in the Biosphere, the Symbolosphere, the Technosphere and the Econosphere, Toronto: DEMO Publishing.
Tom Siegfried, The Bit and the Pendulum, Wiley, 2000. ISBN0-471-32174-5
BringinKecamatanNegara IndonesiaProvinsiJawa TimurKabupatenNgawiPemerintahan • CamatSupriyadiLuas[1] • Total75,36 km2 (29,10 sq mi)Populasi • Total31,885[1] jiwa • Kepadatan423/km2 (1,100/sq mi)Kode area telepon+62 351Kode Kemendagri35.21.15 Desa/kelurahan10 Desa 50 Dukuh Bringin[2] (Jawa: ꦧꦿꦶꦔꦶꦤ꧀, translit. Bringin) adalah sebuah kecamatan di Kabupaten Ngawi yang berbatasan...
This article needs additional citations for verification. Please help improve this article by adding citations to reliable sources. Unsourced material may be challenged and removed.Find sources: Nookambika Temple – news · newspapers · books · scholar · JSTOR (May 2010) (Learn how and when to remove this template message) Part of a series onHinduism Hindus History Timeline Origins History Indus Valley Civilisation Historical Vedic religion Dravidian fol...
8th episode of the 3rd season of Playhouse 90 Old ManPlayhouse 90 episodeAdvertisement for Old ManEpisode no.Season 3Episode 8Directed byJohn FrankenheimerWritten byHorton Foote (adaptation), William Faulkner (novel)Original air dateNovember 20, 1958 (1958-11-20)Guest appearances Sterling Hayden as Tall convict Geraldine Page as Young woman Episode chronology ← PreviousHeart of Darkness Next →The Return of Ansel Gibbs Old Man is an American television play br...
Pour le dungeon crawling dans le jeu de rôle, voir Porte-monstre-trésor. Le dungeon crawler (pouvant être traduit par « ramper dans un donjon »[1]) ou Dungeon-RPG est un genre de jeu vidéo de rôle dont le gameplay met l'exploration de donjons en avant. Le joueur contrôle généralement son personnage en vue subjective et se déplace de case en case[2]. Le donjon y est souvent labyrinthique et rempli d'ennemis, ce qui implique une mécanique de combat. Le scénario de type «...
Shire of Murchison Local Government Area van Australië Locatie van Shire of Murchison in West-Australië Situering Staat West-Australië Hoofdplaats geen Coördinaten 26°53'ZB, 115°57'OL Algemene informatie Oppervlakte 41.173 km² Inwoners 101 (2021)[1] Overig Wards 2 Website http://www.murchison.wa.gov.au/ (en) Portaal Australië Shire of Murchison is een lokaal bestuursgebied (LGA) in de regio Mid West in West-Australië. Shire of Murchison telde 101 inwoners in 2021...
Michael Kumpfmüller, 2008 Michael Kumpfmüller (* 21. Juli 1961 in München) ist ein deutscher Schriftsteller. Inhaltsverzeichnis 1 Leben 2 Werk 3 Auszeichnungen 4 Werke 5 Hörspiele 6 Weblinks 7 Einzelnachweise Leben Michael Kumpfmüller verbrachte seine Kindheit und Jugend in der Gemeinde Unterschleißheim im Norden von München. Nach dem Abitur am Werner-Heisenberg-Gymnasium Garching studierte er deutsche Literatur und Geschichte in Tübingen, Wien und Berlin und schloss sein Studium 1994...
Rhombus with diagonals in the golden ratio The golden rhombus. In geometry, a golden rhombus is a rhombus whose diagonals are in the golden ratio:[1] D d = φ = 1 + 5 2 ≈ 1.618 034 {\displaystyle {D \over d}=\varphi ={{1+{\sqrt {5}}} \over 2}\approx 1.618~034} Equivalently, it is the Varignon parallelogram formed from the edge midpoints of a golden rectangle.[1] Rhombi with this shape form the faces of several notable polyhedra. The golden rhombus should be...
Train operating company in Wales, United Kingdom For the urban rail network around Cardiff, formerly operated by this franchise but today by Transport for Wales, see Valleys & Cardiff Local Routes. This article needs additional citations for verification. Please help improve this article by adding citations to reliable sources. Unsourced material may be challenged and removed.Find sources: Valley Lines train operating company – news · newspapers · books...
Island in Higashi-ku, Fukuoka, Japan Panorama view of South-West Shika Island from ferry on Hakata Bay A view of the entire Hakata Bay from Shiomi-kōen Park Observation Deck Shikanoshima Island (志賀島, Shika-no-shima[1]) is an island in Higashi-ku, Fukuoka, Japan. The island is known as the spot where the Gold Seal of the King of Na, a national treasure, was discovered. The island is about 11 kilometres around and connected to the Umi no Nakamichi (road) on the mainland by a caus...
This article contains translated text and needs attention from someone fluent in Spanish and English. Please see this article's entry on Pages needing translation into English for discussion. If you have just labeled this article as needing attention, please add{{subst:Needtrans|pg=Asociación de Scouts de Venezuela |language=Spanish |comments= }} ~~~~to the bottom of the WP:PNTCU section on Wikipedia:Pages needing translation into English. (A...
Australian multinational bank Commonwealth Bank of Australia1 Darling Park Headquarters, SydneyTypePublicTraded asASX: CBAS&P/ASX 200 componentISINAU000000CBA7IndustryBanking and financial servicesFounded22 December 1911;111 years ago (1911-12-22)(government bank)12 September 1991;32 years ago (1991-09-12)(public company)HeadquartersDarling Park Tower 1, 201 Sussex Street, Sydney, New South Wales, AustraliaNumber of locations741 branches1,956 ATMs[1 ...
Australian politician For the member for Uralla-Walcha, see William Henry Piddington.For the English actor and singer, see Bill Tarmey. William Richman Piddington13th Colonial Treasurer of New South WalesIn office14 May 1872 – 4 December 1872Preceded byGeorge LordSucceeded byGeorge LloydIn office22 March 1877 – 16 August 1877Preceded byAlexander StuartSucceeded byWilliam LongMember of the New South WalesLegislative CouncilIn office28 October 1879 – 25 Nove...
Dominican baseball player (1942–2023) In this Spanish name, the first or paternal surname is Rojas and the second or maternal family name is Alou. Baseball player Jesús AlouAlou in 1965OutfielderBorn: (1942-03-24)March 24, 1942Bajos de Haina, Dominican RepublicDied: March 10, 2023(2023-03-10) (aged 80)Santo Domingo, Dominican RepublicBatted: RightThrew: RightMLB debutSeptember 10, 1963, for the San Francisco GiantsLast MLB appearanceSeptember 29, 1979, ...
Sporting event delegationIceland at the1980 Summer ParalympicsIPC codeISLNPCNational Paralympic Committee of IcelandWebsitewww.ifsport.isin ArnhemCompetitors12MedalsRanked 31st Gold 1 Silver 0 Bronze 1 Total 2 Summer Paralympics appearances (overview)198019841988199219962000200420082012201620202024 Iceland competed at the 1980 Summer Paralympics in Arnhem, Netherlands. 12 competitors from Iceland won 2 medals, 1 gold and 1 bronze, and finished joint 31st in the medal table with Colombia.[...
Clasificación de AFC/CAF y Oceanía para la Copa Mundial de Fútbol de 1966Camboya Camboya 1965 Sede Nom Pen Estadio Estadio Olímpico de Nom Pen Fecha 21 de noviembre de 196524 de noviembre de 1965 Cantidad de equipos 21 Equipos clasificados PRK Corea del Norte Partidos 2 Goles anotados 11 (5.5 por partido) Goleador Pak Seung-Zin (3 goles) La Clasificación de AFC/CAF y Oceanía para la Copa Mundial de Fútbol de 1966 contó con la participación de 21 selecciones nacion...
Verschiedene Varietäten der Zitronatzitrone, die als Etrog verwendet werden. Eine Zitronatzitrone wird von einem ultraorthodoxen Juden darauf untersucht, ob sie ohne jeglichen Fehl ist. Nur dann darf sie als Etrog verwendet werden. Der Etrog (hebräisch אֶתְרוֹג Etrōg, deutsch ‚Ethrog‘, jiddisch Essreg) sind verschiedene Varietäten der Zitronatzitrone, die von Juden während des Laubhüttenfests oder Sukkot verwendet werden. Der Etrog gehört zu dem im 3. Buch Mose 23, ...
Electrical device Antennas redirects here. For other uses of antenna, see Antenna (disambiguation). AntennaA stack of fishbone and Yagi–Uda television antennasWorking principleElectromagnetic radiationInventedHeinrich HertzFirst production 1886Electronic symbol Film on working of antenna In radio engineering, an antenna (American English) or aerial (British English) is the interface between radio waves propagating through space and electric currents moving in metal conductors, u...
Pakistani YouTuber and preacher (born 1977) Muhammad Ali Mirza محمد علی مرزاEngineer Muhammad Ali Mirza at Qur'an & Sunnat Research Academy Jhelum Pakistan In 2022PersonalBornMuhammad Ali Mirza (1977-10-04) 4 October 1977 (age 46)[1]Jhelum, Punjab, PakistanReligionIslamNationalityPakistaniDenominationNon-denominationalEducationUniversity of Engineering and Technology, TaxilaOccupationGeneral Manager (Grade-19 Officer),[1]YouTube informationChannel Engineer ...
Hobbycopter HobbyCopter in the Oregon Air and Space Museum Role Single seat homebuilt helicopterType of aircraft National origin United States Manufacturer Vortech Designer Adams-Wilson First flight November 1958 Variants A-B Helicopters A/W 95Vortech A/W 95Showers Skytwister Choppy The Adams-Wilson Hobbycopter (later named the Adams-Wilson Choppy) is a small, single-seat, open-framework helicopter designed for homebuilding, to be powered by a motorcycle engine. Design and development The Ada...