出典(authority):フリー百科事典『ウィキペディア(Wikipedia)』「2015/01/25 12:57:18」(JST)
This article may require cleanup to meet Wikipedia's quality standards. No cleanup reason has been specified. Please help improve this article if you can. (September 2010) |
In probability theory, a purely stochastic system is one whose state is non-deterministic (i.e., "random") so that the subsequent state of the system is determined probabilistically. Any system or process that must be analyzed using probability theory is stochastic at least in part.[1][2] Stochastic systems and processes play a fundamental role in mathematical models of phenomena in many fields of science, engineering, and economics.
Stochastic comes from the Greek word στόχος (stokhos, "aim"). It also denotes a target stick; the pattern of arrows around a target stick stuck in a hillside is representative of what is stochastic.
The use of the term stochastic to mean based on the theory of probability goes back to a 1917 publication by Ladislaus Bortkiewicz (1868–1931).[3] Bortkiewicz used it in the sense of making conjectures that the Greek term has borne since the days of the ancient philosophers, and after the title of Ars Conjectandi that Jakob Bernoulli gave to his work (published in 1713) on probability theory.[4]
In mathematics, specifically in probability theory, the field of stochastic processes has been[when?] a major area of research.[citation needed]
A stochastic matrix is a matrix that has non-negative real entries that sum to one in each column, row, or both.
In artificial intelligence, stochastic programs work by using probabilistic methods to solve problems, as in simulated annealing, stochastic neural networks, stochastic optimization, genetic algorithms, and genetic programming. A problem itself may be stochastic as well, as in planning under uncertainty.
One of the simplest continuous-time stochastic processes is Brownian motion. This was first observed by botanist Robert Brown while looking through a microscope at pollen grains in water.
The name "Monte Carlo" for the stochastic Monte Carlo method was popularized by physics researchers Stanislaw Ulam, Enrico Fermi, John von Neumann, and Nicholas Metropolis, among others. The name is a reference to the Monte Carlo Casino in Monaco where Ulam's uncle would borrow money to gamble.[5] The use of randomness and the repetitive nature of the process are analogous to the activities conducted at a casino.
Random methods of computation and experimentation (generally considered forms of stochastic simulation) can be arguably traced back to the earliest pioneers of probability theory (see, e.g., Buffon's needle, and the work on small samples by William Sealy Gosset), but are more specifically traced to the pre-electronic computing era. The general difference usually described about a Monte Carlo form of simulation is that it systematically "inverts" the typical mode of simulation, treating deterministic problems by first finding a probabilistic analog (see Simulated annealing). Previous methods of simulation and statistical sampling generally did the opposite: using simulation to test a previously understood deterministic problem. Though examples of an "inverted" approach do exist historically, they were not considered a general method until the popularity of the Monte Carlo method spread.
Perhaps the most famous early use was by Enrico Fermi in 1930, when he used a random method to calculate the properties of the newly discovered neutron. Monte Carlo methods were central to the simulations required for the Manhattan Project, though were severely limited by the computational tools at the time. Therefore, it was only after electronic computers were first built (from 1945 on) that Monte Carlo methods began to be studied in depth. In the 1950s they were used at Los Alamos for early work relating to the development of the hydrogen bomb, and became popularized in the fields of physics, physical chemistry, and operations research. The RAND Corporation and the U.S. Air Force were two of the major organizations responsible for funding and disseminating information on Monte Carlo methods during this time, and they began to find a wide application in many different fields.
Uses of Monte Carlo methods require large amounts of random numbers, and it was their use that spurred the development of pseudorandom number generators, which were far quicker to use than the tables of random numbers which had been previously used for statistical sampling.
In biological systems, introducing stochastic "noise" has been found[by whom?] to help improve the signal strength of the internal feedback loops for balance and other vestibular communication. It has been found to help diabetic and stroke patients with balance control.[6] Many biochemical events also lend themselves to stochastic analysis. Gene expression, for example, has a stochastic component through the molecular collisions — as during binding and unbinding of RNA polymerase to a gene promoter — via the solution's Brownian motion.
Stochastic effect, or "chance effect" is one classification of radiation effects that refers to the random, statistical nature of the damage. In contrast to the deterministic effect, severity is independent of dose. Only the probability of an effect increases with dose.
Simonton (2003, Psych Bulletin) argues that creativity in science (of scientists) is a constrained stochastic behaviour such that new theories in all sciences are, at least in part, the product of a stochastic process.
Stochastic ray tracing is the application of Monte Carlo simulation to the computer graphics ray tracing algorithm. "Distributed ray tracing samples the integrand at many randomly chosen points and averages the results to obtain a better approximation. It is essentially an application of the Monte Carlo method to 3D computer graphics, and for this reason is also called Stochastic ray tracing."[citation needed]
Although most computers are deterministic machines, their complexity makes deterministic analysis impossible. Consequently, stochastic forensics analyzes computer crime by viewing computers as stochastic processes.
In music, mathematical processes based on probability can generate stochastic elements.
Stochastic processes may be used in music to compose a fixed piece or may be produced in performance. Stochastic music was pioneered by Iannis Xenakis, who coined the term stochastic music. Specific examples of mathematics, statistics, and physics applied to music composition are the use of the statistical mechanics of gases in Pithoprakta, statistical distribution of points on a plane in Diamorphoses, minimal constraints in Achorripsis, the normal distribution in ST/10 and Atrées, Markov chains in Analogiques, game theory in Duel and Stratégie, group theory in Nomos Alpha (for Siegfried Palm), set theory in Herma and Eonta,[7] and Brownian motion in N'Shima.[citation needed] Xenakis frequently used computers to produce his scores, such as the ST series including Morsima-Amorsima and Atrées, and founded CEMAMu. Earlier, John Cage and others had composed aleatoric or indeterminate music, which is created by chance processes but does not have the strict mathematical basis (Cage's Music of Changes, for example, uses a system of charts based on the I-Ching).
When color reproductions are made, the image is separated into its component colors by taking multiple photographs filtered for each color. One resultant film or plate represents each of the cyan, magenta, yellow, and black data. Color printing is a binary system, where ink is either present or not present, so all color separations to be printed must be translated into dots at some stage of the work-flow. Traditional line screens which are amplitude modulated had problems with moiré but were used until stochastic screening became available. A stochastic (or frequency modulated) dot pattern creates a sharper image.
Non-deterministic approaches in language studies are largely inspired by the work of Ferdinand de Saussure, for example, in functionalist linguistic theory, which argues that competence is based on performance.[8][9] This distinction in functional theories of grammar should be carefully distinguished from the langue and parole distinction. To the extent that linguistic knowledge is constituted by experience with language, grammar is argued to be probabilistic and variable rather than fixed and absolute. This conception of grammar as probabilistic and variable follows from the idea that one's competence changes in accordance with one's experience with language. Though this conception has been contested,[10] it has also provided the foundation for modern statistical natural language processing[11] and for theories of language learning and change.[12]
Stochastic social science theory is similar to systems theory in that events are interactions of systems, although with a marked emphasis on unconscious processes. The event creates its own conditions of possibility, rendering it unpredictable if simply for the number of variables involved. Stochastic social science theory can be seen as an elaboration of a kind of 'third axis' in which to situate human behavior alongside the traditional 'nature vs. nurture' opposition. See Julia Kristeva on her usage of the 'semiotic', Luce Irigaray on reverse Heideggerian epistemology, and Pierre Bourdieu on polythetic space for examples of stochastic social science theory.[citation needed]
Stochastic methods are widely used for uncertainty assessment in future performance predictions of oil and gas reservoirs.[13]
Manufacturing processes are assumed to be stochastic processes. This assumption is largely valid for either continuous or batch manufacturing processes. Testing and monitoring of the process is recorded using a process control chart which plots a given process control parameter over time. Typically a dozen or many more parameters will be tracked simultaneously. Statistical models are used to define limit lines which define when corrective actions must be taken to bring the process back to its intended operational window.
This same approach is used in the service industry where parameters are replaced by processes related to service level agreements.
The financial markets use stochastic models to represent the seemingly random behaviour of assets such as stocks, commodities, relative currency prices (i.e., the price of one currency compared to that of another, such as the price of US Dollar compared to that of the Euro), and interest rates. These models are then used by quantitative analysts to value options on stock prices, bond prices, and on interest rates, see Markov models. Moreover, it is at the heart of the insurance industry.
The marketing and the changing movement of audience tastes and preferences, as well as the solicitation of and the scientific appeal of certain film and television debuts (i.e., their opening weekends, word-of-mouth, top-of-mind knowledge among surveyed groups, star name recognition and other elements of social media outreach and advertising), are determined in part by stochastic modeling. A recent attempt at repeat business analysis was done by Japanese scholars[citation needed] and is part of the Cinematic Contagion Systems patented by Geneva Media Holdings, and such modeling has been used in data collection from the time of the original Nielsen ratings to modern studio and television test audiences.
リンク元 | 「推計学」「inductive statistics」 |
関連記事 | 「stochastic」 |
.