||Physicists have abandoned determinism as a fundamental description of reality. The most precise physical laws we have are quantum mechanical, and the principle of quantum uncertainty limits our ability to predict, with arbitrary precision, the future state of even the simplest imaginable system. However, scientists began developing probabilistic, that is, stochastic, models of natural phenomena long before quantum mechanics was discovered in the 1920s. Classical uncertainty preceded quantum uncertainty because, unlike the latter, the former is rooted in easily recognized human conditions. We are too small and the universe too large and too interrelated for thoroughly deterministic thinking.
For whatever reason—fundamental physical indeterminism, human finitude, or both—there is much we don't know. And what we do know is tinged with uncertainty. Baseballs and hydrogen atoms behave, to a greater or lesser degree, unpredictably. Uncertainties attend their initial conditions and their dynamical evolution. This also is true of every artificial device, natural system, and physics experiment.
Nevertheless, physics and engineering curriculums routinely invoke precise initial conditions and the existence of deterministic physical laws that turn these conditions into equally precise predictions. Students spend many hours in introductory courses solving Newton's laws of motion for the time evolution of projectiles, oscillators, circuits, and charged particles before they encounter probabilistic concepts in their study of quantum phenomena. Of course, deterministic models are useful, and, possibly, the double presumption of physical determinism and superhuman knowledge simplifies the learning process. But uncertainties are always there. Too often these uncertainties are ignored and their study delayed or omitted altogether.
An Introduction to Stochastic Processes in Physics revisits elementary and foundational problems in classical physics and reformulates them in the language of random variables. Well-characterized random variables quantify uncertainty and tell us what can be known of the unknown. A random variable is defined by the variety of numbers it can assume and the probability with which each number is assumed. The number of dots showing face up on a die is a random variable. A die can assume an integer value 1 through 6, and, if unbiased and honestly rolled, it is reasonable to suppose that any particular side will come up one time out of six in the long run, that is, with a probability of 1/6.