ISSN: 2754-4753 | Open Access

Journal of Physics & Optics Sciences

Is Quantization a Co-Operative Phenomenon? On the Ensemble Interpretation of Quantum Physics

Author(s): Peter Marquardt

Abstract

A fresh look at Planck's quantum, h, re-interprets action as the companion of power in a dynamic interplay of energies in a multi particle system of the micro world. An exchange of energy, δE, requires a characteristic duration, δτ, with δE and δτ tied together in a natural way. Action, the product δEδτ, must have a nonzero lower limit, h, as a classical necessity everywhere in physics. Zero h would constitute a singularity. The veritable quantum is action, not energy. Black body radiation formulas (Wien, Planck, and Rayleigh) all require this non zero h. Planck's thermodynamic approach was not consistently developed further. Einstein in his 1905 description of the photo effect chose the single "quantum of light" as statistically independent actor in radiation. The resulting discussions about the wave-particle dualism, the meaning of uncertainty, the collapse of wave packets when observed, self-interference, and related problems can be avoided by the consistent view of the ensemble interpretation. In agreement with action defined as δEδτ only co-operative phenomena produce the effects we observe. In particular, this concerns the photon, directly associated with h, as naturally interactive quantum. Careful experiments on low intensity interference suggest to take the photon out of the isolation, thus changing the general view of quantum physics with surprising consequences. Do photons transport energy or action? To understand where 20th century physics has gone a glance at the historical background is in order.

On Potentials and Action

Our everyday experience in the macro world tells us that dynamic (ex)changes of energy, δE, require their corresponding duration, δt (not to be confused with abstract time that, in spite of common usage, has to be defined as independent of any physical process): An explosion quickly releases large amounts of energy - the controlled release of smaller amounts "takes more time". We may safely assume this principle to hold in the micro world as well. The assumption of action being the product "energy times time" merely defines its dimension and misses its physical value. The integral "energy over time" is the mathematical rendering of energy changing during the considered process but does not explicitly express the close relationship between δE and δt. The theorem of kinetic energy, (δE = vδp with v = δr/δt) puts the pairs (δp, δr) and (δE, δt) on an equal footing. This works fine for action to be expressed either way, δpδr or δEδt.

Action, defined as product of an inseparable pair δEδt or δpδr, is a physical principle and can be traced back to inter atomic potentials, ø(r), due to interacting sources (charges or masses). The gradients of the potentials ø(r) tie together the two ingredients, δE and δt, of action. The amount of energy δE to be dispensed corresponds to the steepness of ø(r) with ø(r) typically flattening with distance r. As a rule, large amounts of δE occur in short intervals of δt and vice versa. We recognize a simple natural relation "if δE is large, then δt is small and vice versa" that complies with causality. Potentials provide a helpful mechanism whenever energy comes into play, dynamic processes or binding included. They guarantee inter atomic coupling and rule out isolated ("spontaneous") processes. We see potentials as justification for the present topic, the ensemble interpretation. The force on a particle interacting with its surrounding is proportional to the gradient of an effective potential provided by the surrounding. The simple 1/r potential of the Newton or the Cavendish-Coulomb type is just the static two-body approximation. The physics of potentials gets quite involved for more than two interacting partners, still more so if dynamics (motion) comes into play when velocity square terms contribute to the effective potential.

Remark in passing: This kind of potential is quite natural and not surprising. We observe dynamic velocity dependent potentials every day, just take oscillations. The v2/2 term of kinetic energy is the first v2 « c2 approximation of such a dynamic potential in neo mechanics (referring to the Universe as the only legitimate inertial system and therefore not to be confused with special relativity). The neo mechanical meaning of c2 relates to the Universe as ubiquitous background potential ("Mach's Principle") and may be related to the ubiquitous existence of photons [1]. The ongoing interplay between static ø(r) and more so dynamic ø(v2) potentials is the striking evidence of energy conservation. What we call "temperature" is dominated by ø(v2) potentials. The conversion of ø(v2) potentials into ø(r) potentials and vice versa is everyday experience. The topic of dynamic potentials including the later famous c2, pioneered by W. E. Weber in 1846, is fascinating because of its far-reaching consequences for electrodynamics and mechanics, including gravity, alike [2, 3].

Dynamic (or generalized) potentials tell us not to isolate a process out of its context including the restless interactions underlying radiation. This rule out the misleading concept of "spontaneous emission". Energy is to be defined in the context of interactions which justify the ensemble interpretation (not only) of quantum physics which, officially, focuses on Planck's individual quantum, h. The above presentation of action calls for the all-important context and illuminates the necessity because h and therefore both δE and δt must have a finite value each.

The concept of energy and of how energy is processed stands or falls with the physical justification how to apply it. Particularly in the micro world ("quantum physics") the dynamic context is of importance. Yet focusing on single particles the defenders of the prevailing doctrine, ironically named "Copenhageners" by their critics hinting at the origin of the Bohr-Heisenmberg school in Denmark, produced some of their questionable triumphs [4].

Planck's Threshold of Action in Mainstream Theory Waves and Particles

In the Copenhagen view Planck's quantum of action had a mathematical solo career of its own. Critics do not accept this. There is no causality for isolated particles because causality requires interactions. Consequently, there is no reason for an isolated particle to exhibit wave properties, neither wavelength λ. nor frequency v nor phase velocity c = vλ Yet the basic wave parameters defined to express the phase of a wave, were associated with Planck's quantum h, and applied to a single particle including the photon. Experimentally Planck's frequency condition E = hv and de Broglie's momentum condition p= h/λ. work fine - for ensembles, that is, both for charged massive particles (electron microscope) and photons (wave optics). That does not prove wave properties of single let alone non-interacting particles.

Another aspect of the solo career of h is a measuring prescription placing the observers' influence in the foreground. As "uncertainty relation" demanding the simultaneous determination of a single particle's place, r, and momentum, p, it was made one of Nature's intrinsic principles.

There is no way we can do that measurement on an isolated particle as demanded by uncertainty neither do we have to accept the handcuffs of a typical gedanken experiment. That does not deprive us of an analysis based on our observations of interacting ensembles and still arrive at a valid statement about the parameters of an undisturbed ensemble. The relation p = h/λ of a coherent wave expressed in terms of the intrinsic product δpδr ("DebyeSommerfeld range", see below) may be well below Planck's threshold of action. What then would h have to do with waves? Its value represents the condition for the ability of two coherent ensembles to interfere (δpδr < h) or when interference fails (δpδr > h) [5]. These two criteria concern the phase relation between photons in a coherent wave. Uncertainty is not a natural principle - the lower threshold of action is.

Prescribed as a double measurement on a single particle ("Heisenberg's microscope") it has a built-in trivial impossibility: The simultaneity ploy makes tacit use of a singularity, similar to δt = 0. Mathematically it's easy to enter and handle singularities, either the zero or the infinity type. In physics we have to be careful about zero and infinity. The awareness of singularities provides our access to the ensemble interpretation of quantum physics. The key is the non-zero finiteness of both δE and δt. Uncertainty is founded on the non-commutativity of position and momentum operators with the tool box of matrix mechanics behind it. As a purely mathematical claim it misses the essence of action. Operators are at best descriptive. They cannot "give" a particle its position or momentum, let alone a non-interacting particle. Without taking into account dynamic interactions in an ensemble theses parameters are meaningless. Causality requires non-zero gradients of the surrounding potentials. Copenhagen's academic case of isolated particles does not explicitly care for the importance of causal interactions in the surrounding potential. There seems to be disagreement among the Copenhageners how to define a "time operator" - the reason why uncertainty discussions prefer δpδr in their operator presentation?

Nature's principles, causality in particular, must be placed above and distinguished from human mathematics and measurement type "gedanken experiment" prescriptions. A numerically correct result, readily achieved by benevolent mathematics, is necessary, not sufficient to prove a theory right, let alone flawless.

Naturally any measurement requiring non-zero δE affects the state of a micro system. This should not be taken as an invitation to a positivistic attitude ("it exists only if observed").

The quantum of action in terms of δEδt implies changes in interacting ensembles. Direct observation, on the other hand, is not required to include an analysis of the (academic) undisturbed state of particles. Were δE strictly zero, a fatal mathematical singularity, nothing happens and δt would not matter. Non-zero δE requires the fitting δt for finite action and power.

Action, Power, and a Classical Look at Quantization
The Veritable Quantum

Why is energy often accepted as quantum at the side of, or instead of, action? Action is not energy. We may argue that the confusion happened because action is just the silent companion of the other more conspicuous side of energy processing, power.

Both, action and power together, are the key to access the finite lower limit of action. Both δE and δt have to be finite somehow; δE = 0 being the academic static case "nothing happens". Consider the realistic case of finite δE which excludes δt = 0 as a singularity for power. This forbidden infinity, not as prominent as the "ultraviolet catastrophe", necessarily implies a lower finite limit of action, the value of h = 6.626*10-34 Joule sec established by uncounted experiments since Planck's analysis of thermal radiation. Planck's approach, however, was different from the above singularity consideration. He started with entropy and the fatal assumption of independent oscillators. Dynamic disorder ("entropy"), responsible for temperature, does not exclude local ordering and co-operative oscillations in a thermodynamic ensemble with radiative coupling between neighbouring atoms.

Black body radiation became the place to bridge the gap between micro and macro world in what concerns the two complementary mechanisms how energy is processed, action and power. The spectra reveal the importance of action together with power as their intensity comes in Watts/m2. Integrated over dv they deliver intensity [power/area], the quantity which is actually recorded. The power delivered on our planet by our biggest thermal radiation source is some 1,300 W/m2, ideally without atmospheric absorption ("solar constant").

Thermal radiation spectra offer us both, insight in the role of action and power.
This suggests to consider action and power as companions. Their common ingredients are δE and δt albeit in a complementarymanner. In a δE vs δt diagram, constant action shows as hyperbolas and constant power as straight lines. Strangely, these two natural ways of energy processing are usually not considered together. Is this because power δE/δt is more conspicuous than action δEδt? We are not used to measurements of action while the experimental access to power is obvious.

In spite of its long history in physics, action proved to be a puzzle when it comes to its quantization. Planck was aware of action, but still in his later years [1943] he conceded [6]:
"Nun aber erhob sich das theoretisch allerschwierigste Problem, dieser sonderbaren Konstanten einen physikalischen Sinn beizulegen". ["But now the most difficult theoretical problem arose, to attribute a physical sense to this strange constant"].

The type zero singularity for both δE and δt, easily imported within the continuum model underlying oscillations, makes us recognize the not-so-mysterious finite lower limit of action as a physical necessity. Planck's idea of oscillating charges as the origin of radiation, however, suggests to ask how the minimum of action h might built up before emission.

Debye and Sommerfeld [1913] speculated on what might happen before one unit of h becomes effective [7]. Assuming a kind of iterative process, they proposed an accumulation of oscillations in the shady range before emission takes place, i.e. when the value of h is reached. The authors stay in the single photon picture of the photoelectric effect, but their assumption of a piling up process before radiation is released is quite attractive. A revival of this intriguing idea may be helpful in handling the continuum vs discreteness controversy. Applied to the ensemble interpretation it hints at the piling up of quanta to form a coherent wavelet ("bunch", see below) before emission.

In any case, zero action for nonzero δE is impossible. To repeat: It would correspond to infinite power. To avoid this singularity (besides the "ultraviolet catastrophe" in connection with Rayleigh's radiation formula v2dv kT/c2) constitutes the value of Planck's and also of Wien's findings.

The "Cradle of Quantum Physics"

The history of thermal radiation and the non-zero minimum product h = δEδt unearthed by Planck with his formula marked the official advent of quantum physics. But was it Planck alone? We find traces of h hidden in Wien's and even in Rayleigh's "classical" formula (see below).

In 1896 Wien formulated fundamental properties of radiation formulas: Two "displacement laws" demanded by thermodynamics and the general structure in terms of the basic independent parameters wavelength λ and temperature T, in agreement with the T4 intensity formulated by Stefan and Boltzmann for the total spectra integrated over λ. Wien's radiation formula is the special case valid for small λ. Originally formulated in terms of λ as preferred in spectroscopy, these findings were later expressed by the equivalent representation in terms of frequency v by virtue of the relation c = vλ between the wave parameters v and λ and the phase speed, c. In retrospect we see that the general structure v3f(v/T), backed up by thermodynamics, makes Wien one of the fathers of quantization (see below).

The argument of a function must be a pure number, here a ratio of two energies. This enters the factor action/entropy to arrive at x = hv/kT, the ratio h/k being indirectly present in Wien's 1896 original. Other than Boltzmann who expressed entropy in terms the universal gas constant R (a "mole of entropy"), Planck divided R by Avogadro's number, N, and introduced k= R/N (the later "Boltzmann constant"). In a way this contradicted the thermodynamic (i e co-operative) understanding of radiation and the discovery of h. Einstein, too, used R/N in his famous 1905 paper. Ever since, we are not used to a "mole of action". While the division of action and entropy by N does not affect the numerical results, it severely influenced the physical interpretation.

It so happened that the evolution of quantum physics was pushed forward by a school that placed the single particle, including the photon, in the spotlight and did not care much about energy and its changes being physical many-particle principles. In spite of Planck's thermodynamic approach physics was not saved from the not-so-obvious "single particle singularity". There is no evidence that thermal radiation consists of statistically independent single photons. Yet this became one of the official doctrines.

It is still common belief that the discovery of this quantum h entitles us to claim we are able to observe or handle it as an isolated individual directly by experiment. The experiments presented below tell a different story.

So do radiation formulas if analyzed in terms of Wien's pioneering work, namely the general form: v3f(v/T) -u(v,T) = C1 v3 exp(-C2 v /T) with C1 = h/c2 and C2 = h/k.

The above formulas describe an approach as close to the discovery of quantized action as was possible before Planck. They tell us about the hidden relationship between "pre and post quantum physics".

It is instructive to analyze the statistical and hence thermodynamic nature of radiation as a cooperative effect. Strictly speaking, this requires the thermodynamic limit, ideally an infinite number of interacting particles in an infinite volume at finite density. This not so popular condition is essential for the consistent description of statistical ensembles. On the other hand, also thermal radiation, erroneously believed to be completely stochastic, exhibits distinct features of local ordering (coherence) which casts a different light, the ensemble interpretation, on the Copenhagen view of the photo effect and on quantum physics in general.

As we learn on the basis of Wien's C1 v3exp(-C2 v /T) the three formulas by Rayleigh, Wien, and Planck have a 3-factor structure in common:

A multiple of ϖ, omitted here together with Jeans' 1905 correction of Rayleigh's factor.
The intensity factor hv3dv/c2 in [J/m2s] by dimension. The "infinitesimal" dv is necessary because the measurements cover a finite frequency range symbolized by the integral over dv in accordance with the ubiquitous Doppler effect and local temperature fluctuations that prohibit ideally sharp spectral lines in an entropy driven ensemble that emits thermal radiation. In retrospect, we may confidently say that this hv3dv/c2 must be the same for all three formulas because it provided the same experimental access to intensity in the respective spectral ranges.

This leaves us with the decisive difference between Rayleigh, Wien, and Planck: The statistical factor f(x), with x = hv/kT > 0 necessarily a pure number, which, in accordance with Wien's general expression C1 v3f(C 2 v/T), turns out to be

f R(x) = 1/x (Rayleigh),
f w(x) = exp(-x) (Wien),
f p(x) = 1/[exp(x)- 1] (Planck)

A log f(x) vs log x representation (Figure 1) shows where the innocent looking difference between fW(x) and fP(x) comes into play and how fP(x) bridges 1/x and exp(-x). Thus, Planck reconciled the radiation formulas by Rayleigh and Wien to cover the whole radiation spectrum. In so doing, he saved physics from the "ultraviolet catastrophe". This is how h was found. One more, possibly more fatal, singularity that would have threatened physics is avoided by the mandatory non-zero lower limit of h. This makes h a global player in physics. The detection of the quantum of action, h, had to wait till Planck arrived at fP(x), a kind of symbolic albeit mathematical bridge between two complementary ranges of the spectra in the cross-over region. In retrospect, the occurrence of h in all three formulas is obvious, although it cancels in Rayleigh's formula thus hiding from official attention. This shows how close Wien missed to be recognized as discoverer of quantization. The f(x) factors, indifferent to scaling up or down, do not a priori talk about single independent photons. The ensemble interpretation applies the statistics to coherent photon bunches.

img

Based on Wien's general structure of radiation formulas we can rearrange the formulas by Rayleigh, Wien, and Planck. and identify the statistical factors 1/x, 1/exp(x), and 1/[exp(x) - 1]. Plotted here as log f(x) vs log x the comparison demonstrates where fR(x) and fW(x) fail.

The quantum of action. h, quite a puzzle not only in the old days, is still mistaken also as a quantum of energy - and as the trademark exclusively of "quantum physics". The quantum also shows up quite classically in the early planetary model of the hydrogen atom where the lowest value of the angular momentum h/2ϖ matches with the experimental ionization energy. This vector aspect of h, possibly a stability criterion, applies to bound particles with nonzero mass. The non- zero lower limit of h either way, as scalar or vector, is equally important to physics as is energy conservation. We here focus further on thermal radiation and its inter-action with matter.

Quantization and the Photo Effect

Einstein's celebrated view of the photoelectric effect [1905] as a "one quantum of light in - one electron out" process marked the early interpretation of quantization in terms of a single particle ("Lichtquant"), the later "photon" [8]. In his words: "Die einfachste Vorstellung ist die, dass dass ein Lichtquant seine ganze Energie an ein einziges Elektron abgibt; wir wollen annehmen, dass dies vorkomme" ["The simplest assumption is that a quantum of light delivers its total energy to a single electron; let us assume this to happen."]. He left a backdoor open and continues: "Es soll jedoch nicht ausgeschlossen sein, dass Elektronen die Energie von Lichtquanten nur teilweise aufnehmen." [Yet it shall not be excluded that electrons take up the energy of light quanta only partially"]. Einstein was not specific about his "quantum of light" nor about the meaning of action. In the same paragraph he talks about "quanta of energy" and avoids the specific mention of Planck's h as action. He also leaves open the option that many quanta of energy may be emitted by one electron and ends this paragraph with "Es ist also anzunehmen, dass die kinetische Energie eines Elektrons zur Erzeugung vieler Lichtenergiequeanten verwendet wird" ["Hence it has to be assumed that the kinetic energy of one electron is used for the production of may quanta of light energy"].

The photo effect had a foggy start. Nevertheless, the single photon, albeit still enigmatic to this day, was made the main actor of quantization and was over-equipped with wave parameters (intrinsic wavelength and frequency, phase velocity, ability of self-interference, etc), the ongoing discussions about the wave-particle dualism being its price. Here we take the less beaten path that respects waves as ordered patterns with photon bunches propagating as coherent patterns characterized by a specific spacing. This ensemble interpretation of quantum physics may serve as a consistent guide not only through the phenomena of light supported by low intensity experiments. These experiments call for a critical review of the photo effect and the efficiency of photon detectors.

Photons and Interference

The early discoveries on light behaving like waves have been performed with thermal radiation. The ingredients of interference, length and duration of coherence, have been elaborated with sources believed to be statistical emitters. Wave properties are the better defined the longer a coherent wave train is and the higher the intensity it delivers. No ideally infinite wave train is needed to see this. Thermal radiation as well is wave physics exhibiting wavelengths, frequencies, coherence and interference. The ensemble interpretation considers these as genuine collective features. Whenever they show up in an experiment they do not originate from isolated photons. A laser does not create but enhances coherence. Its radiation is just closer to the ideal limit of a welldefined wave. Any light source can be chosen as suited for the "single photon test". With laser radiation the experimental answer will be more pronounced.

Carefully obtained experimental results remain untouched but are put in a different context. Interference is the key property that has to be provided by experiment to decide between single particle and wave. Dirac's statement "a photon interferes only with itself" lacks its physical basis, let alone for a photon assumed as pointlike. It may have its roots in the postulate of a photon's intrinsic frequency. "Self-coherence" as a construct is not supported by the solid evidence of the close relationship between coherence and interference as collective phenomena. Likewise, "self-energy" disregards the very essence of energy, ubiquitous interactions, including waves. A critical experimental approach should check interference at low intensities to find out what happens when the photon flux is continually reduced This has been done.

Following Wesley [2006] we exemplify experimental and theoretical aspects of bunches [1]. He identifies and analyzes experiments and experimental conditions that qualify to provide a conclusive answer to the above question: There is a definite lower threshold intensity below which interference does not occur. He contrasts them with the numerous claims of experiments on the single photon and their shortcomings. Here we focus on Wesley's presentation of two experiments off the beaten path, providing evidence for the ensemble interpretation - the experiments by Dontsov and Baz [1967] and by Panarella [1981, 1982, 1987], followed by a model of bunches proposed in terms of quantum potentials [9].

Low Intensity Interference and Diffraction - the Experimental Side

The Copenhagen claim that single photon is accessible by experiment is based on the assumption that light sources continually emit statistically independent photons yet exhibiting wave properties also when the number of photons is continually reduced. Furthermore, the interpretation of radiation induced electron emission as "one photon in, one electron out" fuelled the belief in 100 % detector efficiency to be linearly extrapolated down to the single photon limit. Eventually this led to the particlewave duality with single particle waves extending throughout space. Critics see this in conflict with all classical evidence of wave physics, including light, where the length and duration of coherence and the path difference decide on interference. They doubt whether wave properties apply to single photons: Interference is the trademark of interaction ("superposition") between coherent waves. Moreover, they doubt whether the single photon has ever been observed. Were Dirac's dictum true single photon interference would signal the presence of all photons down to the lowest accessible intensities. Instead, we always start looking at large photon ensembles, leading us to the key question: What happens if the number of interfering photons is continually reduced? Decisive experiments investigating whether or not interference may be extrapolated down to the single photon limit have been carried out. They question the official claims of performing experiments on single photons. Interference is probed to decide on its origin, the single photon or the conventional wave train. By its classical definition interference requires at least two different coherent waves with a fixed phase difference between them while they meet. Interference, then, as a multi particle phenomenon, is a definite criterion to distinguish mutually coherent wave trains or bunches from single photons. The experimental approach raises the question how to identify the existence of the bunches and how to make sure the observed patterns originate from interference between these bunches. The main idea is to reduce the intensity contributed by the bunches so as to increase their distance (path difference) beyond their coherence length, i.e. to gradually produce a non-coherent flux of bunches. Clearly, the total number of photons present in the experiment must be distinguished from the number of bunches that actually show interference. The duration of the exposure of the detector (photographic film or photo multiplier) is the appropriate parameter to compensate low intensities of photon fluxes. Thus, the total number of photons can be controlled independent of the intensity.

This kind of experiment is demanding and faces technical difficulties that easily may lead to premature claims if performed in order to "confirm" the official single-photon-is-a-wave doctrine.

Careful identification and analysis of typical experimental problems are mandatory:
The method of attenuation, quantitative control of intensity, detector efficiency and possible non-linearity in the regime of low intensities, photo multiplier counts, calibration and manufacturers' specifications of detector, dark background noise, etc.

In order to be conclusive, these experiments must clearly distinguish between the intensity and the total flux of photons. The total flux of photons (rate of arrival estimated from the total power delivered to the detector times the duration of exposure). The latter is an important parameter to compensate for the reduced intensity. It helps to distinguish between correlated and uncorrelated photons. The observation of interference or its disappearance below a certain low intensity threshold then would be a strong indicator for nonlinear behaviour and for the conditions of wave formation. Wesley presents the Dontsov-Baz [1967] experiment as the first truly serious low intensity interference experiment [9]. Monochromatic light from a Hg source is passed through a system of lenses, a Fabry-Perot interferometer and a gray filter for 100-fold attenuation. The experimental setup provides three independent pairs of parameters for different degrees of intensity: The density of atoms in the Hg source; using or not using a filter for attenuation by a factor of 100; and a wide or small iris diaphragm. The resulting 8 degrees of freedom in such a complex experiment require careful attention as to their mutual influence. The process of detection in this experiment demands control of its components and their sensitivity. The interference pattern is observed with an image converter, the photo electrons are accelerated and projected onto a fluorescent screen and the pattern on the screen is photographed. Wesley [2006] points at some shortcomings concerning the correct estimate of the actual photon flux [1]. Dontsov and Baz claim their photon detector was able to detect single photons, so both statistically independent and correlated photons are to be observed in their experiments. They attribute however, correlated photons to the "bunching effect", referring to the western literature. They neglect the discussion of linearity for low intensities and they have misjudged the efficiency of their photo multiplier as 100%. Hence, they underestimate the total photon flux involved; yet they may have succeeded in getting smaller fluxes than their predecessors. This is the proper value of the experiment: It asks an essential question and delivers an unusual result to be taken seriously. The remarkable and clearcut result reported by Dontsov and Baz is the disappearance of interference for the same total flux received by the detector but with intensities in the Fabry-Perot differing by a factor of 100 under otherwise identical conditions. This is achieved by placing the same filter either before or after the Fabry-Perot. The reduction of the intensity in the Fabry-Perot (filter before the interferometer) has affected the flux of the bunches, arriving so infrequently that no interference between them was observed below a threshold although the same photon flux passed through the setup. Dontsov-Baz do not draw the proper conclusions of their remarkable experiment. At the end of their paper, they concede: "The presence of correlated photon "bunches" is not especially surprising. However, the actual nature of the effect (the disappearance of the interference) which is the main result of our our work is still not exactly understood."

In spite of their remarkable detection of a lower threshold of interference, the authors do not make a definite statement against single photons.

The fundamental matter of the behaviour of light as we observe it clearly demands follow-up experiments. Panarella's careful experiments [1987] confirm Dontsov and Baz, but he explicitly addresses the problem of linearity and detector efficiency.

Panarella probed wave properties at low intensities observing the Airy diffraction produced passing laser radiation through a series of pinholes. His experiments used two methods, photographic recording and, replacing the camera by a high gain photo multiplieroperated either in the scanning mode or in the counting mode. Highly coherent HeNe laser radiation is attenuated by a factor of 455 by passing the light through two pinholes and a filter in between. In order to compensate for the low intensity, the duration of exposure was adjusted accordingly to yield the same or even higher controlled total flux of photons for comparison of the results for different intensities. Other than Donsov-Baz who missed to discuss the efficiency of the detector (photographic film or photomultiplier), Panarella [2008] rightfully emphasizes the important check of nonlinearity at low intensities which, in fact, is an often-overlooked problem [10]. The premature linear extrapolation of the photoelectric effect down to the single quantum particle has caused a lot of troubles that are still with the standard Copenhagen view of quantum physics. This led many an experimenter to the erroneous belief in 100% detector efficiency. As one of his major results, Panarella reports a drastic example comparing the diffraction patterns for two different total photon numbers and intensities. A photon flux of 1.95*107 per second with about 39,000 photons reaching the detector produces a diffraction pattern peaked at 350 counts and displays two distinct satellites; whereas a lower flux of 2.53*104 / sec with more photons (50,600) reaching the detector the amplitude of the central diffraction signal has dropped to 130 counts with no satellites. Obviously, the detected diffraction patterns are not proportional to the total flux of bunched. The reduction of intensity affects the conditions for bunches to interfere (their chance to meet within the coherence length or duration respectively).

Summarizing the general result and main statement of the experiments, either by camera or by photo detector: Diffraction is only observed under intensity conditions that allow the bunches to interfere. Having the built-in possibility to interfere, they are the only ones to show wave behaviour. Below a certain threshold, the patterns disappear even in the presence of a larger total number of photons. Were single photons able to exhibit interference, this would have been observable below the threshold and definitely more so for the larger total photon fluxes where interference is still not observed because it is the bunches that fail to interfere.

Photon Bunches on the Ensemble Interpretation of Radiation and Some Possible Consequences

The voice of physicists is not as unanimous as the representation of mainstream ideas in the media may make the public believe. The history of the ensemble interpretation is quite rich and dates back to the Thirties. The community of its proponents comprises famous critical thinkers like Karl Raimund Popper. Post [2005] gives an account of their various activities and of the typical "Copenhagen problems" [4]. The ensemble interpretation lends itself to several routes "off the beaten path" and to address a prominent choice of problems not considered as nasty questions, also from way back before quantum physics. They should be disturbing not just to the critics.

As a viable, maybe provocative, alternative to statistically independent single photons, the ensemble interpretation proposes the emission and absorption of bunches (or clusters, clumps, bursts, conglomerates) of photons bound in crystalline like arrays. The ordered array manifests itself as coherence. The bunching model replaces the assumption of continuous statistical emission of independent photons by the statistic emission of coherent bursts.

Of course, the bunching model offers a rich collection of questions. This is the way science works.

A classical view of the quantum of action and its co-operative performance in bunches to account for conventional wave properties? This may look like a revolution in the original sense of "turning back". But it is not as strange as or at least not stranger than the prominent claims associated with the single independent photon and the wave-particle conflict, empty "pilot waves", collapse of wave packets upon observation, self-interference, etc. What about the prescription of a simultaneous measuring process as a natural principle? Or take the propagation of electromagnetic waves oscillating through free space, detached from their sources.

The ensemble interpretation of action provides a fresh look at the following choice of some issues favoured by critics.

Copenhagen and its Single Photon Doctrine

The officially accepted intrinsic frequency is attributed to the single photon to label its energy hv. Copenhagen takes for granted that the wavelength associated with c/v vastly outdistances the particle size. How to see these parameters in the bunching model? Photons propagating in bunches offer another (different?) frequency pertaining to the bunches. Can it be interpreted to become an additional "arrival rate" of the photons within a bunch or does it replace the official intrinsic v? There is an intriguing possibility: Instead of being labelled by an intrinsic frequency, all photons can be separated from that mysterious intrinsic frequency and be treated as alike or identical. The arrival rate, now an external parameter, depends on the spacing on photons within their coherent group. This casts a different light on the number of photons actually emitted. The statistics now applies to the bunches instead of the single photons. And, taking h as the veritable quantum, the focus is on the propagation of action at the phase velocity c. The propagation of action instead of energy complies with the conventional behaviour and phenomenology of waves.

Too abstract? No more abstract than the idea of a single photon "interfering only with itself" or being "a spread-out wave packet collapsing upon observation" or delivering "its" frequency v or energy hv all at once in one go. The officially accepted idea of quantum jumps distracts from the finite span, δt, in our quantum of action h = δEδt. We fare better sticking to action as main actor of quantization than to energy. In the framework of inter atomic potentials action offers two degrees of freedom, δE and δt. The subsequent attacks by photons coherently bound in bunches offer a different insight of what may happen during absorption such as the photoelectric effect. The inter atomic potentials are no longer to be treated as rigid. They respond dynamically to the repeated attacks and gradually weaken until eventually a photo electron is released. The potentials ø(r) cannot recover quickly enough from too frequent attacks. Every dynamic process has its characteristic duration. Some perturbations change the inter atomic binding beyond repair. In materials science, the principle of this effect is known as "fatigue": Here the successive weakening of inter atomic potentials, say in a metal wire, is permanent by introducing dislocations in the crystal lattice. You can disrupt a wire too strong for tearing it apart simply by periodically bending it back and forth. Eventually the wire gives in. Does that remind us of tunnelling? It is always helpful to keep our everyday physics in mind. That gives us confidence to recognize the principles that work also in the micro world.

Potentials, the reliable basis when energy is concerned, dynamic processes and inter atomic binding included, may be applied as well to refine the idea about bunches. For a rigid model array as may be assumed approximately to photon bunches in a kind of solid state like crystalline array we can content ourselves with a quantum potential depending on the spatial parameter, r. As afirst approach Wesley [2006] has proposed a qualitative model quantum potential ø(r) with a minimum and a maximum to keep the photons in an equilibrium position in an ordered array of identical particles [1]. A simple choice to start with is a (4, 3, 2) inverse power polynomial ø(r) as a function of distance, r, to keep the photons in a meta stable state protected by a weak barrier (Figure 2).

img

The minimum is proposed here to be at r = λ for the positive effective energy hc/λ associated with an individual photon confined within the characteristic λ spacing. The above condition for A, B, and C defines a unique value of λ at the minimum (after Wesley [2006]) [1]. The photon should not be imagined as a "wave packet" spread out over λ and to collapse upon observation. Other effective quantum potentials housing more photons per λ may be modelled so as to allow higher arrival rates than c/λ or rates modulated like the traditional sinusoidal electric field.

Other quantum potentials may be found appropriate to explore the regions of higher frequencies ("gamma rays"). The quantum potentials provide the meta stable arrangement of photons in rigid arrays when undisturbed yet able to rearrange when they interact with other bunches (interference) or with matter (refraction, diffraction, scattering).

Photon bunches are a counter point to electromagnetic waves with a wave performance (wavelength, apparent frequency, phase and phase velocity) different from the Maxwell picture.

Maxwell, Photons, and the Ether Problem

What can we tell about the propagation of action between sender and receiver from what we experience? There are charges on both sides, bound by potentials that, if disturbed, respond by oscillations. What does that say about the communication between sender and receiver? Maxwell's famous equations with electric and magnetic fields coupled by Maxwell's displacement current and Faraday's induction have no built-in mechanism for radiation. Mathematical operations are supposed to provide the transition to electromagnetic waves which, dispatched from their charges, are supposed to travel over astronomic distances. Charges and currents are mathematically dismissed, one more example of a tacitly accepted fatal "type zero singularity". To the critics of Maxwell, this is in conflict with the original definition of charges and currents to stay with their fields. The construct of wave equations (another case of an operator-based theory) does not rid the theory of the problem of self-sustaining oscillations in free space. This was one of the puzzles that caused the hassle about different ideas on the possible existence or nonexistence and the nature of a light carrying medium ("ether"). Weber's c2, interpreted as a ubiquitous space filling background potential, does belong to the rich zoo of ethers. But in reference to Okham's razor, it may be the one with the least pre-assumed specifications.

Quantum potentials, the continuum part of the clumping model, are fields in their own right and provide coherence of photons in rigid arrays. They do not require an artificial wave equation. A propagating rigid array may be envisaged as a periodic pattern of sinus halves like a chain of coat hangers passing by. This replaces the traditional undulatory sinus wave. Nothing needs to "oscillate". The wave picture for a rigid clump pattern looks different. A spacial modulation ("coat hanger") of the photons may be imagined as a frozen wavelike pattern. Maxwell's electromagnetic fields are "wavy" due to their explicit dependence on time, symbolized by the first partial derivative, а/аt. This restriction considers only part of a total time derivative, а/аt + (v ). Usually, the "vee dot del" (v ) is not considered. No one less than Heinrich Hertz replaced the partial а/аt by the total а/аt + (v ) making Maxwell's equations Galilean invariant (Phipps [1986]) [11]. In the case of a rigid pattern the apparent oscillation is produced by the spacial gradient, symbolized by the "nabla" operator v = а/аr, of the frozen pattern traveling by at velocity v. This, of course, still is not the definite answer regarding the nature of the mechanism that makes photons go and of the potential that keeps them together, but it helps to search in a more promising direction such as the ubiquitous c2 potential with less specifications needed than various other ether models. The idea of photons bound in quantum potentials puts physics in the foreground. The veritable quantum h in its coherent array tells us about the propagation of action and power, not just of energy, and is as practicable on an astronomic scale as in a lab.

One More for Photons

There is general agreement that Maxwell's electromagnetic waves cannot account for the photoelectric effect as a frequency (not intensity) sensitive threshold effect. This was the starting signal for the official revival of the photon - the single photon, that is, equipped with an intrinsic frequency v that labels its spectral identity. Somehow that hv, no matter how large v, was to be delivered to the receiver in one go, "at once". This fuelled the popular idea of "quantum jumps" in contrast to the finite duration making up action. Sudden processes became all the rage. The question about the jumps in one go is officially not considered a nasty problem. It should be. A single photon is supposed to do it all alone. The bunching model questions the official one-photon view and it literally shines a different light on threshold effects. Moreover, it helps to extend the critique on discrepancies of the traditional Maxwell theory.

Why Ensembles?

A straightforward argument is "we cannot help experimenting with them". We should speak of "quantization as we observe it", i e of its co-operative effect. It's too easy to cross the border and to extrapolate macroscopic parameters in a deductive way and transport them into the micro world. Keeping in mind that we observe lots of microscopic particles we can, however, analyze their individual contribution to the ensembles as is the case with the photon in its wave.

Planck saved classical physics from a singularity that threatened when the low-frequency radiation formula was extrapolated to cover the range of high frequencies. Planck [1921] set out from a thermodynamic approach and emphasized the necessity of what he called absolute entropy (completely determined without an unknown additive constant) for the discovery of quantization [12]:

"Im Gegensatz dazu [he addresses Boltzmann 's entropy containing an undetermined additional constant] schreiben wir der Entropie eine ganz bestimmte absolute Gröβe zu. Das ist ein Schritt von prinzipieller Tragweite, dessen Berechtigung sich nur durch die Prüfung seiner Konsequenzen erweisen lässt. Er führt, wie wir später sehen werden, mit Notwendigkeit zur "Quantenhypothese". [Contrasting Boltzmann's entropy. "We attribute a definite absolute value to entropy. This is a step of principle import the necessity of which will be evident exclusively by checking its consequences. With necessity, as we shall see, it leads to the "hypothesis of quanta"].

This is a clear statement about the thermodynamic origin of quantization. In physics, dynamic disorder is necessary to define temperature as consequence of the ongoing interactions between particles. Thermodynamic parameters lose their meaning when the system under consideration becomes too small, particularly in the academic limit of single non-interacting particles. Macroscopic thermodynamic parameters (temperature, entropy, pressure, density, melting point, aggregate state etc) do not apply any more. An H 2 O molecule is not "water". There is an intermediate ("mesoscipic") range where bulk properties evolve from those of the single particle. Here order and coherence are a consequence of the finite size way beyond the thermodynamic limit. Coherent bunches consisting of, say, several hundred photons, belong to this range [13]. Parameters describing ordered ensembles like a wave (wavelength, frequency, phase and phase velocity, polarization) do not make sense for single isolated particles, particularly if taken as point like. Planck's h is not "light". The ensemble view holds for energy, action, and power. as interactive quantities. This should be kept in mind when discussing radiation as the "cradle of quantum physics".

While there is consensus on the importance of the photon as essential ingredient of the world of quanta it still has to share the co-existence with the model of electromagnetic waves, and as an individual, too. The price for this forced coexistence is the still disputed wave-particle problem that in the eyes of the critics has not been solved by sophisticated formalisms. Quantum electrodynamics rose as a compromise between Maxwell and quantization in a new world of symbolized nontraditional mathematics.

The ensemble interpretation of quantum physics offers a fairly consistent view where the mainstream doctrine of the single particle meets singularity problems, such as the infinities of which quantum electrodynamics had to rid itself ("renormalization") [1, 4]. What makes the ensemble interpretation worthwhile to be considered as a serious contribution to quantum physics? It invites us to take the less beaten path, open for different question not asked in the standard treatment of physics. And it therefore tells us that uncertainty, the wave particle dualism, the existence of wave packets and their collapse upon observation and more are still to be disputed.

Outlook - The Role of the Photon

The ubiquitous photon, arguably the most common particle in the Universe and our messenger from outer space, plays a particular role in physics. Since Newton [1717, "Are not the rays of light very small bodies emitted from shining substances?"] the "particle of light" has had a unique career between rejection by the wave school (Christiaan Huygens and followers), its re-discovery (Einstein 1905), and the Copenhagen efforts to construct a compromise between waves and particles [8, 14]. Newton was cautious enough to disguise his proposition as a question ("hypotheses non fingo"). Action as well as power, two complementary ways of how energy is processed, occupy a central part in the mainly mathematical struggle between continuum and discreteness. The singularities zero and infinity should be considered with skepticism whenever we do physics. Zero is the trickier singularity. Zero power is beyond our measurement capacities (unless there will be hints that power, too, has a lower non zero limit). Infinite duration is not very spectacular and not very practical, either. Infinite action likewise. Action and power, time honored concepts of physics, tell us that the struggle between continuum and discreteness is not decided on either side if we leave it to mathematics. The continuum is a useful approximation in the thermodynamic limit. Infinitesimals are helpful, too, if we keep them in mind as a purely mathematical tool. It depends on what we are looking at. If energy is not quantized per se because duration and frequency are not, or at least not the way action is, we should accept action as the veritable quantum in terms of Planck's constant h. We may take h as the photon - and as Nature's abstract way to answer our experimental questions, not only in the micro world. It should give us faith that we can determine its numerical value at 6.626*10-34 Js that has stood all tests so far from analyzing a macroscopic ensemble. This minimum value of δEδt is both comfortably small and distinctly far from a singularity to illustrate our practical limits. Experimental records on short duration δt achieved (e g in the actual atto second regime, see the 2023 Nobel Prize) correspond to values of action and power compatible with laboratory physics. The technical challenge is a serious warning against mathematical singularities.

In 1926 the physical chemist G. N. Lewis coined the Greek name phot-on from øωç = lighz and v = particle. This handy name does not entitle us to claim we know what it is nor does it mean we can perform experiments with single photons. The experiments here discussed have provided strong arguments against the official dictum. In his biography on Isaac Newton, J. W. N. Sullyvan states "Newton never did reach a simple, comprehensive, and consistent view of the phenomena of light. Nor can it be said that such a theory has been reached even today". The book came out in 1938 and we may agree that this statement has not changed till today, no matter what the media tell us. In his later years Einstein complained that all these 50 years of pondering have not brought him any closer to answering the question what are light quanta. This exemplifies the uneasy feeling everybody should have when they claim to understand Nature. Our failing to understand is not as bad as it sounds. We can successfully work with the enigmatic elusive photon without having to know what it is. We can, however, adopt a different look on radiation as a genuine wave phenomenon including coherence and interference if we accept the idea of photon bunches. This offers a fairly consistent picture of waves and particles coexisting in a way leaving room enough for fruitful speculations that may end up as useful insights. Other than Einstein's single photon, Newton's photons had co-operative properties ("fits", now known as phase). Keeping this origin in mind, there was no need to choose the mathematically easier single particle road [15-17].

Physics requires abstract thinking in its own way. In the critics' opinion the Copenhageners and their followers overdid it, importing math at any cost. Of course, we did not completely go back to "classical" physics here, keeping in mind how tediously the finite value of h was detected. In retrospect its discovery did fit time honored principles (causality and non-singularity) and reminds us that evolution not necessarily means revolution in the sense of overthrowing old insights. It also obliges us to improveour didactic efforts.

Open questions and falsification in the spirit of Karl Raimund Popper are the trademark of science. Premature conclusions (or should we call them conjectures?) are open to critique and ask for corrections or refinements. So, of course, is the ensemble interpretation presented here.

The problem how to assign energy and hence a corresponding frequency to the individual photon remains unresolved. If that frequency were a cooperative arrival effect of close spaced identical photons they could be identified as the veritable quantum of action, h. The finite duration, δt. associated with action makes photons also the carriers of power where the inverse of δt comes into play. Is that cooperative frequency an average over a characteristic length λ so their spacing within λ modulates the energy delivered during absorption? We have to face different and difficult choices to be clarified.

Even so, the ensemble interpretation contributes to the falsification of inconsistencies of the standard interpretation such as the wavesingle-particle conflict that rule in mainstream quantum physics. The discovery of "the quantum" tempted many 20th century scientists to believe that we can directly observe this little piece of h, thus forgetting that we always have to do with interactions. It is these interactions that led to the discovery of h. Singling out the quantum from what is observed by deduction caused the disputed problems of establishment quantum theory. Looking at ensembles, on the other hand, sharpens our view of the allimportant context property of energy, action, and power. That helps to recognize the elusive singularity of zero h, erroneously considered the trademark of classical ("pre-quantum") physics. Now quantum physics looks more classical than is officially believed. Unwilling to give up their territory the establishment, of course, will not applaud to our case, particularly not with so much fame on their side. Yet science demands to defend critique, no matter how naive "turning the wheel back" may look to the defenders of the Copenhagen single photon view. The voice of critics should be given the same public attention that is paid to the defenders of the mainstream doctrine. There is a good perspective to omit the question mark in our introductory headline because quantization is observed as part of an interactive world. Action with necessity is a dynamic interplay in multi particle systems, including photons. And with necessity action has a lower non-zero limit. This makes it improbable that its smallest unit, h, is ever observed as an isolated entity.

Does h represent the mysterious photon?
The outcome of the above experiments on low intensity interference supports the view that radiation consists of stochastic bursts, not of a stochastic shower of continually emitted single independent photons. Lasers are no exception (see the Panarella experiments), however great the coherence length. That leaves well established experimental results on radiation untouched, but their interpretation is changed drastically and may be expected to lead to new insights. The ensemble interpretation suggests itself as the Gordian knot where to look for a practicable way out of the wave particle dilemma and to find out where both sides are right and where they are wrong. The bunching model offers a conclusive way out of the wave-particle dualism in combination with attractive speculations.

We summarize the ensemble interpretation with a little catalog exemplifying some "unorthodox" perspectives, maybe "revolutionary" conclusions, propositions, attractive speculations, and consequences triggering open questions.

• Only coherent bunches of photons are emitted and detected;
• Wave properties (coherence, diffraction) are imprinted in these bunches.
• Propagating at the phase speed c they exhibit wave characteristics according to c = vλ;
• The bunching model offers a coherent frequency ("internal rate of arrival") to replace the officially assumed intrinsic ("color") frequency attributed to the single photon;
• In consequence of this different degree of freedom typical of a bunch we may separate the color frequency from the single photon and venture beyond the conventional hv picture and ask in the spirit of Newton's question mark: "Do not all photons have the identity h?

That takes the photon out of the category hv and makes it a fundamental elementary particle with a unique identity of its own like the others.

• In the bunching model these identical particles outnumber single photons officially labelled by an intrinsic frequency taken from v = c/λ.
• The statistical arrival of bunches gives rise to an external "intensity frequency".
• Both frequencies depend on the dynamic specifications of the emitter i e the atomic potentials in the vicinity of the charges set in motion.
• Emitting atoms do not oscillate individually due to interacting potentials.
• Nothing needs to oscillate when a rigid bunch propagates through free space.
• On absorption the bunches cause oscillations (interpreted as originating from "waves")
• The limitations and shortcomings of some established theories (Maxwell and Copenhagen) are brought to attention.
• The Doppler effect (ex: blue becoming red) applies to the sequence of photons in a bunch.
• Photons in a bunch are bound in a crystalline array by quantum potentials,
• "Quantum jumps" are re-interpreted complying with the duration imprinted in the concept of action. The sequence of photons arrayed in a clump determines the internal "color frequency".
• Quantized action may still be assumed as a trademark of the existence of the single photon but it does not show as a single entity by experiment because photons are bound in bunches by quantum potentials acting between them.
• By contrast to official doctrine photons are found to be very interactive.
• Iterative precursor processes in the spirit of Debye-Sommerfeld suggest an intriguing question: How does a photon bunch build up?

Photons are powerful actors essential for our lives. In order to give the ensemble interpretation, the further support and attention it deserves, it is desirable to join forces with biology (photo synthesis) and medical researches (our seeing process). In particular, the above two frequency effects on biological matter, intensity and color sensitivity, should be of interdisciplinary interest. How does matter respond to the intrinsic coherence of the bunches and to their statistical emission as bursts? The elusive yet omnipresent photon is to stay with us as one of Nature's great surprises.

The foregoing essay is based on some material I presented at several meetings of the Natural Philosophical Alliance (NPA; USA) in the years 2003 - 2011.

For all interested, the German titles of the original publications are given referring to the respective authors in this list.

References

  1. Wesley, James Paul (2006) Light - a Photon Flux and Other Topics Chapter 8: Quantization and Waves, Ensemble Effects Chapter 9: Low Intensity Interference Bemjamin Wesley, Blumberg ISBN 3-942-10-29800.
  2. Wesley, James Paul (1993) Weber Potential from Finite Velocity of Action? Found. Phys. Lett 5: 597.
  3. Assis, Andre Koch Torres (1994) Weber Electrodynamics. Kluwer Academic Publishers Group, Dordrecht. https:// www.hlevkin.com/hlevkin/90MathPhysBioBooks/Physics/ WeberElectrodynamics.pdf
  4. Post Evert Jan (2005) Physics Owes Max Planck an Apology. Proceedings of the NPA 2: 154 - 157.
  5. Wesley James Paul (1996) Classical Quantum Theory, Chapter 6. Bemjamin Wesley, Blumberg 1996 ISBN 3-9800942-5-1.
  6. Planck Max (1943) Naturwissenschaften 31:153
  7. Debye Peter, Sommerfeld Arnold (1913) Theory of the photoelectric effect under the viewpoint of the quantum of action. Annals of Physics 41: 873.
  8. Einstein Albert (1905) On the production of cathode rays by shining light on solids" Einstein's quotes in our text are found in section 8 therein: About a die On a heuristic viewpoint concerning the production and conversion of light. Annals of Physics 17: 132.
  9. Dont'sov Yu P, Baz AL (1967) Interference Experiments with Statistically Independent Protons. Journal of Experimental and Theoretical Physics 25: 1-210. http://www.jetp.ras.ru/cgi-bin/dn/e_025_01_0001.pdf
  10. Panarella Emilio (2008) “Single Photons" Have not been Detected: The Alternative “Photon Clump” Model. The Nature of Light 1st Edition: 1-16. Chandrasekhar Roychoudhuri, A Kracklauer, Catherine Creath (2008) What is a Photon? CRC Press Taylor & Francis Group. https://www.taylorfrancis.com/books/edit/10.1201/9781420044256/nature-lightkracklauer-chandra-roychoudhuri-kathy-creath
  11. Phipps Thomas E Jr. (1986) Heretical Verities - Mathematical Themes in Physical Description Chapter 4 classic non-fiction library, Urbana. ISBN 0960654007.
  12. Planck Max (1921) Wärmestrahlung (Heat Radiation). 4th Edition, Johann Ambrosius Barth Publisher, Leipzig: 119
  13. Galeczki Georg, Marquardt Peter (1997) Order and Coherence: A Question of Size for Matter and Radiation. Physics Essays 10: 506.
  14. Newton Sir Isaac (1717) Optics 2nd Ed. Book III; Part 1, Query 29] Reprinted in: Great Books of the Western World 43: 507. Encyclopedia Britannica, Inc. Chicago, London, Toronto 1952.
  15. Wesley James Paul (1993) Weber Potential from Finite Velocity of Action? Found Phys Lett 5: 597.
  16. Planck Max (1900) On a correction of Wien's spectral equation. https://materias.df.uba.ar/f4aa2015c1/files/2015/05/papersplanck.pdf
  17. Wien Willy (1896) On the distribution of energy in the emissive spectrum of a black body. Annals of Physics and Chemistry 58: 662-669.
View PDF