Sir Harold Jeffreys Research Paper

Academic Writing Service

Sample Sir Harold Jeffreys Research Paper. Browse other research paper examples and check the list of research paper topics for more inspiration. If you need a research paper written according to all the academic standards, you can always turn to our experienced writers for help. This is how your paper can get an A! Feel free to contact our custom research paper writing service for professional assistance. We offer high-quality assignments for reasonable rates.

Sir Harold Jeffreys, F. R. S. was born on April 22, 1891 in Fatfield, County Durham, England and died on March 18, 1989 in Cambridge, England. He was educated at Armstrong College, Newcastle-upon- Tyne and at St. John’s College, Cambridge University. After brilliant scholastic performance, he became a Fellow of St. John’s College in 1914. From 1946 to 1958, he was Plumian Professor of Astronomy and Experimental Philosophy at Cambridge University. In 1925, he was elected a Fellow of the Royal Society, later became its Bakerian Lecturer and Royal Medalist and received the Copley Medal in 1960. He was knighted in 1953, President of the Royal Astronomical Society, 1955–57, and President of the International Association of Seismology and Physics of the Earth’s Interior, 1957–60. In 1962, the Harold Jeffreys Lectureship was created by the Royal Astronomical Society and Jeffreys presented the inaugural lecture in 1963.

Academic Writing, Editing, Proofreading, And Problem Solving Services

Get 10% OFF with 24START discount code


The obituary in the Times of London, March 23, 1989, noted that Jeffreys’s ‘famous treatise, The Earth: Its Origin, History and Physical Constitution, which with re-editing and improvements, went through seven editions between 1924, when it first appeared, and 1976, did perhaps more than any other book to make the geophysics of the solid earth a coherent study. His researches in seismology, in particular, established his reputation as one of the leading authorities on that subject.’

His early research was centered on astronomical and meteorological dynamics. An accomplished mathematician, he wrote two monographs, entitled Cartesian Tensorsand Operational Methods in Mathematical Physics. He and his wife, Dr. Bertha Jeffreys, a very able mathematician, wrote Methods of Mathematical Physics, which was published in several editions by Cambridge University Press and became a highly respected and widely cited classic. In a review of the work in Nature, it is stated that ‘The authors distinguished for their knowledge of mathematical physics and cosmogony, and by their researches in these fields, possess also a remarkably comprehensive and deep understanding of pure mathematics. Their purpose has been to synthesize the mathematical knowledge for the service of physics … The work is a benefaction to the cause of progress in natural philosophy.’




The Collected Papers of Sir Harold Jeffreys on Geophysics and Other Sciences was published by Gordon and Breach, London, in six volumes. Volume 6 is entitled, Mathematics, Probability and Miscellaneous Other Sciences. Work reported in this volume and in his books, Scientific Inference (1931 and later editions) and Theory of Probability (1939 and later editions) reflected Jeffreys’s concerns with the foundations of science and, in particular, scientific inductive procedures. His book, Theory of Probability presents an axiom system for inductive inference applicable in all the sciences and operational procedures for description, measurement, modeling, data analysis, estimation, testing, and prediction. Many applications of these procedures to important statistical problems encountered in many sciences are provided in his books and papers.

Since he began research on probability theory after World War I in a series of papers written jointly with Dr. Dorothy Wrinch, his contributions have been of fundamental importance to the philosophy of science, scientific method, and theoretical and applied statistics. The general nature of this research has been stated in the preface of the second edition of Scientific Inference as follows: ‘The general standpoint that scientific method can be understood if and only if a theory of epistemological probability is provided, remains unaltered. Consequently, I maintain that much that passes for theory of scientific method is either obscure, useless or actually misleading.’

In his book, Theory of Probability, Jeffreys presents a theory of epistemological probability, discusses its relevance for work in all areas of science, and applies it to solve many practical, important estimation, testing, prediction, and other statistical problems including those considered by R. A. Fisher, Joey Neyman, Karl Pearson, and others. Irving J. Good, an eminent statistician who reviewed Jeffreys’s 1961 edition of Theory of Probability wrote that Jeffreys’s book ‘… is of greater importance for the philosophy of science, and obviously of greater immediate practical importance, than nearly all the books on probability written by professional philosophers lumped together.’

As regards to Jeffreys’s work on scientific method, he along with Pearson (1938) stressed the ‘unity of science principle,’ namely, that any area of study, for example, economics, physics, psychology, etc., can be scientific if scientific methods are employed in analyzing data and reaching conclusions. Or as Pearson (1938, p. 16), cited by Jeffreys states, ‘The unity of all science consists alone in its method, not in its material.’ With respect to scientific method, Jeffreys, in his book, Theory of Probability, provides an axiom system for probability theory which is useful to scientists in their efforts to learn from their data, explain past experience, and make predictions regarding as yet unobserved data, fundamental objectives of science, that are applicable in the natural, biological, and social sciences. He states that scientific induction involves observation and measurement and the production of generalizations or models to explain past data and predict new data, a view of induction that is fundamentally different from those of Mach and Popper. Further in induction, deduction plays a role but alone is inadequate for scientific work, mainly because deduction provides just three attitudes towards a proposition, namely, proof, disproof, or complete ignorance. As Hume and others have stressed, there is always the possibility of an exception to a scientific proposition. Thus, Jeffreys saw a need for statements less extreme than those of deductive logic, namely proof, disproof, or complete ignorance. His inductive logic provides such statements that involve reasonable degrees of belief in scientific propositions.

Jeffreys considers a probability to be a measure of the reasonable degree of confidence that an individual has in a proposition given the individual’s available background information and data. This individualistic, subjective definition is compared to other definitions, including the axiomatic, long-run frequency, and hypothetical infinite population definitions in Chap. 7 of The Theory of Probability. He points out that the first definition is widely employed, for example, in the work of J. Neyman, the second is the Venn limit and strongly advocated by R. Mises, and the third is usually associated with R. A. Fisher although it had appeared earlier in statistical mechanics in the work of Gibbs and others. His criticisms of these concepts of probability are ingenious and devastating and have not been refuted in the literature, perhaps because they are irrefutable. In addition to specific detailed examples illustrating the inadequacies of these three definitions, Jeffreys comments, ‘No probability has ever been assessed in practice, or ever will be, by counting an infinite number of trials or finding the limit of a ratio in an infinite series … A definite value is got on them only by making a hypothesis about what the result would be. The proof even of the existence is impossible. On the limit definition, … there might be no limit at all … the necessary existence of the limit denies the possibility of complete randomness, which would permit the ratio in an infinite series to tend to no limit.’

After further detailed criticisms of the above three definitions of probability, he writes, ‘The most serious drawback of these definitions, however, is the deliberate omission to give any meaning to the probability of a hypothesis.’ (1998, p. 377). Also, he remarks, ‘… in practice no statistician ever uses a frequency definition, but that all use the notion of degree of reasonable belief, usually without even noticing that they are using it and that by using it they are contradicting the principles they have laid down at the outset’ (1998, p. 369). See also Jeffreys’s (1998 pp. 30–3) discussion of the ‘personalistic’ or ‘moral expectation’ views of probability.

Having provided a definition of probability, Jeffreys’s axiom system provides procedures for using such probabilities in a coherent way. That is, he provides basic analysis that yields the addition and product rules and other results, including Bayes’s Theorem or the principle of inverse probability that Jeffreys states (1998, p. 28) was ‘… first given by Bayes (1763). It is the chief rule involved in the process of learning from experience.’ It is invaluable in using data to revise our initial beliefs in propositions regarding the values of parameters, alternative hypotheses and models, future values of variables, etc. With this analysis, Jeffreys presents a logically consistent framework for learning from data to explain the past and to make predictions about future outcomes, primary objectives of science, as mentioned above. He demonstrates, through analyses of many applied problems using data, that his estimation, prediction, and testing procedures work well in practice and makes many comparisons with results provided by the maximum likelihood approach of Fisher and others, NeymanPearson testing procedures, Fisher’s p-value approach to testing, etc.

Central in Jeffreys’s work on statistical estimation, prediction, and testing problems is Bayes’s Theorem. Whatever the estimation problem, use of Bayes’s Theorem enables an investigator to combine his prior information regarding possible values of the parameters, represented by a ‘prior density’ with the likelihood function that incorporates the information in the data to obtain a ‘posterior density’ for the parameters that incorporates all the prior and sample information. That is, using Bayes’s theorem, the posterior density is proportional to the prior density times the likelihood function with the factor of proportionality a numerical constant such that the integral of the posterior density over its range is equal to one. Thus for whatever problem, given the inputs, a prior density and a likelihood function, a posterior density for the parameters is available. This posterior density can be employed to compute the probability that a parameter’s value lies between any two given values, say between 0.4 and 0.6, a solution to the famous inverse problem posed by Bayes. Optimal posterior point estimates for parameters are obtained easily from a given posterior density, say a posterior mean, median or modal value that are optimal in terms of minimizing posterior expected loss is a is squared error, absolute error and zero-one loss functions, respectively. Further, in large samples, Jeffreys (1998, p. 193) shows that under general conditions, the posterior density assumes a normal shape centered at the maximum likelihood estimate with a covariance matrix equal to the inverse of the estimated Fisher information matrix (1998, p. 193) and remarks that R. A. Fisher’s maximum likelihood methods yields ‘… results which are indistinguishable from those given by the principle of inverse probability (in large samples), which supplies a justification for it’ (1998, p. 194).

Since a suitable prior density is needed to utilize Bayes’s Theorem and has been a bone of contention for many years, Jeffreys devoted considerable attention to the problem of formulating appropriate prior densities. In particular, he put forward a reproducible way of formulating invariant prior densities appropriate for use in situations in which an investigator is ignorant, or wishes to proceed as if he were ignorant, about the possible values of parameters. His famous recommendation in such circumstances is to take the prior density for the parameter(s) proportional to the square root of the determinant of the Fisher in-formation matrix (see p. 179ff for material on invariant priors). Use of this ‘ignorance’ or ‘diffuse’ or ‘non-informative’ prior density provides invariant posterior probability statements for all continuous one-to-one transformations of the parameters, a remarkable and highly original solution to the choice of prior problem. Jeffreys derived many invariant priors and used them in analyses of applied problems. He is usually given credit for being a pioneer in considering and solving the problem of formulating invariant priors.

With respect to prediction, in general a probability density function for future, as yet unobserved, observations is formulated conditional on parameters with unknown values. If we have, for example, a proper prior or posterior density for the parameters, the product of it and the conditional predictive density can be formed and the parameters integrated out to yield an operational predictive density. It can be used to make predictive probability statements, for ex-ample, to compute the probability that a future, as yet unobserved, observation will lie between, for example, 3.4 and 4.7. Also, optimal point predictions can be derived for given predictive loss functions, for ex- ample, a mean for quadratic loss functions, etc. For more on uses of predictive densities for forecasting and control, see the general references cited above.

As regards to the evaluation of alternative models or hypotheses put forward to explain empirical phenomena, Jeffreys recommended workers to start with simple models and complicate if necessary. He and Dorothy Wrinch put forward the Simplicity Postulate, namely a statement that simpler models have higher probabilities of performing well. Jeffreys considered all possible models ordered with respect to degree of complexity and assigned probabilities to each, with simpler models receiving higher probabilities, in an infinite sequence that is assumed to converge.

Within this broad modeling framework, on considering two alternative hypotheses, Jeffreys used Bayes’s Theorem to derive an expression for the posterior odds on these two hypotheses that is equal to the prior odds on the two hypotheses times the Bayes’s factor, the ratio of the predictive densities for the observations under the two alternative hypotheses. Jeffreys used posterior odds in his pathbreaking work on significance tests and compared his results for fundamental testing problems with those provided by use of Fisher’s p-values and Neyman-Pearson testing procedures. Since Jeffreys’s pioneering work on significance testing, many have employed and extended his methods to deal with a wide range of testing and model selection techniques, see, for example, the review paper by Kass and Raftery (1995). Given Jeffreys’s brilliant formulation, interpretations and uses of significance tests based on posterior odds, much improved methods for evaluating and/or combining hypotheses and models evolved and are now in widespread use.

Since Jeffreys’s initial work in the 1920s, many have come to appreciate his outstanding contributions to the philosophy and practice of science. It is noteworthy that he analyzed much nonexperimental astronomical and geophysical data. Yet his analysis of randomization in experimental design in his book Theory of Probability is extremely penetrating and valuable. In addition, he has made important contributions in the areas of distribution theory, asymptotic theory and expansions, robust estimation, invariance theory via differential geometry, information measures for distributions, mixture distributions, contingency tables, a key nonparametric problem, and in modifying the Pierre Simon Laplace Rule of Succession. And last, that Jeffreys’s methods work not only in solving physical science problems but also in solving problems in biology, the social sciences, business, law and other areas is noteworthy. Finally, while many orthodox statisticians were critical of Jeffreys’s work years ago, it is the case that his approach and results are currently widely employed and valued.

Bibliography:

  1. Bayes T 1763 An essay toward solving a problem in the doctrine of chances. Philosophical Transactions of the Royal Society of London 53: 370–418
  2. Berger J O 1985 Statistical Decision Theory, 2nd edn. Springer, New York
  3. Bernardo J M, Smith A F M 1994 Bayesian Theory. Wiley, New York
  4. Berry D A, Chaloner K M, Geweke J K 1996 Bayesian Analysis in Statistics and Econometrics: Essays in Honor of Arnold Zellner. Wiley, New York
  5. Box G E P, Tiao G C 1993 Bayesian Inference in Statistical Analysis. Wiley Classics Library, New York
  6. Cook A H 1990 Sir Harold Jeffreys biographical memoirs. Fellows of the Royal Society of London 36: 303–33
  7. Jaynes E T 1983 E. T. Jaynes: Papers on Probability, Statistics and Statistical Physics. Reidel, Dordrecht, The Netherlands
  8. Jeffreys H 1931 Scientific Inference. Cambridge University Press, Cambridge, UK (Reissued with addenda, 1937, 2nd edn. 1957, 3rd edn. 1973)
  9. Jeffreys H 1939 Theory of Probability. Oxford University Press, Oxford, UK (2nd edn. 1948, 3rd edn. 1961, 3rd rev. edn. 1967, Reprinted 1988 and 1998)
  10. Jeffreys H 1971 Collected Papers of Sir Harold Jeffreys on Geophysical and other Sciences, Vols. 1–6. Gordon and Breach, London
  11. Jeffreys H 1998 Theory of Probability, 3rd rev. edn. Oxford University Press, Oxford, UK
  12. Kass R E (ed.) 1991 Tribute to Sir Harold Jeff Chance 4(2)
  13. Kass R E, Raftery A E 1995 Bayes factors. Journal of the American Statistical Association 90(430): 773–95
  14. Kass R E, Wasserman L 1996 The selection of prior distributions by formal rules. Journal of the American Statistical Association 91: 1343–70
  15. Pearson K 1938 The Grammar of Science. Everyman, London
  16. Zellner A (ed.) 1980 Bayesian Analysis in Econometrics and Statistics: Essays In Honor of Harold Jeff North-Holland, Amsterdam
  17. Zellner A 1996 An Introduction to Bayesian Inference in Econometrics. Wiley Classics Library, New York
  18. Zellner A 1997 Past and Recent Results on Maximal Data Information Prior Distributions in Bayesian Analysis in Econometrics and Statistics: The Zellner View and Papers. Edward Elgar, Cheltenham, UK
Motoo Kimura Research Paper
International Science Research Paper

ORDER HIGH QUALITY CUSTOM PAPER


Always on-time

Plagiarism-Free

100% Confidentiality
Special offer! Get 10% off with the 24START discount code!