Main Differences Between Bayesian Statistics and Classical Statistics

Updated on educate 2024-03-30
9 answers
  1. Anonymous users2024-02-07

    1. Whether to use a priori information.

    Since the design and production of products have a certain inheritance, there are many related product information and prior information can be used, and Bayesian statistics believes that the use of these prior information can not only reduce the sample size, but also improve statistical accuracy in many cases; The classical school of statistics ignores this information.

    2. Whether to treat parameter e as a random variable.

    The most basic idea of Bayesian statistics is that any unknown quantity e can be regarded as a random variable, which can be described by a probability distribution, which is a priori distribution. Because any unknown quantity has uncertainty, probability and probability distribution are the best language to express uncertainty; On the contrary, the classical school of statistics simply treats the unknown quantity e as an unknown parameter and makes statistical inferences about it.

  2. Anonymous users2024-02-06

    The BYAES school of statistics and the classical school of statistics have differences on many issues, but their most fundamental differences are: first, whether to use a priori information. Due to the design and production of products have a certain inheritance, there are a lot of related product information and prior information can be used, and the BIES school of statistics believes that the use of these prior information can not only reduce the sample size, but also improve the statistical accuracy in many cases; The classical school of statistics ignores this information.

    Second, whether parameter e is considered a random variable. The most basic idea of the BYAES school of statistics is that any unknown quantity e can be regarded as a random variable, which can be described by a probability distribution, which is the a priori distribution.

    Because any unknown quantity has uncertainty, probability and probability distribution are the best language to express uncertainty; On the contrary, the classical school of statistics treats the unknown quantity e as an unknown parameter and makes statistical inferences about it.

    Although there is a big difference between BYAES and classical statistics, they have their own advantages and disadvantages, and each has its own scope of application. Moreover, in many cases, the conclusions reached by the two are formally identical.

  3. Anonymous users2024-02-05

    Bayesian statisticsApplicable conditions: defined as different assumptions in a given data d as well as hPrior probabilityThe most likely assumptions under the knowledge of the subject. Bayesian theory provides a method for calculating hypothetical probabilities, based on the prior probability of the hypothesis, the probability of different data being observed under a given hypothesis, and the observed data itself.

    Since the design and production of products have a certain inheritance, there are many related product information and prior information can be used, and Bayesian statistics believe that the use of these prior information can not only reduce the sample size.

    And in many cases, oak can improve statistical accuracy; The classical school of statistics ignores this information.

    Research implications. People need to estimate the probability of various conclusions when making decisions based on uncertainty information, and this kind of reasoning is called probabilistic reasoning. Probabilistic reasoning is both probability and logic.

    The object of study is also the object of study in psychology, but the angle of study is different. Probability and logic are the study of formulas or rules for the estimation of objective probabilities; Psychology, on the other hand, studies the cognitive processing process of people's subjective probability estimation.

  4. Anonymous users2024-02-04

    It's not hard.

    Bayesian uses prior information, that is, in the case of a small amount of data, accurate statistical results can be obtained through accurate prior information, and to a certain extent, the influence of the amount of observed data on the statistical results can be eliminated.

    Due to the introduction of a priori information by Bayes, which also leads to the influence of imprecise prior on the final statistical results, we can generally increase the number of observational data to avoid empirical bias, but the accuracy of Bayesian estimates is questioned in the case of small data volumes, which is also where Bayesian estimates have been controversial for decades.

    Posterior distribution. According to the distribution of sample x p and the prior distribution of ( The conditional distribution (x) can be calculated under the condition that x=x is known by using the method of finding the conditional probability distribution in probability theory. Because this distribution is obtained after sampling, it is called a posterior distribution.

    According to the Bayesian school of Keyacia, this distribution is a combination of sample x and the information provided by the prior distribution ( . The whole purpose of the sampling manuscript is to complete the transformation from a priori distribution to a posteriori distribution.

    The above content refers to: Encyclopedia-Bayesian statistics.

  5. Anonymous users2024-02-03

    The information contained in the sample about the population can be divided into two parts:

    One is information about the overall structure, that is, the structure that reflects the overall distribution;

    The second is information about unknown parameters in the population, which is caused by the fact that the distribution of the sample contains unknown information in the population distribution.

    We will only process less information, not more, i.e. statistics have the ability to compress data, but will highlight the information we need.

    Then a good statistic should be able to extract all the information containing unknown parameters in the sample, that is, the sample processing does not lose the information of unknown parameters, which is called adequacy. How do you express this idea mathematically? In 1922, Fisher introduced an important concept – adequate statistical measurement.

    Roughly speaking, the sufficiency statistic, which is the unification without loss of information, is a very important concept in simplifying statistical problems, and it is also one of the few consistent views in classical statistics and Bayesian statistics.

    Determination theorem, sufficient and necessary conditions, factor theorem.

  6. Anonymous users2024-02-02

    Summary. Hello dear, the fusion of classical statistics and Bayesian statistics has proposed a new statistical method called "Bayesian statistics". This method combines the statistical inference of classical statistics and the probability inference of Bayesian statistics to obtain more accurate results.

    The basic idea of Bayesian statistics is that when inferring the likelihood of an event occurring, the combination of known and unknown information is taken into account. Bayesian statistics help researchers better understand data, estimate parameters more accurately, and collect information more efficiently, leading to better results.

    Hello dear, the fusion of classical statistics and Bruce Beard Pieus statistics has proposed a new statistical method called "Bayesian statistics". This method combines the statistical inference of classical statistics and the probability inference of Bayesian statistics to obtain more accurate results. The basic idea of Bayesian statistics is that when inferring the likelihood of an event occurring, the combination of known and unknown information is taken into account.

    Bayesian statistics helps researchers better understand data, estimate parameters more accurately, and collect Bi noise information more effectively, resulting in better results.

    Is classical statistics the basis of Bayesian statistics? Is ChatGPT a fusion of the two statistics?

    No, stool refers to yes. Classical statistical jujube matching is statistics based on probabilistic models, while Bayesian statistics is statistics based on Bayesian models. ChatGPT is not a fusion of two statistics, but a language modeling technology based on neural networks, which can be used to write the next word or sentence of the text.

    ChatGPT uses Bayesian statistics, right?

    The core essence behind ChatGPT is the Bayesian formula, the most simple technical principle of probability and statistics.

    The basic properties of Bayesian statistics and classical statistics are different: 1Probabilities are understood differently.

    The basic nature is also different.

    They must be opposites.

    In classical statistics, the sample spike is not regarded as coming from a population with a certain probability distribution, and the parameters in the total guess body are ordinary unknown variables. In contrast, Bayesian statistics regards any unknown parameter as a random variable, with uncertainties, which can be said to be opposites.

    Ok thanks!

    Have more questions?

  7. Anonymous users2024-02-01

    Classical statistics and Bayesian statistics are two different statistical methods, and their basic assumptions and methods of reasoning are similar. Classical statistics is based on the frequency school, which considers a sample to be a random sample from a population, and the values of population parameters are inferred from the sample data. Bayesian statistics, on the other hand, is based on the Bayesian school, which considers parameters to be random variables, and calculates the posterior distribution through prior distribution and sample data to obtain the probability distribution of parameters.

    In practical applications, classical statistics and Bayesian statistics can be used together to take full advantage of their respective advantages. For example, Bayesian statistics can be used in cases where sample data is small or prior information is important; In cases where there is a large amount of sample data or a priori information is not important, classical statistical methods can be used. At the same time, Bayesian statistics can also be used to correct the bias of classical statistics, for example, in the estimation of parameters, the classical statistical method may have a large deviation, and Bayesian statistics can correct this bias by introducing a priori information.

    Therefore, the combined use of classical statistics and Bayesian statistics can make statistical analysis more accurate and comprehensive.

  8. Anonymous users2024-01-31

    Bayesian statistics are applied as follows:

    Bayesian networks were developed by Judea Pearl (1936-), an American scientist who calculated the excitation and clearing machine, in the 80s of the 20th century, and then soon applied to the medical field. In the medical field, the disease is generally identified as much as possible based on the patient's symptoms and the value of the examination. This is also a process of deducing the cause from the effect, which means that Bayesian statistics can definitely come in handy.

    If you use a Bayesian network, you can grasp the more complex causal relationship.

    Using mathematics to grasp complex causal relationships, the University of Pittsburgh developed the Bayesian network for diagnosing liver diseases. There are various causes of liver disease, such as alcohol consumption, hepatitis virus infection, gallstones, etc. In addition, there are many symptoms caused by liver disease, such as abdominal pain, hair loss, and enlarged spleen.

    Doctors, on the other hand, must see through these complex causal relationships and choose the appropriate method for different patients.

    This is where the Bayesian network comes into play to help doctors make a diagnosis. The doctor will input the patient's medical records, drinking history, and the values and symptoms of various examinations into the Bayesian network for query, so that the original unknown prior probability of liver disease will be updated to a more reliable posterior probability. This will then give a highly accurate diagnosis to determine whether it is liver disease or other diseases, and only oak can help doctors choose the best plan more easily.

  9. Anonymous users2024-01-30

    Applications of Bayesian statistics: machine learning, pattern recognition.

    Bayesian statistical methods are widely used in statistics, computer science, pattern recognition, computer vision, signal processing, machine learning and other fields, and are an interpretability analysis method based on probability theory.

    The British scholar Thomas Bayes proposed a theory of inductive reasoning in "On the Solving of Problems Concerning Opportunities", which was later developed by some statisticians into a systematic statistical inference method, called the Bayesian method. All the results obtained by statistical inference using this method constitute the content of Bayesian statistics.

    Statisticians who believe that Bayesian methods are the only reasonable methods of statistical inference form the Bayesian school of mathematical statistics, the formation of which dates back to the 30s of the 20th century. By the 50s and 60s, it had developed into an influential school. Its impact is growing day by day.

    The main point of contention between the Bayesian school and the frequency school is the problem of a priori distribution. The so-called frequency school refers to the school of statistics that adheres to the frequency interpretation of probability. The Bayesian school holds that a priori distribution can be subjective, that it does not and does not need to have a frequency explanation.

    The frequency school, on the other hand, argues that the use of a priori distribution in statistical inference is only allowed if it has a subjective meaning and can be determined on the basis of appropriate theories or past experience, otherwise objectivity is lost.

    Another criticism is that Bayesian methods give a stylized solution to any statistical problem, which leads to the mechanical application of formulas instead of in-depth analysis of the problem.

    The Bayesian school holds that, theoretically, it can be proved under certain conditions that any reasonable criterion of goodness must be a Bayesian criterion corresponding to a certain prior distribution, so every statistician is consciously or unconsciously a "Bayesian". They argue that the frequencies do not ostensibly use a priori distributions, and that the solution obtained by Nabu is still a Bayesian solution under some kind of prior distribution.

    This potential a priori distribution may be less rational than a carefully selected subjective prior distribution. Second, Bayesians also believe that Bayesian methods are advantages rather than disadvantages in providing stylized solutions to statistical inference and decision-making problems, because it eliminates the difficult mathematical problem of seeking sampling distributions (see statistics).

    Moreover, this stylized solution is not a mechanical formula, it requires a lot of work on the selection of prior distributions, loss functions, etc. Also, the Bayesian school of thought argues that solutions solved by Bayesian methods do not require a frequency explanation and therefore make sense even after a single use.

    Conversely, a solution based on the frequency interpretation of probability is only meaningful if it is used a large number of times, which is often not in line with the reality of the application. The controversy between these two schools of thought was a feature of the development of mathematical statistics in the post-war period. This controversy is far from being resolved, and it will have an impact on the development of mathematical statistics in the future.

Related questions
4 answers2024-03-30

There is a basic tool in Bayesian statistics called the Bayesian formula, also known as the Bayesian rule, and although it is a mathematical formula, its principle does not need to be numerical to understand. >>>More

12 answers2024-03-30

There is no necessary connection between the instruments, and you certainly don't need to learn the guitar first. >>>More

9 answers2024-03-30

In any case, if you buy any brand, you must learn to distinguish between natural latex and artificial latex, so as not to be fooled: >>>More

24 answers2024-03-30

Calcium is an essential element for living organisms. In the human body, there are proteins bound with Ca2+ in muscles, nerves, body fluids and bones. Calcium is the main inorganic component of human bones and teeth, and is necessary for neurotransmission, muscle contraction, blood clotting, hormone release, and milk secretion. >>>More

10 answers2024-03-30

1.In terms of difficulty, although the bass does not need to practice for a long time from getting started to working with the band, it is still difficult to learn by yourself, because it is difficult to figure out the formal playing posture just by reading ** books, and there is a great chance of detours. And it's very boring to learn the bass purely, because the bass is bass, and it's easy to lose enthusiasm when you practice solo or something. >>>More