In the case of Bayesian reasoning, the reasoning methods of Bayesian model mainly include:

Updated on technology 2024-03-04
4 answers
  1. Anonymous users2024-02-06

    There is a basic tool in Bayesian statistics called the Bayesian formula, also known as the Bayesian rule, and although it is a mathematical formula, its principle does not need to be numerical to understand.

    If you see a person who always does something good, that person is most likely going to be a good person. That is to say, when you do not know exactly the nature of a thing, you can rely on the number of events related to the specific nature of the thing to determine the probability of its essential attributes. In mathematical language, it is:

    The more events that support a property, the more likely it is that the property will be true.

    But behavioral economists have found that people often do not follow Bayesian rules in the decision-making process, but give more weight to recent events and recent experiences, and give too much weight to recent events when making decisions and making judgments. When faced with complex and general problems, people tend to take shortcuts and make decisions based on probabilities rather than probabilities. This systematic deviation from the classical model is called "bias".

    Due to the existence of psychological bias, investors are not absolutely rational in decision-making and judgment, and will behave in a biased manner, which will affect the changes in the capital market. But for a long time, in the absence of a powerful alternative tool, economists had to adhere to Bayes' law in their analysis.

    Bayesian reasoning requires two major requirements: the first is to clarify your existing judgments, and the second is to be honest about new evidence, both of which are indispensable. The former is the starting point of judgment, and the latter is the basis for updating judgment.

    And the antecedent judgment and the new evidence are not always independent of each other. If you already have absolute belief in the existence of God, then no matter what new information and evidence emerge, you will always find an explanation that makes you feel comfortable.

    For true Bayesians, they will respect preconceptions, because it is the starting point for all new knowledge, but always be ready to empty their stocks to avoid falling into this trap.

  2. Anonymous users2024-02-05

    The reasoning methods of Bayesian model mainly include: heuristic strategy theory, natural sampling space hypothesis, frequency effect theory, and sampling processing theory.

    Bayesian reasoning is an inductive reasoning method discovered by the British priest Bayes, and many later researchers have continuously improved Bayesian methods in terms of views, methods and theories, and finally formed an influential school of statistics, breaking the dominance of classical statistics. Bayesian reasoning is a new method of reasoning developed on the basis of classical statistical inductive reasoning - estimation and hypothesis testing.

    Compared with the classical statistical inductive reasoning method, Bayesian reasoning should draw conclusions not only on the basis of the currently observed sample information, but also on the relevant past experience and knowledge of the inferent. As a method of reasoning, Bayesian reasoning is an extension of Bayes' theorem in probability theory.

    Study Overview:

    Kahneman and Tversky opened up an important field of study in probabilistic reasoning. Their research in the early 70s of the 20th century first found that people's intuitive probability reasoning does not follow the Bayesian principle, which is manifested in the fact that the basic probability information in the problem is often ignored in the judgment, and the judgment is mainly based on the hit rate information.

    One of their classic studies was to tell the participants that 70 out of 100 people were lawyers and 30 were engineers, and when they were randomly selected from among them, the probability that the participants would judge that person to be an engineer was close when the person's personality traits were described as an engineer. Obviously, the participants ignored the basic probability of an engineer being only 30%.

    Later, they also used a variety of problems to verify the basic probability ignorance phenomenon, such as asking the participants to solve the following taxi problem: 85% of the taxis in a city belong to the green car company, 15% belong to the blue car company, and the existing taxi is involved in a hit-and-run incident, according to an eyewitness, the car involved in the accident belongs to the blue car company, and the reliability of the witness is 80%. Q: What is the probability that the car causing the accident is a blue car.

    The result was judged to be 80% by most participants, but it should be 41% when the basic probability is considered.

  3. Anonymous users2024-02-04

    Bayesian inference is a statistical method applied to decision-making under conditions of uncertainty. The distinguishing feature of Bayesian inference is that prior information and sample information can be used to arrive at a statistical conclusion.

  4. Anonymous users2024-02-03

    Inferences are conclusions or decisions made based on phenomena. Statistical inference is a conclusion about the unobservable properties of the world based on observed characteristics in the real world, often referred to as hypothesis testing. In statistics, unobservable features are often referred to as parameters, while observed features are referred to as data or sample information.

    Bayesian statistical inference is a method that allows investigators to use both sample and a priori information in a logically consistent manner when evaluating statistical hypotheses. In economics, Bayesian inference is used to assist in evaluating different economic hypotheses and models, estimating the numerical values of economic parameters, and making decisions about economic variables to be observed. The conclusion of Bayesian inference is about the probability values of those parameters to be studied, the probability values about the relative confidence of some hypothesis, or the possible intervals of the future observations.

    Compared with non-Bayesian inference, Bayesian inference is characterized by Bayesian utilization of prior information. Prior information may be based on prior research, theories, or subjective beliefs. The term "Bayes" refers to Bayes' theorem, named after Thomas Bayes, a minister of the Presbyterian Church in England and a mathematician (1702-1761), which describes how a priori information can be combined with sample information in a probabilistic way.

    The Bayesian theorem, sometimes referred to as the inverse probability theorem, is the basis of Bayesian learning models. It allows initial and previous sample information to be combined with present sample information to produce posterior data or posterior distributions. The probability distribution function (pdf) that characterizes the characteristics of prior information is called a priori probability distribution function.

    The function that characterizes the information of a sample is called the likelihood function. Bayes' theorem concludes that the posterior probability distribution function is proportional to the product between the prior probability distribution function and the likelihood function. By multiplying the product, Bayes' theorem combines samples and prior information to average the two.

    As long as there is a priori information, this special averaging mechanism of Bayes' theorem is of great significance in calculating the most economic estimation and **.

    Bayesian inference can also be thought of as a dynamic process, as the process begins with a priori information, collects evidence in the form of sample information, and ends with a posterior distribution. This posterior distribution can be used as a new prior distribution combined with new sample information, which is the Bayesian learning model of the transition process from a priori to a posteriori.

Related questions
12 answers2024-03-04

There is no necessary connection between the instruments, and you certainly don't need to learn the guitar first. >>>More

9 answers2024-03-04

In any case, if you buy any brand, you must learn to distinguish between natural latex and artificial latex, so as not to be fooled: >>>More

10 answers2024-03-04

1.In terms of difficulty, although the bass does not need to practice for a long time from getting started to working with the band, it is still difficult to learn by yourself, because it is difficult to figure out the formal playing posture just by reading ** books, and there is a great chance of detours. And it's very boring to learn the bass purely, because the bass is bass, and it's easy to lose enthusiasm when you practice solo or something. >>>More

9 answers2024-03-04

There are two kinds of hypothetical propositions: one is a sufficiently conditional hypothesis, and the other is a necessary conditional hypothesis. >>>More

26 answers2024-03-04

It's very simple, when you knock on the box, if the reading strip goes to the middle of the 1/3 without a pet and reopen it until there is, I have 9 pets, all of which are opened by this method.