-
You want to be more detailed.
-
Regularization refers to the fact that in algebra theory, the indefinite problem is usually defined by a set of linear algebraic equations, and this set of equations is usually the indefinite inverse problem with a large number of conditions. A large number of conditions means that rounding errors or other errors can seriously affect the outcome of the problem.
The common way to solve an indefinite problem is to approximate the solution of the original problem with a set of solutions to the problem that are "nearby" to the original indefinite problem, which is called the regularization method. How to establish an effective regularization method is an important part of the research on indefinite problems in the field of inverse problems.
The usual regularization methods include Tikhonov regularization based on the variational principle, various iterative methods and other improved methods, which are all effective methods for solving indefinite problems, and have been widely used and studied in various types of inverse problem research.
Regularization: Normalization, a concept in algebraic geometry.
In layman's terms. It is to give a plane irreducible algebraic curve some form of pure parametric representation.
i.e. for the irreducible algebraic curve c in pc 2, look for a compact riemann surface c* and an all-pure map: c* pc 2 such that (c*)=c
The strict definition is as follows.
Let c be an irreducible plane algebraic curve and s be the set of singularities of c. If there is a tight riemann surface c* and an all-pure map: c* pc2, make it.
1) c*)=c (2) 1)(s) is a finite set of points (3) c* 1)(s) c s is a one-to-one mapping.
is called (c*, which is the regularization of c. When there is no confusion, it can also be called a regularization of C*.
The practice of regularization is actually to separate the branches of curves with different tangents at the singularity of irreducible plane algebraic curves, thus eliminating this singularity.
The main problem solved.
1.Regularization is the imposition of constraints on the function of minimizing empirical error, which can be interpreted as prior knowledge (regularization of parameters is equivalent to the introduction of a priori distributions to parameters). Constraints have a guiding effect, and when optimizing the error function, they tend to choose the direction of gradient reduction that satisfies the constraints, so that the final solution tends to conform to prior knowledge (such as the general L-norm prior, which means that the original problem is more likely to be simpler, and such optimization tends to produce solutions with small parameter values, which generally correspond to the smooth solution of sparse parameters).
2.At the same time, regularization solves the indefinite problem of instability, the resulting solution is existent, and the only solution that also depends on the data is weak, and the solution will not be overfitted, and if a priori (regularization) is appropriate, the solution tends to be in line with the true solution (let alone overfitted), even if the number of uncorrelated samples in the training set is small.
-
Let's take a look at function fitting, for example! Given a bunch of data, let you use a function to fit this data, in fact, it is to estimate the unknown parameters in the function, if your data is not clean, for example, with some noise, then the effect of your function fitting is definitely not very good, the performance is that the parameters obtained are not very smooth, the fluctuation is more obvious, which is the so-called overfitting problem. Because the noise is fitted during the fitting process!
At this time, regularization is needed, and the popular understanding of regularization is to specify a value range for the parameters of the function, so as to prevent large fluctuations in the parameters, and the advantage is that it can effectively prevent the influence of noise on the parameter estimation of the function.
Regularization is often expressed by adding or subtracting a regularized term at the end of a function.
I suggest you go to CSDN and check out the article "Talk about your own understanding of regularization".
-
The L1 regular assumes that the prior distribution of the parameters is the Laplace distribution, which can ensure the sparsity of the model, that is, some parameters are equal to 0;
The L2 regular assumes that the prior distribution of the parameters is a Gaussian distribution, which can ensure the stability of the model, that is, the values of the parameters are not too large or too small.
L1 regularization and L2 regularization can be seen as penalty terms of the loss function. The so-called penalty refers to the imposition of some restrictions on certain parameters in the loss function. For linear regression models, the model that uses L1 regularization is called LASSO regression, and the model that uses L2 regularization is called Ridge regression.
The figure below is the loss function of LASSO regression in Python, where the plus sign is followed by ||w||1 is the L1 regularization term.
-
regularization regularizing operator regularization operator.
Enterprises engaged in the production, storage, use, operation and transportation of chemicals, involving chemicals that are flammable, explosive, toxic and harmful and corrosive, and cause harm or damage to personnel, facilities and the environment. The enterprise is a hazardous chemical enterprise. >>>More
The benefits of application virtualization are mainly on-demand, and users can use these applications directly on their mobile phones or tablets through authorized clients=. Domestic application virtualization office will be a trend, you can try Sailan 8-hour mobile cloud ICAB, which can save a lot of enterprise costs, but also convenient for mobile office, there are several domestic application virtualization software brands, which can be searched. >>>More
Your question is not clear, what is it that makes people feel ah!
What is information industrialization?
Industrialization is the development of human production activities from the primitive individual manual work type and individual service industry to a large-scale collective production, overall sales, application, and service industry, and gradually formed and improved today's industrialization. >>>More
The hump is the main feature of the marshalling station, it is a small hill built on the ground like the shape of the back of a camel's hump, designed with an appropriate slope, the railway is laid on it, and the potential energy generated by the gravity of the vehicle and the slope of the hump is supplemented by the locomotive thrust to dismantle the train is a kind of shunting equipment, which is a main method of dismantling the train in the marshalling station. When carrying out the hump shunting operation, the train is pushed to the hump by the shunting machine first, and when the front car group (or vehicle) is close to the peak, lift the hook, then you can use the gravity of the vehicle itself, automatically slip down the slope to the predetermined line of the marshalling yard, so that the efficiency of the shunting operation can be greatly improved. The camel bee is generally located at the head of the shunting yard, which is suitable for the dismantling operation of the train. >>>More