Is it true that the 800 million big data under the line of Yihui Zhongmeng?

Updated on technology 2024-05-16
16 answers
  1. Anonymous users2024-02-10

    It's true, we are burning money to do it, they have been collecting money to do it, we can't do it anymore, they are still doing it, the volume of offline data is so large that you can't imagine, 800 million+ is completely possible, and more and more enterprises see the value of offline data, and it is difficult for Yihui to catch up with the accumulation of these three years for a while.

  2. Anonymous users2024-02-09

    Beijing Yihui Zhongmeng Shanghai Chongmeng Technology is very not very good, 15 years began to attract investment to now there are more than 1,000 ** businesses in the country, and now there are only a few newly opened ** businesses (the new ones do not know the platform situation so they operate the market with confidence). Use the concept of big data to fool customers, but the advertisements are all used in the remaining traffic, and they can't be delivered at all. CPM is several times higher than that of DSP platforms.

    There is no effect after the launch, and the promoted ** platform is restricted by the industry or even cannot be launched at all. In 2020, a super push began to be launched, focusing on private domain + public domain to achieve global marketing, but it only packaged and packaged small programs, online **, online business cards, and online live broadcasts into an integrated platform, but in fact, the function could not be realized at all. Don't ask me why I know, I really can't bear to see the ** merchant's money in vain.

    At present, there are many ** business markets where Zhongmeng has been deceived and refunded, so be careful when you start, you are really optimistic about his concept, and let Zhongmeng give you a trial investment, or find a local Zhongmeng ** business to understand the situation.

  3. Anonymous users2024-02-08

    Big data is true, but it is still very difficult to use it specifically, especially as an accurate advertising delivery, which is difficult to achieve, and the current technology in the industry is still in the theoretical concept stage.

  4. Anonymous users2024-02-07

    This is okay, but Bihe Technology Zhaocai Bao will be better than them, collect mobile phone numbers within 200 meters, reach your precise users through a series of marketing methods, and the cost of customer acquisition is very low.

  5. Anonymous users2024-02-06

    Understand the characteristics of big data in a minute.

  6. Anonymous users2024-02-05

    "Big data" is a massive, high-growth and diversified information asset that requires new processing models to have stronger decision-making, insight and process optimization capabilities.

    Structural characteristics of big data:

    Structural big data is a manifestation or feature of the development of the Internet to the present stage, there is no need to mythologize it or maintain a sense of awe for it, under the backdrop of the curtain of technological innovation represented by cloud computing, these data that were originally difficult to collect and use began to be easy to use, through continuous innovation in all walks of life, big data will gradually create more value for human beings.

    Secondly, if you want to systematically understand big data, you must decompose it comprehensively and meticulously, and I set out to start from three levels:

    The first level is theory, which is the only way to know, and it is also the baseline that is widely recognized and disseminated. Here, we understand the overall depiction and characterization of big data in the industry from the characteristic definition of big data. From the value of big data to deeply analyze the preciousness of big data; Insight into the development trend of big data; From the perspective of big data privacy, it is particularly important to examine the long-term game between people and data.

    The second level is technology, which is the means to reflect the value of big data and the cornerstone of progress. Here, the whole process of big data from collection, processing, storage to formation is illustrated from the development of cloud computing, distributed processing technology, storage technology and perception technology.

    The third level is practice, which is the ultimate value embodiment of big data. Here, from the four aspects of Internet big data, big data, enterprise big data, and personal big data, we will depict the beautiful picture that big data has shown and the blueprint that is about to be realized.

    FeaturesCompared with traditional data warehouse applications, big data big data analysis has the characteristics of large data volume and complex query and analysis. The article "Architecture Big Data: Challenges, Status Quo and Prospects" published in the Journal of Computer Science lists several important features that big data analysis platforms need to have, analyzes and summarizes the current mainstream implementation platforms, such as parallel database, mapreduce and hybrid architecture based on both, points out the advantages and disadvantages of each, and also introduces the research status of various directions and the author's efforts in big data analysis, and looks forward to future research.

    The four "Vs", or characteristics, of big data have four levels:

    First, the volume of data is huge. From the TB level, jump to the PB level.

    Third, the processing speed is fast, the law of 1 second, and the information of the best value can be quickly obtained from various types of data, which is also fundamentally different from the traditional data mining technology.

    Fourth, as long as data is used wisely and analyzed correctly and accurately, it will bring high value returns. The industry summarizes this into four "Vs": volume (large data volume), variety (multiple data types), velocity (fast processing speed), and value (low value density).

    The core value of big data is to store and analyze massive amounts of data. Compared with other existing technologies, the comprehensive cost of "cheap, fast, and optimized" of big data is optimal.

  7. Anonymous users2024-02-04

    Big data is a collection of data that is so large that it greatly exceeds the capabilities of traditional database software tools in terms of acquisition, storage, management, and analysis, and has four characteristics: massive data scale, fast data flow, diverse data types, and low value density.

  8. Anonymous users2024-02-03

    Big data, an IT industry term, refers to a collection of data that cannot be captured, managed, and processed by conventional software tools within a certain time frame, and is a massive, high-growth and diversified information asset that requires new processing models to have stronger decision-making, insight and process optimization capabilities.

    In The Age of Big Data [1] by Victor Mayer-Schönberg and Kenneth Cookye, big data refers to the use of all data for analysis and processing without the use of shortcuts such as stochastic analysis (sample surveys). The 5V characteristics of big data (proposed by IBM): volume (large volume), velocity (high speed), variety (variety), value (low value density), and veracity (authenticity).

  9. Anonymous users2024-02-02

    Big data is undoubtedly an important concept in the field of science and technology in recent years, and as more and more enterprises begin to gradually participate in the big data industry chain, the definition of big data itself is constantly enriched and developed.

    To define big data, it can be defined in the following three ways:

    First, big data has redefined the value of data. Big data not only represents technology, but also represents an industry, and also represents a development trend.

    Big data technology refers to a series of related technologies around the valorization of data, including data collection, storage, security, analysis, presentation, etc.; The big data industry refers to the industrial ecology based on big data technology, and the industrial ecology of big data is not yet perfect, and there is still a large space for development; The development trend refers to the fact that big data will become an important area of innovation.

    Second, big data lays the foundation for an intelligent society. The development of artificial intelligence requires three foundations, namely data, computing power and algorithms, so big data is of great significance to the development of artificial intelligence.

    At present, an important reason for the obvious improvement in the application effect in the field of artificial intelligence is that there is a large amount of data support, which will comprehensively promote the training process and verification process of the algorithm, so as to improve the application effect of the algorithm.

    Third, big data promotes the digitization of social resources. The development of big data makes data produce greater value, and this process will promote the digitization process of social resources to a large extent, and after more social resources are digitized, the functional boundaries of big data will be continuously expanded, thus driving a series of innovations based on big data.

    Finally, an important reason why big data is important is that big data has opened up a new area of value, and big data will gradually become an important production material, and it can even be said that big data will be an emerging energy source for an intelligent society.

  10. Anonymous users2024-02-01

    The first to propose big data was McKinsey & Company, when.

    The definition of time is:

    The data that permeates every industry and business area, through the mining and application of this massive data, generates a new wave of productivity growth and consumer surplus.

    The definition later given by the McKinsey Global Institute was:

    A data collection that is so large that it greatly exceeds the capabilities of traditional database software tools in terms of acquisition, storage, management, and analysis, and has four characteristics: massive data scale, fast data flow, diverse data types, and low value density.

    Gartner, a research organization, gives this definition:

    "Big data" is a new processing paradigm that requires greater decision-making, insight and process optimization capabilities to adapt to massive, high-growth and diverse information assets.

    It refers to the collection of data that cannot be captured, managed and processed by conventional software tools within a certain time frame, and the amount of data involved is so large that it cannot be captured, managed, processed, and sorted into information that can be captured, managed, processed, and sorted into a more positive purpose for business decision-making through the human brain or even mainstream software tools within a reasonable time.

    Simply understood as:

    Big data"It is a dataset with a particularly large volume and a large data category, and such a dataset cannot be crawled, managed, and processed by traditional database tools. To put it simply, it is super storage, after the massive data is uploaded to the cloud platform, the big data will conduct in-depth analysis and mining of the data.

  11. Anonymous users2024-01-31

    Large data sources refer to the inability to use conventional soft bai within a certain time frame

  12. Anonymous users2024-01-30

    1. Big data is relative to tradition"Small data"The official definition of big data refers to those datasets with particularly large data volumes and complex data categories, which cannot be stored, managed, and processed by traditional databases. The main characteristics of big data are large data volume (volume), complex data categories (variety), fast data processing speed (velocity) and high data authenticity (veracity), which together are called 4V.

    2. Detailed analysis:

  13. Anonymous users2024-01-29

    Definition of big data: Big data, also known as big data, refers to the amount of data involved that is so large that it cannot be captured, managed, processed, and sorted into information that can be captured, managed, processed, and sorted into a more positive purpose for business decision-making through the human brain or even mainstream software tools within a reasonable time. Netboat Technology is based on mobile Internet data collection, analysis of user behavior, through data mining means, to achieve the whole process of data analysis solutions.

    The analysis tool used is currently the most advanced in the industry, Adobe Insight.

    The characteristics of big data: large amount of data, many types of data, strong real-time requirements, and large value of data. Big data exists in all walks of life, but a lot of information and consultation are complex, and we need to search, process, analyze, summarize, and summarize its deep-seated rules.

    Big data collection: The development of science and technology and the Internet has promoted the advent of the era of big data, and all walks of life are generating a huge number of data fragments every day, and the data measurement unit has developed from bytes, kb, mb, gb, and TB to PB, EB, ZB, YB, and even bb, nb, and db. In the era of big data, the collection of data is no longer a technical problem, but in the face of so much data, how can we find its internal laws.

    Mining and processing of big data: Big data must not be calculated and estimated by the human brain, or processed by a single computer, and must adopt a distributed computing architecture, relying on the distributed processing, distributed database, cloud storage and virtualization technology of cloud computing, therefore, the mining and processing of big data must use cloud technology.

  14. Anonymous users2024-01-28

    In the digital era, Internet operations are inseparable from big data, what is big data? How is it being used?

  15. Anonymous users2024-01-27

    Big data is a kind of data collection that is so large that it greatly exceeds the capabilities of traditional data return software in terms of acquisition, storage, management, and analysis, and has four characteristics: massive data scale, fast data flow, diverse data types, and low value density.

    From a technical point of view, the relationship between big data and cloud computing is as inseparable as the heads and tails of the same coin. Big data cannot necessarily be processed by a single computer, and must adopt a distributed architecture. It features distributed data mining of massive amounts of data.

    But it must rely on the distributed processing, distributed databases and cloud storage, and virtualization technologies of cloud computing.

  16. Anonymous users2024-01-26

    In layman's terms, big data is a database where all the data is integrated together and is larger than ever before. Academically speaking, big data is a collection of data that greatly exceeds the capabilities of traditional database software tools in terms of acquisition, storage, management and analysis, and has the four characteristics of massive data scale, fast data flow, diverse data types and low value density.

Related questions
18 answers2024-05-16

If you want to learn systematically, you can consider signing up for a live online class, and recommend CGWANG's online class. The teacher speaks carefully, you can watch it back after the class, and there are also the same type of recorded classes that you can learn for free (give away lifelong VIP). >>>More