-
Data acquisition. Refers to the process of automatically collecting information from analog and digital units under test such as sensors and other devices under test. The data acquisition system is a combination of computer-based measurement hardware and software products to achieve a flexible, user-defined measurement system.
The purpose of data acquisition is to measure physical phenomena such as voltage, current, temperature, pressure, or sound. PC-based data acquisition is carried out through a combination of modular hardware, application software and a PC. Although data acquisition systems are defined differently depending on the needs of different applications, each system collects, analyzes, and displays information for the same purpose.
The data acquisition system integrates signals, sensors, actuators, signal conditioning, data acquisition equipment, and application software.
-
The purpose of data collection is to obtain specific data for analysis, research, decision-making, etc. Through data collection, a large amount of data can be collected, from which patterns, trends and correlations can be discovered, and strong support can be provided for enterprises, academic research, market research, etc. Data collection can help users obtain various types of data, including text, **, **, etc., so as to meet the needs of different fields and industries.
As a powerful data collection tool, Octopus Collector can help users quickly and efficiently collect data, providing intelligent identification and flexible custom collection rule settings, so that users can easily obtain the required data. Octopus collectors are widely used in scientific research, market research, public opinion monitoring, bidding and other fields, providing users with strong data support. Octopus has prepared a series of concise and easy-to-understand tutorials for users to help you quickly master the collection skills and easily deal with all kinds of ** data collection, please go to the official website tutorial and help for more details.
-
Common data acquisition.
There are questionnaires.
Consult information, field inspection, and experiment.
1. Questionnaire survey: Questionnaire survey is the most commonly used method of data collection, because its cost is relatively low, and the information obtained will be more comprehensive.
2. Access to information: Access to information is the oldest way of data collection, and you can get the data you want by consulting books, records and other materials.
3. Field investigation: Field investigation is to go to the designated place to do research, which refers to going to the field to conduct an intuitive and partial detailed investigation in order to understand the truth of a thing and the development process of the situation.
4. Experiment: The advantage of experimental data collection is that the accuracy of the data is very high, and the disadvantage is that the uncertainty is great, regardless of the period of the experiment or the results of the experiment.
-
The data collection methods are classified according to two categories: online collection and offline collection, and the following is a brief introduction to each collection method and related technologies.
1.Online collection.
1) Open data.
Open data refers to the data that is open to everyone on the Internet, including data that is open to specific industries, data that is publicly available at all levels, and data related to content on web pages.
To obtain open data, we can use crawler technology, and here is a brief introduction to crawler technology.
Crawler technology is a technology that allows developers to automate and systematically collect relevant data on the Internet, and crawlers are not the producers of content, but the porters of content. All kinds of learning materials about reptile technology can be said to be "sweaty" on the Internet, so I won't talk about it here, but what we want to talk about here about crawlers is the safety of crawlers, and we must abide by relevant laws and remember not to touch the red line.
a.Personal information, commercial secrets, and state secrets are the red lines of data crawling.
b.Abide by professional ethics, control the frequency of crawler visits, and do not interfere with the normal business activities of the crawler.
c.Abide by the robots protocol and do what you can and can't climb.
2) Third-party platform data.
For example, if a developer wants to obtain relevant financial data, in addition to using crawler technology, we can retrieve relevant data through the API interface provided by a third-party platform.
I have received such a task, to obtain all the sections of a city that prohibit motor vehicles from turning left, prohibiting motor vehicles from turning right, and prohibiting motor vehicles from turning around, when there is no condition to obtain accurate data, we can set up starting and ending points at the intersection through the API interface of the map open platform of AutoNavi or, and analyze whether the intersection is forbidden to left, right, and U-turn by comparing the path planning distance of motor vehicles and walking. The corresponding function has a corresponding service document to explain how to use, and you can open ** to try if you are interested.
3) Physical data.
Physical data refers to the data generated by the user in the physical world, such as various sensors of the mobile phone when the user uses the mobile phone (fingerprint sensor: records the user's fingerprint for unlocking the mobile phone or making payments, etc., gyroscope: records the angular velocity through the principle of conservation of angular momentum for mobile phone navigation and other behaviors).
Compared with daily applications, physical data exists in a large number of traditional manufacturing industries, and there are generally the following types of data collection methods:
Sensors:
As mentioned above, there are many types of sensors in the mobile phone, and there are many types of sensors in the traditional manufacturing industry, covering different categories of industrial sensors such as light-sensitive, gas-sensitive, force-sensitive, magnetic-sensitive, and sound-sensitive.
-
1. Questionnaire survey
The structure of the questionnaire refers to the order and distribution of questions between groups of questions used for different purposes and between different questionnaires used for the same study.
The steps to design the overall structure of the questionnaire are as follows: first, according to the results of operationalization, the variables are classified, the independent variables, dependent variables and control variables are clarified, and a list is made; Secondly, for each variable, the interview question or interview group was designed according to the interview form. thirdly, the relationship and structure of the interview are planned as a whole; Finally, the auxiliary content of the questionnaire was designed.
2. Interviews and surveys
Interview surveys are surveys that collect data through question-and-answer interactions between interviewers and interviewees, and are used in almost all survey activities. The interview method has a certain code of conduct, from the full preparation of the interview, the smooth entry, the effective control to the end of the interview, each link has certain skills.
3. Observation and investigation
Observational surveys are another method of collecting data, using sensory organs such as the observer's eyes and other instruments and equipment to collect research data. Preparation before observation, smooth entry into the observation site, observation process, observation record, and smooth exit from observation are all highly skillful links.
4. Literature Survey
First, to obtain literature through searching; second, to read the literature obtained; Thirdly, the literature was annotated, summarized and excerpted according to the operational indicators of the research question. Finally, a database of literature surveys was established.
5. Trace investigation
Big data research is also about grasping the relationship patterns between things. In social surveys and research, the investigation of big data is more about selecting data from big data, and it is also necessary to operationalize research hypotheses and variables before the survey.
-
Data acquisition technology is widely used in various fields. For example, cameras and microphones are all data collection tools.
Generally speaking, data collection should collect as much data as possible from data sources such as target objects, devices, and services, and transmit and summarize the obtained data to the designated area for storage in the form required, so as to lay the foundation for future data mining and analysis.
3.Configure collection rules. You can use the intelligent recognition function to let Octopus automatically identify the data structure of the e-commerce ** page, or manually set the collection rules. >>>More
For most manufacturing enterprises, the automatic data acquisition of measuring instruments has always been a troublesome thing, even if the instrument has RS232 485 and other interfaces, but still in the use of measurement, while manually recording to the paper, and finally input into the PC to process the way, not only the work is heavy, but also can not ensure the accuracy of the data, often the data obtained by the management personnel has been lagging behind the data for a day or two; For on-site defective product information and related output data, how to achieve efficient, concise and real-time data collection is a major problem.
There are a lot of web collection tools,But it's generally more difficult to use.,Can't write a program to estimate it.,There's a newly released octopus collector recently.,It's pretty simple.,It's a little bit of a mouse.。
The opening of the topic is the purpose and significance of the topic. >>>More
Send data.
The function send will only send data, it needs the socket descriptor, the data being sent and its size. >>>More