Novel of Web Search Strategy Based on Web Page Block Granularity Analysis Algorithm and Correlation Calculation Model

In this paper, the running parameters of crawler nodes are dynamically adjusted to make them more manageable and configurable. The running parameters include crawling speed, crawling depth, number of crawler threads. After processing and organizing the information, the search engine provides Web information query service for users, crawls continuously on Internet, collects web information. Then the information is understood, extracted, organized and processed by the indexer, and the index database is established. The paper presents novel of web search strategy based on web page block granularity analysis algorithm and correlation calculation model.