In actual applications, we often encounter some special situations, such as the need for news , weather forecast, etc., but as a personal site or a small site, we cannot have so much manpower, material and financial resources to do these things, what should we do?
Fortunately, the Internet is resource sharing. We can use programs to automatically capture pages from other sites and process them for us to use.
What should I use? What the comrade gave me doesn’t work. In fact, Php has this function, which is to use the curl library. Please look at the code below!
$ch = curl_init ("http://dailynews.sina.com.cn");
$fp = fopen ("php_homepage. txt", "w");
curl_setopt ($ch, CURLOPT_FILE, $fp);
curl_setopt ($ch, CURLOPT_HEADER, 0);
curl_exec ($ch);
curl_close ($ch);
fclose ($fp);
?>
Details: http://php.662p.com/thread-504-1-1 .html
Set a permission for a website so that even if website A is invaded, website B cannot be invaded across sites. If you don’t know much, it is recommended to install a host management system, such as the free N-point host management system. The space opened in this way automatically assigns a permission for you.