How Can I Speed This Up?
Solution 1:
It's probably the getimagesize function - it is going and fetching every image on the page so it can determine the size. Maybe you can write something with curl to get the header only for Content-size (though, actually, this might be what getimagesize does).
At any rate, back in the day I wrote a few spiders and it's kind of slow to do, with internet speeds better than ever it's still a fetch for each element. And I wasn't even concerned with images.
Solution 2:
I'm not a PHP guy, but it looks to me like you're going out to the web to get the file twice...
First using this:
$html = file_get_html($url);
Then again using this:
$file_string = file_get_contents($url);
So if each hit takes a couple of seconds, you might be able to reduce your timing by finding a way to cut this down to a single web-hit.
Either that, or I'm blind. Which is a real possibility!
Post a Comment for "How Can I Speed This Up?"