i have this very simple php script:
<?php
require 'functions.php';
$token = "some-number";
$id = "other-number";
$albums = get_url_contents("https://graph.facebook.com/".$id."/albums?access_token=".$token);
$aObject = json_decode($albums, true);
foreach ($aObject['data'] as $i => $a) {
$photos = get_url_contents("https://graph.facebook.com/".$a['id']."/photos?access_token=".$token);
$bObject = json_decode($photos, true);
foreach ($bObject['data'] as $y => $b) {
if (strpos($b['name'],"#test1") !== false) {
echo($b['name']."<br>".$b['source']."<br>".$b['created_time']."<br>");
}
}
}
?>
The execution time is always more than 10 seconds, is any way to notify the user with a perceptual text or something?
ok I learned something new.
it is possible. Look there: http://bytes.com/topic/php/answers/5153-status-note-scripts-run-long-time
You could display message at the beginning of the script execution and hide it via javascript on the end of the execution.
I hope you understand what I mean
How about asynch load with js or jquery
Related
My script is working most of the times, but in every 8th try or so I get an error. I'll try and explain this. This is the error I get (or similar):
{"gameName":"F1 2011","gameTrailer":"http://cdn.akamai.steamstatic.com/steam/apps/81163/movie_max.webm?t=1447354814","gameId":"44360","finalPrice":1499,"genres":"Racing"}
{"gameName":"Starscape","gameTrailer":"http://cdn.akamai.steamstatic.com/steam/apps/900679/movie_max.webm?t=1447351523","gameId":"20700","finalPrice":999,"genres":"Action"}
Warning: file_get_contents(http://store.steampowered.com/api/appdetails?appids=400160): failed to open stream: HTTP request failed! in C:\xampp\htdocs\GoStrap\game.php on line 19
{"gameName":"DRAGON: A Game About a Dragon","gameTrailer":"http://cdn.akamai.steamstatic.com/steam/apps/2038811/movie_max.webm?t=1447373449","gameId":"351150","finalPrice":599,"genres":"Adventure"}
{"gameName":"Monster Mash","gameTrailer":"http://cdn.akamai.steamstatic.com/steam/apps/900919/movie_max.webm?t=1447352342","gameId":"36210","finalPrice":449,"genres":"Casual"}
I'm making an application that fetches information on a random Steam game from the Steam store. It's quite simple.
The script takes a (somewhat) random ID from a text file (working for sure)
The ID is added to the ending of an URL for the API, and uses file_get_contents to fetch the file. It then decodes json. (might be the problem somehow)
Search for my specified data. Final price & movie webm is not always there, hence the if(!isset())
Decide final price and ship back to ajax on index.php
The error code above suggests that I get the data I need in 4 cases, and an error once. I only wanna receive ONE json string and return it, and only in-case $game['gameTrailer'] and $game['final_price'] is set.
This is the php (it's not great, be kind):
<?php
//Run the script on ajax call
if(isset($_POST)) {
fetchGame();
}
function fetchGame() {
$gameFound = false;
while(!$gameFound) {
////////// ID-picker //////////
$f_contents = file("steam.txt");
$url = $f_contents[mt_rand(0, count($f_contents) - 1)];
$answer = explode('/',$url);
$gameID = $answer[4];
$trimmed = trim($gameID);
////////// Fetch game //////////
$json = file_get_contents('http://store.steampowered.com/api/appdetails?appids='.$trimmed);
$game_json = json_decode($json, true);
if(!isset($game_json[$trimmed]['data']['movies'][0]['webm']['max']) || !isset($game_json[$trimmed]['data']['price_overview']['final'])) {
continue;
}
$gameFound = true;
////////// Store variables //////////
$game['gameName'] = $game_json[$trimmed]['data']['name'];
$game['gameTrailer'] = $game_json[$trimmed]['data']['movies'][0]['webm']['max'];
$game['gameId'] = $trimmed;
$game['free'] = $game_json[$trimmed]['data']['is_free'];
$game['price'] = $game_json[$trimmed]['data']['price_overview']['final'];
$game['genres'] = $game_json[$trimmed]['data']['genres'][0]['description'];
if ($game['free'] == TRUE) {
$game['final_price'] = "Free";
} elseif($game['free'] == FALSE || $game['final_price'] != NULL) {
$game['final_price'] = $game['price'];
} else {
$game['final_price'] = "-";
}
}
////////// Return to AJAX (index.php) //////////
echo
json_encode(array(
'gameName' => $game['gameName'],
'gameTrailer' => $game['gameTrailer'],
'gameId' => $game['gameId'],
'finalPrice' => $game['final_price'],
'genres' => $game['genres'],
))
;
}
?>
Any help will be appreciated. Like, are there obvious reason as to why this is happening? Is there a significantly better way? Why is it re-iterating itself at least 4 times when it seems to have fetched that data I need? Sorry if this post is long, just trying to be detailed with a lacking php/json-vocabulary.
Kind regards, John
EDIT:
Sometimes it returns no error, just multiple objects:
{"gameName":"Prime World: Defenders","gameTrailer":"http://cdn.akamai.steamstatic.com/steam/apps/2028642/movie_max.webm?t=1447357836","gameId":"235360","finalPrice":899,"genres":"Casual"}
{"gameName":"Grand Ages: Rome","gameTrailer":"http://cdn.akamai.steamstatic.com/steam/apps/5190/movie_max.webm?t=1447351683","gameId":"23450","finalPrice":999,"genres":"Simulation"}
I don't know how to make this.
There is an XML Api server and I'm getting contents with cURL; it works fine. Now I have to call the creditCardPreprocessors state. It has 'in progress state' too and PHP should wait until the progess is finished. I tried already with sleep and other ways, but I can't make it. This is a simplified example variation of what I tried:
function process_state($xml){
if($result = request($xml)){
// It'll return NULL on bad state for example
return $result;
}
sleep(3);
process_state($xml);
}
I know, this can be an infite loop but I've tried to add counting to exit if it reaches five; it won't exit, the server will hang up and I'll have 500 errors for minutes and Apache goes unreachable for that vhost.
EDIT:
Another example
$i = 0;
$card_state = false;
// We're gona assume now the request() turns back NULL if card state is processing TRUE if it's done
while(!$card_state && $i < 10){
$i++;
if($result = request('XML STUFF')){
$card_state = $result;
break;
}
sleep(2);
}
The recursive method you've defined could cause problems depending on the response timing you get back from the server. I think you'd want to use a while loop here. It keeps the requests serialized.
$returnable_responses = array('code1','code2','code3'); // the array of responses that you want the function to stop after receiving
$max_number_of_calls = 5; // or some number
$iterator = 0;
$result = NULL;
while(!in_array($result,$returnable_responses) && ($iterator < $max_number_of_calls)) {
$result = request($xml);
$iterator++;
}
Hi everyone once again!
We need some help to develop and implement a multi-curl functionality into our crawler. We have a huge array of "links to be scanned" and we loop throw them with a Foreach.
Let's use some pseudo code to understand the logic:
1) While ($links_to_be_scanned > 0).
2) Foreach ($links_to_be_scanned as $link_to_be_scanned).
3) Scan_the_link() and run some other functions.
4) Extract the new links from the xdom.
5) Push the new links into $links_to_be_scanned.
5) Push the current link into $links_already_scanned.
6) Remove the current link from $links_to_be_scanned.
Now, we need to define a maximum number of parallel connections and be able to run this process for each link in parallel.
I understand that we're gonna have to create a $links_being_scanned or some kind of queue.
I'm really not sure how to approach this problem to be honest, if anyone could provide some snippet or idea to solve it, it would be greatly appreciated.
Thanks in advance!
Chris;
Extended:
I just realized that is not the multi-curl itself the tricky part, but the amount of operations done with each link after the request.
Even after the muticurl, I would eventually have to find a way to run all this operations in parallel. The whole algorithm described below would have to run in parallel.
So now rethinking, we would have to do something like this:
While (There's links to be scanned)
Foreach ($Link_to_scann as $link)
If (There's less than 10 scanners running)
Launch_a_new_scanner($link)
Remove the link from $links_to_be_scanned array
Push the link into $links_on_queue array
Endif;
And each scanner does (This should be run in parallel):
Create an object with the given link
Send a curl request to the given link
Create a dom and an Xdom with the response body
Perform other operations over the response body
Remove the link from the $links_on_queue array
Push the link into the $links_already_scanned array
I assume we could approach this creating a new PHP file with the scanner algorithm, and using pcntl_fork() for each parallel proccess?
Since even using multi-curl, I would eventually have to wait looping on a regular foreach structure for the other processes.
I assume I would have to approach this using fsockopen or pcntl_fork.
Suggestions, comments, partial solutions, and even a "good luck" will be more than appreciated!
Thanks a lot!
DISCLAIMER: This answer links an open-source project with which I'm involved. There. You've been warned.
The Artax HTTP client is a socket-based HTTP library that (among other things) offers custom control over the number of concurrent open socket connections to individual hosts while making multiple asynchronous HTTP requests.
Limiting the number of concurrent connections is easily accomplished. Consider:
<?php
use Artax\Client, Artax\Response;
require dirname(__DIR__) . '/autoload.php';
$client = new Client;
// Defaults to max of 8 concurrent connections per host
$client->setOption('maxConnectionsPerHost', 2);
$requests = array(
'so-home' => 'http://stackoverflow.com',
'so-php' => 'http://stackoverflow.com/questions/tagged/php',
'so-python' => 'http://stackoverflow.com/questions/tagged/python',
'so-http' => 'http://stackoverflow.com/questions/tagged/http',
'so-html' => 'http://stackoverflow.com/questions/tagged/html',
'so-css' => 'http://stackoverflow.com/questions/tagged/css',
'so-js' => 'http://stackoverflow.com/questions/tagged/javascript'
);
$onResponse = function($requestKey, Response $r) {
echo $requestKey, ' :: ', $r->getStatus();
};
$onError = function($requestKey, Exception $e) {
echo $requestKey, ' :: ', $e->getMessage();
}
$client->requestMulti($requests, $onResponse, $onError);
IMPORTANT: In the above example the Client::requestMulti method is making all the specified requests asynchronously. Because the per-host concurrency limit is set to 2, the client will open up new connections for the first two requests and subsequently reuse those same sockets for the other requests, queuing requests until one of the two sockets become available.
you could try something like this, haven't checked it, but you should get the idea
$request_pool = array();
function CreateHandle($url) {
$handle = curl_init($url);
// set curl options here
return $handle;
}
function Process($data) {
global $request_pool;
// do something with data
array_push($request_pool , CreateHandle($some_new_url));
}
function RunMulti() {
global $request_pool;
$multi_handle = curl_multi_init();
$active_request_pool = array();
$running = 0;
$active_request_count = 0;
$active_request_max = 10; // adjust as necessary
do {
$waiting_request_count = count($request_pool);
while(($active_request_count < $active_request_max) && ($waiting_request_count > 0)) {
$request = array_shift($request_pool);
curl_multi_add_handle($multi_handle , $request);
$active_request_pool[(int)$request] = $request;
$waiting_request_count--;
$active_request_count++;
}
curl_multi_exec($multi_handle , $running);
curl_multi_select($multi_handle);
while($info = curl_multi_info_read($multi_handle)) {
$curl_handle = $info['handle'];
call_user_func('Process' , curl_multi_getcontent($curl_handle));
curl_multi_remove_handle($multi_handle , $curl_handle);
curl_close($curl_handle);
$active_request_count--;
}
} while($active_request_count > 0 || $waiting_request_count > 0);
curl_multi_close($multi_handle);
}
You should look for some more robust solution to your problem. RabbitMQ
is a very good solution I used. There is also Gearman but I think it is your choice.
I prefer RabbitMQ.
I will share with you my code which I have used to collect email addresses from certain website.
You can modify it to fit your needs.
There were some problems with relative URL's there.
And I do not use CURL here.
<?php
error_reporting(E_ALL);
$home = 'http://kharkov-reklama.com.ua/jborudovanie/';
$writer = new RWriter('C:\parser_13-09-2012_05.txt');
set_time_limit(0);
ini_set('memory_limit', '512M');
function scan_page($home, $full_url, &$writer) {
static $done = array();
$done[] = $full_url;
// Scan only internal links. Do not scan all the internet!))
if (strpos($full_url, $home) === false) {
return false;
}
$html = #file_get_contents($full_url);
if (empty($html) || (strpos($html, '<body') === false && strpos($html, '<BODY') === false)) {
return false;
}
echo $full_url . '<br />';
preg_match_all('/([A-Za-z0-9_\-]+\.)*[A-Za-z0-9_\-]+#([A-Za-z0-9][A-Za-z0-9\-]*[A-Za-z0-9]\.)+[A-Za-z]{2,4}/', $html, $emails);
if (!empty($emails) && is_array($emails)) {
foreach ($emails as $email_group) {
if (is_array($email_group)) {
foreach ($email_group as $email) {
if (filter_var($email, FILTER_VALIDATE_EMAIL)) {
$writer->write($email);
}
}
}
}
}
$regexp = "<a\s[^>]*href=(\"??)([^\" >]*?)\\1[^>]*>(.*)<\/a>";
preg_match_all("/$regexp/siU", $html, $matches, PREG_SET_ORDER);
if (is_array($matches)) {
foreach($matches as $match) {
if (!empty($match[2]) && is_scalar($match[2])) {
$url = $match[2];
if (!filter_var($url, FILTER_VALIDATE_URL)) {
$url = $home . $url;
}
if (!in_array($url, $done)) {
scan_page($home, $url, $writer);
}
}
}
}
}
class RWriter {
private $_fh = null;
private $_written = array();
public function __construct($fname) {
$this->_fh = fopen($fname, 'w+');
}
public function write($line) {
if (in_array($line, $this->_written)) {
return;
}
$this->_written[] = $line;
echo $line . '<br />';
fwrite($this->_fh, "{$line}\r\n");
}
public function __destruct() {
fclose($this->_fh);
}
}
scan_page($home, 'http://kharkov-reklama.com.ua/jborudovanie/', $writer);
morning. I am wanting to take all segments of php code out of a file located on my local server. Problem is i dont seem to be getting anywhere, no php errors just browser errors.
$file_contents = "<xmp>".file_get_contents("../www.cms.actwebdesigns.co.uk2/pageIncludes/instalation/selectMainPages.php")."</xmp>";
if(preg_match_all("#<\?php((?!\?>).)*#is", $file_contents, $matches))
{
foreach($matches[0] as $phpCode)
{
$code = "<xmp>".$phpCode."\n?></xmp>";
}
}
echo "dsds";
?>
could someone please point me in the right direction?
working with this:
$file_contents = token_get_all(file_get_contents("../www.cms.actwebdesigns.co.uk2/logged.php"));
$start=0;
$end=0;
$segmentArray = array();
foreach($file_contents as $key => $token)
{
$tokenName = token_name($key);
if($start==0 && $end==0 && $tokenName=="T_OPEN_TAG")
{
$start=1;
}
if(start==1 && $end==0 && $tokenName!="T_CLOSE_TAG")
{
$entryNo = count($segmentArray);
$segmentArray[$entryNo][] = $token;
}
if($tokenName=="T_CLOSE_TAG")
{
$start=0;
}
}
You might want to tokenize the PHP script using the Tokenizer extension:
http://php.net/manual/en/book.tokenizer.php
The extensions is built into PHP since PHP v4.3.0.
$tokens = token_get_all(file_get_contents($file));
http://www.php.net/manual/en/function.token-get-all.php
Not sure how to use this. Puts all code into an array. For me to use it wouldn't i have to implode it or something then im back to square one?
I was following this tutorial.
I need to use a php file's ouput in my HTML file to dynamically load images into a gallery. I call
function setOutput()
{
if (httpObject.readyState == 4)
document.getElementById('main').src = httpObject.responseText;
alert("set output: " + httpObject.responseText);
}
from
function doWork()
{
httpObject = getHTTPObject();
if (httpObject != null) {
httpObject.open("GET", "gallery.php?no=0", true);
httpObject.send(null);
httpObject.onreadystatechange = setOutput;
}
}
However, the alert returns the php file, word for word. It's probably a really stupid error, but I can't seem to find it.
The php file:
<?php
if (isset($_GET['no'])) {
$no = $_GET['no'];
if ($no <= 10 && $no >1) {
$xml = simplexml_load_file('gallery.xml');
echo "images/" . $xml->image[$no]->src;
}
else die("Number isn't between 1 and 10");
}
else die("No number set.");
?>
If the alert is returning the contents of the PHP file instead of the results of executing it, then the server is not executing it.
Test by accessing the URI directly (instead of going via JavaScript).
You probably need to configure PHP support on the server.
Your Server doesn't serve/parse PHP files! You could test your JavaScript code by setting the content of gallery.php to the HTML code you want to receive.