I have this code:
<?php
foreach($items as $item) {
$site = $item['link'];
$id = $item['id'];
$newdata = $item['data_a'];
$newdata2 = $item['data_b'];
$ch = curl_init($site.'updateme.php?id='.$id.'&data1='.$newdata.'&data2='.$newdata2);
curl_exec ($ch);
// do some checking here
curl_close ($ch);
}
?>
Sample input:
$site = 'http://www.mysite.com/folder1/folder2/';
$id = 512522;
$newdata = 'Short string here';
$newdata = 'Another short string here with numbers';
Here the main process of updateme.php
if (!$id = intval(Tools::getValue('id')))
$this->_errors[] = Tools::displayError('Invalid ID!');
else
{
$history = new History();
$history->id = $id;
$history->changeState($newdata1, intval($id));
$history->id_employee = intval($employee->id_employee);
$carrier = new Carrier(intval($info->id_carrier), intval($info->id_lang));
$templateVars = array('{delivery}' => ($history->id_data_state == _READY_TO_SEND AND $info->shipping_number) ? str_replace('#', $info->shipping_number, $carrier->url) : '');
if (!$history->addWithemail(true, $templateVars))
$this->_errors[] = Tools::displayError('an error occurred while changing status or was unable to send e-mail to the employee');
}
The site will always be changing and each $items will have atleast 20 data inside it so the foreach loop will run atleast 20 times or more depending on the number of data.
The target site will update it's database with the passed variables, it will probably pass thru atleast 5 functions before it is saved to the DB so it could probably take some time too.
My question is will there be a problem with this approach? Will the script encounter a timeout error while going thru the curl process? How about if the $items data is around 50 or in the hundreds now?
Or is there a better way to do this?
UPDATES:
* Added updateme.php main process code. Additional info: updateme.php will also send an email depending on the variables passed.
Right now all of the other site are hosted in the same server.
You can have a php execution time problem.
For your curl timeout problem, you can "fix" it using the option CURLOPT_TIMEOUT.
Since the cURL script that calls updateme.php doesn't expect a response, you should make updateme.php return early.
http://gr.php.net/register_shutdown_function
function shutdown() {
if (!$id = intval(Tools::getValue('id')))
$this->_errors[] = Tools::displayError('Invalid ID!');
else
{
$history = new History();
$history->id = $id;
$history->changeState($newdata1, intval($id));
$history->id_employee = intval($employee->id_employee);
$carrier = new Carrier(intval($info->id_carrier), intval($info->id_lang));
$templateVars = array('{delivery}' => ($history->id_data_state == _READY_TO_SEND AND $info->shipping_number) ? str_replace('#', $info->shipping_number, $carrier->url) : '');
if (!$history->addWithemail(true, $templateVars))
$this->_errors[] = Tools::displayError('an error occurred while changing status or was unable to send e-mail to the employee');
}
}
register_shutdown_function('shutdown');
exit();
You can use set_time_limit(0) (0 means no time limit) to change the timeout of the PHP script execution. CURLOPT_TIMEOUT is the cURL option for setting the timeout, but I think it's unlimited by default, so you don't need to set this option on your handle.
Related
The code I am using is:
$url = "example.com";
$code = file_get_contents("http://www.".$url);
if (!$code) {
$code = file_get_contents("https://www.".$url);
}
I don't know whether each URL starts with http or https. I have 1,000 URLs saved in my database. Any suggestions?
This answer is somewhat relevant. Given that your database is rather incomplete in that the URLs aren't fully qualified, I would be inclined to write a short routine to update the database for future use.
<?php
$url = 'example.com';
$variations[] = 'http://'.$url;
$variations[] = 'https://'.$url;
$variations[] = 'http://www.'.$url;
$variations[] = 'https://www'.$url;
foreach($variations as $v){
// Check variation with function from answer linked above
if(urlOK($v)){
// code to update database row
// or if you don't want to do that
$code = file_get_contents($v);
}
}
My script is working most of the times, but in every 8th try or so I get an error. I'll try and explain this. This is the error I get (or similar):
{"gameName":"F1 2011","gameTrailer":"http://cdn.akamai.steamstatic.com/steam/apps/81163/movie_max.webm?t=1447354814","gameId":"44360","finalPrice":1499,"genres":"Racing"}
{"gameName":"Starscape","gameTrailer":"http://cdn.akamai.steamstatic.com/steam/apps/900679/movie_max.webm?t=1447351523","gameId":"20700","finalPrice":999,"genres":"Action"}
Warning: file_get_contents(http://store.steampowered.com/api/appdetails?appids=400160): failed to open stream: HTTP request failed! in C:\xampp\htdocs\GoStrap\game.php on line 19
{"gameName":"DRAGON: A Game About a Dragon","gameTrailer":"http://cdn.akamai.steamstatic.com/steam/apps/2038811/movie_max.webm?t=1447373449","gameId":"351150","finalPrice":599,"genres":"Adventure"}
{"gameName":"Monster Mash","gameTrailer":"http://cdn.akamai.steamstatic.com/steam/apps/900919/movie_max.webm?t=1447352342","gameId":"36210","finalPrice":449,"genres":"Casual"}
I'm making an application that fetches information on a random Steam game from the Steam store. It's quite simple.
The script takes a (somewhat) random ID from a text file (working for sure)
The ID is added to the ending of an URL for the API, and uses file_get_contents to fetch the file. It then decodes json. (might be the problem somehow)
Search for my specified data. Final price & movie webm is not always there, hence the if(!isset())
Decide final price and ship back to ajax on index.php
The error code above suggests that I get the data I need in 4 cases, and an error once. I only wanna receive ONE json string and return it, and only in-case $game['gameTrailer'] and $game['final_price'] is set.
This is the php (it's not great, be kind):
<?php
//Run the script on ajax call
if(isset($_POST)) {
fetchGame();
}
function fetchGame() {
$gameFound = false;
while(!$gameFound) {
////////// ID-picker //////////
$f_contents = file("steam.txt");
$url = $f_contents[mt_rand(0, count($f_contents) - 1)];
$answer = explode('/',$url);
$gameID = $answer[4];
$trimmed = trim($gameID);
////////// Fetch game //////////
$json = file_get_contents('http://store.steampowered.com/api/appdetails?appids='.$trimmed);
$game_json = json_decode($json, true);
if(!isset($game_json[$trimmed]['data']['movies'][0]['webm']['max']) || !isset($game_json[$trimmed]['data']['price_overview']['final'])) {
continue;
}
$gameFound = true;
////////// Store variables //////////
$game['gameName'] = $game_json[$trimmed]['data']['name'];
$game['gameTrailer'] = $game_json[$trimmed]['data']['movies'][0]['webm']['max'];
$game['gameId'] = $trimmed;
$game['free'] = $game_json[$trimmed]['data']['is_free'];
$game['price'] = $game_json[$trimmed]['data']['price_overview']['final'];
$game['genres'] = $game_json[$trimmed]['data']['genres'][0]['description'];
if ($game['free'] == TRUE) {
$game['final_price'] = "Free";
} elseif($game['free'] == FALSE || $game['final_price'] != NULL) {
$game['final_price'] = $game['price'];
} else {
$game['final_price'] = "-";
}
}
////////// Return to AJAX (index.php) //////////
echo
json_encode(array(
'gameName' => $game['gameName'],
'gameTrailer' => $game['gameTrailer'],
'gameId' => $game['gameId'],
'finalPrice' => $game['final_price'],
'genres' => $game['genres'],
))
;
}
?>
Any help will be appreciated. Like, are there obvious reason as to why this is happening? Is there a significantly better way? Why is it re-iterating itself at least 4 times when it seems to have fetched that data I need? Sorry if this post is long, just trying to be detailed with a lacking php/json-vocabulary.
Kind regards, John
EDIT:
Sometimes it returns no error, just multiple objects:
{"gameName":"Prime World: Defenders","gameTrailer":"http://cdn.akamai.steamstatic.com/steam/apps/2028642/movie_max.webm?t=1447357836","gameId":"235360","finalPrice":899,"genres":"Casual"}
{"gameName":"Grand Ages: Rome","gameTrailer":"http://cdn.akamai.steamstatic.com/steam/apps/5190/movie_max.webm?t=1447351683","gameId":"23450","finalPrice":999,"genres":"Simulation"}
I have a website that pulls prices from an API. The problem is that if you send more than ~10 requests to this API in a short amount of time, your ip gets blocked temporarily (I'm not sure if this is just a problem on localhost or if it would also be a problem from the webserver, I assume the latter).
The request to the API returns a JSON object, which I then parse and store certain parts of it into my database. There are about 300 or so entries in the database, so ~300 requests I need to make to this API.
I will end up having a cron job that every x hours, all of the prices are updated from the API. The job calls a php script I have that does all of the request and db handling.
Is there a way to have the script send the requests over a longer period of time, rather than immediately? The problem I'm running into is that after ~20 or so requests the ip gets blocked, and the next 50 or so requests after that get no data returned.
I looked into sleep(), but read that it will just store the results in a buffer and wait, rather than wait after each request.
Here is the script that the cron job will be calling:
define('HTTP_NOT_FOUND', false);
define('HTTP_TIMEOUT', null);
function http_query($url, $timeout=5000) {
$curl = curl_init($url);
curl_setopt($curl, CURLOPT_RETURNTRANSFER, true);
curl_setopt($curl, CURLOPT_TIMEOUT_MS, $timeout);
$text = curl_exec($curl);
if($text) {
$code = curl_getinfo($curl, CURLINFO_HTTP_CODE);
switch($code){
case 200:
return $text;
case 404:
return -1;
default:
return -1;
}
}
return HTTP_TIMEOUT;
}
function getPrices($ID) {
$t = time();
$url = url_to_API;
$result = http_query($url, 5000);
if ($result == -1) { return -1; }
else {
return json_decode($result)->price;
}
}
connectToDB();
$result = mysql_query("SELECT * FROM prices") or die(mysql_error());
while ($row = mysql_fetch_array($result)) {
$id = $row['id'];
$updatedPrice = getItemPrices($id);
.
.
echo $updatedPrice;
. // here I am just trying to make sure I can get all ~300 prices without getting any php errors or the request failing (since the ip might be blocked)
.
}
sleep() should not affect/buffer queries to the database. You can use ob_flush() if you need to print something immediately. Also make sure to set max execution time with set_time_limit() so your script don't timeout.
set_time_limit(600);
while ($row = mysql_fetch_array($result)) {
$id = $row['id'];
$updatedPrice = getItemPrices($id);
.
.
echo $updatedPrice;
//Sleep 1 seconds, use ob_flush if necessary
sleep(1);
//You can also use usleep(..) to delay the script in milliseconds
}
I'm here again, learning more and more about PHP, but still have some problems for my scenario, most of my scenario has been programmed and solved without problem, but I found an issue, but to understand it, I need to explain it first:
I have a PHP script which can be invoked by any client and its work is to receive a request, ping to a proxy from a list which I define manually, to know if a proxy is available, if it is available, I proceed to retrieve a response using "curl" with a POST method. The logic is like this:
$proxyList = array('192.168.3.41:8013'=> 0, '192.168.3.41:8023'=>0, '192.168.3.41:8033'=>0);
$errorCounter = 0;
foreach ($proxyList as $key => $value){
if(!isUrlAvailable($key){ //It means it is NOT available so I count errors
$errorCounter++;
} else { //It means it is AVAILABLE
$result = callThisProxy($key);
}
}
The function "isUrlAvailable" uses a $fsockopen to know if the proxy is available. If not, I make a POST with CURL as mentioned before, the function has callThisProxy() something like:
$ch = curl_init($proxyUrl);
curl_setopt($ch, CURLOPT_POST, 1);
curl_setopt($ch, CURLOPT_POSTFIELDS,'xmlQuery='.$rawXml);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
$info = curl_exec ($ch);
if($isDebug){echo 'Info in the moment: '.$info.'<br/>';}
curl_close ($ch);
But, we're testing some scenarios, what happen if I turn off the proxy between the verification of the proxy availability and the call? I mean:
foreach ($proxyList as $key => $value){
if(!isUrlAvailable($key){ //It means it is NOT available so I count errors
$errorCounter++;
} else { //It means it is AVAILABLE
$result = callThisProxy($key);//What happen if I kill the proxy when the result is being processed?
}
}
I tested it and when I do that, the $result comes as empty string ''. But the problem is that I lost that request, and my goal is to retry it with the next $key which is a proxy. So, I've been thinking of a "do, while" when I invoke the result. But not sure, if it is ok or there's a better way to do it, so please I ask for help with this issue. Thanks in advance for your time any answer is welcome. Thanks.
Maybe something like:
$result = "";
while ($result == "")
{
foreach ($proxyList as $key => $value)
{
if (!isUrlAvailable($key))
{
$errorCounter++;
}
else
{
$result = callThisProxy($key);
}
}
}
// Now check $result, which should contain the first successful callThisProxy()
// result, or nothing if none of the keys worked.
You could just keep a list of proxies that you still need to try. When you hit the error or get a valid response then you remove the proxy from the list of proxies to try. If you do not get a good response then keep it in the list and try it again later.
$proxiesToTry = $proxyList;
$i = 0;
while (count($proxiesToTry) != 0) {
// reset to beginning of array
if($i >= count($proxiesToTry))
$i = 0;
$proxy = $proxiesToTry[$i];
if (!isUrlAvailable($proxy)) { //It means it is NOT available so I count errors
$errorCounter++;
unset($proxiesToTry[$i]);
} else { //It means it is AVAILABLE
$result = callThisProxy($proxy);
if($result != "") // If we got a response remove it from the array of proxies to try.
unset($proxiesToTry[$i]);
}
$i++;
}
NOTE: You will never break out of this loop if you don't ever get a valid response from some proxy.
I don't know how to make this.
There is an XML Api server and I'm getting contents with cURL; it works fine. Now I have to call the creditCardPreprocessors state. It has 'in progress state' too and PHP should wait until the progess is finished. I tried already with sleep and other ways, but I can't make it. This is a simplified example variation of what I tried:
function process_state($xml){
if($result = request($xml)){
// It'll return NULL on bad state for example
return $result;
}
sleep(3);
process_state($xml);
}
I know, this can be an infite loop but I've tried to add counting to exit if it reaches five; it won't exit, the server will hang up and I'll have 500 errors for minutes and Apache goes unreachable for that vhost.
EDIT:
Another example
$i = 0;
$card_state = false;
// We're gona assume now the request() turns back NULL if card state is processing TRUE if it's done
while(!$card_state && $i < 10){
$i++;
if($result = request('XML STUFF')){
$card_state = $result;
break;
}
sleep(2);
}
The recursive method you've defined could cause problems depending on the response timing you get back from the server. I think you'd want to use a while loop here. It keeps the requests serialized.
$returnable_responses = array('code1','code2','code3'); // the array of responses that you want the function to stop after receiving
$max_number_of_calls = 5; // or some number
$iterator = 0;
$result = NULL;
while(!in_array($result,$returnable_responses) && ($iterator < $max_number_of_calls)) {
$result = request($xml);
$iterator++;
}