Im developing a little software based on API requests with php cURL.
I encountered a problem with private requests of API. One of the parameters of the request is "nonce" (unix timestamp), but the response is "invalid nonce".
Contacting the assistance, they answer me that:
"Invalid Nonce is sent when nonce you sent is smaller or equal to the nonce that was previously sent."
And,
"if you make 2 requests at same second you need to increase nonce for 2nd request (you can use micro uniquestamp so that in one second you can create 1000000 unique nonces in 1 second)."
My question is: What function can I use to solve this problem!? I tried microtime() function, but I get the same error.
Thank you and sorry for my bad english.
My code:
$unix_time = time();
$microtime = number_format(microtime(true), 5, '', '')
$message = $unix_time.$customer_id.$API_KEY; //nonce + customer id + api key
$signature = hash_hmac('sha256', $message, $API_SECRET);
$ticker_url = "https://www.bitstamp.net/api/v2/ticker/btceur";
$balance_url = "https://www.bitstamp.net/api/v2/balance/btceur/";
$param_array = array(
"key" => $API_KEY,
"signature" => strtoupper($signature),
"nonce" => $microtime
);
switch($_POST['action']){
case 'ticker_btceur':
ticker_btceur($param_array, $ticker_url);
break;
case 'balance_btceur':
balance_btceur($param_array, $balance_url);
break;
}
function ticker_btceur($da, $b_url){ // cURL GET
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, $b_url."?key=".$da['key']."&signature=".$da['signature']);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
curl_setopt($ch, CURLOPT_FOLLOWLOCATION, true);
curl_setopt($ch, CURLOPT_SSL_VERIFYPEER, true);
curl_setopt($ch, CURLOPT_CAINFO, getcwd() . "/CAcerts/cacert.pem");
if(curl_exec($ch) === false){
echo "Errore: ". curl_error($ch)." - Codice errore: ".curl_errno($ch);
}
else{
$result = curl_exec($ch);
echo $result;
}
curl_close($ch);
}
function balance_btceur($pa, $b_url){ // cURL POST
$ch = curl_init();
curl_setopt($ch,CURLOPT_URL, $b_url);
curl_setopt($ch,CURLOPT_POST, count($pa));
curl_setopt($ch, CURLOPT_POSTFIELDS, http_build_query($pa));
if(curl_exec($ch) === false){
echo "Errore: ". curl_error($ch)." - Codice errore: ".curl_errno($ch);
}
else{
$result = curl_exec($ch);
echo $result;
}
curl_close($ch);
}
microtime() is current Unix timestamp with microseconds and it's different than normal microseconds time (1 sceond = 1000000 microseconds), so they are not the samething.
If the service provider is asking you to send the time in Unix timestamp with microseconds then you have to use:
$time = microtime(true);
Also you can make it random by using rand() to be like that:
// Increase the time in random value between 10 and 100 in microtime
$time = microtime(true) + rand(10, 100);
If they asking you to do it in microseconds time then use rand() like that:
$time = rand(1000,10000000);
Seems that API requires microseconds, here is function to get microseconds:
function microseconds()
{
list($usec, $sec) = explode(" ", microtime());
return $sec . ($usec * 1000000);
}
echo microseconds();
echo "\n";
my best guess is that they mean:
$stamp=(string)(int)(microtime(true)*1000000);
this stamp will change 1 million times per second, depending on when you generate it, it looks something like
string(16) "1555177383042022"
.. just note that this code won't work properly on a 32bit system, if your code needs 32bit php compatibility then do this instead:
$stamp2=bcmul(number_format(microtime(true),18,".",""),"1000000",0);
Related
UPDATE: Setup tested - it works - but my web-host cannot handle 600 email in about 6 seconds - I had each connection wait 20 seconds and then send one mail - those all went through
I have a mailing list with 600+ emails
I have a function to send out the 600+ emails
Unfortunately, there is a limit as to the execution time (90 seconds) - and therefore the script is shut down before it is completed. I cannot change the time with set_time_limit(0), as it is set by my web-host (not in an ini file that i can change either)
My solution is to make post requests from a main file to a sub file that will send out chunks of 100 mails at a time. But will these be sent without delay - or will they wait for an answer before sending the next request?
The code:
for($i=0;$i<$mails;$i+100) {
$url = 'http://www.bedsteforaeldreforasyl.dk/siteadmin/php/sender.php';
$myvars = 'start=' . $i . '&emne=' . $emne . '&besked=' . $besked;
$ch = curl_init( $url );
curl_setopt( $ch, CURLOPT_POST, 1);
curl_setopt( $ch, CURLOPT_POSTFIELDS, $myvars);
curl_setopt( $ch, CURLOPT_FOLLOWLOCATION, 1);
curl_setopt( $ch, CURLOPT_HEADER, 0);
curl_setopt( $ch, CURLOPT_SAFE_UPLOAD, 0);
curl_setopt( $ch, CURLOPT_RETURNTRANSFER, 0);
curl_setopt( $ch, CURLOPT_TIMEOUT, 1);
$response = curl_exec( $ch );
curl_close($ch);
}
$mails is the total number of recipients
$start is the start row number i the SQL statement
Will this (as I hope) start 6 parallel connections - or will it (as I fear) start 6 procesesses each after the other?
In the receiving script I have:
<br>
ignore_user_abort(true);<br>
$q1 = "SELECT * FROM maillist LIMIT $start,100 ORDER BY navn";
Create six php scripts, one for each 100 emails (or pass a value (e.g. 0-5) to a single script).
Create a main script to call these six sub-scripts.
Use stream_socket_client() to call the sub-scripts.
The six scripts will run simultaneously.
You can catch anything echoed back by the sub-scripts (e.g. status).
$timeout = 120;
$buffer_size = 8192;
$result = array();
$sockets = array();
$id = 0;
header('Content-Type: text/plain; charset=utf-8');
$urls[] = array('host' => 'www.example.com','path' => "http://www.example.com/mail1.php");
$urls[] = array('host' => 'www.example.com','path' => "http://www.example.com/mail2.php");
$urls[] = array('host' => 'www.example.com','path' => "http://www.example.com/mail3.php");
$urls[] = array('host' => 'www.example.com','path' => "http://www.example.com/mail4.php");
$urls[] = array('host' => 'www.example.com','path' => "http://www.example.com/mail5.php");
$urls[] = array('host' => 'www.example.com','path' => "http://www.example.com/mail6.php");
foreach($urls as $path){
$host = $path['host'];
$path = $path['path'];
$http = "GET $path HTTP/1.0\r\nHost: $host\r\n\r\n";
$stream = stream_socket_client("$host:80", $errno,$errstr, 120,STREAM_CLIENT_ASYNC_CONNECT|STREAM_CLIENT_CONNECT);
if ($stream) {
$sockets[] = $stream; // supports multiple sockets
fwrite($stream, $http);
}
else {
$err .= "$id Failed<br>\n";
}
}
echo $err;
while (count($sockets)) {
$read = $sockets;
stream_select($read, $write = NULL, $except = NULL, $timeout);
if (count($read)) {
foreach ($read as $r) {
$id = array_search($r, $sockets);
$data = fread($r, $buffer_size);
if (strlen($data) == 0) {
// echo "$id Closed: " . date('h:i:s') . "\n\n\n";
$closed[$id] = microtime(true);
fclose($r);
unset($sockets[$id]);
}
else {
$result[$id] .= $data;
}
}
}
else {
// echo 'Timeout: ' . date('h:i:s') . "\n\n\n";
break;
}
}
var_export($result);
I'll provide some ideas on how the objective can be achieved.
First Option - Use curl_multi_* suite of functions. It provides non-blocking cURL requests.
2 . Second Option - Use an asynchronous library like amphp or ReactPHP. Though it would essentially provide the same benefit as curl_multi_*, IIRC.
Use pcntl_fork() to create separate processes and distribute the job as in worker nodes.
Use pthreads extension, which essentially provides a userland PHP implementation of true multi-threading.
I'll warn you though, the last two options should be the last resort, since the parallel processing world comes up some spooky situations which can prove to be really pesky ;-).
I'd also probably suggest you that if you are planning to scale this sort of application, it'd be the best course of action to use some external service.
so I'm trying to figure out why does this PHP code takes too long to run to output the results.
for example this is my apitest.php and here is my PHP Code
<?php
function getRankedMatchHistory($summonerId,$serverName,$apiKey){
$k
$d;
$a;
$timeElapsed;
$gameType;
$championName;
$result;
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, "https://".$serverName.".api.pvp.net/api/lol/".$serverName."/v2.2/matchhistory/".$summonerId."?api_key=".$apiKey);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
curl_setopt($ch, CURLOPT_SSL_VERIFYPEER, false);
$response = curl_exec($ch);
curl_close($ch);
$matchHistory = json_decode($response,true); // Is the Whole JSON Response saved at $matchHistory Now locally as a variable or is it requested everytime $matchHistory is invoked ?
for ($i = 9; $i >= 0; $i--){
$farm1 = $matchHistory["matches"][$i]["participants"]["0"]["stats"]["minionsKilled"];
$farm2 = $matchHistory["matches"][$i]["participants"]["0"]["stats"]["neutralMinionsKilled"];
$farm3 = $matchHistory["matches"][$i]["participants"]["0"]["stats"]["neutralminionsKilledTeamJungle"];
$farm4 = $matchHistory["matches"][$i]["participants"]["0"]["stats"]["neutralminionsKilledEnemyJungle"];
$elapsedTime = $matchHistory["matches"][$i]["matchDuration"];
settype($elapsedTime, "integer");
$elapsedTime = floor($elapsedTime / 60);
$k = $matchHistory["matches"][$i]["participants"]["0"]["stats"]["kills"];
$d = $matchHistory["matches"][$i]["participants"]["0"]["stats"]["deaths"];
$a = $matchHistory["matches"][$i]["participants"]["0"]["stats"]["assists"];
$championIdTmp = $matchHistory["matches"][$i]["participants"]["0"]["championId"];
$championName = call_user_func('getChampionName', $championIdTmp); // calls another function to resolve championId into championName
$gameType = preg_replace('/[^A-Za-z0-9\-]/', ' ', $matchHistory["matches"][$i]["queueType"]);
$result = (($matchHistory["matches"][$i]["participants"]["0"]["stats"]["winner"]) == "true") ? "Victory" : "Defeat";
echo "<tr>"."<td>".$gameType."</td>"."<td>".$result."</td>"."<td>".$championName."</td>"."<td>".$k."/".$d."/".$a."</td>"."<td>".($farm1+$farm2+$farm3+$farm4)." in ". $elapsedTime. " minutes". "</td>"."</tr>";
}
}
?>
What I'd like to know is how to make the page output faster as it takes around
10~15 seconds to output the results which makes the browser thinks the website is dead like a 500 Internal error or something like it .
Here is a simple demonstration of how long it can take : Here
As you might have noticed , yes I'm using Riot API which is sending the response as a JSON encoded type.
Here is an example of the response that this function handles : Here
What I thought of was creating a temporarily file called temp.php at the start of the CURL function and saving the whole response there and then reading the variables from there so i can speed up the process and after reading the variables it deletes the temp.php that was created thus freeing up disk space. and increasing the speed.
But I have no idea how to do that in PHP Only.
By the way I'd like to tell you that i just started using PHP today so I'd prefer some explanation with the answers if possible .
Thanks for your precious time.
Try benchmarking like this:
// start the timer
$start_curl = microtime(true);
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, "https://".$serverName.".api.pvp.net/api/lol/".$serverName."/v2.2/matchhistory/".$summonerId."?api_key=".$apiKey);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
curl_setopt($ch, CURLOPT_SSL_VERIFYPEER, false);
// debugging
curl_setopt($ch, CURLOPT_VERBOSE, true);
// start another timer
$start = microtime(true);
$response = curl_exec($ch);
echo 'curl_exec() in: '.(microtime(true) - $start).' seconds<br><br>';
// start another timer
$start = microtime(true);
curl_close($ch);
echo 'curl_close() in: '.(microtime(true) - $start).' seconds<br><br>';
// how long did the entire CURL take?
echo 'CURLed in: '.(microtime(true) - $start_curl).' seconds<br><br>';
I'm using cURL to upload a file via given URL. (user gives URL, and my server downloads the file)
For a progressbar, I use the CURLOPT_PROGRESSFUNCTION option.
I want the function of the progress to also calculate the speed of download, and how much time left.
$fp = fopen($temp_file, "w");
$ch = curl_init($url);
curl_setopt($ch, CURLOPT_NOPROGRESS, false );
curl_setopt($ch, CURLOPT_PROGRESSFUNCTION, "curl_progress_callback");
curl_setopt($ch, CURLOPT_FILE, $fp);
$success = curl_exec($ch);
$curl_info = curl_getinfo($ch);
curl_close($ch);
fclose($fp);
function curl_progress_callback ($download_size, $downloaded_size, $upload_size, $uploaded_size) {
global $fileinfo;
if (!$downloaded_size) {
if (!isset($fileinfo->size)) {
$fileinfo->size = $download_size;
event_callback(array("send" => $fileinfo));
}
}
event_callback(array("progress" => array("loaded" => $downloaded_size, "total" => $download_size)));
}
Thank you! and sorry for my English
Add this before curl_exec:
$startTime = $prevTime = microtime(true);
$prevSize = 0;
You can calculate the average and current speed, and remaining time by adding this to the callback function:
$averageSpeed = $downloaded_size / (microtime(true) - $startTime);
$currentSpeed = ($downloaded_size - $prevSize) / (microtime(true) - $prevTime);
$prevTime = microtime(true);
$prevSize = $downloaded_size;
$timeRemaining = ($downloaded_size - $download_size) / $averageSpeed;
Speed is measured in Bytes/s and remaining time in seconds.
I've done a simple twitter feed for a webpage. I save a cached file with the JSON response of the API, and then read it using jQuery.
It works well, the problem is that it randomly reaches the REST API limit of 150 requests, while I'm doing just 6 per hour (1 each 10 minutes), and as far as I remember, I don't have any other feed in my hosting (MediaTemple gs) that could've been doing many requests per hour.
I know that I can auth with an account and get 350 requests limit, I haven't tested yet, but I don't think this will solve the problem at all.
Here is the cron I execute each 10 minutes:
<?php
//Set local timezone
putenv("TZ=America/Caracas");
//Function to get contents using cURL
function url_get_contents ($Url) {
if (!function_exists('curl_init')){
die('CURL is not installed!');
}
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, $Url);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
$output = curl_exec($ch);
curl_close($ch);
return $output;
}
//The path of the file that contains cached tweets
$cache = '/home/xxxxx/domains/example.com/html/demos/webname/twitter-json.txt';
//Call the api and get JSON response
$data = url_get_contents("https://api.twitter.com/1/statuses/user_timeline.json?include_entities=true&include_rts=true&screen_name=webname&count=100");
$parsed = json_decode($data, true);
//This is a workaround I made to exit the script if there's an error(it works)
if ($parsed['error']) exit;
//I change the twitter date format to d/m/y H:i
foreach ($parsed as $key => $value) {
$parsed[$key]['created_at'] = date('d/m/y H:i',strtotime($value['created_at']));
}
//I encode to JSON
$data = json_encode($parsed);
//Save the file
$cachefile = fopen($cache, 'wb');
fwrite($cachefile,$data);
fclose($cachefile);
?>
My current code (see below) uses 147MB of virtual memory!
My provider has allocated 100MB by default and the process is killed once run, causing an internal error.
The code is utilising curl multi and must be able to loop with more than 150 iterations whilst still minimizing the virtual memory. The code below is only set at 150 iterations and still causes the internal server error. At 90 iterations the issue does not occur.
How can I adjust my code to lower the resource use / virtual memory?
Thanks!
<?php
function udate($format, $utimestamp = null) {
if ($utimestamp === null)
$utimestamp = microtime(true);
$timestamp = floor($utimestamp);
$milliseconds = round(($utimestamp - $timestamp) * 1000);
return date(preg_replace('`(?<!\\\\)u`', $milliseconds, $format), $timestamp);
}
$url = 'https://www.testdomain.com/';
$curl_arr = array();
$master = curl_multi_init();
for($i=0; $i<150; $i++)
{
$curl_arr[$i] = curl_init();
curl_setopt($curl_arr[$i], CURLOPT_URL, $url);
curl_setopt($curl_arr[$i], CURLOPT_RETURNTRANSFER, 1);
curl_setopt($curl_arr[$i], CURLOPT_SSL_VERIFYHOST, FALSE);
curl_setopt($curl_arr[$i], CURLOPT_SSL_VERIFYPEER, FALSE);
curl_multi_add_handle($master, $curl_arr[$i]);
}
do {
curl_multi_exec($master,$running);
} while($running > 0);
for($i=0; $i<150; $i++)
{
$results = curl_multi_getcontent ($curl_arr[$i]);
$results = explode("<br>", $results);
echo $results[0];
echo "<br>";
echo $results[1];
echo "<br>";
echo udate('H:i:s:u');
echo "<br><br>";
usleep(100000);
}
?>
As per your last comment..
Download RollingCurl.php.
Hopefully this will sufficiently spam the living daylights out of your API.
<?php
$url = '________';
$fetch_count = 150;
$window_size = 5;
require("RollingCurl.php");
function request_callback($response, $info, $request) {
list($result0, $result1) = explode("<br>", $response);
echo "{$result0}<br>{$result1}<br>";
//print_r($info);
//print_r($request);
echo "<hr>";
}
$urls = array_fill(0, $fetch_count, $url);
$rc = new RollingCurl("request_callback");
$rc->window_size = $window_size;
foreach ($urls as $url) {
$request = new RollingCurlRequest($url);
$rc->add($request);
}
$rc->execute();
?>
Looking through your questions, I saw this comment:
If the intention is domain snatching,
then using one of the established
services is a better option. Your
script implementation is hardly as
important as the actual connection and
latency.
I agree with that comment.
Also, you seem to have posted the "same question" approximately seven hundred times:
https://stackoverflow.com/users/558865/icer
https://stackoverflow.com/users/516277/icer
How can I adjust the server to run my PHP script quicker?
How can I re-code my php script to run as quickly as possible?
How to run cURL once, checking domain availability in a loop? Help fixing code please
Help fixing php/api/curl code please
How to reduce virtual memory by optimising my PHP code?
Overlapping HTTPS requests?
Multiple https requests.. how to?
Doesn't the fact that you have to keep asking the same question over and over tell you that you're doing it wrong?
This comment of yours:
#mario: Cheers. I'm competing against
2 other companies for specific
ccTLD's. They are new to the game and
they are snapping up those domains in
slow time (up to 10 seconds after
purge time). I'm just a little slower
at the moment.
I'm fairly sure that PHP on a shared hosting account is the wrong tool to use if you are seriously trying to beat two companies at snapping up expired domain names.
The result of each of the 150 queries is being stored in PHP memory and by your evidence this is insufficient. The only conclusion is that you cannot keep 150 queries in memory. You must have a method of streaming to files instead of memory buffers, or simply reduce the number of queries and processing the list of URLs in batches.
To use streams you must set CURLOPT_RETURNTRANSFER to 0 and implement a callback for CURLOPT_WRITEFUNCTION, there is an example in the PHP manual:
http://www.php.net/manual/en/function.curl-setopt.php#98491
function on_curl_write($ch, $data)
{
global $fh;
$bytes = fwrite ($fh, $data, strlen($data));
return $bytes;
}
curl_setopt ($curl_arr[$i], CURLOPT_WRITEFUNCTION, 'on_curl_write');
Getting the correct file handle in the callback is left as problem for the reader to solve.
<?php
echo str_repeat(' ', 1024); //to make flush work
$url = 'http://__________/';
$fetch_count = 15;
$delay = 100000; //0.1 second
//$delay = 1000000; //1 second
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, $url);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
curl_setopt($ch, CURLOPT_SSL_VERIFYHOST, FALSE);
curl_setopt($ch, CURLOPT_SSL_VERIFYPEER, FALSE);
for ($i=0; $i<$fetch_count; $i++) {
$start = microtime(true);
$result = curl_exec($ch);
list($result0, $result1) = explode("<br>", $result);
echo "{$result0}<br>{$result1}<br>";
flush();
$end = microtime(true);
$sleeping = $delay - ($end - $start);
echo 'sleeping: ' . ($sleeping / 1000000) . ' seconds<hr />';
usleep($sleeping);
}
curl_close($ch);
?>