Http Request alternatives - php

I have a requirement, calling nearly 50+ URLs to fetch the data. I tried in many ways. Finally Stick with HTTPRequestPool. Because it was faster than any other methods. I tried below methods.
1.file_get_contents($url) - took about 2 mins to complete all 50 requests
2.curl - took about 45 to 50 seconds to complete all 50 requests
3.HTTPRequest - took about 20 to 30 seconds to complete all 50 requests
4.HttpRequestPool - took about 10 to 15 seconds to complete all 50 requests
But still 10 to 15 seconds is also slow, when compare to my competitors. I want to complete all the 50 requests in a max of 3 to 6 seconds. How can I achieve my target time frame. Is there any alternatives apart from the above in PHP. If yes can anybody please tell me those.
Please help me to resolve this issue.
Thanks,
Sudhakar

Below is my code.
for($i=0;$i<$url_cnt;$i++)
{
$req[$i] = new HttpRequest($URLs[$i], HttpRequest::METH_GET,array('redirect'=>4,'timeout'=>5,'connecttimeout'=>7));
}
$pool = new HttpRequestPool();
$pool->enablePipelining(true);
//Attaching the Request
try {
foreach($req as $r){
$pool->attach($r);
}
} catch (HttpRequestPoolException $ex) { }
try {
$pool->send();
}
catch (HttpRequestPoolException $x) {}
foreach($pool as $request)
{
$url_contents[] = $request->getResponseBody();
}
foreach($req as $r)
{
$pool->detach($r);
}
$pool->__destruct();
unset($pool);

Related

PHP pthreads async with threads number gets stuck

I'm trying to achive a script that start threads async with a limit(5) and does wait if all threads are busy. If the threads are busy my script has to wait untill one is free and it should start another one but for some reasons does get stuck. Yesterday I discovered pthreads and after a few hours today this is all I came with:
EDIT: There are examples with 500 threads for example but in my case I need to parse large files(a few MBs) of targets and I need to make it to work async.
<?php
class My extends Thread {
public function run() {
echo $this->getThreadId() . "\n";
sleep(rand(1, 3)); // Simulate some work...
global $active_threads;
$active_threads = $active_threads - 1; // It should decrese the number of active threads. Here could be the problem?
}
}
$active_threads = 0;
$maximum_threads = 5;
while(true) {
if($active_threads == $maximum_threads) {
echo 'Too many threads, just wait for a moment untill some threads are free.' . "\n";
sleep(3);
continue;
}
$threads = array();
for($i = $active_threads; $i < $maximum_threads; $i++) {
$active_threads = $active_threads + 1;
$threads[] = new My();
}
foreach($threads as $thread) {
$thread->start();
}
}
?>
Output:
[root#localhost tests]# zts-php test
140517320455936
140517237061376
140517228668672
140517220275968
140517207701248
Too many threads, just wait for a moment.
Too many threads, just wait for a moment.
Too many threads, just wait for a moment.
Too many threads, just wait for a moment.
I know is pretty primitive but I'm totally new to pthreads and all the examples are without async threads number limit. There is for sure something wrong in my code but I have no ideea where.
The problem as stated in another post is that the global memory is not shared. So your $active_thread will not decrement even if you use global.
See post: How to share global variable across thread in php?

Keep data from reaching below 0

Im working on a browser game where i need to handle energy. The user spends and gets energy but i have come across an issue that i have tried to fix, When a user has for example 20 Energy and he does something that spends 30 energy it's going through and ofcourse i understand because i am checking the value to not go below 0 like this
if ( 0 < $resultenergy[0]['energy'])
This means that unless the users energy is at 0 he can spend whatever he want. What would be a better way to check the data so it alerts when "trying" to go under 0 ? I appreciate all help and if you have an idea of a different approach im all ears.
Do the calculation once.
$remaining_energy = $current_energy - $energy_cost;
Then check the result, and act accordingly.
if ($remaining_energy >= 0) {
$current_energy = $remaining_energy;
// do whatever required the energy
// send the client a succes indicator
} else {
// don't change current energy
// send the client an error indicator
}
You could subtract and then evaluate. e.g.
if ( 0 < $resultenergy[0]['energy'] - $energyCost) {
$resultenergy[0]['energy'] = $resultenergy[0]['energy'] - $energyCost;
return true
} else {
return false;
}
$energyCost is the cost of whatever action they're trying to do.

Multiupload using pthread in php

I have been trying to implement multi-threading in php to achieve multi-upload using pthreads php.
From my understanding of multi-threading, this is how I envisioned it working.
I would upload a file,the file will start uploading in the background; even if the file is not completed to upload, another instance( thread ) will be created to upload another file. I would make multiple upload requests using AJAXand multiple files would start uploading, I would get the response of a single request individually and I can update the status of upload likewise in my site.
But this is not how it is working. This is the code that I got from one of the pthread question on SO, but I do not have the link( sorry!! ).
I tested this code to see of this really worked like I envisioned. This is the code I tested, I changed it a little.
<?php
error_reporting(E_ALL);
class AsyncWebRequest extends Thread {
public $url;
public $data;
public function __construct ($url) {
$this->url = $url;
}
public function run () {
if ( ($url = $this->url) ){
/*
* If a large amount of data is being requested, you might want to
* fsockopen and read using usleep in between reads
*/
$this->data = file_get_contents ($url);
echo $this->getThreadId ();
} else{
printf ("Thread #%lu was not provided a URL\n", $this->getThreadId ());
}
}
}
$t = microtime (true);
foreach( ["http://www.google.com/?q=". rand () * 10, 'http://localhost', 'https://facebook.com'] as $url ){
$g = new AsyncWebRequest( $url );
/* starting synchronized */
if ( $g->start () ){
printf ( $url ." took %f seconds to start ", microtime (true) - $t);
while ($g->isRunning ()) {
echo ".";
usleep (100);
}
if ( $g->join () ){
printf (" and %f seconds to finish receiving %d bytes\n", microtime (true) - $t, strlen ($g->data));
} else{
printf (" and %f seconds to finish, request failed\n", microtime (true) - $t);
}
}
echo "<hr/>";
}
So what I expected from this code was it would hit google.com, localhost and facebook.com simultaneously and run their individual threads. But every request is waiting for another request to complete.
For this it is clearly waiting for first response to complete before it is making another request because time the request are sent are after the request from the previous request is complete.
So, This is clearly not the way to achieve what I am trying to achieve. How do I do this?
You might want to look at multi curl for such multiple external requests. Pthreads is more about internal processes.
Just for further reference, you are starting threads 1 by 1 and waiting for them to finish.
This code: while ($g->isRunning ()) doesn't stop until the thread is finished. It's like having a while (true) in a for. The for executes 1 step at a time.
You need to start the threads, add them in an array, and in another while loop check each of the threads if it stopped and remove them from the array.

pthreads stopping already running thread once a condition has been met

I'm still relatively new to PHP and trying to use pthreads to solve an issue. I have 20 threads running processes that end at varying times. Most finish around < 10 seconds or so. I don't need all 20, just 10 detected. Once I get to 10, I would like to kill the threads, or to continue on to the next step.
I have tried using set_time_limit to about 20 seconds for each of the threads, but they ignore it and keep running. I am looping through the jobs looking for the join because I didn't want the rest of the program to run but I'm stuck until the slowest one has finished. While pthreads has reduced the time from around a minute to about 30 seconds, I can shave even more time since the first 10 run in about 3 seconds.
Thanks for any help and here is my code:
$count = 0;
foreach ( $array as $i ) {
$imgName = $this->smsId."_$count.jpg";
$name = "LocalCDN/".$imgName;
$stack[] = new AsyncImageModify($i['largePic'], $name);
$count++;
}
// Run the threads
foreach ( $stack as $t ) {
$t->start();
}
// Check if the threads have finished; push the coordinates into an array
foreach ( $stack as $t ) {
if($t->join()){
array_push($this->imgArray, $t->data);
}
}
class class AsyncImageModify extends \Thread{
public $data;
public function __construct($arg, $name, $container) {
$this->arg = $arg;
$this->name = $name;
}
public function run() {
//tried putting the set_time_limit() here, didn't work
if ($this->arg) {
// Get the image
$didWeGetTheImage = Image::getImage($this->arg, $this->name);
if($didWeGetTheImage){
$timestamp1 = microtime(true);
print_r("Starting face detection $this->arg" . "\n");
print_r(" ");
$j = Image::process1($this->name);
if($j){
// lets go ahead and do our image manipulation at this point
$userPic = Image::process2($this->name, $this->name, 200, 200, false, $this->name, $j);
if($userPic){
$this->data = $userPic;
print_r("Back from process2; the image returned is $userPic");
}
}
$endTime = microtime(true);
$td = $endTime-$timestamp1;
print_r("Finished face detection $this->arg in $td seconds" . "\n");
print_r($j);
}
}
}
It is difficult to guess the functionality of Image::* methods, so I can't really answer in any detail.
What I can say, is that there are very few machines I can think of that are suitable to run 20 concurrent threads in any case. A more suitable setup would be the worker/stackable model. A Worker thread is a reuseable context, and can execute task after task, implemented as Stackables; execution in a multi-threaded environment should always use the least amount of threads to get the most work done possible.
Please see pooling example and other examples that are distributed with pthreads, available on github, additionally, much information regarding usage is contained in past bug reports, if you are still struggling after that ...

Codeigniter Mysql Don't complete cycle

I'm having some kind of problem and i don't know how i solve it.
The problem is, that i have 500 registers in another database, and the query to get them its has them all, but when i do the cycle to insert/update them in m y database the foreach cycle don't reach the end and it doesn't show any error.
Here is the cycle:
foreach ($this->getMetaEmpregado()->getAll()->result_array() as $modelData) {
$oEmpregado = $this->getEmpregadoObject($modelData);
$arrayEmpWhere = array(
'idempregado' => $oEmpregado->getIdEmpregado(),
'idsociedade' => $oEmpregado->getIdSociedade(),
'nif' => $oEmpregado->getNif()
);
if ($this->getWayUtilizador()->get($arrayEmpWhere)->num_rows() == 0) {
$countInsert++;
$this->insertNewEmp($oEmpregado);
} else {
$countUpdate++;
$this->UpdateEmp($oEmpregado);
}
}
echo "Total Updates: $countUpdate Total Inserts: $countInsert<br>";
It down't shoe the echo in the end because its stops around the 260 register, but sometimes it reaches the 300 others not even 100.
Regards,Elkas
It seems that the script is taking too long, You need to increase the Max_Execution_Time in your php.ini.
You can use ini_set method to change it, like this:
ini_set('max_execution_time', 600); //600 seconds = 10 minutes
Hope this helps.

Categories