Send multiple numbers SMS requests in one second PHP - php

I'm trying to send SMS using an API. It is sending almost one SMS per second but i want to send multiple SMS in one second using multithreading/pthreads in PHP. How is it possible or how can i send multiple SMS request asynchronously to API server from my end at least time.
//Threads Class
class MThread extends Thread {
public $data;
public $result;
public function __construct($data){
$this->data = $data;
}
public function run() {
foreach($this->data as $dt_res){
// Send the POST request with cURL
$ch = curl_init("http://www.example.com");
curl_setopt($ch, CURLOPT_POST, true);
curl_setopt($ch, CURLOPT_POSTFIELDS, $dt_res['to']);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
$res = curl_exec($ch);
$http_code = curl_getinfo($ch, CURLINFO_HTTP_CODE);
curl_close($ch);
$this->result = $http_code;
/**/
}
}
}
// $_POST['data'] has multi arrays
$request = new MThread($_POST['data']);
if ($request->start()) {
$request->join();
print_r($request->result);
}
Any idea will be appreciated.

You don't necessarily need to use threads to send multiple HTTP requests asynchronously. You can use non-blocking I/O, multicurl is suitable in this case. There are some HTTP clients with multicurl support.
Example (using Guzzle 6):
$client = new \GuzzleHttp\Client();
$requestGenerator = function() use ($client) {
$uriList = ['https://www.google.com', 'http://amazon.com', 'http://github.com', 'http://stackoverflow.com'];
foreach ($uriList as $uri) {
$request = new \GuzzleHttp\Psr7\Request('GET', $uri);
$promise = $client->sendAsync($request);
yield $promise;
}
};
$concurrency = 4;
\GuzzleHttp\Promise\each_limit($requestGenerator(), $concurrency, function(\GuzzleHttp\Psr7\Response $response) {
var_dump($response->getBody()->getContents());
}, function(\Exception $e) {
var_dump($e->getMessage());
})->wait();

Why do you make a foreach into the run() ? When you do that, that exactly like a simple function, no multithreading.
So, how to use multithreading with pthread ?
Here is the solution at your problem:
$thread = array();
foreach ($_POST['data'] as $index => $data) {
$thread[$index] = new MThread($data);
$thread[$index]->start();
}
You should be able to understand your error with this code.
Just delete your foreach into your run() function and use my code and it's will work.

It's better to use something like Beanstalk with multiple workers.

Related

PHP CURL multi-threaded and single-threaded function help. How do I do this?

I found a function here: http://archevery.blogspot.com/2013/07/php-curl-multi-threading.html
I am using it to send an array of URLs to run and process as quickly as possible via Multi-threaded curl requests. This works great.
SOME of the urls I want to send it require they be processed in order, not at the same time, but in a sequence.
How can I achieve this?
Example:
URL-A URL-B URL-C --> All fire off at the same time
URL-D URL-E --> Must wait for URL-D to finish before URL-E is triggered.
My purpose is for a task management system that allows me to add PHP applications as "Tasks" in the database. I have a header/detail relationship with the tasks so a task with one header and one detail can be sent off multi-threaded, but a task with one header and multiple details must be sent off in the order of the detail tasks.
I can do this by calling curl requests in a loop, but I want them to also fire off the base request (the first task of a sequence) as part of the multi-threaded function. I dont want to have to wait for all sequential tasks to pile up and process in order. As in the first task of each sequence should be multi-threaded, but tasks with a sequence then need to wait for that task to complete before moving to the next.
I tried this function that I send the multiple tasks to, but it waits for each task to finish before moving on the next. I need to somehow combine the multi-threaded function from the URL above with this one.
Here is my multithreaded curl function:
function runRequests($url_array, $thread_width = 10) {
$threads = 0;
$master = curl_multi_init();
$curl_opts = array(CURLOPT_RETURNTRANSFER => true,
CURLOPT_FOLLOWLOCATION => true,
CURLOPT_MAXREDIRS => 5,
CURLOPT_CONNECTTIMEOUT => 15,
CURLOPT_TIMEOUT => 15,
CURLOPT_RETURNTRANSFER => TRUE);
$results = array();
$count = 0;
foreach($url_array as $url) {
$ch = curl_init();
$curl_opts = [CURLOPT_URL => $url];
curl_setopt_array($ch, $curl_opts);
curl_multi_add_handle($master, $ch); //push URL for single rec send into curl stack
$results[$count] = array("url" => $url, "handle" => $ch);
$threads++;
$count++;
if($threads >= $thread_width) { //start running when stack is full to width
while($threads >= $thread_width) {
//usleep(100);
while(($execrun = curl_multi_exec($master, $running)) === -1){}
curl_multi_select($master);
// a request was just completed - find out which one and remove it from stack
while($done = curl_multi_info_read($master)) {
foreach($results as &$res) {
if($res['handle'] == $done['handle']) {
$res['result'] = curl_multi_getcontent($done['handle']);
}
}
curl_multi_remove_handle($master, $done['handle']);
curl_close($done['handle']);
$threads--;
}
}
}
}
do { //finish sending remaining queue items when all have been added to curl
//usleep(100);
while(($execrun = curl_multi_exec($master, $running)) === -1){}
curl_multi_select($master);
while($done = curl_multi_info_read($master)) {
foreach($results as &$res) {
if($res['handle'] == $done['handle']) {
$res['result'] = curl_multi_getcontent($done['handle']);
}
}
curl_multi_remove_handle($master, $done['handle']);
curl_close($done['handle']);
$threads--;
}
} while($running > 0);
curl_multi_close($master);
return $results;
}
and here is single-threaded curl function.
function runSingleRequests($url_array) {
foreach($url_array as $url) {
// Initialize a CURL session.
$ch = curl_init();
// Page contents not needed.
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 0);
// grab URL and pass it to the variable.
curl_setopt($ch, CURLOPT_URL, $url);
// process the request.
$result = curl_exec($ch);
}
Both take an array of URLs as their input.
I currently have an array of all single tasks and another array of all multiple tasks with a "header id" that lets me know what header task each detail task is part of.
Any help on theory or code would be most appreciated.
Thanks!
Why don't you use a rudementary task scheduler to schedule your requests and followups, instead of running everything at once?
See it in action: https://ideone.com/suTUBS
<?php
class Task
{
protected $follow_up = [];
protected $task_callback;
public function __construct($task_callback)
{
$this->task_callback = $task_callback;
}
public function addFollowUp(Task $follow_up)
{
$this->follow_up[] = $follow_up;
}
public function complete()
{
foreach($this->follow_up as $runnable) {
$runnable->run();
}
}
public function run()
{
$callback = $this->task_callback;
$callback($this);
}
}
$provided_task_scheduler_from_somewhere = function()
{
$tasks = [];
$global_message_thing = 'failed';
$second_global_message_thing = 'failed';
$task1 = new Task(function (Task $runner)
{
$something_in_closure = function() use ($runner) {
echo "running task one\n";
$runner->complete();
};
$something_in_closure();
});
/**
* use $global_message_thing as reference so we can manipulate it
* This will make sure that the follow up on this one knows the status of what happened here
*/
$second_follow_up = new Task(function(Task $runner) use (&$global_message_thing)
{
echo "second follow up on task one.\n";
$global_message_thing = "success";
$runner->complete();
});
/**
* Just doing things in random order to show that order doesn't really matter with a task scheduler
* just the follow ups
*/
$tasks[] = $task1;
$tasks[] = new Task(function(Task $runner)
{
echo "running task 2\n";
$runner->complete();
});
$task1->addFollowUp(new Task(function(Task $runner)
{
echo "follow up on task one.\n";
$runner->complete();
}));
$task1->addFollowUp($second_follow_up);
/**
* Adding the references to our "status" trackers here to know what to print
* One will still be on failed because we did nothing with it. this way we know it works properly
* as a control.
*/
$second_follow_up->addFollowUp(new Task(function(Task $runner) use (&$global_message_thing, &$second_global_message_thing) {
if($global_message_thing === "success") {
echo "follow up on the second follow up, three layers now, w00007!\n";
}
if($second_global_message_thing === "success") {
echo "you don't see this\n";
}
$runner->complete();
}));
return $tasks;
};
/**
* Normally you'd use some aggretating function to build up your tasks
* list or a collection of classes. I simulated that here with this callback function.
*/
$tasks = $provided_task_scheduler_from_somewhere();
foreach($tasks as $task) {
$task->run();
}
This way you can have nesting of tasks that need to follow after each other, with some clever uses of closures you can pass parameters to the executing functions and the encompassing objects outside it.
In my example the Task object itself is passed to the executing function so the executing function can call complete when it's done with it's job.
When complete is called the Task determine if it has scheduled follow up tasks to execute and if so, those are automatically called and works itself down the chain like that.
It's a rudimentary task scheduler, but it should help you on the way getting steps planned in the order you want them to be executed.
Here's an easier to follow example, From : http://arguments.callee.info/2010/02/21/multiple-curl-requests-with-php/
curl_multi_init. This family of functions allows you to combine cURL handles and execute them simultaneously.
EXAMPLE
build the individual requests, but do not execute them
$ch_1 = curl_init('http://webservice.one.com/');
$ch_2 = curl_init('http://webservice.two.com/');
curl_setopt($ch_1, CURLOPT_RETURNTRANSFER, true);
curl_setopt($ch_2, CURLOPT_RETURNTRANSFER, true);
build the multi-curl handle, adding both $ch
$mh = curl_multi_init();
curl_multi_add_handle($mh, $ch_1);
curl_multi_add_handle($mh, $ch_2);
execute all queries simultaneously, and continue when all are complete
$running = null;
do {
curl_multi_exec($mh, $running);
} while ($running);
close the handles
curl_multi_remove_handle($mh, $ch1);
curl_multi_remove_handle($mh, $ch2);
curl_multi_close($mh);
all of our requests are done, we can now access the results
$response_1 = curl_multi_getcontent($ch_1);
$response_2 = curl_multi_getcontent($ch_2);
echo "$response_1 $response_2"; // output results
If both websites take one second to return, we literally cut our page load time in half by using the second example instead of the first!
Referances : https://www.php.net/manual/en/function.curl-multi-init.php

Alternative of file_get_contents function

I am trying to get json data from https://nepse-data-api.herokuapp.com/data/todaysprice.
I use file_get_contents() function but I got below error msg
Message: require(): https:// wrapper is disabled in the server
configuration by allow_url_fopen=0
Now my problem is I am using shared hosting so allow_url_fopen = 1 is not possible.
How can I get the data from above url.
In localhost this code is working properly, Here is my code
$url = 'https://nepse-data-api.herokuapp.com/data/todaysprice';
$raw = file_get_contents($url);
$data = json_decode($raw);
In case you’re using PHP to retrieve data from a certain server you probably came across the problem that it may work for you but a client complained about lots of errors. It’s pretty likely that you’ve relied on the fact that allow_url_fopen is set to true. This way you can put pretty much anything – local path or a URL – into function calls like include or maybe simplexml_load_file.
If you’d like to get around this problem you can advice your client to make the necessary changes in his php.ini file. Most of the time this isn’t an option because the hosting company decided to disable this feature for security reasons. Since almost everybody got cURL installed we can use this to retrieve data from another web server.
Implementation
I’ll present a wrapper that helps you loading an XML file. It uses simplexml_load_file if allow_url_fopen is enabled. If this feature is disabled it employs simplexml_load_string and cURL. If none of this works we’ll throw an exception because we weren’t able to load the data.
class XMLWrapper {
public function loadXML($url) {
if (ini_get('allow_url_fopen') == true) {
return $this->load_fopen($url);
} else if (function_exists('curl_init')) {
return $this->load_curl($url);
} else {
// Enable 'allow_url_fopen' or install cURL.
throw new Exception("Can't load data.");
}
}
private function load_fopen($url) {
return simplexml_load_file($url);
}
private function load_curl($url) {
$curl = curl_init($url);
curl_setopt($curl, CURLOPT_RETURNTRANSFER, 1);
$result = curl_exec($curl);
curl_close($curl);
return simplexml_load_string($result);
}
}
//For Json
class JsonWrapper {
public function loadJSON($url) {
if (ini_get('allow_url_fopen') == true) {
return $this->load_fopen($url);
} else if (function_exists('curl_init')) {
return $this->load_curl($url);
} else {
// Enable 'allow_url_fopen' or install cURL.
throw new Exception("Can't load data.");
}
}
private function load_fopen($url) {
$raw = file_get_contents($url);
$data = json_decode($raw);
return $data;
}
private function load_curl($url) {
$curl = curl_init($url);
curl_setopt($curl, CURLOPT_RETURNTRANSFER, 1);
$result = curl_exec($curl);
curl_close($curl);
$data = json_decode($result);
return $data;
}
}
The code is pretty simple, create an instance of the given class and call the loadXML method. It’ll call the right private method which finally loads the XML. Loading some XML is just an example, you can use this technique with e.g. include or require too.

How do I use StreamedResponse to render template view in Symfony2

I am trying to use streamedResponse to output progress to my index page in Symfony2.
This code below does show my progress on the api calls as it occurs, but I am having trouble rendering the streamed information in an actual view. Right now it is just outputing plain text on the top of the page, then rendering the view when its all complete.
I don't want to return the final array and close the function until everything is loaded, but I can't seem to get a regular twig template to show while I output the progress.
I have tried using render but nothing seems to truly ouput that view file to the screen unless I return.
public function indexAction($countryCode)
{
//anywhere from five to fifteen api calls are going to take place
foreach ($Widgets as $Widget) {
$response = new StreamedResponse();
$curlerUrl = $Widget->getApiUrl()
. '?action=returnWidgets'
. '&data=' . urlencode(serialize(array(
'countryCode' => $countryCode
)));
$requestStartTime = microtime(true);
$curler = $this->get('curler')->curlAUrl($curlerUrl);
$curlResult = json_decode($curler['body'], true);
if(isset($curlResult['data'])){
//do some processing on the data
}
$response->setCallback(function() use ($Widget, $executionTime) {
flush();
sleep(1);
var_dump($Widget->getName());
var_dump($executionTime);
flush();
});
$response->send();
}
//rest of indexAction with a return statement
return array(
//all the vars my template will need
);
}
Also, another important detail is that I am trying to render all to twig and there seems to be some interesting issues with that.
As I understand it, you only get one chance to output something to the browser from the server (PHP/Twig), then it's up to JavaScript to make any further changes (like update a progress bar).
I'd recommend using multi-cURL to perform all 15 requests asynchronously. This effectively makes the total request time equal to the slowest request so you can serve your page much faster and maybe eliminate the need for the progress bar.
// Create the multiple cURL handle
$mh = curl_multi_init();
$handles = array();
$responses = array();
// Create and add the cURL handles to the $mh
foreach($widgets as $widget) {
$ch = $curler->getHandle($widget->getURL()); // Code that returns a cURL handle
$handles[] = $ch;
curl_multi_add_handle($mh, $ch);
}
// Execute the requests
do {
curl_multi_exec($mh, $running);
curl_multi_select($mh);
} while ($running > 0);
// Get the request content
foreach($handles as $handle) {
$responses[] = curl_multi_getcontent($handle);
// Close the handles
curl_close($handle);
}
curl_multi_close();
// Do something with the responses
// ...
Ideally, this would be a method of your Curler service.
public function processHandles(array $widgets)
{
// most of the above
return $responses;
}
You may implements all of the logic in the setCallback method, so consider this code:
public function indexAction($countryCode)
{
$Widgets = [];
$response = new StreamedResponse();
$curlerService = $this->get('curler');
$response->setCallback(function() use ($Widgets, $curlerService, $countryCode) {
foreach ($Widgets as $Widget) {
$curlerUrl = $Widget->getApiUrl()
. '?action=returnWidgets'
. '&data=' . urlencode(serialize(array(
'countryCode' => $countryCode
)));
$requestStartTime = microtime(true);
$curler = $curlerService->curlAUrl($curlerUrl);
$curlResult = json_decode($curler['body'], true);
if(isset($curlResult['data'])){
//do some processing on the data
}
flush();
sleep(1);
var_dump($Widget->getName());
var_dump( (microtime(true) - $requestStartTime) );
flush();
}
});
// Directly return the streamed response object
return $response;
}
Further reading this and this article.
Hope this help

PHP cURL - thread safe?

I wrote a PHP script which retrieved data via libcurl and processed it. It worked fine but for performance reasons I changed it to use dozens of workers (threads). The performance improved by more than 50 times, however now php.exe is crashing every few minutes and the faulting module listed is php_curl.dll. I do have prior experience with multi-threading in C, but haven't used it at all before in php.
I googled around and supposedly cURL is thread safe (as of 2001):
http://curl.haxx.se/mail/lib-2001-01/0001.html
But I can't find any mention of whether or not php_curl is thread safe.
In case it matters, I am running php from the command line. My setup is Win7 x64, PHP 5.5.11 Thread Safe VC11 x86, PHP pthreads 2.0.4 for PHP 5.5 Thread Safe VC11 x86.
Here is some pseudo code to show what I am doing
class MyWorker extends Worker
{
...
public function run()
{
...
while(1)
{
...
runCURL();
...
sleep(1);
}
}
}
function runCURL()
{
static $curlHandle = null;
...
if(is_null($curlHandle))
{
$curlHandle = curl_init();
curl_setopt($curlHandle, CURLOPT_RETURNTRANSFER, TRUE);
curl_setopt($curlHandle, CURLOPT_USERAGENT, "My User Agent String");
}
curl_setopt($curlHandle, CURLOPT_URL, "The URL");
curl_setopt($curlHandle, CURLOPT_POSTFIELDS, $data);
curl_setopt($curlHandle, CURLOPT_HTTPHEADER, $header);
curl_setopt($curlHandle, CURLOPT_SSL_VERIFYPEER, false);
$result = curl_exec($curlHandle);
...
}
Firstly, resource types are officially unsupported by pthreads; a curl handle is a resource, you therefore should not store curl handles in the object scope of pthreads objects, since they might become corrupted.
Making it easy
pthreads provides an easy way to use workers...
The easiest way to execute among many threads is to use the built in Pool class provided by pthreads:
http://php.net/pool
The following code demonstrates how to pool a bunch of requests in a few background threads:
<?php
define("LOG", Mutex::create());
function slog($message, $args = []) {
$args = func_get_args();
if (($message = array_shift($args))) {
Mutex::lock(LOG);
echo vsprintf("{$message}\n", $args);
Mutex::unlock(LOG);
}
}
class Request extends Threaded {
public function __construct($url, $post = []) {
$this->url = $url;
$this->post = $post;
}
public function run() {
$curl = curl_init();
curl_setopt($curl, CURLOPT_RETURNTRANSFER, true);
curl_setopt($curl, CURLOPT_URL, $this->url);
if ($this->post) {
curl_setopt($curl, CURLOPT_POSTFIELDS, $this->post);
}
$response = curl_exec($curl);
slog("%s returned %d bytes", $this->url, strlen($response));
}
public function getURL() { return $this->url; }
public function getPost() { return $this->post; }
protected $url;
protected $post;
}
$max = 100;
$urls = [];
while (count($urls) < $max) {
$urls[] = sprintf(
"http://www.google.co.uk/?q=%s",
md5(mt_rand()*count($urls)));
}
$pool = new Pool(4);
foreach ($urls as $url) {
$pool->submit(new Request($url));
}
$pool->shutdown();
Mutex::destroy(LOG);
?>
Your specific task requires that you now process the data, you can either write this functionality into a design like the above ... or
Making it fancy
promises are a super fancy form of concurrency ...
Promises suit the nature of the task here:
First: Make a request
Then: Process response
The following code shows how to use pthreads/promises to make the same request and process responses:
<?php
namespace {
require_once("vendor/autoload.php");
use pthreads\PromiseManager;
use pthreads\Promise;
use pthreads\Promisable;
use pthreads\Thenable;
define("LOG", Mutex::create());
function slog($message, $args = []) {
$args = func_get_args();
if (($message = array_shift($args))) {
Mutex::lock(LOG);
echo vsprintf("{$message}\n", $args);
Mutex::unlock(LOG);
}
}
/* will be used by everything to report errors when they occur */
trait ErrorManager {
public function onError(Promisable $promised) {
slog("Oh noes: %s\n", (string) $promised->getError());
}
}
class Request extends Promisable {
use ErrorManager;
public function __construct($url, $post = []) {
$this->url = $url;
$this->post = $post;
$this->done = false;
}
public function onFulfill() {
$curl = curl_init();
curl_setopt($curl, CURLOPT_RETURNTRANSFER, true);
curl_setopt($curl, CURLOPT_URL, $this->url);
if ($this->post) {
curl_setopt($curl, CURLOPT_POSTFIELDS, $this->post);
}
$this->response = curl_exec($curl);
}
public function getURL() { return $this->url; }
public function getPost() { return $this->post; }
public function getResponse() { return $this->response; }
public function setGarbage() { $this->garbage = true; }
public function isGarbage() { return $this->garbage; }
protected $url;
protected $post;
protected $response;
protected $garbage;
}
class Process extends Thenable {
use ErrorManager;
public function onFulfilled(Promisable $request) {
slog("%s returned %d bytes\n",
$request->getURL(), strlen($request->getResponse()));
}
}
/* some dummy urls */
$max = 100;
$urls = [];
while (count($urls) < $max) {
$urls[] = sprintf(
"http://www.google.co.uk/?q=%s",
md5(mt_rand()*count($urls)));
}
/* initialize manager for promises */
$manager = new PromiseManager(4);
/* create promises to make and process requests */
while (#++$id < $max) {
$promise = new Promise($manager, new Request($urls[$id], []));
$promise->then(
new Process($promise));
}
/* force the manager to shutdown (fulfilling all promises first) */
$manager->shutdown();
/* destroy mutex */
Mutex::destroy(LOG);
}
?>
Composer:
{
"require": {
"krakjoe/promises": ">=1.0.2"
}
}
Note that Request has hardly changed, all that has been added is somewhere to hold the response and a means to detect if the objects are garbage.
For details on garbage collection from pools, which applies to both examples:
http://php.net/pool.collect
The slog function exists only to make logged output readable
Making it clear
pthreads is not a new PDO driver ...
Many people approach using pthreads as they would approach using a new PDO driver - assume it works like the rest of PHP and that everything will be fine.
Everything might not be fine, and requires research: we are pushing the envelope, in doing so some "restrictions" must be placed upon the architecture of pthreads to maintain stability, this can have some strange side effects.
While pthreads comes with exhaustive documentation which mostly include examples in the PHP manual, I'm not able to attach the following document in the manual, yet.
The following document provides you with an understanding of the internals of pthreads, everyone should read it, it's written for you.
https://gist.github.com/krakjoe/6437782

Using PHPUnit to test cookies and sessions, how?

With PHPUnit it's quite easy to test raw PHP code, but what about code that heavily relies on cookies? Sessions could be a good example.
Is there a method that doesn't require me to setup $_COOKIE with data during my test? It feels like a hacky way of doing things.
This is a common problem with code, especially lagacy PHP code. The common technique used is to further abstract the COOKIE/SESSION variables in related objects and using inversion of control technique(s) to pull those dependencies into scope.
http://martinfowler.com/articles/injection.html
Now before you execute a test you would instantiate a mock version of a Cookie/Session object and provide default data.
I imagine, the same effect can be achieved with legacy code by simply overriding the super global value before executing the test.
Cheers,
Alex
I understand this is quite old, but I believe this needs to be updated as technology has improved since the original post. I was able to get sessions working with this solution using php 5.4 with phpunit 3.7:
class UserTest extends \PHPUnit_Framework_TestCase {
//....
public function __construct () {
ob_start();
}
protected function setUp() {
$this->object = new \User();
}
public function testUserLogin() {
$this->object->setUsername('test');
$this->object->setPassword('testpw');
// sets the session within:
$this->assertEquals(true, $this->object->login());
}
}
I found that I could use PHPUnit to test the behavior of the part of my website that relies heavily on sessions, through a combination of Curl and a cookie that passes the session id.
The following Curl class uses the CURLOPT_COOKIE option to pass a session parameter. The static variable $sessionid saves the session between different Curl calls. Further, sessions can be changed using the static function changeSession.
class Curl {
private $ch;
private static $sessionid;
public function __construct($url, $options) {
$this->ch = curl_init($url);
if (!self::$sessionid)
self::$sessionid = .. generateRandomString() ..;
$options = $options + array(
CURLOPT_RETURNTRANSFER => true,
CURLOPT_COOKIE => 'PHPSESSID=' . self::$sessionid);
foreach ($options as $key => $val) {
curl_setopt($this->ch, $key, $val);
}
}
public function getResponse() {
if ($this->response) {
return $this->response;
}
$response = curl_exec($this->ch);
$error = curl_error($this->ch);
$errno = curl_errno($this->ch);
$header_size = curl_getinfo($this->ch, CURLINFO_HEADER_SIZE);
$this->header = substr($response, 0, $header_size);
$response = substr($response, $header_size);
if (is_resource($this->ch)) {
curl_close($this->ch);
}
if (0 !== $errno) {
throw new \RuntimeException($error, $errno);
}
return $this->response = $response;
}
public function __toString() {
return $this->getResponse();
}
public static function changeSession() {
self::$SESSIONID = Practicalia::generateRandomString();
}
}
An example call
$data = array(
'action' => 'someaction',
'info' => 'someinfo'
);
$curl = new Curl(
'http://localhost/somephp.php',
array(
CURLOPT_POSTFIELDS => http_build_query($data)));
$response = $curl->getResponse();
And any subsequent Curl calls will automatically use the same session as the previous one, unless specifically Curl::changeSession() is called.

Categories