How to reduce bandwith usage on php curl requests - php

I had made a php app which works on an API but for sending requests to the API i use curl with a proxy and that proxy has a limited bandiwith, can someone please tell me about some practices which may help me to reduce bandwith.
P.S - I already know about CURLOPT_NOBODY which helps to reduce the bandwith usage a bit, but i need to save more bandwith.
My Current Code -
$ch = curl_init($url);
curl_setopt($ch, CURLOPT_NOBODY, true);
curl_setopt($ch, CURLOPT_USERAGENT, 'My User Agent');
curl_setopt($ch, CURLOPT_PROXY, '209.XXX.XXX.XX:8081');
curl_setopt($ch, CURLOPT_POST, true);
curl_setopt($ch, CURLOPT_POSTFIELDS, $postfields);
curl_exec($ch);
curl_close($ch);

Related

Rapidgator API Direct Download Link Error

Guys, I am currently working on file hosting premium link generator basically it will be a website from where you can get a premium link of uptobox,rapidgator,uploaded.net and other file hosts sites without purchasing the premium account. Basically, We are purchasing the accounts of this website on behalf of the users and offering this service at a low price. So when I was setting up API of direct download link of rapidgator I was able to get that link but I was getting session is over. I was trying to that API via a software, not via manual coding and I am facing this problem
So I have been getting Rapidgator API reference from Tihs Site:- https://gist.github.com/Chak10/f097b77c32a9ce83d05ef3574a30367d
So I am doing the following Thing With My Debugging Software And I am getting success response but when I just open that URL in my browser it shows Session Id Failed.
So Here Are Steps What I am Doing
Sending a post request on https://rapidgator.net/api/user/login with username and data and I am getting this output
{"response":{"session_id":"g8a13f32hr4cbbbo54qdigrcb3","expire_date":1542688501,"traffic_left":"13178268723435"},"response_status":200,"response_details":null}
Now I am sending a get request (I tried Post Request Too But the Same Thing Happened) on this url with session id and URL embedded in URL https://rapidgator.net/api/file/download?sid=&url=
and I am getting this output
{"response":{"url":"http:\/\/pr56.rapidgator.net\/\/?r=download\/index&session_id=uB9st0rVfhX2bNgPrFUri01a9i5xmxan"},"response_status":200,"response_details":null}
When I try to download the file from the Url through my browser It says Invalid Session and sometimes too many open connections error
Link of the error:- https://i.imgur.com/wcZ2Rh7.png
Success Response:- https://i.imgur.com/MqTsB8Q.png
Rapidgator needs its api to be hit three times with different URLs.
$cookie = $working_dir.rand();
$headers = array("header"=>"Referer: https://rapidgator.net");
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, "http://rapidgator.net/api/user/login");
curl_setopt($ch, CURLOPT_ENCODING, 'gzip, deflate');
curl_setopt($ch, CURLOPT_POSTFIELDS, "username=email#domain.ext&password=myplaintextpassword");
curl_setopt($ch, CURLOPT_POST, 1);
curl_setopt($ch, CURLOPT_HTTPHEADER, $headers);
curl_setopt($ch, CURLOPT_HEADER, 0);
curl_setopt($ch, CURLOPT_VERBOSE, 0);
curl_setopt($ch, CURLOPT_FOLLOWLOCATION, 1);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
curl_setopt($ch, CURLOPT_COOKIEJAR, $cookie);
curl_setopt($ch, CURLOPT_COOKIEFILE, $cookie);
curl_setopt($ch, CURLOPT_SSL_VERIFYHOST, 0);
curl_setopt($ch, CURLOPT_SSL_VERIFYPEER, 0);
$result = curl_exec($ch);
curl_close ($ch);
$rapidgator_json = json_decode($result,true);
return array($rapidgator_json['response']['session_id'],$cookie);
http://rapidgator.net/api/user/login (this is the initial login)
Above link gives you a session id that you need. The response is in JSON
Now we need to request a download link that will allow us to download without having to log in to a human input form. So we will use its api to request a download link using the intial session id we got from the 1st url.
$headers = array("header"=>"Referer: http://rapidgator.net/api/user/login");
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, "http://rapidgator.net/api/file/download?sid=$rapidgator_session&url=$rapidgator_file");
curl_setopt($ch, CURLOPT_ENCODING, 'gzip, deflate');
curl_setopt($ch, CURLOPT_CUSTOMREQUEST, 'GET');
curl_setopt($ch, CURLOPT_HTTPHEADER, $headers);
curl_setopt($ch, CURLOPT_HEADER, 0);
curl_setopt($ch, CURLOPT_VERBOSE, 0);
curl_setopt($ch, CURLOPT_FOLLOWLOCATION, 1);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
curl_setopt($ch, CURLOPT_COOKIEJAR, $working_dir.$rapidgator_cookie);
curl_setopt($ch, CURLOPT_COOKIEFILE, $working_dir.$rapidgator_cookie);
curl_setopt($ch, CURLOPT_SSL_VERIFYHOST, 0);
curl_setopt($ch, CURLOPT_SSL_VERIFYPEER, 0);
$result = curl_exec($ch);
curl_close ($ch);
$rapidgator_json = json_decode($result,true);
return array($rapidgator_json['response']['url']);
Basically, we pass the session id Rapidgator gave us assuming you have properly passed a valid account. Then we include the source url you had obtained (Link to file) http://rapidgator.net/api/file/download?sid=$rapidgator_session&url=$rapidgator_file
After that. Rapidgator will return a JSON response with an url that u can use to obtain the file in question. This allows you to use whatever download method you want
as that link is a session url is valid for a short period of time.
$rapidgator_json['response']['url']
All code above is somewhat incomplete. Some extra checks on the json responces for possible errors/limits are recommended. I used functions on my end but this is enough for you to see what you should be doing. Rapidshare API has other data that can be useful in determining if you have gone over your daily quota. How long the session url is going to last and so on.

How to send multiple requests in one curl open connection

i have an array with approximately 45 k usernames in in i want to hit a url using curl that would give me a response pertaining to those usernames.The issue is i want to achieve it in less time.
$username=['123','456','789'....] //upto 45k entries
for($i=0;$i<sizeof($username);$i++)
{
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, 'http://abc.com.pk/hxc/get_user_details.php?uname='.$username[$i]);
curl_setopt($ch, CURLOPT_RETURNTRANSFER,true);
curl_setopt($ch, CURLOPT_USERAGENT, $ua);
curl_setopt($ch, CURLOPT_AUTOREFERER, true);
curl_setopt($ch, CURLOPT_FOLLOWLOCATION, true);
curl_setopt($ch, CURLOPT_MAXREDIRS, 20);
curl_setopt($ch, CURLOPT_HTTPGET, true);
$result = curl_exec($ch);
curl_close($ch);
}
The above code depicts what i am doing right now but as usernames are in large numbers it takes alot of time to return all the responses.Is there any way i can achieve it in less time.
Have a look at https://github.com/php-curl-class/php-curl-class , it speeds up our curl requests a lot.
It has multi-curl support enabled and it's very easy to use.
As for your question on the time, you can set the time out using
Curl::setTimeout($seconds)
Or in the case of MultiCurl
MultiCurl::setTimeout($seconds)
You can extend the timeout as much time as needed.
You can use curl-multi-init and curl-multi-exec so that your requests are processed asynchronously.

PHP cURL causing massive server load on error?

I'm currently making a PHP site that gets his data from an API. At first cURL seemed to do that perfectly, but if the API returns an empty response (and we make the request again, because we can't give a correct response without it) it seems to spawn a child process. This didn't use too much CPU when developing, but in production it can get as high as 150% CPU load.
Code used to get data from the API:
while (empty($output )) {
$ch = curl_init();
set_curl($ch);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
curl_setopt($ch, CURLOPT_SSL_VERIFYPEER, 0);
curl_setopt($ch, CURLOPT_TIMEOUT, 8);
curl_setopt($ch, CURLOPT_CONNECTTIMEOUT, 4);
curl_setopt($ch, CURLOPT_URL, "http://domain.com");
$output = curl_exec($ch);
}
Is there any way to fix this?

SharePoint web service persistent authentication using PHP

I want to build a php application that uses a restful SharePoint web service (https with simple http authentication).
I did a request with CURL and all works but it took a very long time (3sec). After a little debugging i found out that at each request the authentication took most of that time because it does some redirects and so on.
This does not happen in the browser, after the first request the authentication is persistent.
What i tried to do is mimic this with 2 different calls: 1 to get the WSS_KeepSessionAuthenticated Cookie and another one to actually get the web service. I remember the cookie in a session variable and i use it for all the subsequent calls to the web service.
The problem is that the second request (that uses the cookie) does not authenticate (http 401).
Here is the first request that successfully gets the cookie:
$ch = curl_init('https://server/default.aspx');
curl_setopt($ch, CURLOPT_HTTPAUTH, CURLAUTH_NTLM);
curl_setopt($ch, CURLOPT_USERPWD, $username . ':' . $password);
curl_setopt($ch, CURLOPT_HEADER, true);
curl_setopt($ch, CURLOPT_SSL_VERIFYPEER, false);
curl_setopt($ch, CURLOPT_SSL_VERIFYHOST, false);
curl_setopt($ch, CURLOPT_NOBODY, true);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
This is the second request that tries to use the cookie to achieve persistent authentication:
$ch = curl_init('https://server/somewebservice/');
curl_setopt($ch, CURLOPT_VERBOSE, 1);
curl_setopt($ch, CURLOPT_HEADER, false);
curl_setopt($ch, CURLOPT_SSL_VERIFYPEER, false);
curl_setopt($ch, CURLOPT_SSL_VERIFYHOST, false);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
curl_setopt($ch, CURLINFO_HEADER_OUT, 1);
curl_setopt($ch, CURLOPT_COOKIE, implode(';', $session_key));
This gives 401.
Is it something i am missing?
Thank you

API hit using CURL

Am trying to access an API using CURL
I can access the API from my browser.
But cannot get the data from the same api(using the same API key)
using curl.
I am getting this error.
403 Developer Over Qps
Please let me know what can be the reason for this.
Earlier it was working. I am facing this issue for the past 2 days.!!
please check the code below:
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, 'http://api.perfb.com/api/api.php?requestmethod=json&responsemethod=xml');
curl_setopt($ch, CURLOPT_TIMEOUT, 900);
curl_setopt($ch, CURLOPT_CONNECTTIMEOUT, 30);
curl_setopt($ch, CURLOPT_FAILONERROR, false);
curl_setopt($ch, CURLOPT_SSL_VERIFYHOST, 0);
curl_setopt($ch, CURLOPT_SSL_VERIFYPEER, 0);
curl_setopt($ch, CURLOPT_FOLLOWLOCATION, true);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
curl_setopt($ch, CURLOPT_HTTPHEADER, $headers);
curl_setopt($ch, CURLOPT_POST, true);
curl_setopt($ch, CURLOPT_POSTFIELDS, $vJson);
$response = curl_exec($ch);
$info = curl_getinfo($ch);
echo '<pre>';
print_r($info);exit;
Qps means Queries Per Second
Are you hitting the server repeatedly with curl in a loop for example? Try adding a pause after each call and see if that works.
That error usually signifies that you're hitting the server too often (i.e. developer over allowed queries per second). Slow down your code, put some delays in. In browser, you're doing it manually, so it's likely quite a bit slower than your code.

Categories