I have a server A and multiple servers that are behind in a local network with the server A (Server A has connection outside)
Server A has a webpage that does multiple ajax request to a script like this
//Get myserverB_IP from database
...
$url = "https://myserverB_IP/someurl.php"
$ch = curl_init();
curl_setopt($ch,CURL_VERSION_SSL,TRUE);
curl_setopt($ch, CURLOPT_SSL_VERIFYPEER, false);
curl_setopt($ch, CURLOPT_URL, $url);
curl_setopt($ch, CURLOPT_CONNECTTIMEOUT, 5);
curl_setopt($ch, CURLOPT_FRESH_CONNECT, false);
curl_setopt ($ch, CURLOPT_TIMEOUT,30);
curl_setopt($ch, CURLOPT_COOKIEJAR, '/tmp/sess.txt');
curl_setopt($ch, CURLOPT_COOKIEFILE, '/tmp/sess.txt');
curl_exec($ch);
...
With http, everything worked fine, but when adding https there is a big delay because every single ajax call requests a new key exchange to the serverB
There is any way to reuse the connection to avoid the key exchange in every request ?
Related
Guys, I am currently working on file hosting premium link generator basically it will be a website from where you can get a premium link of uptobox,rapidgator,uploaded.net and other file hosts sites without purchasing the premium account. Basically, We are purchasing the accounts of this website on behalf of the users and offering this service at a low price. So when I was setting up API of direct download link of rapidgator I was able to get that link but I was getting session is over. I was trying to that API via a software, not via manual coding and I am facing this problem
So I have been getting Rapidgator API reference from Tihs Site:- https://gist.github.com/Chak10/f097b77c32a9ce83d05ef3574a30367d
So I am doing the following Thing With My Debugging Software And I am getting success response but when I just open that URL in my browser it shows Session Id Failed.
So Here Are Steps What I am Doing
Sending a post request on https://rapidgator.net/api/user/login with username and data and I am getting this output
{"response":{"session_id":"g8a13f32hr4cbbbo54qdigrcb3","expire_date":1542688501,"traffic_left":"13178268723435"},"response_status":200,"response_details":null}
Now I am sending a get request (I tried Post Request Too But the Same Thing Happened) on this url with session id and URL embedded in URL https://rapidgator.net/api/file/download?sid=&url=
and I am getting this output
{"response":{"url":"http:\/\/pr56.rapidgator.net\/\/?r=download\/index&session_id=uB9st0rVfhX2bNgPrFUri01a9i5xmxan"},"response_status":200,"response_details":null}
When I try to download the file from the Url through my browser It says Invalid Session and sometimes too many open connections error
Link of the error:- https://i.imgur.com/wcZ2Rh7.png
Success Response:- https://i.imgur.com/MqTsB8Q.png
Rapidgator needs its api to be hit three times with different URLs.
$cookie = $working_dir.rand();
$headers = array("header"=>"Referer: https://rapidgator.net");
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, "http://rapidgator.net/api/user/login");
curl_setopt($ch, CURLOPT_ENCODING, 'gzip, deflate');
curl_setopt($ch, CURLOPT_POSTFIELDS, "username=email#domain.ext&password=myplaintextpassword");
curl_setopt($ch, CURLOPT_POST, 1);
curl_setopt($ch, CURLOPT_HTTPHEADER, $headers);
curl_setopt($ch, CURLOPT_HEADER, 0);
curl_setopt($ch, CURLOPT_VERBOSE, 0);
curl_setopt($ch, CURLOPT_FOLLOWLOCATION, 1);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
curl_setopt($ch, CURLOPT_COOKIEJAR, $cookie);
curl_setopt($ch, CURLOPT_COOKIEFILE, $cookie);
curl_setopt($ch, CURLOPT_SSL_VERIFYHOST, 0);
curl_setopt($ch, CURLOPT_SSL_VERIFYPEER, 0);
$result = curl_exec($ch);
curl_close ($ch);
$rapidgator_json = json_decode($result,true);
return array($rapidgator_json['response']['session_id'],$cookie);
http://rapidgator.net/api/user/login (this is the initial login)
Above link gives you a session id that you need. The response is in JSON
Now we need to request a download link that will allow us to download without having to log in to a human input form. So we will use its api to request a download link using the intial session id we got from the 1st url.
$headers = array("header"=>"Referer: http://rapidgator.net/api/user/login");
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, "http://rapidgator.net/api/file/download?sid=$rapidgator_session&url=$rapidgator_file");
curl_setopt($ch, CURLOPT_ENCODING, 'gzip, deflate');
curl_setopt($ch, CURLOPT_CUSTOMREQUEST, 'GET');
curl_setopt($ch, CURLOPT_HTTPHEADER, $headers);
curl_setopt($ch, CURLOPT_HEADER, 0);
curl_setopt($ch, CURLOPT_VERBOSE, 0);
curl_setopt($ch, CURLOPT_FOLLOWLOCATION, 1);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
curl_setopt($ch, CURLOPT_COOKIEJAR, $working_dir.$rapidgator_cookie);
curl_setopt($ch, CURLOPT_COOKIEFILE, $working_dir.$rapidgator_cookie);
curl_setopt($ch, CURLOPT_SSL_VERIFYHOST, 0);
curl_setopt($ch, CURLOPT_SSL_VERIFYPEER, 0);
$result = curl_exec($ch);
curl_close ($ch);
$rapidgator_json = json_decode($result,true);
return array($rapidgator_json['response']['url']);
Basically, we pass the session id Rapidgator gave us assuming you have properly passed a valid account. Then we include the source url you had obtained (Link to file) http://rapidgator.net/api/file/download?sid=$rapidgator_session&url=$rapidgator_file
After that. Rapidgator will return a JSON response with an url that u can use to obtain the file in question. This allows you to use whatever download method you want
as that link is a session url is valid for a short period of time.
$rapidgator_json['response']['url']
All code above is somewhat incomplete. Some extra checks on the json responces for possible errors/limits are recommended. I used functions on my end but this is enough for you to see what you should be doing. Rapidshare API has other data that can be useful in determining if you have gone over your daily quota. How long the session url is going to last and so on.
I want to build a php application that uses a restful SharePoint web service (https with simple http authentication).
I did a request with CURL and all works but it took a very long time (3sec). After a little debugging i found out that at each request the authentication took most of that time because it does some redirects and so on.
This does not happen in the browser, after the first request the authentication is persistent.
What i tried to do is mimic this with 2 different calls: 1 to get the WSS_KeepSessionAuthenticated Cookie and another one to actually get the web service. I remember the cookie in a session variable and i use it for all the subsequent calls to the web service.
The problem is that the second request (that uses the cookie) does not authenticate (http 401).
Here is the first request that successfully gets the cookie:
$ch = curl_init('https://server/default.aspx');
curl_setopt($ch, CURLOPT_HTTPAUTH, CURLAUTH_NTLM);
curl_setopt($ch, CURLOPT_USERPWD, $username . ':' . $password);
curl_setopt($ch, CURLOPT_HEADER, true);
curl_setopt($ch, CURLOPT_SSL_VERIFYPEER, false);
curl_setopt($ch, CURLOPT_SSL_VERIFYHOST, false);
curl_setopt($ch, CURLOPT_NOBODY, true);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
This is the second request that tries to use the cookie to achieve persistent authentication:
$ch = curl_init('https://server/somewebservice/');
curl_setopt($ch, CURLOPT_VERBOSE, 1);
curl_setopt($ch, CURLOPT_HEADER, false);
curl_setopt($ch, CURLOPT_SSL_VERIFYPEER, false);
curl_setopt($ch, CURLOPT_SSL_VERIFYHOST, false);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
curl_setopt($ch, CURLINFO_HEADER_OUT, 1);
curl_setopt($ch, CURLOPT_COOKIE, implode(';', $session_key));
This gives 401.
Is it something i am missing?
Thank you
I'm making an app that deal with an API
So i'm trying to call the API using cURL and the issue is that the api will deal with the server cookie which do this curl call not the client cookie who use the app...
For Example : the client is already logged in to the api ,
but when i check if the client logged in by curl call , the api response that you are not authorithed
So how can i make the API deal with the client cookies or pass the cookie of client to the API ?
You'll want to use a cookiefile, like so:
# Set the cookiefile name, which will allow us to store the cookie and present it later for all requests that require it...
$cookiefile = tempnam("/tmp", "cookies");
$agent = "Mozilla/4.0 (compatible; MSIE 5.01; Windows NT 5.0)";
# Set the user name and password values
$user = $argv[1];
$password = $argv[2];
# The API url, to do the login
$url = "https://some_site.com/login.php?WID=$user&PW=$password";
# Initialise CURL
$ch = curl_init();
# Set all the various options
curl_setopt($ch, CURLOPT_URL, $url);
curl_setopt($ch, CURLOPT_userAGENT, $agent);
curl_setopt($ch, CURLOPT_POST, 0); // set POST method
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
curl_setopt($ch, CURLOPT_FOLLOWLOCATION, 1);
curl_setopt($ch, CURLOPT_COOKIEFILE, $cookiefile);
curl_setopt($ch, CURLOPT_COOKIEJAR, $cookiefile);
curl_setopt($ch, CURLOPT_userPWD, $user.":".$password);
curl_setopt($ch, CURLOPT_SSL_VERIFYPEER, false);
curl_setopt($ch, CURLOPT_SSL_VERIFYHOST, 0);
# Execute the first CURL request to perform the login...
$results = curl_exec($ch);
# Setup our next request...
$job_id_number = $argv[3];
$url = "https://some_site.com/request.php?task=add&taskID=$job_id_number";
# We do not have to initialise CURL again, however we do need to adjust our URL option for the second request...
curl_setopt($ch, CURLOPT_URL, $url);
#Remember to use the same cookiefile as above
curl_setopt($ch, CURLOPT_COOKIEFILE, $cookiefile);
curl_setopt($ch, CURLOPT_COOKIEJAR, $cookiefile);
curl_setopt($ch, CURLOPT_SSL_VERIFYPEER, false);
curl_setopt($ch, CURLOPT_SSL_VERIFYHOST, 2);
# Execute the second CURL call to perform the action (using the cookie we retrieved from earlier)
$results = curl_exec($ch);
echo "$results";
You should check the documentation of the API.
Maybe the documentation describes a way to authorize the client.
This way it won't work, because the cookies are set at clientside (in the browser) and not at your server. So when you make a cURl call, the cookie data isn't included in the request.
An API call via Javascript could work, because that is a request that the client sends (with the needed cookies)
Am trying to access an API using CURL
I can access the API from my browser.
But cannot get the data from the same api(using the same API key)
using curl.
I am getting this error.
403 Developer Over Qps
Please let me know what can be the reason for this.
Earlier it was working. I am facing this issue for the past 2 days.!!
please check the code below:
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, 'http://api.perfb.com/api/api.php?requestmethod=json&responsemethod=xml');
curl_setopt($ch, CURLOPT_TIMEOUT, 900);
curl_setopt($ch, CURLOPT_CONNECTTIMEOUT, 30);
curl_setopt($ch, CURLOPT_FAILONERROR, false);
curl_setopt($ch, CURLOPT_SSL_VERIFYHOST, 0);
curl_setopt($ch, CURLOPT_SSL_VERIFYPEER, 0);
curl_setopt($ch, CURLOPT_FOLLOWLOCATION, true);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
curl_setopt($ch, CURLOPT_HTTPHEADER, $headers);
curl_setopt($ch, CURLOPT_POST, true);
curl_setopt($ch, CURLOPT_POSTFIELDS, $vJson);
$response = curl_exec($ch);
$info = curl_getinfo($ch);
echo '<pre>';
print_r($info);exit;
Qps means Queries Per Second
Are you hitting the server repeatedly with curl in a loop for example? Try adding a pause after each call and see if that works.
That error usually signifies that you're hitting the server too often (i.e. developer over allowed queries per second). Slow down your code, put some delays in. In browser, you're doing it manually, so it's likely quite a bit slower than your code.
For instance, on the following CURL snippet:
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, $url); //set target URL
curl_setopt($ch, CURLOPT_FOLLOWLOCATION, TRUE);// allow redirects
curl_setopt($ch, CURLOPT_POST, $usePost); // set POST method
curl_setopt($ch, CURLOPT_HTTPHEADER, $headers); //set headers
curl_setopt($ch, CURLOPT_RETURNTRANSFER, TRUE);
curl_setopt($ch, CURLOPT_HEADER, $returnHeaders);
curl_setopt($ch, CURLOPT_SSL_VERIFYHOST, FALSE);
curl_setopt($ch, CURLOPT_SSL_VERIFYPEER, FALSE); //prevent unverified SSL error
Before I run curl_exec on it, what if I want to see the full request headers and body before it is sent. (to see if is correctly following certain REST API guidelines)
You could send a request to the local server:
$test_url = 'http://localhost/nonexistent-page';
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, $test_url);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
curl_setopt($ch, CURLINFO_HEADER_OUT, true);
// Other options.
curl_exec($ch);
echo nl2br(curl_getinfo($ch, CURLINFO_HEADER_OUT));
This will give you the request headers, with only the request line path and the Host: line being different from your actual request.
If you have access to a graphical environment on your server, you could use Wireshark to examine the network packets being sent and received. Wireshark allows you to use filters, to filter out specific IP-adresses and protocols.
For instance, I use this filter to see all the traffic from my cURL requests/responses to the server with IP w.x.y.z (substitute with the ip of the server you are connecting to):
ip.addr == w.x.y.z && http
I can then examine all my requests responses.
This has given me great insight in what's happening 'under the hood'.