How to get response from server using curl in php - php

i am using CURL to get data from server. The way it works is like the following:
A device send data to routing application which is found on server.
To get the data from the routing application, clients must ask with GET method specifying server address, port and parameter.
once a client is connected, the application start sending data on every new packet arrived from the device to connected clients. see below picture
now lets see my code that i run to get the response:
<?php
$curl = curl_init('http://192.168.1.4/online?user=dneb');
curl_setopt($curl, CURLOPT_PORT, 1818);
curl_setopt($curl, CURLOPT_RETURNTRANSFER, TRUE);
$result = curl_exec($curl);
curl_close($curl);
echo $result;
With this CURL request i can get the response data from routing application. But the routing application will never stop sending data to connected clients, so i will get the result only if i close the routing application, and it will echo every data as one. Now my question is how can i echo each data without closing the connection or the connection closed by the routing application? i.e When data received, display the data without any conditions. You can suggest any other options to forward this data to another server using TCP. Thanks!

a http connection that never close? don't think php's curl bindings are suitable for that. but you could use the socket api,
$sock=socket_create(AF_INET,SOCK_STREAM,SOL_TCP);
socket_set_block($sock);
socket_connect($sock,"192.168.1.4",1818);
$data=implode("\r\n",array(
'GET /online?user=dneb HTTP/1.0',
'Host: 192.168.1.4',
'User-Agent: PHP/'.PHP_VERSION,
'Accept: */*'
))."\r\n\r\n";
socket_write($sock,$data);
while(false!==($read_last=socket_read($sock,1))){
// do whatever
echo $read_last;
}
var_dump("socket_read returned false, probably means the connection was closed.",
"socket_last_error: ",
socket_last_error($sock),
"socket_strerror: ",
socket_strerror(socket_last_error($sock))
);
socket_close($sock);
or maybe even http fopen,
$fp=fopen("http://192.168.1.4:1818/online?user=dneb","rb");
stream_set_blocking($fp,1);
while(false!==($read_last=fread($fp,1))){
// do whatever
echo $read_last;
}
var_dump("fread returned false, probably means the connection was closed, last error: ",error_get_last());
fclose($fp);
(idk if fopen can use other ports than 80. also this won't work if you have allow_url_fopen disabled in php.ini)

Related

file_get_contents failed to open streams

Helloo, all I am trying to call the web service from file on windows server 2008.
I have connected to the server and installed there xampp and placed all the required files.
this is my code to call the webservice.
$result = file_get_contents("http://*******:8055/API.ashx?Method=Departure");
$json = json_decode($result, true);
$departure_count = count($json['Response']);
It gives me correct response on localhost but not on server. I have googled and they tell me that I should use cURL instead of file_gets_contents.
Then I used this code:
$url = 'http://*******:8055/API.ashx?Method=Departure';
$curl = curl_init();
curl_setopt($curl, CURLOPT_URL, $url);
curl_setopt($curl, CURLOPT_RETURNTRANSFER, true);
curl_setopt($curl, CURLOPT_HEADER, false);
$data = curl_exec($curl);
curl_close($curl);
$json = json_decode($data, true);
$departure_count = count($json['Response']);
and it also gives me response on localhost but not on server,
The address to access the URl is : http://221.120.222.68:8080/wordpress/fare/
When I tried to open $url in browser, it gives me the response
As the error message says, the stream (URL) requested cannot be opened.
There are many possible reasons for this:
1. base URL is bad.
2. username and/or password are bad
3. username/password do not have permission on the server
4. Your system cannot reach the server (firewall, PHP permissions)
I would use the following strategy to debug:
1. Dump $url and write it down.
2. Use a browser with debug tools (eg Firefox/Firebug) and try to access that URL.
3. Look at the headers returned to see what error the server reports (if any).
4. Think about why that error is returned...
I have googled and they tell me that I should use cURL instead of file_gets_contents.
Who are they? Certainly using curl should make it easier to diagnose the problem - but it won't reveal all the potential problems.
While the answer from Rax has some good hints, you say that the code works on a different machine - so the issue is about how the server connects to the service. There are many reasons this could be a problem:
all outgoing connections may be blocked by design
there may be no route to the outside network
there may be no DNS service available
The first person you should be speaking to is whomever supports/provisions the server. Meanwhile you could try deploying a simple script to attempt to resolve the hostname, and to attempt to retrieve content from a well established site using HTTP (e.g. this one).
e.g.
<?php
$ip=gethostbyname('*******'); // the hostname you are trying to connect to
print "IP = " . var_export($ip, true) . "<br />";
$content=file_get_contents("http://stackoverflow.com");
if (false===$content) {
print "failed to retrieve content - "
. var_dump($http_response_header, true);
} else {
print "Successfully retrieved " . strlen($content)
. " bytes from http://stackoverflow.com";
}

Why does a cURL connection fail (without error) if no timeout is set?

I have a PHP script that connects to an URL through cURL and then does something, depending on the returned HTTP status code:
$ch = curl_init();
$options = array(
CURLOPT_RETURNTRANSFER => true,
CURLOPT_URL => $url,
CURLOPT_USERAGENT => "What?!?"
);
curl_setopt_array($ch, $options);
$out = curl_exec($ch);
$code = curl_getinfo($ch)["http_code"];
curl_close($ch);
if ($code == "200") {
echo "200";
} else {
echo "not 200";
}
Some webservers are slow to reply, and although the page is loaded in my browser after a few seconds my script, when it tries to connect to that server, tells me that it did not receive a positive ("200") reply. So, apparently, the connection initiated by cURL timed out.
But why? I don't set a timeout in my script, and according to other answers on this site the default timeout for cURL is definitely longer than the three or four seconds it takes for the page to load in my browser.
So why does the connecion time out, and how can I get it to last longer, if, apparently, it is already set to infinite?
Notes:
The same URL doesn't always time out. So sometimes cURL can connect.
It is not one specific URL that sometimes times out, but different URLs at different times.
I'm on a shared server, so I don't have root access to any files.
I tried to look at curl_getinfo($ch) and curl_error($ch) – as per #drew010's suggestion in the comments – but both were empty whenever the problem happened.
The whole script runs for a little more than one minute. In this time it connects to 300+ URLs successfully. Even when one of the URLs fails, the other connections are successfully made. So the script does not time out.
cURL does not time out either, because when I try to connect to an URL with a script sleeping for 59 seconds, cURL successfully connects. So apparently the slowness of the failing URL is not a problem in itself for cURL.
Update
Following #Karlos' suggestion in his answer, I used:
CURLOPT_VERBOSE => 1,
CURLOPT_STDERR => $curl_log
(using code from this answer) and found the following in $curl_log when an URL failed (URL and IP changed):
* About to connect() to www.somesite.com port 80 (#0)
* Trying 104.16.37.249... * connected
* Connected to www.somesite.com (104.16.37.249) port 80 (#0)
GET /wp_german/?feed=rss2 HTTP/1.1
User-Agent: myURL
Host: www.somesite.com
Accept: */*
* Recv failure: Connection reset by peer
* Closing connection #0
So, I have found the why – thank you #Karlos! – and apparently #Axalix was right and it is a network problem. I'll now follow suggestions given on this site for that kind of failure. Thanks to everyone for their help!
My experience working with curl showed me that sometimes when using the option:
CURLOPT_RETURNTRANSFER => true
the server might not give a successful reply or, at least, a successful reply within the timeframe that curl has to receive the response and cache it, so the results are returned by the curl into the variable you assign. In your code:
$out = curl_exec($ch);
In this stackoverflow question CURLOPT_RETURNTRANSFER set to true doesnt work on hosting server, you can see that that the option CURLOPT_RETURNTRANSFER is directly affected by the requested host web server implementation.
As you are using explicitly the response body, and your code relies on the response headers, a good way to solve this might be to:
CURLOPT_RETURNTRANSFER => false
and execute the curl code to work on the response headers.
Once you have the header with the code you are interested, you could run a php script that echoes the curl response and parse it by yourself:
<?php
$url=isset($_GET['url']) ? $_GET['url'] : 'http://www.example.com';
$ch= curl_init();
$options = array(
CURLOPT_RETURNTRANSFER => false,
CURLOPT_URL => $url,
CURLOPT_USERAGENT => "myURL"
);
curl_setopt_array($ch, $options);
curl_exec($ch);
curl_close($ch);
?>
In any case the reply to your question why your request does not get an error, I guess that the use of the option CURLOPT_NOSIGNAL and the different timeout options explained in the set_opt php manual might get you closer to it.
In order to dig further, the option CURLOPT_VERBOSE might help you to have extra information about the request behavior through the STDERR.
The reason may be your hosting provider is imposing some limits on outgoing connections.
Here is what can be done to secure your script:
Create a queue in DB with all the URLs that need to be fetched.
Run cron every minute or 5 minutes, take a few URLs from DB - mark them as in progress.
Try to fetch those URLs. Mark every fetched URL as success in DB.
Increment failure count for unsuccessful ones.
Continue going through queue until its empty.
If you implement such a solution you will be able to process every single URL under any unfavourable conditions.

Calling function or script on another server and receiving response

What I essentially want to do is to be able to call a function or script on another server using PHP and receive a response. Let me set up an example:
On my web application, I have a page, and I need to send some data to the server, have the server do some work and then return a response.
Web application:
<?php
// call server script, sending it for example a string, "testString", then wait for a response
?>
Server script:
<?php
// get the string "testString", so some work on it and return it, displaying it on the web app
?>
I'm not looking for someone to complete this for me, just to head me in the right direction. I've read that cURL can be useful for this, but have yet to be able to get any examples to work for me, such as these ones:
http://www.jonasjohn.de/snippets/php/curl-example.htm
How to get response using cURL in PHP
So what's an ideal route to head in to solve my problem?
cURL operates just like a browser basically. It makes an HTTP request to a server and gets back the response. So one server is the 'client' and one server is the 'server'.
On the server server (lol), set up a page called index.php that outputs some text
<?php
echo 'hello from the server server';
Then from the client server, create a page called index.php to make the cURL request to the server server.
<?php
// init curl object
$ch = curl_init();
// define options
$optArray = array(
CURLOPT_URL => 'http://www.serverserver.com', <--- edit that URL
CURLOPT_RETURNTRANSFER => true
);
// apply those options
curl_setopt_array($ch, $optArray);
// execute request and get response
$result = curl_exec($ch);
var_dump($result);
Then, when you go to the client server URL, the client server will hit the server server, just like it was a person using a browser over HTTP, and print the result to the screen.
Hope that helps. Didn't actually test it.
If you do not have control over both the machines or the host server does not provide you with an API to be able to do that, this is not doable. Otherwise it is as simple as setting up some code on host server which will receive commands from your client, then process and respond accordingly. Once that is setup then you can easily call your server code with either cURL or even file_get_contents

PHP: Remote Function Call and returning the result?

I'm not very expert to PHP. I want to know how to communicate between 2 web servers. For clearance, (from 1st Server) run a function (querying) on remote server. And return the result to 1st server.
Actually the theme will be:
Web Server (1) ----------------> Web Server (2) ---------------> Database Server
Web Server (1) <---------------- Web Server (2) <--------------- Database Server
Query Function() will be only located on Web Server (2). Then i need to run that query function() remotely from Web Server (1).
What is it call? And Is it possible?
Yes.
A nice way I can think of doing would be to send a request to the 2nd server via a URL. In the GET (or POST) parameters, specify which method you'd like to call, and (for security) some sort of hash that changes with time. The hash in there to ensure no third-party can run the function arbitrarily on the 2nd server.
To send the request, you could use cURL:
function get_url($request_url) {
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, $request_url);
curl_setopt($ch, CURLOPT_CONNECTTIMEOUT, 10);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
$response = curl_exec($ch);
curl_close($ch);
return $response;
}
This sends a GET request. You can then use:
$request_url = 'http://second-server-address/listening_page.php?function=somefunction&securityhash=HASH';
$response = get_url($request_url);
On your second server, set up the listening_page.php (with whatever filename you like, of course) that checks for GET requests and verifies the integrity of the request (i.e. the hash, correct & valid params).
You can do so by using an API. create a page on second server that takes variables and communicates to the server using those vars (depending on what you need). and the standard reply from that page should be either JSON or XML. then read that from server 1 by requesting that file and getting the reply from the 2nd server.
*NOTE if its a private file, make sure you use an authentication method to prevent users from accessing the file
What you are aiming to do is definitely possible. You will need to set up some sort of api in order for server one to make a request to server 2.
I suggest you read up on SOAP and REST api
http://www.netmagazine.com/tutorials/make-your-own-soap-api
Generally you will use something like CURL to contact server 2 from server 1.
Google curl and you should quickly get idea.
Its not going to be easy to give you a complete solution so I hope this nudge in the right direction is helpful.

file_get_contents() GET request not showing up on my webserver log

I've got a simple php script to ping some of my domains using file_get_contents(), however I have checked my logs and they are not recording any get requests.
I have
$result = file_get_contents($url);
echo $url. ' pinged ok\n';
where $url for each of the domains is just a simple string of the form http://mydomain.com/, echo verifies this. Manual requests made by myself are showing.
Why would the get requests not be showing in my logs?
Actually I've got it to register the hit when I send $result to the browser. I guess this means the webserver only records browser requests? Is there any way to mimic such in php?
ok tried curl php:
// create curl resource
$ch = curl_init();
// set url
curl_setopt($ch, CURLOPT_URL, "getcorporate.co.nr");
//return the transfer as a string
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
// $output contains the output string
$output = curl_exec($ch);
// close curl resource to free up system resources
curl_close($ch);
same effect though - no hit registered in logs. So far it only registers when I feed the http response back from my script to the browser. Obviously this will only work for a single request and not a bunch as is the purpose of my script.
If something else is going wrong, what debugging output can I look at?
Edit: D'oh! See comments below accepted answer for explanation of my erroneous thinking.
If the request is actually being made, it would be in the logs.
Your example code could be failing silently.
What happens if you do:
<?PHP
if ($result = file_get_contents($url)){
echo "Success";
}else{
echo "Epic Fail!";
}
If that's failing, you'll want to turn on some error reporting or logging and try to figure out why.
Note: if you're in safe mode, or otherwise have fopen url wrappers disabled, file_get_contents() will not grab a remote page. This is the most likely reason things would be failing (assuming there's not a typo in the contents of $url).
Use curl instead?
That's odd. Maybe there is some caching afoot? Have you tried changing the URL dynamically ($url = $url."?timestamp=".time() for example)?

Categories