I want to send a request (POST or GET) and don't wait for answer
I have the code:
$fp = stream_socket_client("tcp://requestb.in:80", $errno, $errstr, 30, STREAM_CLIENT_ASYNC_CONNECT|STREAM_CLIENT_CONNECT);
$out = "GET /15b3x0v1 HTTP/1.1\r\n";
$out .= "Host: requestb.in\r\n";
$out .= "Connection: Close\r\n\r\n";
fputs($fp, $out);
//usleep(500);
fflush($fp);
fclose($fp);
Without the usleep, it doesn't work (the server doesn't receive the request). Why have i to wait 500ms before close the socket ?
Related
i have been working on a scraper tool that rips google search results and then crawls the results websites looking to match specific items.
I'm having an issue with cURL though. I have come accross a site that is causing curl to go into an infinite loop.
website in question.
http://www.darellyelectrical.com/
when i open up my packet sniffer and look through tcp http packets ive found the same request is being sent over and over again.
i can not pinpoint the reason why, I have no trouble with any other websites.
I have tried setting the following curl options
curl_setopt($this->sessions[$key], CURLOPT_TIMEOUT, $timeout);
curl_setopt($this->sessions[$key], CURLOPT_MAXREDIRS, 2);
curl_setopt($this->sessions[$key], CURLOPT_CONNECTTIMEOUT, 1);
be great if someone could test that url with curl and let me know if the issue persists.
thanks
EDIT**
function sck_send()
{
$host = "www.darellyelectrical.com";
$path = "";
$fp = fsockopen($host, 80, $errno, $errstr, 30);
if (!$fp) {
echo "$errstr ($errno)<br />\n";
} else {
$out = "GET /".$path." HTTP/1.1\r\n";
$out .= "Host: ".$host."\r\n";
$out .= "Connection: Close\r\n\r\n";
$data = "";
fwrite($fp, $out);
while (!feof($fp))
{
$data .= fgets($fp, 128);
}
fclose($fp);
echo $data;
}
}
sck_send();
this will produce the loop same as curl.
That server needs the User-Agent header to be included or it does not respond. PHP's curl doesn't set this by default and it wouldn't be included in a socket request unless you specify it. The code below works for me:
<?php
function sck_send() {
$host = "www.darellyelectrical.com";
$path = "";
$fp = fsockopen($host, 80, $errno, $errstr, 30);
if (!$fp) {
echo "$errstr ($errno)<br />\n";
} else {
$out = "GET /".$path." HTTP/1.1\r\n";
$out .= "Host: ".$host."\r\n";
$out .= "User-Agent: Mozilla/5.0 \r\n";
$out .= "Connection: Close\r\n\r\n";
$data = "";
fwrite($fp, $out);
while (!feof($fp)) {
$data .= fgets($fp, 128);
}
fclose($fp);
echo $data;
}
}
sck_send();
I'm trying to set up a simple PHP script to emulate an audio file with fpassthru.
However, the source that I'm pulling the content from naturally adds headers to the actual data-stream fetched by my PHP script, which I think is the reason my setup doesn't work.
What I'd like to do is skip/ignore the first 13 lines/289 bytes of the data fetched by fpassthru, so the stream is "clean" and uncontaminated with headers from the original server - is this possible?
here's my current code:
// Make socket connection
$errno = "errno";
$errstr = "errstr";
$fp = fsockopen($ip, $port, $errno, $errstr, 30);
if(!$fp) exit;
// Create send headers
fputs($fp, "GET /$path HTTP/1.0\r\n");
fputs($fp, "Host: $ip\r\n");
fputs($fp, "User-Agent: Script\r\n");
fputs($fp, "Accept: */*\r\n");
fputs($fp, "Connection: close\r\n\r\n");
// Write the returned data back to the resource
fpassthru($fp);
// close the socket when we're done
fclose($fp);
I'm trying to open a connection to a webserver I have hosted on an Amazon EC2 instance. I can access the port with a GET request from my browser which gives me the following:
Started HTTP server...
- <IP redacted> [19/Jul/2013 13:49:24] code 501, message Unsupported method ('GET')
- <IP redacted> [19/Jul/2013 13:49:24] "GET / HTTP/1.1" 501 -
This is as expected as I haven't implemented GET on the webserver - but I get no response at all using the following snippet to POST to the same port.
<?php
$req = 'verify=true';
$header = "POST / HTTP/1.1\r\n";
$header .= "Host: ec2-55-555-555-555.compute-1.amazonaws.com\r\n";
$header .= "Content-Type: application/x-www-form-urlencoded\r\n";
$header .= "Content-Length: " . strlen($req) . "\r\n\r\n";
$fp = fsockopen ("ec2-55-555-555-555.compute-1.amazonaws.com", 8080, $errno, $errstr, 30);
fputs ($fp, $header . $req);
fclose ($fp);
?>
It definitely runs and reaches the end but the fsockopen call is timing out. What's missing or is there but shouldn't be?
The easiest way to interact with an HTTP server from PHP would be to use curl.
It's a good library with some great features. It works well.
$curl = curl_init();
curl_setopt($curl, CURLOPT_URL, "Your_url_here");
$resp = curl_exec($curl);
<?php
$host = 'www.yourtargeturl.com';
$service_uri = '/detect_referal.php';
$vars ='additional_option1=yes&additional_option2=un';
$header = "Host: $host\r\n";
$header .= "User-Agent: PHP Script\r\n";
$header .= "Content-Type: application/x-www-form-urlencoded\r\n";
$header .= "Referer: http://www.google.com/search?hl=en&q=jigh&btnG=Google+Search \r\n";
$header .= "Content-Length: ".strlen($vars)."\r\n";
$header .= "Connection: close\r\n\r\n";
$fp = fsockopen("".$host,80, $errno, $errstr);
if (!$fp) {
echo "$errstr ($errno)<br/>\n";
echo $fp;
} else {
fputs($fp, "POST $service_uri HTTP/1.1\r\n");
fputs($fp, $header.$vars);
fwrite($fp, $out);
while (!feof($fp)) {
echo fgets($fp, 128);
}
fclose($fp);
}
?>
This is changing the $_SERVER['HTTP_REFERER']. How can I change $_SERVER['REMOTE_ADDR']. What code should I append in $header?
You can't do that. The IP address is determined at the beginning of the TCP connection, not in the HTTP headers. (It is possible to spoof [though not remotely like that], but then you won't get the response back.)
You can't. The IP address you connect out from isn't some header... it comes from the underlying TCP connection made from your server to the other server.
Hy all.
I need to get the content of multiple pages from a single domain.
Now for each page I use an fsockopen connection, and I get the content of the page this way:
<?php
$fp = fsockopen("www.example.com", 80, $errno, $errstr, 30);
if (!$fp) {
echo "$errstr ($errno)<br />\n";
} else {
$out = "GET /page1.html HTTP/1.1\r\n";
$out .= "Host: www.example.com\r\n";
$out .= "Connection: Close\r\n\r\n";
fwrite($fp, $out);
while (!feof($fp)) {
fgets($fp, 128);
}
fclose($fp);
}
?>
My script wastes time, with reconnecting to the domain, to get the second page.
I was wondering, if is possible to use a single connection, and to get multiple pages, like this:
<?php
$fp = fsockopen("www.example.com", 80, $errno, $errstr, 30);
if (!$fp) {
echo "$errstr ($errno)<br />\n";
} else {
$out = "GET /page1.html HTTP/1.1\r\n";
$out .= "Host: www.example.com\r\n";
$out .= "Connection: Close\r\n\r\n";
fwrite($fp, $out);
while (!feof($fp)) {
fgets($fp, 128);
} $out = "GET /page2.html HTTP/1.1\r\n";
$out .= "Host: www.example.com\r\n";
$out .= "Connection: Close\r\n\r\n";
fwrite($fp, $out);
while (!feof($fp)) {
fgets($fp, 128);
}
fclose($fp);
}
?>
But this method is returning the page1.html two times, I don't know why.
I tried to use: Connection: keep alive, or HTTP/1.0, but in this cases I didn't get anything from the server (infinite executing time of my script).
Any suggestion to solve this?
Thank you!
Try only sending the Connection: Close header on the last request.
EDIT: Clarification
$fp = fsockopen("www.example.com", 80, $errno, $errstr, 30);
if (!$fp) {
echo "$errstr ($errno)<br />\n";
} else {
$out = "GET /page1.html HTTP/1.1\r\n";
$out .= "Host: www.example.com\r\n";
// DON'T SEND Connection: Close HERE
fwrite($fp, $out);
while (!feof($fp)) {
fgets($fp, 128);
}
$out = "GET /page2.html HTTP/1.1\r\n";
$out .= "Host: www.example.com\r\n";
// THIS IS THE LAST PAGE REQUIRED SO SEND Connection: Close HEADER
$out .= "Connection: Close\r\n\r\n";
fwrite($fp, $out);
while (!feof($fp)) {
fgets($fp, 128);
}
fclose($fp);
}