Is it possible to log the request with php?
I also want to log the images, js, css file requests.
<img src="foo.png">
<link rel="stylesheet" href="foo.css">
I currently use this code but it only gives me the current request uri and not the other files etc.
I also rewrited all request to my index.php wich i have this line of code in.
file_put_contents('foo.txt', $_SERVER['REQUEST_URI'] . PHP_EOL, FILE_APPEND | LOCK_EX);
Here is one option to draw all http requests to a file. This applies to all requests that travel with the HTTP protocol
$myFile = "requestslog.txt";
$fh = fopen($myFile, 'a') or die("can't open file");
fwrite($fh, "\n\n--------------------------------------
-------------------------\n");
foreach($_SERVER as $h=>$v)
if(ereg('HTTP_(.+)',$h,$hp))
fwrite($fh, "$h = $v\n");
fwrite($fh, "\r\n");
fwrite($fh, file_get_contents('php://input'));
fclose($fh);
echo "<html><head /><body><iframe src=\"$myFile\"
style=\"height:100%; width:100%;\"></iframe></body></html>"
Yes, it is. You can make all the requests pass through a php script, so that you can log the action. For example, a simple image request like http://url.com/img.jpg would became http://url.com/index.php?action=download&file=img.jpg, and the script would handle the logging, the file download and correct headers.
Also take into account that your http server might be logging the request already, take a look into the access_log of apache if you are using it.
I prefer the approach listed on this gist.
<?php
// https://gist.github.com/magnetikonline/650e30e485c0f91f2f40
class DumpHTTPRequestToFile {
public function execute($targetFile) {
$data = sprintf(
"%s %s %s\n\nHTTP headers:\n",
$_SERVER['REQUEST_METHOD'],
$_SERVER['REQUEST_URI'],
$_SERVER['SERVER_PROTOCOL']
);
foreach ($this->getHeaderList() as $name => $value) {
$data .= $name . ': ' . $value . "\n";
}
$data .= "\nRequest body:\n";
file_put_contents(
$targetFile,
$data . file_get_contents('php://input') . "\n"
);
echo("Done!\n\n");
}
private function getHeaderList() {
$headerList = [];
foreach ($_SERVER as $name => $value) {
if (preg_match('/^HTTP_/',$name)) {
// convert HTTP_HEADER_NAME to Header-Name
$name = strtr(substr($name,5),'_',' ');
$name = ucwords(strtolower($name));
$name = strtr($name,' ','-');
// add to list
$headerList[$name] = $value;
}
}
return $headerList;
}
}
(new DumpHTTPRequestToFile)->execute('./dumprequest.txt');
You can quickly copy the above file by doing curl -O https://gist.githubusercontent.com/magnetikonline/650e30e485c0f91f2f40/raw/cbc114d0af29eaad80f75b69732d757971c71fd0/dumprequest.php > dumprequest.php.
The output will be something like
GET /dumprequest.php HTTP/1.1
HTTP headers: Authorization: User-Agent: PostmanRuntime/7.29.0 Accept:
/ Cache-Control: no-cache Host: somehost.com Accept-Encoding: gzip,
deflate, br Connection: keep-alive
Request body: hi=ricardo
// Reports all errors
error_reporting(E_ALL);
// Do not display errors for the end-users (security issue)
ini_set('display_errors','Off');
// Set a logging file
ini_set('error_log','request_log_file.log');
Note- request_log_file.log - you can set your full file path here if needed.
Hope this will help you!
Related
For some automated tests that I did, I had to record requests from Chrome, and then repeat them in curl commands.
I start checking how to do it...
The way I did it was:
Access websites when the developers tools open.
Issue requests, make sure they are logged in the console.
Right click on the requests, select 'Save as HAR with content', and save to a file.
Then run the following php script to parse the HAR file and output the correct curls:
script:
<?php
$contents=file_get_contents('/home/elyashivl/har.har');
$json = json_decode($contents);
$entries = $json->log->entries;
foreach ($entries as $entry) {
$req = $entry->request;
$curl = 'curl -X '.$req->method;
foreach($req->headers as $header) {
$curl .= " -H '$header->name: $header->value'";
}
if (property_exists($req, 'postData')) {
# Json encode to convert newline to literal '\n'
$data = json_encode((string)$req->postData->text);
$curl .= " -d '$data'";
}
$curl .= " '$req->url'";
echo $curl."\n";
}
Don't know in which version they added this feature, but Chrome now offers a "Save as cURL" option:
You can access this by going under the Network tab of the Developer Tools, and right clicking on a XHR request
Building upon the code by ElyashivLavi, I added a file name argument, error checking when reading from the file, putting curl in verbose mode, and disabling the Accept-encoding request header, which usually results in getting back compressed output that would make it hard to debug, as well as automatic execution of curl commands:
<?php
function bail($msg)
{
fprintf(STDERR, "Fatal error: $msg\n");
exit(1);
}
global $argv;
if (count($argv) < 2)
bail("Missing HAR file name");
$fname = $argv[1];
$contents=file_get_contents($fname);
if ($contents === false)
bail("Could not read file $fname");
$json = json_decode($contents);
$entries = $json->log->entries;
foreach ($entries as $entry)
{
$req = $entry->request;
$curl = 'curl --verbose -X '.$req->method;
foreach($req->headers as $header)
{
if (strtolower($header->name) === "accept-encoding")
continue; // avoid gzip response
$curl .= " -H '$header->name: $header->value'";
}
if (property_exists($req, 'postData'))
{
# Json encode to convert newline to literal '\n'
$data = json_encode((string)$req->postData->text);
$curl .= " -d '$data'";
}
$curl .= " '$req->url'";
echo $curl."\n";
system($curl);
}
I have a file(image) from device that is send via put
PUT /r.php HTTP/1.1
User-Agent: PHS/2.0.6
Host: localhost
Accept: */*
Content-Name: cam20141020084031.jpg,10001019
Content-Length: 35183
Expect: 100-continue
This is how I get picture that was send:
$res = file_get_contents("php://input");
$file = fopen('1.jpg', "w");
fputs($file, $res);
fclose($file);
I need to get content-name separately too. I can't find anywhere how can I get it. Can anyone help?
UPDATE
$res = file_get_contents("php://input");
$vars=parse_str($res,$post_vars);
$headers = getallheaders();
$contentName=$headers['Content-Name'])
$file = fopen('1.jpg', "w");
$fileVar= fopen('1.txt', "w");
fputs($file, $res);
fputs($fileVar,$res);
fclose($fileVar);
fclose($file);
Strange, but this code seems to be loading forever.
UPDATE 1
When I print_r it I understand that it's not headers of put request it's headers of page. Not what I need.
You can try using getallheaders() function, which exists for sole purpose of retrieving request headers:
$headers = getallheaders();
var_dump($headers['Content-Name']);
Note that it might be best to preprocess keys to take care of headers such as Content-name (note the change of letter case near -).
I am building a simple php proxy that caches responses header and objects .
My problems are that if I login to youtube.com then I dont see myself as I am signed and youtube keeps saying sign in (not signed) , but If I stop my script and just open youtube.com website then I see myself as signed . I think it is a cookies issue . Is it?
My script just grab response headers and send them back to the browser . When I use fopen() to download the object then some websites such as Google "Play STore" & "Apple Store" keeps sending HTTP 403 (Forbidden) even though I am catching the user-agent of the client through $_SERVER['HTTP_USER_AGENT'] and attach it with $context function stream_context_create() .. Still no luck!.
I have seen such a header Set-Cookie in a response header so I thought if I send it back to the browser using header() then it would be solved .. Still no luck.
This is how I grab headers and send them back to the browser:
This is how I get cookies that are requested from the client
$requested_cookie = $_COOKIE;
$requested_cookie = $_COOKIE;
$ua= $_SERVER['HTTP_USER_AGENT'];
ini_set('user_agent', $ua);
$md5_fname = md5($fname);
$filehead = $file_path."/HTTP_HEADER/".$md5_fname.".txt";
if (file_exists($filehead))
{
$handle = fopen($filehead, "r");
$contents = fread($handle, filesize($filehead));
$getCachedH = unserialize($contents);
fclose($handle);
foreach ($getCachedH as $cHead)
{
$sendHead = $cHead;
$getHead = strtoupper($cHead);
//file_put_contents("$file_path/ghassan.txt", "\n $sendHead \n" ,FILE_APPEND);
if ( preg_match('/CONTENT-LENGTH: (\d+)/i',$getHead,$clm) ) $content_length = $clm[1];
if ( preg_match('/CONTENT-TYPE: ([\/\w\-\d]+)/i',$getHead,$ctm) ) $content_type = $ctm[1];
if ( preg_match('/LOCATION: (.*)/i',$cHead,$lm) )
{
$header_location = $lm[1];
header("Location: ".$header_location);
exit;
}
if ( preg_match('/^HTTP\/1\.[01] (\d\d\d)/i', $getHead, $hcm ) )
{
$http_code = $hcm[1];
}
header($sendHead);
}
}
else
{
$opts = array(
'http'=>array(
'ignore_errors'=>"true",
'method'=>"GET",
'header'=>"Cookie: foo=bar\r\n"
)
);
$context = stream_context_create($opts);
$urlptr = fopen($_GET['url'],'rb', false, $context);
$headers = $http_response_header;
$http_code = $headers[0];
if($http_code=="200")
{
// We grab the Response Headers and save them
file_put_contents($filehead, serialize($headers));
}
foreach ($headers as $response_header)
{
$sendHead = $response_header;
$getHead = strtoupper($response_header);
header($sendHead);
if ( preg_match('/CONTENT-LENGTH: (\d+)/i',$getHead,$clm) ) $content_length = $clm[1];
if ( preg_match('/CONTENT-TYPE: ([\/\w\-\d]+)/i',$getHead,$ctm) ) $content_type = $ctm[1];
if ( preg_match('#Set-Cookie: (([^=]+)=[^;]+)#i', $sendHead , $cookm))
{
$sCookies = $cookm[1];
//file_put_contents("$file_path/cookies.txt", "\n $cookm[0], $url \n" ,FILE_APPEND);
}
if ( preg_match('/LOCATION: (.*)/i',$sendHead,$lm) )
{
$header_location = $lm[1];
header("Location: ".$header_location);
exit;
}
if ( preg_match('/ACCEPT-RANGES: ([\w\d\-]+)/i',$getHead,$arm) ) $accept_ranges = $arm[1];
if ( preg_match('/^HTTP\/1\.[01] (\d\d\d)/i', $getHead, $hcm ) )
{
$http_code = $hcm[1];
}
}
}
Is there a function that grabs the response header + object within one call and store them together in one file ?. I don't want to use fopen() at the top of the script because Apache or Php while reading strings such as $handler = fopen($urlptr,'r'); then it connect remote URL even if i didn't call the string! which is adding latency.
Is there a solution to access client's cookies through my php script and how to solve 403 forbidden message if I am already sending the user-agent of the Android Device when I download a file from Play Store through my script?.
Thank you
I have already solved my question.
While I am grabbing remote headers of each website .. Headers already have Set-Cookie header so I resend them back to the browser and capture cookies back from $_COOKIE and add cookies with fopen() which seems to solve all my problems above.
Thank you
Here's my code:
$language = $_GET['soundtype'];
$word = $_GET['sound'];
$word = urlencode($word);
if ($language == 'english') {
$url = "<the first url>";
} else if ($language == 'chinese') {
$url = "<the second url>";
}
$opts = array(
'http'=>array(
'method'=>"GET",
'header'=>"User-Agent: <my user agent>"
)
);
$context = stream_context_create($opts);
$page = file_get_contents($url, false, $context);
header('Content-Type: audio/mpeg');
echo $page;
But I've found that this runs terribly slow.
Are there any possible methods of optimization?
Note: $url is a remote url.
It's slow because file_get_contents() reads the entire file into $page, PHP waits for the file to be received before outputting the content. So what you're doing is: downloading the entire file on the server side, then outputting it as a single huge string.
file_get_contents() does not support streaming or grabbing offsets of the remote file. An option is to create a raw socket with fsockopen(), do the HTTP request, and read the response in a loop, as you read each chunk, output it to the browser. This will be faster because the file will be streamed.
Example from the Manual:
$fp = fsockopen("www.example.com", 80, $errno, $errstr, 30);
if (!$fp) {
echo "$errstr ($errno)<br />\n";
} else {
header('Content-Type: audio/mpeg');
$out = "GET / HTTP/1.1\r\n";
$out .= "Host: www.example.com\r\n";
$out .= "Connection: Close\r\n\r\n";
fwrite($fp, $out);
while (!feof($fp)) {
echo fgets($fp, 128);
}
fclose($fp);
}
The above is looping while there is still content available, on each iteration it reads 128 bytes and then outputs it to the browser. The same principle will work for what you're doing. You'll need to make sure that you don't output the response HTTP headers which will be the first few lines, because since you are doing a raw request, you will get the raw response with headers included. If you output the response headers you will end up with a corrupt file.
Instead of downloading the whole file before outputting it, consider streaming it out like this:
$in = fopen($url, 'rb', false, $context);
$out = fopen('php://output', 'wb');
header('Content-Type: video/mpeg');
stream_copy_to_stream($in, $out);
If you're daring, you could even try (but that's definitely experimental):
header('Content-Type: video/mpeg');
copy($url, 'php://output');
Another option is using internal redirects and making your web server proxy the request for you. That would free up PHP to do something else. See also my post regarding X-Sendfile and friends.
As explained by #MrCode, first downloading the file to your server, then passing it on to the client will of course incur a doubled download time. If you want to pass the file on to the client directly, use readfile.
Alternatively, think about if you can't simply redirect the client to the file URL using a header("Location: $url") so the client can get the file directly from the source.
I am kinda new to PHP however I used JSP a lot before (I have quite information) and everything was easier with Java classes.
So, now, I want to perform a POST request on a HTTPS page (not HTTP) and need to get returned cookies and past it to another GET request and return the final result. Aim is to make a heavy page for mobile phones more compatible to view in a mobile browser by bypassing the login page and directly taking to the pages which are also served in an ajax user interface.
I am stuck, my code does not work, it says it is Bad Request.
Bad Request
Your browser sent a request that this
server could not understand. Reason:
You're speaking plain HTTP to an
SSL-enabled server port. Instead use
the HTTPS scheme to access this URL,
please.
<?php
$content = '';
$flag = false;
$post_query = 'SOME QUERY'; // name-value pairs
$post_query = urlencode($post_query) . "\r\n";
$host = 'HOST';
$path = 'PATH';
$fp = fsockopen($host, '443');
if ($fp) {
fputs($fp, "POST $path HTTP/1.0\r\n");
fputs($fp, "Host: $host\r\n");
fputs($fp, "Content-length: ". strlen($post_query) ."\r\n\r\n");
fputs($fp, $post_query);
while (!feof($fp)) {
$line = fgets($fp, 10240);
if ($flag) {
$content .= $line;
} else {
$headers .= $line;
if (strlen(trim($line)) == 0) {
$flag = true;
}
}
}
fclose($fp);
}
echo $headers;
echo $content;
?>
From past experience, I've never used PHP's internal functions like fsocketopen() for external data posting. The best way to do these actions are using CURL, which gives much more ease and is massively more powerful for developers to leverage.
for example, look at these functions
http://php.net/curl_setopt
and look at the one with URL, POST, POSTDATA, and COOKIESFILES which is for .JAR, which you get then retrieve and that you can use file_get_contents() to send the data using GET.