Can you please suggest simple PHP script that I can insert to web-page, that will track and record every HTTP_REFERER of users that came to web-page?
Thank you so much for help in advance
Using $_SERVER['HTTP_REFERER'] is not reliable.
However, if you still want to go that route, you can use the following.
What this does is use a ternary operator to check if the referer is set.
If a referer is found, then it will record to it to file, and appending/adding to it using the a switch. Otherwise, if one is not found or isn't recordable, it will simply echo and not write anything to file.
If you don't want to keep adding to the file, use the w switch.
Caution - Using the w switch will overwrite any previously-written content.
<?php
$refer = isset($_SERVER['HTTP_REFERER']) ? $_SERVER['HTTP_REFERER'] : null;
if(!empty($refer)){
$fp = fopen('file.txt', 'a');
fwrite($fp, "$refer" . "\n"); // Using \n makes a new line. \r\n Windows/MAC
fclose($fp);
echo "Referer found and written to file.";
}
else{
echo "No referer, nothing written to file.";
}
// Or use this to write "No referer" in the file as a replacement
/*
else{
$fp = fopen('file.txt', 'a');
fwrite($fp, "No referer" . "\n");
fclose($fp);
echo "No referer.";
}
*/
this is a very simple script that will let you archive it:
$fp = fopen('myfile.txt', 'w');
fwrite($fp, $_SERVER['HTTP_REFERER']);
fclose($fp);
It might be a better idea to log this in apache (if thats the platform).
About logfiles:
http://httpd.apache.org/docs/1.3/logs.html
Then use some designated software to analyze the logs.
The reason for this is that to build your own tracking-script is a lot more work than it might seem, even at the simplest level. If its to be usable.
Another idea is to install some logging software from 3rd party. I think statcounter uses logfiles and can give you what you want for instance.
Related
I'm reviewing code that was written by another developer. The code tries to open a file, and sets $canOpen based on success/failure.
$fh = #fopen('files.php', 'a+');
if (!$fh){
fclose($fh);
$canOpen = false;
} else {
$canOpen = true;
}
What I find strange is that it also tries to close the file but only when the open fails if (!$fh). Does this make sense? Shouldn't the close be in the else statement when the file was successfully opened?
No, it makes no sense.
If !$fh is met as a condition, it means the $fh contains boolean FALSE (or possibly, in the event of some weird internal PHP error, NULL). So what you are effectively doing is:
fclose(FALSE);
...which is pointless and will result in an error.
And, if all you're trying to do is populate the $canOpen variable and not do anything useful with the file handle, wouldn't a combination of is_file(), is_readable() and is_writable() suffice?
The only reason one would close an unopened file is if it is for resiliency purposes. For example, always closing the file in the finally block of a try...catch.
In your case, it looks like a coding error.
if you put a var_dump( $fh ) in the true block of your if, you'll find that it's not a resource handle.
The php manual states that fclose takes a resource. Therefore fclose can't be called on a file that can't be opened.
<?php
$fh = #fopen('files.php', 'a+');
if (!$fh){
var_dump( $fh );
fclose($fh); // this will cause an error
$canOpen = false;
} else {
$canOpen = true;
}
No it doesn't make sense, if you cannot open the file you won't need to close it eigther. At this point the filepointer is always open because if the file doesn't exists it will create a file.
In PHP, i will write (create) the file using file_put_contents($filename, $data);
It is ok, but i want to detect the finish event of process.
Currently, the page is showing loading status.
This is currently the way i can know it is finish or not.
I want to detect with the code.
This is a blocking, I/O call, so it finishes when the function call returns. The return value is the number of bytes written (upon success).
It puts everything on hold until it's over
So you could use something like this to determine when it has finished writing the file.
echo "The file's contents are now being written, please wait.";
file_put_contents($filename, $data);
echo "The file's contents have been written.";
Retrieve the Content-Length header from the remote location first. You can use get_headers to do that. Like so:
$headers = get_headers($url, 1);
echo "To download: " . $headers['Content-Length'] . " bytes.<br />";
I have a php-script that uploads files to an ftp server from a location with a very low bandwith, low reliability connection. I am currently using the ftp functions of php.
Sometimes the connection drops in the middle of a transfer. Is there a way to later resume the upload? How can this be done?
Edit:
People misunderstood that this is happening from a browser. However, this is not the case. It is a php cli script. So it is about a local file being uploaded to ftp, no user interaction.
Try getting the remote file's size and then issuing the APPE ftp command with the difference. This will append to the file. See http://webmasterworld.com/forum88/4703.htm for an example. If you want real control over this, I recommend using PHP's cURL functions.
I think all the previous answers are wrong. The PHP manage the resume, for that FTP_AUTODESK must be activated, and you have to activate the FTP_AUTORESUME during upload. The code should be something like this :
$ftp_connect = ftp_connect($host, $port);
ftp_set_option ($ftp_connect, FTP_AUTOSEEK, TRUE);
$stream = fopen($local_file, 'r');
$upload = ftp_fput($ftp_connect, $file_on_ftp, $stream, FTP_BINARY, FTP_AUTORESUME);
fclose($stream);
ftp_(f)put has a $startpos parameter. You should be able to do what you need using this parameter. (starts the transfer from the file size on your server).
However I never used it, so I don't know about the reliability. You should try.
EDIT:
Another Example: by cballou
Look here. A very simple and good example.
You could do something like this:
<?php
$host = 'your.host.name';
$user = 'user';
$passwd = 'yourpasswd';
$ftp_stream = ftp_connect($host);
//EDIT
ftp_set_option($ftp_stream, FTP_AUTOSEEK, 1);
if ( ftp_login($ftp_stream,$user,$passwd) ){
//activate passive mode
//ftp_pasv($ftp_stream, TRUE);
$ret = ftp_nb_put($ftp_stream, $remotefile, $localfile, FTP_BINARY,FTP_AUTORESUME);
while ( FTP_MOREDATA == $ret ) {
// continue transfer
$ret = ftp_nb_continue($ftp_stream);
}
if (FTP_FINISHED !== $ret){
echo 'Failure occured on transfer ...';
}
}
else {
print "FAILURE while login ...";
}
//free
ftp_close($ftp_stream);
?>
are you meaning you are going to allow user resume upload from you web interface ?
is this suit you ?
http://www.webmasterworld.com/php/3931706.htm
just use the filezilla can done everything for you ...i mean the uploading process...
it will keep the data and reopen after close you will be able to continue process q...
for automate and tracking , simply use our way...which is git and capistrano
there is an option for php
try google it...
Code works fine, except for the fact that there is a problem here:
//Log Events
function logEvent($newinput) {
if ($newinput !== NULL) {
// Add a timestamp to the start of the $message
$newinput = date("Y/m/d H:i:s").': '.$newinput;
$fp = fopen('log.txt', 'w');
fwrite($fp, $newinput."\n");
fclose($fp);
}
}
//Problem writing these two lines to log.txt?
//The bad, the two lines below are not on the log.txt
logEvent('Selection'.$selections[$selection]);
logEvent('Change' . $change. 'cents.');
//This line is written to a text file (log.txt), okay that's good.
logEvent('Input' . $newinput);
i think you're not appending to the file, you're rewriting it. try fopen with 'a' instead of 'w'.
You need to use the append modifier when opening the file, you've gone
fopen('log.txt', 'w');
this means that every time you call that function, the log file is getting blown out and recreated, if you instead used
fopen('log.txt', 'a');
then your new log entries will append to the file instead.
You could also look into keeping the file open for later inserts too, but there may be issues with multiple updates in other requests.
So yea, im working on a windows system and while this works locally, know it will break on other peoples servers. Whats a cross platform way to do the same as this
function fetch($get,$put){
file_put_contents($put,file_get_contents($get));
}
I don't see why that would fail unless the other computer is on PHP4. What you would need to do to make that backwards compatible is add functionality to provide replacements for file_get_contents & file_put_contents:
if(version_compare(phpversion(),'5','<')) {
function file_get_contents($file) {
// mimick functionality here
}
function file_put_contents($file,$data) {
// mimick functionality here
}
}
Here would be the solution using simple file operations:
<?php
$file = "http://www.domain.com/thisisthefileiwant.zip";
$hostfile = fopen($file, 'r');
$fh = fopen("thisisthenameofthefileiwantafterdownloading.zip", 'w');
while (!feof($hostfile)) {
$output = fread($hostfile, 8192);
fwrite($fh, $output);
}
fclose($hostfile);
fclose($fh);
?>
Ensure your directory has write permissions enabled. (CHMOD)
Therefore, a replacement for your fetch($get, $put) would be:
function fetch($get, $put) {
$hostfile = fopen($get, 'r');
$fh = fopen($put, 'w');
while (!feof($hostfile)) {
$output = fread($hostfile, 8192);
fwrite($fh, $output);
}
fclose($hostfile);
fclose($fh);
}
Hope it helped! =)
Cheers,
KrX
Shawn's answer is absolute correct, the only thing is that you need to make sure your $put varialable is a valid path on either the Windows Server on the Unix server.
well when i read your question I understood you wanted to bring a file from a remote server to your server locally, this can be done with the FTP extension from php
http://www.php.net/manual/en/function.ftp-fget.php
if this is not what you intent I believe what shawn says is correct
else tell me in the comments and i'll help you more
If the fopen wrappers are not enabled, the curl extension could be: http://php.net/curl