I need to include one PHP file and execute function from it.
After execution, on end of PHP script I want to append something to it.
But I'm unable to open file. It's possible to close included file/anything similar so I'll be able to append info to PHP file.
include 'something.php';
echo $somethingFromIncludedFile;
//Few hundred lines later
$fh = fopen('something.php', 'a') or die('Unable to open file');
$log = "\n".'$usr[\''.$key.'\'] = \''.$val.'\';';
fwrite($fh, $log);
fclose($fh);
How to achieve that?
In general you never should modify your PHP code using PHP itself. It's a bad practice, first of all from security standpoint. I am sure you can achieve what you need in other way.
As Alex says, self-modifying code is very, VERY dangerous. And NOT seperating data from code is just dumb. On top of both these warnings, is the fact that PHP arrays are relatively slow and do not scale well (so you could file_put_contents('data.ser',serialize($usr)) / $usr=unserialize(file_get_contents('data.ser')) but it's only going to work for small numbers of users).
Then you've got the problem of using conventional files to store data in a multi-user context - this is possible but you need to build sophisticated locking queue management. This usually entails using a daemon to manage the queue / mutex and is invariably more effort than its worth.
Use a database to store data.
As you already know this attempt is not one of the good ones. If you REALLY want to include your file and then append something to it, then you can do it the following way.
Be aware that using eval(); is risky if you cannot be 100% sure if the content of the file does not contain harmful code.
// This part is a replacement for you include
$fileContent = file_get_contents("something.php");
eval($fileContent);
// your echo goes here
// billion lines of code ;)
// file append mechanics
$fp = fopen("something.php", "a") or die ("Unexpected file open error!");
fputs($fp, "\n".'$usr[\''.$key.'\'] = \''.$val.'\';');
fclose($fp);
Related
Have a file in a website. A PHP script modifies it like this:
$contents = file_get_contents("MyFile");
// ** Modify $contents **
// Now rewrite:
$file = fopen("MyFile","w+");
fwrite($file, $contents);
fclose($file);
The modification is pretty simple. It grabs the file's contents and adds a few lines. Then it overwrites the file.
I am aware that PHP has a function for appending contents to a file rather than overwriting it all over again. However, I want to keep using this method since I'll probably change the modification algorithm in the future (so appending may not be enough).
Anyway, I was testing this out, making like 100 requests. Each time I call the script, I add a new line to the file:
First call:
First!
Second call:
First!
Second!
Third call:
First!
Second!
Third!
Pretty cool. But then:
Fourth call:
Fourth!
Fifth call:
Fourth!
Fifth!
As you can see, the first, second and third lines simply disappeared.
I've determined that the problem isn't the contents string modification algorithm (I've tested it separately). Something is messed up either when reading or writing the file.
I think it is very likely that the issue is when the file's contents are read: if $contents, for some odd reason, is empty, then the behavior shown above makes sense.
I'm no expert with PHP, but perhaps the fact that I performed 100 calls almost simultaneously caused this issue. What if there are two processes, and one is writing the file while the other is reading it?
What is the recommended approach for this issue? How should I manage file modifications when several processes could be writing/reading the same file?
What you need to do is use flock() (file lock)
What I think is happening is your script is grabbing the file while the previous script is still writing to it. Since the file is still being written to, it doesn't exist at the moment when PHP grabs it, so php gets an empty string, and once the later processes is done it overwrites the previous file.
The solution is to have the script usleep() for a few milliseconds when the file is locked and then try again. Just be sure to put a limit on how many times your script can try.
NOTICE:
If another PHP script or application accesses the file, it may not necessarily use/check for file locks. This is because file locks are often seen as an optional extra, since in most cases they aren't needed.
So the issue is parallel accesses to the same file, while one is writing to the file another instance is reading before the file has been updated.
PHP luckily has a mechanisms for locking the file so no one can read from it until the lock is released and the file has been updated.
flock()
can be used and the documentation is here
You need to create a lock, so that any concurrent requests will have to wait their turn. This can be done using the flock() function. You will have to use fopen(), as opposed to file_get_contents(), but it should not be a problem:
$file = 'file.txt';
$fh = fopen($file, 'r+');
if (flock($fh, LOCK_EX)) { // Get an exclusive lock
$data = fread($fh, filesize($file)); // Get the contents of file
// Do something with data here...
ftruncate($fh, 0); // Empty the file
fwrite($fh, $newData); // Write new data to file
fclose($fh); // Close handle and release lock
} else {
die('Unable to get a lock on file: '.$file);
}
I'm using this code below that simply takes the name of an artist submitted through a form and saves it as a html file on the server.
<?php
if (isset($_GET['artist'])) {
$fileName = $_GET['artist'].".html";
$fileHandler = fopen($fileName, 'w') or die("can't create file");
fclose($fileHandler);
}
?>
What I'm trying to work out is how I could possibly add any code within the file before it is saved. That way every time a user adds an artist I can include my template code within the file. Any help would be great :)
Use fwrite.
Two things:
file_put_contents as a whole will be faster.
Your design is a very bad idea. They can inject a file anywhere on your filesystem, e.g. artist=../../../../etc/passwd%00 would try to write to /etc/passwd (%00 is a NUL byte, which causes fopen to terminate the string in C - unless that's been fixed).
fwrite() allows you to write text to the file.
if (isset($_GET['artist'])) {
$fileName = $_GET['artist'].".html";
$fileHandler = fopen($fileName, 'w') or die("can't create file");
fwrite($fileHandler, 'your content goes here');
fclose($fileHandler);
}
Warning - be very careful about what you write to the filesystem. In your example there is nothing stopping someone from writing to parts of the filesystem that you would never have expected (eg. artist='../index'!).
I sincerely recommend that you think twice about this, and either save content to a database (using appropriate best practices, ie http://php.net/manual/en/security.database.sql-injection.php) or at least make sure that you limit the characters used in the filename strings (eg. only allow characters A-Z or a-z and '_', for example). It's dangerous and exploitable otherwise, and at the very least you run the risk of your site being defaced or abused.
As others have said - you should be able to write php code to the filesystem with fwrite.
I'm writing a system for a browser application that will store some particular php scripts in a database and then pull them out and execute them when needed. At first I tried using exec() and piping to php the output of a script that got the scripts out of the database and printed them. This worked in one use case, but not all, and feels brittle anyway, so I'm looking for a better way.
I'm now attempting to accomplish this through use of a PHP file stream in memory. For instance:
$thing = <<<'TEST'
<?php
$thing = array();
print "Testing code in here.";
var_dump($thing);
?>
TEST;
$filename = "php://memory";
$fp = fopen($filename, "w+b");
fwrite($fp, $thing);
//rewind($fp);
fclose($fp);
include "php://memory";
However, nothing is printed when the script is executed. Is this even possible by this means, and if not, is there another way to do this? I'm trying to avoid having to write temporary files and read from them, as I'm sure accessing the filesystem would slow things down. Is there a URL I can provide to "include" so that it will read the memory stream as if it were a file?
I don't think eval() would do this, as, if I remember correctly, it's limited to a single line.
Also, please no "eval = include = hell" answers. Non-admin users do not have access to write the scripts stored in the database, I know that this needs special treatment over the life-cycle of my application.
You need to use stream_get_contents to read from the php://memory stream. You cannot include it directly.
eval() and include are actually pretty the same. So eval() works with multiple lines - just FYI. However, I would prefer include here, I always think it's faster. Maybe I'm wrong, no Idea.
However, I think you should debug your code, I don't see a reason per-se why it should not work. You might need to rewind the pointer (you have commented that), but what you should check first-hand is, that your PHP configuration allows to include URLs. I know that that setting prevents using of the data:// URIs, so you might have this enabled.
Also you can always try if PHP can open the memory by using file_get_contents and dumping out. This should give you the code. If not, you already made some mistake (e.g. no rewind or something similar).
Edit: I've not come that far (demo):
<?php
/**
* Include from “php://memory” stream
* #link https://stackoverflow.com/q/9944867/367456
*/
$thing = <<<TEST
<?php
\$thing = array();
print "Testing code in here.";
var_dump(\$thing);
TEST;
$filename = "php://memory";
$fp = fopen($filename, "w+b");
fwrite($fp, $thing);
rewind($fp);
var_dump(stream_get_contents($fp));
This is what I found out:
You should not close the "file". php://memory is a stream once closed it will disappear.
You need to access the $fp as stream than, which is not possible for include out of the box AFAIK.
You then would need to create a stream wrapper that maps a stream resource to a file name.
When you've done that, you can include a memory stream.
The PHP settings you need to check anyway. There are more than one, consult the PHP manual.
It might be easier to use the data URI (demo):
<?php
/**
* Include from “php://memory” stream
* #link https://stackoverflow.com/q/9944867/367456
*/
$thing = <<<TEST
<?php
\$thing = array();
print "Testing code in here.";
var_dump(\$thing);
TEST;
include 'data://text/plain;,'. urlencode($thing);
See as well: Include code from a PHP stream
If there is a way to include from php://memory then this is a serious vulnerability. While it has many uses, eval is used quite often with code obfuscation techniques to hide malicious code.
(Every tool is a weapon if you hold it right.)
With that said, there (thankfully) doesn't appear to be any obvious way to include from php://memory
What's the cleanest way in php to open a file, read the contents, and subsequently overwrite the file's contents with some output based on the original contents? Specifically, I'm trying to open a file populated with a list of items (separated by newlines), process/add items to the list, remove the oldest N entries from the list, and finally write the list back into the file.
fopen(<path>, 'a+')
flock(<handle>, LOCK_EX)
fread(<handle>, filesize(<path>))
// process contents and remove old entries
fwrite(<handle>, <contents>)
flock(<handle>, LOCK_UN)
fclose(<handle>)
Note that I need to lock the file with flock() in order to protect it across multiple page requests. Will the 'w+' flag when fopen()ing do the trick? The php manual states that it will truncate the file to zero length, so it seems that may prevent me from reading the file's current contents.
If the file isn't overly large (that is, you can be confident loading it won't blow PHP's memory limit), then the easiest way to go is to just read the entire file into a string (file_get_contents()), process the string, and write the result back to the file (file_put_contents()). This approach has two problems:
If the file is too large (say, tens or hundreds of megabytes), or the processing is memory-hungry, you're going to run out of memory (even more so when you have multiple instances of the thing running).
The operation is destructive; when the saving fails halfway through, you lose all your original data.
If any of these is a concern, plan B is to process the file and at the same time write to a temporary file; after successful completion, close both files, rename (or delete) the original file and then rename the temporary file to the original filename.
Read
$data = file_get_contents($filename);
Write
file_put_contents($filename, $data);
One solution is to use a separate lock file to control access.
This solution assumes that only your script, or scripts you have access to, will want to write to the file. This is because the scripts will need to know to check a separate file for access.
$file_lock = obtain_file_lock();
if ($file_lock) {
$old_information = file_get_contents('/path/to/main/file');
$new_information = update_information_somehow($old_information);
file_put_contents('/path/to/main/file', $new_information);
release_file_lock($file_lock);
}
function obtain_file_lock() {
$attempts = 10;
// There are probably better ways of dealing with waiting for a file
// lock but this shows the principle of dealing with the original
// question.
for ($ii = 0; $ii < $attempts; $ii++) {
$lock_file = fopen('/path/to/lock/file', 'r'); //only need read access
if (flock($lock_file, LOCK_EX)) {
return $lock_file;
} else {
//give time for other process to release lock
usleep(100000); //0.1 seconds
}
}
//This is only reached if all attempts fail.
//Error code here for dealing with that eventuality.
}
function release_file_lock($lock_file) {
flock($lock_file, LOCK_UN);
fclose($lock_file);
}
This should prevent a concurrently-running script reading old information and updating that, causing you to lose information that another script has updated after you read the file. It will allow only one instance of the script to read the file and then overwrite it with updated information.
While this hopefully answers the original question, it doesn't give a good solution to making sure all concurrent scripts have the ability to record their information eventually.
I experimenting with twitter streaming API,
I use Phirehose to connect to twitter and fetch the data but having problems storing it in files for further processing.
Basically what I want to do is to create a file named
date("YmdH")."."txt"
for every hour of connection.
Here is how my code looks like right now (not handling the hourly change of files)
public function enqueueStatus($status)
$data = json_decode($status,true);
if(isset($data['text'])/*more conditions here*/) {
$fp = fopen("/tmp/$time.txt");
fwirte ($status,$fp);
fclose($fp);
}
Help is as always much appreciated :)
You want the 'append' mode in fopen - this will either append to a file or create it.
if(isset($data['text'])/*more conditions here*/) {
$fp = fopen("/tmp/" . date("YmdH") . ".txt", "a");
fwrite ($status,$fp);
fclose($fp);
}
From the Phirehose googlecode wiki:
As of Phirehose version 0.2.2 there is
an example of a simple "ghetto queue"
included in the tarball (see file:
ghetto-queue-collect.php and
ghetto-queue-consume.php) that shows
how statuses could be easily collected
on to the filesystem for processing
and then picked up by a separate
process (consume).
This is a complete working sample of doing what you want to do. The rotation time interval is configurable too. Additionally there's another script to consume and process the written files too.
Now if only I could find a way to stop the whole sript, my log keeps filling up (the script continues execution) even if I close the browser tab :P