Include -- using a string rather than a filename - php

I'd like to run include on a string rather than a file, but an unaware of how to achieve this.
//This is the desired functionality
include($filename);
//But I want to do something like this instead.
$file_contents = getFileFromCacheOrSomewhereElse($filename);
include($file_contents); // Doens't work...
eval($file_contents); // Also incorrect.
Please note: "eval" is not the same as include -- "include" echos out the contents of the file (and executes any PHP tags) while "eval" executes the string as PHP code.
An example use case is loading a template file from Memcache (as a string), then running include on that string, rather than running include and relying on PHP filecache.

If you can turn on the allow_url_fopen and allow_url_include php.ini settings, then an alternative is the data stream wrapper (manual).
include 'data:text/plain,' . urlencode($file_contents);

eval("?>" . $file_contents . "<?php ");
does it.

Storing PHP code in the memcache is not the best idea.
And evaling it thereafter is even worse.
Any opcode cache, APC or EAccelerator will cache your PHP files on the fly, with no strange efforts like this, and even parse it for the faster execution.
EDIT. Given the voting results after all these years, I assume that this question is attracting only noobs, who have the same strange whim. So I have to repeat: although it defeats your brilliant idea,
just leave your includes as is
They will be cached much better and executed much faster by the internal PHP's opcode cache.

Related

How to avoid php parsing the whole php file and includes and make it only parse what will be used?

I read somewhere that php parses the whole .php file every time it is executed. Some solution was proposed in there (that was not opcache), but I lost the website and I couldn't find it.
Now I have an enormous php website that has many long functions that are often used alone, and it's required that the execution be fast.
To avoid having php parsing all the other functions that won't be used, I was thinking of making a modular design in which the functions, stored in independent php files, will only be included if they will be actually used. But I haven't been able to confirm that php will not parse an include inside of a function or inside of a conditional statement unless it is required. Does php parse those includes?
Example:
<?php
$func_to_execute = $_GET['func'];
$parameter = $_GET['parameter'];
switch($func_to_execute)
{
case 'a':
include 'func_a.php';
$output = func_a($parameter);
break;
case 'b':
include 'func_b.php';
$output = func_b($parameter);
break;
case 'c':
include 'func_c.php';
$output = func_c($parameter);
break;
};
echo $output;
?>
In this example, I would like php to parse only the func_a if I am requesting a, only the func_b if I am requesting b, etcetera. There are in practice more than just 3 functions, and each is a very long algorythm with also very long strings and arrays.
As an alternative to includes I was thinking of making independent php files and execute them and retrieve their output only if they are required, with a shell_exec. But that would take other complexities, like formatting the parameters (I don't have idea of how I would pass a very long string with special characters, or a JSON, as a parameter in the shell) and calling the function to execute in the shell. Would those complexities make it slower than just letting php parse the whole file?
I know about the opcache function. Would it be enough even if all the ops of all the functions will be tested each time?
Are there other ways to make a PHP website modular, and not having php parsing the whole of php files everytime?
Thank you.
since php uses many optimizations and caching apcu i.e. you dont need to care about this
include wont be parsed at load time.. its more like file_get_contents and execute in same context - and these will be optimized by internal php cache
http://php.net/manual/en/intro.apc.php
I made a benchmarking experiment and it seems that php truly does not parse conditional includes. I made the test using the example script mentioned, and defining each as:
func_a: it only declares that the value of the variable $x is the sentence 'war and peace'.
$x = 'war and peace';
func_b: it only declares that the value of the variable $x is the whole text of the novel war and peace, which is approximately 3.2 MB long (the whole text was pasted in the php file). This would be a very long file to parse.
$x = 'War and Peace, by Leo Tolstoy...(the whole novel...)...';
func_c: it contained incorrect syntax, that should immediately launch an error message from php. This was made to guarantee that php was not actually parsing what was not included.
I measured the execution time from another php script with the function shell_exec(). The results were (in seconds):
func_a ≈ 0.122
func_b ≈ 0.152
func_c ≈ 0.119
Therefore I conclude that:
- Includes in a switch statement are not parsed unless they are actually required.
- A syntax mistake in an include (inside a switch statement) will not launch any error if it is not actually required, because it is not parsed.
- Anyway, the difference on the time of the process is very little (about 0.03 extra second for an extra text of 3.3 MB; or crudely said, 0.01 extra second per 1 MB of text to parse). However this might be important to consider if there are many users requesting the website at the same time, and therefore it might be useful to divide in modules (includes) if the script is actually that big. Also the fact that a wrongly written include that was not required be not parsed helps to not launch errors when they aren't relevant.
It seems then for me a good manner to design a modular application in PHP where the modules be extremely big.

Is it possible to change the behavior of PHP's print_r function [duplicate]

This question already has answers here:
making print_r use PHP_EOL
(5 answers)
Closed 6 years ago.
I've been coding in PHP for a long time (15+ years now), and I usually do so on a Windows OS, though most of the time it's for execution on Linux servers. Over the years I've run up against an annoyance that, while not important, has proved to be a bit irritating, and I've gotten to the point where I want to see if I can address it somehow. Here's the problem:
When coding, I often find it useful to output the contents of an array to a text file so that I can view it's contents. For example:
$fileArray = file('path/to/file');
$faString = print_r($fileArray, true);
$save = file_put_contents('fileArray.txt', $faString);
Now when I open the file fileArray.txt in Notepad, the contents of the file are all displayed on a single line, rather than the nice, pretty structure seen if the file were opened in Wordpad. This is because, regardless of OS, PHP's print_r function uses \n for newlines, rather than \r\n. I can certainly perform such replacement myself by simply adding just one line of code to make the necessary replacements, ans therein lies the problem. That one, single line of extra code translates back through my years into literally hundreds of extra steps that should not be necessary. I'm a lazy coder, and this has become unacceptable.
Currently, on my dev machine, I've got a different sort of work-around in place (shown below), but this has it's own set of problems, so I'd like to find a way to "coerce" PHP into putting in the "proper" newline characters without all that extra code. I doubt that this is likely to be possible, but I'll never find out if I never ask, so...
Anyway, my current work-around goes like this. I have, in my PHP include path, a file (print_w.php) which includes the following code:
<?php
function print_w($in, $saveToString = false) {
$out = print_r($in, true);
$out = str_replace("\n", "\r\n", $out);
switch ($saveToString) {
case true: return $out;
default: echo $out;
}
}
?>
I also have auto_prepend_file set to this same file in php.ini, so that it automatically includes it every time PHP executes a script on my dev machine. I then use the function print_w instead of print_r while testing my scripts. This works well, so long as when I upload a script to a remote server I make sure that all references to the function print_w are removed or commented out. If I miss one, I (of course) get a fatal error, which can prove more frustrating than the original problem, but I make it a point to carefully proofread my code prior to uploading, so it's not often an issue.
So after all that rambling, my question is, Is there a way to change the behavior of print_r (or similar PHP functions) to use Windows newlines, rather than Linux newlines on a Windows machine?
Thanks for your time.
Ok, after further research, I've found a better work-around that suite my needs, and eliminates the need to call a custom function instead of print_r. This new work-around goes like this:
I still have to have an included file (I've kept the same name so as not to have to mess with php.ini), and php.ini still has the auto_prepend_file setting in place, but the code in print_w.php is changes a bit:
<?php
rename_function('print_r', 'print_rw');
function print_r($in, $saveToString = false) {
$out = print_rw($in, true);
$out = str_replace("\n", "\r\n", $out);
switch ($saveToString) {
case true: return $out;
default: echo $out;
}
}
?>
This effectively alters the behavior of the print_r function on my local machine, without my having to call custom functions, and having to make sure that all references to that custom function are neutralized. By using PHP's rename_function I was able to effectively rewrite how print_r behaves, making it possible to address my problem.

Using ftp_get() when there are "spaces" in the file path and filename

I need to download a file via PHP ftp_get(), but the foolish provider is using directories and file names contaning whitespace.. The file path I'm dealing with is similar to /product info/more stuff/inventory and stuff.csv
The spaces in the path and in the filename itself is making it difficult to retrieve anything. I already tried the following without success:
$path = "/product\ info/more\ stuff/inventory\ and\ stuff.csv";
$path = "/product%20info/more%20stuff/inventory%20and%20stuff.csv";
$path = '"/product info/more stuff/inventory and stuff.csv"';
Thanks again for taking the time to help me out.
Your third attempt, quoting the complete path, was already the recommended approach. Though it very much depends on the actual server implementation.
FTP per RFC859 is comprised of a terminal session and a data transfer channel. Basically the FTP server provides a mini-shell on the command port. As such, typical shell string escaping rules do apply. URL encoding can be ruled out here.
I'd advise first to use single quotes however. Preferrably use escapeshellarg() to apply them. And try ftp_nb_get() while at it.
$path = "/foo foo/bar bar/baz baz.csv";
ftp_nb_get($con, "save.csv", escapeshellarg($path), 2);
If that doesn't work, further debugging is necessary. While all ftp_* function arguments are left unprocessed, you could as well try to send a ftp_raw request. This won't actually activate the data channel reading, but might return a more concrete error response.
print_r(ftp_raw($con, "RETR '/path to/some file.csv'\r\n"));
And I'm just gonna say it, if you're still getting a file not found error then; it's entirely possible that the file really doesn't exist at the presumed location. In that case manually traverse the directory structure with ftp_nlist and ftp_rawlist with var_dump (in case of extra trailing spaces for subdirs).
Alternatively just use PHPs ftp:// stream wrapper (which also supports PASV mode). Whose implementation is distinct from that of the ext/ftp functions. Here funnily enough, URL encoding is again the correct approach, but quoting still necessary (ftp_fopen_wrapper.c does not quote itself):
= file_get_contents("ftp://user:pw#example.org/'path%20to/file%20and.csv'");
// Inline quotes may likely trip up some FTP server implementations..
A much better alternative though is just using cURL.
// You'll have to use the long-winded PHP curl functions of course.
print curl("ftp://.../file with spaces.csv")->exec();
Last option is just resorting to calling a Unixland client. (If not wget, than a plain ftp client.)
$url = escapeshellarg("ftp://user:pw#ftp.example.org/$path");
$file = `wget $url`;
If you still can't retrieve any files, you'll have to look for an alternative FTP client in PHP for further debugging. Guess who wrote one.
To get a list of files or folders with spaces in the path.
ftp_chdir($conn, $path);
$children = ftp_rawlist($conn,'-a .');
Source

PHP: How to solve ob_start() in combination imagepng() issue?

I use the following code to create an image and encode it to base64. There is no direct output of the image.
ob_start(); // catching the output buffer
imagepng($imgSignature);
$base64Signature=base64_encode(ob_get_contents());
ob_end_clean();
ob_start started recently to throw error 500 and I have trouble figuring out the issue. The server uses php 5.4.11. I really don't know if it was running the same version as I installed the script, of if the memory runs full. I know that ob_start has changed throughout the php version. I really have a hard time to wrap my head around this. Is the script correct for php 5.4.11?
I really appreciate any help.
I'm not sure how to solve your issue with ob_start(), but I have an alternative for what you are doing that don't envolve output buffers.
imagepng($imgSignature, 'php://memory/file.png');
$base64Signature = base64_encode(file_get_contents('php://memory/file.png'));
This is basically saving the png image to a virtual temporary file that exists only in memory, then you read it back and have the same result.
My theory about your error:
At some point in your code, you will have this image stored multiple times in memory. In the $imgSignature, the internal buffer you created with ob_start(), the buffer you read with ob_get_contents(), and the resulting value of base64_encode(). Pretty much all in one line. God only knows how much memory its using, not to mention you probably allocated more resources before as you were mounting this image.
It is important to not have too much stuff allocated at the same time, specially when dealing with memory consuming resources like images. If you unset() or overwrite variables you no longer need, you will allow the garbage collector to do its job of disposing those unreferenced resources from memory.
For instance, you can change the way this piece of code was written to this:
ob_start();
imagepng($imgSignature);
imagedestroy($imgSignature);
$data = ob_get_contents();
ob_end_clean();
$data = base64_encode($data);
I dropped $imgSignature as soon as I didn't need it anymore, ended and cleaned my buffer as soon I was done getting what I wanted from it, and then disposed $data as I overwrote it with the base64 encoded $data that was really what I wanted.
Now this will use significantly less memory. If you extend this to the rest of your code, or do it at least to the parts that use a lot of memory like the images you loaded or created with the GD2 lib, it should optimize the memory usage of your script giving you that extra space you need.

PHP - *fast* serialize/unserialize?

I have a PHP script that builds a binary search tree over a rather large CSV file (5MB+). This is nice and all, but it takes about 3 seconds to read/parse/index the file.
Now I thought I could use serialize() and unserialize() to quicken the process. When the CSV file has not changed in the meantime, there is no point in parsing it again.
To my horror I find that calling serialize() on my index object takes 5 seconds and produces a huge (19MB) text file, whereas unserialize() takes unbearable 27 seconds to read it back. Improvements look a bit different. ;-)
So - is there a faster mechanism to store/restore large object graphs to/from disk in PHP?
(To clarify: I'm looking for something that takes significantly less than the aforementioned 3 seconds to do the de-serialization job.)
var_export should be lots faster as PHP won't have to process the string at all:
// export the process CSV to export.php
$php_array = read_parse_and_index_csv($csv); // takes 3 seconds
$export = var_export($php_array, true);
file_put_contents('export.php', '<?php $php_array = ' . $export . '; ?>');
Then include export.php when you need it:
include 'export.php';
Depending on your web server set up, you may have to chmod export.php to make it executable first.
Try igbinary...did wonders for me:
http://pecl.php.net/package/igbinary
First you have to change the way your program works. divide CSV file to smaller chunks. This is an IP datastore i assume. .
Convert all IP addresses to integer or long.
So if a query comes you can know which part to look.
There are <?php ip2long() /* and */ long2ip(); functions to do this.
So 0 to 2^32 convert all IP addresses into 5000K/50K total 100 smaller files.
This approach brings you quicker serialization.
Think smart, code tidy ;)
It seems that the answer to your question is no.
Even if you discover a "binary serialization format" option most likely even that would be to slow for what you envisage.
So, what you may have to look into using (as others have mentioned) is a database, memcached, or on online web service.
I'd like to add the following ideas as well:
caching of requests/responses
your PHP script does not shutdown but becomes a network server to answer queries
or, dare I say it, change the data structure and method of query you are currently using
i see two options here
string serialization, in the simplest form something like
write => implode("\x01", (array) $node);
read => explode() + $node->payload = $a[0]; $node->value = $a[1] etc
binary serialization with pack()
write => pack("fnna*", $node->value, $node->le, $node->ri, $node->payload);
read => $node = (object) unpack("fvalue/nre/nli/a*payload", $data);
It would be interesting to benchmark both options and compare the results.
If you want speed, writing to or reading from the file system in less than optimal.
In most cases, a database server will be able to store and retrieve data much more efficiently than a PHP script that is reading/writing files.
Another possibility would be something like Memcached.
Object serialization is not known for its performance but for its ease of use and it's definitely not suited to handle large amounts of data.
SQLite comes with PHP, you could use that as your database. Otherwise you could try using sessions, then you don't have to serialize anything, you just saving the raw PHP object.
What about using something like JSON for a format for storing/loading the data? I have no idea how fast the JSON parser is in PHP, but it's usually a fast operation in most languages and it's a lightweight format.
http://php.net/manual/en/book.json.php

Categories