Gzcompressed or plain string - php

I have a file plain.cache which is little over 10MB and I made a gzcompressed file gz.cache out of the original plain.cache file. Then, I made two separate files which load each of the mentioned cache files and was kind of surprised that the page load speed of both files was almost the same. So, my question is - am I being right by concluding that gzcompressed file does not in any way benefit the load speed of the page? Now, I would conclude that the gzuncompress that I use in the gz.php file "makes" the same exact string just as when I read it from the plain file. Given all these staments - a general question is how can I (if it all in all is done this way) increase the load speed by compressing the file with gzcompress.
The image of the files is below, and the code of files is as follows:
_makeCache.php, in which I make the gzcompressed version of the plain.cache file:
$str = file_get_contents("plain.cache");
$strCompressed = gzcompress($str, 9);
$file = "gz.cache";
$fp = fopen($file, "w");
fwrite($fp, $strCompressed);
fclose($fp);
plain.php:
echo file_get_contents("plain.cache");
gz.php:
echo gzuncompress(file_get_contents("plain.cache"));

Your http server is compressing the plain.cache automatically on the fly, using gzip as well, and the client decompresses it. So you should see almost no difference.

Related

difference between readfile() and fopen()

These two codes both do the same thing in reading files , so what's the main difference ?
1-First code :
$handle = fopen($file, 'r');
$data = fread($handle, filesize($file));
2-Second code :
readfile($file);
There's a significant difference between fread() and readfile().
First, readfile() does a number of things that fread() does not. readfile() opens the file for reading, reads it, and then prints it to the output buffer all in one go. fread() only does one of those things: it reads bytes from a given file handle.
Additionally, readfile() has some benefits that fread() does not. For example, it can take advantage of memory-mapped I/O where available rather than slower disk reads. This significantly increases the performance of reading the file since it delegates the process away from PHP itself and more towards operating system calls.
Errata
I previously noted that readfile() could run without PHP (this is corrected below).
For truly large files (think several gigs like media files or large archive backups), you may want to consider delegating the reading of the file away from PHP entirely with X-Sendfile headers to your webserver instead (so that you don't keep your PHP worker tied up for the length of an upload that could potentially take hours).
So you could do something like this instead of readfile():
<?php
/* process some things in php here */
header("X-Sendfile: /path/to/file");
exit; // don't need to keep PHP busy for this
Reading the docs, readfile reads the whole content and writes it into STDOUT.
$data = fread($handle, filesize($file));
While fread puts the content into the variable $data.

Download the first 5kb of a file with PHP as plain text?

Let's say I'd like to download some information from a file on the internet within PHP, but I do not need the entire file. Therefore, loading the full file through
$my_file = file_get_contents("https://www.webpage.com/".$filename);
would use up more memory and resources than necessary.
Is there a way to download only e.g. the first 5kb of a file as plain text with PHP?
EDIT:
In the comments it was suggested to use e.g. maxlen arg for file_get_contents or similar. But what I noticed that the execution time of the call does not vary appreciably for different maxlen which means that the function loads the full file and then just returns a substring to the variable.
Is there a way to make PHP download just the required amount of bytes and no more, to speed things up?
<?php
$fp = fopen("https://www.webpage.com/".$filename, "r");
$content = fread($fp,5*1024);
fclose($fp);
?>
Note: Make sure allow_url_fopen is enabled.
PHP Doc: fopen, fread

PHP: GET # of characters from a URL and then stop/exit?

For parsing large files on the internet, or just wanting to get the opengraph tags of a website, is there a way to GET a webpage's first 1000 characters and then to stop downloading anything else from the page?
When a file is several megabytes, it can take the server a while to parse the file. This is especially the case when operating with many of these files. Even more troublesome than bandwidth is CPU/RAM conditions as files that are too large are difficult to work with in PHP as the server can run out of memory.
Here are some PHP commands that can open a webpage:
fopen
file_get_contents
include
fread
url_get_contents
curl_init
curl_setopt
parse_url
Can any of these be set to download a specific number of characters and then exit?
Something like that?
<?php
if ($handle = fopen("http://www.example.com/", "rb")) {
echo fread($handle, 8192);
}
Got from php.net official functions doc examples...

(PHP) Unzip and rename CSV files?

I want to download different feeds form some publishers. But the poor thing is, that they are first of all zipped as .gz and as second not in the right format. You can download one of the feeds and check it out. They do not have any filespec... So, I'm forced to add the .csv by myself..
My question now is, how can I unzip those files from the different urls?
How I do rename them, I know. But how do I unzip them?
I already searched for it and found this one:
//This input should be from somewhere else, hard-coded in this example
$file_name = '2013-07-16.dump.gz';
// Raising this value may increase performance
$buffer_size = 4096; // read 4kb at a time
$out_file_name = str_replace('.gz', '', $file_name);
// Open our files (in binary mode)
$file = gzopen($file_name, 'rb');
$out_file = fopen($out_file_name, 'wb');
// Keep repeating until the end of the input file
while (!gzeof($file)) {
// Read buffer-size bytes
// Both fwrite and gzread and binary-safe
fwrite($out_file, gzread($file, $buffer_size));
}
// Files are done, close files
fclose($out_file);
gzclose($file);
But with those feeds it doesn't work...
Here a two example files: file one | file two
Do you have an idea? - Would be very grateful!
Greetings!
windows 10 + php7.1.4 it's work.
The following code has the same effect.
ob_start();
readgzfile($file_name);
file_put_contents($output_filename', ob_get_contents());
ob_clean();
Or you can try to use the gzip command to decompress, and then use the it.
Program execution Functions

Secure delete with PHP 5.3.x

Does someone knows an good PHP Solution to delete or better wipe an file from an linux system?
Scenario:
File is encrypted and saved, when a download is requested the file is copyed to an temporary folder and decrypted. This is already working.
But how to remove the file from the temporary location after sending in to the user?
In my mind i have the following options:
Open the File via "fopen" and write 0,1 into it (think very slow)
Save file to Memcache instead of harddisk (could be a problem with my hoster)
Use somd 3rd pary tool on commandline or as cronjob (could be a problem to install)
Goal: Delete the file from hard disk, without the possibility to recover (wipe/overwrite)
Call "shred" via exec/system/passthru
Arguably the best is to never save the file in its decrypted state in the first place.
Rather, use stream filters to decrypt it on-the-fly and send it directly to the end-user.
Update
Your option 1 is actually not too bad if you consider this code:
$filename = 'path/to/file';
$size = filesize($filename);
$src = fopen('/dev/zero', 'rb');
$dest = fopen('/path/to/file', 'wb');
stream_copy_to_stream($src, $dest, $size);
fclose($src);
fclose($dest);
You could choose /dev/urandom as well, but that will be slow.

Categories