place file into zip and stream to browser - zero physical files - php

basically we currently generate a csv file and push it too the clients browser to download, so no physical files are involved or stored as these are reports that always change.
These have become quite large in some places and I need to reduce the speed and size of the downloads to clients.
I decided to go the zip root and got the below test script working fine, however i am still having to generate the zip file physically and then later delete it, obviously there is a speed factor here as I am writing and reading the file, when I only need to push the zip files contents to the browser.
is there no way to create a stream with the ziplib?
<?php
//report
$reportFilename = 'report';
$data = '005,"756607 ","WED","L","TEST UITENHAGE CBD F NFD"
005,"756608 ","MON","L","TEST SUMMERSTRAND NNB "
005,"756634 ","WED","L","TEST UITENHAGE PENFORD F"
005,"756776 ","MON","L","TEST WALKER DRIVE FOOD "
005,"756858 ","MON","C","TEST R ADAMI&SONS C C "
005,"801002 ","MON","L","TESTMOFFET NNB "
005,"CP00270 ","WED","L","TEST WALMER P FNF "
'; //dummy data
// populate fake data...
$reportData = $data.$data.$data.$data.$data.$data.$data.$data.$data.$data.$data;
//make unique so no clashing occurs
$zipFilename = '_temp_'.microtime().$zipFilename;
//zip data
$zip = new ZipArchive();
$zip->open($zipFilename, ZIPARCHIVE::CREATE);
$zip->addFromString($reportFilename.'.csv' , $reportData); //add report
$zip->close();
$zipData = file_get_contents($zipFilename);
$zipSize = filesize($zipFilename);
unlink($zipFilename);
header("Content-Description: File Transfer");
header("Content-Disposition: attachment; filename=\"".$reportFilename.".zip\"");
header('Content-Transfer-Encoding: binary');
header("Content-Type: application/zip");
echo $zipData;
?>

Related

export mysql tables and zipArchive

I've searched all the posts, but still couldn't get it to work. With button press I want to export "page preset" to be precise it's restaurant menu preset, it includes css files, mysql tables etc. I want to be able to import it to another 'menu'. First I'm trying to export mysql database. Should I use mysqldump or SELECT * INTO OUTFILE ?
I'm using this line:
exec("mysqldump --user=$dbusername --password=$dbpassword restaurantsdb meal --where=restaurant_id=$restId > tables/meal.sql");
restaurantsdb is database name and meal = table name. I also want rows WHERE 'restaurant_id' = {id}
I'm trying to understand how mysqldump works, should this line work for me?
Next I'm creating zipArchive file, adding some directories, I tried adding .txt file, that one works, but it doesn't seem to find meal.sql file.
$zip = new ZipArchive();
if ( $zip->open($zip_file, ZipArchive::CREATE) !== TRUE) {
exit("error");
}
$zip->addEmptyDir('TestFiles');
$zip->addEmptyDir('tables');
$fileToZip = __DIR__.'/hello.txt';
$zip->addFile($fileToZip, "TestFiles/text.txt");
$fileToZip = __DIR__.'/tables/meal.sql';
$zip->addFile($fileToZip, "tables/meal.sql");
$download_file = file_get_contents( $file_url );
$zip->addFromString(basename($file_url),$download_file);
$zip->close();
header('Content-type: application/zip');
header('Content-Disposition: attachment; filename="'.basename($zip_file).'"');
header("Content-length: " . filesize($zip_file));
header("Pragma: no-cache");
header("Expires: 0");
ob_clean();
flush();
readfile($zip_file);
unlink($zip_file);
exit;
Also, is it better to export as .sql or .csv ? Later when importing, I need to be able to change id's of all rows to specified, just before importing to database. It's basically cloning same data, but different id's.

PHP on the fly flush xml to zipfile and push download

I create a xml file based on information from my database (xmltv format). These xml files can be quite big - 25-70mb is normal. Now i create the xml file on the fly like this:
$xmlWriter = new XMLWriter();
$xmlWriter->openURI('php://output');
and flush through the loop to prevent memory overflow. I also set headers to push the content as download:
header('Content-Type: application/octet-stream');
header('Content-Disposition: attachment; filename="' . $config->filename . '.xml"');
I would like to be able to zip/gzip the xml because of the size. Is this possible on the fly? I have used PHPZip before, which works good with files, but i dont know if i can write the xml output directly to the zip?
If I have understood correctly, the goal is to create gzip compressed data dynamically, without creating a file on the server. This is possible with deflate_init and deflate_add, but requires PHP 7.
$gzip = deflate_init(ZLIB_ENCODING_GZIP, array('level' => 9));
$data = deflate_add($gzip, "my", ZLIB_NO_FLUSH);
$data .= deflate_add($gzip, "data", ZLIB_FINISH);
With deflate_add we can add more data any number of times (the mode should be ZLIB_FINISH for the last chunk).
We can adjust this method using XMLWriter::openMemory (stores the data in memory) and XMLWriter::flush, to zip the xml elements as they are produced, and create the contents of a .gz file on the fly. For example this code:
header('Content-Type: application/octet-stream');
header('Content-Disposition: attachment; filename="file.gz"');
$xmlWriter = new XMLWriter();
$xmlWriter->openMemory();
$xmlWriter->startDocument('1.0', 'UTF-8');
$gzip = deflate_init(ZLIB_ENCODING_GZIP, array('level' => 9));
for ($i = 0; $i < 10; $i++) {
$xmlWriter->writeElement("element", $i);
$data = $xmlWriter->outputMemory();
echo deflate_add($gzip, $data, ZLIB_NO_FLUSH);
}
echo deflate_add($gzip, "", ZLIB_FINISH);
creates xml elements, compresses and outputs them one by one, without using too much memory or any files.

pass mysql search results url variables into 2nd php processor to download preferred files in a zip file

SITREP
In the middle of building a cart-like document builder/downloader in PHP 5.6.11 and MySqli 5.6.30 on Ubuntu 15.04. A lot of parts work individually but not together.
A database table contains the specific names of products along with corresponding documentation urls (where the PDF's are stored on the server).
Here's the sequence of events, dictated by the end-user's workflow:
Set session cookie. (TBD probably hidden iframe). PHP Page #1.
User searches via form, submits, echos dozens of results + matching url's from the database with checkboxes. PHP Page #1.
User checks a few boxes for items they want to download. PHP Page #1 .
User searches more, check some more items. PHP Page #1.
User hits download button. PHP Page #1.
PHP Page #2 uses the $results['url'] variables from PHP Page #1 in an array, creates a zip, adds files that were checked and prompts download.
I need a solution to pass the returned urls variables $results['url'] from the database search results on PHP #1 into the PHP#2 download processor array.
PHP page #1, query is:
$raw_results = mysql_query("SELECT * FROM mobilesearchspec
WHERE (`url` LIKE '%".$query."%') OR (`product` LIKE '%".$query."%') ORDER BY product ") or die(mysql_error());
if(mysql_num_rows($raw_results) > 0){ // if one or more rows are returned do following
while($results = mysql_fetch_array($raw_results)){
echo "<p><input type=checkbox name=item[] value=$results[url]>" ; echo "".$results['product']."<br>";
echo "<a href='" .$results['url'] . "'>". $results['url'] . "</a>";
Here's the full code of PHP Page #2 processor (from RajdeepPaul's solution below + wiping the zip directory):
$files = $_POST['item'];
$timestamp = date("M-d-Y_H:i:s"); //$timestamp takes the current time
$zipname = "zip_".$timestamp.".zip"; // add timestamp to the file name
$zip = new ZipArchive;
$zip->open($zipname, ZipArchive::CREATE);
foreach ($files as $file){
$downloaded_file = file_get_contents($file);
$zip->addFromString(basename($file),$downloaded_file);
}
$zip->close();
header('Content-Type: application/zip');
header('Content-disposition: attachment; filename='.$zipname);
header('Content-Length: ' . filesize($zipname));
readfile("$zipname");
// removes zip file from directory after creation
array_map('unlink', glob("*.pdf"));
array_map('unlink', glob("*.zip"));
I am open to and extremely grateful for any suggestions!
Change your foreach loop in the following way,
foreach ($files as $file){
$downloaded_file = file_get_contents($file);
$zip->addFromString(basename($file),$downloaded_file);
}
So your code on Page #2 should be like this:
$files = $_POST['item'];
$timestamp = date("M-d-Y_H:i:s"); //$timestamp takes the current time
$zipname = "zip_".$timestamp.".zip"; // add timestamp to the file name
$zip = new ZipArchive;
$zip->open($zipname, ZipArchive::CREATE);
foreach ($files as $file){
$downloaded_file = file_get_contents($file);
$zip->addFromString(basename($file),$downloaded_file);
}
$zip->close();
header('Content-Type: application/zip');
header('Content-disposition: attachment; filename='.$zipname);
header('Content-Length: ' . filesize($zipname));
readfile("$zipname");

PHP CSV diff script

I am currently writing a simple PHP script/site to save the differences of 2 csv files in a 3rd one.
Currently my only problem is, that I want to ignore a specific row (the customer id) which changes from time to time.
I am writing a newsletter system and I have 2 csv files with customer data, and I am checking, if the email address has changed since the last csv file.
How can I ignore the customer id, I can't just delete the column, because I need that column for later.
Here is my structure:
CSV file 1 (old database):
customerid,sex,name,firstname,zip,email
1,male,smith,will,1234,will.smith#gmail.com
2,male,doe,john,7367,john#doe.com
3,female,doe,anna,7367,anne#doe.com
CSV file 2 (new database):
customerid,sex,name,firstname,zip,email
1,male,smith,will,2224,will.smith#gmail.com
7,male,doe,john,7367,john#gmail.com
20,female,doe,anna,7367,anne#doe.com
As you can see, in the newer file, the customer id for John and Anna Doe changed, this should be ignored.
What shouldn't be ignored, is that the email for John Doe and the zip for Will Smith changed. The second "step" works perfectly.
Here is my full code: http://pastebin.com/bt7Pj3MP (about 30 lines). There are the essential parts:
$file1 = file('2015-07-01.csv', FILE_IGNORE_NEW_LINES);
$file2 = file('2015-07-09.csv', FILE_IGNORE_NEW_LINES);
sort($file1);
sort($file2);
$diff = array_diff($file2, $file1);
array_unshift($diff, $_POST['csv_cols']);
$output = substr(md5(rand()), 0, 5). "-output.csv";
file_put_contents($output, implode(PHP_EOL, $diff));
unlink($file1_name);
unlink($file2_name);
header('Content-Description: Download ' . $output);
header('Content-Type: application/force-download');
header("Content-Type: application/download");
header("Content-Length: " . filesize($output));
header("Content-disposition: attachment; filename=\"" . basename($output) . "\"");
readfile($output);
unlink($output);
exit;

Mozilla Firefox not correctly downloading certain file types from MySQL database

I have a MySQL database where I store various file types. If the file extension is the standard three character (.doc, .xls, .pdf), then the content type is stored as application/msword, application/ms-excel, application/pdf, etc. If it's .docx or .xlsx, then the content type is application/vnd.openxmlformats-officedocument.
Until recently, this has never been a problem, but within the past few weeks, it's become a problem in Firefox. Firefox will not download files of type application/vnd.openxlmformats-officedocument in their correct formats. Instead, it downloads the file without an extension and the user has to add it manually. Furthermore, if there are spaces in the filename, then Firefox only picks up the first word in it and that's how the file is saved.
Here is the code I use to upload files:
if($_FILES['Budget']['size'] > 0)
{
$fileName = $_FILES['Budget']['name'];
$tmpName = $_FILES['Budget']['tmp_name'];
$fileSize = $_FILES['Budget']['size'];
$fileType = $_FILES['Budget']['type'];
$fp = fopen($tmpName, 'r');
$content = fread($fp, filesize($tmpName));
fclose($fp);
$fileUp = $con->prepare("INSERT INTO ptfs.upload (ProposalNo, name, size, type, content) VALUES(:proposalno,:name,:size,:type,:content)");
$fileData=array('proposalno'=>$proposalNo,'name'=>$fileName,'size'=>$fileSize,'type'=>$fileType,'content'=>$content);
$fileUp->execute($fileData);
}
And here is the code for presenting the file link to the user:
if(isset($_GET['ProposalNo']) && isset($_GET['UID']))
{
$fileget = $con->prepare("SELECT name, type, size, content FROM upload WHERE ProposalNo = :proposalno AND UID = :uid");
$data = array('proposalno'=>$_GET['ProposalNo'],'uid'=>$_GET['UID']);
$fileget->execute($data);
list($name, $type, $size, $content) = $fileget->fetch(PDO::FETCH_BOTH);
header("Content-Disposition: attachment; filename=$name");
header("Content-type: $type");
header("Content-length: $size");
echo $content;
exit;
}
This works fine in every browser except Firefox, and as I said, it's a recent problem. My users started reporting it within the last couple of weeks. Can I modify either my code or my database to make sure that FF downloads these file types correctly again?
"Furthermore, if there are spaces in the filename, then Firefox only picks up the first word in it and that's how the file is saved."
It's always best to catch the problem right away (before the file is uploaded and entered into DB) and replace spaces with underscores, then let PHP do its thing afterwards.
Consider the following which is the logic I use for my uploaded files, which will transform:
This is a line
into:
This_is_a_line
<?php
$string = "This is a line";
$arr = explode(" ",$string);
$string = implode("_",$arr);
echo $string;
?>
This taken from my own experiences with the same issue that resolved it.

Categories