I have a little Problem. I have a Script, that allows people to upload files in a multiple select input. Those input fields are submitted via HTTPXML Request. But i tried to select 98 Pictures with about 900MB - They were uploaded and the ZIP Script finished without any error. But when I want to download the File it is only 200 MB and only 20 Pictures are in the ZIP. I increased the maximum execution time on the server - but the script seems to run for 24 seconds only. I increased the PHP Memory Limit to 2 GB - The Server has enough Ram as well. The Maximum File Size is about 2GB the Maximum Upload Size too.
Here The Script:
$zip = new ZipArchive();
$res = $zip->open(__DIR__."/../files/".$filename, ZIPARCHIVE::CREATE);
if($res){
for($i = 0; $i < count($_FILES['datei']['name']); $i++){
move_uploaded_file($_FILES['datei']['tmp_name'][$i], __DIR__.'/../temp/'.$_FILES['datei']['name'][$i]);
if(file_exists(__DIR__.'/../temp/'.$_FILES['datei']['name'][$i]) && is_readable(__DIR__.'/../temp/'.$_FILES['datei']['name'][$i])){
$zip->addFile(__DIR__.'/../temp/'.$_FILES['datei']['name'][$i], $_FILES['datei']['name'][$i]);
}else{
$status['uploaded_file'] = 500;
}
}
$res_close = $zip->close();
if($res_close){
$status['uploaded_file'] = 200;
}
for($i = 0; $i < count($_FILES['datei']['name']); $i++){
unlink(__DIR__.'/../temp/'.$_FILES['datei']['name'][$i]);
}
}else{
die($res);
$status['uploaded_file'] = 500;
}
The Script basically moves all the TEMP Files to another TEMP Folder. From Those TEMP Folder they are being Zipped to a file folder. Afterwords the Files from the TEMP Folder are going to be deleted.
Is there anything stupid I am doing wrong? Or is there another limitation I didnt see?
Thanks for Help
Related
I've written a script to get all image records from a database, use each image's tempname to find the imagine on disk, copy it to a new folder and create a tar file out of these files. To do so, I'm using PHP's PharData. The problem is the images being TIFF files, and pretty large ones at that (the entire folder of 2000-ish images is 95gb in size).
Initially I created one archive, looped through all database records to find the specific file and used PharData::addFile() to add each file to the archive individually, but this eventually lead to adding a single file taking 15+ seconds.
I've now switched to using PharData::buildFromDirectory in batches, which is significantly faster, but the time to create batches increases with each batch. The first batch got done in 28 seconds, the second in 110, the third one didn't even finish. Code;
$imageLocation = '/path/to/imagefolder';
$copyLocation = '/path/to/backupfolder';
$images = [images];
$archive = new PharData($copyLocation . '/images.tar');
// Set timeout to an hour (adding 2000 files to a tar archive takes time apparently), should be enough
set_time_limit(3600);
$time = microtime(true);
$inCurrentBatch = 0;
$perBatch = 100; // Amount of files to be included in one archive file
$archiveNumber = 1;
foreach ($images as $image) {
$path = $imageLocation . '/' . $image->getTempname();
// If the file exists, copy to folder with proper file name
if (file_exists($path)) {
$copyName = $image->getFilename() . '.tiff';
$copyPath = $copyLocation . '/' . $copyName;
copy($path, $copyPath);
$inCurrentBatch++;
// If the current batch reached the limit, add all files to the archive and remove .tiff files
if ($inCurrentBatch === $perBatch) {
$archive = new PharData($copyLocation . "/images_{$archiveNumber}.tar");
$archive->buildFromDirectory($copyLocation);
array_map('unlink', glob("{$copyLocation}/*.tiff"));
$inCurrentBatch = 0;
$archiveNumber++;
}
}
}
// Archive any leftover files in a last archive
if (glob("{$copyLocation}/*.tiff")) {
$archive = new PharData($copyLocation . "/images_{$archiveNumber}.tar");
$archive->buildFromDirectory($copyLocation);
array_map('unlink', glob("{$copyLocation}/*.tiff"));
}
$taken = microtime(true) - $time;
echo "Done in {$taken} seconds\n";
exit(0);
The copied images get removed in between batches to save on disk space.
We're fine with the entire script taking a while, but I don't understand why the time to create an archive increases so much in between batch.
I am developing a text collection engine using fwrite() to write text but I want to put a file size cap of 1.5 mb on the writing process so if the file is larger that 1.5mb it will start writing a new file from where it left off and so on until it writes the contents of the source file into multiple files. I have Google-searched but many of the tutorials and examples are too complex for me because I am a novice programmer. The code below is inside a for loop which fetches the text ($RemoveTwo). It does not work as I need. Any help would be appreciated.
switch ($FileSizeCounter) {
case ($FileSizeCounter> 1500000):
$myFile2 = 'C:\TextCollector/'.'FilenameA'.'.txt';
$fh2 = fopen($myFile2, 'a') or die("can't open file");
fwrite($fh2, $RemoveTwo);
fclose($fh2);
break;
case ($FileSizeCounter> 3000000):
$myFile3 = 'C:\TextCollector/'.'FilenameB'.'.txt';
$fh3 = fopen($myFile3, 'a') or die("can't open file");
fwrite($fh3, $RemoveTwo);
fclose($fh3);
break;
default:
echo "continue and continue until it stops by the user";
}
Try doing something like this. You need to read from a source then write piece by piece all the while checking for end of file from the source. When you compare the max and buffer values, if they are true, then close the current file and open a new one with an auto-incremented numeric:
/*
** #param $filename [string] This is the source
** #param $toFile [string] This is the base name for the destination file & path
** #param $chunk [num] This is the max file size based on MB so 1.5 is 1.5MB
*/
function breakDownFile($filename,$toFile,$chunk = 1)
{
// Take the MB value and convert it into KB
$chunk = ($chunk*1024);
// Get the file size of the source, divide by kb
$length = filesize($filename)/1024;
// Put a max in bits
$max = $chunk*1000;
// Start value for naming the files incrementally
$i = 1;
// Open *for reading* the source file
$r = fopen($filename,'r');
// Create a new file for writing, use the increment value
$w = fopen($toFile.$i.'.txt','w');
// Loop through the file as long as the file is readable
while(!feof($r)) {
// Read file but only to the max file size value set
$buffer = fread($r, $max);
// Write to disk using buffer as a guide
fwrite($w, $buffer);
// Check the bit size of the buffer to see if it's
// same or larger than limit
if(strlen($buffer) >= $max) {
// Close the file
fclose($w);
// Add 1 to our $i file
$i++;
// Start a new file with the new name
$w = fopen($toFile.$i.'.txt','w');
}
}
// When done the loop, close the writeable file
fclose($w);
// When done loop close readable
fclose($r);
}
To use:
breakDownFile(__DIR__.'/test.txt',__DIR__.'/tofile',1.5);
I am new here and need a bit of help. I have a php script which is pulling data out of a database and creating .csv files. I need to add some logic to the script which can compare two files and then rename the file if the files size is equal to or greater than a specific (TBD) size.
Basically this script runs twice a hour and I would only like the .csv files rewritten if the file size is large enough. THis is all in hopes that It will prevent .csv files being created which are incomplete or too small.
This a bit of the code which is creating the .csv documents. Any help would be much appreciated.
$course_csv = fopen('/Course.csv','w');
$courses_u = array_unique($courses, SORT_REGULAR);
foreach($courses_u as $course){
fputcsv($course_csv, $course, '|');
}
fclose($course_csv);
$data = file('/Course.csv');
$handle = fopen("/Course.csv", "w");
foreach ($data as $line) {
$line = str_replace(array("\r\n", ',','"'), "", $line);
fwrite($handle, "{$line}");
$maxfilesize = 2048;
$myfilesize = filesize('/Course.csv');
if ($myfilesize > $maxfilesize) {
rename('/Course.csv', '/CourseToBig.csv');
}
hi i am using below code to check the size of remote image .it works but it takes lot of time to check the size of image is there any bette rway to do
<?php
$url='http://testfile.com/test/sddkssk.jpg';
$head = array_change_key_case(get_headers($url, TRUE));
$filesize = $head['content-length'];
if ($filesize >= 131000) {
echo 'good image';
}
but it takes 2-3 minute for each time to load is there any better way which can do same work very fast
$size = getimagesize("http://www.example.com/gifs/logo.gif");
// if the file name has space in it, encode it properly
$size = getimagesize("http://www.example.com/gifs/lo%20go.gif");
Since a few days the captcha image doesn't show up anymore.
I when i try to reach the captcha.php the file gives me an error:
Fatal error: Class 't3lib_div' not found in /typo3conf/localconf.php on line 10
When I lookup the localconf.php file, the first 20 lines look like this:
<?php
$TYPO3_CONF_VARS['SYS']['sitename'] = 'New TYPO3 site';
// Default password is "joh316" :
$TYPO3_CONF_VARS['BE']['installToolPassword'] = 'bacb98acf97e0b6112b1d1b650b84971';
$TYPO3_CONF_VARS['EXT']['extList'] = 'tsconfig_help,context_help,extra_page_cm_options,impexp,sys_note,tstemplate,tstemplate_ceditor,tstemplate_info,tstemplate_objbrowser,tstemplate_analyzer,func_wizards,wizard_crpages,wizard_sortpages,lowlevel,install,belog,beuser,aboutmodules,setup,taskcenter,info_pagetsconfig,viewpage,rtehtmlarea,css_styled_content,t3skin';
$typo_db_extTableDef_script = 'extTables.php';
// MAX FILE SIZE
$TYPO3_CONF_VARS['BE']['maxFileSize'] = '100000';
t3lib_div::loadTCA('tt_content');
// This changes the upload limit for image elements
$TCA['tt_content']['columns']['image']['config']['max_size'] = 100000;
// This changes the upload limit for media elements
$TCA['tt_content']['columns']['media']['config']['max_size'] = 100000;
// This changes the upload limit for multimedia elements
$TCA['tt_content']['columns']['multimedia']['config']['max_size'] = 100000;
Does anybody has an idea why I get this error?
The best would be that this settings below would be in ext_tables.php file of some extension.
But if you don't have any special ext for that then you can put that into typo3conf/extTables.php file. After that the error about t3lib_div not found should be gone.
// MAX FILE SIZE
$TYPO3_CONF_VARS['BE']['maxFileSize'] = '100000';
t3lib_div::loadTCA('tt_content');
// This changes the upload limit for image elements
$TCA['tt_content']['columns']['image']['config']['max_size'] = 100000;
// This changes the upload limit for media elements
$TCA['tt_content']['columns']['media']['config']['max_size'] = 100000;
// This changes the upload limit for multimedia elements
$TCA['tt_content']['columns']['multimedia']['config']['max_size'] = 100000;