Laravel does not get the temppath when upload bigger files - php

I need to handle a file in a Laravel controller,
TL;DR
How to handle big files through laravel requests?
When I make the request with the small file, its tempPath is normal: /tmp/php3Hijl5
But when I make a request for a large file it comes: /home/p.../public
This can happen because php defines a maximum size in php.ini called upload_max_filesize.
Trying to solve this problem, I'm setting the project to a larger upload_max_filesize:
ini_set('upload_max_filesize', '50M');
But this thing is not reflecting in the project, because when I give ini_get('upload_max_filesize');
it returns a '2M'.
When I try to handle the file with fgetcsv() it says that this tempPath is a directory.
Im using dd() to check the file tempPath.
Controller.php
public function assignToClients(Request $request, $id)
{
$file = $request->file('csv_joining_clients');
if (!$file) {
return response([
'message' => 'You have to upload a CSV file'
], 400);
}
$array = $this->parseCsvToString($file);
dd($array);
}
private function parseCsvToString($file)
{
$i = 0;
$importData_arr = [];
$tempPath = $file->getRealPath();
dd($tempPath);
$file = fopen($tempPath, 'r');
while (($filedata = fgetcsv($file, 10000, ",")) !== false) {
$num = count($filedata);
for ($c = 0; $c < $num; $c++) {
$importData_arr[$i][] = $filedata[$c];
}
$i++;
}
fclose($file);
return $importData_arr;
}

Related

I have 10 files of 500MB to upload, What is the best solution for this issue

I am using Slim Framework to upload files. I need to upload 10 files in single request of 500MB size. How can I accomplish this. I am using following code for this.
$uploadedFiles = $request->getUploadedFiles();
foreach ($uploadedFiles['aws_file'] as $uploadedFile) {
if ($uploadedFile->getError() === UPLOAD_ERR_OK) {
$uploadFileName = $uploadedFile->getClientFilename();
$fileDetails = pathinfo($uploadedFile->getClientFileName());
$fileName = explode('_',$fileDetails['filename']);
if(count($fileName)==3) {
$orgIdArray[] = $fileName[1];
}
} else {
$responseObj->status = 'error';
$responseObj->message = 'Error in file or file is empty ';
$responseObj->errorFileList[] = $uploadedFile->getError();
}
}
I am getting memory issue.
Increase your memory_limit PHP setting.

Big CSV upload - filename could not be empty

Using php 5.4.34 And Laravel 4 with apache 2.2.22 and Ubuntu.
Using the library https://github.com/goodby/csv to parse a csv uploaded.
here is my code :
$file = Input::file('file');
//echo $file->getClientOriginalName();
$config = new LexerConfig();
$config
->setDelimiter(";")
->setToCharset('UTF-8')
;
$lexer = new Lexer($config);
$interpreter = new Interpreter();
$salarie_csv = [];
$errors = [];
$lineNb = 0;
$interpreter->addObserver(function (array $rows) use (&$salarie_csv, &$lineNb, &$errors) {
//some code
});
$lexer->parse($file, $interpreter);
return Response::json($errors, 200);
When I upload a 1.5Mb size csv with 20.000 rows it works.
When I upload a 2.5Mb size csv with 38.500 rows it give me the error :
SplFileObject::__construct():Filename cannot be empty in Lexer.php line 50.
i tried with the same file (just removed or add some rows for the test)
Is there a way to fix this ?
Check your post_max_size and upload_max_filesize in your php.ini config file.
PHP probably does not allow too big files to be uploaded, so it cuts it off from post.
var_dump( ini_get('post_max_size') );
Note, that post_max_size overrides upload_max_filesize (as explained in answer here), you should make sure that both of those settings allow sizes that you'll be uploading.
Try This..........
$type=$_FILES['file']['type'];
$filename=$_FILES["file"]["tmp_name"];
$filename_csv = explode(".", $_FILES["file"]["name"]);
$extension = end($filename_csv);
if($extension=="csv")
{
if($_FILES["file"]["size"] > 0)
{
$file = fopen($filename, "r");
while (($emapData = fgetcsv($file, 10000, ",")) !== FALSE)
{
$sql = mysql_query("insert into upload_data(spent,click,filename,date) values('$emapData[0]','$emapData[1]','$emapData[2]','$emapData[3]')") or die (mysql_error());
mysql_query($sql);
}
fclose($file);
echo $error1=ucwords('<div style="margin-left:60px;position: absolute;width: 400px; color: #006400;">CSV File has been successfully Imported</div>');
}
}
else
{
echo $error1=ucwords('<div style="margin-left:60px;position: absolute;width: 400px; color: #CC0000;">Invalid File:Please Upload CSV File</div>');
// echo 'Invalid File:Please Upload CSV File';
}

Replace CSV row in PHP, memory error

I have a csv file that is 2Mb size, and has pipe delimiter. I would like to take the first row and replace its data then resave the file. Here is what I did :
//Creating a new first row with the modified data.
$file = fopen($path,"r");//$path is where the file is located : outputs/my_file.csv
$size = filesize($path);
$firstLine = fgetcsv(fopen($path,"r")); //$firstLine has all the data of the first row as array
fclose($file);
$firstLine = explode("|", $firstLine[0]);//To get each column row
$newHeader = array();
for($i = 0; $i<sizeof($firstLine ); $i++){
if($i == 4){
array_push($newHeader, "modified column in row 1 ");//Only column 4 in row 1 is modified
}else{
array_push($newHeader, $firstLine [$i]);
}
}
$Header = implode("|", $newHeader);
//Creating the new csv file
$row = 0;
while (($data = fgetcsv(fopen($path,"r"), "|")) !== false) {
if($row == 0){
$data[0] = $Header;
}
$newCsvData[] = $data;
}
return $newCsvData; //I wanted to display the new content of the csv before saving it
This code should print the new content of the csv file that I will store but I get an error : Allowed memory size of 536870912 bytes exhausted (tried to allocate 332 bytes) How can I do that in a very fast way ? the file is about 19122 row.
Thanks
If it's only 2mb, maybe read the entire file into memory and then write out a new file (overwriting the previous file). Here are some helper functions to help you read and write the file, and I'm certain you're proficient in editing the resulting array:
/**
* Reads a file into an array
*
* #param $FILE string the file to open
*
* #return $lines The Lines of the file as an array
*/
public static function readFile($FILE) {
$lines = array(); // the array to store each line of the file in
$handle = fopen($FILE, "r");
if ($handle) {
// $FILE successfully opened for reading,
while (($line = fgets($handle)) !== false) {
$lines[] = $line; //add each line of the file to $lines
}
} else {
throw new Exception("error opening the file...");
}
fclose($handle); // close the file
return $lines; // return the lines of the file as an array
}
/**
* Writes the $lines of a file into $FILE
*
* #param $FILE string The file to write
* #param $lines array An array containing the lines of the file
*
* #return $result int|NULL The number of bytes written, or null on failure. See: php.net/fwrite#refsect1-function.fwrite-returnvalues
*/
public static writeFile($FILE, $lines) {
// Add newline at the end of each line of the array
// output is now a single string which we will write in one pass
// (instead of line-by-line)
$output = implode("\n", $lines);
$handle = fopen($FILE, "w+");
if ($handle) {
// $FILE successfully opened for writing, write to the file
$result = fwrite($handle, $output);
} else {
throw new Exception("error opening the file...");
}
fclose($handle); // close the file
return $result; // The number of bytes written to the file, or NULL on failure
}
<?php
$source = fopen('filename','r');
$destination = fopen('newfilename','w');
//write new header to new file
fwrite($destination,"your|replacement|header\n");
//set the pointer in the old file to the second row
fgets($source);
//copy the entire stream
stream_copy_to_stream($source,$destination);
//rename the newfilename to the old filename
rename('newfilename','filename');
//check what memory we used:
var_dump(memory_get_peak_usage());
That resulted in 142260 bytes used of memory at its peak for a 2MB file. BTW: the memory usage of a 2GB file is exactly if I test it here.

PHP Stop Reading Remote Files when after it is Fully Downloaded

I'm getting a 30 second timeout error because the code keeps checking if the file is over 5mb when it's under. The code is designed to reject files over 5mb but i need it to also stop executing when the file is under 5mb. Is there a way to check the file transfer chunk to see if it's empty? I'm currently using this example by DaveRandom:
PHP Stop Remote File Download if it Exceeds 5mb
Code by DaveRandom:
$url = 'http://www.spacetelescope.org/static/archives/images/large/heic0601a.jpg';
$file = '../temp/test.jpg';
$limit = 5 * 1024 * 1024; // 5MB
if (!$rfp = fopen($url, 'r')) {
// error, could not open remote file
}
if (!$lfp = fopen($file, 'w')) {
// error, could not open local file
}
// Check the content-length for exceeding the limit
foreach ($http_response_header as $header) {
if (preg_match('/^\s*content-length\s*:\s*(\d+)\s*$/', $header, $matches)) {
if ($matches[1] > $limit) {
// error, file too large
}
}
}
$downloaded = 0;
while ($downloaded < $limit) {
$chunk = fread($rfp, 8192);
fwrite($lfp, $chunk);
$downloaded += strlen($chunk);
}
if ($downloaded > $limit) {
// error, file too large
unlink($file); // delete local data
} else {
// success
}
You should check if you have reached the end of the file:
while (!feof($rfp) && $downloaded < $limit) {
$chunk = fread($rfp, 8192);
fwrite($lfp, $chunk);
$downloaded += strlen($chunk);
}

Split a large zip file in small chunks using php script

I am using below script for spliting a large zip file in small chucks.
$filename = "pro.zip";
$targetfolder = '/tmp';
// File size in Mb per piece/split.
// For a 200Mb file if piecesize=10 it will create twenty 10Mb files
$piecesize = 10; // splitted file size in MB
$buffer = 1024;
$piece = 1048576*$piecesize;
$current = 0;
$splitnum = 1;
if(!file_exists($targetfolder)) {
if(mkdir($targetfolder)) {
echo "Created target folder $targetfolder".br();
}
}
if(!$handle = fopen($filename, "rb")) {
die("Unable to open $filename for read! Make sure you edited filesplit.php correctly!".br());
}
$base_filename = basename($filename);
$piece_name = $targetfolder.'/'.$base_filename.'.'.str_pad($splitnum, 3, "0", STR_PAD_LEFT);
if(!$fw = fopen($piece_name,"w")) {
die("Unable to open $piece_name for write. Make sure target folder is writeable.".br());
}
echo "Splitting $base_filename into $piecesize Mb files ".br()."(last piece may be smaller in size)".br();
echo "Writing $piece_name...".br();
while (!feof($handle) and $splitnum < 999) {
if($current < $piece) {
if($content = fread($handle, $buffer)) {
if(fwrite($fw, $content)) {
$current += $buffer;
} else {
die("filesplit.php is unable to write to target folder");
}
}
} else {
fclose($fw);
$current = 0;
$splitnum++;
$piece_name = $targetfolder.'/'.$base_filename.'.'.str_pad($splitnum, 3, "0", STR_PAD_LEFT);
echo "Writing $piece_name...".br();
$fw = fopen($piece_name,"w");
}
}
fclose($fw);
fclose($handle);
echo "Done! ".br();
exit;
function br() {
return (!empty($_SERVER['SERVER_SOFTWARE']))?'<br>':"\n";
}
?>
But this script not creating small files after split in target temp folder. Script runs successfully without any error.
Please help me to found out what is issue here? Or If you have any other working script for similar functinality, Please provide me.
As indicated in the comments above, you can use split to split a file into smaller pieces, and can then use cat to join them back together.
split -b50m filename x
and to put them back
cat xaa xab xac > filename
If you are looking to split the zipfile into a spanning type archive, so that you do not need to rejoin the them together take a look at zipsplit
zipslit -n (size) filename
so you can just call zipsplit from your exec script and then most standard unzip utils should be able to put it back together. man zipslit for more options, including setting output path, etc..

Categories