I am building an application using Laravel 5 where I want to insert several image urls within an array separated by commas. This would be placed in a column on the database. The files are successfully uploaded to my AWS S3 bucket, but it's now the input to the database. I tried using Laravel's array_add helper but I get an error stating I am missing argument 2. I am wondering how would I be able to achieve this. My current alternate solution is to put the images with the post id and use the relationship to connect them together.
For reference: The column I intend to place the images is picgallery, and the insert is done using $newproperty['picgallery'] variable.
public function store(Request $request)
{
//establish random generated string for gallery_id
$rando = str_random(8);
//input all data to database
$data = $request->all();
$newproperty['title'] = $data['title'];
$newproperty['address'] = $data['address'];
$newproperty['city'] = $data['city'];
$newproperty['province'] = $data['province'];
$newproperty['contact_person'] = Auth::user()->id;
$newproperty['gallery_id'] = $rando;
$newproperty['property_description'] = $data['description'];
if($request->hasfile('images')) {
$files = $request->file('images');
//storage into AWS
foreach ($files as $file) {
$uploadedFile = $file;
$upFileName = time() . '.' . $uploadedFile->getClientOriginalName();
$filename = strtolower($upFileName);
$s3 = \Storage::disk('s3');
$filePath = 'properties/' . $rando . '/' . $filename;
$s3->put($filePath, file_get_contents($uploadedFile), 'public');
$propicurl = 'https://s3-ap-southeast-1.amazonaws.com/cebuproperties/' . $filePath;
$array = array_add(['img'=> '$propicurl']);
$newproperty['picgallery'] = $array;
}
}
Properties::create($newproperty);
return redirect('/properties');
}
array_add requires 3 parameters
$array = array_add($array, 'key', 'value');
(https://laravel.com/docs/5.1/helpers#method-array-add)
For example
$testArray = array('key1' => 'value1');
$testArray = array_add($testArray, 'key2', 'value2');
and you will get
[
"key1" => "value1",
"key2" => "value2",
]
You might not be able to use array_add in this case.
I think in order to fix your issue, fix your foreach loop to be something like this
//storage into AWS
// init the new array here
$array = [];
foreach ($files as $file) {
$uploadedFile = $file;
$upFileName = time() . '.' . $uploadedFile->getClientOriginalName();
$filename = strtolower($upFileName);
$s3 = \Storage::disk('s3');
$filePath = 'properties/' . $rando . '/' . $filename;
$s3->put($filePath, file_get_contents($uploadedFile), 'public');
$propicurl = 'https://s3-ap-southeast-1.amazonaws.com/cebuproperties/' . $filePath;
// change how to use array_add
array_push($array, array('img' => $propicurl));
// remove the below line
// $array = array_add($array, 'img', $propicurl);
}
// move this assignment out of foreach loop
$newproperty['picgallery'] = $array;
Related
I was trying to update my userprofile with the following controller but the problem is if i update only profile picture it shows the above error..But if i update every value it update successfully. How do i update the userProfile without updating every value :
public function updateUser(Request $request)
{
$this->validate($request, [
'profile_picture' => 'dimensions:width=400,height=400',
'cover_picture' => 'dimensions:width=800,height=400',
'avatar' => 'dimensions:width=80,height=80',
]);
if (\Auth::check())
{
$user= User::find(\Auth::id());
}
$files= [];
if($request->file('profile_picture')) $files[] = $request->file('profile_picture');
if($request->file('cover_picture')) $files[] = $request->file('cover_picture');
if($request->file('avatar')) $files[] = $request->file('avatar');
foreach($files as $file)
{
if(!empty($file))
{
$filename = time().str_random(20). '.' . $file->getClientOriginalExtension();
$file->move('users/',$filename);
$filenames[]=$filename;
}
}
$user->profile_picture = $filenames[0];
$user->cover_picture = $filenames[1];
$user->avatar = $filenames[2];
$user->save();
return redirect::back()->with('Warning',"Profile Updated Successfully");
}
I don't think it's wise using a positional array like this, As you've discovered, what if someone only wants to update their avatar. I feel your assignment into $files[] is redundant and you could go straight into your processing code.
Basically your current implementation means $files can be of a variable length, how do you know which is 0, 1 or 2 etc ?
With my approach, the code is now looping over each type of picture, and assigns it into the user with $user->$type directly by the same matching type property.
foreach( array( 'profile_picture', 'cover_picture', 'avatar' ) as $type)
{
if( $request->file( $type ) )
{
$filename = time() . str_random(20) . '.' . $request->file( $type )->getClientOriginalExtension();
$request->file( $type )->move( 'users/', $filename );
$user->$type = $filename;
}
}
If you find you need to map a different $source to the $type variable, you could do this with an additional array index...
foreach( array(
'profile_picture' => 'profile_picture',
'cover_picture' => 'cover_picture',
'avatar' => 'avatar'
) as $source => $type)
{
if( $request->file( $source ) )
{
$filename = time() . str_random(20) . '.' . $request->file( $source )->getClientOriginalExtension();
$request->file( $source )->move( 'users/', $filename );
$user->$type = $filename;
}
}
I finally came up with a solution mate.
You can try to Include a var_dump of $filenames. I suppose that $filenames[1] doesn't exist at all.
I'm trying to save array of multi images and values , the values was saving well but when i going to add image it's only save one image and upload only one .
Here's my controller function
public function store(Request $request) {
$parentproduct = new Product();
$parentproduct->id = Input::get('id');
$parentproduct->save();
$insertedId = $parentproduct->id;
$uploadcount=0;
$files = Input::file('main_image');
$file_count = count($files);
foreach($files as $i=>$file) {
$multiupload=new ProductsTranslation();
if($request->hasFile('main_image')){
$destinationPath = 'website/images';
$filename = $file->getClientOriginalName();
$upload_success = $file->move($destinationPath, $filename);
$uploadcount ++;
$multiupload->main_image = $filename;
$multiupload->id = $request->input('id')[$i];
$multiupload->title = $request->input('title')[$i];
$multiupload->language = $request->input('language')[$i];
$multiupload->product_id=$parentproduct->id;
$multiupload->save();
}
}
It's working fine after the final update ...
try this:
if (Input::hasFile('main_image')) {
foreach (Input::file('main_image') as $file) {
$destinationPath = 'website/images';
$filename = $file->getClientOriginalName();
$upload_success = $file->move($destinationPath, $filename);
$uploadcount ++;
// You have to initialize your array out side your loop
$insertprod = [];
foreach ($request->input('language') as $i=>$language) {
$insertprod[] = array(
'id' =>$request->input('id')[$i],
'product_id'=>$parentproduct->id,
'title' =>$request->input('title')[$i],
'language' => $request->input('language')[$i],
//used this line to save the image name path !
'main_image'=>$filename[$i]
);
}
}
DB::table('products_translations')->insert($insertprod);
}
I'm using Phalcon 3.0.4. I made a foreach on each file inside my folder. Currently I have just 4000 files. I did a findFirst to check if the filename already exist in MySQL (I have 100 000 rows in my table). But when I use findFirst, the response is super slow (I have to wait 20 minutes to get a response). Here is my code :
$dir = new FilesystemIterator("files/path/to/my/files/");
foreach ($dir as $file) {
if ($file->getExtension() == 'json') {
$filename = $file->getFilename();
$explode_filename = explode("_", $filename);
$date = $explode_filename[0];
$unformatted_date = DateTime::createFromFormat("Ymd-His", $date);
$date_server = $unformatted_date->format("Y-m-d H:i:s");
$timestamp_app = $explode_filename[2];
$date_app = date("Y-m-d H:i:s", $timestamp_app/1000);
echo $date_server;
$json_data = json_decode(file_get_contents($file), true);
$scan = Scans::findFirst(array(
"name = :name:",
"bind" => array("name" => $filename)
));
if (!$scan) {
...
}
}
}
I tried to make my query with the QueryBuilder PHQL but I have the same result:
$scan = $this->modelsManager->createBuilder()
->from("Scans")
->where("name = :name:", ["name" => $filename])
->limit(1)
->getQuery()
->execute();
If I remove the findFirst or queryBuilder the response is ~30ms but with the findFirst it will takes ~20 minutes... How can I do to increase the performance of the search in my table ?
By changing your code to better performing one:
$dir = new FilesystemIterator("files/path/to/my/files/");
$fileNames = [];
foreach ($dir as $file) {
if ($file->getExtension() == 'json') {
$filename = $file->getFilename();
$explode_filename = explode("_", $filename);
$date = $explode_filename[0];
$unformatted_date = DateTime::createFromFormat("Ymd-His", $date);
$date_server = $unformatted_date->format("Y-m-d H:i:s");
$timestamp_app = $explode_filename[2];
$date_app = date("Y-m-d H:i:s", $timestamp_app/1000);
echo $date_server;
$json_data = json_decode(file_get_contents($file), true);
// save the above data to some arrays
$fileNames[] = $fileName;
}
}
$scans = Scans::find([
'columns' => 'check only columns you need, otherwise you will have full models with hydration',
'conditions' => 'name IN ({fileNames:array})',
'group' => 'name',
'bind' => [
'fileNames' => $fileNames
]
]);
foreach($fileNames as $fileName) {
$filteredScans = $scans->filter(function($scan) use ($fileName) {
return $scan->name == $fileName;
}
if(!$filteredScans) {
// do here whatever
}
}
This solution can be memory heavy though, then you could include here some paginations like do some limit like proper for and do 100-10000 rows at once depending how much RAM you have.
create index on Scans.name
use group by Scans.name (if not uniq)
set some columns then be use
I'm using ajax and laravel 4 and I upload files that are arrays - they are multiple. But I'm not sure how to update session array withe the new array. I've tried array_merge but it's overwrited.
My method looks like that:
public function storeFiles() {
$name = Input::get('name');
$input = Input::all();
$current_time = time();
$trim_name = trim($name, '[]');
$corporate_docs = array();
foreach ($input[$trim_name] as $corporate) {
$file_name = $current_time . '_' . $corporate->getClientOriginalName();
$corporate->move(APPLICATIONS_DIR, $file_name);
$corporate_docs[] = $file_name;
}
$session = Session::get($name);
if($session) {
$docs = array_push($session, $corporate_docs);
Session::put($name, $docs);
} else {
Session::put($name, $corporate_docs);
}
}
How should I merge existing session array withe the newly created?
Updated: my code now is:
$trim_name = trim($name, '[]');
$corporate_docs = array();
foreach ($input[$trim_name] as $corporate) {
if ($corporate) {
$file_name = $current_time . '_' . $corporate->getClientOriginalName();
$corporate->move(APPLICATIONS_DIR, $file_name);
$corporate_docs[] = $file_name;
}
}
$session = Session::get($name);
if (Session::has($name)) {
$docs = array_merge($session, $corporate_docs); //dd($docs);
Session::forget($name);
Session::put($name, $docs); dd(Session::get($name));
} else {
Session::put($name, $corporate_docs);
}
But it is everwwriten again. When I add first image, it is stored in session and when I add another, it is merged with the first array but when I add third image, in the newly created session array - it is first image and last image. Second image is overwriten.
Try this:
if(session()->has($name)) {
session()->forget($name);
}
session()->put($name, $corporate_docs);
I am trying to represent the whole array returned from Amazon S3 bucket in a tree structure one can browse.
The array example is following
$files[0] = 'container/798/';
$files[1] = 'container/798/logo.png';
$files[2] = 'container/798/test folder/';
$files[3] = 'container/798/test folder/another folder/';
$files[4] = 'container/798/test folder/another folder/again test/';
$files[5] = 'container/798/test folder/another folder/test me/';
$files[6] = 'container/798/test two/';
$files[7] = 'container/798/test two/logo2.png';
and this is what i am trying to achieve
http://i.stack.imgur.com/HBjvE.png
so far i have only achieved differing the files and folder but not on different level with parent-child relation. The above mentioned array resides in $keys['files']. The code is following
$keys = json_decode($result,true);
$folders = array();
$files = array();
$i =0;
foreach ($keys['files'] as $key){
if(endsWith($key, "/")){
$exploded = explode('container/'.$_SESSION['id_user'].'/',$key);
if(!empty($exploded[1]))
$folders[$i]['name'] = substr($exploded[1],0,-1);
}
else{
$exploded = explode('container/'.$_SESSION['id_user'].'/',$key);
$files[$i]['name'] = $exploded[1];
$files[$i]['size'] = "";
$files[$i]['date'] = "";
$files[$i]['preview_icon'] = "";
$files[$i]['dimensions'] = "";
$files[$i]['url'] = "";
}
$i++;
}
This is code just to show i am trying but its not complete or accurate. I don't know how to approach a logic that can give me the hierarchy i am showing the picture. Any help would be greatly appreciated.
I don't know if this is the 'correct' way to do this, but if you want to make a recursive structure, then the easy way is to use a recursive function:
$root = array('name'=>'/', 'children' => array(), 'href'=>'');
function store_file($filename, &$parent){
if(empty($filename)) return;
$matches = array();
if(preg_match('|^([^/]+)/(.*)$|', $filename, $matches)){
$nextdir = $matches[1];
if(!isset($parent['children'][$nextdir])){
$parent['children'][$nextdir] = array('name' => $nextdir,
'children' => array(),
'href' => $parent['href'] . '/' . $nextdir);
}
store_file($matches[2], $parent['children'][$nextdir]);
} else {
$parent['children'][$filename] = array('name' => $filename,
'size' => '...',
'href' => $parent['href'] . '/' . $filename);
}
}
foreach($files as $file){
store_file($file, $root);
}
Now, every element of root['children'] is an associative array that hash either information about a file or its own children array.