I can't download file from ftp.
I want to download javascript file (3.js) from Root folder of FTP
$this->load->library('ftp');
$config['hostname'] = 'xxx.xx.x.x';
$config['port'] = 'xxx';
$config['username'] = 'ftpuser';
$config['password'] = '123456';
$config['debug'] = TRUE;
$this->ftp->connect($config);
$this->ftp->download("/3.js", "http://test.net/public_html/js/3.js");
But I have this Error
FTP download Unable to download the specified file. Please check your path
Then I will try this method
$files = $this->ftp->list_files('');
foreach($files as $file)
{
$this->ftp->download($file, "http://test.net/public_html/js/".basename($file));
}
But error will happend
Please help me!
$this->ftp->download("/3.js", "http://test.net/public_html/js/3.js");
should probably be
$this->ftp->download("/home/remoteacct/3.js", "/home/localacct/public_html/js/3.js", "ascii");
Your asking the ftp server to send you the file: '/3.js' which isn't a specific enough path. You are also asking codeigniter to store the file via http instead of via the filesystem. Add the full server path to the file you are downloading and change the web path to a server path.
Change 'remoteacct' and 'localacct' to match your specific directory information
Related
I need download with CodeIgniter. I tried with the force_download function and download function.
With the download function it works, but doesn't permit select the folder for the download. With the force download function the browser downloads an empty file.
$this->load->helper('download');
$path = file_get_contents(base_url()."modulos/".$filename); // get file name
$name = "sample_file.pdf"; // new name for your file
//
force_download($name, $path); // start download
// or
$this->ftp->download($path, '/local/path/to/'.$name);
$this->ftp->download downloads a file from an FTP server to a web server.
force_download downloads a file from the web server to the client.
To download a file from the FTP server all the way to the the client, you have to combine/chain both functions.
$temp_path = $temp_file = tempnam(sys_get_temp_dir(), $name);
$this->ftp->download($path, $temp_path);
$data = file_get_contents($temp_path);
force_download($name, $data);
unlink($temp_path);
This test is on localhost/windows/php/apache/mysql
For example:
in root of Public folder:
$path = '/upload/images/accounts/12005/thumbnail_profile_eBCeBawXWP.jpg';
I want use it:
$img->save($path);
I get this error:
"Can't write image data to path
(/upload/images/accounts/12005/thumbnail_profile_eBCeBawXWP.jpg)"
I try to fix it:
$path = public_path($path);
$img->save($path);
So I get this error:
"Can't write image data to path
(D:\Server\data\htdocs\laravel\jordankala.com\public_html\ /upload/images/accounts/12005/thumbnail_profile_pH657T62fl.jpg)
probably, in real server (linux), I won't have this problem and the code words.
Now how can I manage this error? (I want it works in windows localhost and real linux server)
You can use this to save at your path
$file = $request->file('image_field_name');
$destinationPath = 'upload/images/accounts/12005/';
$uploadedFile = $file->move($destinationPath,$file->getClientOriginalName()); //move the file to given destination path
I've finally gotten to testing external file storage systems in my project and I'm encountering a strange error when I try to analyze some of these files.
What I'm trying to achieve: Grab a list of all the files in a certain s3 directory (done) and analyze them by their ID3 tags using a php package:
https://packagist.org/packages/james-heinrich/getid3
$files = Storage::disk('s3')->files('going/down/to/the/bargin/basement/because/the/bargin/basement/is/cool'); //Get Files
$file = Storage::disk('s3')->url($files[0]); // First things first... let's grab the first one.
$getid3 = new getID3; // NEW OBJECT!
return $getid3->analyze($file); // analyze the file!
However when I throw that into tinker it squawks back at me with:
"GETID3_VERSION" => "1.9.14-201703261440",
"error" => [
"Could not open "https://a.us-east-2.amazonaws.com/library/pending/admin/01%20-%20Cathedrals.mp3" (!is_readable; !is_file; !file_exists)",
],
Which seems to indicate that the file is not readable? This is my first time utilizing AWS S3 so there may be something I haven't configured correctly.
GetId3 doesn't have support for remote files.
You will need to pull your file from S3 to your local storage, then pass the local path for the file to the analyze method of getID3.
# $file[0] is path to file in bucket.
$firstFilePath = $file[0];
Storage::put(
storage_path($firstFilePath),
Storage::get($firstFilePath)
);
$getid3->analyze(storage_path($firstFilePath));
The problem is that you are passing URL to analyze method. This is mentioned here.
To analyze remote files over HTTP or FTP you need to copy the file locally first before running getID3()
Ideally, you will save the file from your URL locally and then pass to getID3->analyze()
// save your file from URL ($file)
// I assume $filePath is the local path to the file
$getID3 = new getID3;
return $getID3->analyze($filePath); // $filePath should be local file path and not a remote URL
To save an s3 file locally
$contents = $exists = Storage::disk('s3')->get('file.jpg');
$tmpfname = tempnam("/tmp", "FOO");
file_put_contents($tmpfname, $contents);
$getID3 = new getID3;
// now use $tmpfname for getID3
$getID3->analyze($tmpfname);
// you can delete temporary file when done
I am trying to run a script which creates pages and saves them to the server
but am getting a permission error on one of the files that is in the public_html directory.
So 2 pages are created in the "pages" directory which is chmod to 0777 and they are created fine.
The 3rd page is created in the "public_html" directory which fails with you do not have permission.
The only way i have found to fix this is to chmod the "public_html" directory to 0770 which then
everything works but i have been strongly advised by the hosting company not to do this bevause of the security risk.
So my question is, Is there any otherway to achieve this goal?
Looking into it a bit it looks like i need to give the script "user" priviliges might work but
this is beyond my knowledge at the moment.
I`m not even sure what the script is running as at the moment, I would guess "group" as chmoding the public_html
to 0770 allows "group"
My setup is: vps server running centos CENTOS 6.7 x86_64
php 5, dso, Apache suEXEC on
simplified Code i am using is:
$page_path = "/home/username/public_html/";
$loop[Html_Name] = "test.html";
$new_page_file = "test.html";
$fp = fopen($page_path.$loop[Html_Name], "w");
fwrite($fp, $new_page_file);
fclose($fp);
chmod($page_path.$loop[Html_Name], 0666);
Many thanks in advance.
Typically, we use ftp in these situations. /public_html permissions may remain to 750 and run this code.
$server = 'localhost';
$ftp_user_name = 'username';
$ftp_user_pass = 'passw';
$dest = 'public_html/new.file';
$source = '/home/username/public_html/path/to/existing.file';
$connection = ftp_connect($server);
$login = ftp_login($connection, $ftp_user_name, $ftp_user_pass);
if (!$connection || !$login) { die('Ftp not connected.'); }
$copied = ftp_put($connection, $dest, $source, FTP_BINARY);
if ($copied) {
echo 'File copied';
} else {
echo 'Copy failed!';
}
ftp_close($connection);
The page with final destination in public_html can be created in the other directory and then this script will copy it in public_html. The old file will remain and if a file exists with the same destination name will be overwritten.
The $dest is relative path to user home directory. The $source is absolute path.
The connection will fail if the ftp is concurrently used by filezilla or something. A solution to that is to create a second ftp user account in cPanel.
I've been trying to make a system that can upload large files, originally i used HTTP but this had a number of problems with settings that needed to be changed. So i thought i'd give it a go with FTP.
Now i have a ftp connection in PHP and it works find, i can view folders and files as well as make directories, what i can't seem to figure out though is how to get hold of a local file and upload it.
I have been reading lots of information and tutorials such as the PHP manual and a tutorial i found on nettuts But i'm struggling to do it. The tutorial says you can upload a local file but i must be missing something.
Here is the upload method i'm using:
public function uploadFile ($fileFrom, $fileTo)
{
// *** Set the transfer mode
$asciiArray = array('txt', 'csv');
$extension = end(explode('.', $fileFrom));
if (in_array($extension, $asciiArray))
$mode = FTP_ASCII;
else
$mode = FTP_BINARY;
// *** Upload the file
$upload = ftp_put($this->connectionId, $fileTo, $fileFrom, $mode);
// *** Check upload status
if (!$upload) {
$this->logMessage('FTP upload has failed!');
return false;
} else {
$this->logMessage('Uploaded "' . $fileFrom . '" as "' . $fileTo);
return true;
}
}
When trying to upload a file i use this:
$fileFrom = 'c:\test_pic.jpg';
$fileTo = $dir . '/test_pic.jpg';
$ftpObj -> uploadFile($fileFrom, $fileTo);
I thought this would get the file from my machine that is stored in the c: and upload it to the destination but it fails (Don't know why). So i changed it a little, changed the $fileFrom = test_pic.jpg and up the picture in the same folder on the remote server. When i ran this code the script copied the file from the one location to the other.
So how would i go about getting the file from my local machine to be sent up to the server?
Thanks in advance.
Using this you would upload a file from your PHP server to your FTP server, what actually not seems to be your target.
Create an upload form which submits to this PHP file. Store this file temporarily on your server and then upload it to your FTP server from there.
If your try would actually work, this would be a major security issue, because a PHP file would have access to any files on my local machine.