ftp_get() does not download, instead moves file to website root folder - php

I have a website developed under CakePHP. I'm trying to change the download system from Apache to FTP.
I currently am doing the FTP by hand with PHP (not using any plugins or libraries).
My code succesfully connects to the FTP server etc. The problem I'm having is that when I call ftp_get($conn_id, $localFile, $remoteFile, FTP_BINARY), it succesfully executes, though the file does not download, instead, its moved to the /webroot/ folder, which serves, as the name suggests, as the root folder of the website (using .httaccess rules).
I mention the .httaccess because I may suspect this is what causing the FTP to route the download to "moving" it to the root, but I'm not sure.
The following is my FTP download code:
$conn_id = ftp_connect($host);
$login_result = ftp_login($conn_id,$ftp_user,$ftp_pass);
if($login_result)
{
try
{
if(ftp_get($conn_id, $localFile, $remoteFile, FTP_BINARY))
{
$this->log("Downloaded...","debug");
}
else
{
$this->log("Couldn't download...","debug");
}
}
catch(Exception $e)
{
$this->log("Error: " . $e->getMessage(),"debug");
}
}
else
{
$this->log("Coudln't connect to FTP server...","debug");
}
ftp_close($conn_id);
Yes, I checked (printed out) the $conn_id and the $login_result and they are working.
What is inside the paths?
$remoteFile = "/downloads/rdticoiroe1432584529/YourMusic.zip";
$localFile = "Music.zip";
The code does not throw any errors. I also tried using the fotografde/cakephp-ftp repo plugin for cakephp and FTP, and it does the same behaviour...
EDIT 1:
Its a music download site, right now we serve file downloads with Apache (which is very slow), the idea is to move to FTP downloads. I'm trying to use FTP protocol to download the file when the client requests download of it.
EDIT 2:
The whole idea of this question, is me trying to move to a more optimized method to serve files to clients. I own a 100Mbit transfer server and downloads are preeetty slow. I'm using Apache at the moment to download files to clients who request it.
I completely misunderstood about using PHP-FTP to serve files to the clients Hard Drive.
So, I'm looking for some guidance at what methods/protocols do people use when they serve files to clients who request download. (This is a music download site).

You have a fundamental misunderstanding here: the "remote file" in an FTP connection is something remote from you - a file on another server somewhere, for instance. In your case, I think the file you're trying to serve is "local" in this sense, but you want a way of getting it to a client quicker.
There is no way of pushing a file out to a client's hard drive (think of the security implications), you can only make it available. Right now, the client can request it using HTTP, and Apache will give them the content. If you installed an FTP server, they could connect with an FTP client and request it that way. Similarly, you could install a different web server, like nginx or lighttpd, which might be faster.
Nothing you do in PHP, however, will be faster than just Apache, because you still have Apache involved, but now you have your PHP code as well, which at best requires the time to execute your code, before doing what Apache was going to do anyway.

Related

How can I write a file from one server to another server in php?

Is it possible to write a file from on server to another server?
I have 2 domains both are on diffrent web servers
http://example1.com is on godady and http://example2.com is on a free webhost eu5.org ,
I want to write htaccess file of http://example1.com from http://example2.com .
my code in example2.com
$code =$_POST["cd"];
if (empty($code)) {
echo "Could not insert data!";
} else {
$file = fopen("http://server2.com/.htaccess", "w");
echo fwrite($file, $code);
fclose($file);
}
It does'nt rewrite the .htaccess file and no php error. file permision of example1/.htaccess is 777.
Any idea?
The short answer is:
"Yes. It possible to write a file from on server to another server"
To write any file to any server across the network, you need to:
1) decide on a protocol (for example, ftp), then
2) instantiate a "listener" on the host to accept client requests for that protocol (for example, installing WinSCP on your Windows host)
Since you're writing in PHP - and since PHP is already using a protocol (HTTP) and already has a "listener" (your web server) -then why not just write a PHP app to accept files for upload, and send files for download?
Here's a simple example:
http://justinpaulin.com/2012/09/03/a-simple-server-to-server-file-transfer-script-php/
Whatever you choose to do, make sure it's secure. In particular, you should not be able to "re-write .htaccess", and you should not be able to read or write to any arbitrary directory.

Hosting on Android: Backing up by cloning to external SD card?

I'm hosting a small website on a pretty budget android tablet. If it were to fail, I would essentially lose my files. To get around this, I'm thinking of using PHP copy(); to clone the main directory to the tablet's SD card at my request.
Is this a good idea, or not? What are the risks? While this is happening, will PHP still continue to run other scripts?
The directory that will be cloned only contains the webpages, all of the images have been moved into another folder.
Here is how I am planning to accomplish it:
include("adminverify.php");
if(isset($_GET["websiteBackup"]) && $admin=true)
{
$sdcard = "/mnt/extsdcard";
$sourcefile = "/internal/www";
// Write to a file saying the backup has started
copy($sourcefile, $sdcard) or die("Could not be done");
// Write to a file saying the backup has finished
}
Alternatives are welcome. I would simply go on the tablet myself and copy the files over, but the tablet is honestly too laggy.
If it turns out PHP is not able to function while doing the backup, I will simply have it change the directory name and modify the 404 page to say that the website is temporarily unavailable.
If this helps, I'm using lighttpd and PHP 5.5.15

Upload file with sftp and php

I'm using the library phpseclib0.3.7 to connect via SFTP to my web server. The users that are present have the following structure:
/dirhome/FirstUser/FirstUser/
/dirhome/SecondUser/SecondUser/
/dirhome/....../.....
Where:
dirhome:
own -> root:root
permission -> 700
FirsUser: (is the name of a user (example))
own -> root:root
permission -> 755
FirsUser: (is the name of a user (example))
own -> FirstUser:mygroup
permission -> 700
The same structure for the second user and so on. With this facility, users can view / edit / create content only within their directory.
I can easily connect in PHP using the library and view folders/files, but do not know how to upload files from user's local PC on the remote server within your personal folder.
The library provides the method:
->put($remotepath, $localpath, NET_SFTP_LOCAL_FILE);
Il problema รจ:
How can I open a dialog box to allow the user to select a file on user's PC and upload in her personal directory?
If I put: $localpath = "C:\\Users\\****\\Desktop\\test.txt"
I get:
C:\\Users\\****\\Desktop\\test.txt is not a valid file
But if I take a file that resides locally on the server works fine, but it does not make sense because users can not put its files. There are several days that I try but without success.
Also the download does not work, essentially for the same problem.
PHP is a server side scripting language. When your browser requests a PHP generated site, PHP runs on the server and your browser only gets to see the result. For that reason, PHP obviously has no way to access client side files: it already ran before anything even reaches the client computer.
I'm assuming the server with the PHP files on it and the SFTP server are different servers, otherwise your whole question doesn't make too much sense.
So, what you need to do here is a two step approach: First, you need to upload the files to the server who runs the PHP files the regular way, using a HTTP POST request. You can send the request to a PHP script, that then uses SFTP to move the files to the other server.
For downloads (as you asked this in your comments) it works similar: the browser requests a PHP script that fetches the file from the SFTP server, and then sends it to the browser as HTTP response. The browser should then display a regular file download dialog and the user can select where to store it.
FOr uploading you should consider using some kind of cron job or a background job started using PHP's exec() instead, as you will most likely either run into max execution timeouts or have to set them way higher than you should if you upload them usign PHP, especially for large files. The alternative is to use a separate PHP configuration (depending on what version of PHP you are running you can use .htaccess files, .user.ini files or different PHP-FPM pools for that) to increase the execution time for only the upload script.

Help understanding the setup of a CDN?

Here's a few questions I cannot find in search:
When adding CDN services to your website, do you still maintain/create local dynamic files on your origin server and point the CDN to that location, set a http rule, and have them upload it automatically if they aren't hosting it yet?
Let's say I have an avatar upload form on my origin server and after a cropping function, do I set the save image to the local directory or to the CDN?
The other question I have is if you save files locally first and wait for the CDN to pull them, how do you code for the page to know the difference? Do you use some thing like
// $filename = 'images/image.jpg';
function static_file($filename) {
$cdnfilepath = 'http://cdndomain.com/';
if (fopen($cdnfilepath.$filename, "r")) {
return $cdnfilepath.$filename;
} else {
return $filename;
}
}
Or, do you just PUT every dynamically created file that you would like the CDN to host directly to the CDN?
If anyone knows a good tutorial on this that would helpful. Sorry if any of this has been covered but I having been searching with no clear answers...
Sometimes there's no straight-forward way of uploading directly to your CDN.
For example with AWS you have to PUT the file, which means it still has to be uploaded to your server temporarily. What I do is upload the files to a temp directory then have a cron script run that PUT's the files onto AWS, so as not to cause the upload process to take any longer for the end-user.

Web Development: How can I allow a user to upload files directly to my CDN (Cachefly)?

I have a PHP web-application that allows users to upload images to my web site. I'm doing this using a simply HTML <form enctype="multipart/form-data">
However, instead of having those images uploaded to my web server - I would to have those images uploaded directly to my CDN (Cachefly - which is another server).
Is this possible ... to have a web-application allow a user to upload images directly to another server?
In case it helps, here's my PHP code:
$target_path = "/home/www/example.com/uploads/";
$target_path = $target_path . basename( $_FILES['uploadedfile']['name']);
if(move_uploaded_file($_FILES['uploadedfile']['tmp_name'], $target_path)) {
// file has been uploaded **LOCALLY**
// HOWEVER, instead of it being upload locally, I would like the file
// to be directly uploaded to the CDN ('other' server)
...
} else{
// error: file did not get uploaded correctly
....
}
i think in case of a CDN ... u will first have to receive files on ur server and then using the CDN API upload to their 'bucket'. i dont think u can upload directly to a CDN unless there is a way to map it as a directory on ur server.
Moving / Uploading a file to a service or for you non-direct-accesable server is usually done by using the provider's API
Moving / Uploading a file to a server 'owned' by yourself can be done by using PHP + FTP extensions (for more information: pear.php.net or pecl.php.net)
Moving / Uploading a file to a server 'owned' by yourself and being one of many in a cluster is usually done by uploading the file temporary on 1 server and afterwards a .sh, .bash or whatever is called which activates further transfer processes to another server.
I don't think it's possible to directly upload to another server, but I could be wrong. I had a similar problem, and I used PHP's FTP capabilities (http://us3.php.net/manual/en/book.ftp.php). I still used my server as a middle-man, meaning I uploaded the files to my server, then FTP transferred them to the target server, and then deleted the file from my server.
You could recieve it on your webserver and then transfer it to the CDN via fileshare or FTP.
If the CDN is web-facing, you could re-direct the request to that server and send control back to your webserver form once the file is uploaded. It's probably better to do the file transfer in the back end though and keep the user connected to the web server.
Sure.
Somewhere in your code there is a "$target_directory" variable that needs to be set. It won't be called that, but depeding on how your function is set up it needs to be there -somewhere-. Just use an absolute path for the directory you want the files to land in. Also, make sure that directory is CHMOD'd to 777 so it can be written into.
Post your code and I can help more.
Yes amazon web services already allows you to upload to amazon S3 directly from the user's browser:
Documentation: http://doc.s3.amazonaws.com/proposals/post.html
Additionally that S3 bucket can be exposed via the amazon CDN (or any other cdn that can point to a customer's origin server)

Categories