Which upload method is preferable when hosting the files in the server - php

Which upload method is preferable ? Individual file upload through FTP or zip file upload through file manager.
Do any file lost while using zip file upload?

Uploading a zip file can be faster as the total size you'll transfer will be smaller; Just think that you'll have to unzip you file after the transfer.
And if you zipped all your filed correctly, nothing will be lost.

It's a lot quicker to upload a zipped file and extract it on the server. If only FTP could support remote unzipping.
If it's a large file, I tend to upload the .zip via FTP and then extract it via cPanel.

If you can, do Zip File upload via FTP.
First of all, text files is good compressed so you save on the size that needs to be uploaded to server.
When moving separate files via FTP there usually is separate connection to server for each file, so it will be very slow.
Also, if you can, don't use File Manager that mostly all host providers offers, because moving files via browser has 30s timeout (unless it's increased, but still not recommended). Only use it if there is no other possibility to extract Zip files via FTP. But still, it will take some time to upload big files.

Related

Transfer files from s3 to SFTP server

I have logic to upload images to s3 in php. I need to upload the same set of images in a SFTP server. In my view there are 2 options. First is to find a logic to upload image from my local to the server, when I am uploading images to s3 and the other option is to write some script to transfer the images from s3 to sftp server. I need the same set of images to be in the server and s3.
Out of the 2 approaches, which one is optimal? Is there any other way to approach my requirement? Any sample php script available for local to SFTP file transfer, if yes please provide the code.
I cannot say for sure which one is optimal, but I can definitely see a potential issue with option #1. If you perform the second upload (i.e. from your "local" server to the SFTP server) during the first upload, you make PHP wait on that operation before returning the response to the client. This could cause some unnecessary hanging for the user agent connecting to the local server.
I would explore option #2 first. If possible, look into SSHFS. This is a way to mount a remote filesystem over SSH. It uses the SFTP to transfer files. This may be a possible solution where all you have to do is write the file once to the local server's filesystem and then again to the mounted, remote filesystem. SSHFS would take care of the transfer for you.

Uploading common required file via FTP without affecting website?

Lets say I have a file common.php used by many pages in my website. Now I want to update the file via FTP, so there will be around 1-2 seconds where the file is not available / still partially being uploaded.
During that time, it causes require('common.php') to report error, thus website is not loading properly.
How to solve cases like this?
Thanks!
You can upload the file with a different name and rename it only after the upload completes. That minimizes the downtime.
Some clients support this even automatically. What further minimizes the downtime.
For example, WinSCP SFTP/FTP client supports this. But with SFTP protocol only, if that's an option for you.
In WinSCP preferences, enable Transfer to temporary filename for All files.
WinSCP will then upload all files with a temporary .filepart extension, overwriting the target file only after the upload finishes.
(I'm the author of WinSCP)

Usage of bandwidth when uploading from a form

I need to know how much bandwidth is used when uploading a file through a form.
Let me explain a bit more easily. I have a file containing a upload form that is hosted on a web host. When the user uploads a file it is uploaded through this form and to another server through FTP, so basically I'm creating a FTP connection inside the PHP file that is stored on a web host.
How much bandwidth is used if I upload a 100MB file? And is it the receiving server (the server we upload to through FTP in PHP file), is it the web host (where we are hosting the PHP file that opens the FTP connection) or is it both that uses the bandwidth needed to upload a 100MB file?
When you use 100MB of bandwidth (transfer to first server) and another 100MB of bandwidth (transfer to another server), that's 200MB of bandwidth. 100MB download, 100MB upload. Sometimes your provider will bill those separately.
100 + 100 = 200. It really is that simple.
(Note that there is overhead in all cases, but not a ton.)

php temp file upload directory - off local server

When uploading an image PHP stores the temp image in a local on the server.
Is it possible to change this temp location so its off the local server.
Reason: using loading balancing without sticky sessions and I don't want files to be uploaded to one server and then not avaliable on another server. Note: I don't necessaryly complete the file upload and work on the file in the one go.
Preferred temp location would be AWS S3 - also just interested to know if this is possible.
If its not possible I could make the file upload a complete process that also puts the finished file in the final location.
just interested to know if the PHP temp image/file location can be off the the local server?
thankyou
You can mount S3 bucket with s3fs on your Instances which are under ELB, so that all your uploads are shared between application Servers. About /tmp, don't touch it as destination is S3 and it is shared - you don't have to worry.
If you have a lot of uploads, S3 might be bottleneck. In this case, I suggest to setup NAS. Personally, I use GlusterFS because it scales well and very easy to set up. It has replication issues, but you might not use replicated volumes at all and you are fine.
Another alternatives are Ceph, Sector/Sphere, XtreemFS, Tahoe-LAFS, POHMELFS and many others...
You can directly upload a file from a client to S3 with some newer technologies as detailed in this post:
http://www.ioncannon.net/programming/1539/direct-browser-uploading-amazon-s3-cors-fileapi-xhr2-and-signed-puts/
Otherwise, I personally would suggest using each server's tmp folder for exactly that-- temporary storage. After the file is on your server, you can always upload to S3, which would then be accessible across all of your load balanced servers.

Big xml files over FTP

On my FTP server, there is a 4gb XML file and I want it to put data from that file to the database using PHP.
I know how to connect to FTP and basic operations using PHP but my question is there a possibility to do it without having to download the file first?
Unfortunately no, you cannot "stream" file using FTP as you could do say on network drive. It's not possible to open that file without downloading it first locally.
This is given you can only access that file via FTP.
If your FTP server and PHP server are one and the same, you just need to change the path to reference the FTP location rather than whereever you are downloading to.
If they are on the same local network, you may be able to use a network path to reach the file.
Otherwise, you will indeed need to transfer the entire file first by downloading it.

Categories