I have a backup of my website in WinRAR file, I wana upload it on my new website... So I uploaded that WinRAR file on my new website panel, but it didn't worked...
Actually I'm moving my website to another hosting, so I downloaded the backup in WinRAR file and upload it on new hosting, it uploaded successfully but its not working.. it shows the default page of that hosting... check this http://kownleg.comyr.com, I dont know whats this error all about, I'm newbie... pls help
any suggestions, why this happening or how can we upload WinRAR backup file??
Upload the individual files to the server using the same method you used to upload your compressed file. The key is to NOT send the server a compressed file if you are unable to decompress it on the server.
This should be probably related to file permissions...
Most of the hosting providers take care for security reasons.
Check you file permissions after decompressing the file.
Should be no difference between uploading a single compressed file (and then decompress it) and upload every single file.
Related
I know for the fact that there is trojan or malware in php which represents itself as an image. And I also know that to filter out the file upload we use extensions such as .txt or .png.
Is there a way to scan the files manually when they are being uploaded into the server using server built-in antivirus or the server doing this kind of tasks automatically for us? ( I mean particularly in cpanel )
thanks
If you are worried about code being uploaded to your server in the form of an image, simply re-encode the image upon upload. A file containing code with an image extension will throw an error when the encoder tries to process it.
I am frequenty getting "unexpected end of file" errors, I think they only happen on included files and not the base files themselves. But anyway they happen whenever I upload a file at the same time a user happens to load a page that uses the file that's being uploaded.
How would I go about detecting whether a file is being uploaded? Or is there some way to configure my FTP client (I use FileZilla) to read-lock files while uploading?
You should upload to an temporary directory, and switch the completely-uploaded file with the old file.
I have an administration page on a website, from which the admin AND THE ADMIN ONLY can manage users and upload files from local hard drive for these users to download. The admin uses this page to upload files for his customers or to store files he needs when he has no memory device available. The files size may vary from a few Kb to a lot of hundreds megabytes.
The ideal solution:
An HTML form through which the admin can choose a file and upload it, to download it back later. This can be done in PHP.
The problems:
I cannot set the "max_file_size" variable in php.ini because the hosting doesn't let me
I tried FTP upload (PHP function ftp_put()) but it requires me to upload the file with a POST anyway
Even though it's completely wrong, I used a input="text" instead of input="file" to write the whole file path and upload it, but I get the following error:
Warning: ftp_put(insert_local_file_path_here)[function.ftp-put]:
failed to open stream: No such file or directory in path_to_php_script.php on line 70
The insane thing is... on Monday this was working, and now it's not. No changes were done and the file is the same.
My only conclusion:
With my little knowledge, all I could think of is a Java applet which does the required tasks that opens on the administration page. But if someone disables javascript/has no Java installed, the thing will not work, so it's not 100% bulletproof.
Have you got any ideas how to overcome such an issue?
If you need to upload big files you must use ftp protocol to make it. You can't upload big files if you don't have acces to php.ini. Sad but true
I have a php script to backup automatically cPanel and upload the .tar.gz concurrently to an FTP server.
The script works fine.
The script requests the file then starts the ftp transfer and when the upload is complete it send also a confirmation email.
The only problem is that even if the backup is, for example, of 1GB the uploaded file is only of 170mb.
It seems it's not able to upload large files. Infact with a small backup (for example 16mb or 20mb) it's all working fine.
You can see the complete file here http://pastie.org/949680
This is probably due to php.ini settings
For instance max_execution_time needs to be high enough to upload such a large file. Im not sure if any other ini directives are pertinent .
I've seen ways to upload files directly to S3 and I've seen ways to upload a file already on the server. I need to modify the file's name before sending it on to S3, but I don't want to store it locally and then upload it, if I can help it. Is there a way to put it in a buffer or something? Thx.
The file will always end up in the temporary directory first while the upload completes, even before you're able to work with the uploaded file. You get all the file's chunks before, and then it get rebuilt in the /tmp directory by default. So no, there's no "pass-through". But I guess you could re-upload directly from the temporary directory afterwards instead of moving it to another working directory.