I'd like to use .NET to upload a picture file from the local disk to a web server where it will be received by a PHP script and saved to the server. I'm not sure of the preferred way to transfer the data, now that I've realised it isn't as simple as I'd hoped.
The problem is that PHP's file upload mechanism only accepts data encoded as multipart/form-data, and I can't see a way to get WebClient to upload the file this way without doing the encoding myself into a byte array and uploading that.
Which would be the neater solution: should I go through the hassle of doing this encoding on the client? Or, if I just use WebClient.UploadFile, should I be able to receive it from php://input and, if so, will it need decoding?
I have found several examples of doing the encoding, on this and other sites, so I don't need help with that. I'd just like an opinion on whether such client-side encoding is sensible or necessary, or whether I can do the work on the server instead by not using $_FILES and receiving the data in a more 'manual' way.
In the longer term I will be seeking to better understand the HTTP protocol.
I've worked out my own answer, thanks to Janoszen's comment. Just the kind of simplicity I wanted.
Here's an extract from the .NET client (example in Visual Basic):
Dim wc As New WebClient()
wc.UploadFile(url, "PUT", filename)
And from the PHP on the server:
file_put_contents(filename, file_get_contents('php://input'));
Thanks Janoszen!
Related
Huy guys,
I need to transfer a larger amount of files from one server to another as a sort of "update/re-install process" for the application I'm building.
So far the files have been pushed by a main server via FTP. This works well, but I want to get rid of storing client's FTP information and want to turn the push-method into a pull-method. So the client clicks "Update" and the client server receives the files.
I've been looking into Phar, Zip and other ways of packing files, but they require extensions and I want my application to be at least extension-dependent as possible.
So I've resorted to transfering the files with JSON. The main/source server packs all the files in a JSON array and sends it to the client server upon request, and the client server loops over the files and saves them. It works perfectly well for PHP,JavaScript, etc. but some images are corrupted in the process.
I suspect it is due to the fact that the data is transfered as ASCII rather than binary, because I encountered the same problem when I built the installation with FTP, and when I turned to binary transfer instead of ASCII the images were no longer corrupted.
Does anybody here have a solution to getting the images transfered non-corrupted?
I use file_get_contents, and have used it in other projects to open and save image data, so I know the function can handle it. I suspect the JSON will need to do some additional encoding or something like that to correctly transfer the image content?
Thanks in advance
Try with base64.That is the simplest way to transfer binary data with php.
I want to allow uploads of very large files into our PHP application (hundred of megs - 8 gigs). There are a couple of problems with this however.
Browser:
HTML uploads have crappy feedback, we need to either poll for progress (which is a bit silly) or show no feedback at all
Flash uploader puts entire file into memory before starting the upload
Server:
PHP forces us to set post_max_size, which could result in an easily exploitable DOS attack. I'd like to not set this setting globally.
The server also requires some other variables to be there in the POST vars, such as an secret key. We'd like to be able to refuse the request right away, instead of after the entire file is uploaded.
Requirements:
HTTP is a must.
I'm flexible with client-side technology, as long as it works in a browser.
PHP is not a requirement, if there's some other technology that will work well on a linux environment, that's perfectly cool.
upload_max_filesize can be set on a per-directory basis; the same goes for post_max_size
e.g.:
<Directory /uploadpath/>
php_value upload_max_filesize 10G
php_value post_max_size 10G
</IfModule>
Python Handler?
Using a Python POST handler instead of PHP. Generate a unique identifier from your PHP app that the client can put in the HTTP headers. With mod_python to reject or accept the large upload before the entire POST body is transmitted.
I think
http://www.modpython.org/live/current/doc-html/dir-handlers-hph.html
Allows you to check headers and decline the rest of the POST input. I haven't tried it but might be the right path?
Looking at the source of mod_python, the buffering of the input via read() seems to allow bit-at-a-time evaluation of the HTTP input. Headers are first.
https://svn.apache.org/repos/asf/quetzalcoatl/mod_python/trunk/src/filterobject.c
It's old I know, but maybe someone have this problem nowdays ,too.
Now you can do this with only Javascript and, say, PHP. No Flash or Java required on client side.
demo: http://dnduploader.filkor.org/
The idea is to slice the files with Javascript's Blob slice() method...
How about a Java applet? That's how we had to do it at a company I previously worked for. I know applets suck, especially in this day and age with all our options available, but they really are the most versatile solution to desktop-like problems encountered in web development. Just something to consider.
You can set the post_max_size for just scripts in 1 directory. Place your upload script there, and allow only that script to handle large sizes. It's still possible for that script to be attacked with large/useless files, but it avoids setting it globally.
Use that with APC and you might be able to work out something good:
IBM Developer works article on APC
Tried all of this... this is by far the best I have used yet...
http://www.uploadify.com/
Take a look at jumploader.com
A good java-applet for uploading.
I've used it for uploading images and it works fine. Haven't tried with bigger files than 10MB, but i should work for really big files too.
Have you looked into using APC to check the progress and total file size. Here is a good blog post about it. It might help.
Maybe you could use Webdav and Javascript in the browser
AJAX Big file upload, with progress, to WebDAV
http://www.webdavsystem.com/ajax/programming/upload_progress
A simple library
http://debris.demon.nl/projects/davclient.js/doc/README.html
You can then get the JS to redirect the user to a success page. Secret keys and what-not can be handled in a PHP prelude before handing off the JS Client->WebDAV
I would look into FTP, SSH or SCP this allows you to upload a large file and still have access control over the file as well. This might take a little longer to implement but its probably the most secure way I could think of.
I know it sucks to add another dependency but in my experience, most websites that are doing something like this are using flash on the client side, and uploading the large file as chunks
adobe as a howto on flash file uploads
I also found this tutorial on codeproject:
Multiple File Upload With Progress Bar Using Flash and ASP.NET
PS - I know you're using PHP and not .net, I figured the important part was the flash ;)
I've had success with uploadify, and I would recommend it. It's a jQuery/Flash script that handles large uploads, and you can pass extra parameters to it (like the secret key). To solve the server-side issues, simply use the following code. The changes take affect just for the script they're called in:
//Check to see if the key is there
if(!isset($_POST['secret_key']) || !isValid($_POST['secret_key']))
{
exit("Invalid request");
}
function isValid($key)
{
//Put your validation code here.
}
//This line changes the timeout.
//Give it a value in seconds (3600 = 1 hour)
set_time_limit(3600);
//Set these amounts to whatever you need.
ini_set("post_max_size","8192M");
ini_set("upload_max_filesize","8192M");
//Generally speaking, the memory_limit should be higher
//than your post size. So make sure that's right too.
ini_set("memory_limit","8200M");
EDIT In response to your comment:
Given what you've said, I'm afraid you may not be able to meet your requirements over http. All of the solutions out there are code that add features to http that it was never designed for.
Like you said yourself, it's a simple protocol. Apart from writing your own client software that runs outside of the browser, a java applet, or using a different protocol (like FTP, which was designed for this), you might not get what you want.
I've done the best I could within the given constraints. Sorry I couldn't do better.
Try this: http://www.simple2ftp.com uses a Java based FTP applet from within a clever PHP application wrapper.
I m currently trying to transfer large data from one server to another using php cURL (posting the data). In some cases the remote server is getting incomplete data(corrupted).
Is there any other way to achieve this reliably
EDIT - 1
Using FTP seems good idea, anybody would like to say that it is bad or i should avoid it for any reason (Suggestions - #Ed Heal, #Neo)
I would guess your php session is timing out. See How to increase the execution timeout in php?
Or you could get curl to run in it's own thread. Call it from a bash script maybe.
Posting large files is not what http is for. Ftp is for transferring files. Hence the name.
But if you are stuck on using http, you can take a look at the WebDAV extensions to http. There is a php library called SabreDAV that you should take a look at:
http://code.google.com/p/sabredav/
You can even use scp to do so, so that the data transfer is secure as well. You would be able to find libraries to do so. Also basic function in php can be useful: http://php.net/manual/en/function.ssh2-sftp.php
As you say that it is truncated, I would imaging that the server has a file limitation size - i.e. to prevent abuse and denial of service attacks.
I would stick to FTP and perhaps compressing the files.
I am making an online tool for identifying certain file types. I need to access some byte values from the file header to do this.
The user selects the file on the client machine. Somehow, I need to get the key byte values from the file, and then these are looked up in a server side database to categorize the file.
How can I read bytes from a client-side file?
I know I could have the user upload the file to the server, but these files are very large, and I only need a few bytes, so it would be slow and wasteful to upload the whole file.
Could I somehow upload part of the file? It seems it is difficult to cancel a html form upload and the file-part is not available after cancel. Is this correct?
Is it possible to read a file in javascript? I have googled this, but the answer is unclear. I have read that it is possible with a java applet, but only if the applet is signed.
Is there some other way?
You can use html5, but will need to fallback on flash or some other non-javascript method for older browsers.
http://www.html5rocks.com/en/tutorials/file/dndfiles/
So. as Said above you must use non-javascript methodds. But each of this methods has some minus.
FLASH - bad work with proxy. Really bad. Of course you can use flash obly for get base64 code of file and give it to js. In this case this will be work greate.
Java Applet - greate work but not many users have JVM or versions of JVM may not be sasme (but if you will use JDK1.4 or 1.5 thi is no problem).
ActiveX - work only in IE and on Windows
HTML5 File Api - not cross browsers solution. Will be work only on last browsers and not in all.
of course much better use server side - in php for example getmimetype and other functions.
But I can manually change headers of my file. For example i can add to php file headers from jpeg or png - and your script will be think that is image.
So this is bad solution : use headers. For check filetype maybe simple use mimetype of file of trust to user and generate icon through file extension
I code primarily in PHP and Perl. I have a client who is insisting on seeking video submissions (any encoding) from the public via one of their pages rather than letting YouTube do its job.
Server in question is a virtual machine and I can adjust ini settings for max post, max upload size etc as needed.
My initial thought is to use a Flash based uploader with PHP on the back end but I wondered if someone might have useful advice and experience on the subject?
Doing large file transfers of HTTP is not usually fun -- but sometimes it's necessary.
For large files, you'll definitely want to provide some kind of progress gauge for end-users.
There are flash-based tools that do this (swfUpload comes to mind).
If you want to avoid flash and do it with pretty html/javascript/css, you can leverage PHP's APC extension, which for some reason provides support for getting upload status from the server, as explained here
You can adjust the post size and use a normal html form. The big problem is not Apache, its http. If anything goes wrong in the transmission you will have no way to detect the error. Further more there is no way to resume the transfer. This is exactly why BitTorrent is so popular.
I don't know how against youtube your client is, but you can use their api to do the uploads from a page on your site.
http://code.google.com/apis/youtube/2.0/developers_guide_protocol.html#Uploading_Videos
See: browser based uploading.
For web-based uploads, there's not many options. Regardless of web platform, web server, etc. you're still transferring over HTTP. The transfer is all or nothing.
Your best option might be to find a Flash, Java, or other client side option that can chunk files and upload them piecemeal, then do a checksum to verify. That will allow for resuming uploads. Unfortunately, I don't know of any such open source component that does this.
Try to convince your client to change point of view.
Using http (and the browser, hell, the browser!) for this kind of issue is rarely a good deal; Will his users wait 40 minutes with the computer and the browser running until the upload is complete?
I dont think so.
Maybe, you could set up a public ftp account, where users can upload but not download and see the others user's files.. then, who want to use FTP software can, who like to do it via browser can too.
The big problem dealing using a browser is that, if something go wrong, you cant resume but have to restart from zero again.
the past year i had the same issue, i gave a look to ZUpload
, but i didnt use it so i can suggest (we wrote a small python script that we send to our customer; the python script create a torrent of the folder our costumer need to send to us, and we download it via utorrent ;)
p.s: again, sorry for my bad english ;)
I used jupload. Yes it looks horrible, but it just works.
With that said, it's still a better idea to convince the client that doing so is stupid.
I would agree with others stating that using HTML is a poor option. I believe there is a size limitation using Flash as well. I know of a script that uses a JavaScript Applet to perform an actual FTP transfer. It is called Simple2FTP and can be found at http://www.simple2ftp.com
Not sure but perhaps worth a try?