transfer file selected in form with scp/sftp - php

I'd like to upload file from html form through scp/sftp (I know their syntax) in php script, not the traditional way (upload to server's /tmp folder, then calling move_uploaded_file) . Upon submitting the form, I'd just like to fetch the uploaded file's full path from <input type="file"...> and pass it into ssh2_scp_send . If there is a way please let me know, if not, please explain why.Btw, if there is a way to select in web page where the user wants to upload the file on server (some gui/dialog), let me know too. Thanks.

You need to separate the server side from the client side. The web page (client side) is generally not able to upload anything anywhere directly (without use of some plugin or applet - Java applet would work fine). PHP itself, being a server-side language (in general, I am ignoring border cases now) can't be used to directly transfer the file from user's computer to some SFTP server.
So the best option is to let the user upload the file to the HTTP server as a part of the form, then upload the received file to the SFTP server. If you can't do this (eg. due to size limitations), you can create a Java applet which will let the user choose the file and transfer it to the server.

Related

Process Uploaded file on web server without storing locally first?

I am trying to process the user uploaded file real time on the websever,
but it seems, APACHE invokes PHP, only once complete file is uploaded.
When i uploaded the file using CURL, and set
Transfer-Encoding : "Chunked"
I had some success, but can't do same thing via browser.
I used Dropzone.js but when i tried to set same header, it said Transfer -Encoding is an unsafe header, hence not setting it.
This answer explains what is the issue there.
Can't set Transfer-Encoding :"Chunked from Browser"
In a Nutshell problem is , when a user uploads the file to webserver, i want webserver to start processing it as soon as first byte is available.
by process i mean, PIPING it to a Named Pipe.
Dont want 500mb first getting uploaded to a server, then start processing it.
But with current Webserver (APACHE - PHP), I cant seem to be able to accomplish it.
could someone please explain, what technology stack or workarounds to use, so that i can upload the large file via browser and start processing it, as soon as first byte is available.
It is possible to use NodeJS/Multiparty to do that. Here they have an example of a direct upload to Amazon S3. This is the form, which sets content type to multipart/form-data. And here is the function for form parts processing. part parameter is of type ReadableStream, which will allow per-chunk processing of the input using data event.
More on readable streams in node js is here.
If you really want that (sorry don`t think thats a good idea) you should try looking for a FUSE Filesystem which does your job.
Maybe there is already one https://github.com/libfuse/libfuse/wiki/Filesystems
Or you should write your own.
But remember as soon as the upload is completed and the post script finishes his job the temp file will be deleted
you can upload file with html5 resumable upload tools (like Resumable.js) and process uploaded parts as soon as they received.
or as a workaround , you may find the path of uploaded file (usually in /tmp) and then write a background job to stream it to 3rd app. it may be harder.
there may be other solutions...

Processing file in the users directory

I am writing a scripts that processes the .csv file. The script currently have to upload the csv file to the server in order to process it, and the user have to download the processed file which is a lot of work from a user.
My question is, is there a way to process files from the user's directory path without the user having to upload the file first? So the user will just browse to the file to be processed and the file will be save and processed in that path.
Thanks,
Sbo
Then the only option you have is to do it client-side. To do it client-side you thus have to use a client-side technology like Flash or JavaScript. The latter is probably the better choice. The following URL explains how you can do a client-side file upload: http://igstan.ro/posts/2009-01-11-ajax-file-upload-with-pure-javascript.html
You want to get access to user's computer? Forget it.
Only way to achieve it is to use Java Applets with special permissions in php you need to upload it, it can be uploaded to temp directory but you need to still upload it.
Java Applets need to be signed and has certificate to be accepted by user. There is no other way I know to get access to user's files.
Check this link as well

PHP upload field destination of other server

I have a website which I'll call website.com that is located on server1. website.com has a field to upload a file. When someone uploads a file on website.com, I don't want the file uploaded to server1, I want it to upload to another server, server2. What is the best way to do this? Can I do this using php, a shell script?
After the file is uploaded to server2, I have a shell script to execute on the file which I will also eventually have to figure out how to run from server1.
I hope this makes sense, thanks in advance.
another possible way to do this is by uploading this file to your website.com site and use CURL to send the image to another server. once this completes you can remove the image again.
see CURL PHP send image for more information.
-- UPDATE --
For SSH connection you need to install additional libraries in order to allow php to make SSH connection. an excellent tutorial can be found here.
-- UPDATE 2 --
The question intrigued me, so i expanded my research. there seems to be another PHP Library phpseclib around on Sourceforge. In the documentation on page 5 there is some information on how it works.
The only good way to make this to work is to read the image to binary, and send it over the the other server, as text and write that into an file, hence creating an image from the source of the original.
Also place the image in a public folder that is accepts calls from your website1 domain, this way you also prevent hot linking your images and saves considerable data.
I also came across this for help with phpseclib.
in the end i wouldnt choose for a solution like this. I would swap your website from server1 to server2, just to keep everything in one place.
Cant you put the script to handle the upload on Server 2?
You can have your HTML pages with the form served for server 1, but call the PHP for the upload from server 2.
Update
For example...
Server 1 has a file index.php which has a form:
<form action='http://server2.com/some_directory/uploader.php' method='POST'>
.... Some form code
</form>
The form on index.php points to a PHP script on server 2, via a URL. That PHP script can now handle the input.
Of course this will only work if server2 is connected accessible from the internet, if not you will have to use some sort of shell script on server 1 to move the files on the internal network when they are uploaded to server 1.

how to upload files to PHP server with use of Python?

I was wondering is there any tutorial out there that can teach you how to push multiple files from desktop to a PHP based web server with use of Python application?
Edited
I am going to be writing this so I am wondering in general what would be the best method to push files from my desktop to web server. As read from some responses about FTP so I will look into that (no sFTP support sadly) so just old plain FTP, or my other option is to push the data and have PHP read the data thats being send to it pretty much like Action Script + Flash file unloader I made which pushes the files to the server and they are then fetched by PHP and it goes on from that point on.
I'm assuming you own the PHP server.
Use FTP. See here and here.
Make a file upload form with PHP, and use python to fill out the form. See this and this.
(Usually a bad idea) Use PHP to write small server that listens for data and then writes it to a file.
I think you're referring to a application made in php running on some website in which case thats just normal HTTP stuff.
So just look at what name the file field has on the html form generated by that php script and then do a normal post. (urllib2 or whatever you use)

CURL + $GLOBALS["HTTP_RAW_POST_DATA"] in PHP

I'm using CURL to upload files to a service.
currently I'm getting the file content with $GLOBALS["HTTP_RAW_POST_DATA"] then save it on my server.
after that, I'm using CURLOPT_POSTFIELDS with the file's full path.
Is there a way to send the file content directly, without saving it on my server, as if I saved it?
Or is there a way to upload a Photo from a flash app to facebook album, without saving it on the server?
Thanks
If you are uploading data you might consider using the file upload mechanism in PHP http://php.net/manual/en/features.file-upload.php It automatically handls file upload PHP.
If you want to redirect the upload to another (third party service) without needing to be in the chain of commands (i.e. user->3rd party server), you might want to look into AJAX. AFAIK when you upload a file using PHP/forms the file will be uploaded to your PHP temp directory and there is no way to prevent this because:
1. To access the file it needs to be on the server (PHP is server execute meaning it can not execute on the user side)
2. I do not believe any user will want you to access their files on their computer nor will you be able to do so(Firewall, AV), if that were to happen it will be a major security issue
As I said above, what you want to look into is AJAX (I used jquery and their AJAX methods are very simple). Because AJAX is user execute javascript it can run on the machine and initiate a connection to any URL. This way you can directly access the service without submitting the file to your server.
Here is an exmaple AJAX upload (you can google for more):
http://valums.com/ajax-upload/
Hope this helps

Categories