.bin file upload to web server using cURL is not working - php

I need to upload a file to a web server using cURL. The web app looks like this:
Click here to view a screen capture
This is part of the HTML code (the complete code is 300 lines, it's a very simple web app on an embedded system to run 3D printers called Smoothieboard):
<h2> Upload File </h2>
<input type="file" id="files" name="files[]" onchange="upload();">
<h3>Uploading file(s)</h3>
<output id="list"></output>
<div id="progress"></div>
<div id="uploadresult"></div>
<script>
document.getElementById('files').addEventListener('change', handleFileSelect, false);
</script>
When the user clicks the "browse" button, a window pops up to browse the file system and pick a file. My file is called "firmware.bin". Upon selecting the file, the upload begins immediately (there is no "upload" button, the file's transfer is done right after picking it). I need to automate this task using cURL. I'm currently doing the following:
curl -i -F files[]=#/home/pi/P18/firmware.bin http://192.168.3.222/upload
The output is:
HTTP/1.0 200 OK
Server: uIP/1.0
Connection: close
Access-Control-Allow-Origin: *
Content-Type: text/plain
OK
However, it doesn't seem to be working. I can access the server through other means and I can assure you that the usual human-friendly upload procedure works, but what I'm doing with cURL doesn't. Something DOES seem to be going on, since the OK message takes a few seconds to pop up, which also happens in the web app. The file seems to be transfered, but I feel that I need to do something more to complete the process.
Something that caught my attention is that, whether I type files[] or potatoes[], the same thing happens.

Without the upload() function, it's hard to reproduce the problem but what you can do is just extracting the curl request with Chrome development console :
In Network Tab, check "Preserve logs", reload the page, upload the file, right click on request / Copy / Copy as cURL :

Related

PHP, how will continue to run after the connection cancellation

I'm having a problem like a file upload code.
The user begins to upload files through the site (for large files. Like Wetransfer)
Showing percentage loading with Ajax.
When completed, showing warning.
But the problem starts here.
Because files are huge, it takes time to move to the appropriate folder and ziping.
If the user closes the browser in this process, the process can not be completed.
Even users close the browser, how do I ensure that the operation continues.
I tried to ignore_user_abort. But I was not successful.
So send response to the browser that you are moving file, and or do it as queue and execute it as background job or just do it in your script. That should help: https://stackoverflow.com/a/5997140/4099089

Chromecast PHP Buffer MP4

I have a Chromecast and a URL of an mp4 file online. I also have a 2Mbps download connection, which is pathetic and renders direct buffering to the Chromecast too slow. That's what I tried so far:
Through the developer console, I simply set location.href to the online URL of the mp4. The Chromecast would buffer for 20 seconds, play 10 seconds' worth of video, and then buffer again. So, through the console, I paused the video and let it buffer for 5 minutes. When I let it play again, it played for about 15 seconds, and then lost all progress and had to be returned to the home screen.
As I don't want to wait for the whole download of the mp4 to complete, I am currently attempting this: I buffer the mp4 to a local file which is in my htdocs directory, and I then direct the Chromecast to that file's location. However, when opening the mp4 file thorugh Chrome (the browser), instead of playing, it shows a download prompt, and the Chromecast returns to the home screen.
I have implemented the buffering in PHP, and it looks as thus:
$bufferSource = 'http://example.com/path/to/file.mp4';
$bufferedReader = fopen($bufferSource, 'r');
while(!($finished = feof($bufferedReader))){
if($finished !== false){ break; }
//get onle line
$buffer = fgets($bufferedReader);
file_put_contents('buffer.mp4', $buffer, FILE_APPEND);
}
fclose($bufferedReader);
I know that PHP does its work, as I can watch the file size grow on my computer, and I can open the file with VLC. Is there maybe another PHP script I could make to access the locally buffered mp4 file which simulates 'bufferability', so Chrome does not show the download dialog but buffers the file, as should do the Chromecast?
EDIT: One more thing. I am not directing the Chromecast to the PHP script. I am actually directing it directly to the buffer.mp4 file.
You're missing a Content-Type header in your PHP script.
Figure out what the original content type header is from your server (probably video/mp4) and send it with your proxying script like this:
header('Content-Type: video/mp4');
This will allow the browser to detect the content type and play it directly (if supported), without downloading.
Also, I would consider using a real proxy server, such as Nginx, rather than reinventing the wheel. This will be much easier and more reliable.
You can use a receiver with a Media Element tag and then point its source to your mp4 file on your server. If you don't want to write your own receiver, you can use either the default or Styled Media Receiver . You would need a very simple sender to send the url, check out the github repo for examples

Can PHP (with Apache or Nginx) check HTTP header before POST request finished?

Here is a simple file upload form HTML.
<form enctype="multipart/form-data" action="upload.php" method="POST">
Send this file: <input name="userfile" type="file" />
<input type="submit" value="Send File" />
</form>
And the php file is pretty simple too.
<?php
die();
As you see, the PHP script do nothing in server side. But when we uploading a big file, the process still cost a long time.
I know, my PHP code will executed after the POST process ended. PHP MUST prepare the array named $_POST and $_FILES before the first line code parsed.
So my question is: Can PHP (with Apache or Nginx) check HTTP header before POST request finished?
For example, some PHP extensions or Apache modules.
I was told that Python or node.js can resolve this problem, just want to know if PHP can or not.
Thanks.
================ UPDATE 1 ================
My target is try to block some unexpected file-upload request. For example, we generated a unique token as POST target url (like http://some.com/upload.php?token=268dfd235ca2d64abd4cee42d61bde48&t=1366552126). In server side, my code like:
<?php
define(MY_SALT, 'mysalt');
if (!isset($_GET['t']) || !isset($_GET['token']) || abs(time()-$_GET['t'])>3600 || md5(MY_SALT.$_GET['t'])!=$_GET['token']) {//token check
die('token incorrect or timeout');
}
//process the file uploaded
/* ... */
Code looks make sense :-P but cannot save bandwidth as I expected. The reason is PHP code runs too late, we cannot check token before file uploading finished. If someone upload file without correct token in url, the network and CPU of my server still wasted.
Any suggestion is welcome. Thanks a lot.
The answer is always yes because this is Open Source. But first, some background: (I'm only going to talk about nginx, but Apache is almost the same.)
The upload request isn't sent to your PHP backend right away -- nginx buffers the upload body so your PHP app isn't tying up 100MB of RAM waiting for some guy to upload via a 300 baud modem. The downside is that your app doesn't even find out about the upload until it's done or mostly done uploading (depending on client_body_buffer_size).
But you can write a module to hook into the different "phases" internally to nginx. One of the hooks are called when the headers are done. You can write modules in LUA, but it's sill fairly complex. There may be a module that will send the "pre-upload" hook out to your script via HTTP. But that's not great for performance.
It's very likely you won't even need a module. The nginx.conf files can do what you need. (i.e. route the request to different scripts based on headers, or return different error codes based on headers.) See this page for examples of header checking (especially "WordPress w/ W3 Total Cache using Disk (Enhanced)"): http://kbeezie.com/nginx-configuration-examples/
Read the docs, because some common header-checking needs already have directives of their own (i.e. client_max_body_size will reject a request if the Content-Length header is too big.)
There is no solution in HTTP level, but is possible in TCP level. See the answer I chose in another question:
Break HTTP file uploading from server side by PHP or Apache

can you use curl to post to a local file?

I tried using curl to post to a local file and it fails. Can it be done? my two management systems are on the same server and it seems unnecessary to have it traverse the entire internet system just to go to a file on the same hard drive.
Using localhost didn't do the trick.
I also tried to $_SERVER[DOCUMENT_ROOT].'/dir/to/file.php' using post data. It's for an API that is encrypted, so I'm not sure exactly how it works. It's for a billing system I have and I just realized that it sends data back (API).
It's simply post data and an XML response. I could write an html form tag and input fields and get the same result, but there isn't really anything else to know.
The main question is: Does curl have the ability to post to a local file or not?
it is post data. it's for an API that is encrypted so i'm not sure exactly how it works
Without further details nobody can answer then what you should do.
But if it's indeed a POST receival script on the local server, then you can send a POST request to it using the URL:
$url = "https://$_SERVER[SERVER_NAME]/path/to/api.php";
And then receive its output from the cURL call.
$data = curl($url)->post(1)->postdata(array("billing"=>1234345))
->returntransfer(1)->exec();
// (you would use the cumbersome curl_setopt() calls instead)
So you get a XML or JSON or whatever response.
If they're on the same drive, then use file operations instead:
file_put_contents('/path/to/the/file', $contents);
Using CURL should only be done if you absolutely NEED the http layer to get involved for some reason, or if you're dealing with a remote server. Using HTTP would also mean you need to have the 'target' script be able to handle a file upload plus whatever other data you need to send, and then that script would end up having to do file operations ANYWAYS, so in effect you've gone on a round-the-world flight just so you can move from your living room to the kitchen.
file://locafilespec.ext worked for me. I had 2 files in the same folder on a linux box, in a folder that is not served by my webserver, and I used the file:// wrapper to post to file://test.php and it worked great. it's not pretty, but it'll work for dev until I move it to it's final resting place.
Does curl have the ability to post to a local file or not?
To curl local file, you need to setup HTTP server as file:// won't work, so:
npm install http-server -g
Then run the HTTP server in the folder where is the file:
$ http-server
See: Using node.js as a simple web server.
Then test the curl request from the command-line to localhost like:
curl http://127.0.0.1:8081/file.html
Then you can do the same in PHP.

How do I use PHP to download a doc file?

I know that there are php functions that allow a user to download or you to download file using PHP BUT I have not seen a single one that allows your php file to navigate and download a file and store it in a specific directory..
So here is what I want to do. I have a web host which runs php applications. Then I have a website with a calendar. The calendar has options on the side...
Tools--->export as doc
I want to write a PHP code that EVERYDAY automatically goes to calendar's Tool options, then downloads the calendar called Team Calendar into the webhost where the script can use it.
For experimental purposes lets assume the calendar URL is at http://webdesign.about.com/od/php/ht
How do I go about this?
Thanks a bunch
EDIT: I TRIED WGET THIS IS WHAT I GOT, HOW CAN I MAKE IT DOWNLOAD THE FILE IN DOC FROM TOOLS?
[/cygdrive/c/documents and settings/omar.khawaja]$ wget http://confluence.com/display/prodsupport/Team+Calendar
--2011-06-02 16:33:43-- http://confluence.rogersdigitalmedia.com/display/prodsupport/Team+Calendar
Resolving confluence.com... 204.225.248.160
Connecting to confluence.com|204.225.248.160|:80... connected.
HTTP request sent, awaiting response... 302 Moved Temporarily
Location: http://confluence.com/login.action;jsessionid=2F13926CF763FE4F3862FAFC24AB81D7?os_destinati
on=%2Fdisplay%2Fprodsupport%2FTeam%2BCalendar [following]
--2011-06-02 16:33:43-- http://confluence.com/login.action;jsessionid=2F13926CF763FE4F3862FAFC24AB81
D7?os_destination=%2Fdisplay%2Fprodsupport%2FTeam%2BCalendar
Connecting to confluence.com|204.225.248.160|:80... connected.
HTTP request sent, awaiting response... 200 OK
Length: 7865 (7.7K) [text/html]
Saving to: `login.action;jsessionid=2F13926CF763FE4F3862FAFC24AB81D7#os_destination=%2Fdisplay%2Fprodsupport%2FTeam+Cale
ndar'
100%[==============================================================================>] 7,865 --.-K/s in 0.04s
2011-06-02 16:33:43 (207 KB/s) - `login.action;jsessionid=2F13926CF763FE4F3862FAFC24AB81D7#os_destination=%2Fdisplay%2Fp
rodsupport%2FTeam+Calendar' saved [7865/7865]
You need to use a cron job on the server to do this. Have that cron job call a PHP script that simply saves the doc to a directory on the web server.

Categories