I have an application, which has one input=file
Now I need to upload to my server, and then move file to some other server. How can I avoid time out?
Also, any good suggestion for ajax uploader. Thanks.
Flash Uploader: Undoubtedly, SWFUpload or Uploadify (based on the latter).
File Transfer: Use PHP CURL to do an HTTP POST form transfer (http://www.php.net/manual/en/function.curl-setopt.php see the 2nd example).
Before doing the transfer do the following:
set_time_limit(-1); // PHP won't timeout
ignore_user_abort(true); // PHP won't quit if the user aborts
Edit: I don't see a valid reason why you would need a CRON job unless the file in question changes at some time (which is the real definition of sync-ing). On the other hand, if what you want is to just copy the file to a remote server, there's no reason you can't do it with plain PHP.
Also, one thing you should be aware of is file sizes. If the file size in anything less than 20mb, you're safe.
Edit 2: By the way, with the right conditions, (output buffering off, and implicit output on), you can show the user the current remote transfer progress. I've done, it ain't hard really. You just need a hidden iframe which sends progress requests to update the parent window.
It works kind of like AJAX, but using an iframe in place of XHR (since XHR returns as a bulk, not in blocks, unlike an iframe).
If interested, I can help you out with this, just ask.
Edit3: Dynamic remote upload example/explanation:
To make things short, I'll assume that your file has already been uploaded to the server by the user, but not the target remote server. I'll also assume the user landed on handle.php after uploading the file.
handle.php would look like:
// This current script is only cosmetic - though you might want to
// handle the user upload here (as I did)
$name = 'userfile'; // name of uploaded file (input box) YOU MUST CHANGE THIS
$new_name = time().'.'.pathinfo($_FILES[$name]['name'],PATHINFO_EXTESION); // the (temporary) filename
move_uploaded_file($_FILES[$name]['tmp_name'],'uploads/'.$new_name);
$url = 'remote.php?file='.$new_name; ?>
<iframe src="<?php echo $url; ?>" width="1" height="1" frameborder="0" scrolling="no"></iframe>
<div id="progress">0%</div>
<script type="text/javascript">
function progress(percent){
document.getElementById('progress').innerHTML='%'+percent;
}
</script>
Doesn't look difficult so far, no?
The next part is a little more complex. The file remote.php would look like:
set_time_limit(0); // PHP won't timeout
// if you want the user to be able to cancel the upload, simply comment out the following line
ignore_user_abort(true); // PHP won't quit if the user aborts
// to make this system work, we need to tweak output buffering
while(ob_get_level())ob_end_clean(); // remove all buffers
ob_implicit_flush(true); // ensures everything we output is sent to browser directly
function progress($percent){
// since we're in an iframe, we need "parent" to be able to call the js
// function "progress" which we defined in the other file.
echo '<script type="text/javascript">parent.progress('.$percent.');</script>';
}
function curlPostFile($url,$file=null,$onprogress=null){
curl_setopt($ch,CURLOPT_URL,$url);
if(substr($url,0,8)=='https://'){
curl_setopt($ch,CURLOPT_HTTPAUTH,CURLAUTH_ANY);
curl_setopt($ch,CURLOPT_SSL_VERIFYPEER,false);
}
if($onprogress){
curl_setopt($ch,CURLOPT_NOPROGRESS,false);
curl_setopt($ch,CURLOPT_PROGRESSFUNCTION,$onprogress);
}
curl_setopt($ch,CURLOPT_HEADER,false);
curl_setopt($ch,CURLOPT_USERAGENT,K2FUAGENT);
curl_setopt($ch,CURLOPT_RETURNTRANSFER,true);
curl_setopt($ch,CURLOPT_FOLLOWLOCATION,true);
curl_setopt($ch,CURLOPT_MAXREDIRS,50);
if($file){
$fh=fopen($file);
curl_setopt($ch,CURLOPT_INFILE,$fh);
curl_setopt($ch,CURLOPT_INFILESIZE,filesize($file));
}
$data=curl_exec($ch);
curl_close($ch);
fclose($fh);
return $data;
}
$file = 'uploads/'.basename($_REQUEST['file']);
function onprogress($download_size,$downloaded,$upload_size,$uploaded){
progress($uploaded/$upload_size*100); // call our progress function
}
curlPostFile('http://someremoteserver.com/handle-uplaods.php',$file,'onprogress');
progress(100); // finished!
Use i.e. scp or rsync to transfer the file to another server. Do that with a cron job every couple of minutes, not from your php script - that will prevent any timeouts occuring if the server-to-server transfer takes too long.
Related
I have this code and its work, but.. Its saving this wwww "to fast" and on html file i see in middle part of www Loading image :/ So how can i make to delay script or smth to stay little more time on this www and when all is loaded on www then saving it to file ?
<pre><?php
$file = fopen("brawl2.html", "w");
$c = curl_init();
curl_setopt($c, CURLOPT_URL, "https://brawlstats.com/club/8LG08L");
curl_setopt($c, CURLOPT_FILE, $file);
curl_exec($c);
curl_close($c);
fclose($file);
?>
Thanks for help !
Curl is not emulating a browser, it is just downloading a single file from the server, so it will never load these images.
In HTTP, a user agent (normally a browser, but in this case the curl library) sends a request for a particular resource (URL); then the server does whatever it needs to do, and then returns a response; and then you're done.
In your case, the server is responding with an HTML page that contains some JavaScript. When loaded by a browser, this JavaScript will run, and load the images; but curl is not a browser, so will not run this JavaScript.
There are libraries that do emulate a browser, which would be able to run this; they are referred to as "headless browsers", and a quick search turned up this attempt at a comprehensive list.
It's also worth remembering that even once the JavaScript is run, the images are probably not part of the HTML, but references to other files. If you don't save those, your saved HTML won't show any images if you unplug your internet, so you may also need to think about how to archive all the resources needed to display the page, not just the page itself.
I have a networked camera that generates a video snapshot upon hitting http://192.0.0.8/image/jpeg.cgi. The problem is that by accessing the root (i.e. 192.0.0.8) directly, users can access a live video feed, so I hope to hide the address altogether.
My proposed solution is to use PHP to retrieve the image and display it at http://intranet/monitor/view.php. Although users could create motion by hitting this new address repeatedly, I see that as unlikely.
I have tried using include() and readfile() in various ways, but do not really use PHP often enough to understand if I'm going in the right direction. My best attempt to date resulted in outputting the jpeg contents, but I did not save the code long enough to share with you.
Any advice would be appreciated.
If you want to limit requests per user then use this:
$timelimit = 30;//Limit in seconds
if(!isset($_SESSION['last_request_time'])) {
$_SESSION['last_request_time'] = time();
}
if(time() > $_SESSION['last_request_time'] + $timelimit) {
//prepare and serve a new image
} else {
//serve an old image
}
If you want to limit image refresh time then use the same script but save the last_request_time in place shared for all users(DB, file, cache)
A succinct way to do this is as follows:
header('Content-Type: image/jpeg');
readfile('http://192.0.0.8/image/jpeg.cgi');
The content of the jpeg is then streamed back to the browser as a file, directly from http://intranet/monitor/view.php.
I've got a large form where the user is allowed to input many different fields, and when they're done I need to send the contents of the form to the server, process it, and then spit out a .txt file containing the results of the processing for them to download. Now, I'm all set except for the download part. Setting the headers on the response to the jQuery .post() doesn't seem to work. Is there any other way than doing some sort of iframe trick to make this work (a la JavaScript/jQuery to download file via POST with JSON data)?
Again, I'm sending data to the server, processing it, and then would like to just echo out the result with headers to prompt a download dialog. I don't want to write the result to disk, offer that for download, and then delete the file from the server.
Don't use AJAX. There is no cross-browser way to force the browser to show a save-as dialog in JavaScript for some arbitrary blob of data received from the server via AJAX. If you want the browser to interpret the results of a HTTP POST request (in this case, offering a download dialog) then don't issue the request via AJAX.
If you need to perform some kind of validation via AJAX, you'll have to do a two step process where your validation occurs via AJAX, and then the download is started by redirecting the browser to the URL where the .txt file can be found.
Found this thread while struggling with similar issue. Here's the workaround I ended up using:
$.post('genFile.php', {data : data}, function(url) {
$("body").append("<iframe src='download.php?url="+url+"' style='display: none;'></iframe>");
});
genFile.php creates the file in staging location using a randomly generated string for filename.
download.php reads the generated file, sets the MIME type and disposition (allowing to prompt using a predefined name instead of the random string in the actual filename), returns the file content and cleans up by deleting the source file.
[edit] might as well share the PHP code...
download.php:
<?php
$fname = "/tmp/".$_GET['url'];
header('Content-Type: text/xml');
header('Content-Disposition: attachment; filename="plan.xml"');
echo file_get_contents($fname);
unlink ($fname);
?>
genFile.php:
<?php
$length = 12;
$chars = "abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789";
$str = substr( str_shuffle( $chars ), 0, $length ).'.xml';
$fh = fopen(('tmp/'.$str), 'w') or die("can't open file");
fwrite($fh,$_POST["data"]);
fclose($fh);
echo $str;
?>
Rather than using jQuery's .post(), you should just do a normal POST by submitting the form, and have the server respond with appropriate Content-Encoding and MIME-type headers. You can't trigger a download through post() because jQuery encapsulates the returned data.
One thing I see in use rather frequently, though, is this:
$.post('generateFile.php', function(data) {
// generateFile builds data and stores it in a
// temporary location on the server, and returns
// the URL to the requester.
// For example, http://mysite.com/getFile.php?id=12345
// Open a new window to the returned URL which
// should prompt a download, assuming the server
// is sending the correct headers:
window.open(data);
});
Basic Idea: I have a flash file that takes screenshots with a click of a button, sending the data to a PHP file, and then the user gets to save a PNG image. The images that are merged together (via PHP) require that they reside on the same server as the PHP, otherwise they do not merge and the final PNG shows up blank.
My solution so far: I have two PHP files, and I just need to find a way to merge them. The screenshot one, and one that copies a file from one server to another. This is my cheat work around to bring the image to reside on the same server, THEN run the screenshot php.
The Server-to-Server PHP Code:
<?PHP
$inputfile = FOPEN("https://www.google.com/intl/en_com/images/srpr/logo3w.png", "r");
$outputfile = FOPEN("transferedfile.gif", "w");
ECHO "File opened...";
$data = '';
WHILE (!FEOF($inputfile)) {
$data .= FREAD($inputfile, 8192);
}
ECHO "Data read...";
FWRITE($outputfile, $data);
ECHO "transfered data";
FCLOSE ($inputfile);
FCLOSE ($outputfile);
ECHO "Done.";
?>
So as you can see, it pulls Google's logo and saves it as "transferedfile.gif" to the directory the PHP resides on. I can get this PHP code to work by saving this as whateverIWant.php on my webserver, and visiting it directly, but I need to in place of Google's logo (in this example) put a value that will be dynamically changing via flash.
So basically… in the flash file, I'll have a dyniamic variable where the URL will change, in short. So we'll just say that I define that variable in flash as var imageToGet so somehow I need to pass that variable into this PHP. That's one step... here's the AS 2.0 code:
My Actionscript (2.0) Code:
button.onRelease = function ():Void {
sendImageToServer();
ScreenShot.save(_root, "screenshot.png", 0, 0, 100, 140);
};
the sendImageToServer() function isn't made yet. This is where I'm stuck. I would need the sendImageToServer() function to send var imageToGet as what image to get, THEN run the ScreenShot.save() function after the transfer is done (aka FCLOSE ($outputfile); is complete)
In Summary: A movie clip on the stage will have a dynamic image loaded into it, that once a button is pressed, it would need to copy that dynamic image to the local server, and then run the screenShot function. I believe once I have this figured out, I should be able to do everything else, such as saving as a unique name, saving multiple files, etc. But I just need pushed in the right direction :)
Thanks so much everyone # StackOverflow. You've been nothing but awesome to me thus far!
EDIT -- I've found a good starting point!!
I found a good starting point, and am answering my own question in case someone else stumbles upon this. I used these two codes as a starting point, and I think I'm on the right track…
In Flash: I simply made a dynamic textbox with the instance name of traceText
In Actionscript (2.0):
var send:LoadVars = new LoadVars;
var receive:LoadVars = new LoadVars;
send.toPHP = "asd123";
receive.onLoad = function(){
encrypted = this.toFlash;
traceText.text = encrypted;
}
send.sendAndLoad("test.php",receive,"POST");
In "test.php" file:
$fromFlash = $_POST['toPHP'];
$encrypted = $fromFlash;
$toFlash = "&toFlash=";
$toFlash .= $encrypted;
echo $toFlash;
What this ended up doing was sending the variable to PHP and then back again. Which is perfect for what I needed. For now, I should be good! Hope this helps anyone that needs it.
I'm currently looking into a way of showing the file download status on a page.
I know this isnt needed since the user usually has a download status in the browser, but I would like to keep the user on the page he is downloading from, as long as the download is lasting. To do that, the download status should match the status the file actually has (not a fake prograss bar). Maybe it will also display the speed the user is downloading at, and estimate the time it will take, depending on the current download rate.
Can this be done using PHP and Javascript? Or does it realy require Flash or Java?
Should not somewhere on the Server be an information about who is downloading what at what speed and how much?
Thank you for your help in advance.
Not really possible cross-browser, but have a look into http://markmail.org/message/kmrpk7w3h56tidxs#query:jquery%20ajax%20download%20progress+page:1+mid:kmrpk7w3h56tidxs+state:results for a pretty close effort. IE (as usual) is the main culprit for not playing ball.
You can do it with two seperate php files, first file for downloading process.
Like as:
$strtTime=time();
$download_rate=120; //downloading rate
$fp = fopen($real, "r");
flush();// Flush headers
while (!feof($fp)) {
$downloaded=round($download_rate * 1024);
echo fread($fp,$downloaded );
ob_flush();
flush();
if (connection_aborted ()) {
// unlink("yourtempFile.txt" ;
exit;
}
$totalDw +=$downloaded;
// file_put_contents("yourtempFile.txt", "downloaded: $totalDw ; StartTime:$strtTime");
sleep(1);
}
fclose($fp);
// unlink("yourtempFile.txt") ;
Second file would be used for reading yourtempFile.txt by Ajax continusly. Using Sessions and Cookies wouldn't be used because of starting print.