I have done a code to receive images from iphone to PHP Server and I need to resize these image and move to 4 folders.
Only then the json respose is giving to iphone. But it takes much time.
Requirement:
i want to move a file to the folder "folder1" then want to give the json response.
the resizing process should do from this "folder1" after giving json response.
How to run this resizing process in background.
Here is my code:
http://pastebin.com/qAcT1yi9
You could always send your php script to run in the background with a Linux command.
Example:
// using backticks to execute the Linux command but there are
// other alternatives
$cmd = `php runScriptInBackground.php &`;
echo $cmd;
First send/upload the images and send a response back, without doing the resize operation.
Then, if the upload was successful, let the browser issue another request and do the resizing. When this succeeds, send the message ‘resizing successful’ back.
A common solution to this problem is implementing a loading/processing message on hitting a specific event. Then - still being displayed - the action will continue to load on the background and the result page will finally be displayed when done.
Although the user must wait, I prefer this above display a result message when the actual result is not known. Unfortunately I'm not sure how this is done on iphone development.
if your building in objective c then you may just resize make a copy and resize it there and send the resized image to your php you could then display a spinner and json result back to the user and also if the is an error the user will still have the resized image to try again with... Also another thought I had was was to use push notification. I don't know what that code would look like but it's something to consider
you need some async javascript or an iframe in your page posting the image to your server and providing feedback to the user.
This means that the 'main' page would not change, but some visual information can be provided to the user.
You can display an animated gif loader or use JS setInterval to give the user the feeling that things are moving forward why waiting for the server to respond.
If the processing is split in more 1 parts, after each step the server could respond with an HTML page and a redirect: this would even work in an IFRAME without JS.
Each 'page' would perform one more step. But if the user closes the browser before all is done you would end with an unfinished task.
A DB, real background processing, and client side JS polling are a more robust alternative.
A full answer would be quite long and require way more details on your settings (apache CGI PHP? or mod_php? are you using an MVC model or framework, or are you writing a page-oriented website?).
If i had to write a full answer I would forget PHP and use Python and celery http://celeryproject.org/ ;-)
PS.
I just found out that a few related questions already existed:
PHP Background Processes
Asynchronous shell exec in PHP
You can do it realy in two times, first send de files and save on first server, after when the user request that you generate the necesary parts.
You will pass the costs from the file sender to the first request from that.
Related
Like a Log-file is written by a php-script via fwrite($fp, ---HTML---),
I need to save an HTML DIV as png-file on the server.
The client-browser only start the php-script,
but without any client-interaction the png-file should be saved on the server.
Is there a way to do this?
All posts (over thousands) I have been reading are about html2canvas,
which is (as I understand) client-side operating.
I know the html-(html-div)-rendering normally does the browser.[=client-side]
But is there a way to do it in PHP on server-side ?
Reason:
Until now the procedure is
print the div via browser on paper twice
one for the costumer,
one to scan it in again to save it on the server as picture and throw it in the paperbasket.
By more than 500 times a day ...
By security reasons it need to be a saved picture on the server.
My project is an image processing script, using php, JavaScript and imagick (or imagemagick).
Currently, a user can change properties of an image with a browser which then jscript sends an Ajax call to my php script to process the changes, resave the image and send the file path and response back to the browser so jscript can then refresh the img tag.
I'm looking to make this process faster if possible.
Ideally, the processing php script would be able to output the raw image data straight after its processed changes with the appropriate mime header, but this can't be done as the same file needs to send a json response.
Any views and suggestions welcome..
EDIT: I should have mentioned what I have tried so far:
Because of the wide variety of operations available to alter the image, telling my php script what to alter via url string like <img src='image.php?id=132&layer1=flip' /> the url would often exceed the recommended maximum number of characters. Otherwise this would have been ideal.
I have also tried sending the base64 raw data back and processing it and although I haven't completely ruled this one out, it's got its drawbacks - adding base 64 data to the src of an <img> is not naturally supported in all browsers.
I don't know if this is the best way, but think about that:
you have to visualize your image with an <img src="">. Now you make following:
User clicks on button -> AJAX Request to Server -> Ajax Response with
URL to browser -> changing the src="" of the image and visualize
it.
replace it with following:
User clicks on button -> changing the src="" of the php file which
processing the manipulation and display it when ready.
give you some explaining code:
<img src="image.php?picid=123123" id="#image"><button id="#rotate90">rotate</button>
<script>
$("#rotate90").click(function(){
$("#image").attr("src","image.php?picid='123123'&do=rotate&what=90");
}
</script>
so you transmit to your php file via picid which pic you mean, do says what function you want to call and what is in dies example the degrees you want to rotate. Your PHP File has to give a Picture back with the correct headers (e.g, header('Content-Type: image/jpeg'); ) and the browser will load the image till the function finishes.
You can include the raw image data as part of your JSON response, and then interpret that raw data accordingly.
I am quite sure, this will not lead to a speedup: You would need to encode the image data, attach it to the JSON, decode on the client, then draw. Additionally chances are, the encoding the image data to a JSON would result in a much bigger volume of data to go ver the wire, negating any speedups, even if there were any.
There is a funny little trickt though, that can shave a bit more than a roundtrip off your latency:
Start your AJAX call to generate the image
Immediately (without waiting for the result) start your image refresh to a PHP script
In this PHP script, wait for the image generation to finish, and then immediately send it (Sort of long poll for an image)
This way you save the time from the moment the image is calculated, up to the new image request arriving on the server:
the result JSON being assembled
return phase of HTTP processing
Network latency downstream
Processing time on client
Network latency upstream for new image request
HTTP processing time for new image request
Hi,
I download a large amount of files for data mining. I used to use PHP for this purpose but I am finding it to be too slow. Also I just want a small part of the web page. I want to achieve two things
Curl should be able to utilize all my download bandwidth
Is there any way to download only a part of the web page where my data resides.
I am not confined to PHP. If curl works better in terminal I would use that.
Yes, you can download only a part of the page by using the CURLOPT_RANGE option, and you can also provide a write callback function that simply returns an error when you've received "enough" data and you want to stop and move on.
Are you downloading HTML? Your comment leads me to believe that you are. If that's the case, simply load up the html with Simple PHP DOM and get only the part that you want. Although, I find it hard to believe that grabbing just the HTML is slowing you down. Are you downloading any files or media as well?
Link : http://simplehtmldom.sourceforge.net/
There is no way to download only part of a page. When you request a URL, the server response is what it is.
Utilize more of your bandwidth by using cURL's ability to make multiple connections at once.
I have to make an image of a dynamic page i.e. the page keeps on changing in every 5 minutes.
I want to make images of that very page that keeps on changing so that i can have its records saved in the form of images.
How can i do that using php??
i have no idea about this and a little elaboration in answers will be highly appreciated!!
Two steps:
1: Create a script that captures the current data in image form.
If you provide more information about what you mean when you say "create an image of dynamic data", I can probably point you to some resources you can use. For now, just have a look at the GD library.
2: Set up a job that runs the script every 5 minutes
This can be done via Cron. I would suggest investigating if you can run the script when the data changes, instead of at specific intervals.
http://www.devarticles.com/c/a/PHP/Generating-Images-on-the-Fly-With-PHP/
http://www.thesitewizard.com/php/create-image.shtml
Getting a screenshot of a web page isn't an easy task.
You can choose one of the online services that do that for you and you can download the images from there.
Otherwise, I have found a solutions using webkit and python but you will need full access to your linux server in order to install the necessary packages, then you will be able to call that script from php and get your screenshots.
I'm sure this has been asked before, but as I can't seem to find a good answer, here I am, asking... again. :)
Is there any way, using only a mixture of HTML, JavaScript/AJAX, and PHP, to report the actual progress of a file upload?
In reply to anyone suggesting SWFUpload or similar:
I know all about it. Been down that road. I'm looking for a 100% pure solution (and yes, I know I probably won't get it).
Monitoring your file uploads with PHP/Javascript requires the PECL extension:
uploadprogress
A good example of the code needed to display the progress to your users is:
Uber Uploader
If I'm not mistaken it uses JQuery to communicate with PHP.
You could also write it yourself, It's not that complex.
Add a hidden element as the first element of upload form, named UPLOAD_IDENTIFIER.
Poll a PHP script that calls uploadprogress_get_info( UPLOAD_IDENTIFIER )
It return an array containing the following:
time_start - The time that the upload began (unix timestamp),
time_last - The time that the progress info was last updated,
speed_average - Average speed in bytes per second,
speed_last - Last measured speed in bytes per second,
bytes_uploaded - Number of bytes uploaded so far,
bytes_total - The value of the Content-Length header sent by the browser,
files_uploaded - Number of files uploaded so far,
est_sec - Estimated number of seconds remaining.
Let PHP return the info to Javascript and you should have plenty of information.
Depending on the audience, you will likely not use all the info available.
If you have APC installed (and by this point, you really should; it'll be standard in PHP6), it has an option to enable upload tracking.
There's some documentation, and Rasmus has written a code sample that uses YUI.
If you're able to add PECL packages into your PHP, there is the uploadprogress package.
The simplest way would be to just use swfupload, though.
Is there any way, using only a mixture of HTML, JavaScript/AJAX, and PHP, to report the actual progress of a file upload?
I don't know of any way to monitor plain HTML (multipart/form-data) file uploads in webserver-loaded PHP.
You need to have access to the progress of the multipart/form-data parser as the data comes in, but this looks impossible because the ways of accessing the HTTP request body from PHP ($HTTP_RAW_POST_DATA and php://input) are documented as being “not available with enctype="multipart/form-data"”.
You could do a script-assisted file upload in Firefox using an upload field's FileList to grab the contents of a file to submit in a segmented or non-multipart way. Still a bunch of work to parse though.
(You could even run a PHP script as a standalone server on another port just for receiving file uploads, using your own HTTP-handling code. But that's a huge amount of work for relatively little gain.)
I'd recommend you to five FancyUpload a try it's a really cool solution for progress bar and it's not necesarely attached to php. Checkout also the other tools at digitarald.de
cheers
IMHO, this is the problem that Web browsers should solve. We have progress meter for downloads, so why not for uploads as well?
Take a look at this for example:
http://www.fireuploader.com/