Performance: Loop incl. AJAX vs. a single request with php loop - php

I wrote a PHP that is currently looping through data, validating and importing them into a database. This can take some time, depending on the amount of data to run through.
My next step is to make this "user-friendly" so others can use it, not just me. I was looking at the components I want to implement, including a progress bar so the user knows the script isn't just stuck but is doing something.
Now the question comes up: should I use a JavaScript loop through the data, have them validated one by one and imported to the database with the help of an AJAX call to my php script? This would be pretty useful as well to update the progress bar after each iteration.
Or is it—performance-wise—better to do a single request to the php script which then loops through the data? In that case: how could I implement the updating progress bar?
These are a bunch of questions. The overall purpose is to see what my options are and what arguments speak for each. Thanks in advance!

Well if you want to have a progress bar, then you will need to do it 1-by-1, you would count all the records and then very time one calls back as completed, you would increment the progress bar by a certain percentage.
100 / amount of records = percentage to increase progress bar.
However, the issue with that approach is that if somewhere along the line it fails, you will need to know where it did so, or devise a way to make sure duplicates are not added to the database (If that's even an issue).
It would be better to add the data at once, faster and cleaner, but it would remove the possibility of having a progress bar that actually gives you a correct progress report.

Related

Creating pagination with ajax results

In all the time I've spent developing, one thing I've never really understood is the proper way to create "pagination" with AJAX search results.
So, I'm returning 40 results and I want to be able to paginate them by 10 at a time ... is it a matter of spitting them out on the page, adding some css classes and hiding/showing each group of 10 at a time?
Can someone point me in the direction of some "from scratch" pagination?
Far and away the easiest way to do it is to use a canned script. I recommend DataTables, which will do all the paging for you, or if you choose can do it via individual ajax calls to PHP. At it's simplest, you'd spit out a valid table of results with a <thead>, call $(element).datatables() and you'd be done.
IF you HAVE to do it from scratch, it's a matter of returning a set number of results from your database, making a call with a start and end number for the record (or a start number and number of rows to return) Then you would build buttons or links that change that view by passing different parameters to the server and receiving data back. Ajax would be preferable so you don't need to reload the page each time.
[CEIL][2] is your friend for determining pages for the navigation.
If you chose to do it without hitting the database each time (a notion improved in Datatables by a process called "pipelining") then you'd spit out divs of information based on the amount to display, then show and hide them based on the page you want to be on. It wouldn't work for huge datasets (too much to the dom and too long to load) plus it won't be pretty to write, either.

Read results of a PHP array, in real time with Javascript

I certainly can't solve this problem by myself after a few many days already trying. This is the problem:
We need to display information on the screen (HTML) that is being generated in real time inside a PHP file.
The PHP is performing a very active crawling, returning huge arrays of URLs, each URL need to be displayed in real time in HTML, as soon as the PHP captures it, that's why we are using Ob_flush() and flush methods to echo and print the arrays as soon as we got them.
Meanwhile we need to display this information somehow so the users can see it while it works (since it could take more than one hour until it finishes).
It's not possible to be done, as far as I understand, with AJAX, since we need to make only 1 request and read the information inside the array. I'm not either totally sure if comet can do something like this, since it would interrupt the connection as soon as it gets new information, and the array is really rapidly increasing it's size.
Additionally and just to make the things more complex, there's no real need to print or echo the information (URLs) inside the array, since the HTML file is being included as the User Interface of the same file that is processing and generating the array that we need to display.
Long story short; we need to place here:
<ul>
<li></li>
<li></li>
<li></li>
<li></li>
<li></li>
...
</ul>
A never ending and real time updated list of URLS being generated and pushed inside an array, 1,000 lines below, in a PHP loop.
Any help would be really more than appreciated.
Thanks in advance!
Try web-sockets.
They offer real-time communication between client and server and using socket.io provide cross-browser compatibility. It's basically giving you the same results as long-polling / comet, but there is less overhead between requests so it's faster.
In this case you would use web sockets to send updates to the client about the current status of the processing (or whatever it was doing).
See this Using PHP with Socket.io
Suppose you used a scheme where PHP was writing to a Memcached server..
each key you write as rec1, rec2, rec3
You also store a current_min and a current_max
You have the user constantly polling with ajax. For each request they include the last key they saw, call this k. The server then returns all the records from k to max.
If no records are immediately available, the server goes into a wait loop for a max of, say 3 seconds, checking if there are new records every 100ms
If records become available, they are immediately sent.
Whenever the client receives updates or the connection is terminated, they immediately start a new request...
Writing a new record is just a matter of inserting max+1 and incrementing min and max where max-min is the number of records you want to keep available...
An alternative to web sockets is COMET
I wrote an article about this, along with a followup describing my experiences.
COMET in my experience is fast. Web sockets are definitely the future, but if you're in a situation where you just need to get it done, you can have COMET up and running in under an hour.
Definitely some sort of shared memory structure is needed here - perhaps an in-memory temp table in your database, or Memcached as stephen already suggested.
I think the best way to do this would be to have the first PHP script save each record to a database (MySQL or SQLite perhaps), and then have a second PHP script which reads from the database and outputs the newest records. Then use AJAX to call this script every so often and add the records it sends to your table. You will have to find a way of triggering the first script.
The javascript should record the id of the last url it already has, and send it in the AJAX request, then PHP can select all rows with ids greater than that.
If the number of URLs is so huge that you can't store a database that large on your server (one might ask how a browser is going to cope with a table as large as that!) then you could always have the PHP script which outputs the most recent records delete them from the database as well.
Edit: When doing a lot of MySQL inserts there are several things you can do to speed it up. There is an excellent answer here detailing them. In short use MyISAM, and enter as many rows as you can in a single query (have a buffer array in PHP, which you add URLs to, and when it is full insert the whole buffer in one query).
If I were you , I try to solve this with two way .
First of all I encode the output part array with json and with the setTimeout function with javascript I'll decode it and append with <ul id="appendHere"></ul> so
when list is updated , it will automatically update itself . Like a cronjob with js .
The second way , if u say that I couldn't take an output while proccessing , so
using data insertion to mysql is meaningless I think , use MongoDb or etc to increase speed .
By The way You'll reach what u need with your key and never duplicate the inserted value .

Progress bar showing database progress

I was wondering, since I'm making a GPT Script, I need an installation system to insert database queries, and create tables.
I know how to do that, but the problem is I want to show the progress of creating files, registering the product to my server, with a progress bar, so the user knows how much longer they'll be waiting.
How can I do this?
PHP runs server-side, jQuery runs client-side. You would have to call a server-side script repeatedly, either to check the number of rows that have been inserted or tables that have been created, etc...or to tell it to perform another step, updating your progress bar on each call.
Either way is a performance hit, so probably not a good idea.

DB operation, Is it very expensive?

I have table called playlist, and I display those details using display_playlist.php file.
screen shot of display_playlist.php:
Every time user clicks the 'up' or 'down' button to arrange the song order, I just update the table.But I feel updating DB very often is not recommended, so Is there any efficient way to accomplish this task.
I am still a newbie to AJAX, so if AJAX is the only way to do it, can you please explain it in detail.thank you in advance.
In relative terms, yes, hitting the database is an expensive operation. However, if the playlist state is meant to be persistent then you have to hit the database at some point, it's just a question of when/how often.
One simple optimization you might try is instead of sending each change the user makes to the server right away, allow them to make however many changes they want (using some client-side javascript to keep the UI in the correct state) and provide a "Save Playlist" button that they can press to submit all of their changes to the server at once. That will reduce database hits, and also the number of round-trips made to the server (in terms of what a user experiences, a round-trip to the server is far more expensive than a database hit).
More broadly though, you shouldn't get hung up over hypothetical performance concerns. Is your application too slow to handle its current load (and if so, have you done any profiling to verify that it is indeed this database query that is causing the issue)? If not, then you don't need to worry too much about changing it just yet.
You can have a save button, so instead of updating on each move there will only be one update where you update every row at one time. This also lets you have a cancel button for people to refresh the way it was.
You can do it so users can change locally all they wish; defer writing the final result to the database until they choose to move on from the page.
if you really want to avoid updating the database, you can try some JavaScript based MP3players , which allow you to pass the path to *.mp3 files.
Then I suggest you to use Jquery UI - Sortable
and use it to update the songs list to the flash player ..

Is it possible to preload an entire MyISAM dataset using PHP & jSon?

For Example
Let say we have 1000 products in a single category.
And we would like to Filter through these products.
As we Filter through the products using json.
Each time we have need run a separate query to the DB.
We were wondering if any one knows if it's possible display a preload the products table.
For example preload bar: initializing search (0 - 100%)
So the whole system would only initialize once on load then we would hope the search results could then be instant.
Unfortunately tweaking customers servers isn't really an option for us, so hopefully someone here may have a better suggestion
Thanks in advance!
On the PHP end, you could preload the records into something like a memcache bucket.
I strongly doubt, though, that the problem is the database query. It's much more likely that the json request is what takes so long.
It's certainly possible to preload all the values (i.e. fetch them in one big request). You are not giving any details about your system, but it would probably simply be a big json request.
I don't see any easy way to implement a progress bar for the Ajax request. Maybe a simple "loading" animation will do as well.
You would probably have to use some sort of local storage to persist the data across pages for it to make sense. For that, see this question.
1000 data sets isn't that much to query through, but it is a significant amount to store client side. However.. to answer your question:
Javscript can create an array of objects. Let's say
<script type="text/javascript">
var products = new Array()
products[0] = {name:'First', price:29.99};
products[1] = {name:'Second', price:29.99};
products[2] = {name:'Third', price:29.99};
</script>
... create these via php. Also instead of {...} you can create a function product(name, price) and then call products[0] = new product("first", 29.99);
When you have this set up all information is stored in the clients browser. So now you can use a search/filter via only javascript.
Loading bar can be neatly done through jquery-ui. Searching involves an array loop.
If you have multiple categories that you can separate before hand - you can just create different arrays, instead of storing everything in products array.
Good luck!

Categories