i have a problem with my php ajax chat script - php

hello i have some problems with my php ajax script
i'm using PHP/mysql
i have a field in my accounts table that will save the time for the last request from a user, i will use that to kick the idle user out of the chat. and i will make a php function that will delete all the rows that its time field more than the time limit, but where should i use this method is it okay to fire it every time a new request sent to my index.php ? i think that will make a huge load on the server,is n't it ? do you have a better solution?
thanks

There are two viable solutions:
either create a small PHP script that makes this deletion in an infinite loop (and of course sleeps for a specified amount of time before doing it again), and then start it via PHP CLI,
or create one that makes the deletion only once, then exits, and call it from cron (if you're using a UNIXish server) or Task Scheduler (on Windows).
The second one is simpler, but its drawback is that you can't make the interval between the deletions shorter than 60 seconds.

A solution could be to fire the deletion function just once every few requests.
Using rand() you could give it a 1 in 100 (for example) change of running the function, so that about one page request in a 100 will clean up the expired data.

Related

Cron Job - php script - multiple users?

What is the best way to perform cron-job automation for multiple users?
Example:
A cron-job needs to run every 10 minutes an call a PHP script that connects to an external API (via curl) and collects data (site visitors and other data) a user has received on an external web property. We need to check periodically every 10 minutes via the API if there is any new data available for that user and fetch it -- and that for EACH user in the web-app.
Such cron-job PHP script call to the API usually takes 1-3 seconds per user, but could occasionally take up to 30 seconds or more to complete for one user (in exceptional situations).
Question...
What is the best way to perform this procedure and collect external data like that for MULTIPLE users? Hundreds, even thousands of users?
For each user, we need to check for data every 10 minutes.
Originally I was thinking of calling 10 users in a row in a loop with one cron-job call, but since each user collection can take 30 seconds...for 10 users a loop could take several minutes and...the script could timeout? Correct?
Do you have some tips and suggestions on how to perform this procedure for many users most efficiently? Should separate cron jobs be used for each user? Instead of a loop?
Thank you!
=== EDIT ===
Let's say one PHP script can call the API for 10 users within 1 minute... Could I create 10 cron-jobs that essentially call the same PHP script simultaneously, but each one collecting a different batch of 10 users? This way I could potentially get data for 100 users within one minute? No?
It could look like this:
/usr/local/bin/php -q get_data.php?users_group=1
/usr/local/bin/php -q get_data.php?users_group=2
/usr/local/bin/php -q get_data.php?users_group=3
and so on...
Is this going to work?
=== NOTE ===
Each user has a unique Access Key with the external API service, so one API call can only be for one user at a time. But the API could receive multiple simultaneous calls for different users at once.
If it takes 30 seconds a user and you have more than 20 users, you won't finish before you need to start again. I would consider using GearMan or other job server to handle each of these requests in an async way. GearMan can also wait for jobs to complete, so you should be able to loop over all the requests you need to make and then wait for them to finish. You can probably accomplish the same thing with PHP's pthread implementation, however, that's going to be significantly more difficult.

Get a list of dynamic names from a DB and have a cron job that traverses this array (php)

Here's what I'm trying to accomplish in high-level pseudocode:
query db for a list of names (~100)
for each name (using php) {
query a 3rd party site for xml based on the name
parse/trim the data received
update my db with this data
Wait 15 seconds (the 3rd party site has restrictions and I can only make 4 queries / minute)
}
So this was running fine. The whole script took ~25 minutes (99% of the time was spent waiting 15 seconds after every iteration). My web host then made a change so that scripts will timeout after 70 seconds (understandable). This completely breaks my script.
I assume I need to use cronjobs or command line to accomplish this. I only understand the basic us of cronjobs. Any high level advice on how to split up this work in a cronjob? I am not sure how a cronjob could parse through a dynamic list.
cron itself has no idea of your list and what is done already, but you can use two kinds of cron-jobs.
The first cron-job - that runs for example once a day - could add your 100 items to a job queue.
The second cron-job - that runs for example once every minute in a certain period - can check if there are items in the queue, execute one (or a few) and remove it from the queue.
Note that both cron-jobs are just triggers to start a php script in this case and you have two different scripts, one to set the queue and one to process part of a queue so almost everything is still done in php.
In short, there is not much that is different. Instead of executing the script via modphp or fcgi, you are going to execute it via command line php /path/to/script.php.
Because this is a different environment than http, some things obviously don't work. Sessions, cookies, get and post variables. Output gets send to stdout instead of the browser.
You can pass arguments to your script by using $argv.

how to prevent server to block execution on long time running script

i have a big script written in php, which should import a lot of informations in a prestashop installation, using webservices, this script is written in "sections" I mean, there is a function that import the categories, another one that import products, then manufacturers, and so on, there are about 7 - 10 functions called in the main script. Basically I assume that this script must run for about an hour, passing from a function to the next one and so on since it arrives at the last function, then return some values and stops until the next night.
i would like to understand if it could be better :
1) impose a time limit of 30 minutes everytime i enter a new function (this will prevent the timeout)
2) make a chain of pages, each one with a single function call (and of course the time limit)
or any other idea... i would like to :
know if a function has been called (maybe using a global variable?)
be sure that the server will execute the function in order (so the pages chain)...
i hope to have beeen clear, otherwise i'll update the question.
edits:
the script is executed by another server that will call a page, the other server is "unkown" from me, so I simply know only that this page is called (they could also call the function by going on the page) but anyway i have no controll on it.
For any long running scripts, I would run it through the commandline, probably with a cronjob to kick it off. If it's triggered from the outside, I would create a job queue (for example in the database) where you insert a new row to signify that it should run, along with any variable input params. Then the background job would run - say - every 5 minutes, check if there's a new job in the queue. If there's not, just exit. If there is, mark that it has begun work and start processing. When done, mark that it's done.
1 hour of work is a looooooooong time though. Nothing you can do to optimise that?
You can increase the time limit for execution of a script as much as you want using :
set_time_limit(seconds);
And also for long running scripts you need a more memory. you can increase the memory limit using :
ini_set('memory_limit','20M');
And second other thing you have to make sure is that you are running your script on a dedicated server because if you are using a shared server you server will kill automatically long running scripts.

How to launch a php script 12 hours after each user triggered event

Any of the users can trigger an event, as many times they wish. Each time this event is triggered, a row in a mysql table is created with the timestamp, their userid and other useful information. What I need to do is launch a script which runs exactly 12 hours after the event is triggered i.e. for each row in the table.
How can I achieve this in an automated and efficient fashion?
You could use a cron job which every minute launches the script.
In the script you should first fetch the row(s) and check if it's OK to run (if 12 hours passed) then continue, else stop.
I don't know if this will be very efficient, it depends on your number of entries in the database, but technically it's not expensive to just check if the current date matches a date fetched from the database + 12 hrs, I cannot say more because you didn't give too much details about your data.
You'd probably be better off with a cronjob
http://en.wikipedia.org/wiki/Cron
Potentially, you could look into MySQL event scheduler. Although It might not fit your needs, hard to really tell on the details given
http://dev.mysql.com/doc/refman/5.1/en/events.html
something like
CREATE EVENT myTimedEvent ON SCHEDULE EVERY INTERVAL 5 minutes DO CALL updateRows();
updateRows checks your criteria (12hours ago), if it is, perform whatever action you want to do. This requires your MySQL to be # version 5.1+ however
You would probably best have a cron job which runs every minute and checks for any rows in the database > 12 hours old that have not yet been processed the script and process them. This wont give you EXACTLY a 12 hour difference, but would give you a difference within a minute of that.
You would probably also want to make sure that script would be able to run within a few seconds such that you don't have overlap of the script running twice at the same time.
This could be done using CronJobs. If you have root access to your Server or a server administration toolkit that offers cronjob managment you would do it on your server. otherwise use an online cronjob service (google for cronjob online).
the cronjob then triggers a php script on your server in your defined interval, like every minute or every 5 minutes.
this script then selects all rows from your mysql table which are older then 12 hours (WHERE timestamp <= NOW() - INTERVAL 12 hour ) and performs your desired actions on each result and delete the result from the table (or mark it done),
just make sure that the fetching and the actions itself are faster than your cronjob interval, otherwise you would have two or more scripts working on the same rows.
The easy way for me is to make the page head contains
<head>
<meta http-equiv="refresh" content="43200"> <!-- 43200 for (12 hours × 60 mintes × 60 seconds) -->
</head>
This method is very helpful to avoid server time out, which you can't avoid if you are using only PHP code
Important if you start the script using submit button, it's recommended to make the form action="other_page" not the same page, and sure you should save your input values as cookie using setcookie function and grab it as a variable in the action page using $cookieVar = $_COOKIE['the_cookie_name'] ;
You may need to increase or decrease the $cookieVar value and update the value again using setcookie every time your head code http-equiv do the refresh automatically after the specific seconds (43200 in our example) depends on what you want to do
Note that if you make your action start automatically without pressing submit button you can do the action in the same page.
I did that idea before, I had 180,000 images in one folder and the server didn't allow me to download it all because it was showing me only first 7998 images, so I create a script to zip each 1000 image in a file outside the images folder and to avoid time out, I made the code refresh each 60 second , finally I got only 180 zip file :)

Ajax - Communicating with running PHP script in background?

I suspect this question will seem too... silly, but I'm trying to get my head around a nice solution and I'm kinda stuck.
So, here's my situation :
I'm using Ajax to perform a series of tasks. Actually, I'm queuing (or parallely at times, doesn't really matter to me) 1 or more requests.
Show the progress as a percentage (1 out of X tasks performed)
When finished, show the final result.
What I'm trying to do :
Instead of having 3-4 different tasks running (= 3-4 different PHP scripts called asynchronously via Ajax), I would like to have just 1 (= 1 script) - in other words, combine the X scripts into one. (That's easy).
Issues I'm facing :
How could I still report the percentage complete (1 out of X tasks)?
Any ideas?
I would have updated a key in Memcached each time a task is complete. And then you would let your ajax-call simply get the value from your memcached key.
http://php.net/manual/en/book.memcached.php
If you have the request dropped into a database by your original ajax, then kick off the script, you will still have time (assuming the request takes some time to complete) for subsequent tasks to be dropped into the same database to be picked up by the script that is still running.
As for reporting it, perhaps run a quick jquery to see how many items are in the queue? Alternately have the task.php file update the database (possibly another table even) to say how many jobs it has completed and how many are currently still in the queue.
If you don't need to send much data to the PHP script, you can use a "long poll" approach. With this, you don't use AJAX but insert a script tag like this:
<script src="my_php_script?task1=x&param_t1_1=42&task2=y"></script>
The PHP file can then send back a JavaScript command like
updatePercent(12);
after each task is done. The commands should be executed by the browser whenever they come in. Be sure to call flush after every task.
Looking into Comet may give you other ideas on how to handle the Client-Server connection.
You can manage queue - before sending AJAX request - put the task in queue (could be object or array). Run AJAX asynchronously with complete function which will remove a job from queue when it's done.
You can update progress together with removing job from queue or handle it separately using setTimeout() which will check how many task there are in queue and how many were put in it in total: % = (submitted_tasks - items_in_queue) / submitted_tasks * 100

Categories