I've run into a problem while developing a Wordpress plug-in. Basically the API I'm building the plug-in for limits the requests I need to make to 6 per minute, however when the plug-in activates I need to make more than 6 requests to download the API data I need for the plug-in.
The API is the LimelightCRM API (http://help.limelightcrm.com/entries/317874-Membership-API-Documentation). I'm using the campaign_view method of the API, and what I'm looking to do is potentially make the requests in batches, but I'm not quite sure how to approach the problem.
Idea 1:
Just off the top of my head, I'm thinking I'll need to count the number of requests I'll need to make with PHP on plug-in activation, by using campaign_find_active and then divide that count by the request limit (6), and make 6 campaign_view requests per minute until I have all of the data I require and store them in Wordpress transients. However, say I need to make 30 requests, the user can't just sit around waiting 5 minutes to download the data. Even if I manage to come up with a solution for that, it might require me to set the time limits for the Wordpress transients in such a way that the plug-in will never need to make more than 6 requests. So my next thought is, can I use a Wordpress hook to make the requests every-so-often while checking when the last batch of requests was made? So it's already getting very tricky. I wonder if you guys might be able to point me in the right direction. Do you have any ideas on how I might be able to beat this rate limit?
Idea 2:
Cron jobs that store the values in a database?
//Fetch Campaign ID's
$t_campaign_find_active = get_transient('campaign_find_active');
if(!$t_campaign_find_active){
limelight_cart_campaign_find_active();
$t_campaign_find_active = get_transient('campaign_find_active');
return $t_campaign_find_active;
}
//Fetch Campaign Information for each Campaign ID
$llc_cnames = array();
foreach($llc_cids as $count => $id) {
if(!get_transient('campaign_view_'.$id)) {
limelight_cart_campaign_view($id);
$llc_cnames[$id] = get_transient('campaign_view_'.$id);
}
}
//Merge Campaign ID's and Campaign Info into Key => Value array
$limelight_campaigns = array_combine($llc_cids, $llc_cnames);
Note: The functions limelight_cart_campaign_find_active() and limelight_cart_campaign_view() are not included because they simply make a single API request, return the response, and store it in a Wordpress transient. I can include the code if you guys need it, but for the purposes of this example, that part of the plug-in is working so I did not include it.
I've come up with a solution for this guys, and I should have thought of it before. So I've arrived at the conclusion that downloading all of the API data on activation is simply impossible with the current rate limit. Most people who might use the plug-in would have far too many campaigns to download all of their data at once, and it's inevitable that the rate limit will be used up the majority of the time if I keep the code the way it is. So rather than constantly having that API data ready for the plug-in right after activation, I'm going to give the user the ability to make the API calls on demand as needed using AJAX. So let me explain how it will work.
Firstly, on plug-in activation, no data will initially be downloaded, and the user will need to enter their API credentials, and the plug-in will validate them and give them a check mark if the credentials are valid and API log-in was successful. Which uses one API request.
Now rather than having a pre-populated list of campaigns on the "Add Product" admin page, the user will simply click a button on the "Add Product" page to make the AJAX campaign_find_active request which will fetch the campaign ID's and return a drop-down menu of campaign id's and names. Which only uses one request.
After that drop-down data is fetched, they will need to choose the campaign they want to use, and upon choosing the campaign ID the plug-in will display another button to make a campaign_view request to fetch the campaign data associated with the ID. This will return another drop-down menu which will allow them to choose the product. This will also require a little CSS and jQuery to display/hide the AJAX buttons depending on the drop-down values. Which will only use one API request, and because the request is not automatically made and requires a button click the user will not make several API requests when choosing a campaign ID in the first drop-down menu that was fetched.
The user would then click publish, and have a wordpress indexed product with all of the necessary limelight data attached and cached. All API requests will be stored in transients with a 1 hour time limit, and the reason for the hour is so they don't have to wait 24 hours in case they make updates. I will also include a button on the settings page to clear the transients so they can re-download on demand if necessary. That could also get a bit tricky, but for the purposes of this question it's not a problem.
In total, I'm only using 3-4 API requests. I might also build a counter into it so I can display an error message to the user if they use too many requests at once. Something along the lines of "The API's limit of 10 requests per minute has been reached, please wait 60 seconds and try again."
I welcome any comments, suggestions or critiques. Hope this helps someone struggling with API request limits, AJAX is a great way to work around that if you don't mind giving the user a little more control.
I just made 40 API accounts and randomly choose one for each request.. Works well
$api_acounts = array(
"account1" => "asdfasdfdsaf",
"account2" => "asaasdfasdf",
"account3" => "asdfasdf",
);
$rand = rand(1,count($api_acounts));
$username = "account".$rand;
$password = $api_acounts['account'.$rand];
Related
I am developing a WordPress plugin that fetch user Instagram Profile info and store in database via WordPress wp_remote_get() function. A Corn job runs after every 24 hours that update the user's Instagram Info on daily basis. The problem is I've about 5000+ users and the script runs too long and the task never completed. Everything was working great when users quantity was under < 1000.
Which PHP settings in php.ini should I change to solve this issue? I've set max_execution_time to 0. Any other setting? Any suggestions?
I advice you to do the folowing
create more than one cron job which call same file
after update the user .. mark him as updated
do not update any user if he is marked as updated
make the updating function as transaction (finsh all or cancel all)
finally increase time out also good
Hard to make a valid recommendation without knowing your specific scenario, but I would change your code in the way that it visits instagram profile only when needed as opposed to for everyone via cron job. First, the info will be 'fresher', second, you'll avoid having the problem you're describing.
For instance, when a user visits their profile, a call is made to Instagram and data is pulled. You store the data in your database the same way as before, only with a timestamp. Also in the code, make sure that it doesn't pull data unless it's been 24 hours since the last refresh. Hope this helps.
My page is visited by multiple users at the same time.
User 1: visits the page and changes the name of title
User 2: user 2 was already on that page but sees the old title, the title automatically has to be updated to new title.
I know i can simply use AJAX to call every 5 minutes, but im trying to see if there is any other way possible that fires an event to all instances of the page opened by different users that if one of them is updated all other pages get automatically updated with latest data without the wait of 5 minute ajax call. Ajax seems inefficient since it will do many ajax calls and also what happens if user 1 updates title while user 2 updates title as well before user 2's page has been updated with 5 minute ajax call.
Not asking for a code, just need an advise whether I should keep using AJAX calls every 5 minutes and be happy with it or there is a better solution.
Try investigating web sockets for real time, two way communication between server and browser.
http://socketo.me/
I'm in the early stages of working with it myself but it seems like a solution that would fit your requirements.
Also, maybe look at push notifications
e.g. http://www.pubnub.com/blog/php-push-api-walkthrough
I know the title is complicated, but i was looking for some advise on this and found nothing.
Just want to ask if i'm thinking the right way.
I need to make a top facebook shared page with about 10 items or so for my website items (images, articles etc.)
And this is simple, i will just get the share count from facebook graph api and update in database, i don't want to make it in some ajax call based on fb share, it could be misused.
Every item has datetime of last update, create date and likes fields in database.
I will also need to make top shared url in 24h, 7 days and month so the idea is simple:
User views an item, every 10 minutes the shared count is obtained from fb graph api for this url and updated in database, database also stores last update time.
Every time user is viewing the item, the site checks last update datetime, if it is more than 10 minutes it makes fb api call and updates. It is every 10 minutes to lower fb api calls.
This basically works, but there is a problem - concurrency.
When the item is selected then in php i check if last update was 10 minutes ago or more, and only then i make a call to fb api and then update the share count (if bigger than current) and rest of data, because a remote call is costly and to lower fb api usage.
So, till users view items, they are updated, but the update is depending on select and i can't make it in one SQL statement because of time check and the remote call, so one user can enter and then another, both after 10 minutes and then there is a chance it will call fb api many times, and update many times, the more users, the more calls and updates and THIS IS NOT GOOD.
Any advise how to fix this? I'm doing it right? Maybe there is a better way?
You can either decouple the api check from user interaction completely and have a separate scheduled process collect the facebook data every 10 minutes, regardless of users
Or, if you'd rather pursue this event-driven model, then you need to look at using a 'mutex'. Basically, set a flag somewhere (in a file, or a database, etc) which indicates that a checking process is currently running, and not to run another one.
I want to write a little program that will give me an update whenever a webpage changes. Like I want to see if there is a new ebay listing under a certain category and then send an email to myself. Is there any clean way to do this? I could set up a program and run it on a server somewhere and have it just poll ebay.com every couple of minutes or seconds indefinitely but I feel like there should be a better way. This method could get dicey too if I wanted to monitor a variety of pages for updates.
There is no 'clean' way to do this.
You must relay on CURL or file_get_context() with context options in order to simply get data from URL, and, in order to notify you when content of URL is changed, you must store in database snapshots of page you are listening. Lately you are comparing new version of crawled content with earlier created snapshot and, if change in significant parts of DOM are detected, that should be trigger for your mail notifier function.
I'm not awesome enough to write a chat application, and I'm trying to get one to work, and I've recently downloaded one from here, it's pretty good so far, as I've tested it out on XAMPP, but I have a slight problem. I'm trying to generate a list of online users to give it a more practical application-like feel, but the problem with that, is I have no clue how to do it easily.
When users login to my site, a session named g_username is created, (the chat says 'username', but I'll fix that) and from what I see so far, the easiest method would be to store their username in a database called OnlineUsers and call that data via Ajax, but, the other problem, is that it's session based, and sometimes the users can just leave, without logging out, and I intended to run a script to logout the user from both the OnlineUsers table, and by deleting the session.
If they leave without logging out, they'd be online forever! I could potentially suffix a bit of code on every page, that toggled an ajax event on page close, the event being a script that kills their OnlineUsers table record, but then again, that would load the server with useless queries as users jump between pages, as far as I'm aware.
Creating my entire site in Ajax isn't really an option, as it's a load of different sites combined in to 1 'place' with a social 'layer' (if you will) from a social service.
Does anyone see a way to do this that would make sense, and be easy to integrate, and do with Apache, without command line access?
You could so something like storing a timestamp of the users last action in a database, comparing that timestamp when outputting online users and making sure that it was done at most 1 min ago.
Run on all/vital pages:
(Deciding if the last action is outdated, you could also check if it was done for one minute ago to reduce the database-load)
if($user['lastAction'] < time()) {
//update into database, last action is outdated
}
When calculating the amount of users online and is within the loop of each timestamp
//If the users last action was within a minute, the user is most likely online
if(($row['lastAction']- time()) > 60*60)
//count user as online
you could have a cron job [if you have cpanel] running on the server once every 60secs or so, that checks when a user last sent anything via the chat if they have not in the last lets say 5mins then remove their entry from the online users list.