I'm doing a report using PHP and SQL SERVER 2000
In this report, PHP page is accessing a live database with lot of
processes happening.
The problem i'm facing is, when i give this report to users they
frequently access the report and it can cause havoc to the daily
processes.
I want to limit the access time for a given day to some number(let's
say 10 times per day) or between two hours periods per person(once accessed a person only can view the report after 2
hours).
Is this achievable via PHP? or do i need to do it in SQL SERVER 2000
configurations?
Please provide me with a detailed answer with the references if
possible
You quite simply enable/create caching, so the report is generated a few times a day (either a cronjob / time generation) or by user activation, then allow that report to 'live' for a few hours (or whatever your parameters are).
Basically you need to create some caching PHP side. I would create a copy of the report then send that to users, and update it throughout the day without their interaction.
UPDATE
A quick example I found: http://www.devshed.com/c/a/PHP/Output-Caching-with-PHP/
Related
I have the following challenge... I created an appointment booking system with laravel 8, which allows people to book appointments which are then available in the admin area for the employees.
The webapp itself is running for a couple of months, but now I am getting more and more users, which lead already a couple of times to the situation, that the webserver / mysql was so overloaded that the website was not loading any more.
Analysis has shown, the too many queries are executed, as the calculation for open available time slots query the bookings table in the db, which got quite big over time (1.7gb table).
Surely I can optimize the db queries etc. but I still afraid, that too many concurrent web requests may lead to db/webserver blockage.
My question now is, what options would see to prevent this situation?
Apparently the main cause is the checkin of the available time slots function, which is executed every time the user selects a date in the dropdown (it will then execute an ajax call, which fetches the available time slots).
I was thinking about two possible solutions:
Queueing: would it make sense, to execute this ajax call as a Queued Job, so that not too many requests can block the db?
replace the ajax call for getting the timeslot with an internal api request? Hence instead of directly asking the db, I would as my backend API to get the time slots, which then may throttle the requests if there are too many?
Or are there better options???
I hope you could explain the issue and my desired target.
Thank you very much in advance!
Well a question I wanted to ask for a while but it is getting more and more the right time to ask it.
I am building a system in PHP that processes bookings for holiday facilities.
Every facility (hotel, motel, hostel, bead & breakfast etc etc) has it's own login and is able to manage its own bookings.
Now it is being ran on a system with a single database that separates the user data at the hand of the hostname & user & login.
Also the system is equipped with the option to import bookings provided by partner / resellers.
I have created a specific page for doing this.
Let's say :
cron.examplesystem.com
This URL is called by the Cron task every 5 minutes to check if there are any new bookings or any pending changes / cancellations or confirmations.
Now the issue
This is going just fine until now.
However we had one instance for every client. And generally every call had to process something between 3 to 95 bookings.
But as I now updated my system from one instance for every client I work with one instance for all clients.
So in the past :
myclient.myholidaybookings.example.com
I am now going to :
myholidaybookings.example.com
With one server to handle all clients instead of multiple servers for each client his own.
This will put a lot of stress on the server. And this in general is none of my worries since I have worked hard to make it manageable and scalable.
But I have no clue how to approach this.
Because let's say we have about 100 clients with each 3 - 95 bookings (average 49) we'll have 490 bookings or updates to process a time.
For sure we'll be having a timeout soon or later.
And this is what I want to prevent.
Now all kind of creative solutions are to be found. But what's best practice. I want to create a solution that is solid and doesn't have to be reworked half way going live.
Summary :
problem : I have a system that processes many API feeds in one call and I am sure we'll have a timeout when processing if the system gets populated with users
desired solution : a best practice approach in PHP how to handle and process many API feeds without worrying about a timeout when the user database is growing.
I just want a approach on how to build a database with live records, so don't just downvote. I don't expect any code.
At the moment I have a MySql database with about 2 thousand users, they are are getting more though. Each player/user has several points, which are increasing or decreasing by certain actions.
My goal is that this database gets refreshed about every second and the user with more points move up and others move down... and so on
My question is, what is the best approach for this "live database" where records have to be updated every second. In MySql I can run time based actions which are executing a SQL command but this isn't the greatest way I think. Can someone suggest a good way to handle this? E.g. other Database providers like MongoDB or anything else?
EDIT
This doesn't work client side, so I can't simply push/post it into the databse due some time based events. For explanation: A user is training his character in the application. This training (to get 1 level up) takes 12 hours. After the time is elapsed the record should be updated in the database AUTOMATICALLY also if the user doesn't send a post request by his self (if the user is not logged in) other users should see the updated data in his profile.
You need to accept the fact that rankings will be stale to some extent. Your predicament is no different than any other gaming platform (or SO rankings for that matter). Business decisions were put in place and constantly get reviewed for the level of staleness. Take the leaderboards on tags here, for instance. Or the recent change that has profile pages updated a lot more frequently, versus around 4AM GMT.
Consider the use of MySQL Events. It is built-in functionality that replaces the need for cron tasks. I have 3 event-related links off my profile page if interested. You could calculate ranks on a timed schedule (your tolerance for staleness) and the users' requests for them would be fast (faster than the below from Gordon). On the con-side, they are stale.
Consider not saving (writing) rank info but rather focus just on filling in the slots of your other data. And get your rankings on the fly. As an example, see this rankings answer here from Gordon. It is dynamic, runs upon request with at least at that moment non-staleness, and would not require Events.
Know that only you should decide what is tolerable for the UX.
I'm attempting to create a procedure that will be running on the server every 1 min(more or less). I know I could achieve it with a cronjob but I'm concerned, let's say I have about 1000 tasks (1000 users that the system would need to check every 1 min), wouldn't it kill the server?
This system is supposed to sync data from google adwords API and do something with it. for example it should read a campaign from google and every 1000 impressions or clicks it should do something. So obviously I need to keep running a connection to adwords api to see the stats on real time. Imagine this procedure needs to run with 1000 registered users.
What technology should I use in my case when I need to run a heavy loop every 1 min?
Thanks a lot,
Oron
Ideally, if you're using a distributed workflow, you are better off servicing multiple users this way.
While there are many technologies available at your disposal, it's difficult to pinpoint any specific one that will be useful when you haven't given enough sufficient information.
There are 2 fundamental issues with what you are trying to do, the first one more severe than the other.
AdWords API doesn't give you real-time stats, the reports are usually delayed by 3 hours. See http://adwordsapi.blogspot.com/2011/06/statistics-in-reports.html for additional background information.
What you can do is to pick a timeframe (e.g. once every 2 hours) for which you want to run reports, and then stick to that schedule. To give you a rough estimate, you could run 10 reports in parallel, and assuming it takes 10 seconds to download and parse a report (which gives you a throughput of 1 report / second, but this strictly depends on your internet connection speed, load on AdWords API servers, how big your clients are, what columns and segmentation and date ranges you are requesting the data for, etc. For big clients, the timing could easily run into several minutes per report), you could refresh 1000 accounts only in 20-30 minutes' time.
Even if you wanted to go much faster, AdWords API won't allow you to do so. It has a rate limiting mechanism that will limit your API call rate if you try to make too many calls in a very small period. See http://adwordsapi.blogspot.com/2010/06/better-know-error-rateexceedederror.html for details.
My advise is to ask this question on the official forum for Adwords API - https://groups.google.com/forum/?fromgroups#!forum/adwords-api. You will definitely find users who have tackled similar issues before.
you can safe a variable which contains a timestamp of the last run of your loop.
now when a user visits your page, check if your timestamp is older than 1 minute.
If he is older run your loop and check all user.
so your loop runs only when there are users on your site. this saves a lot of server performance.
I'm writing a realtime wep application, something similar to auction site. The problem is that I need a daemon script, preferrably php, that runs in background and constantly launches queries to mysql db and basing on some of criterias (time and conditions from resultsets) updates other tables. Performance of the daemon is crucial. Sample use case: we have a deal that is going to expire in 2:37 minutes. Even if nobody is watching/bidding it we need to expire it exactly in 2:37 since the time it started.
Can anybody advise a programming technology/software that performs this kind of task the best?
Thanks in advance
UPDATED: need to perform a query when a deal expires, no matter if it has ever been accessed by a user or not.
Why do you need to fire queries at time intervals? Can't you just change how your frontend works?
For example, in the "Deals" page, just show only deals that haven't expired - simplified example:
SELECT * FROM Deal WHERE NOW() <= DateTimeToExpire
Accordingly for the "Orders" page, a deal can become a placed order only if time hasn't expired yet.
Does your daemon need to trigger actions instantaneously? If you need a table containing the expired state as a column you could just compute the expire value on the fly or define a view? You could then use a daemon/cron job querying the view every 10 minutes or so if you have to send out emails or do some cleanup work etc.