I have a Raspberry Pi Model B+, which means it has 512MB memory. I'm trying to update an SQL database on my Pi with PHP and I'm copying over transactions from a third party via their API. It's also a LEMP server setup if that makes a difference.
The issue is the page when run updates transactions (1,900) but the page doesn't load, i.e. show the summary information, I simply get a blank page.
I admit this is the initial stage and the code should not have to update more than 100 transactions at a time after the initial copying of old transaction to my Pi but I'm curious as to what the issue is exactly and how to manage it.
The code works fine with a couple of accounts where there are only a few hundred transactions per account. The code finishes and shows a few lines of info of how many accounts updated and how many transactions.
The code doesn't finish when it gets to an account with nearly 2,000 transactions.
Things noticed:
Initially just running the code on the one account the code updated 500 transactions and then stopped/crashed etc.
Increasing the www.conf max_children setting from 5 to 15 (even tried 50) the code now updates all 2,000 transactions but the code doesn't finish, meaning the page doesn't show summary information, no header (which contains buttons etc.), just a blank screen.
I've tried adding a set_time_limit to the php code but doesn't seem to do much.
had a look in the php, mysql etc. error logs in the /var/log/messages directory and nothing is in the logs apart from standard events.
I'm happy to accept the limitations of my Pi but would love to learn why it's failing and how to manage it.
Haven't put the code as there's a lot but the logic is as below but let me know what you would like to see:
Retrieve all transactions from 3rd party API and store in array.
Process array into a new array that only contains the needed data, formatting and editing certain fields.
Cycle through the array and insert it into the SQL database via a prepared statement. (I briefly saw there sql transactions but not
looked into it)
Count the number of accounts updated and number of transactions.
Print summary information.
Related
I have the following challenge... I created an appointment booking system with laravel 8, which allows people to book appointments which are then available in the admin area for the employees.
The webapp itself is running for a couple of months, but now I am getting more and more users, which lead already a couple of times to the situation, that the webserver / mysql was so overloaded that the website was not loading any more.
Analysis has shown, the too many queries are executed, as the calculation for open available time slots query the bookings table in the db, which got quite big over time (1.7gb table).
Surely I can optimize the db queries etc. but I still afraid, that too many concurrent web requests may lead to db/webserver blockage.
My question now is, what options would see to prevent this situation?
Apparently the main cause is the checkin of the available time slots function, which is executed every time the user selects a date in the dropdown (it will then execute an ajax call, which fetches the available time slots).
I was thinking about two possible solutions:
Queueing: would it make sense, to execute this ajax call as a Queued Job, so that not too many requests can block the db?
replace the ajax call for getting the timeslot with an internal api request? Hence instead of directly asking the db, I would as my backend API to get the time slots, which then may throttle the requests if there are too many?
Or are there better options???
I hope you could explain the issue and my desired target.
Thank you very much in advance!
Well a question I wanted to ask for a while but it is getting more and more the right time to ask it.
I am building a system in PHP that processes bookings for holiday facilities.
Every facility (hotel, motel, hostel, bead & breakfast etc etc) has it's own login and is able to manage its own bookings.
Now it is being ran on a system with a single database that separates the user data at the hand of the hostname & user & login.
Also the system is equipped with the option to import bookings provided by partner / resellers.
I have created a specific page for doing this.
Let's say :
cron.examplesystem.com
This URL is called by the Cron task every 5 minutes to check if there are any new bookings or any pending changes / cancellations or confirmations.
Now the issue
This is going just fine until now.
However we had one instance for every client. And generally every call had to process something between 3 to 95 bookings.
But as I now updated my system from one instance for every client I work with one instance for all clients.
So in the past :
myclient.myholidaybookings.example.com
I am now going to :
myholidaybookings.example.com
With one server to handle all clients instead of multiple servers for each client his own.
This will put a lot of stress on the server. And this in general is none of my worries since I have worked hard to make it manageable and scalable.
But I have no clue how to approach this.
Because let's say we have about 100 clients with each 3 - 95 bookings (average 49) we'll have 490 bookings or updates to process a time.
For sure we'll be having a timeout soon or later.
And this is what I want to prevent.
Now all kind of creative solutions are to be found. But what's best practice. I want to create a solution that is solid and doesn't have to be reworked half way going live.
Summary :
problem : I have a system that processes many API feeds in one call and I am sure we'll have a timeout when processing if the system gets populated with users
desired solution : a best practice approach in PHP how to handle and process many API feeds without worrying about a timeout when the user database is growing.
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 4 years ago.
Improve this question
I'm developing a tool for my company, that in very broad strokes is intended to message users some information (a link to a picture) if they decide they want to be notified when it comes online.
If it were just alerting them when it's online is easy, because you don't have to schedule it, just check the list to see if anyone wants to be messaged about the picture when it comes online, and do it.
But we also have "We've got your petition, the picture is not here yet but we'll message you when it is" kind of message, and a few "not here yet" that I have to launch days later if the picture isn't online yet. And all these scheduled jobs need to be canceled if in any moment the picture comes online, and we send the message with it's link.
I'll try to explain it in as much detail as I can:
The user asks to be notified
Our web takes note of the petition. (adds it to the DB)
Check if the file is already online. At this point in time the file might not have been uploaded yet, but may be mere seconds away from it. So if the picture is not online yet when the petition is made, we want to wait 1-2 minutes to send the "Picture not yet here" message. Just so we don't send a "not yet here" message and 5 seconds later a "here's your picture" one.
We want to wait a few hours (1-3), to send a new message asking them to be patient.
After another set amount of time (7 days, approx) we want to send a last message, letting them know that the picture might never reach them because it's not being uploaded.
In any given time, even after point 5, if the picture comes online we want to cancel all these schedules and send the message with the picture.
I've been trying to learn how to do this, and through my search I've learned of three possible ways to achieve this functionality:
Option A: A single Cronjob executing every minute, that sweeps the database table searching if it's time to send one of those messages.
This option is easy to understand, although I'm afraid it might tax the database too much. I can use the shifty control panel that 1and1 has to set up that single Cronjob and call it a day.
Option B: Programatically write Cronjobs for every message that gets scheduled.
This sounds like it would be more "efficient", or at least less taxing on the DB, but I'm not sure Cronjob is supposed to work like that. It's normally used to schedule tasks that repeat themselves, isn't it? This would need a whole lot of functions to work (read the Crontab, add a line, search a line, edit a line, delete a line). Problem being here that I don't know how to edit the crontab that's on 1and1 servers via php. I have tried to contact them but their support has not been helpful at all.
Option C: The "at" function in linux.
This I just learned about. It looks like it would do what I want: schedule a task that happens only once, and it's structure seems pretty easy to handle. The problem here is threefold: 1- I don't know if PHP can execute Command Lines, 2- I don't know if the server at 1and1 has the "at" program installed, 3- I don't know if I can get a Command Line to execute a PHP file with the arguments to make it work.
And if any of these can be done, I don't know how.
As you see there are plenty of things I don't know about, but I've been trying to inform myself and learn. I just ask here because I'm at the end of the rope.
These options I listed are not an exhaustive list, they are just the methods I've found.
Which method would serve my purpose better? And how to do it?
Relevant facts:
Our host and database are located within 1and1, in a virtual server (meaning, we don't have a complete server for us, but share one with other clients)
Although we have "Unlimited" database space and queries, there is still a hard limit of how many queries you can do in a certain limit.
I'm new-ish to using linux, and I have not worked with PHP for years (until I've got this job!), so it would be better if your explanation doesn't assume deep knowledge on my part.
Programatically write Cronjobs for every message that gets scheduled.
God no, the biggest issue you'll have with that is that you will have to anticipate in advance what kinds of messages you'll have to send when, and that clashes with your ability to easily cancel messages. You will generally have to worry about a lot of state to manage (more on that later), which is a lot of hassle. You also need to ensure your scheduled jobs are cleaned up again afterwards, since cron can only set repeating tasks.
The "at" function in linux.
This is basically cron but for non-repeating tasks. That's better, but is still stateful. Especially with shared hosts it's also somewhat unpredictable whether your code will always execute on the same machine or when a machine might reboot. In those circumstances you may lose your scheduled jobs, so this is a no-go.
A single Cronjob executing every minute, that sweeps the database table searching if it's time to send one of those messages.
Yes, this is the way to do it. The biggest advantage here is that it's stateless, in the sense that it will always pick up on the exact current contents of your database, so it allows you to easily manage what your job should be doing on its next run and not having to anticipate that at the time you schedule an event.
I'm afraid it might tax the database too much.
It's one query per minute (if you write it well). Presumably every single page load of your website will incur one or multiple queries, and a properly built site should be able to handle hundreds to thousands of loads per second. One more query per minute isn't going to tank it. If it does, you have bigger issues.
I would personally choose the option A, I'm using it already on a project I worked on.
In your case, having the data on a shared hosting, I would create a cronjob that runs every minute (using an online service) and hits a php file somewhere in your folders, checking in a database table if anything must be done.
You should write some code that handles all the notifications you want to send and when, creating, for each of them, a row in the db table with the time of execution and all the details ready to be used to create the notification and to send it out.
The entire thing would work more or less as follow:
- Something happens that requires the creation of a notification to be sent out in 5 minutes: the row is created in the db table with the unix time or date of 5 minutes from now.
- A notification needs to be sent out 3 days from now, you use the same procedure as above.
The cronjob runs every minute and checks for expired orders (anything with date <= now), if any, a script takes care of these rows and execute the orders (sending out only the notifications required).
The database wouldn't be bothered too much, having to perform only 1 query per minutes (only checking for expired orders).
I have a web app at mmatakedown.net. It's basically a social game, fantasy sports-esque, where you predict fight outcomes and score points and perks for getting them correct, with a leaderboard, etc.
Right now, I have a page I wrote to administer "results". What I'd like to do is use cron job (or a better option?) to check twitter periodically (on the day of a fight card) and look for tweets that note the results of the fights. (from more trusted accounts to start, perhaps). For example, it might know that a card locked today, and that the deadline is passed, so it's time to start searching for results. If two fighters were named Brooks and Jones, for example, it might look for something like Jones def. or Brooks def. to find out who the victor was, then search for something like "sub" or "ko" or "dec" to find out the method of victory.
Once it tallies a certain # of tweets to confirm the result, it would update the DB with the result, then set off a series of updates and notifications based on the result, who picked what, what scores were updated, any milestones reached, etc.
What would be the best approach for this? I write the site w/ PHP/mysql/ajax/jquery presently.
I'm not sure about the best approach, but this functionality ought to be independent of your web application (i.e. the PHP/ajax/jquery code base).
It's a program that can run via cron, check the database for things that need to be updated, and then update the database appropriately.
You could run a cron job every hour or whatever you think is necessary in order to keep the database relatively up to date. Or you could get fancier I suppose.
I think cron is entirely appropriate though.
I have written Python programs that were invoked via cron, did some Twitter API calls, and updated a database. The application was quite different, but Python worked well for the task. It has built-in libraries for everything you'll need (e.g. JSON, http) except the MySQL access, but that's a quick package install.
One complication, you may start searching for tweets and find a few indicating the fight was over, but not enough to change your database. You will probably want a way to ensure that a subsequent invocation of the program has access to this information.
I think generally that when you write these offline batch updating scripts that you should have a table in your database for them to log to. For example, when did they start, how many tweets did they inspect, when did they exit, did they exit successfully, etc.
mostly I find the answers on my questions on google, but now i'm stuck.
I'm working on a scraper script, which first scrapes some usernames of a website, then gets every single details of the user. there are two scrapers involved, the first goes through the main page, gets the first name, then gets the details of it's profile page, then it goes forward to the next page...
the first site I'm scraping has a total of 64 names, displayed on one main page, while the second one, has 4 pages with over 365 names displayed.
the first one works great, however the second one keeps getting me the 500 internal error. I've tried to limit the script, to scrape only a few names, which works like charm, so I'm more then sure that the script itself is ok!
the max_execution_time in my php ini file is set to 1500, so I guess that's not the problem either, however there is something causing the error...
not sure if adding a sleep command after every 10 names for example will solve my situation, but well, i'm trying that now!
so if any of you have any idea what would help solve this situation, i would appreciate your help!
thanks in advance,
z
support said i can higher the memory upto 4gigabytes
Typical money gouging support answer. Save your cash & write better code because what you are doing could easily be run from the shared server of a free web hosting provider even with their draconian resource limits.
Get/update the list of users first as one job then extract the details in smaller batches as another. Use the SQL BULK Insert command to reduce connections to the database. It also runs much faster than looping through individual INSERTS.
Usernames and details is essentially a static list, so there is no rush to get all the data in realtime. Just nibble away with a cronjob fetching the details and eventually the script will catch up with new usernames being added to the incoming list and you end up with a faster,leaner more efficient system.
This is definitely a memory issue. One of your variables is growing past the memory limit you have defined in php.ini. If you do need to store a huge amount of data, I'd recommend writing your results to a file and/or DB at regular intervals (and then free up your vars) instead of storing them all in memory at run time.
get user details
dump to file
clear vars
repeat..
If you set your execution time to infinity and regularly dump the vars to file/db your php script should run fine for hours.