We're currently working with a free warehouse management system. This exports our store data to the external price search engine all 6 hours. We need to do this hourly due new law changes.
Here comes the riddle: You need to click on a button on the webinterface of our management system to start the export. This calls "host/admin/exportformate.php?action=export&kExportformat=30&token=1800b8e38ded2ead0f060c18fdbcbfdb". We solved the problem with losing the session with a firefox addon which reloads the index.php ever 30 secs.
Now when I open up a new window or tab with the URL above, everything runs fine. When I try to open the URL with a batch file to set up a cron job for the hourly export, the opened window/tab only refers to "/admin/exportformate.php?action=export", drops the rest of the line and doesn't call the action at all.
Is there a way to call this exact php action hourly, ideally with credentials? (No, .htaccess doesn't work with our management system...)
Thanks in advance!
Related
I want to run a PHP script every 5 seconds without using cronjobs. How is it possible in PHP?
I want to update user data every 5 seconds. The program will execute when I refresh the page but I want to run that script if the page is open or not in browser.
How can I achieve this ?
One way to do it would be to have a text file or a database entry that holds the time of the last run in UNIX time.
Then on all (or selected) pages you add something like;
If($lastrun +5 < strtotime(now)){
//Run the user update
}
This means when a user or visitor on your page goes to one of the "selected" pages with the code above this visitor will "run the update"
You must have some basics about PHP. PHP only runs when you request a page. So it's impossible to run php without requesting the page. Somehow you must reload the page every 5 seconds then you can run it every 5 sec.
So you must use cron or something like this. Or, you can you can use an old computer (NOT RECOMMENDED) which will relaod the page. To relaod the page you can use browser plugins like auto reload (for chrome).
But almost every hosting companies provide free cron job. Please search your cpanel for that. Or, mail your hosting provider for help. Cron is the best way to do this job for free.
I've heard that the wordpress cron job only works when a trigger like 'user visits the website' occurs. What if I want to run a cron job every night automatically that runs a .php file. The complete scenario is that I want to save some data coming from a third party source as an API call to one of my custom php file, say savedata.php, and I save this data in a text file (for now) and another .php file say: executecron.php is ready to run automatically at 11 pm at night. How can I do this in wordpress without any users visiting my website? And I want to notify that I do not have permission to log into the dashboard but I can work all the php files.
Also please have patience with me as I'm totally new to wordpress development.
Thanks
Yes, WordPress Cron Job are only loaded once the user visit the site.
So, as of your problem you want to automatically load the file at certain interval of time you need to configure the cron job from the server and specify the path of the file you want to cron job at fixed interval of time.
The setting mostly found in the cPanel menu but it might vary according to the hosting provider or you can ask hosting provider support team for doing it.
Here are the useful links:
Site Ground Real Cron Job
Cron Job for WordPress
If you still find issue please contact me for solving issue.
I've build a scraper to get some data from another website. The scraper runs currently at the command line in a screen so the process is never stopping. Between each request I've set an interval to keep things calm. In one scrape it's possible there are coming 100 files along with which needs to be download. Also this process haves an interval after every download.
Now I want to add the functionality in the back-end to scrape on the fly. Everything works fine, I get the first data set which only has 2 requests. Within this data returned I've an array with files need to be download (can be 10 can be +100).. I would like to create something the user can see realtime how far the download process is.
The thing I am facing, when the scraper has 2 jobs to do in a browser window with up to +20 downloads including intervals to keep things clam down it will take too much time. I am thinking about to save the files needed to be download into a database table and handle this part of the data process by another shell script (screen) or cronjob.
I am wondering about if my thoughts are in the good way, overkilled or there are some better examples to handle these kind of processes.
Thanks for any advice.
p.s. I am developing in PHP
If you think that is overkill, you can run the script and waiting that task is finished before run again.
Basically you need to implement a message queue where http request handler (front controller?) emit a message to fetch a page, and one or more workers do the job, optionally emitting more messages to the queue to download files.
There are plenty of MQ brokers, but you can implement your own with database as a queue storage.
I want to perform a task with a php file, (updating a feed) which I want to do automatically once a day. I DON'T want to have to load the file in a browser by hand. The file itself could be anything (very small and fast) but it needs to be run every day without using Cron jobs
If you're on a Windows machine, you can use a scheduled task (here are the instructions for Windows 7, but Google "run scheduled task " to find similar pages for other versions). It has much the same options as Cron, with a simple interface.
Two other possible hacks:
Have the URL for the feed itself be a PHP script that updates the feed and outputs it directly. Then you could but a cache in front of that URL so it only refreshes once a day (for instance the free level of Cloudfare).
Have the PHP script create a webpage that refreshes itself once a day, using the meta refresh tag. Then open a browser window and never close it.
If the page you have created is exposed to network, you could theoretically run a cron-job from another machine on the said network and call a curl to the page:
curl http://server/yourphp
I created a script that gets data from some web services and our database, formats a report, then zips it and makes it available for download. When I first started I made it a command line script to see the output as it came out and to get around the script timeout limit you get when viewing in a browser. But because I don't want my user to have to use it from the command line or have to run php on their computer, I want to make this run from our webserver instead.
Because this script could take minutes to run, I need a way to let it process in the background and then start the download once the file has been created successfully. What's the best way to let this script run without triggering the timeout? I've attempted this before (using the backticks to run the script separately and such) but gave up, so I'm asking here. Ideally, the user would click the submit button on the form to start the request, then be returned to the page instead of making them stare at a blank browser window. When the zip file they exists (meaning the process has finished), it should notify them (via AJAX? reloaded page? I don't know yet).
This is on windows server 2007.
You should run it in a different process. Make a daemon that runs continuously, hits a database and looks for a flag, like "ShouldProcessData". Then when you hit that website switch the flag to true. Your daemon process will see the flag on it's next iteration and begin the processing. Stick the results in to the database. Use the database as the communication mechanism between the website and the long running process.
In PHP you have to tell what time-out you want for your process
See PHP manual set_time_limit()
You may have another problem: the time-out of the browser itself (could be around 1~2 minutes). While that time-out should be changeable within the browser (for each browser), you can usually prevent the time-out user side to be triggered by sending some data to the browser every 20 seconds for instance (like the header for download, you can then send other headers, like encoding etc...).
Gearman is very handy for it (create a background task, let javascript poll for progress). It does of course require having gearman installed & workers created. See: http://www.php.net/gearman
Why don't you make an ajax call from the page where you want to offer the download and then just wait for the ajax call to return and also set_time_limit(0) on the other page.