I want to run a laravel cron job in order to run a command on windows 10 using task scheduler, I tried to create a basic task in scheduler but it shows running but data doesnt add in db. When I run "php artisan schedule:run" it works perfectly. I am using Laravel and Homestead.
I have adding these two lines while creating the task in scheduler
C:\xampp\php\php.exe (why do we have to add this when I don't even use xampp anymore, so I think this is the part which is giving issues?????)
C:\projects\project-name\artisan schedule:run
I would really appreciate, if someone could guide me through this, thanks.
Update your task scheduler command to this:
C:\xampp\php\php.exe C:\projects\project-name\artisan schedule:run
C:\xampp\php\php.exe does not mean using xampp, we're just using php here which is coincidentally found inside your xampp folder, because we need the executable php to run the file artisan with the parameter schedule:run which is found in C:\projects\project-name\
You can add an environment variable for your executable php so you could just write the command as php C:\path\to\artisan schedule:run.
Also, try to see the logs of task scheduler, so you can see what it tried to do.
As per your issue. Yes, C:\xampp\php\php.exe causes an issue. Try typing in your cmd that command. What happens? It's just paused there. That's also what's happening in your scheduled task.
Is there any way to set up git such that it listens for updates from a remote repo and will pull whenever something changes? The use case is I want to deploy a web app using git (so I get version control of the deployed application) but want to put the "central" git repo on Github rather than on the web server (Github's interface is just soooo nice).
Has anyone gotten this working? How does Heroku do it? My Google-fu is failing to give me any relevant results.
Git has "hooks", actions that can be executed after other actions. What you seem to be looking for is "post-receive hook". In the github admin, you can set up a post-receive url that will be hit (with a payload containing data about what was just pushed) everytime somebody pushes to your repo.
For what it's worth, I don't think auto-pull is a good idea -- what if something wrong was pushed to your branch ? I'd use a tool like capistrano (or an equivalent) for such things.
On unix-likes you can create cron job that calls "git pull" (every day or every week or whatever) on your machine. On windows you could use task scheduler or "AT" command to do the same thing.
There are continuous integrations programs like Jenkins or Bamboo, which can detect commits and trigger operations like build, test, package and deploy. They do what you want, but they are heavy with dependencies, hard to configure and in the end they may use periodical check against git repository, which would have same effect like calling git pull by cron every minute.
I know this question is a bit old, but you can use the windows log and git to autopull your project using a webhook and php (assuming your project involves a webserver.
See my gist here :
https://gist.github.com/upggr/a6d92e2808e9628ebe0d01fd93569f4a
As some have noticed after trying this, if you use php exec(), it turns out that solving for permissions is not that simple.
The user that will execute the command might not be your own, but www-data or apache.
If you have root/sudo access, I recommend you read this Jonathan's blog post
When you aren't allowed/can't solve permissions
My solution was a bit creative. I noticed I could create a script under my username with a loop and git pull would work fine. But that, as pointed out by others, bring the question of running a lot of useless git pull every, say, 60 seconds.
So here the steps to a more delicate solution using webhooks:
deploy key: Go to your server and type:
ssh-keygen -t rsa -b 4096 -C "deploy" to generate a new deploy key, no need write-permissions (read-only is safer). Copy the public key to your github repository settings, under "deploy key".
Webhook: Go to your repository settings and create a webhook. Lets assume the payload address is http://example.com/gitpull.php
Payload: create a php file with this code example bellow in it. The purpose of the payload is not to git pull but to warn the following script that a pull is necessary. Here the simple code:
gitpull.php:
<?php
/* Deploy (C) by DrBeco 2021-06-08 */
echo("<br />\n");
chdir('/home/user/www/example.com/repository');
touch(GITPULLMASTER);
?>
Script: create a script in your preferred folder, say, /home/user/gitpull.sh with the following code:
gitpull.sh
#!/bin/bash
cd /home/user/www/example.com/repository
while true ; do
if [[ -f GITPULLMASTER ]] ; then
git pull > gitpull.log 2>&1
mv GITPULLMASTER GITPULLMASTER.`date +"%Y%m%d%H%M%S"`
fi
sleep 10
done
Detach: the last step is to run the script in detached mode, so you can log out and keep the script running in background.
There are 2 ways of doing that, the first is simpler and don't need screen software installed:
disown:
run ./gitpull.sh & to put it in background
then type disown -h %1 to detach and you can log out
screen:
run screen
run ./gitpull.sh
type control+a d to detach and you can log out
Conclusion
This solution is simple and you avoid messing with keys, passwords, permissions, sudo, root, etc., and also you prevent the script to flood your server with useless git pulls.
The way it works is that it checks if the file GITPULLMASTER exists; if not, back to sleep. Only if it exists, then do a git pull.
You can change the line:
mv GITPULLMASTER GITPULLMASTER.date +"%Y%m%d%H%M%S"`
to
rm GITPULLMASTER
if you prefer a cleaner directory. But I find it useful for debug to let the pull date registered (and untracked).
For our on-premises Windows test servers, we use Windows Task Scheduler tasks, set to run every 3 minutes, pulling from Bitbucket Cloud to repositories on those servers. While not instantaneous, it meets our needs, and has proven to be reliable.
I have a command that looks like the following:
php bin/console rabbitmq:multiple-consumer -w run_task
the command above has an endless while loop, its meant to be that way because its a listener that listens from the queue. Is there a way to put this command to run in the background so that I don't have to have 10 terminal tab always open? If not what is the solution
For me this question is more OS specific than PHP. I would solve this by using system-tools which make it possible to run tasks in background e.g. screen (linux).
If you want a command that does this you are able to write an wrapper command using the symfony process component where you could run the real task in a screen-instance.
Can someone possibly advise how I can keep my custom artisan command running forever with the daemon?
I saw the many tutorials with queues, however it doesn't exactly fit. I am trying to accomplish "subscribe" with pubnub's php library and this seems like the best way, unless I missed something?
Thanks in advance!
If you run the artisan command from the command line - it can already run indefinitely/forever. You dont need to do anything.
I have an application that has been running the one single artisan command for 97 days straight at the moment.
You then need to make sure it has not crashed for some reason, with something like Supervisor, or a web monitoring service like Eyewitness.io
This will help you run artisan command forever
nohup php artisan yourcommand:abc > mylog.log 2>&1 & echo $! >> save_pid.txt
when you want to kill this process, get pid from save_pid.txt file
kill pid
I have a php file that calls exec() on a c++ exe.When the .exe is finishing running I need to run the php file again,repeat .
I am wondering what is the best way to do this? This .php file has no user interaction and needs to be completely automated.
EDIT:Found another solution that would kill the process if it is already running which will cover this issue well.See posts on this page.
http://php.net/manual/en/function.getmypid.php
Here's a simple Linux line that starts up the script in the background and uses the watch command to restart the script whenever it finishes running:
watch -n 0 php /path/to/script.php &
Once you've started it like this, you can use the ps command to list running processes and find the process ID for watch, and then use kill process_id to stop watch.
It's probably not best practice to run commands like this, but it's a quick and easy solution that doesn't require any special access privileges on the system (whereas using cron might), and doesn't involve editing your code (whereas a codeigniter solution will).
I haven't used codeigniter before but there seems to be a solution as described in the wiki.
Depending on how you can access the system (if you are admin or not) and depending on how you are planning to update the automated commands, IMHO, you could use both solution (Linux crontab or codeigniter cron script).