Open URL via cmd without opening a browser - php

I wanted to use my local server which is running Windows 7 to take advantage of the task scheduler to setup some cron jobs for some of my php files.
I can do this currently by:
start http://theurl
Which opens in my default browser. However I was hoping to accomplish this without physically opening a browser so when I come back to my computer after a few days I don't have millions of Chrome windows open.
How can I load a URL in Task scheduler without opening a browser client via cmd?

I was able to accomplish the cron job by using a program called wget. I setup task scheduler to run the wget.exe at my specified time with these arguments:
wget -q -O - http://theurl.com > tmp.txt
This would load the website and store it to a temporary text file which is overwritten next time it is used.

If you just want to run some php files you don't need a browser. You can just run it from the commandline:
php -f /path/to/php/file.php
However if you really need to access a page you can do several things like: file_get_contents() or making a cURL request from PHP.

You don't need cmd or shell access. If your host has the HTTP wrapper enabled, a call to file_get_contents() is all you need:
file_get_contents( 'http://theurl');
You can also use fopen() if you're not interested in the response from the server.

Related

PHP WebSocket server

I use Phirehose to get a live and continuous stream of the Twitter UserStream API. So far I have been able to execute php -S localhost:8000 index.php and it work fire up and work fine.
Now I want to use the data from the CLI script in Laravel.
1) How can I stream the Phirehose data to Laravel?
2) How can I get this script to stay active in the background of a non-GUI droplet # DigitalOcean?
In your Phirehose script, write each tweet to a database. In your Laravel application (which I am assuming is being accessed by users, from their browsers?), query that database. The database need not be as heavy as MySQL, it could instead be memcache, redis or one of the NoSQL options.
For getting a Phirehose script to run in the background I would login over ssh and do this:
nohup php myscript.php 2>&1 &
(This assumes you have installed the php-cli package for your distro.)
The nohup part means you can logout and it will keep running. The 2>&1 means both stdout and stderr messages will be written to nohup.out. The & at the end is what puts it into the background.
(In fact I do something a bit more complicated: I have my Phirehose script write to a keep-alive file every 10 seconds. I then have another PHP script that is started on 1-minute cron, that will check that keep-alive file is being updated and, if not, it will start the phirehose script running.)

Schedule Windows Task to open a page with username and passowrd

How would you run a windows task schedule to open a webpage, post login information and then run the url?
Background:
CRM has crons that were setup for a linux only. It has a manager where I can run the jobs as well manually. I want to run the web url that does these jobs manually through the windows server but requires that each time it connect it login with a specific user.
How would I setup a scheduled task on windows server that :
1. Opens and Logs into page then runs the url for the manual job.
Runs every minute
So essentially it needs to look like this:
http://thewebsitename.com/?username=someuser&password=apass
http://thewebsitename.com/theurltorunjobmanually.php
Can scheduled tasks run a php command instead as well? For example if I set up a WGET script, could the scheduler run that php script? Have not been able to figure out how to do this, linux seems to be pretty easy in this scenario
This could be as simple as:
Download wget for Windows
Create a batch file with the following contents:
wget --post-data "username=someuser&password=apass" http://thewebsitename.com/
wget http://thewebsitename.com/theurltorunjobmanually.php
You also asked about running a PHP script via the scheduled task, you can add this line to the batch script:
C:\path\to\PHP.exe script.php
Not sure if you're looking for methodology or an actual solution, but we have a process somewhat like this where we need to login to our CRM and run an upload, task creation process at regular intervals. Used to be manual but now we use an automation software product, Foxtrot. You can find it here for whatever it is worth: http://www.enablesoft.com/foxtrot-professional/
You can put the cURL or wget commands in a batch file or PowerShell script and have the Windows Task Scheduler call it.

Using Lynx to run a PHP script in CentOS

I need to use an Apache handler to run a PHP script, rather than running it through CLI. I'm using APC user cache, which stores variables using the Apache process. If I run my PHP script through CLI, then it won't have access to the APC variables.
A possible solution is creating a directory restricted to localhost and putting my scripts in there. Then, I can use a browser to run the PHP scripts. However, I'm not too experienced with Linux and I don't know how to implement this. Here's how I need it to work:
One of the cron job fires.
The cron job opens the PHP script using a web browser.
After the PHP script is finished processing, the web browser closes.
I don't know how to close the browser once the task is finished. Also, multiple PHP scripts will be running simultaneously (called by different cron jobs), I'm not sure how this will work. I'm using the Lynx browser on CentOS.
In Debian/Ubuntu I can run a script using lynx, say
/usr/bin/lynx -source 'url'
For eg:
/usr/bin/lynx -source http://google.com
Once execution is completed, the browser quits default.

Running a looping PHP Script on my server

I have a windows pc with apache running, and I needed a php script to continuously run to listen to inputs coming from a UDP port, and take the required action and send it back.
The only way I know how to do this, is to install curl for cmd, and run the php script with a WHILE loop. What I am afraid is that this is the wrong way to do it.and may be unreliable and take up large amount of system resources.
Can people comment on the above method? I have heard of cron..but thats for unix only? What can I do?
Hey try this below solution.
Use a bat file and schedule to execute that bat file.
For example in the bat file executephp.bat, write this
c:\xampp\php\php.exe -f c:\xampp\htdocs\do_something.php
save that bat file that contains that line.
Go to windows scheduler and create a new task and in action tab, browse to point that executephp.bat and for start in -> direct to the directory u have that executephp.bat.
For example if u save the file under C:\xampp\htdocs put that C:\xampp\htdocs in the start in.
Remember to invoke the script even when the user is not logged on.
Everything is set and it will execute without problem.
A PHP script behind Apache will always have a maximum execution time, so the while-loop should always be stopped after the specific timeout.
You should better use cron or a batch script like Venkat recommended. There are some great services for cron out there, that will do a GET request to your server and run the script. Have a look at this related thread: Scheduled Request to my website from an external source
Doesn't that fit your needs?

Schedule and execute a PHP script automatically

I have written a PHP script which generates an SQL file containing all tables in my database.
What I want to do is execute this script daily or every n days. I have read about cron jobs but I am using Windows. How can I automate the script execution on the server?
You'll need to add a scheduled task to call the URL.
First of all, read up here:
MS KB - this is for Windows XP.
Second, you'll need some way to call the URL - i'd recommend using something like wget - this way you can call the URL and save the output to a file, so you can see what the debug output is. You can get hold of wget on this page.
Final step is, as Gabriel says, write a batch file to tie all this up, then away you go.
e: wget is pretty simple to use, but if you have any issues, leave a comment and I'll help out.
ee: thinking about it, you don't even really need a batch file, and could just call wget directly..
add a scheduled task to request the url. either using a batch file or a script file (WSH).
http://blog.netnerds.net/2007/01/vbscript-download-and-save-a-binary-file/
this script will allow you to download binary data from a web source. Modify it to work for you particular case. This vbs file can either be run directly or executed from within a script. Alternately you do not have to save the file using the script, you can just output the contents (WScript.Echo objXMLHTTP.ResponseBody) and utilize the CMD out to file argument:
cscript download.vbs > logfile.log
save that bad boy in a .bat file somewhere useful and call it in the scheduler: http://lifehacker.com/153089/hack-attack-using-windows-scheduled-tasks
Cron is not always available on many hosting accounts.
But try this:
http://www.phpjobscheduler.co.uk/
its free, has a useful interface so you can see all the scheduled tasks and will run on any host that provides php and mysql.
You can use ATrigger scheduling service. A PHP library is also available to create scheduled tasks without overhead. Reporting, Analytics, Error Handling and more benefits.
Disclaimer: I was among the ATrigger team. It's a freeware and I have not any commercial purpose.
Windows doesn't have cron, but it does come with the 'at' command. It's not as flexible as cron, but it will allow you to schedule arbitrary tasks for execution from the command line.
Yes, You can schedule and execute your php script on windows to run automatically. In linux like os u will have cron but on windows u can schedule task using task scheduler.
If your code is in remote hosted server then create a cron-job for the same.
Else if in local then use a scheduled task in windows.Its easy to implement.I am having servers with so many scheduled tasks running.

Categories