What is the best way to generate an excel file and inform the user after that?
I am using PHPExcel to generate an excel file from an MSSQL Server to a webserver and allow a user to download it using a link. The problem is that the each time we try to execute the PHP script it always throws a fast-cgi timeout error. The script needs to read up to 2000 - 5000 rows of data.
We tried to execute it via command prompt using exec() and Shell. It successfully generates the file in the server, but we don't have a way/method in informing the user after the script is completed.
exec() should return the result of running the external program - can't you use it? You can move generated file to a directory that is reachable for user and just give him the URL to the file.
Related
I have a php web app and have written a class that effectively invokes an API which requests data from our 3rd party membership system database. The API returns the data and saves the XML into a file (as it's a very large data set). This xml file is then parsed and I upload some of that data into my web application's database.
It takes approximately 4 hours for the API to return all of the data. Currently I invoke this process by clicking a button on my web page. However, I want to automate the process and schedule it to run every so often but have no idea where to begin or what to even Google. The web app runs on a Windows server which is running xammp. Any help is much appreciated.
I have managed to create a .bat file to execute my php script but I now get several errors in the CMD windows stating that it failed to open any of the files that I include in my script file:
My script file is:
<?php
include '../config.php';
include (ROOT.'/includes/my_session.php');
include (ROOT.'/model/dao.php');
include (ROOT.'/model/miller_api.php');
include (ROOT.'/model/branch_db.php');
$a = new miller_api;
$k = $a->apiRequest('key');
$a->apiRequest('branch', $k);
And here is a screen shot of the errors I am getting:
CMD errors
It is quite simple. Create a run.bat file with the following command
C:\xampp\php\php.exe C:\xampp\htdocs\SCRIPTFOLDER\SCRIPT.php
And schedule the run.bat file using windows Task Scheduler
https://www.techsupportalert.com/content/how-schedule-programs-run-automatically-windows-7.htm
I have a script that creates a file using fwrite and right after creating it it sends and email with this file anexed.
When I run this script I get an error "File does no exists". When I run the script again it all works.
So I guess that it tries to send the file to fast after creating it and the server maybe needs a few more ms before being able to send it.
Does anyone know this problem? Any solutions?
Delays should not be necessary; it is possible your script does not close the file, but PHP and other script languages will auto close open file handles at the end of execution, which means that the file will exist for the second run.
Try sleep(1) before attaching the file, this will cause PHP to wait 1 second before trying to access the file which should be plenty of time for it to be created. This should tell you if the file creation speed is the problem or not.
I have a command line PHP script that runs constantly (infinite loop) on my server in a 'screen' session. The PHP script outputs various lines of data using echo.
What I would like to do is create a PHP web script to interface the command line script so that I can view the echo output without having to SSH into the server.
I had considered writing/piping all of the echo statements to a text file, and then having the web script read the text file. The problem here is that the text file will grow to several megabytes in the space of only a few minutes.
Does anyone know of a more elegant solution?
I think expect_popen will work for you, if you have it available.
Another option is to used named pipes - no disk usage, the reading end has output available as it comes.
The CLI script can write to a file like so:
file_put_contents( '/var/log/cli-log-'.date('YmdHi').'.log', $data );
Thereby a new log file being created every minute to keep the file size down. You can then clean up the directory at that point, deleting previous log files or moving them or whatever you want to do.
Then the web script can read from the current log file like so:
$log = file_get_contents( '/var/log/cli-log-'.date('YmdHi').'.log' );
As Elias Van Ootegem suggested, I would definitely recommend a cron instead of an constantly running script.
If you want to view the data from a web script you can do a few things....one is write the data to a log file or a database so you can pull it out later....I would consider limiting what you output if you there is so much data (if that is a possiblity).
I have a lot of crons email me data, not sure if that would work for you but I figured I would mention it.
The most elegant suggestion I can think of is to run the commands using exec in a web script which will directly output to the browse if you use : http://php.net/manual/en/function.flush.php
I've written a daily batch file which uses FTP to login to my webserver and download a CSV full of new members using mget members.csv which is created on my site by PHP.
I've also have a php page on server which emails me with these new members. This php executes when I load the page in a browser but is it possible to execute it from the batch file?
I could also keep these members in a database if it's easier/securer but then ideally I wouldn't like to hold sensitive database login details in a batch file...
Many thanks
Using a scheduled batch file under Windows you could use the start command to execute a browser instance requesting the URL of your PHP script (which generates the email).
Put this in your scheduled batch file:
start www.stackoverflow.com
This one would use the systems default browser. To start a specific browser instead, you can use:
start /d "C:\Program Files\Mozilla Firefox" firefox.exe www.stackoverflow.com
start /d "C:\Program Files\Internet Explorer" iexplore.exe www.stackoverflow.com
:
You need to replace www.stackoverflow.com with the URL of your PHP script, of course^^
Why not just setup a cron job to check every x minutes/hours for populated csv file and send you the file if it is populated?
how to setup cron job
A client has a Windows based in-house server, on which they edit the contents of a CMS. The data are synchronized nightly with the live web server. This is a workaround for a slow Internet connection.
There are two things to be synchronized: New files (already sorted) and a mySQL database. To do this, I am writing a script that exports the database into a dump file using mysqldump, and uploads the dump.
The upload process is done using a 3rd party tool named ScriptFTP, an FTP automation tool.
I then need to run a PHP based import script on the target server. Depending on this script's return value, the ScriptFTP operation goes on, and some directories are renamed.
I need an external tool for this, as scriptFTP only supports FTP calls. I was thinking about the Windows version of wget.
Within scriptFTP, I can execute any batch or exe file, but I can only parse the errorlevel resulting from the call and not the stdout output. This means that I need to return errorlevel 1 if the PHP import operation goes wrong, and errorlevel 0 if it goes well. Additionally, obviously, I need to return a positive errorlevel if the connection to the import script could not be made at all.
I have total control over the importing PHP script, and can decide what it does on error: Output an error message, return a header, whatever.
How would you go about running wget (or any other tool to kick off the server side import) and returning a certain error level depending on what the PHP script returns?
My best bet right now is building a batch file that executes the wget command, stores the result in a file, and the batch file returning errorlevel 0 or 1 depending on the file's contents. But I don't really know how to match a file's contents using batch programming.
You can do the following in powershell:
$a = wget --quiet -O - www.google.com
$rc = $a.CompareTo("Your magic string")
exit $rc