PHP execute all files in a folder - php

Im working on a PHP script that is executing chosen website projects. (Only on my local computer).
What i would want it to do is execute ALL files it can find in a folder. It would take time to always change the code when i add a, for example, .php file or .css file.
Im using windows 7.
Is it possible somehow?
Code so far:
exec("C:\\xampp\\htdocs\\project\\",$output, $return);

foreach (glob("path/ToFile/*.php") as $filename) {
exec($filename, $output); // previously $Filename
}
Perhaps this might perform the task you require
foreach (glob("path/ToFile/*.php") as $filename) {
echo $filename."<br>";
}
See if you are getting any files

Related

ftp listing and download file in current date

I have a case,
I have a remote server that contains so many generated transaction files (.txt) from 2015 until now. I must download it everyday real time. For now, i use PHP to download it all, but the method i think is not effectifely. First, I list all files, and then I read the component of the files such as the date modified, but this method is annoying. Make my program run slowly and take a very much time.
This is my code (I've used PHP Yii2),
public function actionDownloadfile(){
$contents=Yii::$app->ftpFs->listContents('/backup', ['timestamp','path','basename']); --> Much time needed while executing this line
var_dump($contents);
foreach ($contents as $value) {
if (date('Y-m-d',$value['timestamp']) == date('Y-m-d')){
echo "[".date('Y-m-d H:i:s')."] : Downloading file ".$value['basename']."\n";
$isi = Yii::$app->ftpFs->read($value['path']);
$dirOut = Yii::$app->params['out'];
$fileoutgoing = $dirOut."/".$value['basename'];
$file = fopen($fileoutgoing,"w");
fwrite($file,$isi);
}
}
}
i have a question,
Is that possible to list and download some files in ftp server just only on this current date without listing them all first?
Any solution either using PHP or Shell Script is OK.
Thank you so much (y)

"No such file or directory" on localhost copy

EDIT: I'm pretty sure the issue has to do with the firewall, which I can't access. Marking Canis' answer as correct and I will figure something else out, possibly wget or just manually scraping the files and hoping no major updates are needed.
EDIT: Here's the latest version of the builder and here's the output. The build directory has the proper structure and most of the files, but only their name and extension - no data inside them.
I am coding a php script that searches the local directory for files, then scrapes my localhost (xampp) for the same files to copy into a build folder (the goal is to build php on the localhost and then put it on a server as html).
Unfortunately I am getting the error: Warning: copy(https:\\localhost\intranet\builder.php): failed to open stream: No such file or directory in C:\xampp\htdocs\intranet\builder.php on line 73.
That's one example - every file in the local directory is spitting the same error back. The source addresses are correct (I can get to the file on localhost from the address in the error log) and the local directory is properly constructed - just moving the files into it doesn't work. The full code is here, the most relevant section is:
// output build files
foreach($paths as $path)
{
echo "<br>";
$path = str_replace($localroot, "", $path);
$source = $hosted . $path;
$dest = $localbuild . $path;
if (is_dir_path($dest))
{
mkdir($dest, 0755, true);
echo "Make folder $source at $dest. <br>";
}
else
{
copy($source, $dest);
echo "Copy $source to $dest. <br>";
}
}
You are trying to use URLs to travers local filesystem directories. URLs are only for webserver to understand web requests.
You will have more luck if you change this:
copy(https:\\localhost\intranet\builder.php)
to this:
copy(C:\xampp\htdocs\intranet\builder.php)
EDIT
Based on your additional info in the comments I understand that you need to generate static HTML-files for hosting on a static only webserver. This is not an issue of copying files really. It's accessing the HMTL that the script generates when run through a webserver.
You can do this in a few different ways actually. I'm not sure exactly how the generator script works, but it seems like that script is trying to copy the supposed output from loads of PHP-files.
To get the generated content from a PHP-file you can either use the command line php command to execute the script like so c:\some\path>php some_php_file.php > my_html_file.html, or use the power of the webserver to do it for you:
<?php
$hosted = "https://localhost/intranet/"; <--- UPDATED
foreach($paths as $path)
{
echo "<br>";
$path = str_replace($localroot, "", $path);
$path = str_replace("\\","/",$path); <--- ADDED
$source = $hosted . $path;
$dest = $localbuild . $path;
if (is_dir_path($dest))
{
mkdir($dest, 0755, true);
echo "Make folder $source at $dest. <br>";
}
else
{
$content = file_get_contents(urlencode($source));
file_put_contents(str_replace(".php", ".html", $dest), $content);
echo "Copy $source to $dest. <br>";
}
}
In the code above I use file_get_contents() to read the html from the URL you are using https://..., which in this case, unlike with copy(), will call up the webserver, triggering the PHP engine to produce the output.
Then I write the pure HTML to a file in the $dest folder, replacing the .php with .htmlin the filename.
EDIT
Added and revised the code a bit above.

PHP rename() cannot always find source file (code 2) in Windows environment

My environment is: Windows, MsSQL and PHP 5.4.
My scenario:
I'm doing a small shell script that creates a full backup from my wanted database to a temp folder and then moves it to a new location.
The backup goes fine and the file is created to my temp folder. Then I rename it to the 2nd folder and sometimes it goes ok, sometimes it cannot find the source file.
Of course at this point I know that I could skip the temporary location alltogether, but the actual problem with not finding the file bothers me. Why is it so random and might it also affect other file functions I've written before this one... Also i need to be able to control how and when the files move to the destination.
The base code is simple as it should be (although this is a simplified version of my actual code, since I doubt anyone would be interested in my error handling/logging conditions):
$query = "use test; backup database test to disk '//server01/temp/backups/file.bak', COMPRESSION;";
if($SQLClass->query($query)) {
$source="////server01//temp//backups//file.bak";
$destination="////server02//storage//backups//file.bak";
if(!rename($source , $destination)) {
//handleError is just a class function of mine that logs and outputs errors.
$this->handleError("Moving {$source} to {$destination} failed.");
}
}
else {
die('backup failed');
}
What I have tried is:
I added a file_exists before it and it can't find the source file either, when rename can't.
As the file can't be found, copy() and unlink() will not work either
Tried clearstatcache()
Tried sleep(10) after the sql backup completes
None of these didn't help at all. I and google seem to be out of ideas on what to do or try next. Of course I could some shell_execing, but that wouldn't remove my worries about my earlier products.
I only noticed this problem when I tried to run the command multiple times in a row. Is there some sort of cache for filenames that clearstatcache() won't touch ? It seems to be related to some sort of ghost file phenomena, where php is late to refresh the file system contents or such.
I would appreciate any ideas on what to try next and if you read this far thank you :).
You may try calling system's copy command.
I had once problem like yours (on Linux box) when i had to copy files between two NFS shares. It just failed from time to time with no visible reasons. After i switched to cp (analog of Windows copy) problem has gone.
Surely it is not perfect, but it worked for me.
It might be cache-related, or the mysql process has not yet released the file.
mysql will dump the file into another temp file, first and finally moves it to your temp folder.
While the file is beeing moved, it might be inaccessible by other processes.
First I would try to glob() all the files inside temp dir, when the error appears. Maybe you notice, its still not finished.
Also have you tried to implemente something like 10 retry iterations, with some delay?
$notMoved = 0;
while($notMoved < 10){
$source="////server01//temp//backups//file.bak";
$destination="////server02//storage//backups//file.bak";
if(!rename($source , $destination)) {
//handleError is just a class function of mine that logs and outputs errors.
if ($notMoved++ < 10){
sleep(20);
} else {
$this->handleError("Moving {$source} to {$destination} failed.");
break;
}
}else{
break;
}
}
To bypass the issue:
Don't dump and move
Move then dump :-)
(ofc. your backup store would be one behind then)
$source="////server01//temp//backups//file.bak";
$destination="////server02//storage//backups//file.bak";
if(!rename($source , $destination)) {
//handleError is just a class function of mine that logs and outputs errors.
$this->handleError("Moving {$source} to {$destination} failed.");
}
$query = "use test; backup database test to disk '//server01/temp/backups/file.bak', COMPRESSION;";
if($SQLClass->query($query)) {
//done :-)
}
else {
die('backup failed');
}
Try
$source = "\\server01\temp\backups\file.bak";
$destination = "\\server02\storage\backups\file.bak";
$content = file_get_content($source);
file_put_contents($destination, $content);

Go through files in folder at another server PHP

I've got two servers. One for the files and one for the website.
I figured out how to upload the files to that server but now I need to show the thumbnails on the website.
Is there a way to go through the folder /files on the file server and display a list of those files on the website using PHP?
I searched for a while now but can't find the answer.
I tried using scanddir([URL]) but that didn't work.
I'm embarrassed to say this but I found my answer at another post:
PHP directory list from remote server
function get_text($filename) {
$fp_load = fopen("$filename", "rb");
if ( $fp_load ) {
while ( !feof($fp_load) ) {
$content .= fgets($fp_load, 8192);
}
fclose($fp_load);
return $content;
}
}
$matches = array();
preg_match_all("/(a href\=\")([^\?\"]*)(\")/i", get_text('http://www.xxxxx.com/my/cool/remote/dir'), $matches);
foreach($matches[2] as $match) {
echo $match . '<br>';
}
scandir will not work any other server but your own. If you want to be able to do such a thing your best bet to have them still on separate servers would be to have a php file on the website, and a php file on the file server. The php file on your website could get file data of the other server via the file server php file printing data to the screen, and the webserver one reading in that data. Example:
Webserver:
<?php
$filedata = file_get_contents("url to file handler php");
?>
Fileserver:
<?php
echo "info you want webserver to read";
?>
This can also be customized for your doing with post and get requests.
I used the following method:
I created a script which goes through all the files at the file server.
$fileList = glob($dir."*.*");
This is only possible if the script is actually on the fileserver. It would be rather strange to go through files at another server without having access to it.
There is a way to do this without having access (read my other answer) but this is very slow and not coming in handy.
I know I said that I didn't have access, but I had. I just wanted to know all the possibilities.

PHP extractTo does not extract files properly on one computer, but works on the other.

I'm facing a very weird problem! I'm using the method below to extract a .zip file's contents into a new folder. It works perfectly fine one my computer but does not work on another one! I have Windows XP on both computers and have installed the same wampServer on both. Everything between the two computers is the same except their CPU and RAM! My computer is a powerful one and the one where the extract process fails is a very slow computer. Is that why? How can I make sure the PHP code runs perfectly even in a slow environment?
One thing to add: the zip archive to be extracted contains one directory and some files in that directory. If I test the process with a zip file that has no directories in it, it works fine on both computers. Any ideas?!
public function extract($pluginName, $pasteLocation) {
$zip = new ZipArchive();
$plugin = $pasteLocation.$pluginName.".zip";
if ($zip->open($plugin) === TRUE) {
$zip->extractTo($pasteLocation);
$zip->close();
unlink($pasteLocation.$pluginName.'.zip');
$status = "true";
$msg = "success";
} else {
$status = "false";
$msg = "error";
}
$result["status"] = $status;
$result["msg"] = $msg;
return $result;
}
You said it does not work in one system. Can you tell what is not working, like, are the files extracted partially? or are the files getting corrupted?
Did you tried using different directories. Does the target directory contain a file with the same name as the directory in the zip? then I guess directory creation will not work.
Also what version of php are you using?
EDIT: Did you use ZipArchive::getStatusString function to get any generated errors ? Are you using the same source archive in both machines?
You can also try the procedure explained in comment by 'hardcorevenom' here.
You can also try this class as shown here if nothing works.

Categories