I've created the following code to create a zip file. I'm pulling a list of files from the database depending on the $job_number (which I'm getting from the global $_GET array) and then trying to add them to the zip file.
That part is working fine. It's pulling the list of files from the database, as I can see by echoing or dumping with print_r the results.
The only problem is that the .zip file isn't being created at all. I can't see where I've gone wrong.
$sth = $conn->prepare('SELECT `dp_filename` FROM damage_pics WHERE dp_job_number = :job_number');
$sth->bindParam(':job_number', $job_number);
$sth->setFetchMode(PDO::FETCH_ASSOC);
$sth->execute();
$zip = new ZipArchive();
$zip->open('example5.zip', ZipArchive::CREATE);
$result = $sth->fetchAll();
echo '<pre>';
print_r($result);
foreach ($result as $file)
{
// just echoing for testing purposes to see if file name is created correctly.
$image = $file['dp_filename'];
echo $image . "<br />";
$zip->addFile("uploads/{$image}");
}
$zip->close();
Sounds like your PHP script is being run with insufficient permissions to write to the destination directory. I did some experimenting, and in such a situation the $zip->open() call would silently fail with no indication of what went wrong.
You could put the following code at the top to determine if this is indeed the problem:
$fp = fopen('example5.zip', 'w');
if ($fp === FALSE) { die("Cannot open example5.zip for writing"); }
fclose($fp);
To fix the issue, I would suggest using an absolute path and filename (such as /tmp/example5.zip) and then make sure that the destination directory is writable by the user that's executing the script (likely whatever user is running the HTTP server software, such as www-data or httpd, depending on the OS and distribution).
Related
I am new to php and I am trying to create a file upload system that will automatically parse the xml file using simplexml. I have created a php script that will open the directory and try to parse the files. For some reason, it will only parse one of the files. I am not sure if this is the best way to aproach this task.
<?php
$dir = "path/to/xmlfiles"
chdir($dir);
// Open a directory, and read its contents
if (is_dir($dir)){
if ($dh = opendir($dir)){
while (($file = readdir($dh)) !== false){
$xml = simplexml_load_file($file);
$nombre = $xml ->xpath("//NOMBRE");
$rpu = $xml ->xpath("//RPU");
echo (string) $nombre[0];
echo (string) $rpu[0];
echo $file;
}
closedir($dh);
}
}
?>
For this script, I am able to echo the results just fine, the only problem is that it will only echo one of the xml file resutls.
Hopefully someone with more experience could give me a tip on how to achieve this.
For extra points, I am also trying to insert an entry to a Mysql database for each parsed file.
;) Thank you in advance for all your help.
readdir() reads directory entries as they're stored on disk (i.e., it doesn't sort entries) so it's very likely that . (current directory) will be the first one. That will make simplexml_load_file() fail and $xml will become false so $xml->xpath() will crash the script with a fatal error.
PHP should be reporting all this. If you cannot see it, it's very likely that you haven't configured PHP to display errors.
You need to filter out entries (the bare minimum would be to check they are actual files and not directories) and add some error checking here and there.
An alternative approach:
foreach (glob("$dir/*.xml") as $file) {
}
EDIT: I'm pretty sure the issue has to do with the firewall, which I can't access. Marking Canis' answer as correct and I will figure something else out, possibly wget or just manually scraping the files and hoping no major updates are needed.
EDIT: Here's the latest version of the builder and here's the output. The build directory has the proper structure and most of the files, but only their name and extension - no data inside them.
I am coding a php script that searches the local directory for files, then scrapes my localhost (xampp) for the same files to copy into a build folder (the goal is to build php on the localhost and then put it on a server as html).
Unfortunately I am getting the error: Warning: copy(https:\\localhost\intranet\builder.php): failed to open stream: No such file or directory in C:\xampp\htdocs\intranet\builder.php on line 73.
That's one example - every file in the local directory is spitting the same error back. The source addresses are correct (I can get to the file on localhost from the address in the error log) and the local directory is properly constructed - just moving the files into it doesn't work. The full code is here, the most relevant section is:
// output build files
foreach($paths as $path)
{
echo "<br>";
$path = str_replace($localroot, "", $path);
$source = $hosted . $path;
$dest = $localbuild . $path;
if (is_dir_path($dest))
{
mkdir($dest, 0755, true);
echo "Make folder $source at $dest. <br>";
}
else
{
copy($source, $dest);
echo "Copy $source to $dest. <br>";
}
}
You are trying to use URLs to travers local filesystem directories. URLs are only for webserver to understand web requests.
You will have more luck if you change this:
copy(https:\\localhost\intranet\builder.php)
to this:
copy(C:\xampp\htdocs\intranet\builder.php)
EDIT
Based on your additional info in the comments I understand that you need to generate static HTML-files for hosting on a static only webserver. This is not an issue of copying files really. It's accessing the HMTL that the script generates when run through a webserver.
You can do this in a few different ways actually. I'm not sure exactly how the generator script works, but it seems like that script is trying to copy the supposed output from loads of PHP-files.
To get the generated content from a PHP-file you can either use the command line php command to execute the script like so c:\some\path>php some_php_file.php > my_html_file.html, or use the power of the webserver to do it for you:
<?php
$hosted = "https://localhost/intranet/"; <--- UPDATED
foreach($paths as $path)
{
echo "<br>";
$path = str_replace($localroot, "", $path);
$path = str_replace("\\","/",$path); <--- ADDED
$source = $hosted . $path;
$dest = $localbuild . $path;
if (is_dir_path($dest))
{
mkdir($dest, 0755, true);
echo "Make folder $source at $dest. <br>";
}
else
{
$content = file_get_contents(urlencode($source));
file_put_contents(str_replace(".php", ".html", $dest), $content);
echo "Copy $source to $dest. <br>";
}
}
In the code above I use file_get_contents() to read the html from the URL you are using https://..., which in this case, unlike with copy(), will call up the webserver, triggering the PHP engine to produce the output.
Then I write the pure HTML to a file in the $dest folder, replacing the .php with .htmlin the filename.
EDIT
Added and revised the code a bit above.
I want to create a new file along with the the creation of a record in my db. The db part works fine, but the file part isn't working. I want to create a file in another folder in the root directory and I used the same code with changes in the file name. Despite setting ini_set('display_errors',3) the code doesn't return any errors, I've also tried fopen($file_name,"w") or die("unable to open file"), still the code ran. I've given IIS_IUSRS full control over the directory. Here is my code
$data = mysqli_fetch_array(mysqli_query($conn,"Select * from content where link='$url'")); //fetch data from db
$id = $data['id'];
$file_name="/files/$id.html"; //sets file name
$file = fopen($file_name,"w"); //creates the file
fwrite($file,$_POST['desc']); //writes to the file
fclose($file); //closes the file
The problem was PHP wasn't going to the root directory. I added $root = $_SERVER['DOCUMENT_ROOT'] and replaced $file_name="/files/$id.html" with $file_name="$root/files/$id.html"; and it worked like a charm.
I have my users uploading a text file which then gets processed by my application. Once the processing is done, I would like to save a copy of this text file somewhere on my server for future reference. Currently, the uploaded text file stays in the PHP temp folder until it is closed by my app.
What's a simple way to accomplish this?
BTW, I'll need to know how to do this on my web server along with localhost (for testing).
You can use the fwrite function (this is probably not a very good idea in this particular example though.
$fp = fopen('data.txt', 'w');
fwrite($fp, $yourContents);
fclose($fp);
But, if you already have the file simply copy it using the copy command (if you want to keep it in the temp folder that is, if not move it with the rename function instead).
To copy, do something like this
$tempfile = 'tempfile.txt';
$newfile = 'newfile.txt';
if (copy($tempfile, $newfile)) {
echo "success!";
} else {
echo "misery :(";
}
To move with rename
// Rename returns a bool, just as in the copy example
rename("/tmp/tempfile.txt", "/home/user/files/newfile.txt");
Added this: To move with move_uploaded_file
Please note, I didn't test this in a development environment. This may not execute perfectly.
$uploads_dir = 'C:\\movefiles\\here\\';
foreach ($_FILES["upload-tracking-file"]["error"] as $key => $error) {
if ($error == UPLOAD_ERR_OK) {
$tmp_name = $_FILES["upload-tracking-file"]["tmp_name"][$key];
$name = $_FILES["upload-tracking-file"]["name"][$key];
move_uploaded_file($tmp_name, "$uploads_dir\\$name");
}
}
References
Copy, http://php.net/manual/en/function.copy.php
Move (rename), http://php.net/manual/en/function.rename.php
move_uploaded_file, http://php.net/manual/en/function.move-uploaded-file.php
fwrite, http://php.net/manual/en/function.fwrite.php
use php's function ob_start(), and file_put_contents() this should help you. this links will help you if not post your reply
php.net/manual/en/function.ob-start.php, php.net/manual/en/function.file-put-contents.php
I have a directory with a number of subdirectories that users add files to via FTP. I'm trying to develop a php script (which I will run as a cron job) that will check the directory and its subdirectories for any changes in the files, file sizes or dates modified. I've searched long and hard and have so far only found one script that works, which I've tried to modify - original located here - however it only seems to send the first email notification showing me what is listed in the directories. It also creates a text file of the directory and subdirectory contents, but when the script runs a second time it seems to fall over, and I get an email with no contents.
Anyone out there know a simple way of doing this in php? The script I found is pretty complex and I've tried for hours to debug it with no success.
Thanks in advance!
Here you go:
$log = '/path/to/your/log.js';
$path = '/path/to/your/dir/with/files/';
$files = new RecursiveIteratorIterator(new RecursiveDirectoryIterator($path), RecursiveIteratorIterator::SELF_FIRST);
$result = array();
foreach ($files as $file)
{
if (is_file($file = strval($file)) === true)
{
$result[$file] = sprintf('%u|%u', filesize($file), filemtime($file));
}
}
if (is_file($log) !== true)
{
file_put_contents($log, json_encode($result), LOCK_EX);
}
// are there any differences?
if (count($diff = array_diff($result, json_decode(file_get_contents($log), true))) > 0)
{
// send email with mail(), SwiftMailer, PHPMailer, ...
$email = 'The following files have changed:' . "\n" . implode("\n", array_keys($diff));
// update the log file with the new file info
file_put_contents($log, json_encode($result), LOCK_EX);
}
I am assuming you know how to send an e-mail. Also, please keep in mind that the $log file should be kept outside the $path you want to monitor, for obvious reasons of course.
After reading your question a second time, I noticed that you mentioned you want to check if the files change, I'm only doing this check with the size and date of modification, if you really want to check if the file contents are different I suggest you use a hash of the file, so this:
$result[$file] = sprintf('%u|%u', filesize($file), filemtime($file));
Becomes this:
$result[$file] = sprintf('%u|%u|%s', filesize($file), filemtime($file), md5_file($file));
// or
$result[$file] = sprintf('%u|%u|%s', filesize($file), filemtime($file), sha1_file($file));
But bare in mind that this will be much more expensive since the hash functions have to open and read all the contents of your 1-5 MB CSV files.
I like sfFinder so much that I wrote my own adaption:
http://www.symfony-project.org/cookbook/1_0/en/finder
https://github.com/homer6/altumo/blob/master/source/php/Utils/Finder.php
Simple to use, works well.
However, for your use, depending on the size of the files, I'd put everything in a git repository. It's easy to track then.
HTH