Am creating pdf files with TCPDF.Every single user gets a different folder.
This is a path name: $_SERVER['DOCUMENT_ROOT'].'/bandymas/pdfDocuments/'.$_SESSION["userSession"].There are no problem with file creation.
Now i need to see a list of created files and make them available for open.
The problem is, my page is crashing and I can't see the list.
$dir='/'.$_SERVER['DOCUMENT_ROOT'].'/bandymas/pdfDocuments/'.$_SESSION['userSession'].'/';
if(is_dir($dir)){
if($dh=opendir($dir)){
echo "My documents list:";
while(($fileName=readdir($dir)) !==false){
echo " view","\n";
}
close($dh);
}
}
The issue here is that line: while(($fileName=readdir($dir)) !==false)
A simple look into the php documentation of that function points out the reason why things fail:
string readdir ([ resource $dir_handle ] ) requries a directory handle as argument, not a file system path. So the line should be: while(($fileName=readdir($dh)) !==false). $dh is the variable holding your directory handle you got returned a few lines above when opening the folder.
This is a very common and typical issues with scripts getting implemented. We all make such mistakes. Nothing to worry about. But what you should learn from this is: monitor your http servers error log file. Such issues are pointed out in there, you can actually read in there what issue you are dealing with and typically also in which precise line in what file that issue occurs. You cannot seriously develop php without monitoring that error log file.
Use Code Below Working Just Fine.
$dir='/'.$_SERVER['DOCUMENT_ROOT'].'/bandymas/pdfDocuments/'.$_SESSION['userSession'].'/';
if (is_dir($dir)){
if ($dh = opendir($dir)) {
while (($file = readdir($dh)) !== false) {
echo "filename: $file : filetype: " . filetype($dir . $file) . "\n";
}
closedir($dh);
}
}
You used close($dh) instead of closedir($dh)
Related
I'm currently in the process of creating an application to generate and manage project names based on predefined themes. This application features very basic cloud saving functionality. It's super simple and designed to work without a database by saving the generated save data on files on a server.
In order for the program to download all the saved files I need to list all the saved files in a folder on the server. However, I can't seem to get the expected response from my server. I've tried 3 different ways to list all the files, and NONE of them return any files, which seems very odd to me.
$dir = "WordPress_SecureMode_01/Bubba/";
echo pathinfo($dir, PATHINFO_DIRNAME);
// Open a known directory, and proceed to read its contents
if (is_dir($dir)) {
if ($dh = opendir($dir)) {
while (($file = readdir($dh)) !== false) {
echo "filename: $file : filetype: " . filetype($dir . $file) . "\n";
}
closedir($dh);
}
}
$files = scandir('WordPress_SecureMode_01/Bubba/');
foreach($files as $file){
echo $file;
echo pathinfo($file, PATHINFO_FILENAME);
}
$entries = glob('WordPress_SecureMode_01/Bubba/*.txt');
foreach($entries as $entry){
echo $entry;
}
As you can see I'm now using three different methods of retrieving the files. opendir, scandir and glob. All their findings are echoed and thus retrieved by my application. However, the only data my application receives is the output of the pathinfo method at the top of the script. So, the communication between client and server is working fine, but all the options for scanning directory files aren't.
Does anyone have an idea as to why this behaviour is ocurring?
You either want to use an absolute path, if the files are in the same directory:
$dir = __DIR__;
Or a relative path, if they are in the same directory:
$dir = "./";
Here in your code $dir is not a directory and that's the reason the code isn't moving forward.
Please check the path,whether it is dir or not by simply executing
$dir = "WordPress_SecureMode_01/Bubba/";
if (is_dir($dir)) {
echo 'Yes';
}
else{
echo 'No';
}
If it gives you Yes it means its not about path and if it returns you No then please change your path to that folder.
I am new to php and I am trying to create a file upload system that will automatically parse the xml file using simplexml. I have created a php script that will open the directory and try to parse the files. For some reason, it will only parse one of the files. I am not sure if this is the best way to aproach this task.
<?php
$dir = "path/to/xmlfiles"
chdir($dir);
// Open a directory, and read its contents
if (is_dir($dir)){
if ($dh = opendir($dir)){
while (($file = readdir($dh)) !== false){
$xml = simplexml_load_file($file);
$nombre = $xml ->xpath("//NOMBRE");
$rpu = $xml ->xpath("//RPU");
echo (string) $nombre[0];
echo (string) $rpu[0];
echo $file;
}
closedir($dh);
}
}
?>
For this script, I am able to echo the results just fine, the only problem is that it will only echo one of the xml file resutls.
Hopefully someone with more experience could give me a tip on how to achieve this.
For extra points, I am also trying to insert an entry to a Mysql database for each parsed file.
;) Thank you in advance for all your help.
readdir() reads directory entries as they're stored on disk (i.e., it doesn't sort entries) so it's very likely that . (current directory) will be the first one. That will make simplexml_load_file() fail and $xml will become false so $xml->xpath() will crash the script with a fatal error.
PHP should be reporting all this. If you cannot see it, it's very likely that you haven't configured PHP to display errors.
You need to filter out entries (the bare minimum would be to check they are actual files and not directories) and add some error checking here and there.
An alternative approach:
foreach (glob("$dir/*.xml") as $file) {
}
EDIT: I'm pretty sure the issue has to do with the firewall, which I can't access. Marking Canis' answer as correct and I will figure something else out, possibly wget or just manually scraping the files and hoping no major updates are needed.
EDIT: Here's the latest version of the builder and here's the output. The build directory has the proper structure and most of the files, but only their name and extension - no data inside them.
I am coding a php script that searches the local directory for files, then scrapes my localhost (xampp) for the same files to copy into a build folder (the goal is to build php on the localhost and then put it on a server as html).
Unfortunately I am getting the error: Warning: copy(https:\\localhost\intranet\builder.php): failed to open stream: No such file or directory in C:\xampp\htdocs\intranet\builder.php on line 73.
That's one example - every file in the local directory is spitting the same error back. The source addresses are correct (I can get to the file on localhost from the address in the error log) and the local directory is properly constructed - just moving the files into it doesn't work. The full code is here, the most relevant section is:
// output build files
foreach($paths as $path)
{
echo "<br>";
$path = str_replace($localroot, "", $path);
$source = $hosted . $path;
$dest = $localbuild . $path;
if (is_dir_path($dest))
{
mkdir($dest, 0755, true);
echo "Make folder $source at $dest. <br>";
}
else
{
copy($source, $dest);
echo "Copy $source to $dest. <br>";
}
}
You are trying to use URLs to travers local filesystem directories. URLs are only for webserver to understand web requests.
You will have more luck if you change this:
copy(https:\\localhost\intranet\builder.php)
to this:
copy(C:\xampp\htdocs\intranet\builder.php)
EDIT
Based on your additional info in the comments I understand that you need to generate static HTML-files for hosting on a static only webserver. This is not an issue of copying files really. It's accessing the HMTL that the script generates when run through a webserver.
You can do this in a few different ways actually. I'm not sure exactly how the generator script works, but it seems like that script is trying to copy the supposed output from loads of PHP-files.
To get the generated content from a PHP-file you can either use the command line php command to execute the script like so c:\some\path>php some_php_file.php > my_html_file.html, or use the power of the webserver to do it for you:
<?php
$hosted = "https://localhost/intranet/"; <--- UPDATED
foreach($paths as $path)
{
echo "<br>";
$path = str_replace($localroot, "", $path);
$path = str_replace("\\","/",$path); <--- ADDED
$source = $hosted . $path;
$dest = $localbuild . $path;
if (is_dir_path($dest))
{
mkdir($dest, 0755, true);
echo "Make folder $source at $dest. <br>";
}
else
{
$content = file_get_contents(urlencode($source));
file_put_contents(str_replace(".php", ".html", $dest), $content);
echo "Copy $source to $dest. <br>";
}
}
In the code above I use file_get_contents() to read the html from the URL you are using https://..., which in this case, unlike with copy(), will call up the webserver, triggering the PHP engine to produce the output.
Then I write the pure HTML to a file in the $dest folder, replacing the .php with .htmlin the filename.
EDIT
Added and revised the code a bit above.
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
Questions asking for code must demonstrate a minimal understanding of the problem being solved. Include attempted solutions, why they didn't work, and the expected results. See also: Stack Overflow question checklist
Closed 9 years ago.
Improve this question
I need to run 600 XML files through a script I've made that extracts specific pieces of information and saves each one in JSON format. All 600 XML files are inside a folder ready to be run through the PHP file, I'm now looking for a fast way to do it.
Essentially this is the process the PHP file goes through:
PHP reads single XML file via URL -> locally saves important info in variables -> saves important info into JSON file
Is there a way I can somehow run all 600 XML files through my PHP file?
Thanks
Open the directory containing the XML files and then process them, here are some of the most common way todo that.
opendir()
<?php
$dir = "/etc/php5/";
// Open a known directory, and proceed to read its contents
if (is_dir($dir)) {
if ($dh = opendir($dir)) {
while (($file = readdir($dh)) !== false) {
echo "filename: $file : filetype: " . filetype($dir . $file) . "\n";
}
closedir($dh);
}
}
?>
You can also use glob()
<?php
foreach (glob("*.txt") as $filename) {
echo "$filename size " . filesize($filename) . "\n";
}
?>
Inside the foreach loop of whichever you choose you can use file_get_contents() or fread() then you can do your conversion to json.
<?php
// get contents of a file into a string
$filename = "/usr/local/something.txt";
$handle = fopen($filename, "r");
$contents = fread($handle, filesize($filename));
fclose($handle);
?>
Hope it helps
Just go ahead and try! You'll probably run into a timeout error. If you do, try configuring the max timeout settings. http://php.net/manual/en/function.set-time-limit.php
Joel,
Sounds to me like what you need to is to use readdir
http://php.net/manual/en/function.readdir.php
This will allow you to get a list of files in a directory to iterate over.
$dir = opendir('/path/to/files');
while($file = readdir($dir)) {
if ($file !== '.' && $file !== '..' && !is_dir($file)) {
$parthParts = pathinfo($file);
if ($pathParts['extension'] === 'xml') {
runscripton($file);
}
}
}
closedir($dir);
First, write a function that gets an XML file name, and after processing, returns the results in php array or JSON (Based on how you need your code to be).
To write this function, you need to parse XML (http://php.net/manual/en/book.xml.php).
To work with JSON in PHP: http://php.net/manual/en/book.json.php
Then, write your main code. Your main code should enumerate all XML files in the folder, and then call your function for each file, and gather/generate JSON using information returned by the function.
You might need readdir to gather all of XML files in the folder. (http://php.net/manual/en/book.xml.php)
Don't forget to increase time limit as long as there are lots of XML files and the process might take long so a timeout error would occur. (http://php.net/manual/en/function.set-time-limit.php)
This question should be pretty simple. I have a php file in a directory that contains function calls to read files in that directory. I need to be able to access those functions and call them from outside the directory. Is there a way to make php execute those functions relative to the file that they are physically in vs the file they were included into? If not, how would I make sure that I could read those files from different parts of the directory structure?
Thanks
__FILE__ is the name of "this" file, even if it is included somewhere else
http://us3.php.net/manual/en/language.constants.predefined.php
so fopen(dirname(__FILE__) . '/blah') will open the file from the same directory
So if I understand well you have something like that:
function processDir() {
if ($dh = opendir('.')) {
while (($file = readdir($dh)) !== false) {
// some processing
}
closedir($dh);
}
}
so why don't you path the directory to treat to your processing function
in my example:
function processDir(aDir)