Using PHP, I want to get the content from the first file in a folder and when the content is loaded, delete the file. Here is what I have:
$a = opendir('./');
while (false !== ($entry = readdir($a))) {
if($entry != '.' && $entry != '..') {
echo $entry = file_get_contents('./'.$entry.'', FILE_USE_INCLUDE_PATH);
break; // only need the first file
}
}
The code above loads the first file in the folder and I can delete it successfully using something like
unlink("temp.txt");
So there are no permission denied errors. BUT what I need to do is delete the file by its variable name (because every filename is different). Surprisingly for me, unlink("$entry"); or something similar does not let me delete it, instead showing a warning along with the first few lines of the content of that file. If I echo $entry It shows temp.txt correctly. Can someone enlighten me? What am I missing here?
(Optional (un)related question: If I have numeric files like 1.txt, 2.txt, 3.txt, 10.txt... . Is there a way I could modify the code above in a way, that it does not load files like 1,10,2,3 ..., instead load it like 1,2,3,10...?)
UPDATE:
The updated code that works (for future reference):
$a = opendir('./');
while (false !== ($entry = readdir($a))) {
if($entry != '.' && $entry != '..') {
echo $b = file_get_contents('./'.$entry.'', FILE_USE_INCLUDE_PATH);
break; // only need the first file
}
}
unlink("./$entry");
The first problem is that you're overwriting $entry with the file contents, as such the filename is no longer valid when trying to delete it (explaining the error with the file contents).
Secondly, because you're using FILE_USE_INCLUDE_PATH you don't know exactly where the file is located, and unlink resolves related to the current working directory, which is most probably not $a.
Use unlink($a.'/'.$entry) and you'll be fine.
As for the unrelated question - use scandir to get all the files in the folder, then apply natsort to the resulting array to sort by a 'natural sorting algorithm'. Keep in mind that a directory listing always also lists the folders . and .. which you'll have to detect and skip or remove manually.
Related
I have a php script that will echo a list of files from a folder and display them randomly on my page.
At the moment it displays the url of the file for example: what-can-cause-tooth-decay.php
i would like it to display the page title <title>What can cause tooth decay</title>.
Current code:
<?php
if ($handle = opendir('health')) {
$fileTab = array();
while (false !== ($file = readdir($handle))) {
if ($file != "." && $file != "..") {
$fileTab[] = $file;
}
}
closedir($handle);
shuffle($fileTab);
foreach($fileTab as $file) {
$thelist .= '<p>'.$file.'</p>';
}
}
?>
<?=$thelist?>
Many thanks
I see a couple of possible approaches:
You could parse the files. Should be possible by running a regex over the results of file_get_contents.
Place some text file somewhere, which maps file names to titles. Load it into memory and use it to populate an array which you can use to map the file names. Let it have file_name.html Title on each line, or something like that.
Use a naming convention, so the titles can be inferred from the filenames. Something like "capizalize first letter, turn '_' into space"
Downsides of the 2nd and 3rd approach: You'd have to keep it consistent with the actual title-elements in the files. Problem with the first approach: You have to read every file into memory - could be a performance problem with many/big files. To solve this, I could imagine writing a script which looks through all the files and generates the lookup text file. Whenever you change a title in a file, or add/remove files, you'd just need to rerun that script.
I have been using a snipit I found online to display the files and folders in a directory with my PHP script. This part works fine, but I would like to be able to click the folders and get a similar html page up that displays it's contents, and beeing able to open the text files in the browser(this works fine if the text file is in the start directory).
At the moment, when I click a folder, an old school sort of page opens up with no html file in it. Is the only way to do this to create a new PHP script in each folder and link to it? I tried using the DirectoryIterator class, but it gave me an error. I dont have the snipit for DirectoryIterator anymore, but it was something like "Can't find class DirectoryIterator".
Here's the code I'm using now(working):
$arrayImports = array ();
if ($handle = opendir ($importLogPath)
{
while (false !== ($entry = readdir ()))
{
chop ($entry);
if ($entry != "." $entry != "..")
{
$arrayImports [] = "<p><a href=$importLogPath$entry target=_blank>$entry</a></p>";
}
}
closedir($handle);
}
arsort ($arrayImports);
foreach ($arrayImports as $value)
{
print "<li>$value</li>";
}
Thanks!
Are you looking for something like this?
Encode Explorer
I used it for an old project... it's very cool. You can also add ajax to browse folders without refreshing the pages.
Bye
When I use simplexml_load_file to individual file, it works fine. However, since I have so many of them, when I tried to run my script to a batch of files.
In case this is relavant, I have two kind of log files to load into my database. One starts with . The other starts with . (But looking at the error, the error occurs even w/ the same structure.)
<?php
$dir_path = ".";
if ($dir_handler = opendir($dir_path)) {
while (($sub_dir = readdir($dir_handler)) !== false) { //reading all sub dir
if (is_dir($sub_dir)) {
if (substr($sub_dir,0,6) == "201209") { //filter only desired sub dir
$sub_dir_handler = opendir($sub_dir);
while($file = readdir($sub_dir_handler)) { //reading files in each
//qualified sub dir
if (($file != ".") && ($file != "..")) { //except . and ..
$xml = simplexml_load_file($file); // got error on
//the second file
if ($xml->getname() != "hash") { // tried to distinct
// structure type but error
I guess simplexml_load_file return false (error)
You can see those using libxml_get_errors, cf http://php.net/manual/en/simplexml.examples-errors.php
Edit:
Considering your 2nd comment, looks like your file is not accessible to SimpleXML...
Found the silly cause.
because I keep all files in the subdirs, I "need" to concat the subdir to the filename to locate the path. The error, thus, cause by cannot open file. It also explains why I can open it individually when I hard code the filename.
Since there is now no error, the libxml_get_error cannot be applied to my script.
:)
Since I want to skip the second type of xml schema which causes the error I use
if (!libxml_get_errors())
to continue my work. Many many thank to you.
I'm not a developer, but I'm the default developer at work now. : ) Over the last few weeks I've found a lot of my answers here and at other sites, but this latest problem has me confused beyond belief. I KNOW it's a simple answer, but I'm not asking Google the right questions.
First... I have to use text files, as I don't have access to a database (things are locked down TIGHT where I work).
Anyway, I need to look into a directory for text files stored there, open each file and display a small amount of text, while making sure the text I display is sorted by the file name.
I'm CLOSE, I know it... I finally managed to figure out sorting, and I know how to read into a directory and display the contents of the files, but I'm having a heck of a time merging those two concepts together.
Can anyone provide a bit of help? With the script as it is now, I echo the sorted file names with no problem. My line of code that I thought would read the contents of a file and then display it is only echoing the line breaks, but not the contents of the files. This is the code I've got so far - it's just test code so I can get the functionality working.
<?php
$dirFiles = array();
if ($handle = opendir('./event-titles')) {
while (false !== ($file = readdir($handle))) {
if ($file != "." && $file != "..") {
$dirFiles[] = $file;
}
}
closedir($handle);
}
sort($dirFiles);
foreach($dirFiles as $file)
{
$fileContents = file_get_contents($file);//////// This is what's not working
echo $file."<br>".$fileContents."<br/><br/>";
}
?>
Help? : )
Dave
$files = scandir('./event-titles') will return an array of filenames in filename-sorted order. You can then do
foreach($files as $file)
{
$fileContents = file_get_contents('./event-titles/'.$file);
echo $file."<br/>".$fileContents."<br/><br/>";
}
Note that I use the directory name in the file_get_contents call, as the filename by itself will cause file_get_contents to look in the current directory, not the directory you were specifying in scandir.
Say I have a directory with 100 files in it. Some of the files are PHP and the others are HTML. None of them are linked together. It's just a directory with the files and none of the files are linked, and there is no index file. It's a shared hosting cPanel environment. My question: Is there a way via PHP or otherwise to automatically detect these files and generate a sitemap in HTML, XML or other format? Thanks very much for your help on this one.
Untested but here are a couple scripts which I think may solve your issue:
http://apptools.com/phptools/dynamicsitemap.php
http://yoast.com/xml-sitemap-php-script/
If you want a proper sitemap (how the files link to one another) then there are some libraries available for that mentioned by others. If you just want to list them, then just use the opendir and readdir functions:
$directory = 'your directory';
$array_items = array();
if ($handle = opendir($directory)) {
while (false !== ($file = readdir($handle))) {
if ($file != "." && $file != "..") {
if (is_dir($directory.'/'.$file)){
continue;
}
$array_items[] = $file;
}
}
closedir($handle);
}
You can then loop through the $array_items and output xml or html. You can also make this recursive by making this a function and handling the
if (is_dir($directory.'/'.$file)){
continue;
}
section