I've got two servers. One for the files and one for the website.
I figured out how to upload the files to that server but now I need to show the thumbnails on the website.
Is there a way to go through the folder /files on the file server and display a list of those files on the website using PHP?
I searched for a while now but can't find the answer.
I tried using scanddir([URL]) but that didn't work.
I'm embarrassed to say this but I found my answer at another post:
PHP directory list from remote server
function get_text($filename) {
$fp_load = fopen("$filename", "rb");
if ( $fp_load ) {
while ( !feof($fp_load) ) {
$content .= fgets($fp_load, 8192);
}
fclose($fp_load);
return $content;
}
}
$matches = array();
preg_match_all("/(a href\=\")([^\?\"]*)(\")/i", get_text('http://www.xxxxx.com/my/cool/remote/dir'), $matches);
foreach($matches[2] as $match) {
echo $match . '<br>';
}
scandir will not work any other server but your own. If you want to be able to do such a thing your best bet to have them still on separate servers would be to have a php file on the website, and a php file on the file server. The php file on your website could get file data of the other server via the file server php file printing data to the screen, and the webserver one reading in that data. Example:
Webserver:
<?php
$filedata = file_get_contents("url to file handler php");
?>
Fileserver:
<?php
echo "info you want webserver to read";
?>
This can also be customized for your doing with post and get requests.
I used the following method:
I created a script which goes through all the files at the file server.
$fileList = glob($dir."*.*");
This is only possible if the script is actually on the fileserver. It would be rather strange to go through files at another server without having access to it.
There is a way to do this without having access (read my other answer) but this is very slow and not coming in handy.
I know I said that I didn't have access, but I had. I just wanted to know all the possibilities.
Related
EDIT: I'm pretty sure the issue has to do with the firewall, which I can't access. Marking Canis' answer as correct and I will figure something else out, possibly wget or just manually scraping the files and hoping no major updates are needed.
EDIT: Here's the latest version of the builder and here's the output. The build directory has the proper structure and most of the files, but only their name and extension - no data inside them.
I am coding a php script that searches the local directory for files, then scrapes my localhost (xampp) for the same files to copy into a build folder (the goal is to build php on the localhost and then put it on a server as html).
Unfortunately I am getting the error: Warning: copy(https:\\localhost\intranet\builder.php): failed to open stream: No such file or directory in C:\xampp\htdocs\intranet\builder.php on line 73.
That's one example - every file in the local directory is spitting the same error back. The source addresses are correct (I can get to the file on localhost from the address in the error log) and the local directory is properly constructed - just moving the files into it doesn't work. The full code is here, the most relevant section is:
// output build files
foreach($paths as $path)
{
echo "<br>";
$path = str_replace($localroot, "", $path);
$source = $hosted . $path;
$dest = $localbuild . $path;
if (is_dir_path($dest))
{
mkdir($dest, 0755, true);
echo "Make folder $source at $dest. <br>";
}
else
{
copy($source, $dest);
echo "Copy $source to $dest. <br>";
}
}
You are trying to use URLs to travers local filesystem directories. URLs are only for webserver to understand web requests.
You will have more luck if you change this:
copy(https:\\localhost\intranet\builder.php)
to this:
copy(C:\xampp\htdocs\intranet\builder.php)
EDIT
Based on your additional info in the comments I understand that you need to generate static HTML-files for hosting on a static only webserver. This is not an issue of copying files really. It's accessing the HMTL that the script generates when run through a webserver.
You can do this in a few different ways actually. I'm not sure exactly how the generator script works, but it seems like that script is trying to copy the supposed output from loads of PHP-files.
To get the generated content from a PHP-file you can either use the command line php command to execute the script like so c:\some\path>php some_php_file.php > my_html_file.html, or use the power of the webserver to do it for you:
<?php
$hosted = "https://localhost/intranet/"; <--- UPDATED
foreach($paths as $path)
{
echo "<br>";
$path = str_replace($localroot, "", $path);
$path = str_replace("\\","/",$path); <--- ADDED
$source = $hosted . $path;
$dest = $localbuild . $path;
if (is_dir_path($dest))
{
mkdir($dest, 0755, true);
echo "Make folder $source at $dest. <br>";
}
else
{
$content = file_get_contents(urlencode($source));
file_put_contents(str_replace(".php", ".html", $dest), $content);
echo "Copy $source to $dest. <br>";
}
}
In the code above I use file_get_contents() to read the html from the URL you are using https://..., which in this case, unlike with copy(), will call up the webserver, triggering the PHP engine to produce the output.
Then I write the pure HTML to a file in the $dest folder, replacing the .php with .htmlin the filename.
EDIT
Added and revised the code a bit above.
I have a script and I don't know why and how it works - one reason for that is I found contradicting information about file_get_contents.
I have three (internal) webservers - all set up the same way, running the same software.
I need to count the number of files in one specific folder on each server (in order to get the number of users logged into a certain application).
For the local server my file counting PHP script is called by a simple include and for the two remote servers I use file_get_contents.
In both cases I refer to the same PHP file. That works - I get the correct number of files for the folder on each server.
Sometimes you read file_get_contents returns just the file content but does not execute the file. In my case the file is executed and I get the correct number of files. So, I'm a bit confused here why my scripts actually work.
My scripts were saved on one server. I want to be more flexible and be able to call the scripts from each server. Therefore I created a new virtual directory on a network folder and moved the script files there, the virtual folder has the same set up on each server. I had to change my script slightly to get the same result again. Instead of a return $num I now have echo $num. If I use return I won't get a result, if I use echo the correct number of files is given. I would prefer to receive the result via return - but I don't know why this doesn't work anymore in the new context.
script which shows the number of files:
function getUserNum($basis_url_server, $url_vaw_scripte, $script_number_users)
{
$serverName = strtoupper($_SERVER['SERVER_NAME']);
//local server
if(strpos(strtoupper($basis_url_server),$serverName) !== false)
{
$numUsers = (include($script_number_users));
}
//remote server
else
{
$path = $basis_url_server.$url_vaw_scripte.$script_number_users;
$numUsers = file_get_contents($path);
//include($path);
}
return $numUsers;
}
echo getUserNum($basis_url_server1, $url_vaw_scripte, $script_number_users)."($label_server1)";
echo getUserNum($basis_url_server2, $url_vaw_scripte, $script_number_users)."($label_server2)";
echo getUserNum($basis_url_server3, $url_vaw_scripte, $script_number_users)."($label_server3)";
script for counting the files (refered as $script_number_users above)
<?php
// 'include' only contains $fadSessionRepository = "E:\Repository\Session"
include dirname(__DIR__).'/vaw_settings.php';
$fi = new FilesystemIterator($pfadSessionRepository, FilesystemIterator::SKIP_DOTS);
$number = (iterator_count($fi)-1)/2 ;
//return $number;
echo $number;
?>
file_get_contents() will execute a GET if given a url, and will read a file if given filesystem path. It is like 2 different function from the same call.
You are actually building a primitive REST webservice instead of actually loading the files as you though, the remote files are executed and you get the output that you would see if you manually loaded them from a browser
file_get_contents() will return the raw content of a local file. For remote files it will return what the webserver delivers. If the webserver executes the script in the file it will get the result of that script. If the webserver doesn't execute the script in the file (due to a misconfiguration for example) you will still get the raw content of the remote script.
In your case I'd just remove the include path and just fetch all scripts over http. It reduces the complexity and the overhead of calling one of three scripts via http instead of loading it directly is negligible.
I am writing a script that will go through all my .js files and minify them into one .php file to be included on the site. I just run this script after I have edited some js and want to upload it to the live site.
The issue: I can not load the content of jquery-2.1.4.min.js using file_get_contents. I have tried changing the name of the file to jquery.js and that did not help. I do not have any complex javascript in the other files (just random strings) but they open fine.
With the code:
if (!file_get_contents($filename)) {
die ("dammit");
}
I get the response of "dammit". All other files are fine though, so I know the file name and path are correct. One of the weird things is that there are no errors coming up (I have used error_reporting (-1); to make sure they will).
Is anyone else able to get the file contents of jquery? Any ideas what would cause this and if it will be a problem with other javascript or css?
As requested, here is the full code:
$buffer = $jsStartBuffer;
//get a list of files in the folder (only .js files)
$fileArray = array();
if (is_dir($jsMakeFile["SourcePath"])){
if ($dh = opendir($jsMakeFile["SourcePath"])){
while (($file = readdir($dh)) !== false){
$file_parts = pathinfo($jsMakeFile["SourcePath"].$file);
if ($file_parts['extension'] == "js") {
$fileArray[] = $file;
}
}
}
}
print_r($fileArray);
foreach ($fileArray as $nextRawFile) {
$buffer .= file_get_contents($jsMakeFile["SourcePath"].$nextRawFile);
if (!file_get_contents($jsMakeFile["SourcePath"].$nextRawFile)) {
die ("dammit");
}
echo $jsMakeFile["SourcePath"].$nextRawFile;
}
$buffer .= $jsEndBuffer;
echo $buffer;
$buffer = \JShrink\Minifier::minify($buffer);
file_put_contents($jsMakeFile["finalFile"]["path"].$jsMakeFile["finalFile"]["name"], $buffer);
When I put other .js files in there it is fine (I even tried lightbox.min.js and it worked fine!) I have tried a few different versions of jquery.min and they all seem to fail.
OK, solution found. It is to do with the actual file created by jquery.
The way I solved it was:
- Go to the query site, and instead of downloading the required file, open it in a new tab/window
- Copy all the content in this window
- Create a new file where required and name as required
- Paste the content into this file and save it
This new file will now be able to be read by file_get_contents. I would imagine this solution would help if you are trying to work with jquery (and other) files in php in any way and having issues.
I want to store some data retrieved using an API on my server. Specifically, these are .mp3 files of (free) learning tracks. I'm running into a problem though. The mp3 link returned from the request isn't to a straight .mp3 file, but rather makes an ADDITIONAL API call which normally would prompt you to download the mp3 file.
file_put_contents doesn't seem to like that. The mp3 file is empty.
Here's the code:
$id = $_POST['cid'];
$title = $_POST['title'];
if (!file_exists("tags/".$id."_".$title))
{
mkdir("tags/".$id."_".$title);
}
else
echo "Dir already exists";
file_put_contents("tags/{$id}_{$title}/all.mp3", fopen($_POST['all'], 'r'));
And here is an example of the second API I mentioned earlier:
http://www.barbershoptags.com/dbaction.php?action=DownloadFile&dbase=tags&id=31&fldname=AllParts
Is there some way to bypass this intermediate step? If there's no way to access the direct URL of the mp3, is there a way to redirect the file download prompt to my server?
Thank you in advance for your help!
EDIT
Here is the current snippet. I should be echoing something, correct?
$handle = fopen("http://www.barbershoptags.com/dbaction.php?action=DownloadFile&dbase=tags&id=31&fldname=AllParts", 'rb');
$contents = stream_get_contents($handle);
echo $contents;
Because this echos nothing.
SOLUTION
Ok, I guess file_get_contents is supposed to handle redirects just fine, but this wasn't happening. So I found this function: https://stackoverflow.com/a/4102293/2723783 to return the final redirect of the API. I plugged that URL into file_get_contents and volia!
You seem to be just opening the file handler and not getting the contents using fread() or another similar function:
http://www.php.net/manual/en/function.fread.php
$handle = fopen($_POST['all'], 'rb')
file_put_contents("tags/{$id}_{$title}/all.mp3", stream_get_contents($handle));
I want to upload an external image to my remote host via php.mean's that i wan't code which copy http://www.exaple.com/123.jpg to http://www.mysite.com/image.jpg.
Despite exec function ,due this function has been disabled by webhosting.
best regards
Google is your friend:
<?php exec("wget http://www.exaple.com/123.jpg"); exec("mv 123.jpg image.jpg")?>
(this code is to be executed on the server www.mysite.com)
You can run a php script to get the file and save it in your site.
<?php
$data = file_get_contents('http://www.exaple.com/123.jpg');
if ($data !== false) {
file_get_contents('image.jpg', $data);
}
else {
// error in fetching the file
}
file_get_contents is binary safe. you can find more about it from php website http://php.net/manual/en/function.file-get-contents.php