PHP file_get_contents to get jquery min code - php

I am writing a script that will go through all my .js files and minify them into one .php file to be included on the site. I just run this script after I have edited some js and want to upload it to the live site.
The issue: I can not load the content of jquery-2.1.4.min.js using file_get_contents. I have tried changing the name of the file to jquery.js and that did not help. I do not have any complex javascript in the other files (just random strings) but they open fine.
With the code:
if (!file_get_contents($filename)) {
die ("dammit");
}
I get the response of "dammit". All other files are fine though, so I know the file name and path are correct. One of the weird things is that there are no errors coming up (I have used error_reporting (-1); to make sure they will).
Is anyone else able to get the file contents of jquery? Any ideas what would cause this and if it will be a problem with other javascript or css?
As requested, here is the full code:
$buffer = $jsStartBuffer;
//get a list of files in the folder (only .js files)
$fileArray = array();
if (is_dir($jsMakeFile["SourcePath"])){
if ($dh = opendir($jsMakeFile["SourcePath"])){
while (($file = readdir($dh)) !== false){
$file_parts = pathinfo($jsMakeFile["SourcePath"].$file);
if ($file_parts['extension'] == "js") {
$fileArray[] = $file;
}
}
}
}
print_r($fileArray);
foreach ($fileArray as $nextRawFile) {
$buffer .= file_get_contents($jsMakeFile["SourcePath"].$nextRawFile);
if (!file_get_contents($jsMakeFile["SourcePath"].$nextRawFile)) {
die ("dammit");
}
echo $jsMakeFile["SourcePath"].$nextRawFile;
}
$buffer .= $jsEndBuffer;
echo $buffer;
$buffer = \JShrink\Minifier::minify($buffer);
file_put_contents($jsMakeFile["finalFile"]["path"].$jsMakeFile["finalFile"]["name"], $buffer);
When I put other .js files in there it is fine (I even tried lightbox.min.js and it worked fine!) I have tried a few different versions of jquery.min and they all seem to fail.

OK, solution found. It is to do with the actual file created by jquery.
The way I solved it was:
- Go to the query site, and instead of downloading the required file, open it in a new tab/window
- Copy all the content in this window
- Create a new file where required and name as required
- Paste the content into this file and save it
This new file will now be able to be read by file_get_contents. I would imagine this solution would help if you are trying to work with jquery (and other) files in php in any way and having issues.

Related

How can I parse directory of xml files with php?

I am new to php and I am trying to create a file upload system that will automatically parse the xml file using simplexml. I have created a php script that will open the directory and try to parse the files. For some reason, it will only parse one of the files. I am not sure if this is the best way to aproach this task.
<?php
$dir = "path/to/xmlfiles"
chdir($dir);
// Open a directory, and read its contents
if (is_dir($dir)){
if ($dh = opendir($dir)){
while (($file = readdir($dh)) !== false){
$xml = simplexml_load_file($file);
$nombre = $xml ->xpath("//NOMBRE");
$rpu = $xml ->xpath("//RPU");
echo (string) $nombre[0];
echo (string) $rpu[0];
echo $file;
}
closedir($dh);
}
}
?>
For this script, I am able to echo the results just fine, the only problem is that it will only echo one of the xml file resutls.
Hopefully someone with more experience could give me a tip on how to achieve this.
For extra points, I am also trying to insert an entry to a Mysql database for each parsed file.
;) Thank you in advance for all your help.
readdir() reads directory entries as they're stored on disk (i.e., it doesn't sort entries) so it's very likely that . (current directory) will be the first one. That will make simplexml_load_file() fail and $xml will become false so $xml->xpath() will crash the script with a fatal error.
PHP should be reporting all this. If you cannot see it, it's very likely that you haven't configured PHP to display errors.
You need to filter out entries (the bare minimum would be to check they are actual files and not directories) and add some error checking here and there.
An alternative approach:
foreach (glob("$dir/*.xml") as $file) {
}

"No such file or directory" on localhost copy

EDIT: I'm pretty sure the issue has to do with the firewall, which I can't access. Marking Canis' answer as correct and I will figure something else out, possibly wget or just manually scraping the files and hoping no major updates are needed.
EDIT: Here's the latest version of the builder and here's the output. The build directory has the proper structure and most of the files, but only their name and extension - no data inside them.
I am coding a php script that searches the local directory for files, then scrapes my localhost (xampp) for the same files to copy into a build folder (the goal is to build php on the localhost and then put it on a server as html).
Unfortunately I am getting the error: Warning: copy(https:\\localhost\intranet\builder.php): failed to open stream: No such file or directory in C:\xampp\htdocs\intranet\builder.php on line 73.
That's one example - every file in the local directory is spitting the same error back. The source addresses are correct (I can get to the file on localhost from the address in the error log) and the local directory is properly constructed - just moving the files into it doesn't work. The full code is here, the most relevant section is:
// output build files
foreach($paths as $path)
{
echo "<br>";
$path = str_replace($localroot, "", $path);
$source = $hosted . $path;
$dest = $localbuild . $path;
if (is_dir_path($dest))
{
mkdir($dest, 0755, true);
echo "Make folder $source at $dest. <br>";
}
else
{
copy($source, $dest);
echo "Copy $source to $dest. <br>";
}
}
You are trying to use URLs to travers local filesystem directories. URLs are only for webserver to understand web requests.
You will have more luck if you change this:
copy(https:\\localhost\intranet\builder.php)
to this:
copy(C:\xampp\htdocs\intranet\builder.php)
EDIT
Based on your additional info in the comments I understand that you need to generate static HTML-files for hosting on a static only webserver. This is not an issue of copying files really. It's accessing the HMTL that the script generates when run through a webserver.
You can do this in a few different ways actually. I'm not sure exactly how the generator script works, but it seems like that script is trying to copy the supposed output from loads of PHP-files.
To get the generated content from a PHP-file you can either use the command line php command to execute the script like so c:\some\path>php some_php_file.php > my_html_file.html, or use the power of the webserver to do it for you:
<?php
$hosted = "https://localhost/intranet/"; <--- UPDATED
foreach($paths as $path)
{
echo "<br>";
$path = str_replace($localroot, "", $path);
$path = str_replace("\\","/",$path); <--- ADDED
$source = $hosted . $path;
$dest = $localbuild . $path;
if (is_dir_path($dest))
{
mkdir($dest, 0755, true);
echo "Make folder $source at $dest. <br>";
}
else
{
$content = file_get_contents(urlencode($source));
file_put_contents(str_replace(".php", ".html", $dest), $content);
echo "Copy $source to $dest. <br>";
}
}
In the code above I use file_get_contents() to read the html from the URL you are using https://..., which in this case, unlike with copy(), will call up the webserver, triggering the PHP engine to produce the output.
Then I write the pure HTML to a file in the $dest folder, replacing the .php with .htmlin the filename.
EDIT
Added and revised the code a bit above.

PHP handles a zip file as if it's empty

Here's a very stripped down version of the code I'm using.
$url = "http://server.com/getDaFile";
//Get the file from the server
$zip_file_contents = file_get_contents($url);
//Write file to disk
file_put_contents("file.zip", $zip_file_contents);
//open zip file
$zip = zip_open("file.zip");
if(is_resource($zip))
{
while($zip_entry = zip_read($zip))
{
if(zip_entry_open($zip, $zip_entry, 'r'))
{
//Read the whole file
$buf = zip_entry_read($zip_entry, zip_entry_filesize($zip_entry));
/*
Do stuff with $buf!!!
*/
zip_entry_close($zip_entry);
}
}
zip_close($zip);
}
else
{
echo "Not a resource. Oh noes!\n";
}
So : get the file, save it to disk, unzip it to extract files it contains, do stuff with files. The problem here is that, for some reason I cannot figure out, zip_read returns FALSE, as if it couldn't read files inside the ZIP archive. $zip does contain a resource, I've checked with var_dump.
What makes this even stranger is that I downloaded the ZIP file on my PC using the URL on top, manually uploaded it to the PHP server, and commented out the calls to file_get_contents and file_put_contents so PHP uses the local version. When I do this, zip_read correctly finds the right amount of files inside the ZIP and processing proceeds as it should.
I also tried doing this : $zip = zip_open($url) but $zip fails the is_resource($zip) check.
Something is obviously wrong with my code since the URL works and returns a valid ZIP archive. What is it?
So I finally found out the problem. Following #diolemo's suggestion, I opened my server's ZIP archive in a hex editor. Here's what I found at the top, followed by the usual ZIP binary data : http://pastebin.com/vQEXJtTN
It turns out there was a PHP error mixed in with the actual content of the ZIP file. Unsure of how to fix this (but knowing it certainly had to do with HTTP headers), I tried this guy's code and, what do you know, my code works perfectly now!
Lessons learned? Never trust your data, even if it seems all right (both 7-Zip and Winrar managed to open the file without problem).

Go through files in folder at another server PHP

I've got two servers. One for the files and one for the website.
I figured out how to upload the files to that server but now I need to show the thumbnails on the website.
Is there a way to go through the folder /files on the file server and display a list of those files on the website using PHP?
I searched for a while now but can't find the answer.
I tried using scanddir([URL]) but that didn't work.
I'm embarrassed to say this but I found my answer at another post:
PHP directory list from remote server
function get_text($filename) {
$fp_load = fopen("$filename", "rb");
if ( $fp_load ) {
while ( !feof($fp_load) ) {
$content .= fgets($fp_load, 8192);
}
fclose($fp_load);
return $content;
}
}
$matches = array();
preg_match_all("/(a href\=\")([^\?\"]*)(\")/i", get_text('http://www.xxxxx.com/my/cool/remote/dir'), $matches);
foreach($matches[2] as $match) {
echo $match . '<br>';
}
scandir will not work any other server but your own. If you want to be able to do such a thing your best bet to have them still on separate servers would be to have a php file on the website, and a php file on the file server. The php file on your website could get file data of the other server via the file server php file printing data to the screen, and the webserver one reading in that data. Example:
Webserver:
<?php
$filedata = file_get_contents("url to file handler php");
?>
Fileserver:
<?php
echo "info you want webserver to read";
?>
This can also be customized for your doing with post and get requests.
I used the following method:
I created a script which goes through all the files at the file server.
$fileList = glob($dir."*.*");
This is only possible if the script is actually on the fileserver. It would be rather strange to go through files at another server without having access to it.
There is a way to do this without having access (read my other answer) but this is very slow and not coming in handy.
I know I said that I didn't have access, but I had. I just wanted to know all the possibilities.

Trying to echo contents of multiple text files while sorting the output by the file name - PHP

I'm not a developer, but I'm the default developer at work now. : ) Over the last few weeks I've found a lot of my answers here and at other sites, but this latest problem has me confused beyond belief. I KNOW it's a simple answer, but I'm not asking Google the right questions.
First... I have to use text files, as I don't have access to a database (things are locked down TIGHT where I work).
Anyway, I need to look into a directory for text files stored there, open each file and display a small amount of text, while making sure the text I display is sorted by the file name.
I'm CLOSE, I know it... I finally managed to figure out sorting, and I know how to read into a directory and display the contents of the files, but I'm having a heck of a time merging those two concepts together.
Can anyone provide a bit of help? With the script as it is now, I echo the sorted file names with no problem. My line of code that I thought would read the contents of a file and then display it is only echoing the line breaks, but not the contents of the files. This is the code I've got so far - it's just test code so I can get the functionality working.
<?php
$dirFiles = array();
if ($handle = opendir('./event-titles')) {
while (false !== ($file = readdir($handle))) {
if ($file != "." && $file != "..") {
$dirFiles[] = $file;
}
}
closedir($handle);
}
sort($dirFiles);
foreach($dirFiles as $file)
{
$fileContents = file_get_contents($file);//////// This is what's not working
echo $file."<br>".$fileContents."<br/><br/>";
}
?>
Help? : )
Dave
$files = scandir('./event-titles') will return an array of filenames in filename-sorted order. You can then do
foreach($files as $file)
{
$fileContents = file_get_contents('./event-titles/'.$file);
echo $file."<br/>".$fileContents."<br/><br/>";
}
Note that I use the directory name in the file_get_contents call, as the filename by itself will cause file_get_contents to look in the current directory, not the directory you were specifying in scandir.

Categories