i have subdomain sub.domain.com. the subdomain points on a directory root/sub of my root dir on my Webserver.
now i have pdfs on another dir on the server root/pdf.
How can i check if a specific pdf exists and if it exists i want to copy the file to a temp dir of the subdomain.
if i call a php script sub/check.php an try to check a pdf that exists :
$filename = "http://www.domain.com/pdf/1.pdf";
if (file_exists($filename))
{
"exists";
}
else
{
"not exists";
}
It always shows : not exists.
If i take the url and put it in a browser - the pdf will be shown.
How can my php script in the /sub-folder access files in the root or root/pdf ?
bye jogi
file_exists() function does not work that way. It does not take remote URLs.
This function is used to check the file that exists on the file system.
Check the manual here
Make use of cURL to accomplish this.
<?php
$ch = curl_init("https://www.google.co.in/images/srpr/logo4w.png"); //pass your pdf here
curl_setopt($ch, CURLOPT_NOBODY, true);
curl_setopt($ch, CURLOPT_SSL_VERIFYPEER,false);
curl_exec($ch);
$retcode = curl_getinfo($ch, CURLINFO_HTTP_CODE);
curl_close($ch);
if($retcode==200)
{
echo "exists";
}
else
{
echo "not exists";
}
?>
file_exists() looks locally on the machine if a file exists. But what you are doing is using a URL.
Since you say your script is in the root folder, you need to do change
$filename = "http://www.domain.com/pdf/1.pdf";
into
$filename = realpath(dirname(__FILE__)) . "/pdf/1.pdf"; // first part gets current directory
Related
my link (URL) is different!!! and does not work with usual method
I think because site load with js or aspx
you can test my link (URL) in your browser and see download starting
but cant work in php
I have tested all methods (fetch, curl, file, get, put), but it does not work.
I have a similar URL here: 'http://www.tsetmc.com/tsev2/data/ClientTypeAll.aspx?h=0&r=0'
I can open it in the browser and download a csv file I need to do this in php and save the csv file on server
Here is what I have tried so far:
<?php
$file = fopen('http://www.tsetmc.com/tsev2/data/ClientTypeAll.aspx?h=0&r=0');
$file = file_get_contents('http://www.tsetmc.com/tsev2/data/ClientTypeAll.aspx?h=0&r=0');
file_put_contents('ClientTypeAll.csv', $file);
?>
I do not want Contents !!! I want a csv file form my link
if you test my link in your browser download start in your pc
I run this code with a remote PDF file.
<?php
$url = 'https://example.com/file.pdf';
$dir_name = 'storage-x'; // saVe the directory name
if (!file_exists($dir_name)) {
mkdir($dir_name, 0777, true);
}
$path = $dir_name.'/'.rand().'.pdf';
$file = file_get_contents($url);
file_put_contents($path, $file);
?>
Please follow the below step.
Get file form URL.
Set directory and check file exit condition and update directory access permission.
Set new file path, name, and directory.
Save the file.
Please check the below example which works for me.
<?php
$file = file_get_contents('https://example.com/file.pdf');
$dirName = 'storage-pdf';
if (!file_exists($dirName)) {
mkdir($dirName, 0777, true);
}
$newFilePath = $dirName.'/'.rand().'.pdf';
file_put_contents($newFilePath, $file);
?>
I am creating directory in php with mkdir it returns true but when i ssh into my server i cannot find directory in specified path.
I have checked in different locations in server.
if (!file_exists('/tmp/tmpfileeee')) {
mkdir('/tmp/tmpfileeee',0755);
echo 'created';
}
Does tmp exist where this is executing? Is tmpfileee a file or directory you are trying to create? If tmp does not exist and neither does tmpfileee, I believe you are trying to make 2 directories without a recursive parameter in the call.
My PHP is definitely rusty so maybe someone else can answer better but that was just my initial thoughts looking at it.
Just try as that:
if (!file_exists('tmp/tmpfileeee') AND !is_dir('tmp/tmpfileeee')) {
mkdir('tmp/tmpfileeee',0755, true);
echo 'created';
}
mkdir creates a folder not file.
If you want to create an file:
if (!file_exists('tmp/tmpfileeee') AND !is_file('tmp/tmpfileeee')) {
$fp = fopen('tmp/tmpfileeee', 'w');
echo 'created';
}
or best way:
// 1. Check folder and xreate if not exists
if (!file_exists('tmp') AND !is_dir('tmp')) {
mkdir('tmp',0755, true);
echo 'folder created';
}
// 2. Check file and create if not exists
if (!file_exists('tmp/tmpfileeee') AND !is_file('tmp/tmpfileeee')) {
$fp = fopen('tmp/tmpfileeee', 'w');
echo 'file created';
}
UPDATE
On some servers, the tmp and temp folders are restricted.
Check for open_basedir.
PHP manual states:
If the directory specified here is not writable, PHP falls back to the system default temporary directory. If open_basedir is on, then the system default directory must be allowed for an upload to succeed.
EDIT: I'm pretty sure the issue has to do with the firewall, which I can't access. Marking Canis' answer as correct and I will figure something else out, possibly wget or just manually scraping the files and hoping no major updates are needed.
EDIT: Here's the latest version of the builder and here's the output. The build directory has the proper structure and most of the files, but only their name and extension - no data inside them.
I am coding a php script that searches the local directory for files, then scrapes my localhost (xampp) for the same files to copy into a build folder (the goal is to build php on the localhost and then put it on a server as html).
Unfortunately I am getting the error: Warning: copy(https:\\localhost\intranet\builder.php): failed to open stream: No such file or directory in C:\xampp\htdocs\intranet\builder.php on line 73.
That's one example - every file in the local directory is spitting the same error back. The source addresses are correct (I can get to the file on localhost from the address in the error log) and the local directory is properly constructed - just moving the files into it doesn't work. The full code is here, the most relevant section is:
// output build files
foreach($paths as $path)
{
echo "<br>";
$path = str_replace($localroot, "", $path);
$source = $hosted . $path;
$dest = $localbuild . $path;
if (is_dir_path($dest))
{
mkdir($dest, 0755, true);
echo "Make folder $source at $dest. <br>";
}
else
{
copy($source, $dest);
echo "Copy $source to $dest. <br>";
}
}
You are trying to use URLs to travers local filesystem directories. URLs are only for webserver to understand web requests.
You will have more luck if you change this:
copy(https:\\localhost\intranet\builder.php)
to this:
copy(C:\xampp\htdocs\intranet\builder.php)
EDIT
Based on your additional info in the comments I understand that you need to generate static HTML-files for hosting on a static only webserver. This is not an issue of copying files really. It's accessing the HMTL that the script generates when run through a webserver.
You can do this in a few different ways actually. I'm not sure exactly how the generator script works, but it seems like that script is trying to copy the supposed output from loads of PHP-files.
To get the generated content from a PHP-file you can either use the command line php command to execute the script like so c:\some\path>php some_php_file.php > my_html_file.html, or use the power of the webserver to do it for you:
<?php
$hosted = "https://localhost/intranet/"; <--- UPDATED
foreach($paths as $path)
{
echo "<br>";
$path = str_replace($localroot, "", $path);
$path = str_replace("\\","/",$path); <--- ADDED
$source = $hosted . $path;
$dest = $localbuild . $path;
if (is_dir_path($dest))
{
mkdir($dest, 0755, true);
echo "Make folder $source at $dest. <br>";
}
else
{
$content = file_get_contents(urlencode($source));
file_put_contents(str_replace(".php", ".html", $dest), $content);
echo "Copy $source to $dest. <br>";
}
}
In the code above I use file_get_contents() to read the html from the URL you are using https://..., which in this case, unlike with copy(), will call up the webserver, triggering the PHP engine to produce the output.
Then I write the pure HTML to a file in the $dest folder, replacing the .php with .htmlin the filename.
EDIT
Added and revised the code a bit above.
I am able to save images from a website using curl like so:
//$fullpath = "/images/".basename($img);
$fullpath = basename($img);
$ch = curl_init($img);
curl_setopt($ch, CURLOPT_HEADER, 0);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
curl_setopt($ch, CURLOPT_BINARYTRANSFER, 1);
$rawData = curl_exec($ch);
curl_close($ch);
if(file_exists($fullpath)) {
unlink($fullpath);
}
$fp = fopen($fullpath, 'w+');
fwrite($fp, $rawData);
fclose($fp);
However, this will only save the image on the same folder in which I have the php file that executes the save function is in. I'd like to save the images to a specific folder. I've tried using $fullpath = "/images/".basename($img); (the commented out first line of my function) but this results to an error:
failed to open stream: No such file or directory
So my question is, how can I save the file on a specific folder in my project?
Another question I have is, how can I change the filename of the image I save on the my folder? For example, I'd like to add the prefix siteimg_ to the image's filename. How do I implement this?
Update: I have managed to solve first problem with the path after trying to play around with the code a bit more. Instead of using $fullpath = "/images/".basename($img), I added a variable right before fopen and added it to the fopen method like so:
$path = "./images/";
$fp = fopen($path.$fullpath, 'w+');
Strangely that worked. So now I'm down to one problem which would be renaming the file. Any suggestions?
File paths in PHP are server paths. I doubt you have a /images folder on your server.
Try constructing a relative path from the current PHP file, eg, assuming there is an images folder in the same directory as your PHP script...
$path = __DIR__ . '/images/' . basename($img);
Also, why don't you try this much simpler script
$dest = __DIR__ . '/images/' . basename($img);
copy($img, $dest);
What is the best way using PHP and Curl to POST an entire folder to another server.
you can:
post all files in the directory consequently
zip the directory and post the archive
$srcdir = '/source/directory/';
$dh = opendir($srcdir);
$c = curl_init();
curl_setopt($c, ....); // set necesarry curl options to specify target url, etc...
while($file = readdir($dh)) {
if (!is_file($srcdir . $file)) {
continue; // skip non-files, like directories
}
curl_setopt($c, CURLOPT_POSTFIELDS, "file=#{$srcdir}{$file}");
curl_exec($c);
}
closedir($dh);
That'd be the basics. You'd want some error handling in there, to make sure the source file is readable, to make sure the upload succeeds, etc.. The full set of CURLOPT constants are documented here.