I have a folder directory set up like this in my htdocs:
claas2/TractorPics\5484474\Received\ => and then a bunch of images inside
im using php to put the picture from MySQL database where the file path is stored rather than all the pictures and so that they can be changed easily.
php:
if ( $_REQUEST['rec_pic'] ) {
$order_id = $_POST['rec_pic'];
$sql = "SELECT * FROM `orders` WHERE `active` = '1' AND `order_id` = '$order_id'";
$result = mysql_query($sql);
while ($row = mysql_fetch_array($result)) {
$dir = $row['rec_pic'];
if ($handle = opendir($dir)) {
while (false !== ($file = readdir($handle))) {
echo "<div>";
echo "<a href='#'><img src='".$file."' /></a>";
echo "</div>";
}
closedir($handle);
}
}
}
When i echo this out i get an error saying that that file directory does not exist but if i were to do src it in an image it works just fine. What am i doing wrong here?
the php file that get getting the request is in the directory
claas2\db\ajax
Pretty messy path you have:
claas2/TractorPics\5484474\Recieved\
Try:
claas2\TractorPics\5484474\Recieved\
or
claas2/TractorPics/5484474/Recieved/
but do not mix separators.
EDIT
You are mixing path with URL. And it works because you are using \ as path segment separator, which your browser converts to / while sending request to webserver and your server is now Windows based I guess, therefore \ is invalid as path separator for file system on server. Do
$path = str_replace('\\', `/`, $pathFromDb);
and use result instead of data from db as you use now.
Related
I want to open a server stored html report file on a client machine.
I want to bring back a list of all the saved reports in that folder (scandir).
This way the user can click on any of the crated reports to open them.
So id you click on a report to open it, you will need the location where the report can be opend from
This is my dilemma. Im not sure how to get a decent ip, port and folder location that the client can understand
Here bellow is what Ive been experimenting with.
Using this wont work obviously:
$path = $_SERVER['DOCUMENT_ROOT']."/reports/saved_reports/";
So I though I might try this instead.
$host= gethostname();
$ip = gethostbyname($host);
$ip = $ip.':'.$_SERVER['SERVER_PORT'];
$path = $ip."/reports/saved_reports/";
$files = scandir($path);
after the above code I loop through each file and generate a array with the name, date created and path. This is sent back to generate a list of reports in a table that the user can interact with. ( open, delete, edit)
But this fails aswell.
So im officially clueless on how to approach this.
PS. Im adding react.js as a tag, because that is my front-end and might be useful to know.
Your question may be partially answered here: https://stackoverflow.com/a/11970479/2781096
Get the file names from the specified path and hit curl or get_text() function again to save the files.
function get_text($filename) {
$fp_load = fopen("$filename", "rb");
if ( $fp_load ) {
while ( !feof($fp_load) ) {
$content .= fgets($fp_load, 8192);
}
fclose($fp_load);
return $content;
}
}
$matches = array();
// This will give you names of all the files available on the specified path.
preg_match_all("/(a href\=\")([^\?\"]*)(\")/i", get_text($ip."/reports/saved_reports/"), $matches);
foreach($matches[2] as $match) {
echo $match . '<br>';
// Again hit a cURL to download each of the reports.
}
Get list of reports:
<?php
$path = $_SERVER['DOCUMENT_ROOT']."/reports/saved_reports/";
$files = scandir($path);
foreach($files as $file){
if($file !== '.' && $file != '..'){
echo "<a href='show-report.php?name=".$file. "'>$file</a><br/>";
}
}
?>
and write second php file for showing html reports, which receives file name as GET param and echoes content of given html report.
show-report.php
<?php
$path = $_SERVER['DOCUMENT_ROOT']."/reports/saved_reports/";
if(isset($_GET['name'])){
$name = $_GET['name'];
echo file_get_contents($path.$name);
}
Hi I have created a script to download images and to rename from a url. The links and name are stored in a database. The database looks like this:-
id name url
1 abcd http://www.abcd.com/a.jpeg
I am running a following script to download and renaming the images.:-
$sql = "SELECT * FROM data";
$results = $gtl->query($sql);
while($row = $results->fetch_assoc()) {
$n = $row['name'];
$url = $row['url'];
$name = $n.'.jpeg';
$path = "images/";
if ($url != NULL) {
$get_image = file_get_contents($url);
if ($http_response_header != NULL) {
$get_file = $path . $name;
file_put_contents($get_file, $get_image);
}
}
The following code works well but consumes a lot of time. I have tried using cURL but it has similar speed. Since there are over 300 images that needs to be downloaded. It would be a great if anyone can suggest a way to fasten the process. Any help is highly appreciated.
I'm trying to copy multiple files from one domain on a web server to another using copy() and looping through a list of files, but it's only copying the last file on the list.
Here is the contents of files-list.txt:
/templates/template.php
/admin/admin.css
/admin/codeSnippets.php
/admin/editPage.php
/admin/index.php
/admin/functions.php
/admin/style.php
/admin/editPost.php
/admin/createPage.php
/admin/createPost.php
/admin/configuration.php
This script runs on the website that I'm trying to copy the files to. Here's the script:
$filesList = file_get_contents("http://copyfromhere.com/copythesefiles/files-list.txt");
$filesArray = explode("\n", $filesList);
foreach($filesArray as $file) {
$filename = trim('http://copyfromhere.com/copythesefiles' . $file);
$dest = "destFolder" . $file;
if(!#copy($filename, $dest))
{
$errors= error_get_last();
echo "COPY ERROR: ".$errors['type'];
echo "<br />\n".$errors['message'];
} else {
echo "$filename copied to $dest from remote!<br/>";
}
}
I get the affirmative message for each and every file individually just as I should, but when I check the directory, only the last file from files-list.txt is there. I've tried changing the order, so I know the problem lies with the script, not any individual file.
The output from the echo statements looks something like this:
http://copyfromhere.com/copythesefiles/admin/admin.css copied to updates/admin/editPage.php from remote!
http://copyfromhere.com/copythesefiles/admin/admin.css copied to updates/admin/editPost.php from remote!
http://copyfromhere.com/copythesefiles/admin/admin.css copied to updates/admin/index.php from remote!
Etc
I've modified your code slightly, and tested it on my local dev server. The following seems to work:
$fileURL = 'http://copyfromhere.com/copythesefiles';
$filesArray = file("$fileURL/files-list.txt", FILE_IGNORE_NEW_LINES);
foreach ($filesArray as $file) {
$fileName = "$fileURL/$file";
$dest = str_replace($fileURL, 'destFolder', $fileName);
if (!copy($fileName, $dest)) {
$errors= error_get_last();
echo "COPY ERROR: ".$errors['type'];
echo "<br />\n".$errors['message'];
}
else {
echo "$fileName copied to $dest from remote!<br/>";
}
}
This uses the same fix that Mark B pointed out, but also consolidated the code a little.
Unless the data you're fetching from that remote site has leading/ in the path/filename, you're not generating proper paths:
$file = 'foo.txt'; // example only
$dest = "destFolder" . $file;
produces destFolderfoo.txt, and you end up littering your script's working directory with a bunch of wonky filenames. Perhaps you wanted
$dest = 'destFolder/' . $file;
^----note this
instead.
I want to upload 1000 images in just one click via URL. I have 1000 Image URLs stored in MYSQL database.
So please any one give me PHP code to upload that 1000 images via URL through mysql database.
Currently I am using the bellow code:-
It upload one image per click by posting URL of image...
But i want to upload 1000 image in one click by getting URLs from databse
$result = mysql_query("SELECT * FROM thumb") or die(mysql_error());
// keeps getting the next row until there are no more to get
while($row = mysql_fetch_array( $result )) {
echo "<div>";
$oid = $row['tid'];
$th= $row['q'];
echo "</div>";
$thi = $th;
$get_url = $post["url"];
$url = trim('$get_url');
if($url){
$file = fopen($url,"rb");
$directory = "thumbnail/";
$valid_exts = array("php","jpeg","gif","png","doc","docx","jpg","html","asp","xml","JPEG","bmp");
$ext = end(explode(".",strtolower(basename($url))));
if(in_array($ext,$valid_exts)){
$filename = "$oid.$ext";
$newfile = fopen($directory . $filename, "wb");
if($newfile){
while(!feof($file)){
fwrite($newfile,fread($file,1024 * 8),1024 * 8);
}
echo 'File uploaded successfully';
echo '**$$**'.$filename;
}
else{
echo 'File does not exists';
}
}
else{
echo 'Invalid URL';
}
}
else{
echo 'Please enter the URL';
}
}
Thanks a lot.... …
The code you have is outdated and a lot more complex than needed. This is not a site where you get code because you ask, this is a learning environment.
I'll give you an example on which you can continue:
// Select the images (those we haven't done yet):
$sItems = mysql_query("SELECT id,url FROM thumb WHERE imported=0") or die(mysql_error());
// Loop through them:
while( $fItems = mysql_fetch_assoc($sItems) ){
$imgSource = file_get_contents($fItems['url']); // get the image
// Check if it didn't go wrong:
if( $imgSource!==false ){
// Which directory to put the file in:
$newLocation = $_SERVER['DOCUMENT_ROOT']."/Location/to/dir/";
// The name of the file:
$newFilename = basename($fItems['url'], $imgSource);
// Save on your server:
file_put_content($newLocation.$newFilename);
}
// Update the row in the DB. If something goes wrong, you don't have to do all of them again:
mysql_query("UPDATE thumb SET imported=1 WHERE id=".$fItems['id']." LIMIT 1") or die(mysql_error());
}
Relevant functions:
file_get_contents() - Get the content of the image
file_put_contents() - Place the content given in this function in a file specified
basename() - given an url, it gives you the filename only
Important:
You are using mysql_query. This is deprecated (should no longer be used), use PDO or mysqli instead
I suggest you make this work from the commandline and add an echo after the update so you can monitor progress
I want to download file .xls and then upload it to database oracle. but I am getting an error "the filename .xls is not readable".
Below is my script:
<? require_once (dirname(__FILE__)."/upload/upBc.php");
if(!empty($_FILES["file"])){
echo $file = $_FILES["file"];
}else{
/*$file = "/opt/lampp/htdocs/avar/pp/ccr/test.xls";*/
$file = '/www.upload.com/pm_ms/test.xls';
/* $file = "website_data_". date('ymd').".xls";*/
}
$upload = new upload();
$upload->setFile($file);
$upload->getFileType();
$upload->getFileName();
$upload->getFileDir();
$upload->readDataTbc();
$dataD = array ();
$dataD = $upload->getDataTbc();
echo "<table border = 1>";
for($i=1;$i < count($dataD);$i++){
echo "<tr>";
for($j=0;$j < $upload->getHSize();$j++){
echo " <td>".$dataD[$i][$j]."</td>";
}
echo"</tr>";
}
echo"</table>";
echo"data length=". count($dataD);
$upload->commitDataTbc();
?>
If I use $file = "/opt/lampp/htdocs/avar/pp/ccr/test.xls";. in my program it works
but if I use $file = '/www.upload.com/pm_ms/test.xls';. I get the error THE FILENAME test.xls is Not Readable
That's probably because /www.upload.com/ doesn't exist on your server.
Do you know the actual path to the file? By prefixing the URI with / this makes it absolute, which means that it's trying to read literally from /www.upload.com on your server. If you're looking to make this relative you could try ./upload.com/ or simply upload.com/ as the path prefix.
If you are however trying to load it from the website upload.com (instead of a folder from your server) you have two options:
(If Supported) Set the filename as a URL ie. $file = 'http://www.upload.com/pm_ms/test.xls';
Download the file and then open it locally.
For #2:
$file = 'test.xls';
$ctnts = file_get_contents('http://www.upload.com/pm_ms/test.xls');
file_put_contents($file, $ctnts);