create a folder in local computer from online system in php - php

I want to save a file from the online system and save it into specific,ex. C:/myFolder/. So,if in the C:/ there is no myFolder name,the system will automatically detect and create the folder in C drive, and save the file into that folder. When I try system locally, it can create the folder and the file into C drive.
But when it upload my file into the server, the folder was created within the server, not in the local computer. Can anyone help me how to solve this? how to create a folder in local computer and save file into that folder from online system?
Below is the code that works when the system runs locally:
$directory = 'C:/sales/'.$filename.'.txt';
$path_name = 'C:/sales/';
if ( ! is_dir($path_name)) {
mkdir($path_name);
}
if(mysql_num_rows($query))
{
$fp = fopen($directory, 'w');
if($fp)
{
for($i = 0; $i < mysql_num_rows($query); $i++)
{
$f = mysql_fetch_array($query);
$orderFee_q = mysql_query("select * from sales_order where status in ('waiting', 'waiting1', 'waiting2', 'approved') and outstanding = 'N' order by so_no desc");
$get_orderFee = mysql_fetch_array($orderFee_q);
$line = $f["id"]."\t".$f["so_no"];
if(trim($line) != '') {fputs($fp, $line."\r\n");}
}
}
fclose($fp);
}
$download_name = basename($filename);
if(file_exists($filename))
{
header('Content-Description: File Transfer');
header('Content-Type: application/force-download');
header('Content-Transfer-Encoding: Binary');
header("Content-Disposition: attachment; filename=".$download_name);
header('X-SendFile: '.$filename);
header('Pragma: no-cache');
header("Expires: 0");
readfile($filename);
}

Simply, you can't do that in php or any other server side language.
Edit:
Reason is simple server side application and scripts have access only to local resources where they are launched. So when you run your application on local computer, everything works as you wish for. But because of how HTTP works and because of safety reasons you cannot access user local files.

You could create a desktop application that interacts with your server, which would allow you to do something like this.

You can not tell the user where to save the file you sending. Even with JavaScript. Server Site languages can manipulata the files on the server, not on the client's machine, by defaylt.But anyway it is not possible to access client's files without the permision of the client. Possibly this could be done by an ActiveX component, but client should agree and accept this action. Otherwise untill now your computer would have more virusese than your files. So the browser makes a protected invironment for safe browsing. You have already set the name of the file you are sending ... so that's the max possible you can do server side.

Related

A bypass server php code to download file from another server

My situation is illustrated in the figure below:
I have a file X on the main server A which I want to download from my local computer B and the file X is downloadable through HTTP. But, for some reason I am not allowed to download file from the main server A. However, I have an access to another server C which has PHP installed.
I now want to download the file X via the server C by calling some PHP script on server C from my local computer B.
Is it possible to write one PHP script to do the above?
Any help in writing so will be highly appreciated.
I'm not completely sure of what you need but you can use the following script to act as proxy between 2 servers.
PUT THIS FILE ON SERVER C
phpProxy.php
<?php
$myPass = "Secr3t";
if( $myPass == $_GET['pass'] ){
$remoteFile = $_GET['rf'];
$filename = basename( $remoteFile );
header("Content-Type: application/octet-stream");
header("Content-Disposition: attachment; filename=\"$filename\"");
echo file_get_contents( $remoteFile );
}
USE AS:
phpProxy.php?rf=http://phs.googlecode.com/files/Download%20File%20Test.zip&pass=Secr3t
NOTES:
1 - I've added a password otherwise the script is very unsafe!
2 - If possible, use https to avoid MITM

PHP How do I host a large file properly?

I'm currently using the following PHP function to allow a user to select a file and then download it. This happens over FTP. However, if the user chooses a large file then while the download is occurring it locks up the server for any other requests. Is there any way I can host the file while having PHP continue to respond to requests?
I need PHP to verify that the user is permitted to download the file with their credentials so I can't just host it as an asset. The file is located on an FTP server.
function download($file) {
$fileStream = "";
if($this->get($file)) {
//Send header to browser to receive a file
header("Content-disposition: attachment; filename=\"$file\"");
header("Content-type: application/octet-stream");
header("Pragma: ");
header("Cache-Control: no-cache");
header("Expires: 0");
$data = readfile($this->downloadDir . $file);
$i=0;
while ($data[$i] != "")
{
$fileStream .= $data[$i];
$i++;
}
unlink($this->downloadDir . $file);
echo $fileStream;
exit;
} else {
return false;
}
}
PHP is not the best solution for this kind of work, but it can delegate the job to the web server you are using. And as the file is in the same place as your application, this can work.
All major web servers that usually run PHP applications (Apache, lighttpd and nginx) have all support for XSendfile.
To use it, you have to first enable the functionality in your web server (check the links above for each of the web servers), then in your script add a new header:
Apache:
header("X-Sendfile: $location_of_file_to_download");
Lighttpd:
header("X-LIGHTTPD-send-file: $location_of_file_to_download");
nginx:
header("X-Accel-Redirect: $location_of_file_to_download");
The web server will catch this header from your application, and will replace the body of your PHP response with the file. And while it servers this file to the user, the PHP gets unblocked and ready to server a new user.
(The other headers will be kept, so you can retain the content-type and content-disposition headers)
Since PHP is single-threaded, you would have to make a structure for each request. Then, instead of just processing one request at a time, you should loop through the structures and slowly process all of them concurrently (as in send a few hundred kb to one, then move onto the next, etc).
Honestly, PHP doesn't sound like the right language to do this job. Why not using a purpose built FTP server like vsftp or something of that nature?

PHP on IIS 7.5 - Large file download blocks connection until download is complete

I want to allow users to download large video files. These files are outside of the public folder because of security reasons.
I'm using a combination of fopen(), feof(), and fread() to download the file in chuncks.
The download works fine. The video is downloaded and also works just fine. The problem is during the download. Any user who's downloading the file can't continue browsing the site until the file is downloaded. The browser is trying to establish a connection, but it hangs while the file is downloading. When the download is done, the connection is immediately established. Other users can browse the site just fine during the download, so it's not like the whole server hangs or whatever.
I'm working with PHP (CakePHP) installed on an IIS server.
A snippet of code:
$name = "filename.mp4";
$folder = "private/folder/";
$handle = fopen($folder.$name, "rb");
if(!$handle)
{
echo "File not found";
}
else
{
header("Content-length:".filesize($folder.$name));
header("Content-Type: video/mp4");
header("Content-Disposition: attachment; filename='filename.mp4'");
header("Content-Transfer-Encoding: binary");
session_write_close(); // this is the solution
while(!feof($handle))
{
$buffer = fread($handle, 1*(1024*1024));
echo $buffer;
ob_flush();
flush();
}
}
I finally solved the problem. As suggested above, the problem was indeed related to sessions. Even though session.auto_start was off, CakePHP itself was handling sessions at the moment. So, by inserting session_write_close() right before the while loop, the problem was solved.

Assets graber with zend, big files download as 229 or 228 bytes only

I have the following problem with a server, there many variables at play. So this is what happened. Everything was working perfect when I developed a small webapp with zend on my desktop on fedora. Then I transfer the app to a server on dreamhost and everything worked fine.
So the problem comes with the client that needed a server on china, because they are behind the great firewall and they wanted to transfer their files really faster. They had huge files, around 3.4 GB. so they gave me windows 2003 machine virtual machine, and they couldnt change it to linux, and that is when everything went on a downward spiral.
Basically my app had a folder outside the documentroot, where all the files where going to be upload via ftp. My app read the files and only allowed logged users to download the files.
This is my plugin - controller
<?php
class Mapache_Controller_Plugin_AssetGrabber
extends Zend_Controller_Plugin_Abstract
{
public function dispatchLoopStartup (Zend_Controller_Request_Abstract $request)
{
if ($request->getControllerName() != 'assets')
return;
$auth = Zend_Auth::getInstance();
if (!$auth->hasIdentity())
throw new Exception('Not authenticated!');
//$file = APPLICATION_PATH . '/../assets/' . $request->getActionName();
$file = APPLICATION_PATH . '/..' . $_SERVER['REQUEST_URI'];
$file = str_replace("_", " ", $file);
// echo $file;
if (file_exists($file)) {
header("Content-type: ".$this->new_mime_content_type($file));
header('Content-disposition: attachment;');
//header('Content-type: '.$this->new_mime_content_type($file));
//readfile('$file');
echo file_get_contents($file);
}
}
function new_mime_content_type($filename){
$result = new finfo();
if (is_resource($result) === true){
return $result->file($filename, FILEINFO_MIME_TYPE);
}
return false;
}
}
The only thing I changed for it to work on windows, was adding
$file="d:/". $_SERVER['REQUEST_URI'];, so it knows to look on the D: drive of the server.
so basically i have another php file that does an scandir an lists all the files and directories and creates a link to URL/assets/folder/file, it works fine on my test server even with big files. But when I try to download a zip file fo 200 mb from the windows server I get a corrupted zip file of 228 or 229 bytes, like just the header.
The server has xammp with zend installed, I am going crazy.
Migrating back to my dreamhost server will take days, I installed an rsync client on the server and started copying the files but in the last 4 it has only copy 400mb, so my 30 Gb of data will take days.
When i log on to the server with rdesktop and try to download the files I still get 228 bytes of files instead of 200mb or even 3.4Gb.
It has windows 2003.
Do I have to configure something on the apache server, something on the httpd.conf or php.ini?
I found the answer, I need to properly download the file
header('Content-Disposition: attachment;filename="'.basename($file).'"');
header('Content-Description: File Transfer');
//header('Content-Type: application/octet-stream');
header('Content-type: '.$this->new_mime_content_type($file));
header('Content-Disposition: attachment; filename='.basename($file));
header('Content-Transfer-Encoding: binary');
header('Expires: 0');
header('Cache-Control: must-revalidate');
header('Pragma: public');
header('Content-Length: ' . filesize($file));
set_time_limit(0);
$filex = #fopen($file,"rb");
while(!feof($filex))
{
print(#fread($filex, 1024*8));
ob_flush();
flush();
}
//ob_clean();
//flush();
//readfile($file);

PHP readfile() and large files

When using readfile() -- using PHP on Apache -- is the file immediately read into Apache's output buffer and the PHP script execution completed, or does the PHP script execution wait until the client finishes downloading the file (or the server times out, whichever happens first)?
The longer back-story:
I have a website with lots of large mp3 files (sermons for a local church). Not all files in the audio archive are allowed to be downloaded, so the /sermon/{filename}.mp3 path is rewritten to really execute /sermon.php?filename={filename} and if the file is allowed to be downloaded then the content type is set to "audio/mpeg" and the file streamed out using readfile(). I've been getting complaints (almost exclusively from iPhone users who are streaming the downloads over 3G) that the files don't fully download, or that they cut off after about 10 or 15 minutes. When I switched from streaming out the file with a readfile() to simply redirecting to the file -- header("Location: $file_url"); -- all of the complaints went away (I even checked with a few users who could reliably reproduce the problem on demand previously).
This leads me to suspect that when using readfile() the PHP script engine is in use until the file is fully downloaded but I cannot find any references which confirm or deny this theory. I'll admit I'm more at home in the ASP.NET world and the dotNet equivalent of readfile() pushes the whole file to the IIS output buffer immediately so the ASP.NET execution pipeline can complete independently of the delivery of the file to the end client... is there an equivalent to this behavior with PHP+Apache?
You may still have PHP output buffering active while performing the readfile(). Check that with:
if (ob_get_level()) ob_end_clean();
or
while (ob_get_level()) ob_end_clean();
This way theonly remaining output Buffer should be apache's Output Buffer, see SendBufferSize for apache tweaks.
EDIT
You can also have a look at mod_xsendfile (an SO post on such usage, PHP + apache + x-sendfile), so that you simply tell the web server you have done the security check and that now he can deliver the file.
a few things you can do (I am not reporting all the headers that you need to send that are probably the same ones that you currently have in your script):
set_time_limit(0); //as already mention
readfile($filename);
exit(0);
or
passthru('/bin/cat '.$filename);
exit(0);
or
//When you enable mod_xsendfile in Apache
header("X-Sendfile: $filename");
or
//mainly to use for remove files
$handle = fopen($filename, "rb");
echo stream_get_contents($handle);
fclose($handle);
or
$handle = fopen($filename, "rb");
while (!feof($handle)){
//I would suggest to do some checking
//to see if the user is still downloading or if they closed the connection
echo fread($handle, 8192);
}
fclose($handle);
The script will be running until the user finishes downloading the file. The simplest, most efficient and surely working solution is to redirect the user:
header("Location: /real/path/to/file");
exit;
But this may reveal the location of the files. It's a good idea to password-protect the files that may not be downloaded by everyone anyway with an .htaccess file, but perhaps you use a database to detemine access and this is no option.
Another possible solution is setting the maximum execution time of PHP to 0, which disables the limit:
set_time_limit(0);
Your host may disallow this, though. Also PHP reads the file into the memory first, then goes through Apache's output buffer, and finally makes it to the network. Making users download the file directly is much more efficient, and does not have PHP's limitations like the maximum execution time.
Edit: The reason you get this complaint a lot from iPhone users is probably that they have a slower connection (e.g. 3G).
downloading files thru php isnt very efficient, using a redirect is the way to go. If you dont want to expose the location of the file, or the file isnt in a public location then look into internal redirects, here is a post that talks about it a bit, Can I tell Apache to do an internal redirect from PHP?
Try using stream_copy_to_stream() instead. I find is has fewer problems than readfile().
set_time_limit(0);
$stdout = fopen('php://output', 'w');
$bfname = basename($fname);
header("Content-type: application/octet-stream");
header("Content-Disposition: attachment; filename=\"$bfname\"");
$filein = fopen($fname, 'r');
stream_copy_to_stream($filein, $stdout);
fclose($filein);
fclose($stdout);
Under Apache, there is a nice elgant solution not involving php at all:
Just place an .htaccess config file into the folder containing the files to be offered for download with the following contents:
<Files *.*>
ForceType applicaton/octet-stream
</Files>
This tells the Apache to offer all files in this folder (and all its subfolders) for download, instead of directly displaying them in the browser.
See below url
http://php.net/manual/en/function.readfile.php
<?php
$file = 'monkey.gif';
if (file_exists($file)) {
header('Content-Description: File Transfer');
header('Content-Type: application/octet-stream');
header('Content-Disposition: attachment; filename='.basename($file));
header('Content-Transfer-Encoding: binary');
header('Expires: 0');
header('Cache-Control: must-revalidate');
header('Pragma: public');
header('Content-Length: ' . filesize($file));
ob_clean();
flush();
readfile($file);
exit;
}
?>

Categories