I've got some code that generates a word document, as follows:
$word->Documents [1]->SaveAs ( $localDir . $filename );
Now, I was kinda hoping that I could now open the file once it's saved by doing the following:
$word->Documents->Open($remoteDir . $filename)
// remotedir = 'word/', so for example the above would be '/word/document1.doc'
But it seems to open it on the host machine, and not the users! Is there anyway to open it on the user's machine and not on the server?
edit: Just for clarity it will be used exclusively on an intranet by a single user that will be on a Windows machine at all times, with Word etc installed... just want to try and make her life a little easier!
Thanks
I think you are fundamentally mistaken about what runs where. PHP is a purely server side language. You can not use it to open a file on the client's PC so that the user has an opened instance of Word in front of them.
You can maybe achieve that through client side Scripting, namely in VBScript or some other Microsoft scripting flavour. Be prepared for massive obstacles and incompatibilities, though, because such things are blocked for security reasons by default in all browsers, and sometimes those blocks cannot be circumvented even with special settings ("Trusted sites") in the client browser.
You may be able to display the document in the user's browser as an embedded HTML object.
The most simple thing really may be generating the file, and offering it to the user as a download. The user can then save it, and open it. Job done.
/word/document1.doc is the path for a file in the server, not in the client. On Windows, supposing that the file sharing is enabled for the client PC, then you can use a path such as \\IP\word\document1.doc, where IP is the IP of the client PC.
You can get the IP of the PC connecting to the server with $_SERVER['REMOTE_ADDR']; $_SERVER['REMOTE_HOST'] is the result of a DNS reverse lookup, which could return the same value of $_SERVER['REMOTE_ADDR'], in your case.
Probably PHP will not open remote files if it has not been set to do so (there is a directive or that).
If directly accessing the shared file from the COM object doesn't work, then you can copy the file from the client PC to the server in a temporary file, and then give that file to the COM object. In this way, if there are any errors while accessing the networked file, you should be able to get them.
I find strange, anyway, that passing a network file path you get a local file. Are you sure the COM object is not copying of the server the file it finds at the remove file path passed? Did you try with a different file? If that happens with different files too, then we are missing something; I would find strange that for all the network files you try to open, there is already a local file with the same name. Try also renaming the network files.
Related
Is it possible to send file from a local URL?
I need to upload files in php from your local url, eg I open the web page with:
www ... upload.php?url=c://...file.jpg,
from the url GET,I would get the file on the pc and would upload, without using html or anything else that I have to choose, just with the file url.
Important, it can only be this way, it will not be possible to choose the file during the upload, it will only be a POST or GET with the local url.
I researched and found nothing related, if anyone can help
I think it is impossible for server to get a client file by using the file path.
But maybe you can use JS and FileSystemobject to prepare the file, make it to streaming and post to the server.
And you must know FSO need a high security permission and may be disabled on users' browser.
how can I delete a file using an url path ?
I have
$file_with_path = "http://www.myweb.com/uploads/audio.mp3";
if (file_exists($file_with_path)) {
unlink($file_with_path);
}
I don't use "/uploads/audio.mp3" or similar directory paths due some reasons.
thanks in advance !!
unlink tells the operating system to delete a given file. The OS identifies files by file system path - it does not interact with URLs in any way. URLs are translated to file system paths by the web server, which is an entirely different piece of software. While theoretically there is a way to tell a web server to delete the file (by sending a HTTP DELETE request), no web server is going to honor that - it would be way too insecure. It is relatively easy to control who can access the file system; it is very hard to control who can send requests to the web server.
In short, you will have to figure out what the file system path for the file is, and use unlink (and file_exists) with that path.
I'm using two computers (both connected to one network) and one of them has XAMPP. I'm trying to upload files to the one with XAMPP in it (the files are from the other computer). But I always end up having the 'No such file or directory' error even though I have the correct path. But when I use the path from the computer with XAMPP, even when I'm using the other computer, the system works just fine. Can anyone help me?
P.S. I'm using PHP copy() function because the file path is coming from an excel file.
Here's the part of my PHP code:
$original_file_name = objWorksheet->getCellByColumnAndRow(5,$i)->getValue();
// Example of the cell value: C:\Users\ComputerWithoutXAMPP\Desktop\scanned documents\SO 2010\#1.jpeg
$ext = pathinfo($original_file_name, PATHINFO_EXTENSION);
$file = time().substr(md5(microtime()),rand(0,26),5);
// UPLOAD THE FILE DECLARED IN EXCEL
copy($original_file_name, 'uploads/docs/'.$file.'.'.$ext);
You can use copy() to upload a file to another machine, or from another machine, but to access the remote machine, the appropriate argument to copy (source or dest) has to be a URL. The code you've posted is trying to copy the file to a local "uploads/docs/" directory, it's doesn't even appear to aware of another machine.
While what you're looking to do may be technically possible, I haven't the foggiest idea how you'd go about it: it seems rather Rube Goldberg to me. The ftp:// wrapper would probably work, if FTP is set up properly on the XAMPP server.
How big is the file you're trying to send? If it's small enough, you may have better luck either encoding and sending the content itself or uploading the file with curl to an upload-catching script on the XAMPP side
I'm using the library phpseclib0.3.7 to connect via SFTP to my web server. The users that are present have the following structure:
/dirhome/FirstUser/FirstUser/
/dirhome/SecondUser/SecondUser/
/dirhome/....../.....
Where:
dirhome:
own -> root:root
permission -> 700
FirsUser: (is the name of a user (example))
own -> root:root
permission -> 755
FirsUser: (is the name of a user (example))
own -> FirstUser:mygroup
permission -> 700
The same structure for the second user and so on. With this facility, users can view / edit / create content only within their directory.
I can easily connect in PHP using the library and view folders/files, but do not know how to upload files from user's local PC on the remote server within your personal folder.
The library provides the method:
->put($remotepath, $localpath, NET_SFTP_LOCAL_FILE);
Il problema รจ:
How can I open a dialog box to allow the user to select a file on user's PC and upload in her personal directory?
If I put: $localpath = "C:\\Users\\****\\Desktop\\test.txt"
I get:
C:\\Users\\****\\Desktop\\test.txt is not a valid file
But if I take a file that resides locally on the server works fine, but it does not make sense because users can not put its files. There are several days that I try but without success.
Also the download does not work, essentially for the same problem.
PHP is a server side scripting language. When your browser requests a PHP generated site, PHP runs on the server and your browser only gets to see the result. For that reason, PHP obviously has no way to access client side files: it already ran before anything even reaches the client computer.
I'm assuming the server with the PHP files on it and the SFTP server are different servers, otherwise your whole question doesn't make too much sense.
So, what you need to do here is a two step approach: First, you need to upload the files to the server who runs the PHP files the regular way, using a HTTP POST request. You can send the request to a PHP script, that then uses SFTP to move the files to the other server.
For downloads (as you asked this in your comments) it works similar: the browser requests a PHP script that fetches the file from the SFTP server, and then sends it to the browser as HTTP response. The browser should then display a regular file download dialog and the user can select where to store it.
FOr uploading you should consider using some kind of cron job or a background job started using PHP's exec() instead, as you will most likely either run into max execution timeouts or have to set them way higher than you should if you upload them usign PHP, especially for large files. The alternative is to use a separate PHP configuration (depending on what version of PHP you are running you can use .htaccess files, .user.ini files or different PHP-FPM pools for that) to increase the execution time for only the upload script.
On the current website I'm working on, I've got a directory of files for users to download which would be really nice to have some security method other than obscurity ;)
I was wondering if there's any way to supply login information via PHP to htaccess as though a user were entering it.
Alternately, if anyone knows a better way to secure user downloads using PHP, that's also acceptable. All of my googling turns up "just use htaccess" which isn't really helpful, as from the non-savvy user's point of view, they have to log in twice every time they use the website.
My best guess at doing it exclusively with PHP is to store files above the web root, then copy them to a web accessible folder temporarily, but this seems highly inefficient and I couldn't think up any way to remove them after the download has finished.
Note: I don't own the server this is running on and don't have ssh access to it.
If files are not too big (Gb) you can always use readfile for file's download. In this mode you can check user's auth before, and if it's ok output file contents to user, otherwise send him to login page.
With this method you can put your files in protected (with .htaccess) directory so you can be sure that nobody who isn't authenticated can access them.
I think I would either store them in a folder outside of the web root, or in a folder protected by .htaccess and then have a php script that checked if the user was logged in and allowed to download a file asked for. If he was, then just pass the file through to the user.
Example from linked page at php.net:
Example #1 Using fpassthru() with binary files
<?php
// open the file in a binary mode
$name = './img/ok.png';
$fp = fopen($name, 'rb');
// send the right headers
header("Content-Type: image/png");
header("Content-Length: " . filesize($name));
// dump the picture and stop the script
fpassthru($fp);
exit;
?>
Someone else made a comment about having to report the correct content-type, which is true. Often, in my own experience, I already know it, or can use the file extension pretty easily. Otherwise you can always try to have a look at finfo_file. On that page there are also some comments about what you could do especially for images as well.
you should use a php script to control the access.
create a dir outside the webroot or inside the webroot with a .htaccess where you location the download files.
outsite the webroot is better.
you have to make sure that no one can access those files if they are located inside.
then take from the pear class lib. the class http_download.
using this class has many advantages.
Ranges (partial downloads and resuming)
Basic caching capabilities
Basic throttling mechanism
On-the-fly gzip-compression
Delivery of on-the-fly generated archives through Archive_Tar and Archive_Zip
Sending of PgSQL LOBs without the need to read all data in prior to sending
you should not use readfile oder any forwarding filepointer because you have to set the headers yourself and the don't support http "range".
for the access restrictions you can use you session-manager, password, framework, forum etc.
pear - http_download http://pear.php.net/package/HTTP_Download
you need to copy the url, because SO encodes it to url-encoded string (which is correct), but PEAR-homepage doesn't like that.
Why reinvent the wheel? Take a look at File Thingy, which is pretty easy to install and customise. If nothing else, you can study the source to learn how to perform the authentication step.
You could use MySQL to store uploaded files, rather than storing them in a file, better and more secure, in my opinion. Just Google "MySQL upload php" for example.
You could create the htaccess file using a PHP script, from your users table, each time a user accesses that folder, very troublesome.
I think the first option is better.
Use X-SendFile! There's extensions for Apache, Lighty and Nginx so there's a good chance there's one for your webserver.
Once you have the extension installed, you can authenticate the user using your PHP script, and then add the header:
header('X-SendFile','/path/to/file');
When your PHP script is done, it will trigger the webserver to stream the file for you. This is especially efficient if you use PHP along with for example FastCGI, because it frees up the PHP process for other work.
Evert