I have Wamp (server called emerald) running and Mamp running on my Mac. People register on Mamp. Emerald is basically file hosting.
Emerald connects to Mamp's mysql database, to login users. However, I want to create a directories for new registrations on Emerald using PHP.
How can I do this? I have tried using this code:
$thisdir = "192.168.1.71";
$name = "Ryan-Hart";
if(mkdir($thisdir ."/documents/$name" , 0777))
{
echo "Directory has been created successfully...";
}
But had no luck. It basically needs to connect the other server and create a directory, in the name of the user.
I hope this is clear.
You can't create directories through http. You need a filesystem connection to the remote location (a local hard disk, or a network share for example).
The easiest way that doesn't require setting up FTP, SSH or a network share would be to put a PHP script on Emerald:
<?php
// Skipping sanitation because it's only going to be called
// from a friendly script. If "dir" is user input, you need to sanitize
$dirname = $_GET["dir"];
$secret_token = "10210343943202393403";
if ($_GET["token"] != $secret_token) die ("Access denied");
// Alternatively, you could restrict access to one IP
error_reporting(0); // Turn on to see mkdir's error messages
$success = mkdir("/home/www/htdocs/docs/".$dirname);
if ($success) echo "OK"; else echo "FAIL";
and call it from the other server:
$success = file_get_contents("http://192.168.1.71/create_script.php?token=10210343943202393403&dir=HelloWorld");
echo $success; // "OK" or "FAIL"
Create a script on another server that creates the dir and call it remotely.
Make sure you have security check (+a simple password at least)
There is no generic method to access remote server filesystems. You have to use a file transfer protocol and server software to do so. One option would be SSH, which however requires some setup.
$thisdir = "ssh2.sftp://user:pass#192.168.1.71/directory/";
On Windows you might get FTP working more easily, so using an ftp:// url as directory might work.
As last alternative you could enable WebDAV (the PUT method alone works for file transfers, not creating directories) on your WAMP webserver. (But then you probably can't use the raw PHP file functions, probably needs a wrapper class or curl to utilize it.)
I know this is old but i think this might me useful, in my experience:
if(mkdir($thisdir ."/documents/name" , 0777))
doesn't work, i need to do it:
mkdir($thisdir, 0777);
mkdir($thisdir ."/documents" , 0777);
mkdir($thisdir ."/documents/name" , 0777));
hope it helps :)
Related
I need to upload same file to 2 different place in same FTP. Is there a way to copy the file on the FTP to the other place instead of upload it again? Thanks.
There's no standard way to duplicate a remote file over the FTP protocol. Some FTP servers support proprietary or non-standard extensions for this though.
Some FTP clients do support the remote file duplication. Either using the extensions or via a temporary local copy of the remote file.
For example WinSCP FTP client does support the duplication using both drag&drop and menu/keyboard command:
It supports the SITE CPFR/CPTO FTP extension (supported for example by the ProFTPD mod_copy module)
It falls back to an automatic duplication via a local temporary copy, if the above extension is not available.
(I'm the author of WinSCP)
Another workaround is to open a second connection to the FTP server and make the server upload the file to itself by piping a passive mode data connection to an active mode data connection. This solution is shown in the answer by #SaadAchemlal. This is basically use of FXP protocol, but for one server. Though many FTP servers will reject this, as they wont allow data connection to/from an address different to the client's.
Side note: people often confuse move with copy. In case you actually want to move, then that's a completely different question. Moving file on FTP is widely supported.
I don't think there's a way to copy files without downloading and re-uploading, at least I found nothing like this in the List of FTP commands and no client I have seen so far supported something like this.
Yes, the FTP protocol itself can support this in theory. The FTP RFC 959 discusses this in section 5.2 (see the paragraph starting with "When data is to be transferred between two servers, A and B..."). However, I don't know of any client that offers this sort of dual server control operation.
Note that this method could transfer the file from the FTP server to itself using its own network, which won't be as fast as a local file copy but would almost certainly be faster than downloading and then reuploading the file.
I can copy files between remote folders in Linux based systems.
In my particular case, I'm using very common file manager PCManFM:
Menu "Go" --> "Connect to server"
FTP Login info, etc
Open new tab in PCManFM
Connect to same server
Copy from tab to tab...
It's a bit slow, so I guess that it could be downloading and uploading back the files, but it's done automatically and very user-friendly.
The code below makes the FTP server to upload the file to itself (using loopback connection). It needs the FTP server to allow both passive and active connection mode.
If you want to understand the ftp commands here is a list of them : List of ftp commands
function copyFile($filePath, $newFilePath)
{
$ftp1 = ftp_connect('192.168.1.1');
$ftp2 = ftp_connect('192.168.1.1');
ftp_raw($ftp1, "USER ftpUsername");
ftp_raw($ftp1, "PASS mypassword");
ftp_raw($ftp2, "USER ftpUsername");
ftp_raw($ftp2, "PASS mypassword");
$res = ftp_raw($ftp2, "PASV");
$addressAndPort = substr($res[0], strpos($res[0], '(') + 1);
$addressAndPort = substr($addressAndPort, 0, strpos($addressAndPort, ')'));
ftp_raw($ftp1, "CWD ." . dirname($newFilePath));
ftp_raw($ftp2, "CWD ." . dirname($filePath));
ftp_raw($ftp1, "PORT ".$addressAndPort);
ftp_raw($ftp1, "STOR " . basename($newFilePath));
ftp_raw($ftp2, "RETR " . basename($filePath));
ftp_raw($ftp1, "QUIT");
ftp_raw($ftp2, "QUIT");
}
I managed to do this by using WebDrive to mount the ftp as a local folder, then "download" the files using filezilla directly to the folder. It was a bit slower than download normally is, but you dont need to have the space on your hdd.
Here's another workaround using PHP cUrl to execute a copy request on the server by feeding parameters from the local machine and reporting the outcome:
Local code:
In this simple test routine, I want to copy the leaning tower photo to the correct folder, Pisa:
$ch = curl_init();
$data = array ('pic' => 'leaningtower', 'folder' => 'Pisa');
curl_setopt($ch, CURLOPT_URL,"http://travelphotos.com/copypic.php");
curl_setopt($ch, CURLOPT_POST, 1);
curl_setopt($ch, CURLOPT_POSTFIELDS, $data);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, TRUE);
$result = curl_exec($ch);
curl_close($ch);
echo $result;
Server code (copypic.php):
On the remote server, I have simple error checking. On this server I had to mess with the path designation, i.e., I had to use "./" for an acceptable path reference, so you may have to tinker with it a bit.
$pic = $_POST["pic"];
$folder = $_POST["folder"];
if (!$pic || !$folder) exit();
$sourcePath = "./unsortedpics/".$pic.".jpg";
$destPath = "./sortedpics/".$folder."/".$pic.".jpg";
if (!file_exists($sourcePath )) exit("Source file not found");
if (!is_dir("./sortedpics/".$folder)) exit("Invalid destination folder");
if (!copy($sourcePath , $destPath)) exit("Copy not successful");
echo "File copied";
You can do this from C-Panel.
Log into your C-Panel.
Go into file manager.
Find the file or folder you want to duplicate.
Right-click and chose Copy.
Type in the new director you want to copy to.
Done!
You can rename the file to be copied into the full path of your wanted result.
For example:
If you want to move the file "file.txt" into the folder "NewFolder" you can write it as
ftp> rename file.txt NewFolder/file.txt
This worked for me.
I am working on a php web app .
I need to upload a file to the web server, with customer info - customers.csv.
but this process needs to be automated ,
The file will be generated in a Point of Sale app , and the app can open a browser window with the url ...
first i taught i would do something like this www.a.com/upload/&file=customers.csv
but read on here that is not possible,
then i taught i would set a value for the file upload field and submit form automatically after x seconds. Discovered thats not possible .
Anybody with a solution , will be appreciated .
EDIT
I have tried this and it works ,file is uploaded to remote server
is it working only because the php script is running on the same pc where csv is sitting ???
$file = 'c:\downloads\customers.csv';
$remote_file = 'customers.csv';
// set up basic connection
$conn_id = ftp_connect('host.com');
// login with username and password
$login_result = ftp_login($conn_id,'user','password');
// upload a file
if (ftp_put($conn_id, $remote_file, $file, FTP_ASCII)) {
echo "successfully uploaded $file\n";
} else {
echo "There was a problem while uploading $file\n";
}
// close the connection
ftp_close($conn_id);
This is of course not possible, imagine how this could be abused to upload on linux as example the /etc/passwd. The only way it might be possible is to use a Java Applet, but this is for sure not the best way.
You could try to let your PoS Application make a web request with the customers.csv file and let a WebAPI handle the upload, this may be possible, but I have no expierence with Point of Sale Applications.
Best might be, if the solution above cannot be considered, to just prompt the user to provide the file above and check over name + content if it is the correct one.
This is a bit tricky, but if your CSV is not too long, you could encode it in base64, send to the webserver as a GET parameter and then, in the server side, decode and store it as a CSV file.
If the file is too big to do that, you have to use other method, like the java applet pointed by #D.Schalla or even install and configure a FTP server, and make the Point of Sale app uploads the file there.
Other alternative, specially good if you cannot modify the sale app, is to install a web server in the client side and write a small php script to handle the upload process. In this way, the sale app could call a local url (something like: http:// localhost/upload.php) and it's this script the one in charge to upload the file which can be achieve with a classical HTTP POST, a FTP connection or any other way you can think about.
MY Solution , which will work with out setting up web server on client side.
This is for windows but can be adapted to linux
On client side
Local Application opens cmd and runs this command ftp -n -s:C:\test.scr
WHICH opens test.scr - a file with ftp commands e.g.
open host.com
user1
passwOrd
put C:\downloads\customers.csv public_html/customers.csv
more info here :
http://support.microsoft.com/kb/96269
more commands :
http://www.nsftools.com/tips/MSFTP.htm#put
I have been told this cannot be done but I want to get some other opinions here. I am a bit of a newbie when it comes to things like this.
My Site: ExampleSiteA.com
File to download: ExampleSiteB.com
Basically, I am downloading a csv file from ExampleSiteB.com to make updates to my site, ExampleSiteA.com. To do this, I am downloading the csv file manually through CoreFTP and then uploading it manually to ExampleSiteA.com. The file changes daily and I would like to skip this step so I can automate the process.
Keep in mind that I need to download the csv file from ExampleSiteB.com through SFTP...
I am not sure if it is possible to directly download/upload a file from one server to another if one is SFTP. The file size is also quite large, it averages about 25,000 KB / 25 MB.
Another option that I haven't explored yet is requiring or including a file from another server... is that an option or a possibility? The file is located in a folder exclusively for my site and a login is required for SFTP download.
Any insight will be appreciated. Thanks in advance!
Go here and download what you need: http://phpseclib.sourceforge.net/
UPDATE
FOR SFTP
Then in your script:
<?php
include('Net/SFTP.php');
$url = 'http://www.downloadsite.com';
$fileToDownload = "yourCSV.csv";
$cmd = "wget -q \"$url\" -O $fileToDownload";
exec($cmd);
$sftp = new Net_SFTP('www.uploadsite.com');
if (!$sftp->login('username', 'password')) {
exit('Login Failed');
}
echo $sftp->pwd() . "\r\n";
$sftp->put('remote.file.csv', 'yourCSV.csv', NET_SFTP_LOCAL_FILE);
print_r($sftp->nlist());
?>
If you need to connect to a second server for download:
$sftp2 = new Net_SFTP('www.serverFromWhichToDownload.com');
if (!$sftp2->login('username', 'password')) {
exit('Login Failed');
}
echo $sftp2->pwd() . "\r\n";
$sftp2->get('localFileName.csv', 'remoteFileName.csv');
print_r($sftp2->nlist());
Read the docs for further help and examples: http://phpseclib.sourceforge.net/documentation/net.html#net_sftp_get
To Log what your connection is doing if it fails, etc. use this:
include('Net/SSH2.php');
define('NET_SSH2_LOGGING', true);
$ssh = new Net_SSH2('www.domain.tld');
$ssh->login('username','password');
echo $ssh->getLog();
FOR FTP upload - SO has gone crazy, does not want to format my code, but here it is anyway:
$file = 'somefile.txt';
$remote_file = 'readme.txt';
$conn_id = ftp_connect($ftp_server);
$login_result = ftp_login($conn_id, $ftp_user_name, $ftp_user_pass);
if (ftp_put($conn_id, $remote_file, $file, FTP_ASCII)) {
echo "successfully uploaded $file\n";
} else {
echo "There was a problem while uploading $file\n";
}
ftp_close($conn_id);
Yes, that's possible using ssh2_sftp.
http://php.net/manual/en/function.ssh2-sftp.php
I have had good luck with cURL in the past. If you are on a Linux box, it would be trivial to set up a CRON job to do this update process for you. A good reference for CLI HTTP scripting in cURL can be found here, however you may need the -T flag (for file transport) to accomplish the upload portion. Speaking of uploading, if you can run the script/process/crontab from the server you would like to update, I would recommend downloading from the web server to obviate one trip and a third party. Or, if you need to update on demand, you could write a PHP script that uses the built in PHP cURL functions. If you take the Linux+CLI route, you could also use sftp.
Update: In testing curl with sftp (curl -u uname:pword sftp://domain.tld) I get the following error: curl: (1) Protocol sftp not supported or disabled in libcurl on Kubuntu 12.04. So cURL may not be a good idea. I also tested CLI sftp (sftp uname#domain.tld:/dir/file.ext) but could not find a way (short of using ssh keys) to send authentication. Thus, this would necessarily be a manual process unless you did set up ssh keys between the servers. As it does not sound like you have that kind of access to ExampleSiteB.com, this probably isn't acceptable.
Update 2: Since my initial answer turned out to be of little use, I figured I would expand upon one of the above answers. I was trying to find a solution that did not involve a PECL extension, but I did not have much luck with ftp_ssh_connect(). I recommend trying it, you may have better luck and could forgo the PECL extension route.
Sigh, on further reading, it appears ftp_ssh_connect is, understandably, incompatible with the sftp protocol. However, I found a nice blog post about utilizing ssh2_connect() and ssh2_sftp() (as mentioned in a previous answer) and figured I would post that to give you some additional assistance. It is not as simple as calling the functions for most PHP distributions. Here is the blog post. Some of those steps may not be necessary or you may need to do some additional things listed in another blog post I ran across, here.
On my system, all I had to do was run apt-get install libssh2-1-dev libssh2-php and I was able to find ssh2 in my php -m output.
Having an include, as long as you have read/write permissions on the website you're getting the file from should work, however this is just guess work atm as i don't have any means of checking it. Good luck though!
Yes, you should be able to do this.
Whoever told you that you can't do this might be getting confused with JavaScript and cross-site scripting browser restrictions which prevent JavaScript downloaded from one domain to access content in a different domain.
That being said, if you are using PHP which to me implies that you are talking about PHP running on a web sever, you should be able to use PHP or any other scripting or programming language to download the file from SiteB.com, then update the file, and then finally FTP the file to a different web server (SiteA.com).
First of all, I don't know if this is where I have to ask this question; so I'll count on the moderators to move it if need be.
I have a Linux PHP web hosting account on GoDaddy.
When I have to upload a file, I normally use FTP, either a client or the host's file manager.
However, if the file is one which I have to download from another website, I would prefer if I could "download" it directly to my hosting account; the reason being that I'm in Mauritius and our connection is among the slowest in the world. So I would prefer using the high (I'm just assuming it's higher) bandwidth of the host so that transfers go more quickly.
So, my question is: does anyone of you have a solution (PHP script, Java applet, or anything) that I could use to achieve that?
Thanks in advance,
Yusuf
First of this might be a security risk on your server.
Secondly, here's little untested code:
<?php
echo 'get file...';
$data=file_get_contents('http://...target-url...');
if($data===false)die('Failed getting file.');
echo 'saving file...';
$succ=file_put_contents('...target-file...',$data);
echo $succ ? 'Success' : 'Failed saving file';
?>
Usable script (put into file "down.php" in your web root):
<?php
echo 'get file...';
if(!isset($_REQUEST['from'])die('Fail: Parameter "from" not set.');
if(!isset($_REQUEST['to'])die('Fail: Parameter "to" not set.');
$data=file_get_contents($_REQUEST['from']);
if($data===false)die('Failed getting file.');
echo 'saving file...';
$succ=file_put_contents($_REQUEST['to'],$data);
echo $succ ? 'Success' : 'Failed saving file';
?>
Usage (run it in from web browser):
http://yoursite.com/down.php?from=http://yourothersite.com/file-content.txt&to=/var/www/public_html/target.txt
WARNING: Make sure you remove script after use, it is a grave security issue.
Wget I use it for downloading wordpress straight to a server:
# Download the title page of example.com to a file
# named "index.html".
wget http://www.example.com/
# Download Wget's source code from the GNU ftp site. wget
ftp://ftp.gnu.org/pub/gnu/wget/wget-latest.tar.gz
The example are from the link above.
Christian trick make better with this code.
You can create a folder like d on your host and protect it with password! Then create a new index.php and put beloow code on it
<?php
echo 'Get file...';
$url = $_REQUEST['from'];
$filename= preg_replace('/\\?.*/', '', basename($url));
$to ='dl/'.$filename;
$data=file_get_contents($_REQUEST['from']);
if($data===false)die('Failed getting file.');
echo "<br/>".'Saving file...';
$succ=file_put_contents($to,$data);
echo $succ ? "<br/>".'Success' : "<br/>".'Failed saving file';
?>
finally create a folder named dl to store downloaded files.
Usage (run it in from web browser):
http://yoursite.com/d/?from=http://yourothersite.com/file.txt
how to test if a file exist on the current computer using the application ?
I try to put the full url at my file like this, but it doesn't work :
if(file_exists("C:/wamp/www/project/photo/".$nom_photo))
{
echo "file exist";
$extension=pathinfo("C:/wamp/www/project/photo/".$nom_photo,PATHINFO_EXTENSION);
echo "<br>";
$nom=md5($nom_photo.time().rand(0, 99999)).".".$extension;
echo $nom;
rename("C:/wamp/www/project/photo/".$nom_photo,"C:/wamp/www/project/photo/".$nom);
echo "<br>";
}
How to fix it ?
PHP operates server side and has NO ACCESS to the files on the machine running the web browser, unless they are indeed the same machine.
If you are meaning to find a way to test if a file exists on the web server, the file_exists() function you mentioned should find it. There are many reasons this might fail, including safe_mode, file permissions, and using the wrong path.
The server doesn't have any access to the clients filesystem, this would be a major security flaw.
Also javascript is sandboxed so you couldn't do it on the client side either.
The only way I can think of doing this is to get the user to download a separate application that looks for the file and reports back to the server.