I want to send data to my server via SSH. The data is image files that need to be saved into directories on the server.
If I run the script via SSH how can I get a PHP script to read the image files?
For example, if I used bash I could do
./uploadimage.bash -image.jpg
and that would be that. But my knowledge of PHP is limited. Usually PHP uses the $_FILE array when dealing with files, as far as I know. Do I need to use this if I send the files by SSH?
Run the script via php command line executable:
php myscript.php image.jpg image2.jpg
You can the get the file names via $argv array and deal with them any way you please.
It depends on what you need to do with the images.
Do you just need to echo them out to the user?
echo file_get_contents('image.jpg');
Are you meaning to retrieve the command-line variables passed to the script? Use $argv.
myScript.php image.jpg # image.jpg becomes $argv[1]
Do you need to do processing on the images? Use the GD functions.
$image = imagecreatefromjpeg('image.jpg');
// Do processing
imagejpeg($image); // Pass a filename if you want to save it instead of output it.
If you want to simply upload a bunch of files to a remote server, your best bet is to use scp command, rather than PHP
scp /your/file.png username#host:/remote/path.png
If you execute a PHP script through an SSH Session, there's no way you can send a file through SSH. You have to first copy it to the remote server and then execute the PHP which uses file_get_contents or something like that.
If you are sure that PHP must be the recipient of the file (maybe you want to do some logic to determine the file name, path), you have to host it as a webserver, and you can use curl to upload the file, like this:
curl -k -F myfile=#image.png foo.php.
Related
Is it possible to download a file using Wget? I want download that file into my default browser's download directory.
It's possible, but it wouldn't achieve the effect you desire.
Running wget would cause the file to be downloaded to the server (which is something you'd be better off using the cURL library for instead of shelling out to an external binary).
If you want the browser to download it, then you need to output the file from PHP, not save it to a file on the server.
Try something like this :
shell_exec('wget -P path_to_default_download_directory google.com');
I am trying to write some batch script to download and save an XLS file from a URL. I am able to down load the file by using
#ECHO OFF
start /d "C:\Program Files\Internet Explorer" IEXPLORE.EXE url link
exit
I would now like to save these files to a folder or directory.
Any help anyone could provide here would be greatly appreciated.
There are at least two ways to do it.
Like Buddy suggested, use a command-line downloader, for example wget.
Under Linux, you can run PHP directly (even without a webserver). See PHP: Command Line PHP on Microsoft Windows in PHP manual.
Don't run a browser, or - even worse - IE, just to run a PHP script.
I'm pretty new to the cURL library so I hope the solution to this isn't too trivial.
Basically I have a remote directory, lets say http://www.something.com/dir. In this directory following files are present:
file1.xml
file2.xml
file3_active.xml
Is there a way I can get the files where the filename matches the phrase 'active' into a string? Would the solution work both for http and ftp?
EDIT: How can I get the list/array of filenames in a remote dir? If I could do that, I could simply use strpos, get the full filename and use cURL the simple way.
Many Regards,
Andreas
How can I get the list/array of filenames in a remote dir.
Off the top of my head:
Via an FTP dir command, if you have FTP access.
Via a custom PHP (or whatever) script on the remote server which generates a machine-parsable list for you.
Via a shellexec/popen/ssh_exec to a shell command like ls or find, run through SSH.
By parsing HTML from a web-server generated directory listing (i.e., as generated by Apache mod_autoindex) on the remote server.
Each of these options is going to require some action on the part of the person hosting the remote server -- so if it's completely out of your control, I think you're SOL.
exec ("C:/Lame/sox \"C:/1/2.wav\" -t wav \"C:/1/2.rev\" reverse");
Using that code to use an audio post processing tool to reverse a sound file. There is an output but the file is about 1/5th the size it should be and I am unable to play it. Basically it makes a file but its not the one I would have gotten if I did this in the command prompt:
C:/Lame/sox "C:/1/2.wav" -t wav "C:/1/2.rev" reverse
With that, I get the result I want and I am able to play the rev file.
Anyone have any idea why this is happening?
Found the problem. It was a permission problem.
All the other post processing command works because it writes in that folder. Reverse makes a temporary file in another folder which the current user didn't have write access in which why it made a small file since it tried to later read from a file that didn't exist.
I have LaTeX code inside PHP (not as .tex file); for example received by $_POST. How can I save the rendered LaTeX as a PNG or PDF file on my server?
EDIT: I know that PHP normally does not do this. I will run a shell command within PHP. Thus, I need something to do so in Linux terminal.
you could exec() pdfTex to generate a PDF
URL: http://www.tug.org/applications/pdftex/
Run command
pdftex file.tex
after you saved your tex-code from $_POST to a file using file_put_contents() - make sure you have the rights to write in the specified folder.
hope that helps!