sftp/scp files with bash - php

I have the need to upload a set of files to an sftp account.
My first thought was to use php's ssh2_connect() function and I was able to get this working locally no problem. However, once I moved to the dev environment I quickly realized that this wasn't a good solution because there are too many dependencies that wont exist and installing them would require too many approvals from too many people.
So, my next thought was to use bash, and this is where I need help. I will be running this bash script every hour through cron, so it needs to be unattended. However, when I run sftp/scp it requires a password.
The sftp account does not allow ssh connections so I cannot create authorized keys. I don't want to rely on the .ssh folder on the remote machine as well.
Any suggestions would be appreciated.
Keep in mind I cannot install anything, so no keychain, sshpass or expect. All other answers found in How to run the sftp command with a password from Bash script? are not feasible as I cannot install anything on the server.

Initially I was trying to use php's ssh2_connect() because php creates the file that I need to upload. It's better to have this sftp transaction in my php script for that reason but since it wasn't working, I moved on to bash.
My solution is actually using php and curl:
$ch = curl_init();
$fp = fopen($file, "r");
curl_setopt($ch, CURLOPT_URL, "sftp://USER:PASS#HOST/" . basename($file));
curl_setopt($ch, CURLOPT_UPLOAD, 1);
curl_setopt($ch, CURLOPT_PROTOCOLS, CURLPROTO_SFTP);
curl_setopt($ch, CURLOPT_INFILE, $fp);
curl_setopt($ch, CURLOPT_INFILESIZE, filesize($file));
curl_exec($ch);
curl_close($ch);

check lftp, it's very powerful tool in file transfer. for example:
lftp -u "$username","$password" -e "cd /desc/path; put $FILE; bye" sftp://remote.example.com
check also mput, mirror in lftp manual.
BTW. I don't think you need to install anything on the remote server if you use expect. anyway, I am using lftp instead of expect in most similar situations at work.

Related

PHP - Sending request causes test server to freeze [duplicate]

I have a relatively simple script like the following:
<?php
$url = "localhost:2222/test.html";
echo "*** URL ***\n";
echo $url . "\n";
echo "***********\n";
echo "** whoami *\n";
echo exec('whoami');
echo "* Output **\n";
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, $url);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
$output = curl_exec($ch);
curl_close($ch);
echo $output;
When I execute it on the command line, it works - I get the meager results from within test.html.
When I run this script by loading up the built-in PHP server and browsing to the script, it hangs. No output to the screen, nothing written to the logs.
I read that sometimes user permissions can get in the way, so I tried doing whoami to ensure that the user that ran the built-in PHP server is the same as the one who executed the script on the command line; which they are.
safe_mode is off, disable_functions is set to nothing. I can exec other commands successfully (like the whoami).
What else should I check for? Does the built-in PHP server count as someone other user when it fulfills a request perhaps?
The PHP built-in development web server is a very simple single threaded test server. It cannot handle two requests at once. You're trying to retrieve a file from itself in a separate request, so you're running into a deadlock. The first request is waiting for the second to complete, but the second request cannot be handled while the first is still running.
Since PHP 7.4 the environment variable PHP_CLI_SERVER_WORKERS allows concurrent requests by spawning multiple PHP workers on the same port on the built-in web server. It is considered experimental, see the docs.
Using it, the PHP script can send requests to itself which is already being served, without halting.
PHP_CLI_SERVER_WORKERS=10 php -S ...
Works with Laravel as well:
PHP_CLI_SERVER_WORKERS=10 php artisan serve
I think problem in your $url. It may be look like this $url = "http://localhost:2222/test.html"; or $url = "http://localhost/test.html"; I think it's solve your problem. Thanks for your question. Best of luck.

Is it possible to convert a CURL command to PHP?

I have a web application that I need to send a CURL command from a HTTP URL to an application which is running on Ubuntu.
The curl command is this:
curl -X POST --data-binary #/home/User/Pastec_FYP/Currency_Test_Images/Test_TenEuro.jpg http://127.0.0.1:4212/index/searcher
The command is getting an image from the following:
#/home/User/Pastec_FYP/Currency_Test_Images/Test_TenEuro.jpg
And it is searching through the index at
http://127.0.0.1:4212/index/searcher
I need to be able to translate that to PHP.
EDIT
This is what I got so far, but it's still saying image_not_decoded
$ch = curl_init();
$post = array(
"file" => "#" .realpath("/home/User/Pastec_FYP/Currency_Test_Images/Test_TenEuro.jpg")
);
curl_setopt_array(
$ch, array(
CURLOPT_URL => 'http://127.0.0.1:4212/index/searcher',
curl_setopt($ch, CURLOPT_POSTFIELDS, $post),
CURLOPT_RETURNTRANSFER => true
));
$output = curl_exec($ch);
echo $output;
curl_close($ch);
From past use of the physical Curl command in Ubuntu it used to return that error when the path to the Image wasn't right, but i know its right as it works in Command line.
So is there anything I should change?
Additional Edit (To get it working)
I got it working how I wanted, but probably a lot more long winded than needed, but it works. I wrote a CurlCommand.sh with the Curl command I wanted to execute, then called the .sh file from a batch script (CallCurlCommand.bat) opening Ubuntu and inserting the CurlCommand.sh into it. Then using PHP to call the batch file (CallCurlCommand.bat).
CurlCommand.sh
curl -X POST --data-binary '#/home/User/Pastec_FYP/Currency_Test_Images/Test_FiveEuro.jpg' 'http://localhost:4212/index/searcher'
CallCurlCommand.bat
C:\Users\User\AppData\Local\Microsoft\WindowsApps\ubuntu.exe< C:\Users\User\AppData\Local\Packages\CanonicalGroupLimited.UbuntuonWindows_79rhkp1fndgsc\LocalState\rootfs\home\User\Pastec_FYP\CurlCommand.sh
PHP
exec('CallCurlCommand.bat');
I do still wish there was a straight conversion to PHP but this works.
It seems you have bit of a special system - you seem to be running your server on windows, which has ubuntu as a subsystem and curl as well as your file which you post is in there.
If you want to run it directly from your PHP server, you could install curl on your Windows. One way of doing it is downloading Win32 binary of curl from https://curl.haxx.se/download.html. After that you should be able to do something like
$curlpath = 'C:\path\to\curl.exe';
$filepath = '/home/User/FYP_Pastec/Currency_Test/Test_FiveEuro01.jpg';
$url = 'http://localhost:4212/index/searcher';
exec("$curlpath -X POST --data-binary \"#$filepath\" \"$url\"");
which would then send it.

PHP Built-In Server Can't cURL

I have a relatively simple script like the following:
<?php
$url = "localhost:2222/test.html";
echo "*** URL ***\n";
echo $url . "\n";
echo "***********\n";
echo "** whoami *\n";
echo exec('whoami');
echo "* Output **\n";
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, $url);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
$output = curl_exec($ch);
curl_close($ch);
echo $output;
When I execute it on the command line, it works - I get the meager results from within test.html.
When I run this script by loading up the built-in PHP server and browsing to the script, it hangs. No output to the screen, nothing written to the logs.
I read that sometimes user permissions can get in the way, so I tried doing whoami to ensure that the user that ran the built-in PHP server is the same as the one who executed the script on the command line; which they are.
safe_mode is off, disable_functions is set to nothing. I can exec other commands successfully (like the whoami).
What else should I check for? Does the built-in PHP server count as someone other user when it fulfills a request perhaps?
The PHP built-in development web server is a very simple single threaded test server. It cannot handle two requests at once. You're trying to retrieve a file from itself in a separate request, so you're running into a deadlock. The first request is waiting for the second to complete, but the second request cannot be handled while the first is still running.
Since PHP 7.4 the environment variable PHP_CLI_SERVER_WORKERS allows concurrent requests by spawning multiple PHP workers on the same port on the built-in web server. It is considered experimental, see the docs.
Using it, the PHP script can send requests to itself which is already being served, without halting.
PHP_CLI_SERVER_WORKERS=10 php -S ...
Works with Laravel as well:
PHP_CLI_SERVER_WORKERS=10 php artisan serve
I think problem in your $url. It may be look like this $url = "http://localhost:2222/test.html"; or $url = "http://localhost/test.html"; I think it's solve your problem. Thanks for your question. Best of luck.

Creating a cron job in openshift

I have created an application in openshift. I have a cron which should run every minute since it is placed in minutely folder inside cron. But it never runs. Its a php script which hits a url using curl. Any idea
<?php
// create a new cURL resource
$ch = curl_init();
// set URL and other appropriate options
curl_setopt($ch, CURLOPT_URL, "http://www.example.com");
curl_setopt($ch, CURLOPT_HEADER, 0);
// grab URL and pass it to the browser
curl_exec($ch);
// close cURL resource, and free up system resources
curl_close($ch);
?>
I created this script and placed it inside minutely folder in .openshift/crons folder. Then I restarted my application. But it doesn't work. Any idea?
You will need two files.
1.: THE CRON FILE
It is a script, that will execute your PHP script. You need to place it in the minutely folder. Let's name it "crontest.sh", so the full path will be this, where the 000000000000000000000000 is your own OPENSHIFT_APP_UUID:
/var/lib/openshift/000000000000000000000000/app-root/runtime/repo/.openshift/cron/minutely/crontest.sh
The file contains only this line:
php $OPENSHIFT_REPO_DIR/php/crontest.php
2.: THE PHP FILE
It is your PHP script, that will be executed every minute by your Cron script. You need to place in the same folder, that you have specified in your Cron file. Let's name it "crontest.php", so the full path will be this, where the 000000000000000000000000 is your own OPENSHIFT_APP_UUID:
/var/lib/openshift/000000000000000000000000/app-root/runtime/repo/php/crontest.php
The file contains your PHP script, e.g. this will make a file named "crontest.txt" showing up next to your PHP script, containing as many "1" as the number of the passed minutes is:
<?php
file_put_contents(getenv('OPENSHIFT_REPO_DIR').'php/crontest.txt', '1', FILE_APPEND);
?>
To answer SanksR's specific question, the PHP file will contain the code below in the "app-root/runtime/repo/php/crontest.php" file, while the "app-root/runtime/repo/.openshift/cron/minutely/crontest.sh" will contain this: "php $OPENSHIFT_REPO_DIR/php/crontest.php".
<?php
// create a new cURL resource
$ch = curl_init();
// set URL and other appropriate options
curl_setopt($ch, CURLOPT_URL, "http://www.example.com");
curl_setopt($ch, CURLOPT_HEADER, 0);
// grab URL and pass it to the browser
curl_exec($ch);
// close cURL resource, and free up system resources
curl_close($ch);
?>
You have to write a shell/bash script and place it in the minutely folder.
This script has to run your php file. It could look like:
myscript.sh:
#!/bin/bash
export PHP=/usr/local/zend/bin/php ;
$PHP my-curl-cron.php
(don't forget to make it executable: chmod +x myscript.sh)
I recommend to read this article along with this tutorial.

How can I figure out why cURL is hanging and unresponsive?

I am trying to track down an issue with a cURL call in PHP. It works fine in our test environment, but not in our production environment. When I try to execute the cURL function, it just hangs and never ever responds. I have tried making a cURL connection from the command line and the same thing happens.
I'm wondering if cURL logs what is happening somewhere, because I can't figure out what is happening during the time the command is churning and churning. Does anyone know if there is a log that tracks what is happening there?
I think it is connectivity issues, but our IT guy insists I should be able to access it without a problem. Any ideas? I'm running CentOS and PHP 5.1.
Updates: Using verbose mode, I've gotten an error 28 "Connect() Timed Out". I tried extending the timeout to 100 seconds, and limiting the max-redirs to 5, no change. I tried pinging the box, and also got a timeout. So I'm going to present this back to IT and see if they will look at it again. Thanks for all the help, hopefully I'll be back in a half-hour with news that it was their problem.
Update 2: Turns out my box was resolving the server name with the external IP address. When IT gave me the internal IP address and I replaced it in the cURL call, everything worked great. Thanks for all the help everybody.
In your php, you can set the CURLOPT_VERBOSE variable:
curl_setopt($curl, CURLOPT_VERBOSE, TRUE);
This then logs to STDERR, or to the file specified using CURLOPT_STDERR (which takes a file pointer):
curl_setopt($curl, CURLOPT_STDERR, $fp);
From the command line, you can use the following switches:
--verbose to report more info to the command line
--trace <file> or --trace-ascii <file> to trace to a file
You can use --trace-time to prepend time stamps to verbose/file outputs
You can also use curl_getinfo() to get information about your specific transfer.
http://in.php.net/manual/en/function.curl-getinfo.php
Have you tried setting CURLOPT_MAXREDIRS? I've found that sometimes there will be an 'infinite' redirect loop for some websites that a normal browser user doesn't see.
If at all possible, try sudo ing as the user PHP runs under (possibly the one Apache runs under).
The curl problem could have various reasons that require a user input, for example an untrusted certificate that is stored in the trusted certificates cache of the root user, but not the PHP one. In that case, the command would be waiting for an input that never happens.
Update: This applies only if you run curl externally using exec - maybe it doesn't apply.

Categories