I want testing my site load (site is write on php) via apache bench.
I have local server (xampp), OC: windows.
In directory apache/bench there is file ab.exe, this means that apachebench is installed in my local server yes?
I have local site localhost/my_test, I want simulate concurrent 1000 request on this site, in CMD I write this command:
ab -c 1000 localhost/my_test
answer from CMD is: 'ab' is not recognized as an internal or external command,
operable program or batch file.
Tell please, where I wrong?
AB needs a complete URL:
Usage: ab [options] [http://]hostname[:port]/path
So, in your case the URL should look like:
localhost/my_test/
It needs the path - which in this case is simply the /
Hope this helps
Paul.
ab -c 1000 localhost/my_test
answer from CMD is: 'ab' is not recognized as an internal or external command, operable program or batch file.
It means that ab.exe not in your PATH.
If you start CMD, first you should navigate yourself into the apache/bench directory, and run the command from that folder.
I think you should type your port that your server has been running on;
For example, I use http://127.0.0.1:8080 (port is 8080 as I set in my XAMPP config)
Related
when I write cd project in my terminal then I get this line:
MacBook-Pro:project work$
I run my local server like this
MacBook-Pro:project work$ php -S 127.0.0.1:8000 -t public
After that I see this:
PHP 7.2.6 Development Server started at Tue May 29 10:45:40 2018
Listening on http://127.0.0.1:8000
Document root is /Users/work/project/public
Press Ctrl-C to quit.
But then I want to go back to my project folder MacBook-Pro:project work$ but when I write cd oder cd project nothing happens. Only when I press Ctr-C then this line MacBook-Pro:project work$appears again. Do I really have to quit my server to go into my project folder?
You should be able to run
php -S 127.0.0.1:8000 -t public &
in your terminal. & will set the task to be run in the background. Any output the command produces will be output to your terminal though.
To stop the command (Ctrl-C in this case) you first need to fg in your terminal to get it to the foreground, then Ctrl-C to quit.
If you're running the server in the foreground, then it is responsible for handling all input. You don't have direct access to the shell.
If you don't want that, then open another terminal, or run the server in the background, or use a multiplexer like screen or tmux.
I'm trying to execute with PHP a command (rsync) to copy folders and files from a remote server to a local folder.
This is the code I wrote in php. Command WORKS in SSH (local Terminal and remote with putty.exe), copying correctly the folders and the files.
But it doesn't work in PHP. What can I do? Do you know a better(secure/optimal) way to do this?
exec("echo superuserpassword | sudo -S sshpass -p 'sshremoteserverpassword' rsync -rvogp --chmod=ugo=rwX --chown=ftpuser:ftpuser -e ssh remoteserveruser#remoteserver.com:/path/files/folder /opt/lampp/htdocs/dowloadedfiles/", $output, $exit_code);
EDIT:
I had read this guide to create a link between my server and my local machine.
Now I can login with ssh in my remote machine without password.
I changed my command:
rsync -crahvP --chmod=ugo=rwX --chown=ftpuser:ftpuser remote.com:/path/to/remote/files /path/to/local/files/
This command works too in terminal, but when I send it with exec php command, it fails again, but I got another different error: 127.
As MarcoS told in his answer, I checked the error_log.
The messages are this:
ssh: relocation error: ssh: symbol EVP_des_cbc, version OPENSSL_1.0.0 not defined in file libcrypto.so.1.0.0 with link time reference
rsync: connection unexpectedly closed (0 bytes received so far) [Receiver]
rsync error: remote command not found (code 127) at io.c(226) [Receiver=3.1.1]
Well, after lot of try/error, I finished to cut the problem in the root:
I readed this guide (like the last one, but better explained) and I changed the php file that execute the rsync command to the remote server (where files are located) and run the rsync.php file there, and it worked perfectly.
To execute in the machine with the files (the files to copy and the rsync.php)
1.- ssh-keygen generates keys
ssh-keygen
Enter an empty passphrase and repeat empty passphrase again.
2.- ssh-copy-id copies public key to remote host
ssh-copy-id -i ~/.ssh/id_rsa.pub remoteserveraddressip(xxx.xxx.xxx.xxx)
The rsync.php file:
exec("rsync -crahvP /path/in/local/files/foldertocopy remoteuser#remoteserveraddress:/path/in/remote/destinationfolder/", $output, $exit_code);
After all of that, navigate to the rsync.php file and all must work. At least worked for me...
I suppose you are experiencing identity problems... :-)
On a cli, you are running the command as the logged-in user.
On PHP, you are running the command as the user your web server runs as (for example, apache often runs as www-data, or apache user...).
One possible solution I see (if the above is the problem real cause), is to add your user to web-server group...
I'd also suggest you to check the web-server error logs, to be sure about the real cause of the problem... :-)
I successfully installed gammu in ubuntu 11, and send text message using command line.
echo "TEXTMESSAGE" | gammu sendsms TEXT mobilenumber
My problem is, when I use exec function in my php script I always have the following errors:
Warning: No configuration file found!
Warning: No configuration read, using builtin defaults!
Error opening device, it doesn't exist.
Thanks for the help
You are missing the .gammurc and the defaults fail to detect your device.
Try running gammu-detect. It should say something along the lines of
[gammu]
device = /dev/ttyUSB0
name = Phone on USB serial port HUAWEI_Technology HUAWEI_Mobile
connection = at
If that does not work, run gammu-config and manually set up port and connection.
Just resolved the similar trouble. In my case gammu was executed under nagios user, so that it was not able to find the configuration file until I placed it in /etc/gammurc.
According to gammu documentation on Linux, MacOS X, BSD and other Unix-like systems, the config file is searched in following order:
$XDG_CONFIG_HOME/gammu/config
~/.config/gammu/config
~/.gammurc
/etc/gammurc
My file was in /home/user/.gammurc, but when I executed it under nagios user "~" was a different directory, so that gammu was not able to find it.
Now permissions:
In order to gain access for your user to /dev/ttyUSB0 (use your path) you should add nagios (in your case www-data or whatever it is) user to dialout group this way:
sudo usermod -a -G dialout nagios
And then set the SUID bit on gammu to allow nagios (www-data in your case) execute it on behalf of the root:
sudo chmod 4755 /usr/bin/gammu
Try to execute gammu on behalf of the root (you could use su command)
Hope it would be useful.
You can change de path of .gammurc by doing this:
Copy the file (.gammurc) located on the root and past it on /etc.
cp .gammurc /etc/gammurc
Don't forget to remove the dot.
I use it raspberry Pi , the directory of gammu may change on your environment
I have a php script. I am using nginx and spawn-fcgi.
spawn-fcgi -n -s /tmp/nginx9010.socket -u www-data -g www-data -f /usr/bin/php5-cgi -C 6
How can I test from the command line that spawn-fcgi is working with the script?
e.g. I have a script in /home/ubuntu/test.php
I am having issues with nginx and executing a php script. It prompts for a download.
I have #!/usr/bin/php in the file and did a chmod a+x as well.
Thanks
For testing a FastCGI backend you could try to create a CGI environment and use cgi-fcgi to connect to the backend
You could attach with strace to see what the backend does (for example whether it even receives a request from the web server); attach with -ff to the master process to see syscalls on all workers
php5-cgi in FastCGI mode doesn't need a shebang line nor +x on the files - it doesn't use the kernel to execute them, it just loads them as simple files
Firefox (and probably other browsers too) often cache the mime type, so you will see a download prompt even after you fixed the problem. Use curl for testing!
nginx won't serve the file it passes it to php, nginx only serves static files, So if it is downloading the php file you might need to check that your are sending php files to the correct place, are you using an IP and PORT in the php location block in the config file ?
Only a guess of the top of my head whilst on the train home.
FWIW, problems like that nginx offers the file to be downloaded are due Nginx serving the files itself without sending them to fastcgi backend, often because of try_files or wrong location {} block matching to the uri.
Can anybody help me? I'm using sphinx searcher, but i have many databases. I dynamically controll them (change the sphinx config file ), but after i add an index to the config file sphinx needs to be restarted. I have created an bash script for doing this (stop shpinx, start it, indexer --rotate --all ) and when i run the script from terminal it is ok, but when i run from apache server it is not working. How can i do this without changenig the owner of the apache server to root ( it will decrease the security of my server )?
How can i do this without changenig the owner of the apache server to root
How about sudo? Put something like this in your sudoers...
apache ALL= NOPASSWD: /path/to/script command
The catch is to make sure that the script cannot be exploited... as it is running as root... ie is read-only + exec for apache, ensure that commands/switches you send to the script are sanity checked.
I hope that helps,
Kind Regards,
Nick