PHP REST Api used for Dialog Flow detectIntent() function gives internal server error.
We are connecting to Google Dialog flow service from AWS EC2 instance. But we are receiving internal server error 500.
I have tried many solution, I initially thought it was AWS EC2 Instance problem because Dialog flow code is working fine to my local windows server but not on AWS server.
After few research, i found that it was the problem of bcMath module which is not installed on my EC2 Instance php setting.
Following are the steps to install bcMath module on PHP -
1. First check if you have bcMath. Check this using phpinfo(); function.
2. Use following command to install bcMath on your AWS instance php -
$sudo yum install php55-bcmath
Related
I have two APIs A & B, A is my core app and they both communicate on a DB level.
Both run on Ubuntu 22.04, firewall is disable.
When I run php artisan on the API A, I can access it from the web browser and it successfuly display Laravel welcome page using this URL & prot 127.0.0.1:5000 and same with postman.
But when I run the same command after stopping the server on API B, I get the following
{"error":"The specified URL cannot be found","code":404}
And in postman the request always hang when try log into the API.
I really don't know what is going on, I was expecting a 500 error but not 404.
Both APIs work on the live server, the issue happen on my local machine.
Note that API A is compatible with both php7.4 & php8.1, API B has php8 synthax so I running both using php8.1 and I setting php version using Laravel valet with nginx.
Also note that when I try to login with postman after running
php artisan serve --port=5000 // API A
Then
php artisan server --port=5001 // API B
I can login with API A, but But the server hang with API B
When I stopped it, and try to run it again API B command, it says port already in use
I try switching port, still getting the same 404 error on API B, I espect it to display Laravel welcome page as with the API A.
In developing php appengine standard app, I wanted a local development datastore so I ran- gcloud components install cloud-datastore-emulator
I also installed google cloud data store globally using
composer require google\cloud_datastore
After that tried to start the emulator with
gcloud beta emulators datastore start --data-dir="C:\Users\Hellen\Desktop\New folder\myDstore"
But the command failed with the following output.
WARNING: Reusing existing data in [C:\Users\Hellen\Desktop\New folder\myDstore].
Executing: cmd /c C:\Users\Hellen\AppData\Local\Google\Cloud SDK\google-cloud-sdk\platform\cloud-datastore-emulator\cloud_datastore_emulator.cmd start --host=localhost --port=8081 --store_on_disk=True --consistency=0.9 --allow_remote_shutdown C:\Users\Hellen\Desktop\New folder\myDstore
[datastore] 'C:\Users\Hellen\AppData\Local\Google\Cloud' is not recognized as an internal or external command,
[datastore] operable program or batch file.
Can someone tell me what the problem is? I really don't know what next to try.
The problem was that I had both app engine server installed and Google cloud SDK, and both have similar commands. So I uninstalled both and installed only gcloud SDK. Thanks so much for the hint.
I've created an application in php that sends texts out to people using the twilio api. It works perfectly in XAMPP but the php code doesn't run in Azure. Everytime I call it I get an error message saying "Failed to load resource: the server responded with a status of 500 (Internal Server Error)".
Is there a way I can solve this problem without having to create my own virtual server?
Usually, when we get 500 response, it means we get some errors on server scripts. And we can set the display_errors=On in PHP runtime on our Azure Web Apps for easy troubleshooting. Refer to https://azure.microsoft.com/en-in/documentation/articles/web-sites-php-configure/#how-to-change-the-built-in-php-configurations for details.
And you may check whether your application on Azure Web Apps has successfully installed the twilio lib. You can leverage composer to configure the sdk in composer.json, then when you deploy your application to Azure via Git, Azure service will install the dependencies in composer.json file automatically during the deployment task.
You can leverage require 'vendor/autoload.php' to load all the dependencies.
At the first time during the test, I got the following error:
Fatal error: Uncaught exception Services_Twilio_TinyHttpException with message SSL certificate problem...
So it may be the issue on your side too, you can add the certificate in PHP on Azure Web Apps, please refer to https://blogs.msdn.microsoft.com/azureossds/2015/06/12/verify-peer-certificate-from-php-curl-for-azure-apps/ for detailed steps.
Otherwise, you can simply edit TinyHttp.php in twilio lib:
add CURLOPT_SSL_VERIFYPEER => FALSE, at $opts array.
Refer to Twilio PHP - SSL certificate: self signed certificate in certificate chain for the same issue.
They have many possibles:
Your Azure running PHP in ISS Server, then you need convert your .htaccess file to web.config, simple, just go to website in IIS and on import your .htaccess file, IIS will convert your .htaccess in web.config.
Permissions on folder you are running.
PHP versions, check your PHP version on xampp and compare to PHP version on Azure. Your code can be compatible in your PHP Xampp but not in PHP Azure.
Extensions PHP, it is possible that your xampp has extensions for PHP enable that your Azure not are enable, like file_info, etc...
I'm a newbie in aws and have created an ec2 instance on ubuntu server with php, mysql and apache installed where I have hosted couple of html files along with other scripting php files. The website is working fine. I have created a webservice which has to be consumed in different clients. The purpose of this webservice is to insert some values in a remote db table.
a) www.abc.com/client/add.php returns success and I cud see the values getting updated in the db table
The above webservice is working fine on a test server but when I had uploaded the same set of files on aws ec2, the webservice is giving me an error.
b) www.abc.com/client/add.php returns failure. I have checked the db configuration file, connection strings for remote host is correct.
I am also facing similar issue in integrating sms api to push sms to consumers cell. On Test server, my code is working fine and sms are pushed but same piece of code is not working on aws ec2.
I suspect, it is related to the rules in ec2 security group because of which it cannot connect to the remote mysql host and to the sms gateway.
Can any of you help me in assigning the proper rules ?
Currently I have assigned the below privileges for Inbound rules
type HTTP, port 80 for all
type Mysql port 3306 for all
All traffic for all
type SSh, port 22 for all
Do I have to assign any outbound rule as well ?
Firstly thanking #bluto for pointing me to the log files where I was able to see the errors and fix them.
My aws inbound and outbound rules were appropriate.
From the log files, I identified mysql_connect(): Access denied for user.
To fix this I had to explicitly assign the host IP address in allowable host section of the db webserver.
For the second issue wherein my SMS api was not working, I identified the error in log file
PHP Fatal error: Call to undefined method mysqli_stmt::get_result().
To fix this, I came to know that mysqlnd is required to execute Bind_result() & fetch() which was missing in my aws ubuntu server.
http://php.net/manual/en/mysqli-stmt.get-result.php
I had to install mysqlnd by the following command.
apt-get install php5-mysqlnd
Pls remember that installing mysqlnd will break your appcode. For it to work, I added the following line in the end of of php.ini file. The path of php.ini file can be find by output of php.info()
extension=mysqlnd.so
After successful installation, I saw that curl needs to be installed in the server as I was getting an error. I installed it by the below command. Ensure to restart the apache server after the installation
sudo apt-get install php5-curl
References:
Call to undefined method mysqli_stmt::get_result
https://askubuntu.com/questions/386887/install-curl-ubuntu-12-04
I am using the PHP OAuth extension to connect to the Netflix API and pull data through a GET request.
Running the request on my local machine results in a successful try and the correct data returned.
However, upon pushing the request up to our 'live' server, I've immediately run into issues with the PHP OAuth extension failing.
I am using version 1.2.2 of the extension with curl as my request method (libcurl is also installed). My PHP code to call the OAuth fetch is:
$oauth = new OAuth($this->key, $this->secret);
$resource = $oauth->fetch($url, array('v' => '2.0', 'include_tms' => 'true'));
$this->_writeFile($oauth->getLastResponse());
Again, this exact code works 100% on my local dev machine (Mac running OS X Lion, installed php-oauth through MacPorts ... also version 1.2.2 ... PHP 5.3.8). However, when running on our server (Debian Linux, PHP 5.3.3, installed php-oauth through apt-get) I receive the following error:
PHP Fatal error: Uncaught exception 'OAuthException' with message 'making the request failed (Failure when receiving data from the peer)' in /var/www/familymedia/application/Services/Netflix.php:23
I'm really not sure what to make of this error.
Some further info:
We currently make successful cURL requests from our server to other API services and return back good information without issue on an hourly basis. However, none of those services authenticate through OAuth, so none of them required the OAuth extension and I ran them straight with php curl.
I have a feeling that there is some sort of setup issue with the php-oauth extension on our live server that MacPorts "magically" solved for me during install on my local machine ... but I cannot find any install or configuration advice/instruction/etc that has helped.
Hoping someone has an answer or a place to point me for further discovery toward an answer ...
Thought I'd update this question with the answer found: the company was specifically blocking inbound communication from Netflix. IT solved the issue. Not a very exciting answer, but that's what it was.