I am using nginx with PHP and Vue. I`m struggling with CORS error like in a title. I can make normal Fetch requests, however when i want to make some application error from backend (for example, when someone want to create a account, and login is already taken). Im getting Error like in title:
Fetch failed and latern in Network Tab in Dev Tools:
CORS: Preflight Missing Allow Origin Header
Here is my nginx.conf:
server {
listen 80 default_server;
listen [::]:80 default_server;
server_name _;
server_tokens off;
root /app/;
index index.php;
add_header "Access-Control-Allow-Origin" *;
add_header "Access-Control-Allow-Headers" *;
location / {
try_files $uri $uri/ /index.php$is_args$args;
}
location ~ \.php$ {
include fastcgi_params;
fastcgi_pass php:9000;
fastcgi_param SCRIPT_FILENAME $document_root$fastcgi_script_name;
}
}
Way i try to send error:
header("HTTP/1.0 460 Aplication Error");
header("Content-Type: application/json");
json_encode($error);
Am i trying to do in proper way at all?
PS Update, i cant change this header, when using calling it by fetch from Vue CLI Server, when i call it directly for example in browser it works correctly
Related
When developing locally on this project, I'm having issues where when my PHP Laravel application throws a 500 error I see a 502 Bag Gateway instead of an error page rendered by PHP. I do have the following env vars set:
APP_ENV=local
APP_DEBUG=true
APP_LOG_LEVEL=debug
In prod, I see Laravel resolve the 500.blade.php error page as expected, but locally nothing is shown.
For example, a bad method call can trigger this:
022/09/04 22:19:45 [error] 867#867: *103 FastCGI sent in stderr: "PHP message: [2022-09-04 22:19:45] local.ERROR: Call to undefined method....
I haven't been able to identify any configuration setting that I can tweak within nginx that'll enable it to show errors rather than a Bad Gateway.
Any suggestions on what configuration might need to be changed here?
Nginx configuration:
server {
listen 80; ## listen for ipv4; this line is default and implied
#listen [::]:80 default ipv6only=on; ## listen for ipv6
server_name app;
access_log off;
error_log /dev/stdout;
root /var/www/html/public;
index index.php;
charset utf-8;
# this causes issues with Docker
sendfile off;
location = favicon.ico { log_not_found off; access_log off; }
location = robots.txt { access_log off; log_not_found off; }
# look for local files on the container before sending the request to fpm
location / {
try_files $uri /index.php?$query_string;
}
# nothing local, let fpm handle it
location ~ [^/]\.php(/|$) {
fastcgi_split_path_info ^(.+\.php)(/.+)$;
fastcgi_pass localhost:9000;
fastcgi_index index.php;
include fastcgi_params;
fastcgi_param REQUEST_METHOD $request_method;
fastcgi_param SCRIPT_FILENAME $document_root$fastcgi_script_name;
fastcgi_param QUERY_STRING $query_string;
fastcgi_param CONTENT_TYPE $content_type;
fastcgi_param CONTENT_LENGTH $content_length;
# Httpoxy exploit (https://httpoxy.org/) fix
fastcgi_param HTTP_PROXY "";
# allow larger POSTS for handling oauth tokens
fastcgi_buffers 16 16k;
fastcgi_buffer_size 32k;
}
# Deny .htaccess file access
location ~ /\.ht {
deny all;
}
}
I created my nginx virtual-host/code-block by this way and it's working for me, I'm using 8000 port but you can use 80 port. However, it preferable if we don't map port 80 with any of single project in local development because mostly we are working on multiple projects so you should need to enable different ports of every project.
server {
listen 8000;
root /var/www/html/<project-path>/public;
index index.html index.htm index.php;
location / {
try_files $uri $uri/ /index.php$is_args$args;
}
# pass the PHP scripts to FastCGI server listening on /var/run/php/php7.4-fpm.sock
location ~ \.php$ {
try_files $uri /index.php =404;
fastcgi_pass unix:/var/run/php/php7.4-fpm.sock;
fastcgi_index index.php;
fastcgi_param SCRIPT_FILENAME $document_root$fastcgi_script_name;
include fastcgi_params;
}
}
I hope that will help you.
I believe that the behavior is caused by the php configuration, not by nginx.
Try setting
display_errors = on;
Unless otherwise instructed nginx passes the exact error code it receives.
There is one other alternative I can think of, perhaps the script is timing out on error for some reason causing the 502.
You need to simply route the error codes to your index.php file so Laravel can deal with them.
This has been covered in multiple other questions on StackOverflow. Here's one -
Allow Laravel to respond to 403 instead of nginx
Just use 500 (502) instead of 403.
What is the actual header response you getting from your request?
I'd suggest you do some test and try to isolate the issue if this is a problem with nginx config, php config or your laravel environment and error handling instead.
you can create a test.php file in your public folder
i.e.
<?php
http_response_code(500);
TEST;
now if you open site/test.php are you getting 500 error or 502 error? and is the error displaying something like undefine TEST constant.
or how about you edit public/index.php and just break the code like adding . somewhere, are you also getting 502 response in your laravel app?
502 errors usually happens when you set nginx as a proxy or does not get a valid response, you can enable debug mode to see what happens with your request.
also post your nginx.conf
and maybe try adding fastcgi_intercept_errors off; on your php or main location block
EDIT
Another possible cause of this is the upstream too big and more than your nginx config buffer_size
You can try increasing the buffer size,
add inside http block in whatever-environment-you-have/nginx/nginx.conf
proxy_buffer_size 128k;
proxy_buffers 4 256k;
proxy_busy_buffers_size 256k;
then on your ~php block add the following
fastcgi_buffer_size 128k;
fastcgi_buffers 4 256k;
fastcgi_busy_buffers_size 256k;
if still doesnt work try increasing all to 4096k
I am building a docker collection that will eventually have the following containers (and more)
web (nginx)
proxy (reverse proxy nginx)
php-fpm
the web will have the allowance for several frameworks to be added via folders and subfolders
./folder1/folder2/codeigniter
./folder3/folder4/laravel
the folders for codeigniter and laravel are symlinks to a public folder
the index.php page works and shows the default routes without issue.
but when I try to get to a different page such as
/folder3/folder4/laravel/index.php/path/somwhere
I get a 404 error message.
I want to be able to do this without mapping an nginx location directive for EVERY FOLDER...
this is what my conf files looks like:
server {
listen 8100 default_server;
listen [::]:8100 default_server ipv6only=on;
server_name localhost;
root /var/www/public;
index index.php index.html index.htm;
location / {
add_header 'Access-Control-Allow-Origin' '*';
add_header "Access-Control-Allow-Headers" "Authorization, Origin, X-Requested-With, Content-Type, Accept";
add_header "Access-Control-Allow-Methods" "GET, POST, OPTIONS, HEAD";
if ($request_method = 'OPTIONS') {
add_header 'Access-Control-Allow-Origin' '*';
#
# Tell client that this pre-flight info is valid for 20 days
#
add_header 'Access-Control-Max-Age' 1728000;
return 204;
}
try_files $uri $uri/ =404;
}
location ~ \.php$ {
try_files $uri $uri/ index.php?$args;
fastcgi_pass php-fpm-56:9000;
fastcgi_index index.php;
fastcgi_buffers 16 16k;
fastcgi_buffer_size 32k;
fastcgi_param SCRIPT_FILENAME $document_root$fastcgi_script_name;
#fixes timeouts
fastcgi_read_timeout 600;
include fastcgi_params;
}
location ~ /\.ht {
deny all;
}
}
what am I doing wrong here?
keep in mind, I want to use "/folder/index.php/path/path"
I DO NOT WANT
"/folder/path/path"
nor do I want to create a location entry for every new folder I create
i try to use codeifniter,
as an example
I will access
http://192.168.100.100/CI/CI_1/CI_1/index.php/main/index
after I tried this way, the results are according to what I attach
location /CI/CI_1/CI_1 {
autoindex on;
try_files $uri $uri/ /CI/CI_1/CI_1/index.php?/$request_uri;
}
Thanks
server {
listen 80;
server_name xx.cn;
index index.php index.html index.htm;
root /data/www_deploy/xx/backend/web;
location ~* /\. {
deny all;
}
location / {
try_files $uri /index.php?$args;
}
location ~ .*\.(php|php5)?$
{
fastcgi_pass 127.0.0.1:9000;
fastcgi_index index.php;
fastcgi_split_path_info ^(.+\.php)(/.+)$;
fastcgi_param SCRIPT_FILENAME $document_root$fastcgi_script_name;
include fastcgi_params;
}
client_max_body_size 512m;
}
nginx error show that client closed connection while waiting for request, client: x.x.x.x, server: 0.0.0.0:80 when visit domain via client browser
error show that
Connecting to xx.cn (xx.cn)|x.x.x.x|:80... connected.
HTTP request sent, awaiting response... 500 Internal Server Error
2019-09-13 19:48:18 ERROR 500: Internal Server Error.
on the server via wget xx.cn
I wonder how to deal it?
This often happens if the index.php (or any other script) you are calling does not exit correctly, for example throwing an exception.
Have a look at the error.log
I had a spa project which run frontend and backend using different port.
It can run successfully when I using php artisan serve.
But now I tried to run the laravel backend at nginx without php artisan serve.
# backend
server {
listen 3000 default_server;
listen [::]:3000 default_server;
root /usr/nextJs/nextTestBackend/public;
index index.php index.html index.htm index.nginx-debian.html;
server_name _;
location / {
try_files $uri $uri/ /index.php?$query_string;
}
location ~ \.php$ {
try_files $uri /index.php =404;
fastcgi_pass 127.0.0.1:9000;
fastcgi_index index.php;
fastcgi_buffers 16 16k;
fastcgi_buffer_size 32k;
fastcgi_param SCRIPT_FILENAME $document_root$fastcgi_script_name;
#fixes timeouts
fastcgi_read_timeout 600;
include fastcgi_params;
}
}
After nginx start, 127.0.0.1:3000/api/GET/users can't access laravel backend.
It will shows the error:
Failed to load http://127.0.0.1:3000/api/GET/test: No 'Access-Control-Allow-Origin' header is present on the requested resource. Origin 'http://127.0.0.1' is therefore not allowed access. The response had HTTP status code 500.
How to fix it?
Even though I put this inside location{} it still get the same error.
add_header 'Access-Control-Allow-Origin' '*';
add_header 'Access-Control-Allow-Methods' 'GET, POST, OPTIONS';
add_header 'Access-Control-Allow-Headers' 'DNT,User-Agent,X-Requested-With,If-Modified-Since,Cache-Control,Content-Type,Range';
add_header 'Access-Control-Expose-Headers' 'Content-Length,Content-Range';
By the way the api at laravel also had CORS middleware.
Route::middleware('cors')->get('/GET/users', 'UserController#read');
Situation:
I deployed my php project as a web server in the machine A, using nginx and fastcgi, and the config file is as following:
server {
listen 80;
server_name alpha.kimi.com;
index index.html index.htm index.php;
root /alidata/www/;
location ~ .*\.(php|php5)?$
{
#fastcgi_pass unix:/tmp/php-cgi.sock;
fastcgi_pass 127.0.0.1:9000;
fastcgi_index index.php;
include fastcgi.conf;
}
location / {
root /www/admin/;
index index.php;
if (!-f $request_filename){
rewrite ^/(.+)$ /index.php?$1& last;
}
}
location ~ .*\.(gif|jpg|jpeg|png|bmp|swf)$
{
expires 30d;
}
location ~ .*\.(js|css)?$
{
expires 1h;
}
access_log /data/log/nginx/access/output.log;
error_log /data/log/nginx/access/error.log;
}
so when I make a 'GET' request from my local machine as:
curl http://alpha.kimi.com/app/redirect/taskpush?build=10&gcdata=1
there will be json returned
{"res":200,"msg":"success","extra":[]}
However when I made the same request in the machine A, it just hanged there, and returned nothing. I also tried:
curl http://localhost/app/redirect/taskpush?build=10&gcdata=1
and
curl http://localhost:9000/app/redirect/taskpush?build=10&gcdata=1
all not working. I don't know what is the problem.
You need to configure nginx to listen via localhost or 127.0.0.1 for it to work.
See http://nginx.org/en/docs/http/ngx_http_core_module.html#listen for full instructions.
You can add multiple listen statements eg:
listen localhost;
listen 127.0.0.1;
Also see this for more detail: https://serverfault.com/questions/655067/is-it-possible-to-make-nginx-listen-to-different-ports