How to run Open Journal System (OJS) on Nginx - php

I am having trouble with the appropriate Nginx configuration of my server.
The deployed php app on It is OJS, a journal management and publishing system, originally developed to run on Apache1.
Although OJS may runs on Nginx without further specific server configuration, a minor change on the OJS main config settings (disable_path_info ON) must be done because PATH_INFO doesn't seem to be supported by Nginx. However that generate non pretty URLs, which in turn cause some OJS features/plugins to work out of specifications, or not to work at all2.
I found some posts were people share successful experiences on that:
https://coolpandaca.wordpress.com/2012/12/07/migrate-ojs-to-nginx-from-apache
https://forum.pkp.sfu.ca/t/ojs3-on-nginx-php7-0-fpm/28590
This is another site
https://www.snip2code.com/Snippet/305514/nginx-configuration-for-OJS-on-an-aegir-
I am running Ubuntu 18.04.1 LTS (GNU/Linux 4.15.0-42-generic x86_64)
on a Digital Ocean account configured by Laravel Forge.
I couldn't find the way to combine this blocks of code (the ones at examples on above links) with mine default Nginx settings.
# FORGE CONFIG (DO NOT REMOVE!)
include forge-conf/evidenciaonojs.tk/before/*;
server {
listen 80;
listen [::]:80;
server_name evidenciaonojs.tk;
root /home/forge/evidenciaonojs.tk/;
# FORGE SSL (DO NOT REMOVE!)
# ssl_certificate;
# ssl_certificate_key;
ssl_protocols TLSv1.2;
ssl_ciphers ECDHE-RSA-AES256-GCM-SHA512:DHE-RSA-AES256-GCM-SHA512:ECDHE-RSA-AES256-GCM-SHA384:DHE-RSA-AES256-GCM-SHA384:ECDHE-RSA-AES256-SHA384;
ssl_prefer_server_ciphers on;
ssl_dhparam /etc/nginx/dhparams.pem;
add_header X-Frame-Options "SAMEORIGIN";
add_header X-XSS-Protection "1; mode=block";
add_header X-Content-Type-Options "nosniff";
index index.html index.htm index.php;
charset utf-8;
# FORGE CONFIG (DO NOT REMOVE!)
include forge-conf/evidenciaonojs.tk/server/*;
location / {
try_files $uri $uri/ /index.php?$query_string;
}
location = /favicon.ico { access_log off; log_not_found off; }
location = /robots.txt { access_log off; log_not_found off; }
access_log off;
error_log /var/log/nginx/evidenciaonojs.tk-error.log error;
error_page 404 /index.php;
location ~ \.php$ {
fastcgi_split_path_info ^(.+\.php)(/.+)$;
fastcgi_pass unix:/var/run/php/php7.2-fpm.sock;
fastcgi_index index.php;
include fastcgi_params;
}
location ~ /\.(?!well-known).* {
deny all;
}
}
# FORGE CONFIG (DO NOT REMOVE!)
include forge-conf/evidenciaonojs.tk/after/*;
I expect to change back OJS config file to disable_path_info Off and be able to use pretty URL while running on Nginx.
Any help on this will be truly appreciated!

I just now saw your message on the OJS3 forum.
For NginX try this configuration
# FORGE CONFIG (DO NOT REMOVE!)
include forge-conf/evidenciaonojs.tk/before/*;
server {
listen 80;
listen [::]:80;
server_name evidenciaonojs.tk;
root /home/forge/evidenciaonojs.tk/;
# FORGE SSL (DO NOT REMOVE!)
# ssl_certificate;
# ssl_certificate_key;
ssl_protocols TLSv1.2;
ssl_ciphers ECDHE-RSA-AES256-GCM-SHA512:DHE-RSA-AES256-GCM-SHA512:ECDHE-RSA-AES256-GCM-SHA384:DHE-RSA-AES256-GCM-SHA384:ECDHE-RSA-AES256-SHA384;
ssl_prefer_server_ciphers on;
ssl_dhparam /etc/nginx/dhparams.pem;
add_header X-Frame-Options "SAMEORIGIN";
add_header X-XSS-Protection "1; mode=block";
add_header X-Content-Type-Options "nosniff";
index index.html index.htm index.php;
charset utf-8;
# FORGE CONFIG (DO NOT REMOVE!)
include forge-conf/evidenciaonojs.tk/server/*;
location / {
try_files $uri $uri/ /index.php?$args;
}
location = /favicon.ico { access_log off; log_not_found off; }
location = /robots.txt { access_log off; log_not_found off; }
access_log off;
error_log /var/log/nginx/evidenciaonojs.tk-error.log error;
error_page 404 /index.php;
location ~ ^(.+\.php)(.*)$ {
set $path_info $fastcgi_path_info;
fastcgi_split_path_info ^(.+\.php)(.*)$;
fastcgi_param PATH_INFO $path_info;
fastcgi_param PATH_TRANSLATED $document_root$path_info;
if (!-f $document_root$fastcgi_script_name) {
return 404;
}
include fastcgi_params;
fastcgi_pass unix:/var/run/php/php7.2-fpm.sock;
fastcgi_index index.php;
fastcgi_param SCRIPT_FILENAME $document_root$fastcgi_script_name;
}
location ~ /\.(?!well-known).* {
deny all;
}
}
# FORGE CONFIG (DO NOT REMOVE!)
include forge-conf/evidenciaonojs.tk/after/*;
Be sure to to set:
1. cgi.fix_pathinfo=1 in PHP-FPM (in /etc/php/7.2/fpm/php.ini probably).
2. security.limit_extensions = .php in your FPM pool config file (in /etc/php/7.2/fpm/pool.d/your_site.conf)
3. disable_path_info = Off (in OJS config.inc.php)
Restart PHP-FPM and NginX services. Then, if it works, read about the evils of NginX IF and 'cgi.fix_pathinfo'.

Just con confirm that things that were useful in my case to run successfully OJS on Nginx (Ubuntu 18.04.1 LTS on a Digital Ocean account configured by Laravel Forge) included:
1) Modify cgi.fix_pathinfo=1 in PHP-FPM (in /etc/php/7.2/fpm/php.ini)
2) Uncomment (enable) security.limit_extensions = .php (in /etc/php/7.2/fpm/pool.d/www.conf)
3) Changed disable_path_info = Off (in OJS config.inc.php).
4) Replace nginx config with:
# FORGE CONFIG (DO NOT REMOVE!)
include forge-conf/evidenciaonojs.tk/before/*;
server {
listen 80;
listen [::]:80;
server_name evidenciaonojs.tk;
root /home/forge/evidenciaonojs.tk/;
# FORGE SSL (DO NOT REMOVE!)
# ssl_certificate;
# ssl_certificate_key;
ssl_protocols TLSv1.2;
ssl_ciphers ECDHE-RSA-AES256-GCM-SHA512:DHE-RSA-AES256-GCM-SHA512:ECDHE-RSA-AES256-GCM-SHA384:DHE-RSA-AES256-GCM-SHA384:ECDHE-RSA-AES256-SHA384;
ssl_prefer_server_ciphers on;
ssl_dhparam /etc/nginx/dhparams.pem;
add_header X-Frame-Options "SAMEORIGIN";
add_header X-XSS-Protection "1; mode=block";
add_header X-Content-Type-Options "nosniff";
index index.html index.htm index.php;
charset utf-8;
# FORGE CONFIG (DO NOT REMOVE!)
include forge-conf/evidenciaonojs.tk/server/*;
location / {
try_files $uri $uri/ /index.php?$query_string;
}
location = /favicon.ico { access_log off; log_not_found off; }
location = /robots.txt { access_log off; log_not_found off; }
access_log off;
error_log /var/log/nginx/evidenciaonojs.tk-error.log error;
error_page 404 /index.php;
location ~ ^(.+\.php)(.*)$ {
set $path_info $fastcgi_path_info;
fastcgi_split_path_info ^(.+\.php)(.*)$;
fastcgi_param PATH_INFO $path_info;
fastcgi_param PATH_TRANSLATED $document_root$path_info;
fastcgi_pass unix:/var/run/php/php7.2-fpm.sock;
fastcgi_index index.php;
include fastcgi_params;
}
location ~ /\.(?!well-known).* {
deny all;
}
}
# FORGE CONFIG (DO NOT REMOVE!)
include forge-conf/evidenciaonojs.tk/after/*;
5) And finally restart services (service php7.2-fpm restart AND sudo service nginx restart).

Related

How to do advanced debugging of 404 Error on Forge Laravel Website

I recently migrated a previously working PHP Laravel website to DigitalOcean and am managing deployments with Forge. For DNS hosting, I am using AWS Route 53 (purchased domain there).
I believe there is something going on with the DNS, so I decided to make sure I could still see the website via the public IP. Unfortunately, I only seem to get a 404 error. Like I said, this site worked previously on AWS with the same Nginx, PHP, and Laravel versions.
All of the correct files are in the /public directory, Nginx does not show any compile or status errors, and its config file appears to be correct.
# FORGE CONFIG (DO NOT REMOVE!)
include forge-conf/dev.agsflagfootballleague.com/before/*;
server {
listen 80;
listen [::]:80;
server_name dev.agsflagfootballleague.com;
server_tokens off;
root /home/forge/dev.agsflagfootballleague.com/public;
# FORGE SSL (DO NOT REMOVE!)
# ssl_certificate;
# ssl_certificate_key;
ssl_protocols TLSv1.2 TLSv1.3;
ssl_ciphers TLS13-AES-256-GCM-SHA384:TLS13-CHACHA20-POLY1305-SHA256:TLS_AES_256_GCM_SHA384:TLS-AES-256-GCM-SHA384:TLS_CHACHA20_POLY1305_SHA256:TLS-CHACHA20-POLY1305-SHA256:ECDHE-ECDSA-AES256-GCM-SHA384:ECDHE-ECDSA-CHACHA20-POLY1305:ECDHE-ECDSA-AES256-SHA384:ECDHE-RSA-AES256-GCM-SHA384:ECDHE-RSA-CHACHA20-POLY1305:ECDHE-RSA-AES256-SHA384:ECDHE-ECDSA-AES256-SHA:ECDHE-RSA-AES256-SHA;
ssl_prefer_server_ciphers on;
ssl_dhparam /etc/nginx/dhparams.pem;
add_header X-Frame-Options "SAMEORIGIN";
add_header X-XSS-Protection "1; mode=block";
add_header X-Content-Type-Options "nosniff";
index index.html index.htm index.php;
charset utf-8;
# FORGE CONFIG (DO NOT REMOVE!)
include forge-conf/dev.agsflagfootballleague.com/server/*;
location / {
try_files $uri $uri/ /index.php?$query_string;
}
location = /favicon.ico { access_log off; log_not_found off; }
location = /robots.txt { access_log off; log_not_found off; }
access_log off;
error_log /var/log/nginx/dev.agsflagfootballleague.com-error.log error;
error_page 404 /index.php;
location ~ \.php$ {
fastcgi_split_path_info ^(.+\.php)(/.+)$;
fastcgi_pass unix:/var/run/php/php7.3-fpm.sock;
fastcgi_index index.php;
include fastcgi_params;
}
location ~ /\.(?!well-known).* {
deny all;
}
}
# FORGE CONFIG (DO NOT REMOVE!)
include forge-conf/dev.agsflagfootballleague.com/after/*;
Any thoughts on what to debug/research next? I only want to be able to view my web app via IP address at the moment.
Can you try changing:
listen 80;
listen [::]:80;
To:
listen 80 default_server;
listen [::]:80 default_server;

Nginx keeps redirecting on a base install of laravel forge

Although I do not have laravel setup on the server I am just trying to display some php files
In my root directory I have /public/dashboard which is executed by domain.com/dashboard
It shows up fine but as soon as I try to append something to "dashboard" it redirects back to /dashboard
For example type domain.com/dashboard/test it goes back to domain.com/dashboard
I have nothing in my htaccess file. Here is my nginx config:
# FORGE CONFIG (DOT NOT REMOVE!)
include forge-conf/domain.com/before/*;
server {
listen 443 ssl http2;
listen [::]:443 ssl http2;
server_name jackco.biz;
root /home/forge/domain.com/public;
# FORGE SSL (DO NOT REMOVE!)
ssl_certificate /etc/nginx/ssl/domain.com/292197/server.crt;
ssl_certificate_key /etc/nginx/ssl/domain.com/292197/server.key;
ssl_protocols TLSv1 TLSv1.1 TLSv1.2;
ssl_ciphers '';
ssl_prefer_server_ciphers on;
ssl_dhparam /etc/nginx/dhparams.pem;
#add_header X-Frame-Options "SAMEORIGIN";
add_header X-XSS-Protection "1; mode=block";
add_header X-Content-Type-Options "nosniff";
index index.html index.htm index.php;
charset utf-8;
# FORGE CONFIG (DOT NOT REMOVE!)
include forge-conf/domain.com/server/*;
location / {
try_files $uri $uri/ /index.php?$query_string;
}
location = /favicon.ico { access_log off; log_not_found off; }
location = /robots.txt { access_log off; log_not_found off; }
access_log off;
error_log /var/log/nginx/domain.com-error.log error;
error_page 404 /index.php;
location ~ \.php$ {
fastcgi_split_path_info ^(.+\.php)(/.+)$;
fastcgi_pass unix:/var/run/php/php5.6-fpm.sock;
fastcgi_index index.php;
include fastcgi_params;
fastcgi_read_timeout 600;
}
location ~ /\.(?!well-known).* {
deny all;
}
}
# FORGE CONFIG (DOT NOT REMOVE!)
include forge-conf/domain.com/after/*;
Any thoughts to disable the constant redirect/rewite?

nginx two sites one directory

I have a project where I want to have one domain per language. So my sites are pointing at the same directory, but for some reason, one of my domains are being redirected to the other.
To clarify.
example.com
example.org
example.org is pointed to the same directory as example.com.
When I visit example.org, I get redirected to example.com
I don't know if the problem is that I use LetsEncrypt SSL on both domains.
Any clues?
Here is my nginx files
example.com
# FORGE CONFIG (DOT NOT REMOVE!)
include forge-conf/example.com/before/*;
server {
listen 443 ssl http2;
listen [::]:443 ssl http2;
server_name example.com;
root /home/forge/example.com;
# FORGE SSL (DO NOT REMOVE!)
ssl_certificate /etc/nginx/ssl/example.com/xxx/server.crt;
ssl_certificate_key /etc/nginx/ssl/example.com/xxx/server.key;
ssl_protocols TLSv1 TLSv1.1 TLSv1.2;
ssl_ciphers 'XXXXXX-$'
ssl_prefer_server_ciphers on;
ssl_dhparam /etc/nginx/dhparams.pem;
add_header X-Frame-Options "SAMEORIGIN";
add_header X-XSS-Protection "1; mode=block";
add_header X-Content-Type-Options "nosniff";
index index.html index.htm index.php;
charset utf-8;
# FORGE CONFIG (DOT NOT REMOVE!)
include forge-conf/example.com/server/*;
location / {
try_files $uri $uri/ /index.php?$query_string;
}
location ~ /(public/mail-signature/.*)$ {
}
location = /favicon.ico { access_log off; log_not_found off; }
location = /robots.txt { access_log off; log_not_found off; }
access_log off;
error_log /var/log/nginx/example.com-error.log error;
error_page 404 /index.php;
location ~ \.php$ {
fastcgi_split_path_info ^(.+\.php)(/.+)$;
fastcgi_pass unix:/var/run/php/php7.1-fpm.sock;
fastcgi_index index.php;
include fastcgi_params;
}
location ~ /\.ht {
deny all;
}
}
# FORGE CONFIG (DOT NOT REMOVE!)
include forge-conf/example.com/after/*;
example.org
# FORGE CONFIG (DOT NOT REMOVE!)
include forge-conf/example.org/before/*;
server {
listen 80;
listen [::]:80;
server_name example.org;
root /home/forge/example.com;
# FORGE SSL (DO NOT REMOVE!)
# ssl_certificate;
# ssl_certificate_key;
ssl_protocols TLSv1 TLSv1.1 TLSv1.2;
ssl_ciphers 'xxxxxxx-$'
ssl_prefer_server_ciphers on;
ssl_dhparam /etc/nginx/dhparams.pem;
add_header X-Frame-Options "SAMEORIGIN";
add_header X-XSS-Protection "1; mode=block";
add_header X-Content-Type-Options "nosniff";
index index.html index.htm index.php;
charset utf-8;
# FORGE CONFIG (DOT NOT REMOVE!)
include forge-conf/example.org/server/*;
location / {
try_files $uri $uri/ /index.php?$query_string;
}
location = /favicon.ico { access_log off; log_not_found off; }
location = /robots.txt { access_log off; log_not_found off; }
access_log off;
error_log /var/log/nginx/example.org-error.log error;
error_page 404 /index.php;
location ~ \.php$ {
fastcgi_split_path_info ^(.+\.php)(/.+)$;
fastcgi_pass unix:/var/run/php/php7.1-fpm.sock;
fastcgi_index index.php;
include fastcgi_params;
}
location ~ /\.(?!well-known).* {
deny all;
}
}
# FORGE CONFIG (DOT NOT REMOVE!)
include forge-conf/example.org/after/*;
I would suggest using a more agnostic name for your document root, otherwise is very easy to get confused, you could use something like:
/home/forge/site-example
Later try to use curl to diagnose your flow, for example:
curl -I -L https://google.com
The option -I will fetch only headers
And option -L will follow redirects
For example for the site http://immortal.run
$ curl -I -L http://immortal.run
HTTP/1.1 301 Moved Permanently
Date: Fri, 15 Sep 2017 14:58:43 GMT
...
HTTP/1.1 200 OK
Date: Fri, 15 Sep 2017 14:58:43 GMT
Content-Type: text/html; charset=utf-8
...
Notice the:
HTTP/1.1 301 Moved Permanently
That is something probably is happening with your setup.

Nginx downloads PHP instead of executing it

I want to make certain php files accessible via http only.
So I added location = /example.php{} as shown in the code below.
server {
listen 80;
ssl off;
server_name example.com www.example.com;
root /var/www/example;
location ~* \.(php)$ {
# dostufdd
}
location = /example.php {
#do stuff
}
location / {
return 301 https://$host$request_uri;
}
}
server {
listen 443 ssl http2;
server {
listen 443 ssl http2;
server_name example.com www.example.com;
root /var/www/example;
index index.php;
ssl_certificate /etc/letsencrypt/live/example.com/fullchain.pem;
ssl_certificate_key /etc/letsencrypt/live/example.com/privkey.pem;
ssl_trusted_certificate /etc/letsencrypt/live/example.com/chain.pem;
ssl_dhparam /etc/letsencrypt/live/example.com/example.com.dhparam;
ssl_stapling on;
ssl_stapling_verify on;
resolver 8.8.8.8 8.8.4.4;
# Set caches, protocols, and accepted ciphers. This config will
# merit an A+ SSL Labs score.
ssl_session_cache shared:SSL:20m;
ssl_session_timeout 10m;
ssl_protocols TLSv1 TLSv1.1 TLSv1.2;
ssl_prefer_server_ciphers on;
ssl_ciphers 'ECDHE-RSA-AES128-GCM-SHA256:ECDHE-ECDSA-AES128-GCM-SHA256:ECDHE-RSA-AES256-GCM-SHA384:ECDHE-ECDSA-AES256-GCM-SHA384:DHE-RSA-AES128-GCM-SHA256:DHE-DSS-AES128-GCM-SHA256:kED$
error_log /var/log/nginx/example.error.log warn;
location / {
try_files $uri $uri/ /index.php?$args;
}
# Allow Lets Encrypt Domain Validation Program
location ^~ /.well-known/acme-challenge/ {
allow all;
}
# Block dot file (.htaccess .htpasswd .svn .git .env and so on.)
location ~ /\. {
deny all;
}
# Block (log file, binary, certificate, shell script, sql dump file) access.
location ~* \.(log|binary|pem|enc|crt|conf|cnf|sql|sh|key)$ {
deny all;
}
location = /robots.txt {
log_not_found off;
access_log off;
}
location = /favicon.ico {
log_not_found off;
access_log off;
}
location ~* \.(css|js|ico|gif|jpe?g|png|svg|eot|otf|woff|woff2|ttf|ogg)$ {
expires max;
}
location ~ /.well-known {
allow all;
}
location ~ \.php$ {
try_files $uri =404;
fastcgi_split_path_info ^(.+\.php)(/.+)$;
fastcgi_pass unix:/run/php/php7.0-fpm.sock;
fastcgi_index index.php;
fastcgi_read_timeout 180;
fastcgi_param SCRIPT_FILENAME $document_root$fastcgi_script_name;
include fastcgi_params;
}
}
However, if I try to access to http://example.com/example.php, the php file is
downloaded instead of executing.
but If i access https://example.com/example.php, it will be accessed normally.
I have no idea what to do.
Please help me.
Thank you.

Laravel routes not found after nginx install

After I changed ICG to nginx all routes except index page does not work.
Laravel Config:
#/etc/nginx/sites-enabled/laravel
server {
listen 80;
root /var/www/home;
index index.php;
server_name 192.168.178.71;
access_log /var/www/home/storage/app/logs/laravel-nginx-access.log;
error_log /var/www/home/storage/app/logs/laravel-nginx-error.log error;
location /home {
root /home/public;
try_files $uri $uri/ /index.php?$query_string;
}
location = /favicon.ico { log_not_found off; access_log off; }
location = /robots.txt { log_not_found off; access_log off; }
# ERROR
error_page 404 /index.php;
# DENY HTACCESS
location ~ /\.ht {
deny all;
}
}
Default config:
# /etc/nginx/sites-enabled/default
server {
listen 80 default_server;
listen [::]:80 default_server;
root /var/www;
# Add index.php to the list if you are using PHP
index index.php index.html index.htm;
server_name 192.168.178.71 localhost;
location / {
# First attempt to serve request as file, then
# as directory, then fall back to displaying a 404.
try_files $uri $uri/ index.php?$query_string;
autoindex on;
# Remove trailing slash to please routing system.
if (!-d $request_filename) {
rewrite ^/(.+)/$ /$1 permanent;
}
}
location ~ \.php$ {
#try_files $uri /index.php =404;
try_files $uri =404;
fastcgi_split_path_info ^(.+\.php)(/.+)$;
fastcgi_pass unix:/var/run/php5-fpm.sock;
fastcgi_index index.php;
include fastcgi_params;
fastcgi_param SCRIPT_FILENAME document_root$fastcgi_script_name;
}
location ~ /\.ht {
deny all;
}
}
my nginx config
#/etc/nginx/nginx.conf
user www-data;
worker_processes 4;
pid /run/nginx.pid;
events {
worker_connections 768;
# multi_accept on;
}
http {
disable_symlinks off;
##
# Basic Settings
##
sendfile on;
tcp_nopush on;
tcp_nodelay on;
keepalive_timeout 65;
types_hash_max_size 2048;
# server_tokens off;
# server_names_hash_bucket_size 64;
# server_name_in_redirect off;
include /etc/nginx/mime.types;
default_type application/octet-stream;
##
# SSL Settings
##
ssl_protocols TLSv1 TLSv1.1 TLSv1.2; # Dropping SSLv3, ref: POODLE
ssl_prefer_server_ciphers on;
##
# Logging Settings
##
access_log /var/log/nginx/access.log;
error_log /var/log/nginx/error.log;
##
# Gzip Settings
##
gzip on;
gzip_disable "msie6";
##
# Virtual Host Configs
##
include /etc/nginx/conf.d/*.conf;
include /etc/nginx/sites-enabled/*;
}
What I tried:
/var/www/home# (home folder is laravel folder)
sudo chown -R www-data:www-data *
/var/www/home#
sudo chown -R root:root *
also I tried to change
try_files $uri $uri/ /index.php?$query_string;
try_files $uri $uri/ /index.php$is_args$args;
try_files $uri $uri/ /index.php;
php artisan cache:clear
Mostly questions in google i have read, but nothing helps me.
My phpinfo - link
This is the correct basic config for Laravel and Nginx:
server {
listen 80 default_server;
root /var/www/laravel/public/;
index index.php index.html index.htm;
location / {
try_files $uri $uri/ /index.php$is_args$args;
}
# pass the PHP scripts to FastCGI server listening on /var/run/php5-fpm.sock
location ~ \.php$ {
try_files $uri /index.php =404;
fastcgi_pass unix:/var/run/php5-fpm.sock;
fastcgi_index index.php;
fastcgi_param SCRIPT_FILENAME $document_root$fastcgi_script_name;
include fastcgi_params;
}
}
EDIT:
Instead of:
fastcgi_pass unix:/var/run/php5-fpm.sock;
As of November 2018, as PHP 7.2 is out, it would be:
fastcgi_pass unix:/var/run/php7.2-fpm.sock;
When I sent parameters by get I did not recognize them, I just have to activate the following:
try_files $uri $uri/ /index.php$is_args$args;
location / {
try_files $uri $uri/ /index.php$is_args$args;
}
I had same problem after updating few lines nginx working fine..
It's for windows ( change root acording your file system )
1.root html/laravel; #Update Here - add project folder name after html
2.try_files $uri $uri/ /index.php$is_args$args; #Update Here - Add this for 404 not found error
server {
listen 80; # IPv4
server_name localhost;
## Parametrization using hostname of access and log filenames.
access_log logs/localhost_access.log;
error_log logs/localhost_error.log;
## Root and index files.
root html/laravel; #Update Here - add project folder name after html
index index.php index.html index.htm;
## If no favicon exists return a 204 (no content error).
location = /favicon.ico {
try_files $uri =204;
log_not_found off;
access_log off;
}
## Don't log robots.txt requests.
location = /robots.txt {
allow all;
log_not_found off;
access_log off;
}
## Try the requested URI as files before handling it to PHP.
location / {
try_files $uri $uri/ /index.php$is_args$args; #Update Here - Add this for 404 not found error
## Regular PHP processing.
location ~ \.php$ {
try_files $uri =404;
fastcgi_pass php_processes;
fastcgi_index index.php;
fastcgi_param SCRIPT_FILENAME $document_root$fastcgi_script_name;
include fastcgi_params;
}
## Static files
location ~* \.(?:css|gif|htc|ico|js|jpe?g|png|swf)$ {
expires max;
log_not_found off;
## No need to bleed constant updates. Send the all shebang in one
## fell swoop.
tcp_nodelay off;
## Set the OS file cache.
open_file_cache max=1000 inactive=120s;
open_file_cache_valid 45s;
open_file_cache_min_uses 2;
open_file_cache_errors off;
}
## Keep a tab on the 'big' static files.
location ~* ^.+\.(?:ogg|pdf|pptx?)$ {
expires 30d;
## No need to bleed constant updates. Send the all shebang in one
## fell swoop.
tcp_nodelay off;
}
} # / location
}
I had the same issue, but updating the default configuration made it work.
location #rewrite {
rewrite ^/(.*)$ /index.php?_url=/$1;
}
location / {
try_files $uri $uri/ #rewrite;
}
Let me know if this worked for you or not.
sudo service nginx restart after changing the configuration.
Try it, work for me.
sudo nano /etc/nginx/sites-enabled/default
and then sudo systemctl reload nginx
server {
listen 80;
server_name _ midominioexample.com www.midominioexample.com;
root /var/www/html/public;
add_header X-Frame-Options "SAMEORIGIN";
add_header X-XSS-Protection "1; mode=block";
add_header X-Content-Type-Options "nosniff";
index index.php;
charset utf-8;
location / {
try_files $uri $uri/ /index.php?$query_string;
}
location = /favicon.ico { access_log off; log_not_found off; }
location = /robots.txt { access_log off; log_not_found off; }
error_page 404 /index.php;
location ~ \.php$ {
fastcgi_pass unix:/var/run/php/php7.4-fpm.sock;
fastcgi_param SCRIPT_FILENAME $realpath_root$fastcgi_script_name;
include fastcgi_params;
}
location ~ /\.(?!well-known).* {
deny all;
}
}
I was also getting the same error of Routes not working on Nginx on my Ubuntu 16.04
To solve Routes problem, i tried the following code and its just working fine for me.
Open project conf file using following command
sudo nano /etc/nginx/sites-available/projectname
Then do the following changes in this file
server {
listen 80;
listen [::]:80;
root /var/www/project_name/public;
server_name server_name;
location / {
try_files $uri $uri/ /index.php$is_args$args;
}
}
Important thing is to change the try_files in location block.
location / {
try_files $uri $uri/ /index.php$is_args$args;
}
I found that this solved my problem with Laravel routing.
I nested the location ~ .php$ inside location /.
Example:
server{
listen 9000;
server_name _;
root /var/www/myapp/public;
index index.php index.html;
location / {
try_files $uri $uri/ /index.php$is_args$args;
location ~ \.php$ {
include snippets/fastcgi-php.conf;
fastcgi_pass unix:/var/run/php/php7.4-fpm.sock;
}
}
}
As of 03/2022 and the current version of Laragon (5.0.0) I spent a lot of time to figure out why I can't open any link except index.php that I have configured in web.php for the route:list.
Because I just wanted to turn off SSL and turn it back on. It seems this causes to reset all your config files.
It seems that Laragon by default adds those lines:
# Access Restrictions
allow 127.0.0.1;
deny all;
I have put a # in front of deny to uncomment it and it worked again, like this:
# Access Restrictions
allow 127.0.0.1;
#deny all;

Categories