Transfering live yii2 website to localhost - php

I have a live website built on yii2 framework and i want to test it localy using MAMP ,I created a new database named demoDB in phpmyadmin and import sql from live one, the updated database configuration inside the common/config/main.php file to the my local database authentication settings.
'db' => [
'class' => 'yii\db\Connection',
'dsn' => 'mysql:host=localhost;dbname=demoDB',
'username' => 'demo',
'password' => 'demopass111',
'charset' => 'utf8',
],
but when i run MAMP the frontend only show directory like this
frontend path
I also tried to run command
php yii serve
on project file but i got this error
Document root "/Applications/MAMP/htdocs/demoweb/console/web" does not exist.
anyone can help what I'm missing

you need to point your server document root to the right location.
in case of yii2 it has two separate folder called backend and frontend.
so if you try to serve your site using php built-in server you can use the following command from your project root to serve your site locally.
php -S localhost:3000 -t frontend/web/ for the frontend part
php -S localhost:8000 -t backend/web/ for the backend part
and your frontend and backend will be available at http://localhost:3000/ and http://localhost:8000/ respectively.
you can choose the port as your need. For example i use here 3000 and 8000.

Related

Laravel Nova uploaded image file not showing on staging environment

I have deployed a laravel application which is using the Nova admin for backend management on Bluehost VPS account. I have creates a symlink between storage and public as recommend in the laravel documentation.
I have set up a resource for uploading images like so
Avatar::make('Image')->onlyOnIndex(),
Image::make('Image')->disk('public')
->path('item-category')->required()->hideFromIndex()
I am using the public disk for storage with the filesystem disk configuration like so
'public' => [
'driver' => 'local',
'root' => storage_path('app/public'),
'url' => env('APP_URL').'/storage',
'visibility' => 'public',
],
On my local system, I am able to see the actual image displayed but on the staging the image is uploaded successfully but does not display because the image is not found
The record is stored in the db like so
Why does this same configuration work on my local dev system but not on the staging server? Spent the best part of my morning trying to figure this out.
Try running the artisan command php artisan storage:link, that will create a symlink from your public folder to the storage folder.

Using storage_path from another Laravel app

I have an application that uses two different Laravel apps talking to the same database. App 1 is called BUILDER and App 2 is called VIEWER. In production I use S3 for storing files submitted within the application. For local development I use the storage/app/public folder in BUILDER.
The local dev setup is that BUILDER runs on localhost:8000 and VIEWER on locahost:8001
Now here comes my problem. In production both apps use the same S3 bucket for storage. So somehow I need to set this up similarly for local development.
The BUILDER is working fine, uploading and reading its files from the storage/app/public folder with FILESYSTEM_DRIVER=public in .env
The VIEWER is also reading these files fine, creating correct URL's after I added a new disk in the config (BUILDER_URL is set in .env to localhost:8000 which is the URL for the BUILDER)
'builder_public' => [
'driver' => 'local',
'root' => storage_path('app/public'),
'url' => env('BUILDER_URL') . '/storage',
'visibility' => 'public',
],
BUT... I need to somehow be able to also upload files from the VIEWER app that should end up in the same storage folder as the BUILDER.
So in my VIEWER app I would like my builder_public disk to be something like this:
'builder_public' => [
'driver' => 'local',
'root' => builder_storage_path('app/public'), // here
'url' => env('BUILDER_URL') . '/storage',
'visibility' => 'public',
],
Is there some way I can share the storage/app folder between two separate Laravel apps?
Yes, you can. if you are using Linux server, try below command
ln -s SOURCE_FOLDER DESTINATION_FOLDER
it will create folder short cut but still, your both application will use the same location(DESTINATION_PATH)
If you are using different servers, try mount command.

Laravel error: symlink() has been disabled for security reasons

I'm trying to set-up a laravel cms system that already exists on a shared host. But when i setup my laravel project on this shared hosting directory i get the error
symlink() has been disabled for security reasons
I can't seem to find a proper explenation on how i can fix this in my situation. I only get this error the first time and when i refresh this error, it disappears.
in your project's public directory, manually create a storage folder.
this worked for me
You can create symlink for your desired directory by logging in to your server via SSH download Putty and then login to your server via putty by using your server credentials. You can use linux commands to create a symbolic link:
ln -s TARGET LINK_NAME
Where the -s makes it symbolic.
There are 2 options to solve this;
Use any of the supported services for file storage i.e Amazon S3, Dropbox, FTP, Rackspace
Follow the steps below;
Create a folder in /public folder named /storage
Move all folders from /storage/app/public/ to the folder you created /public/storage/
Open file config/filesystems.php and change the values as shown;
'local' => [
'driver' => 'local',
'root' => public_path('storage'),
],
'public' => [
'driver' => 'local',
'root' => public_path('storage'),
'url' => env('APP_URL') . '/storage',
'visibility' => 'public',
],
Now open file .env and change or create the values as shown below;
AVATAR_DIR=avatars
SIGNATURE_DIR=signatures
LOGOS_DIR=logos
MEDIA_DIR=media
After a lot struggle this trick work for me at shared hosting server for Laravel project.
Connect with terminal at server via SSH or directly from CPanel
Run this command at your terminal
ln -s /folder-path-with-name-to-be-linked /folder-paht-where-you-want-to-link-it
e.g in my case
ln -s /home/user-name/public_html/domain-folder-name/storage/app/public /home/user-name/public_html/domain-folder/public/
Note: You don't need to change in php.ini and .htaceess files.
Reference Link:
https://unix.stackexchange.com/questions/141436/too-many-levels-of-symbolic-links/141442#141442?newreg=14b52f3d6fcb454399a1a1a66e2170df
The problem could be that your hosting provider will not let you create the smlink.Have you checked if this is the case?
One suggestion if so, is to write the files that you would have put in the symlink folder directly into the public area of the site so that a symlink is not required.

Laravel and Github - Workflow and Problems

I have a Laravel project right now and I'm using Github for my project.
I have two branches, master and develop.
The problem right now is,...
I have all the files in one folder /dev I'm using Sublime Text 2 and the official Github Client. When I switch branches, I see that in ST2 in the status bar, that's fine.
I put sftp-config.json in my gitignore BUT I have different FTP data for master and develop. I always have to edit the data in ST2 tu upload correctly onto my FTP to test my changes. Sometimes I even forget that, and accidentally upload develop to my master/live page.
Same problem for the routes.php, I need to disable SSL in my routes.php for the DEV because I do not have a wildcard certificate and my dev branches/ftp runs on dev.domain.tld and my main site at www.domain.tld .
I created a environment for my Laravel configs, one main config and "development" config.
Is it possible to use Config::get('app.ssl') in my routes.php in my filter? Like that:
Route::group(['before' => ['csrf',Config::get('app.ssl')]], function () {
Route::get('page', array('as' => 'page','uses' => 'PageController#getIndex'));
});
?
My workflow right now is very annoying and confusing sometimes. I always have to check that I do not upload stuff on my live server or changes the master files.
Any suggestions are highly appreciated!
Of course! Laravel supports environmental configuration out of the box. You can find pretty much everything you need in the official docs. However here's an example:
app/config/app.php - "main config"
array(
// other config entries
'ssl' => true
)
app/config/local/app.php - config for the environment local (you can call it whatever you want)
array(
'ssl' => false // here we override the value from our main config
)
Now the last thing we have to do is make sure our environments get detected correctly.
Laravel uses the host name to detect an environment. So you need to figure out how your machine(s) is called.
An easy way to do that is with the PHP function gethostname()
In bootstrap/start.php you can define your environments:
$env = $app->detectEnvironment(array(
'local' => array('your-machine-name'),
'other-environment' => array('other-machine-name')
));

How to make ZF2 db credentials environment specific

I am starting a new project.
I am using ZF2. I have just installed it and have the Skeleton application up and running.
This is my deployment process:
I develop on my local machine
I then push to my public github repository
I then use deployhq.com to deploy those to my production
server which is where the user would see the changes made.
I have tried to look around stack, zend site, and google at blogs etc but still dont have any real understanding or solution to my problem.
I want the application to use different database credentials based on its environment. E.g. if on 'dev', my local machine, then use credentials A, but if on live server, then use credentials B.
I have read a lot about global and local autoload config files etc, but baring in mind my github repo is public, any where I commit any config files with my db details would be visible.
I was wondering if there was a way to have, the same theory, global and local files with the DB connections in, i upload the production details manually, not via git for security reason, and tell git to ignore the local config file somehow? I would also need to know how I tell the application to use those config files based on the environment and there location.
In Zend 2 There are
Global configuration file &
Module level configuration file
IF you want to know there use you can refer the link below
How does configuration works in ZF2
When I had a same scenario I used the above link to understand and exploit Zend Config module which is really good to handle the situation like this .
create two files
production.php
local.php
in both these files
return this array based on the environment
return array(
"dbname" => "yourdbname"
"dbhostname" => "dbhostname"
"dbusername" => "yourdbusername",
"dbpassword" => "yourdbpassword"
);
in config/autoload/ directory of your zend framewrok application
later edit your config/application.config.php file as per below instructions
// get the application env from Apache vhost file ( here you can set in your apache vhost file as production or local )
$applicationEnv = getenv('APPLICATION_ENV');
$environmentSpecificConfigPath = "config/autoload/{,*.}{".$applicationEnv.",local}.php";
// Next with in the config array pass the environment specific configuration path
'config_glob_paths' => array($environmentSpecificConfigPath)
in any controller or action
you can just use the below code
$configArray = $this->getGlobalConfig();
Now $configarray has all your DB credentials to create a connection
$adapter = new Zend\Db\Adapter\Adapter(array(
'driver' => 'Mysqli',
'database' => $configArray['dbname'],
'username' => $configArray['dbusername'],
'password' => $configArray['dbpassword']
));
If you use config array to connect the DB in your entire application
you dont need to worry about environment changes just make sure you have an Apache APPLICATION_ENV entry in your vhost file
you can do that by adding below line in your apache vhost file
SetEnv APPLICATION_ENV "production" // in your production server
SetEnv APPLICATION_ENV "local" // in your local
Also Last but not least you can use the Zend Experts module ZeDB
https://github.com/ZendExperts/ZeDb
To manage your CRUD applications
Hope the above steps may help you in creating the environment

Categories