I am very new to the server side of development, and I am trying to implement the Koel music streaming service. Koel install. I have done everything up to the .env modification. I have modified it following this guide, I replaced the admin name, email, and password to fit my credentials. I am lost on what to do next, do I need to launch a localhost server, and then run the command php artisan koel:init?
Related
I have my staging site for a Laravel app hosted on a Google App Engine instance. The production site is hosted on Compute Engine and isn't a managed server so I am not as familiar with the set up on GAE - I wanted to try it out in order to eventually move our production site onto a managed server.
I'm having an issue where I can't figure out how to run php artisan commands on staging! I managed to use the google sdk and the cloud sql proxy in order to access the staging database, and I assumed I could use some kind of gcloud command to run the artisan command, like something like gcloud --compute="php artisan migrate" but I can't figure out the best way to do it.
I also have tried using GCP's in-app terminal to ssh into the instance, but it seems like I have no access to the actual project files within that ssh so I can't run the artisan commands.
Does anyone know best practices for running a migration on this type of server?
I'm also using an app.yaml file to build the instance, so I was thinking maybe I should figure out how to put the command there, but I'm not sure if that's the right move as the only information I have in that file is the env info and server resources info.
Please help! thank you :)
One way of running migrations to your laravel project that is hosted on Google App Engine would be to connect your local environment project to the project's Google Cloud SQL. From there you can just run the migrations from your local environment.
Like #TMK said, connect through the SQL proxy. It's simple.
Create an SQL instance using the instructions found here
Follow the instructions here
with that done, you can run your migration command right from your bash terminal
I want to start deploying my laravel app to production. To avoid changing AWS config in the near future, I decided to try to add continuous integration. To do this I'm using elastic beanstalk and pushing the code there using another AWS tool. All this is working perfectly. I put a test message in /public/index.php to output "hello world" and it works as expected when going to the URL. When I remove this text and run my laravel app as normal, I get a 500 internal server error. I'm not sure all what elastic beanstalk does on deployment for laravel apps. Because of this I'm not sure if this error is occurring from a bad DB connection or laravel app not fully setup yet.
I created an RDS DB instance outside of elastic beanstalk. I am able to access this from sequel pro and I added the database I will use to store all my data. I added all the necessary db connection values (host, port, database, username, password) to /config/database.php and the elastic beanstalk server configuration environment variables.
I've searched online for days for info but haven't found anything specific to this. Also part of my issue is a lack of understanding of how laravel is setup during a deployment in beanstalk. Is there something with /.ebextensions that will help me accomplish these goals?
When I deploy my code I need to make sure all vendor files are installed through composer and then I also need to make sure all database migrations/changes happen. I want to automate as many steps as possible so I can just push up code changes and server will change code and keep working.
I also want to make sure there isn't anything I'm missing for setting up my db connection. Are there any other files in laravel I need to setup or something in elastic beanstalk I need to have configured? I'm keeping my db open to all connections and then I can tighten restrictions later
EDIT: My database may not be configured correctly yet, but that appers to not be source of the issue. I think my issue is knowing what scripts and how to run them during deployment. I want to make sure composer and php artisan migrate is ran to keep everything up to date. How can I do this with elastic beanstalk?
This is a little off topic but with my experience I do not recommend using beanstalk to achieve a good workflow pushing and deploying. I recommend
you to use Forgelaravel.com to deploy your GitHub repository. In forge Laravel you will be able to create a server from AWS and it will automatically create database connections and stuffs, then with just one click you can deploy to the server and keep a good workflow.
Given that we're speaking of a Laravel application, this 500 Server Error
often occurs when one is creating new infrastructure. Why? It's likely to forget to add the environment variables to the EB environment.
Simply go to that EB environment configuration, then under software modify to include at least the following environment properties
APP_DEBUG (can be, for example, false)
APP_KEY (this is the key generated with php artisan key:generate)
APP_NAME (tiagoPeres)
Then the issue will be gone.
I have a website in my own localhost using Wamp coded using php.
I recently joined Google Cloud Platform and have deployed Lamp Stack in it.
I also setup the MySQL database in it successfully.
But now I am confused on how to upload my files into that.
The OS is Debian 8
I have been using BitBucket for some time, is there a way that I can clone the data from there directly to google cloud?
Can anyone guide me how to upload the PHP files in there so that I can test my website?
Is there any GUI for that rather than command line? I am not that good with command line.
P.S. Ready to give any more relevant info, as I don't know what all data from my side is required to answer this.
When deploying google Lamp stack(click to deploy) you will automatically be creating an instance of Google Compute Engine - check Compute Engine / VM Instances menu.
Method 1:
Click on SSH button next to the instance name and a new terminal window will open. Make sure you have git installed, if not install it yourself
sudo apt-get install git
Locate your html apache/linux folder. Usually it's
cd /var/www/html
Then download your repository with
git clone https://www.path.to.repository.git
Make sure you use the https repository url not the ssh one. For SSH you will need to have the same SSH key on your instance as on your bitbucket account. With https you will be able to log on with your normal credentials.
Method 2:
You can upload files with SFTP.
First you need to generate a key with PuttyGen if you don't have one already.
Next go to the GCP menu, click on the Compute Engine menu then on the VM instances submenu.
Check the lamp instance then click Edit to go to the Edit page. Scroll down till you find the SSH keys textbox. Paste in the contents of your key.
Next use any SFTP client. You can do that from within PHP Storm, FileZilla or Putty by selecting your private key and connecting to yourusername#instance_external_ip
Good luck php wizz
I work for a company that for some reason uses a Windows 2003 server with WAMP for its live product (please don't ask me why). Currently we have to push updates locally to github, manually connect to this production server with Remote Desktop Connection and pull.
We want to automate this proccess.
I have tried github webhooks with no success -> Couldn't find a way to create SSH keys for the system account (NT AUTHORITY/SYSTEM), which is the one used by apache on that server solution. The project is currently cloned via HTTPS, so we could also try saving credentials globally so that git won't prompt the local system account for password when trying to pull via PHP, but that seems like the worst possible solution security-wise.
Any ideas?
-- UPDATE --
We've decided not to worry about security implications right now. I then followed all the steps to save credentials for a new read-only user, but no success. I can see a .git-credentials file with the correct user/pass/url in:
C:\WINDOWS\system32\config\systemprofile
All commands work (via browser) except for pull, fetch, etc.
Any more ideas?
-- UPDATE 2 --
I've now changed the wampapache service to run as an administrator account instead, which also has credentials stored on its "root" folder (~). It's the same account I can push/pull via cmd without user/pass prompts. But when I try via browser... No luck.
I'm now officially out of ideas.
I have finally found a solution!
Simply editting the git config file to add user:password to the url parameter under [remote "origin"] has done the job.
url = https://user:pass#github.com/organization/project.git
I have a working Laravel application on my desktop. I am trying to deploy it to a godaddy server. The application has been copied and verified on the server in the same structure format as on my desktop. I have changed the database information in the config to the proper entries. When I run from the server I get the error:
FatalErrorException
Class 'name of the class' not found.
The error is generated in the Routes.php file.
Again this app works locally, any idea why it errors when deployed?
Assuming you have a VPS server running Apache (or whatever else) with a a properly configured vhost and terminal (ssh) access, if you want deploy a Laravel app there, you surely need to
Copy all project files, except for the vendor directory, from localhost to your server
Verify file ownage and access rights (chown/chmod)
Change all necessary config
run a composer update
run php artisan migrate to create the database scheme
clear cache php artisan cache:clear
Next thing is populating the database with your apps data from localhost. You can either do it manually (dump sql local, import it on server), or if you want Laravel for it you can use Seeding (Laravel docs).
There even is a nice package that let's you automaticaly generete Seeders from your current database content - so it's very useful for the purpose of moving the app to another server.
https://github.com/orangehill/iseed
Inverse seed generator (iSeed) is a Laravel 4 package that provides a
method to generate a new seed file based on data from the existing
database table.
EDIT
If you are using a shared hosting account, check out this guide
http://driesvints.com/blog/laravel-4-on-a-shared-host
This post from Laravel forum might also be of use
http://forums.laravel.io/viewtopic.php?id=9639