We as developers believe that Openshift v2 was a great platform for developing and deploying apps, now the Life of v2 is going to be end and the v3.x is arrived to play its role.
As a new to v3 architecture, I would say this is bit difficult to get started as easy as v2 was, I have some questions to ask in first place :
In v2 we can create an application and there comes a link to clone the repo locally, how can we create a PHP application on v3 without Github repo and clone that to local repo so that source may be private?
Adding databases on v2 were much easier, but on v3 it is like a nightmare for developers like me, How can we add MySQL DB to our PHP application on v3?
In v2 we make a change to source code, commit and push the app was live in short, How can we commit new changes in v3?
these are the basic questions which must be answered, any resource would be a life saving.
(1) To avoid using GitHub, or any other Git repository hosting service, you need to use a binary build. Although the post is about Django and Python, you can see the steps for using a binary build in:
https://blog.openshift.com/migrating-django-applications-openshift-3/
(2) To add a database, you go Add to Project, find the database you want to use there and create it. Then set the environment variables against the deployment configuration of the front end application so it knows where the database is and what login credentials are. An example of that can be found in:
https://blog.openshift.com/adding-database-openshift-online-3/
(3) If using a binary build as you seem you will want to due to (1), then you start a new build and tell it to use the code from your local directory. This is explained in same post given for (1).
Also suggest you work through example application in:
https://www.openshift.com/promotions/for-developers.html
This will give you further background on using OpenShift version 3.
If you want to keep the same workflow you had in OpenShift v2 (commit/push/live), sign up for a free account on GitLab.com or Bitbucket.com which both include free private repos (or bite the bullet and pay for an account on GitHub.com).
Then, check out Graham's post on best practices for using private git repos with OpenShift v3, which links to several guides on the subject: https://blog.openshift.com/private-git-repositories-part-1-best-practices/
As for the DB, you can add the database after the fact as Graham described (add a database to your project, tell your PHP application which variables to look for, then set those environment variables for your PHP app's deployment config), or you can write a re-usable template to deploy your application to any OpenShift cluster that includes the PHP app and database along with their configuration (see CakePHP template examples). I prefer creating a template for my apps with v3, but maybe I'm crazy :)
Related
I have been developing a website that uses Laravel (v6) on the backend, and Nuxt.js (v2) on the frontend. The idea was for laravel to act as an api & oauth2 server, that also server side rendered the Nuxt.js app. From my research, it seemed like this was not only a common route, but not too much hassle to implement.
While developing, I have kept the backend and frontend as completely separate projects with their own git repos and all that jazz. This is my first time deploying/developing a project like this, where there are two completely applications for the backend and frontend, so all this is very new and a little challenging at times. Now when it came time to deploy them, I always imagined that I would somehow merge the projects and that I would be able to setup Laravel to server side render the Nuxt.js app. However, I am now at that stage and trying to merge them with great difficulty.
Currently I am using the "laravel-nuxt" composer package and "laravel-nuxt" npm package in an attempt to connect the projects in one repo. However, I am having difficulty doing this. I've searched far and wide for a good resource on this process and have yet to find one that explains the process thoroughly. I even purchased a course on Udemy on the topic only to find out they didn't merge the projects! They deployed Nuxt to firebase and didn't even cover how the deployment of laravel.
Anyway, this is my question(s): should or could I keep the projects separate and have 2 completely separate deployments? Or rather, if I keep them separate, how do I deploy nuxt in a way that still gets server side rendered? To me it doesn't matter if they are separate or together, but the most important part is that the nuxt app utlitlizes SSR (server side rendering) for SEO purposes. So am I on the right track? Should I keep these projects separate or should I continue trying to merge them?
Sorry if this is unclear, I am rather frustrated and kind of losing my mind. I would really appreciate any feedback or point in the right direction. Thank you for your time in reading this, and I otherwise hope you have a good day :)
I recently developed something with a similar structure, Nuxt.js frontend and Directus CMS as backend.
I kept backend and frontend separated repositories and also deployed both separately. The reason why I decided to do it that way was because both need different packages on the server side and use different eco systems.
Frontend needs only Node.js backend needs a webserver, database and PHP. I think this should not be mixed.
For backend I used my existing server where I already have stuff running like Nextcloud or a blog behind a nginx webserver.
For frontend I used Dokku which I can only recommend for deploying Node.js apps. Nuxt.js has instructions on how to deploy to it.
Most important for you is that SSR is done by Nuxt.js, you don't need a separate webserver for that. Just build it and use npm start. Depending on your installation/deployment you have to use nginx as proxy to avoid calling the app with a port number. Another thing that Dokku does for me automatically, if the app respects the PORT environment variable.
I am new to Nextcloud app development and would like to create a simple app to play around. I saw that some apps are made with Vue.js, so I’m asking if there is a guide out there?
I generated an app skeleton and played around with the PHP templates but unfortunately I don’t know PHP and would like to create a Vue.js project into this existing demo app.
I found some premade Vue components for Nextcloud
https://github.com/nextcloud/nextcloud-vue
but no step by step guide on how to setup the Vue project after creating the skeleton app. I just saw that the Nextcloud app Tasks is also using Vue.js within the PHP code:
https://github.com/nextcloud/tasks
Thanks a lot for help.
I've looked at the repository you mentioned and it's fairly easy to setup, the question is what would you deserve after setup? If you clone the repository you have something like a 'working copy' of the plugin. But this Plugin needs to be build and the resulting package needs to be copied to right location at the nextcloud server (or maybe you have to install the package over frontend). This is some type of plugin and not a full web application which could run standalone, as I understood.
It's not like you said:
Nextcloud app Tasks is also using Vue.js within the PHP code
It's more like PHP is used for Backend and Vue is used for Frontend, these two 'projects' are completely independent from each other (PHP is backend and Vue is frontend there are no direct dependencies). Just create your Vue application (after build you will have an index.html and several js and css files) these files have to be available from browser. Then your applications entrypoint is the index.html. For PHP: You will just define Interfaces and Routes here which will give you the possibility to interact with the serverside. Then later to deploy your plugin, you have to package it in the format nextcloud needs. You can read from Makefile in the Tasks repository what happens if package is created.
I think a good starting point for you would be: https://docs.nextcloud.com/server/15/developer_manual/app/tutorial.html
My website done in PHP is currently deployed in AWS server.
Client wants to move it to Heroku for integrating the database with Salesforce.
Is it really necessary?
Salesforce have their API by which I can sync data from my project's MySQL database to Salesforce database.
Will moving the project to Heroku provide any extra advantages that AWS doesn't have ?
thanks in advance for your answers
Actually, no special advantages in approach which your client provide. In both cases you will need to implement logic for integration and interaction with SF part, and I don't see any benefits in migration to Heroku, but I see additional work for migration your current infrastructure.
Although Heroku provides some features for integration with SF out of the box, it seems to me, it will be cheaper and easier to add SF integration to your current project. But it's depends on many factors (for example, how is fit the Heroku platform for your solution at all), so possible best way is to implement PoC for both cases (if it's possible) and compare it.
I am trying to create a SCORM package and generate statements for the same.
I want to create a local LRS & Tin Can Api setup and generate statements from my scorm and display result in my php page.
I have created a LRS using the following link (http://onetarek.com/tin-can-api/guidephp-simple-lrs-with-tin-can-api/).
I have downloaded Tin Can Php sample and installed in my local, Unfortunatly it is not working. As i need to set my endpoint and auth credentials. I have no idea to do it to my local setup of LRS.
How to do this ?
I also want to host my SCORM package to any LMS and test with these setup.
I have analysed many forums and post, but nothing worked out.
I am lost. Need some resolution.
Kindly help.
-Vignesh Selvarajan
That post is quite old and even the original author probably wouldn't suggest that approach. He would probably suggest, and if your LRS must be local and PHP I would agree, that you should check out the Learning Locker LRS.
http://learninglocker.net/
Alternatively you could use Rustici's SCORM Cloud which supports import/launch for both SCORM and Tin Can packages, and will create statements in the LRS for launched SCORM content. It has an API for integrating with an LMS. You may also want to check out the Dispatch feature for hosting packages in other LMSs.
http://scorm.com/scorm-solved/scorm-cloud-features/
HTH.
Good day to you all,
I am currently developing a project on Laravel. So far I have always developed online, directly editing my files on the webserver throuh FTP (using PSPad or similar simple editing tools).
What I want to do now (and what i believe most people actually do) is setup a (W)LAMP stack on my local machine and program locally. However it is a little bit unclear to me how to keep my local code (including databases) in sync with the live website. How do you folks do that? I know there's probably lots of ways and tools to do that, but what would be your advice for a best practice? Any advice would be very welcome :)
What many companies do is build offline, then push their edits up to a server using git.
Im no expert on the software so ill describe what you do in a basic form:
My advice would be to create an online repo (repository) to store your project while you edit/update.
There are several git project management systems such as github or bitbucket. I personally use bitbucket
What git does, is when you have built or added what you need offline on local (w)lamp, you then git push them up to your repo or server. The changed files then get merged with the existing on the repo or the server. If you'd like the most recent version of your project you'd simply just git pull them down.
Read the full documentation here to see the wide range of options available when using git
We have a settings array within our platform available as $res::Config.
At runtime, a variable is changed from 'dev' to 'live' after checking the HTTP Host, obviously depending on the IP address.
Within our framework bootstrapping, depending on the value of $res::Config->$env, or the environment set previously as either dev or live, the settings for the database connection are set. You store these settings in the Config array as db_live or db_dev.
However you do it, use an environmental variable to figure out whether you want live or dev, and set up and array of settings accordingly.
We also have sandbox and staging for intermittent development stages.
As for version control, use git or subversion.
Edit: It's also possible that within our vhost file, we setup an environmental variable as either live or dev, and our application reads from this accordingly. I'd suggest this approach :)
There are a number of ways of doing this. But this is a deceptively HUGE question you've asked.
Here is some good practice advice - go and research these items, then have a look at my approach.
Typically you use a precess called version control which allows you to create "versions" or snapshots of your system.
The commonly used "SVN" software is good, but the new (not really any more) kid on the block is GIT, and I personally recommend that.
You can use this system to push the codebase live in a controlled fashion. While the files/upload feature is essentially similar to FTP, it allows you to dump a specific version of your site live.
In environments where there are multiple developers, this is ideal - you can compare/test and work around each other, and version control tends to stop errors between devs.
So - advice part 1: Look up and understand version control, then use it to release CODE to the live environment.
Part 2: I use database dumps and farm them back to my machine to work with.
If the live database needs updating, I can work locally and simply export, then re-import on the live system.
For example: on a recent Moodle project I worked on, to refresh the whole database took seconds... I could push a patch and database update in a few minutes.
However: you should think about maintenance and scheduling... if the site is live and has ongoing data changes then you need to be careful with this. Consider adding a maintenance page.
Advice 2: go research SQL dump/export and importing.
I personally use phpmyadmin to dump and re-import, as it's very convenient.
Advice 3: Working locally then pushing live is MUCH BETTER PRACTICE. You're starting down a much safer and better road than you're on!
Hope that helps... but bear in mind - this is a big subject, so you'll need to research a fair bit.