I developed a web system (in CodeIgniter) for a client and it runs on a local machine inside his office. Whenever I need to update the system, I perform the following steps:
Access to server machine via TeamViewer;
I open CMDER;
Access my account on Bitbucket to give a git pull;
I run the necessary migrations;
Thinking about these desktop systems that automatically look for a new version and upgrade, is it possible to do a routine in this web system to do the same thing? Search Bitbucket (or another web server) for a new version and then do the update?
Thanks for the suggestions.
Related
I developed a Progressive Web App (PHP and JavaScript based) and i upload it on remote server LEMP (Debian/Nginx based) through FileZilla.
How can i deploy in future the updates without to use FTP?
What tools exists for to manager the next versions of my web app?
If it is of utility i use VSCode for coding.
Thanks guys!
You can't update your web app without FTP. In order to update a new service worker, it needs to have at least one byte different than the previous version.
And the only way to update the service worker file is via FTP.
If you can't use FileZilla for whatever reason, you can try alternatives like editing files through Cpanel File Editor.
Or you can use free hosting like Github Pages, Netlify or Firebase. And automate the deployment of your front each time you commit something on a git repository.
This solution only works for your static files. not for your backend using PHP.
Thanks guys for your replies.
I found the right solution for me following this article: https://amifactory.team/blog/how-to-deploy-a-website/#simplewayscprsyncftp
In pratice i use lftp utility to sync my local website with my website on remote server.
The good notice is that "lftp" copy changed files only, reducing the uploading time and copying / overwriting the new files only at the end of the "lftp" process.
How would I go about starting a XAMPP server that uses files from GitHub instead of htdocs?
The idea is that I have my PC on connected to my router and while I am away I can push or merge updates to GitHub, and they are automatically live as XAMPP would be pointed at my Repo.
Can this be done?
I know you can use gitpages, but php and server side code is critical to what we are doing.
You could use webhooks in combination with your Xamp server in order to:
listen to push events on GitHub, and,
when said push is detected, git pull that repository in order to update your files
I have a web app running with PHP and MySQL.
I need to develop a desktop application which will sync data from the cloud DB whenever the client's computer connects to internet. If the client's computer is not connected to the internet, the desktop application will continue to work offline, using the local DB. The local DB is obviously a replica of the cloud DB.
I don't want to use Microsoft c# to create the desktop application. The desktop application needs to be cross platform and should run on Windows, Mac and Linux.
I have used XAMPP to create a local MySQL DB and have achieved the local app to sync with the cloud app. However, there are multiple problems to that approach.
-- Whenever my client's need to install the local app, they need to call me and I have to install XAMPP in their computer, setup the server, setup the local database and prepare it to sync with the cloud database with their account. They obviously aren't tech savvy so they don't know how to do it themselves.
-- If the client formats his computer, they will call me again and again and I have to set it up for them all the time, which isn't scalable in the longer haul.
-- XAMPP doesn't work when there are other processes running and using common ports. Example - Skype, Quickheal and other antivirus software running will prevent the SQL server to start. Sometimes what happens is that even after I have installed the local app, the client will install an antivirus software or some other tool and my local app will stop working on their computer.
Hence, I need to do away with XAMPP and switch to something else.
SQLite is out of question since it is serverless. I don't want to use .NET either. What I am looking for is this
I want to develop the database driven local application and package it somehow. I want to provider an installer file which will automatically install the database server, setup the database and everything else. The client will only login to the system in the local app and he doesn't have to setup any server. All the work that he does will be synced with the cloud server whenever internet connection resumes.
Please note that there is a master slave work involved. The client will have multiple terminal computers using the master system from other terminals and all these terminal computers will use the local database installed in the master computer.
I have tried to illustrate this with a diagram below
What's the best way to go about it?
I wanted to ask you to help me (with some tutorial link or something), what i want to do is to upload php Codeigniter application on azure?
Also in my application, I got upload feature so I should make it work too (don't know if it's more complex becasue of that).
I was using wamp during developing, now I need to push it on server.
Generally, deploy a CI application to Azure Web Apps is simple, you can simply create an Azure Web Apps on Azure portal, then step in the manage page of your Azure Web Apps, click All settings=>deployment source=>chose source=>Local Git Repository to set the git deployment setting of your Azure Web Apps service.
Then you can find the Git clone url under the essential tab.
Additionally, Azure App Service can run php composer.phar install when you run git push, but it's not enabled by default. To enable it, you need to install the Composer extension for your web app.
Here is a CI template on Azure sample, you can refer to https://github.com/Azure-Samples/app-service-web-php-get-started for any hits.
And here is a video https://youtu.be/bBb_Hi2Odqc, you can refer to. It manages on the classic portal, but still works today.
Deploying Codeigniter is very easy, all you need to make sure is placing the project directory correctly.
Just upload the project folder to the hosting either by SSH or FTP.
Modify the .htaccess file accordingly.
Once all the above is set up, including uploading all the files to host, don't forget to update config.php database.php and route.php with new hosting parameters.
To make your upload feature work correctly you just need to give proper/required write access to the upload folder of the application.
And that's all you need to do while deploying Codeigniter application.
I'm going to start working on a web application. The application uses php and the files are going to be hosted on a server I own. Some of my friends will also be working on this project with me.
How exactly should I set up git (using GitHub) so that when me and my friends push our changes to GitHub, our server gets updated automatically with the php files?
git push has a mirror mode that may just be what you want. All you need to do to activate it is to set your remote as --mirror=push and all should be good. IF not, leave a comment and I'll help further.
Another way is to install git on your server, add the settings for github and setup a cron every x period (1 minute) that makes a git fetch and reset to the development branch - that way when you push something to that branch your server will fetch it and reset the filesystem to it.
3rd way - check out the article using Github WebHooks on http://jonathanstark.com/blog/deploying-code-automatically-with-github-webhooks