I'm using Laravel and I'm looking for a way to back up my database with php code. I want these backup files to be saved in paths such as other drives or usb drives (by the user's choice: so that the user enters the Windows path in the software, such as: "D: \").
I also use xampp to set up a server on Windows.
It is better to suggest if you know the standard Laravel package in this regard.
I tried https://github.com/spatie/laravel-backup. it's very good and has many options to manage your backup.
I suggest if you want to show a better report of the backups that have been taken to your user in the software. Create a table and store additional information in it and use this table to display information to the user.
Related
Im currently developing a system using PHP and MySQL, what is required of the system is that it should allow the user be able to access the database on a standard GUI online (web browser) and should also be able to access the database offline, now can anyone help me with a method of how to make this possible.
The idea i have right now (me thinking out of the box) is using local host database manager like xampp which will be offline and an online database manager were the system will be hosted but cant figure out how will both databases be synchronized so that they stay the same.
If you want to have two databases maybe you will have to use a mirroring system. This one, wil help you to mantain the database syncronized with the database of the server with the local or not accessible database. You also can have a look at this link for more information Right way to mirror a PHP/MySQL setup
This topic has been raised a few times but even after searching and going through suggestions I haven't found a solution.
I have a PHP based website and an Android app that has to use the same database. The database is for an objective type test. The questions are created on the web using an admin panel.
The Android app needs to work offline so there is a need for one time sync when internet is available.
I have created a table in PHPMyAdmin called "database version". The Android app checks the database version using a web service. If the the version number is different it should be able to download the latest version of the database of the website and store it locally for offline access.
One of the suggestions I've come across is to convert the .sql database into .sqlite and give the download link of .sqlite to the Android app. However, converting .sql database to .sqlite using PHP doesn't seem to be easy. I've come across some shell script method but with no knowledge about that it probably doesn't work for me.
So, I'm looking for way to make sure the Android app can use the same database of the website but only require internet connection for syncing otherwise the app will work offline.
Would appreciate if I could get some directions and advise as to how to accomplish this.
Thanks!
Actually, I'm not sure If I would choose a solution that "replaces" the database when your app gets back online. I think I would build a database on the Android (SQLite), and synced with the Webdatabase (MySQL) row by row, by comparing datastamps.
We have a system where, a large part of it is the ability to upload and download files if you are logged in and have the correct permissions.
What we are looking at doing to help with the organization from the users point of view is having a virtual file system type layout.
Even if (or even preferably) all the users files are actually just stored in one directory, and the virtual file system is just a screen put up from the database.
What we are wondering before we invest in creating this is if this already exists somewhere, open source (but able to be used in commercial software), free, or paid (first two preferably!).
A simple file system on top of PHP can be done by WebDAV that is built on top of PHP:
http://sabre.io/
This would be a good example, but there are others as well. WebDAV is essentially a web based file system (http://en.wikipedia.org/wiki/WebDAV).
This could not only provide a file system, but also would let you edit files directly with Word/Excel (2007+) in it. Showing then a treeview of folders and files would be quite trivial, using few database tables and some jquery components, such as jsTree and jqGrid.
Although if you are searching for a full document management system: http://www.opendocman.com/ or http://code.google.com/p/simpledoc/ this would be more then enough.
It really depends on how many features you are going to incorporate into this system. Will there be an OCR, would you like to store the files in the cloud service, how many user input there will be (is simple upload enough, or camera, scanners and other devices are needed to be used as well)?
As for the commercial products, you could check out Microsoft SharePoint (http://en.wikipedia.org/wiki/Microsoft_SharePoint) or IBM Lotus Notes (http://en.wikipedia.org/wiki/IBM_Lotus_Notes)
Im currently working on developing a PHP/MYSQL property classifieds website where people can register and manually add property classified adverts. This is all working fine, but I now need to add the functionality to bulk upload property adverts.
There are 2 ways I need to do this, the first is via XML, where a member who is registered on our site can add the url of an XML file on their server, on their account page on our website. Our automatic script will read through the XML file each evening and populate our MYSQL database using the details and images from their XML feed.
The second part is where I am struggling. Some Estate Agents want to be able to upload their properties to our website by FTP'ing a ZIP file, with CSV file and images to our server each evening, so we can then read through the CSV file and populare the MYSQL database from this.
How would we go about giving each Estate agent a place on our server to ftp their files too? Could I automatically create a directory on our server with the name of their username where only they had access to FTP too? Would I be able to automatically create FTP accounts on my server?
Please note I am running a linux server with CPANEL installed. My website is developing in PHP with a MYSQL database.
Any advice on the best methods to implement this functionality would be appreciated.
You'll be better off offering a HTTPS file upload, rather than FTP, because you can secure that with your existing PHP/MySQL authentication system and it doesn't require any technical knowledge from your clients. Then you can use use PHP to parse the ZIP file and check it contains what you need in real time, and provide instant feedback if images are missing or the CSV is corrupt, saving you some customer support effort.
See http://www.php.net/manual/en/class.ziparchive.php
Automating the creation of FTP accounts would require pam_mysql, and may cause a conflict with cPanel. If you really want to do that, ask over on serverfault.com
I build a small application using PHP/MySQL. After one year of development, I fell that I need to update MySQL tables structures.
I do that using (SELECT * INTO OUTFILE) it is work in my local server but not in any user that host his application in hosting company that use some thing like cPanel ( or Control Panel that given by host company to let user manage his web page, emails,ftp account, database...etc).
That type of hosting is not giving full root access to any user cuz it is share same hardware with many other clients.
edit:
I forget to tell you that I need to do that using php script I made, to let ppl run it on their servers to make this automatically. there is no thing to do by user..... this script will do the following:
1) get a copy for all data in each table.
2) delete/Drop all tables.
3) create new tables with new structure.
4) import data again to that new labels.
Use phpMyAdmin, available from cPanel.
if it's working on your local machine then it is highly likeable that the user you are using to connect to the DB does not has all the privileges required.
You can use cron jobs for regularly exporting (dumping) your databases. Also use phpmyadmin like JohnD mentioned