I'm testing out a website I made in local environment, using Sql Server LocalDB as I said above. The site's done, now I wrote a php script that connects to the database and sends out some emails based on the database contents. I need this script to run automatically, so I've already set up a batch file in the Windows Task Scheduler to run automatically every hour. No problem in this. The issue is that sometimes (pretty often actually) the php scripts can't connect to the database. This seems to happen mostly when I recently "explored" my website, so my question is:
does LocalDB allow only one connection at a time?
If you have other ideas, please tell me. I need to finish testing ASAP.
Related
In our office we have a biometric scanner that inserts into a MS Access Database running on one of our local servers. That's just how the thing is built and we can't get into it to modify how it works.
We created a web-based attendance system that needs the Biometric information since the online system allows users to time in via the online form or the biometric scanner.
Our current setup right now is that every 1 minute, the local server runs a scheduled task in the scheduler to push the data to our remote server (this task is a PHP script) where the online app is hosted.
That slight delay isn't very nice and we'd like the data from the local server to sync right away with our online app, since sometimes the local server just dies and we don't know why.
TL;DR:
Is there a way to monitor any changes to the local server (MS Access) that will send the changes to our remote server using NodeJS or PHP? If there are other solutions available, those will be welcome as well.
The local server is running, a driver called ZK Attendance Manager with an MS Access Database. The remote server uses MySQL.
Thank you!
Even in SQL Server I try and avoid triggers if at all possible. Access does not have them per se... but here is an article I previously found in looking it up... because sometimes you just need the functionality. Granted I've never used a Data Macro but from what it says it may be able to help you in your situation.
Read:
MS Access trigger?
I have just started using CodeIgniter for my PHP application. I need some help regarding database creation. Instead of creating database manually is it possible that when CI application in installed on any server then after starting the apache server, my application should execute some db script file to create DB structure if the DB is empty otherwise skip the DB creation. So every time when the apache server is started or restarted, my CI application should execute this script and check whether the database structure is up to date or not and create it if does not exist or not up to date.
This way I want to make sure that after first install or everytime the application runs the DB is created and ready to use.
Hope I have explained my requirement clearly. Please let me know if any kind of configuration will help me here.
Our website currently backs up every night to a seperate server that we have which is fine, but when we go to dowload the files the next day it take's a long time to download the files (usually around 36,000+ images). Downloading this the following day takes quite some time and affects the speeds of everyone else using our network so ideally we would try and do this in the middle of the night - except there's no-one here to do it.
The server that the backup is on is running Cpanel which appears to make it fairly simple to run a PHP file as a Cron job.
I'm assuming the following, feel free to tell me I'm wrong.
1) The server the backup is on runs Cpanel. It appears that it shouldn't be too difficult to set up a PHP script to run as a Cron job in the middle of the night.
2) We could deploy a PHP script utilizing the FTP functions to connect to another server and start the backup of these files using this cron job.
3) We are running Xampp on a windows platform. It has Filezilla as part of it so I'm assuming it should be able to accept incoming FTP connections.
4) Overall - we could deploy a script on the backup server that would run every night and send the files back to my local computer running Xampp.
So that's what I'm guessing. I'm getting stuck at the first hurdle though. I've tried to create a script that runs on our local computer and sends a specified folder to the backup server when it executes, but all I seem to be able to find is scripts relating to single files. Although I've some experience of PHP, I haven't touched upon the FTP functions before which are giving me some problems. I've tried the other examples here on stack overflow with no success :(
I'm just looking for the most simplistic form of a script that can transfer upload a folder to a remote IP. Any help would be appreciated.
There is a fair amount of overhead involved in transferring a bunch of small files over FTP. Ive seen jobs take 5x as long, over a local network. It is by far easier to pack the files in something like a zip and send them in one large file.
you can use exec() to run zip from the command line (or whatever compression tool you prefer). After that, you can send it over ftp pretty quickly (you said you found methods for transferring 1 file). For backup purposes, having the files zipped would probably makes things easier to handle, but if you need them unzipped you can setup a job on the other machine to unpack the file.
I wanted to connect to my server via ssh and run a php script to enter some data into the MySQL databases.
I couldn't do this because I didn't have ssh access.
So instead I'm just going to put a php script into one of my web pages and then put the data in the same folder and then run the php by loading the browser
this seems like a really wierd way to enter data into a database?
but is it ok?
Using a PHP script to execute an SQL script should not be a problem (but be sure to delete both afterwards, just so you don't leave an unvalidated/regulated passage into your database out there).
If your webhost provides a MySQL Admin interface (often phpMyAdmin), you should be able to access that through their Control Panel (often called "cPanel" or "Plesk"). You should be able to upload and execute an SQL file through that interface without installing anything else.
Failing that, you should be able to install Adminer, which is a cutdown version of phpMyAdmin which you can then upload to your server and access through a web browser to, again, upload or copy-and-paste your SQL script into.
So you are basically rebuilding phpMyAdmin's behaviour. I would just install phpMyAdmin, but if your php script is protected (.htaccess or similar), then this should be no problem. Look out for timeouts.
A good tool for working with MySQL db is Workbench, but you must have remote access to your db...
I already read a few threads here and I also went through the MySQL Replication Documentation (incl. Cluster Replication), but I think it's not what I really want and propably too complicated, too.
I got a local and a remote DB that might get both accessed/manipulated by 2 different persons at the same time. What I'd like to do is to sync them as soon as possible (means instantly or as soon the local machine goes online). Both DB's only get manipulated from my own PHP Scripts.
My approach is the following:
If local machine is online:
Let my PHP Script on the loal machine always send the SQL Query to the remote DB too
Let my PHP Script on the remote machine always store its queries and...
...let the local machine ask the remote DB every x minutes for new queries and apply them locally.
If local machine is offline:
Do step 2. also for both machines and send the stored queries to the remote DB as soon as
local machine goes online again. Also pull the queries from the remote machine for sure.
My questions are:
Did I just misunderstand Replication or am I right that my way would be easier in my case? Or is there any other good solution for what I'm trying to accomplish?
Any idea how I could check whether my local machine is online/offline? I guess I'd have to use JavaScript, but I don't know how. The browser/my script would always be running on the local machine.
What you're describing is master-master or multi-master replication. There are plenty of tutorials on how to set this up across the web. Definitely do your research before putting a solution like this into production as replication in MySQL isn't exactly elegant -- you need to know how to recover if (when?) something goes wrong.