i need to create a webapp to show and allow editing for a set of data.
This data is contained in an Access Database file, used by another application (a desktop application).
I'm evaluating the best way to carry out this job.
Unfortunatly my purpose to migrate to another database solution (rdbms such as MySQL or Postgres) was rejected by the customer.
The issue here is how to keep data integrity and syncronized between the server and the desktop that executes the application that also uses this data.
All I need to do is, read data, store edited or new data, give to authorized users an interface to review this new inserted data -thus validating it-, and import this to the original access database.
I've found the following possible solutions (to update the desktop mdb copy), but each of them has pros and cons:
remote access to the windows machine
exposes the machine to unauthorized access
use rsync to keep files syncronized (once a day)
if the mdb on the client has been edited with the desktop application there will be data loss
can be update only when all data has been validated
there won't be real syncronized data (until rsync will run)
client-server applications
can use secure layers to protect data against attackers
a 3rd application (on the desktop) is required
syncronization requires authorized users to use this 3rd application to import data (that will query the remote db and update the local mdb)
Do you know some other way that could help me to get this done?
I'm oriented on the client-server model, also if this would be more expensive, but it's the only way I see to make this work.
Do you see some other pros/cons of the purposed solution?
I didn't choose the PL to develop this, but I was thinking to use either PHP and/or Python.
The remote environment (for the server) can either be Windows or *nix (preferred).
Thanks.
The first idea:
exposes the machine to unauthorized access
This is not really a valid argument. Everything you put on the Internet is exposed. An it is not like it cannot be further protected via SSL/TLS. Even RDP can be secured via a SSH tunnel, for example.
To my mind, the easiest way and most elegant way to do is by using web services (SOAP). Write the server code that does inserts/updates on the Access database with something like a Python or Java. Generate a WSDL from the working code. From the WSDL you can generate a client for PHP/Python. Now all you have to do is to write the web interface that uses the PHP/Python client.
For security using SSL and Basic authentication should be enough (supported by SOAPpy in the case of Python, for example).
You can use pyodbc to connect to the Access database.
well you can use 2 db and syncronize changes with a sort of web service between them.
seperating web server Db (which you could use a modern mysql or whatever) and the current access Db
You should build a sort of a Rest Api returning new or changed records against GET method, Deleting against DELETE method etc. using a timestamp in the http method.
and then you could query at each side with a scheduled job for new records at the other side (transferring with json) resulting in keeping the records relatively insync.
You could take care of security with exposing the application db only in a certain port and only to http queries coming from the webapp server ip address. also using http auth, hashes etc..
if this isn't a heavy load, high concurrency app (which I guess it isn't since you use access as a Db) this should work.
you could build this kind of mini-api with any python webframework like turbogears 2.1,django or the mini frameworks like bottle or flask
p.s If you prefer python (and why wouldn't you) don't use pyodbc directly, work with python beautiful orm - sqlalchemy is much better
I think how this works really depends on the authentication issue and number of users that need to review the data.
The reason I ask?
You can consider using Access 2010 and office 365. This allows you to have linked tables to the cloud, but in fact the tables are also cached local to your Access desktop. This means that real time replication sync of data is used, and this is automatic for Access 2010 (so you don’t' have to write any code).
What this means is while running the Access desktop application, you can pull the plug on the network and it will continue to run. The instant you have a wifi or a connection, then changes local are synced up to office 365. Even better is you can now build web forms in Access.
Data touched or edited (or new records on either side) will come down the pipe to your local computer. So you add reords in Access client, the web users will ALSO see these new reocrds.
So Access 2010 now has web publishing, and this works with the new office 365. The price starts at $6 per month. And if just for a few users, then have them all logon using the same account! This means you can have this all up and running in less time than it took to make this post, and for less then $10 per month!
For those not aware, Access 2010 has web publishing. When you publish the Access forms, then are converted to .net forms (zammel/XAML) forms, and the code is converted to JavaScript. So form code actually runs browser side.
Since the system runs on office 365, then you using some heavy duty iron and you can in theory scale out to millions of users for this setup. When you publish the Access application to office 365, then on the server side you not using mdb or Access files anymore, but what is called Access Web Services. The tables in fact become the equilivant of SharePoint lists . And new for SP 2010 is those lists now have relational features like cascade delete.
The real beauty of this system is you can write and create and do everything inside of Access without have to learn or touch ANY KIND of server side technology. Here is short video of mine, and at the half way point I run the Access application with nothing more than a web browser.
http://www.youtube.com/watch?v=AU4mH0jPntI
There is no activeX or even Silverlight required. In fact my Access applications run fine on a iPad using the safari web browser.
So you could consider to continue using Access, and just publish your application to the web with the new Access 2010 features.
Related
I'm developing a web app using Laravel (a PHP framework). The app is going to be used by about 30 of my co-workers on their Windows laptops.
My co-workers interview people on a regular basis. They will use the web app to add a new profile to a database once they interview somebody for the first time and they will append notes to these profiles on subsequent visits. Profiles and notes are stored using MySQL, but since I'm using Laravel, I could easily switch to another database.
Sometimes, my co-workers have to interview people when they're offline. They might visit a group of interviewees, add a few profiles and add some notes to existing ones during a session without any internet access.
How should I approach this?
With a local web server on every laptop. I've seen applications ship with some kind of installer including a LAMP stack, but I can't find any documentation on this.
I could install the app and something like XAMPP on every laptop
myself. That would be possible, but in the future more people might use the app and not all of them might be located nearby.
I could use Service Workers, maybe in connection with a libray such
as UpUp. This seems to be the most elegant approach.
I'd like to give option (3) a try, but my app is database driven and I'm not sure whether I could realize this approach:
Would it be possible to write all the (relevant) data from the DB to - let's say - a JSON file which could be accessed instead of the DB when in offline mode? We don't have to handle much data (less than 100 small data records should be available during an interview session).
When my co-workers add profiles or notes in offline mode, is there any "Web Service" way to insert data into the db that has been entered?
Thanks
Pida
I would think of it as building the app in "two parts".
First, the front end uses ajax calls to the back end (which isn't anything but a REST API). If there isn't any network connection, store the data in the browser using local storage.
When the user later has network connection, you could send the data that exists in the local storage to the back end and clear the local storage.
If you add web servers on the laptops, the databases and info will only be stored on their local laptops and would not be synced.
You can build what you describe using service workers to cache your site's static content to make it available offline, and a specific fetch handler in the service worker to detect a failed PUT or POST and queue the data in IndexedDB. You'd then periodically check IndexedDB for any queued data when your web app is loaded, and attempt to resend it.
I've described this approach in more detail at https://developers.google.com/web/showcase/case-study/service-workers-iowa#updates-to-users-schedules
That article assumes the use of the sw-precache library for caching your site's static assets, and the sw-toolbox library to provide runtime fetch handlers that check for failed business-logic requests. It also uses a promise-based IndexedDB wrapper called simpleDB although I'd probably go with the more recent idb library nowadays.
At work we have a enterprise store, meaning we can kind of bypass most of the main Apple App Store regulations. We have a special data-management system written in CodeIgniter with MySQL as the database engine serving the framework on Apache.
We are now getting more and more questions to run the system offline on the iPad. I've tried to use LocalStorage and such, yet it's just not enough and stable enough (WebStorage/WebSQL glitchy) and the allowed storage size is too small to fit all offline buffered data into.
I know this is very ugly, but as we mostly know, customers always find the most weird ways of requesting features and our sales team always manages to push it through without consulting us :P.
I did browse Google/DuckDuckGo and CocoaPods for a while, but I can't really find anything combining PHP serving within Swift (Objective-C would be ok too) serving it on Apache/Nginx/FastCGI with MySQL (I could substitute this with SQLite3).
I was wondering if anyone has experience with running an internal server in Swift/Objective-C in this fashion.
If you wish to keep your current stack of technologies, you could use something like Realm. It is a replacement for Core Data, and it allows you to easily create objects from JSON REST API and store it to the local database. But you still have to write some application specific code to keep data on the mobile device in sync with the server, and you have to have RESTful services that produce JSON on the server.
If you're ready to switch your persistence stack, you could use Couchbase Mobile that allows you to transparently sync your data on the device with data in your backend database, back and forth. But then you have to use Couchbase on the server.
If you want server-side Objective-C, look at https://github.com/depinette/backtoweb
I have not updated this framework in a while but it worked for me.
It's based on fastcgi and it can be used with the Apache server integrated with OSX.
I suppose you could use swift instead of Objective-C.
Basically, I want to provide a web application (built in PHP, MySQL, Apache) to users with source code in case they don't have Internet connection. But with that, I have to take care that they web application package (with Apache, PHP, MySQL and actual application with data) cannot be copied and run in another machine (may be we can bind authentication with Hard Disk serial id).
The first solution stroked in my mind was to build stand alone application but we don't have that option because we have limitation to go with web application only.
One solution, I thought is to create a web browser like container (which may be using one of the system's browser inside) in Java or any other stand alone programming language where we have additional authentication for current machine and internally it uses system's browser for HTTP requests/responses.
Please share your idea about feasibility/implementation of above solution or any other better solution.
One thing to keep in mind that, we are providing all source code with servers, so authentication with database or PHP won't be much useful.
But with that, I have to take care that they web application package (with Apache, PHP, MySQL and actual application with data) cannot be copied and run in another machine (may be we can bind authentication with Hard Disk serial id).
This is, strictly speaking, impossible.
The first solution stroked in my mind was to build stand alone application but we don't have that option because we have limitation to go with web application only.
Have you ever heard of IDA Pro? JD-GUI? ILSpy?
Stand-alone applications can trivially be reverse-engineered. This will protect nothing.
Your best options are:
Provide a cloud service, which is totally agnostic towards HTTP clients, so you can own the back-end machines that contain your source code, then give your customers a dumb open source front-end that speaks to the back-end.
Enforce your software policies (i.e. only allowed to run one copy of the software) with the appropriate tool for the job: Lawyers and contracts.
My client has an offline product database for a high street shop that they update fairly frequently for their own purposes. They are now creating an online store which they want to use product information from this database.
Migrating the database to a hosted server and abandoning the offline database is not an option due to their current legacy software set up.
So my question is: how can I get the information from their offline database to an online database? Their local server is always connected to the internet so is it possible to create a script on the website that somehow grabs the data from their server and imports it into the online server? If this ran every 24 hours it would be perfect. But is it even possible? And if so how would I do it?
The only other option I can think of is to manually upload the database after every update, but this isn't really a viable idea.
I did something like this with quickbooks using an odbc connection. Using that I synced data to MySQL. This synchronization however, was just one way. Unless you have keys in the data that indicates when something was changed (updated date), you will end up syncing alot of extra data.
Using SQLYog, i set up a scheduled job that connected to the odbc data source, and pushed the changes since last sync to the mysql database I was using to generate reports. If you can get the data replicated into MySQL it should be easy at that point to make use of it in your online store.
The downside is that it wont be realtime. Inventory could become a problem.
In an ideal world I would look at creating a restful API that would run on the same server or at least run on the same network as your offline database. This restful API would run as a web server via http and return JSON or even XML structures of data from the offline database. Clients running on the internet would be able to connect and fetch any data they need, at any time. A restful API like this has a number of advantages.
Firstly it's secure. You don't have to open up an attack vector to the public by making connections to your offline database public. The only thing you have to do is enable public access to your restful API. In your API's logic you might not even include functionality to write to the database so even if your API's security is compromised at very worst all attackers can do is read your data, not corrupt it.
Having a restful api in this situation represents a good separation of concerns. Your client code should not know anything about the database nor should it know about any internal systems that the offline database uses. What happens when your clients want to update their offline system or even change it? In this situation all you would have to do is update the restful api. Your client that is connecting to the data no longer cares about anything else but the api so changing databases would be easy.
Another reason to consider an API is concurrency. I hinted at this before but having an API would be great if you ever need to have more than one client accessing the offline databases' data. In a web server set up where you would have the API sitting and waiting for requests there is no reason why you could not have more than one client connecting to the api at the same time. HTTP is really good at this!
You talked about having to place old data in a new database. Something like this could be done easily with a restful API as you would just have to map the endpoints of your API to tables in the new database and run that when you need. You could even forgo the new database and use the API as your backend. This solution would require some caching but it would cut down on the duplication of a database if you don't feel it's needed.
The draw back to all of this is the fact that writing an API over a script is more complex. So in this situation I believe in horses for courses. If this database is the backbone of a long term project that will be expanding in the future an API is the way to go. If its a small part of your project then maybe you can swing it with a script that runs every 24 hours however I have done this before and the second I have to change/edit the solution things start getting a little "hairy". Hope this helps and good luck with it.
i have a database that is written in access. the access mdb file connects via ODBC to a local mysql database. i have a bunch of sql and vba code in the access file. i dont expect the database to surpass 100mb. currently it is around 10mb. i will need to have multiple user access. (no more than 10 users at a time)
i need to convert this database from being a local one to a web server, and i need to make a web interface for it.
how do i get the current local instance of mysql database to run off a webserver? i am currently running it off wampserver 2.0. i dont have experience putting a database on a webserver.
i have an OK vb.net background. i have never done any web applications. here's a picture of the access form that i may need to replicate to work off a website:
alt text http://img42.imageshack.us/img42/1025/83882488.jpg
which platform should i use as the front end to this thing?
would it be possible to just run this access file off a webserver instead of programming a new front end for it? is that not a smart idea?
thank you for your help!
If your webserver has TCP connectivity to your existing database server, and its hosted in a suitable place (eg, don't have your webserver in a datacenter connecting to a database server on your office DSL connection), then no move is required.
If you do need to move it, it's as easy as creating a backup/dump, and restoring it elsewhere.
As far as the frontend, there are MANY technologies that will do what you need (ASP.NET, PHP, Python, Ruby, Perl, Java being the most popular ones, not necessarily in that order).
Use something you are comfortable with, or that you are interested in learning (provided you have the time to do so)
Use something that runs properly on your target webserver. Really, ASP.NET is the only one that has any major issue here, as it's limited to Windows.
Access itself has no direct web-accessible version. A Google search finds some apps that claim to convert Access forms to web-based, but I will not link to any because I don't know how well they work. I'm certainly leary of anything like that, because web apps are a different breed from Windows apps. If you are going to go that route, be sure they actually generate HTML output; make sane, clean source; and offer a free trial so you can verify it actually works.
Really though, a form like that is reasonably easy to reproduce with some basic knowledge of server-side programming and some HTML.
I don't have any experience migrating access to a web-based interface, although I have heard of people going straight from access to a web page. MySql is exceptionally easy to migrate. MySQL.com has a program called mysqldump that comes with the standard install of MySQL that allows you to export your database straight to a text file that can be used then with mysqldump to import it on another server. I don't believe the WAMP server comes with the command line tools although they can be downloaded from mysql.com. However, if it has phpMyAdmin, then there is also an export feature with that as well that will generate a .sql file that can be imported to the webserver using phpMyAdmin. One thing to keep in mind though is that I have had very little success mixing and matching these methods: ie, I've never been able to get a mysqldump-created file to work with phpMyAdmin and vice versa.
Good luck!
The link will help you to export and import mySQL database
May be on Windows web server there is an opportunity to run Access files, you can check, but any way if you have some programming skills, I would say that it is not difficult to crate a php script which will query your database info and will edit.
Migrating an Access application to the web is quite difficult, because you can't translate an Access form 1:1 into a web page. Web apps are stateless, whereas Access is built around the concept of bound controls and bound datasets.
Secondly, it is impossible to easily replicate an Access subform.
Third, you lose tons of events that Access forms and controls are built around.
In general, a web page that performs the same task as an Access form will bear little or no resemblance to the Access form, simply because the methods for accomplishing the same tasks and the UI widgets available to you are so completely different.
One thing to consider is whether your users need a web application or if they just need to use your existing Access application over the Internet. If the latter is the case, Windows Terminal Server/Citrix can do the job for a lot less money, since there's no conversion needed. You do need to provision a Windows Terminal Server, set up a VPN and purchase CALs for the users, but the costs of those are going to be much less than the cost of rebuilding the app for web deployment.
It may not be an appropriate solution, but it's one that you should consider, I think.