access + mysql converting to webplatform = (php + asp.net + mysql)? - php

i have a database that is written in access. the access mdb file connects via ODBC to a local mysql database. i have a bunch of sql and vba code in the access file. i dont expect the database to surpass 100mb. currently it is around 10mb. i will need to have multiple user access. (no more than 10 users at a time)
i need to convert this database from being a local one to a web server, and i need to make a web interface for it.
how do i get the current local instance of mysql database to run off a webserver? i am currently running it off wampserver 2.0. i dont have experience putting a database on a webserver.
i have an OK vb.net background. i have never done any web applications. here's a picture of the access form that i may need to replicate to work off a website:
alt text http://img42.imageshack.us/img42/1025/83882488.jpg
which platform should i use as the front end to this thing?
would it be possible to just run this access file off a webserver instead of programming a new front end for it? is that not a smart idea?
thank you for your help!

If your webserver has TCP connectivity to your existing database server, and its hosted in a suitable place (eg, don't have your webserver in a datacenter connecting to a database server on your office DSL connection), then no move is required.
If you do need to move it, it's as easy as creating a backup/dump, and restoring it elsewhere.
As far as the frontend, there are MANY technologies that will do what you need (ASP.NET, PHP, Python, Ruby, Perl, Java being the most popular ones, not necessarily in that order).
Use something you are comfortable with, or that you are interested in learning (provided you have the time to do so)
Use something that runs properly on your target webserver. Really, ASP.NET is the only one that has any major issue here, as it's limited to Windows.
Access itself has no direct web-accessible version. A Google search finds some apps that claim to convert Access forms to web-based, but I will not link to any because I don't know how well they work. I'm certainly leary of anything like that, because web apps are a different breed from Windows apps. If you are going to go that route, be sure they actually generate HTML output; make sane, clean source; and offer a free trial so you can verify it actually works.
Really though, a form like that is reasonably easy to reproduce with some basic knowledge of server-side programming and some HTML.

I don't have any experience migrating access to a web-based interface, although I have heard of people going straight from access to a web page. MySql is exceptionally easy to migrate. MySQL.com has a program called mysqldump that comes with the standard install of MySQL that allows you to export your database straight to a text file that can be used then with mysqldump to import it on another server. I don't believe the WAMP server comes with the command line tools although they can be downloaded from mysql.com. However, if it has phpMyAdmin, then there is also an export feature with that as well that will generate a .sql file that can be imported to the webserver using phpMyAdmin. One thing to keep in mind though is that I have had very little success mixing and matching these methods: ie, I've never been able to get a mysqldump-created file to work with phpMyAdmin and vice versa.
Good luck!

The link will help you to export and import mySQL database
May be on Windows web server there is an opportunity to run Access files, you can check, but any way if you have some programming skills, I would say that it is not difficult to crate a php script which will query your database info and will edit.

Migrating an Access application to the web is quite difficult, because you can't translate an Access form 1:1 into a web page. Web apps are stateless, whereas Access is built around the concept of bound controls and bound datasets.
Secondly, it is impossible to easily replicate an Access subform.
Third, you lose tons of events that Access forms and controls are built around.
In general, a web page that performs the same task as an Access form will bear little or no resemblance to the Access form, simply because the methods for accomplishing the same tasks and the UI widgets available to you are so completely different.
One thing to consider is whether your users need a web application or if they just need to use your existing Access application over the Internet. If the latter is the case, Windows Terminal Server/Citrix can do the job for a lot less money, since there's no conversion needed. You do need to provision a Windows Terminal Server, set up a VPN and purchase CALs for the users, but the costs of those are going to be much less than the cost of rebuilding the app for web deployment.
It may not be an appropriate solution, but it's one that you should consider, I think.

Related

Using PHP in Node-Webkit

Ive built an AngularJS application over the last several months that utilizes a MySQL database for its data. This data is fetched by Angular making calls to PHP and PHP returns JSON strings etc.
The issue is once this application is running inside node-webkit, none of the php works, so all of the content areas are empty. I assume (though the documentation on this issue is null and so i have no confirmation) this happens because Node-webkit is a client-side application framework and therefor wont run server-side languages like php. Is there a way to expand node webkit to run php and other server side languages?
I have done my best to find an answer to this question before posting, but documentation for this is nonexistent, and all of the information I have found about node-webkit talks about installing node on your server and installing npms for MySQL and having angular make calls to node. This defeats the purpose of the application entirely as it is designed so that the exe/deb/rpm/dmg can run and you can set up a database with any cloud database provider and be ready to go. Not ideal if you have to buy a vps just to run this one thing.
I have to assume this is possible in some way. i refuse to believe that everyone with an nw application hard codes all their data.
Thanks in advance
I know of four methods to accomplish this. Some of which you have preferred not to do but I am going to offer them in the hopes it helps you or someone else.
Look for an NPM that can do this for you. You should be able to do this functionality within node.js. - https://www.npmjs.com/search?q=mysql
You can host your PHP remotely. Using node-remote you can give this server the appropriate access to your NW.js project.
You can code a RESTful PHP application that your JavaScript can pass off information to.
You can use my boilerplate code to run PHP within a NW.js project. It however fires up an express.js web server internally to accomplish this. But the server is restricted to the machine and does not accept outside connections - https://github.com/baconface/php-webkit
1 and 4 both carry a risk in your case. Your project can be reversed engineered to reveal the source code and the connection information can be retrieved rather easy. So this should only be in an application on trusted machines and 2 and 3 are the ideal solutions.

Web Server for simple PHP\DB interface

I'm needing to design a work order management system that will be accessed via a web browser, using PHP, and will interface with an arbitrary embedded database. For the time being this application will be stand alone, and will need to have a very easy setup. I want to stay away from a full blown Apache setup for the time being. Obviously I will need some sort of web server to serve up the PHP pages, and anything that can't be built into the database or PHP will probably be written in Python. I'm thinking it might be easiest to have everything built into this single Python instance, but I'm up for suggestions.
Really what I'm trying to stay away from is having multiple services running at any given time that all need updating and maintenance. I'll be running this at work on a single machine and it will need to keep a low profile. Any suggestions?

How to update a remote ms access database?

i need to create a webapp to show and allow editing for a set of data.
This data is contained in an Access Database file, used by another application (a desktop application).
I'm evaluating the best way to carry out this job.
Unfortunatly my purpose to migrate to another database solution (rdbms such as MySQL or Postgres) was rejected by the customer.
The issue here is how to keep data integrity and syncronized between the server and the desktop that executes the application that also uses this data.
All I need to do is, read data, store edited or new data, give to authorized users an interface to review this new inserted data -thus validating it-, and import this to the original access database.
I've found the following possible solutions (to update the desktop mdb copy), but each of them has pros and cons:
remote access to the windows machine
exposes the machine to unauthorized access
use rsync to keep files syncronized (once a day)
if the mdb on the client has been edited with the desktop application there will be data loss
can be update only when all data has been validated
there won't be real syncronized data (until rsync will run)
client-server applications
can use secure layers to protect data against attackers
a 3rd application (on the desktop) is required
syncronization requires authorized users to use this 3rd application to import data (that will query the remote db and update the local mdb)
Do you know some other way that could help me to get this done?
I'm oriented on the client-server model, also if this would be more expensive, but it's the only way I see to make this work.
Do you see some other pros/cons of the purposed solution?
I didn't choose the PL to develop this, but I was thinking to use either PHP and/or Python.
The remote environment (for the server) can either be Windows or *nix (preferred).
Thanks.
The first idea:
exposes the machine to unauthorized access
This is not really a valid argument. Everything you put on the Internet is exposed. An it is not like it cannot be further protected via SSL/TLS. Even RDP can be secured via a SSH tunnel, for example.
To my mind, the easiest way and most elegant way to do is by using web services (SOAP). Write the server code that does inserts/updates on the Access database with something like a Python or Java. Generate a WSDL from the working code. From the WSDL you can generate a client for PHP/Python. Now all you have to do is to write the web interface that uses the PHP/Python client.
For security using SSL and Basic authentication should be enough (supported by SOAPpy in the case of Python, for example).
You can use pyodbc to connect to the Access database.
well you can use 2 db and syncronize changes with a sort of web service between them.
seperating web server Db (which you could use a modern mysql or whatever) and the current access Db
You should build a sort of a Rest Api returning new or changed records against GET method, Deleting against DELETE method etc. using a timestamp in the http method.
and then you could query at each side with a scheduled job for new records at the other side (transferring with json) resulting in keeping the records relatively insync.
You could take care of security with exposing the application db only in a certain port and only to http queries coming from the webapp server ip address. also using http auth, hashes etc..
if this isn't a heavy load, high concurrency app (which I guess it isn't since you use access as a Db) this should work.
you could build this kind of mini-api with any python webframework like turbogears 2.1,django or the mini frameworks like bottle or flask
p.s If you prefer python (and why wouldn't you) don't use pyodbc directly, work with python beautiful orm - sqlalchemy is much better
I think how this works really depends on the authentication issue and number of users that need to review the data.
The reason I ask?
You can consider using Access 2010 and office 365. This allows you to have linked tables to the cloud, but in fact the tables are also cached local to your Access desktop. This means that real time replication sync of data is used, and this is automatic for Access 2010 (so you don’t' have to write any code).
What this means is while running the Access desktop application, you can pull the plug on the network and it will continue to run. The instant you have a wifi or a connection, then changes local are synced up to office 365. Even better is you can now build web forms in Access.
Data touched or edited (or new records on either side) will come down the pipe to your local computer. So you add reords in Access client, the web users will ALSO see these new reocrds.
So Access 2010 now has web publishing, and this works with the new office 365. The price starts at $6 per month. And if just for a few users, then have them all logon using the same account! This means you can have this all up and running in less time than it took to make this post, and for less then $10 per month!
For those not aware, Access 2010 has web publishing. When you publish the Access forms, then are converted to .net forms (zammel/XAML) forms, and the code is converted to JavaScript. So form code actually runs browser side.
Since the system runs on office 365, then you using some heavy duty iron and you can in theory scale out to millions of users for this setup. When you publish the Access application to office 365, then on the server side you not using mdb or Access files anymore, but what is called Access Web Services. The tables in fact become the equilivant of SharePoint lists . And new for SP 2010 is those lists now have relational features like cascade delete.
The real beauty of this system is you can write and create and do everything inside of Access without have to learn or touch ANY KIND of server side technology. Here is short video of mine, and at the half way point I run the Access application with nothing more than a web browser.
http://www.youtube.com/watch?v=AU4mH0jPntI
There is no activeX or even Silverlight required. In fact my Access applications run fine on a iPad using the safari web browser.
So you could consider to continue using Access, and just publish your application to the web with the new Access 2010 features.

php script to read xml data on remote unix server

I got a situation where I have lots of system configurations/logs off which I have to generate a quick review of the system useful for troubleshooting.
At first I'd like to build kind of web interface(most probably a php site) that gives me the rough snapshot of the system configuration using the available information from support logs. The support logs reside on mirrored servers (call it log server) & the server on which I'll be hosting the site (call it web server) will have to ssh/sftp to access them.
My rough sketch:
The php script on web server will make some kind of connection to the log server & go to the support logs location.
It'll then trigger a perl script at logs server, which will collect relevant stuffs from all the config/log files into some useful xml (there'd be multiple of those).
Someway these xml files are transferred to web server & php will use it to create the html out of it.
I'm very new to php & would like to know if this is feasible or if there's any other alternative/better way of doing this?
It would be great if someone could provide more details for the same.
Thanks in advance.
EDIT:
Sorry I missed to mention that the logs aren't the ones generated on live machine, I'm dealing with sustenance activities for NAS storage device & there'll be plenty of support logs coming from different end customers which folks from my team would like to have a look at.
Security is not a big concern here (I'm ok with using plain text authentication to log servers) as these servers can be accessed only through company's VPN.
Yes, PHP can process XML. A simple way is to use SimpleXML: http://php.net/manual/en/book.simplexml.php
While you can do this using something like expect (I think there is something for PHP too..), I would recommend doing this in two separate steps:
A script, running via Cron, retrieves data from servers and store it locally
The PHP script reads from the local stored data only, in order to generate reports.
This way, you have these benefits:
You don't have to worry about how to make your php script connect via ssh to servers
You avoid the security risks related to allowing your webserver user log in to other servers (high risk in case your script gets hacked)
In case of slow / absent connectivity to servers, long time to retrieve logs, etc. you php script will still be able to quickly show the data -- maybe, along with some error message explaining what went wrong during latest update
In any case, you php script will terminate much quicker since it only has to retrieve data from local storage.
Update: ssh client via php
Ok, from your latest comment I understand that what you need is more a "front-end browser" to display the files, than a report generation tool or similar; in this case you can use Expect (as I stated before) in order to connect to remote machines.
There is a PECL extension for PHP providing expect functionality. Have a look at the PHP Expect manual and in particular at the usage examples, showing how to use it to make SSH connections.
Alternate way: taking files from NFS/SAMBA share
Another way, avoiding to use SSH, is to browse files on the remote machines via locally-mounted share.
This is expecially useful in case interesting files are already shared by a NAS, while I wouldn't recommend this if that would mean sharing the whole root filesystem or huge parts of it.

Communication between PHP and application

I'm playing with an embedded Linux device and looking for a way to get my application code to communicate with a web interface. I need to show some status information from the application on the devices web interface and also would like to have a way to inform the application of any user actions like uploaded files etc. PHP-seems to be a good way to make the interface, but the communication part is harder. I have found the following options, but not sure which would be the easiest and most convenient to use.
Sockets. Have to enable sockets for the PHP first to try this. Don't know if enabling will take much more space.
Database. Seems like an overkill solution.
Shared file. Seems like a lot of work.
Named pipes. Tried this with some success, but not sure if there will be problems with for example on simultaneous page loads. Maybe sockets are easier?
What would be the best way to go? Is there something I'm totally missing? How is this done in those numerous commercial Linux based network switches?
I recently did something very similar using sockets, and it worked really well. I had a Java application that communicates with the device, which listened on a server socket, and the PHP application was the client.
So in your case, the PHP client would initialize the connection, and then the server can reply with the status of the device.
There's plenty of tutorials on how to do client/server socket communication with most languages, so it shouldn't take too long to figure out.
What kind of device is it?
If you work with something like a shared file, how will the device be updated?
How will named pipes run into concurrency problems that sockets will avoid?
In terms of communication from the device to PHP, a file seems perfect. PHP can use something basic like file_get_contents(), the device can just write to the file. If you're worried about the moment in time the file is updated to a quick length check.
In terms of PHP informing the device of what to do, I'm also leaning towards files. Have the device watch a directory, and have the script create a file there with something like file_put_contents($path . uniqid(), $command); That way should two scripts run at the exact sime time, you simply have two files for the device to work with.
Embedded linux boxes for routing with web interface don't use PHP. They use CGI and have shell scripts deliver the web page.
For getting information from the application to the web interface, the Shared file option seems most reasonable to me. The application can just write information into the file which is read by PHP.
The other way round it looks not so good at first. PHP supports locking of files, but it most probably doesn't work on a system level. Perhaps one solution is that in fact every PHP script which has information for the application creates it own file (with a unique id filename, e.g. based on timestamp + random value). The application could watch a designated directory for these files to pop-up. After processing them, it could just delete them. For that, the application only needs write permission on the directory (so file ownership is not an issue).
If possible, use shell scripts.
I did something similar, i wrote a video surveillance application. The video part is handled by motion (a great FOSS package). The application is a turn-key solution on standardized hardware, used to monitor slot-machine casinos. It serves as a kiosk system locally and is accessible via internet. I wrote all UI code in PHP, the local display is a tightly locked down KDE desktop with a full screen browser defaulting to localhost. I used shell scripts to interact with motion and the OS.
On a second thought:
If you can use self-compiled applications on the device: Write a simple program that returns the value you want and use PHP's exec() or passthru() or system().

Categories