I maintain a PHP driven web application with Oracle backend. The app interacts with a number of third-party apps so information is managed with a combination of XML files, Microsoft Access databases and HTML forms. There are currently 80 tables with many BLOBs and a pretty good bunch of foreign key relationships. All procedures are carefully explained in a document that (of course) nobody ever reads. The customer was feeling uneasy about his data so he was given an estimate with some improvements that could be made (stuff like adding previews and confirmations in some operations).
Sadly, the customer misinterpreted one of the specs (a partial export to be written in 12 man-hours) and he's expecting a full backup and restore feature that would allow him to save and restore the complete database through a web browser without the DBA intervention.
Before having yet another argument with the client, I'd like to know whether I have any option to actually implement this feature in a timely manner, considering that it doesn't need any refinements (e.g., there is no need to select what to restore).
Production server is a Windows Server 2003 box running PHP/5.2.9. The Oracle server is a remote box running "Oracle9i Release 9.2.0.1.0 - 64bit Production".
(Please note I'm not a DBA so there may be well-known solutions I'm not aware of.)
Oracle is a monster. Once you've read this you'll realise that how you backup the system depends totally on how it has been configured. The short answer is to automate whatever manual process - invoke it as a long running process (since this is MSWindows, prefix the rman command with 'start') then use polling to detect when it finishes (e.g. wrap rman in a DOS batch file which logs start and end times).
I'd be hard pushed to think of a more difficult problem to provide a generic solution for than Oracle runing on top of MSWindows. The latter may be nice for users to click on buttons, but automating anything is a PITA.
Have fun :)
Finally, I had the chance of implementing full Oracle backup from PHP in a later project. I used the Oracle Data Pump command-line utilities, available since 10g. In short:
You define an Oracle directory to map a keyword to a physical directory and grant write permission to the app's Oracle user.
You run expdp with the appropriate arguments and get a complete dump in a single file.
To restore a backup, you run impdp.
It's also advisable to run commands with proc_open() rather than system() since you can bypass_shell if on Windows and have fine-grained control on the process.
As for this question, the pre-10g alternative is the "exp" / "imp" combo.
Related
I have implemented an expert advisor using the MQL4 language to be executed in MetaTrader.
Now, if I need to execute it, I always need to run MetaTrader and attach my EA program to a live currency pair graph in it.
I want to know whether there is a method to execute MQL4 scripts in servers so that I do not need to keep my computer always on. I googled this question, but I could not find an appropriate answer to it.
I found there is a way to transfer data from MetaTrader to the web server (MQL to PHP) but I have no idea whether it is useful to solve my question (http://mql4-php.iinuu.eu/)
Thanks in advance.
Yes, there are few DLL-based methods to transfer "just" DATA
ZeroMQ DLL for socket based messaging approaches.
Windows raw-sockets' for a low-level socket programming.
A few other, DLL-based, tools for passing data to/from remote or parallel processes.
No, there are no known methods to run MQL4-CODE on a server
Each MQL4 source-code is first compiled into an .EX4 file. Such "executable" files are loaded and executed in a similarly proprietary piece of software -- in a MetaTrader4 Terminal. So far, there are no known server-process implementations for this functionality and MetaQuotes, Inc., does not either sell or develop any visible effort to release any such software. Due to legal reasons, there would hardly be any open source programmes, that would work in this direction, as any similar efforts have started legal consequences initiated in a name of protecting the intellectual property in any case, where a non-published nature of the data-transfers and/or operations distributed among MetaTrader4 Terminal [localhost-side] and/or MetaTrader4 Server [broker-side] programmes was to be touched or otherwise analysed and/or re-engineered.
But, there is a way to solve your wish
There is a common practice to operate the localhost-side piece of software -- the MetaTrader4 Terminal -- hosted on a remote machine, that is being kept running in a 24/7/365-style in a professional DataCentre.
Using this kind of approach, your MQL4-code is still being run in a native mode inside a MetaTrader4 Terminal software process, however, the machine ( the Windows O/S based machine ) is virtualised into a VM and hosted in a DataCentre infrastructure.
There are nevertheless some steps & measures needed so as to protect your privacy and your intellectual property rights once thinking about the VM/hosted mode of operations of your EA/script.
Applying this mode of operations will allow you to connect from your localhost to the DataCenter just in a time when you want to visually check and/or manually correct and/or modify your all-the-time-running code in a MetaTrader4 Terminal in a non-stop mode.
Noting on the following requirement:
"I want to know whether there is a method to execute MQL4 scripts in
servers so that I do not need to keep my computer always on."
You can subscribe to VPS (Virtual Private Server) services where you can attach your EA (.ex4) files to. Basically, it acts as a server-hosting (but a really small one, just enough to run your MT4 Terminal).
There are many VPS offerings. Just google Metatrader4 VPS.
In fact, Metaquotes itself also offers this service, straight off your MT4. Once you subscribe to that service and attach your .EX4, you can then switch off your PC and the EA will still be running on the VPS.
You can find details here Link.
Most brokers nowadays offer Virtual Private Server aka VPS solutions, which aims to reduce the latency & slippage on your trades. This means that your system will be "virtually" closer to the brokers services, reducing the time it takes for pricing and execution orders to travel from your VPS to the brokers servers.
Basically, I want to provide a web application (built in PHP, MySQL, Apache) to users with source code in case they don't have Internet connection. But with that, I have to take care that they web application package (with Apache, PHP, MySQL and actual application with data) cannot be copied and run in another machine (may be we can bind authentication with Hard Disk serial id).
The first solution stroked in my mind was to build stand alone application but we don't have that option because we have limitation to go with web application only.
One solution, I thought is to create a web browser like container (which may be using one of the system's browser inside) in Java or any other stand alone programming language where we have additional authentication for current machine and internally it uses system's browser for HTTP requests/responses.
Please share your idea about feasibility/implementation of above solution or any other better solution.
One thing to keep in mind that, we are providing all source code with servers, so authentication with database or PHP won't be much useful.
But with that, I have to take care that they web application package (with Apache, PHP, MySQL and actual application with data) cannot be copied and run in another machine (may be we can bind authentication with Hard Disk serial id).
This is, strictly speaking, impossible.
The first solution stroked in my mind was to build stand alone application but we don't have that option because we have limitation to go with web application only.
Have you ever heard of IDA Pro? JD-GUI? ILSpy?
Stand-alone applications can trivially be reverse-engineered. This will protect nothing.
Your best options are:
Provide a cloud service, which is totally agnostic towards HTTP clients, so you can own the back-end machines that contain your source code, then give your customers a dumb open source front-end that speaks to the back-end.
Enforce your software policies (i.e. only allowed to run one copy of the software) with the appropriate tool for the job: Lawyers and contracts.
i need to create a webapp to show and allow editing for a set of data.
This data is contained in an Access Database file, used by another application (a desktop application).
I'm evaluating the best way to carry out this job.
Unfortunatly my purpose to migrate to another database solution (rdbms such as MySQL or Postgres) was rejected by the customer.
The issue here is how to keep data integrity and syncronized between the server and the desktop that executes the application that also uses this data.
All I need to do is, read data, store edited or new data, give to authorized users an interface to review this new inserted data -thus validating it-, and import this to the original access database.
I've found the following possible solutions (to update the desktop mdb copy), but each of them has pros and cons:
remote access to the windows machine
exposes the machine to unauthorized access
use rsync to keep files syncronized (once a day)
if the mdb on the client has been edited with the desktop application there will be data loss
can be update only when all data has been validated
there won't be real syncronized data (until rsync will run)
client-server applications
can use secure layers to protect data against attackers
a 3rd application (on the desktop) is required
syncronization requires authorized users to use this 3rd application to import data (that will query the remote db and update the local mdb)
Do you know some other way that could help me to get this done?
I'm oriented on the client-server model, also if this would be more expensive, but it's the only way I see to make this work.
Do you see some other pros/cons of the purposed solution?
I didn't choose the PL to develop this, but I was thinking to use either PHP and/or Python.
The remote environment (for the server) can either be Windows or *nix (preferred).
Thanks.
The first idea:
exposes the machine to unauthorized access
This is not really a valid argument. Everything you put on the Internet is exposed. An it is not like it cannot be further protected via SSL/TLS. Even RDP can be secured via a SSH tunnel, for example.
To my mind, the easiest way and most elegant way to do is by using web services (SOAP). Write the server code that does inserts/updates on the Access database with something like a Python or Java. Generate a WSDL from the working code. From the WSDL you can generate a client for PHP/Python. Now all you have to do is to write the web interface that uses the PHP/Python client.
For security using SSL and Basic authentication should be enough (supported by SOAPpy in the case of Python, for example).
You can use pyodbc to connect to the Access database.
well you can use 2 db and syncronize changes with a sort of web service between them.
seperating web server Db (which you could use a modern mysql or whatever) and the current access Db
You should build a sort of a Rest Api returning new or changed records against GET method, Deleting against DELETE method etc. using a timestamp in the http method.
and then you could query at each side with a scheduled job for new records at the other side (transferring with json) resulting in keeping the records relatively insync.
You could take care of security with exposing the application db only in a certain port and only to http queries coming from the webapp server ip address. also using http auth, hashes etc..
if this isn't a heavy load, high concurrency app (which I guess it isn't since you use access as a Db) this should work.
you could build this kind of mini-api with any python webframework like turbogears 2.1,django or the mini frameworks like bottle or flask
p.s If you prefer python (and why wouldn't you) don't use pyodbc directly, work with python beautiful orm - sqlalchemy is much better
I think how this works really depends on the authentication issue and number of users that need to review the data.
The reason I ask?
You can consider using Access 2010 and office 365. This allows you to have linked tables to the cloud, but in fact the tables are also cached local to your Access desktop. This means that real time replication sync of data is used, and this is automatic for Access 2010 (so you don’t' have to write any code).
What this means is while running the Access desktop application, you can pull the plug on the network and it will continue to run. The instant you have a wifi or a connection, then changes local are synced up to office 365. Even better is you can now build web forms in Access.
Data touched or edited (or new records on either side) will come down the pipe to your local computer. So you add reords in Access client, the web users will ALSO see these new reocrds.
So Access 2010 now has web publishing, and this works with the new office 365. The price starts at $6 per month. And if just for a few users, then have them all logon using the same account! This means you can have this all up and running in less time than it took to make this post, and for less then $10 per month!
For those not aware, Access 2010 has web publishing. When you publish the Access forms, then are converted to .net forms (zammel/XAML) forms, and the code is converted to JavaScript. So form code actually runs browser side.
Since the system runs on office 365, then you using some heavy duty iron and you can in theory scale out to millions of users for this setup. When you publish the Access application to office 365, then on the server side you not using mdb or Access files anymore, but what is called Access Web Services. The tables in fact become the equilivant of SharePoint lists . And new for SP 2010 is those lists now have relational features like cascade delete.
The real beauty of this system is you can write and create and do everything inside of Access without have to learn or touch ANY KIND of server side technology. Here is short video of mine, and at the half way point I run the Access application with nothing more than a web browser.
http://www.youtube.com/watch?v=AU4mH0jPntI
There is no activeX or even Silverlight required. In fact my Access applications run fine on a iPad using the safari web browser.
So you could consider to continue using Access, and just publish your application to the web with the new Access 2010 features.
I got a situation where I have lots of system configurations/logs off which I have to generate a quick review of the system useful for troubleshooting.
At first I'd like to build kind of web interface(most probably a php site) that gives me the rough snapshot of the system configuration using the available information from support logs. The support logs reside on mirrored servers (call it log server) & the server on which I'll be hosting the site (call it web server) will have to ssh/sftp to access them.
My rough sketch:
The php script on web server will make some kind of connection to the log server & go to the support logs location.
It'll then trigger a perl script at logs server, which will collect relevant stuffs from all the config/log files into some useful xml (there'd be multiple of those).
Someway these xml files are transferred to web server & php will use it to create the html out of it.
I'm very new to php & would like to know if this is feasible or if there's any other alternative/better way of doing this?
It would be great if someone could provide more details for the same.
Thanks in advance.
EDIT:
Sorry I missed to mention that the logs aren't the ones generated on live machine, I'm dealing with sustenance activities for NAS storage device & there'll be plenty of support logs coming from different end customers which folks from my team would like to have a look at.
Security is not a big concern here (I'm ok with using plain text authentication to log servers) as these servers can be accessed only through company's VPN.
Yes, PHP can process XML. A simple way is to use SimpleXML: http://php.net/manual/en/book.simplexml.php
While you can do this using something like expect (I think there is something for PHP too..), I would recommend doing this in two separate steps:
A script, running via Cron, retrieves data from servers and store it locally
The PHP script reads from the local stored data only, in order to generate reports.
This way, you have these benefits:
You don't have to worry about how to make your php script connect via ssh to servers
You avoid the security risks related to allowing your webserver user log in to other servers (high risk in case your script gets hacked)
In case of slow / absent connectivity to servers, long time to retrieve logs, etc. you php script will still be able to quickly show the data -- maybe, along with some error message explaining what went wrong during latest update
In any case, you php script will terminate much quicker since it only has to retrieve data from local storage.
Update: ssh client via php
Ok, from your latest comment I understand that what you need is more a "front-end browser" to display the files, than a report generation tool or similar; in this case you can use Expect (as I stated before) in order to connect to remote machines.
There is a PECL extension for PHP providing expect functionality. Have a look at the PHP Expect manual and in particular at the usage examples, showing how to use it to make SSH connections.
Alternate way: taking files from NFS/SAMBA share
Another way, avoiding to use SSH, is to browse files on the remote machines via locally-mounted share.
This is expecially useful in case interesting files are already shared by a NAS, while I wouldn't recommend this if that would mean sharing the whole root filesystem or huge parts of it.
i have a database that is written in access. the access mdb file connects via ODBC to a local mysql database. i have a bunch of sql and vba code in the access file. i dont expect the database to surpass 100mb. currently it is around 10mb. i will need to have multiple user access. (no more than 10 users at a time)
i need to convert this database from being a local one to a web server, and i need to make a web interface for it.
how do i get the current local instance of mysql database to run off a webserver? i am currently running it off wampserver 2.0. i dont have experience putting a database on a webserver.
i have an OK vb.net background. i have never done any web applications. here's a picture of the access form that i may need to replicate to work off a website:
alt text http://img42.imageshack.us/img42/1025/83882488.jpg
which platform should i use as the front end to this thing?
would it be possible to just run this access file off a webserver instead of programming a new front end for it? is that not a smart idea?
thank you for your help!
If your webserver has TCP connectivity to your existing database server, and its hosted in a suitable place (eg, don't have your webserver in a datacenter connecting to a database server on your office DSL connection), then no move is required.
If you do need to move it, it's as easy as creating a backup/dump, and restoring it elsewhere.
As far as the frontend, there are MANY technologies that will do what you need (ASP.NET, PHP, Python, Ruby, Perl, Java being the most popular ones, not necessarily in that order).
Use something you are comfortable with, or that you are interested in learning (provided you have the time to do so)
Use something that runs properly on your target webserver. Really, ASP.NET is the only one that has any major issue here, as it's limited to Windows.
Access itself has no direct web-accessible version. A Google search finds some apps that claim to convert Access forms to web-based, but I will not link to any because I don't know how well they work. I'm certainly leary of anything like that, because web apps are a different breed from Windows apps. If you are going to go that route, be sure they actually generate HTML output; make sane, clean source; and offer a free trial so you can verify it actually works.
Really though, a form like that is reasonably easy to reproduce with some basic knowledge of server-side programming and some HTML.
I don't have any experience migrating access to a web-based interface, although I have heard of people going straight from access to a web page. MySql is exceptionally easy to migrate. MySQL.com has a program called mysqldump that comes with the standard install of MySQL that allows you to export your database straight to a text file that can be used then with mysqldump to import it on another server. I don't believe the WAMP server comes with the command line tools although they can be downloaded from mysql.com. However, if it has phpMyAdmin, then there is also an export feature with that as well that will generate a .sql file that can be imported to the webserver using phpMyAdmin. One thing to keep in mind though is that I have had very little success mixing and matching these methods: ie, I've never been able to get a mysqldump-created file to work with phpMyAdmin and vice versa.
Good luck!
The link will help you to export and import mySQL database
May be on Windows web server there is an opportunity to run Access files, you can check, but any way if you have some programming skills, I would say that it is not difficult to crate a php script which will query your database info and will edit.
Migrating an Access application to the web is quite difficult, because you can't translate an Access form 1:1 into a web page. Web apps are stateless, whereas Access is built around the concept of bound controls and bound datasets.
Secondly, it is impossible to easily replicate an Access subform.
Third, you lose tons of events that Access forms and controls are built around.
In general, a web page that performs the same task as an Access form will bear little or no resemblance to the Access form, simply because the methods for accomplishing the same tasks and the UI widgets available to you are so completely different.
One thing to consider is whether your users need a web application or if they just need to use your existing Access application over the Internet. If the latter is the case, Windows Terminal Server/Citrix can do the job for a lot less money, since there's no conversion needed. You do need to provision a Windows Terminal Server, set up a VPN and purchase CALs for the users, but the costs of those are going to be much less than the cost of rebuilding the app for web deployment.
It may not be an appropriate solution, but it's one that you should consider, I think.