I must developing an network monitor to monitoring several components using snmp. I save all received data in a round robin database.
I started to create an web based configuration center, that allows users to add devices to be monitored and access all the graphs (using rrdtool) of all devices.
I must run an daily, week, month and yearly update of the database.
My question is, how can i launch an script that executes an snmp command to fetch the data from the device and stores it on the databse and runs on background ? By background, i mean that it is a process that not depends if the user has logged in in the web configuration page or not.
I never did something in PHP, therefore i am asking you.
I hope you can help me out. Thank you in advance.
Best regard.
I have developed such a system a few years ago. We used Cacti, in combination with Nagios and Smokeping. Of course, if your needs are simpler, you could use cron scripts to fetch your data. But Cacti is definetely worth a look (as well as Nagios, but unlike Cacti, it's not specifically targetted at RRD files)
Note that none of these systems require PHP. They run standalone, as daemons. It's then pretty straightforward to write a web interface on top of that.
Related
I want each form submission of my php application to run in a queue as each request takes considerable time and resources. I found out about Laravel which is kind of complete system for such tasks but unfortunately is Linux specific.
Additionally the email support is also needed to retrieve the jobs by their id.
What might be the existing tools for windows based php applications?
If not, how to achieve this manually using mysql and php?
Few hits were found on this e.g. here but not actually getting from where to start.
If I understand you correctly, then RabbitMQ should be what you're after.
It supports Windows, but you'll need to write some code to get it to do what you want.
To give a brief idea of my current setup:
HTML form to collect the data for the Python Script
PHP inserts form data (POST) into input MySQL table
PHP exec s Python3.4 script.py row_id
Python collected the row ID using sys.argv[1] and runs the code and inserts the results into output MySQL table. When complete PHP displays a report (PHP waits for Python to complete, then I use header to go to the report URL).
The report pages are html with the blanks filled by PHP from the output MySQL table.
This worked fine for me testing and developing. The problem came when I released it into the wild for more general testing within the company - it is hosted on a VPS server and when the Server ran more than one instance of Python it ran like a dog (as we say in Scotland, because to say slow would be too easy).
Of course, the hosting company suggest upgrading, which we did and that doubled the capacity (from 1, sometimes 2 simultaneous runs, to 2 sometimes 3). I have now reached the conclusion that this method is not scalable, we are always going to hit some kind of limit.
So I have been looking at Amazon Web Services, but to do this requires a change in philosophy for the Python design - it must run all the time and have data fed in somehow (as far as I can see).
The Amazon suggestion is Flask or similar framework, but this would mean I then either have two web servers and have to do some kind of cross domain transfer of data, or else scrap what I have done - or at least heavily modify my PHP/HTML/JS that runs on the webserver so that it can be served by Flask and all be hosted by AWS.
Unless someone can suggest another way of communication that doesn't require a web framework, I have been considering either just polling the MySQL database at intervals and processing any new data, or else migrating the database to Postgre, which, I believe, allows Python code to run within the database structure and would enable me to trigger on insert (I believe the user defined triggers in MySQL only support C/C++).
Thanks for any suggestions or pointing out of things I am missing.
Blair
Further to infinigrove's suggestion Pyro4 works well for my use case, there is an excellent tutorial that provides an easy learning curve.
Selcuk's suggestion would probably also have worked, but I would still have been left with the problem of running the code on other servers in a distributed environment. Also if there is a choice between writing Python code and writing PHP I am afraid I take Python every time!
I've got a registration list, which I need to send out a PDF to each person on the list. Each email needs to contain a PDF, which has a base version on the server, but each person's needs to be personalized via name/company etc over the top. This needs to be emailed to each person, which at the moment adds up to be 2,500, but can easily be much higher in the future.
I've only just started working on this project, but the problem I've encountered continuously since last week are that the server doesn't seem to be able to handle doing this. Currently the script is using Zend, which then allows it to use Zend_Pdf and Zend_Mail to create and email the PDFs. Zend_mail connects to an smtp server from smtp.com to do the actual emailing.
Since we have quite a few sites running on the server, we can't afford it to be going down, and when I run it in batches it can start to go down. The best solution I have thus far is running curl from my local machine to the script, which then does one person. The curl script then calls it again, over and over in batches. Even this runs into problems at times, and seems to some how hog memory even after it should be complete (I'm really not sure how).
So what I'm looking for is information on doing this, from libraries, code, information on server setups, anything that can make this much less painful, and much quicker for us to run. I've run out of ideas, and this is something I've not really had to do before (especially at a bulk level).
Thank you.
Edit:
I also forgot to mention that it's using zend_barcode::factory for creating a barcode on the PDF.
First step I suggest is to work out where the problem lies if you can. Is it the PDF generation? Is it the emailing? "Server doesn't seem to be able to handle this" doesn't say what is actually failing as with the "server goes down" - you need to determine if you are running out of memory/disk-space/time or something else. That will help you determine if you need a tweak or a new approach to your generation. Because you said that even single manual invocations can fail you should be able to narrow the problem down to exactly what is the cause of the failure.
If you are running near some resource limit (which might be the case with several sites running), you probably need to offload this capability onto another machine. Your options include:
run the same setup on a new host and adjust your applications to use the new system
run a new setup on a new host
use an external system (such as the mentioned PDFCrowd or Docmosis)
Start with the specifics of the problem. I hope that helps. Please note I work for the company that created Docmosis.
Here's some ideas:
Is there a particular reason this has to run on a web server? Why not run the framework
from a different machine, but with the same settings? You might have to create a different
controller to handle the command-line version of the request, but there's no fundamental
reason it can't work.
If creating PDFs programatically is giving you a headache, you can instead use a service.
In the past, I've used PDFCrowd with good results, and they provided
a useful PHP library. You can give them a blob of HTML, using full URLs for any stylesheets
and images, and they'll create a PDF for you.
The cost per document varies from 0.5-4.5 cents per document depending on your rate plan.
There are other services which do the same thing.
If this kind of batch job is a big deal for your company, you might consider an
asynchronous job queue like beanstalk. You could queue
up thousands of these, and a worker script could handle the requests at whatever pace you
deem reasonable.
From my experience - two options:
Dynamically generate PDFs using one or more PDF libraries (which can be awfully slow).
OR
Use something like wkhtmltopdf which is a simple shell utility to convert html to pdf using the webkit rendering engine, and qt.
Basically, you can loop over n HTML pages and generate PDF's without the overhead of purely dynamic PDF generation!
We've used this to distribute thousands of personalised PDF's on a daily basis as it quickly converts HTML pages to PDF. There are dependencies, but it works and is less intensive (computationally) than 'creating' PDFs individually.
Hope this helps.
If you are trying to call the script over HTTP, the script will timeout based on the max_execution_time specified in the php.ini.
You need to write a php script which can be run from command line and then schedule it via a cron job. The script at a time, can read one user, put together his pdf file, and email him. After that, you might have to run some performance checks to see if the server can handle the process.
I'm building a web application, and I need to use an architecture that allows me to run it over two servers. The application scrapes information from other sites periodically, and on input from the end user. To do this I'm using Php+curl to scrape the information, Php or python to parse it and store the results in a MySQLDB.
Then I will use Python to run some algorithms on the data, this will happen both periodically and on input from the end user. I'm going to cache some of the results in the MySQL DB and sometimes if it is specific to the user, skip storing the data and serve it to the user.
I'm think of using Php for the website front end on a separate web server, running the Php spider, MySQL DB and python on another server.
What frame work(s) should I use for this kind of job? Is MVC and Cakephp a good solution? If so will I be able to control and monitor the Python code using it?
Thanks
How do go about implementing this?
Too big a question for an answer here. Certainly you don't want 2 sets of code for the scraping (1 for scheduled, 1 for demand) in addition to the added complication, you really don't want to be running job which will take an indefinite time to complete within the thread generated by a request to your webserver - user requests for a scrape should be run via the scheduling mechanism and reported back to users (although if necessary you could use Ajax polling to give the illusion that it's happening in the same thread).
What frame work(s) should I use?
Frameworks are not magic bullets. And you shouldn't be choosing a framework based primarily on the nature of the application you are writing. Certainly if specific, critical functionality is precluded by a specific framework, then you are using the wrong framework - but in my experience that has never been the case - you just need to write some code yourself.
using something more complex than a cron job
Yes, a cron job is probably not the right way to go for lots of reasons. If it were me I'd look at writing a daemon which would schedule scrapes (and accept connections from web page scripts to enqueue additional scrapes). But I'd run the scrapes as separate processes.
Is MVC a good architecture for this? (I'm new to MVC, architectures etc.)
No. Don't start by thinking whether a pattern fits the application - patterns are a useful tool for teaching but describe what code is not what it will be
(Your application might include some MVC patterns - but it should also include lots of other ones).
C.
I think you have already a clear Idea on how to organize your layers.
First of all you would need a Web Framework for your front-end.
You have many choices here, Cakephp afaik is a good choice and it is designed to force you to follow the design pattern MVC.
Then, you would need to design your database to store what users want to be spidered.
Your db will be accessed by your web application to store users requests, by your php script to know what to scrape and finally by your python batch to confirm to the users that the data requested is available.
A possible over-simplified scenario:
User register to your site
User commands to grab a random page from Wikipedia
Request is stored though CakePhp application to db
Cron php batch starts and checks db for new requests
Batch founds new request and scrapes from Wikipedia
Batch updates db with a scraped flag
Cron python batch starts and checks db for new scraped flag
Batch founds new scraped flag and parse Wikipedia to extract some tags
Batch updates db with a done flag
User founds the requested information on his profile.
i have a database that is written in access. the access mdb file connects via ODBC to a local mysql database. i have a bunch of sql and vba code in the access file. i dont expect the database to surpass 100mb. currently it is around 10mb. i will need to have multiple user access. (no more than 10 users at a time)
i need to convert this database from being a local one to a web server, and i need to make a web interface for it.
how do i get the current local instance of mysql database to run off a webserver? i am currently running it off wampserver 2.0. i dont have experience putting a database on a webserver.
i have an OK vb.net background. i have never done any web applications. here's a picture of the access form that i may need to replicate to work off a website:
alt text http://img42.imageshack.us/img42/1025/83882488.jpg
which platform should i use as the front end to this thing?
would it be possible to just run this access file off a webserver instead of programming a new front end for it? is that not a smart idea?
thank you for your help!
If your webserver has TCP connectivity to your existing database server, and its hosted in a suitable place (eg, don't have your webserver in a datacenter connecting to a database server on your office DSL connection), then no move is required.
If you do need to move it, it's as easy as creating a backup/dump, and restoring it elsewhere.
As far as the frontend, there are MANY technologies that will do what you need (ASP.NET, PHP, Python, Ruby, Perl, Java being the most popular ones, not necessarily in that order).
Use something you are comfortable with, or that you are interested in learning (provided you have the time to do so)
Use something that runs properly on your target webserver. Really, ASP.NET is the only one that has any major issue here, as it's limited to Windows.
Access itself has no direct web-accessible version. A Google search finds some apps that claim to convert Access forms to web-based, but I will not link to any because I don't know how well they work. I'm certainly leary of anything like that, because web apps are a different breed from Windows apps. If you are going to go that route, be sure they actually generate HTML output; make sane, clean source; and offer a free trial so you can verify it actually works.
Really though, a form like that is reasonably easy to reproduce with some basic knowledge of server-side programming and some HTML.
I don't have any experience migrating access to a web-based interface, although I have heard of people going straight from access to a web page. MySql is exceptionally easy to migrate. MySQL.com has a program called mysqldump that comes with the standard install of MySQL that allows you to export your database straight to a text file that can be used then with mysqldump to import it on another server. I don't believe the WAMP server comes with the command line tools although they can be downloaded from mysql.com. However, if it has phpMyAdmin, then there is also an export feature with that as well that will generate a .sql file that can be imported to the webserver using phpMyAdmin. One thing to keep in mind though is that I have had very little success mixing and matching these methods: ie, I've never been able to get a mysqldump-created file to work with phpMyAdmin and vice versa.
Good luck!
The link will help you to export and import mySQL database
May be on Windows web server there is an opportunity to run Access files, you can check, but any way if you have some programming skills, I would say that it is not difficult to crate a php script which will query your database info and will edit.
Migrating an Access application to the web is quite difficult, because you can't translate an Access form 1:1 into a web page. Web apps are stateless, whereas Access is built around the concept of bound controls and bound datasets.
Secondly, it is impossible to easily replicate an Access subform.
Third, you lose tons of events that Access forms and controls are built around.
In general, a web page that performs the same task as an Access form will bear little or no resemblance to the Access form, simply because the methods for accomplishing the same tasks and the UI widgets available to you are so completely different.
One thing to consider is whether your users need a web application or if they just need to use your existing Access application over the Internet. If the latter is the case, Windows Terminal Server/Citrix can do the job for a lot less money, since there's no conversion needed. You do need to provision a Windows Terminal Server, set up a VPN and purchase CALs for the users, but the costs of those are going to be much less than the cost of rebuilding the app for web deployment.
It may not be an appropriate solution, but it's one that you should consider, I think.