Is it possible to get data from Avaya CMS (16.3) (for call centers) with PHP?
I want to create a real-time statistics with PHP but i don't know how to get data from database :/
If anyone is searching for an answer - yes, it is possible but partially.
You need to create a script from the CMS SuperVisor to export data to the TXT file. Next using PHP you need to load this file to the database.
I am using this solution without problems for last 3 years.
ODBC is of no use for real time data since there is no ODBC interface to Real Time Database (RTDB) in CMS. The only practical solution to get that data is to run terminal reports with CLINT, parse them and store that data in your database. See Tek-Tips thread that has some information on how to do this.
Another option is to use CMS Webdash, which a web interface for CMS but can be used as data source too.
Sound like an older Avaya-switch? In that case, the console based client (hopefully included) on the CMS-server - clint - could be used for screen scraping. It's quite a project writing custom reports from scratch and then make another application log on, start clint and begin scraping that report, but it works and may be an alternative if there is no database-access.
I don't know much about newer Avaya-switches, but they may have more features than this..
You can use clintSVR which is a high level tool based on CMS CLINT. By using clintSVR, you can use CGI, OCX and C++ interfaces to get the real time data from CMS. For PHP, you can use CGI interface to get the real time data.
Related
I have been put in charge of building an IVR using vXML and asp.net. For some reason the voice server we are using requires ASP.net and cannot use PHP in conjunction with vXML so I am stuck learning ASP.net. The application is pretty simple in that it runs an ASP.net file with vXML and should pull data from a database based on user input.
Example:
User enters customer ID "23313"
It should then pull data from our SQL2012 DB that corresponds to that ID and read it back via prompts. Simple enough I figured.
A have a couple questions regarding possible solutions to this -
Is it possible for ASP/vXML to pull data from PHP dynamically (post or get statements), and use the data in the current vXML document? or will I have to bite the bullet and figure out a second page?
if using PHP is not ideal or possible, would it be better or possible to add a db connection into the asp/vXML document and run the IVR that way?
I am not very familiar with ASP.net, and am trying to find out the most efficient way to accomplish my goal without having to have an additional vXML page to run.
Any help appreciated.
EDIT
After further investigation and help from Jim I was able to get inline PHP working. The server I was using was set to go specifically to this asp.net and did not have PHP installed on the server itself. After installing PHP, changing where the server was looking for the file, I am able to run the latest PHP version in my app.
Deleted code sample as it was completely irrelevant
The ASP requirement seems odd, unless you are leveraging some type of library within the ASP.net environment. VoiceXML browsers, are just that, a browser. It should be able to process VXML from the standard sources. I suspect you are working within a framework that requires the serverside ASP.
If your browser is VoiceXML 2.1 compliant, you should have access to the Data element. This element allows you to make Get and Post requests to a server, get back XML and parse the data within Javascript. Note, the return data must be valid XML.
Any database connection would have to be on the ASP.net side of the solution. VoiceXML gets data by transitioning to a new page (goto or subdialog element) or the Data element above.
I have a PHP system, that does everything a social media platform does, i.e. add comments, upload images, add objects, logins, sessions etc. Storing all interactions in a MySQL database. So i've got a pretty good infrastructure to build on.
The next stage of my project is to develop the system so that notifications are sent to the "Networks" of "Contacts", which are associated with one and other. Such as the notifications system like Facebook. i.e. Chris has just commented on object N.
I'm looking at implementing this system for a lot of users: 10,000+, so it has to be reliable. I've researched the Facebook integration, resulting in techniques such as memcache, sockets & hashing.
Are they're any systems that can be easily adapted to this functionality, as I could do with a quick, reliable implementation.
P.s one thought I had was just querying the database every 5 seconds for e.g. "Select everything that has happened in the last 5 seconds" using jQuery, Ajax & PHP, but thats stupid, it would exhaust the server & database right?
I've seen this website & this article, can anyone reflect on this to tell me what is the best approach as I am hesitant about which path to follow.
Thanks
This is not possible with just pure vanilla PHP/MySQL. What you can do is set MySQL triggers (http://dev.mysql.com/doc/refman/5.0/en/triggers.html) on your data, and then use the UDF function sys_exec (Which you will need to install on your MySQL server) to run the notification php script. See this post: Invoking a PHP script from a MySQL trigger
If you can get this set up it should be pretty reliable and fast.
I have a WordPress plugin, which checks for an updated version of itself every hour with my website. On my website, I have a script running which listens for such update requests and responds with data.
What I want to implement is some basic analytics for this script, which can give me information like no of requests per day, no of unique requests per day/week/month etc.
What is the best way to go about this?
Use some existing analytics script which can do the job for me
Log this information in a file on the server and process that file on my computer to get the information out
Log this information in a database on the server and use queries to fetch the information
Also there will be about 4000 to 5000 requests every hour, so whatever approach I take should not be too heavy on the server.
I know this is a very open ended question, but I couldn't find anything useful that can get me started in a particular direction.
Wow. I'm surprised this doesn't have any answers yet. Anyways, here goes:
1. Using an existing script / framework
Obviously, Google analytics won't work for you since it is javascript based. I'm sure there exists PHP analytical frameworks out there. Whether you use them or not is really a matter of your personal choice. Do these existing frameworks record everything you need? If not, do they lend themselves to be easily modified? You could use a good existing framework and choose not to reinvent the wheel. Personally, I would write my own just for the learning experience.
I don't know any such frameworks off the top of my head because I've never needed one. I could do a Google search and paste the first few results here, but then so could you.
2. Log in a file or MySQL
There is absolutely NO GOOD REASON to log to a file. You'd first log it to a file. Then write a script to parse this file.Tomorrow you decide you want to capture some additional information. You now need to modify your parsing script. This will get messy. What I'm getting at is - you do not need to use a file as an intermediate store before the database. 4-5k write requests an hour (I don't think there will be a lot of read requests apart from when you query the DB) is a breeze for MySQL. Furthermore, since this DB won't be used to serve up data to users, you don't care if it is slightly un-optimized. As I see it, you're the only one who'll be querying the database.
EDIT:
When you talked about using a file, I assumed you meant to use it as a temporary store only until you process the file and transfer the contents to a DB. If you did not mean that, and instead meant to store the information permanently in files - that would be a nightmare. Imagine trying to query for certain information that is scattered across files. Not only would you have to write a script that can parse the files, you'd have to right a non-trivial script that can query them without loading all the contents into memory. That would get nasty very, very fast and tremendously impair your abilities to spot trends in data etc.
Once again - 4-5K might seem like a lot of requests, but a well optimized DB can handle it. Querying a reasonably optimized DB will be magnitudes upon magnitudes of orders faster than parsing and querying numerous files.
I would recommend to use an existing script or framework. It is always a good idea to use a specialized tool in which people invested a lot of time and ideas. Since you are using a php Piwik seems to be one way to go. From the webpage:
Piwik is a downloadable, Free/Libre (GPLv3 licensed) real time web analytics software program. It provides you with detailed reports on your website visitors: the search engines and keywords they used, the language they speak, your popular pages…
Piwik provides a Tracking API and you can track custom Variables. The DB schema seems highly optimized, have a look on their testimonials page.
I read some nice articles about how to connect to a remote MySQL database via Android.
Found some really interesting links here and here.
So the common way for getting data seems to be using some kind of webservice (interface, in this case a php script) which queries the db and renders the result in JSON (or XML) format. Then its possible to parse this output with the android JSON_Object implementation. So far so good.
Receiving data from the database and showing it up in a android listview was done in about minutes.
But what is the best practice for writing (inserting) data into tables?
Should a webservice be used here too? (or rather direct mysql conn)
What is the best method to push data to a webservice? (for ex. to insert a new entity in a database) and which format should be used?
In this case I do not use any html forms or anything to post the parameters. So how to post these parameters to the php script? (from within the android app!)
Of course this operation should be secure as well. Implementing a data manipulation machanism is bit more risky (in order to keep the db persistant)
I think, that many apps use some kind of DB, to synchronize data (ex: highscores).
So there should be a best practise for that.
I would recommend keeping anything database-specific hidden behind a web service.
If you build a dependency on MySQL into your application and later find that you need to change databases, the entire installed base has to be cut over. Think about the logistics of accomplishing that for a few minutes and you'll start to realize it's a nightmare.
Premiumsoft's Navicat for MySQL comes with a HTTP tunnel (PHP script) you might be able to use. It basically provides a method for doing anything to a MySQL database over HTTP.
I'd just make sure there are no licensing issues if you plan to distribute your app.
This is a general programming question.
What is the best way to make a light blogging system that can handle images, bbcode-ish styling and text without a database back end? Light means not more than 50 to 100 posts in extreme cases.
What language(s) should be used? Is there any preferred data format for the information? How does security play out?
EDIT: Client has no database, is on a shared server. Can't change that. Therefore, no DB.
EDIT2:
Someone mentioned SQL Compact - does that require anything more than copying files to the server? The key here is again that things shouldn't require any more permissions than FTP Acess.
If you're looking to do it yourself; store each post as a file in a directory. Then to sort and limit the posts you rely partially on the file names to order and limit them, and potentially (in the case of a search) on reading every last file. Don't go letting users make 10,000 posts though. But yeah, the above is considered a flat file data format. You can get fancy by using a standard format like JSON, Yaml, or XML within each post file, and even fancier by requesting these with Ajax calls in mostly client side code.
Now if the reason you want to work with flat files is that you just don't want to install a database server, there's nothing stopping you from reading a local (to the server) file as a berkley DB, a Lucene Index, or an SQLite DB from within your webapp using the appropriate client library. You'll find any of these approaches a little more sane (a bit faster, a bit more readable in code) than the afore-mentioned with all the same requirements for installing on the server (read-write file permissions). Many web frameworks or languages (like php) come with the option of an API to these client libraries; SQLite, and Lucy (C Lucene) particularly.
If you're just looking for examples of it being done, I first (I think 1999 or 2000) came across blosxom which is a perl script that either runs as a cgi script per request or as a cron job. It builds a dated index of "posts" based on whatever you throw into the directory it's meant to scan. It also builds an RSS feed.
Jekyll or Blogofile are my favorite kind of solution for that, "compiling pages before upload".
I'm going to go out on a limb here and say that it's not always the destination, but the Journey.
If you're going to set out to do this, I recommend using a language you are comfortable. Personally, this would be C#/.net for me, but from your tagging, I'll assume PHP would be the Serverside scripting language you would choose.
I would layout how I wanted my application to behave. If there is going to be a lot of data, you should consider (as dlamblin mentioned) an DB of some sort for lookup and retrieval. (Light Blog, not so much data... 1000 users can edit? maybe you should consider a DB.) Once you've decided how to store the data, decide how to present it.
Write some proof of concept code for each of the features you want to implement (blog templating, bbcode, user authentication, text searching...) and start to work them all together.
search for flat-file cms-es on google, for example:
http://www.flatcms.org/
this has been already done, so there is no need to create such CMS again. there are plenty of them.
I concur with dusoft that this has already been done.
DotNetBlogEngine.net is an ASP.NET (C#) based blogging system that has a nice XML back-end as an option.
Doesn't answer your question directly but check Unify.
If you do not want to write a new one or want to get some inspiration:
Flatpress
Simple PHP Blog
Ninja Designs are working on a db-free wordpress clone
You could either use XML, or use SQL compact (which allows for handling things just like SQL Server, but instead of a database you utilize flat files).