This is a very general question but I'm not sure the best way to go about things here.
I have two applications that want to interface. One is a Windows based app that has a database and can send CURL commands. The other is a very simple website with a MySQL database.
The main feature is that these two apps can swap database data between each other. The Windows app is currently using SQLAnywhere but could be converted to MySQL.
Anyway: On the web app there is a js function to dump all data requested into a .txt file, essentially a mysql dump. This function will be called by the Windows app via CURL. It will say
"Hey, dump the data for this table in to a txt file, then let me download it."
What I am unsure of is: Once the request to dump the data is complete, the Windows app will want the file right away. How do I say back to it, "Wait until the file is completed, and then you can download it."
I was thinking of making a dummy file and then a .txt file so the Windows app essentially gets stuck in a loop (with a timeout) until the file is renamed to .txt. Is this a good way to approach this?
Thank you.
Related
I want to implement a simple long polling system in PHP. A simple Szenario:
The Project is based on two websites. Website A and Website
B. There are two Users. One on Website A (UserA) and one on
Website B (UserB). On the Website A is a Button. If UserA push the
Button, the color of Website B change instantly.
Of course i can do this with a MySQL Database, but this seems way to big, because i just want to transfer one Bit.
Are there any other oppurtinitys to store one Bit on the Server an have acces from all PHP Pages, which are hosted on the Server?
I thought i could use a simple .txt file, but i am not shure if the Server Crushes if two diffrent Websites want to access to the same file. Is this a problem?
Or have you any other Ideas how to resolve it?
I would not recommend using a text file, since I/O operations is pretty slow compared to other methods.
You have to read the file on every page load/refresh or even worse, with an ajax request to do it instant. I think I would recommend something like Redis / Memcached and make some sort of ajax call to read from that (if you want it to be instant).
If you don't have access to the server, to install that kind of software, I would use a MySQL database.
Hope it helps
I'm generating a new csv file (approx) every 2 mins on my machine through a local application that I've written, and I need this file to update my database each time it is generated. I've successfully done this locally via a (scheduled) repeating bat file, and now I need to move this process online so my website has access to this data in as similar of a time-frame as possible.
I'm totally green on mySql and learning it as I go, so I was wondering if there is anything I should be concerned about or any best practices I should follow for this task.
Can I connect directly to my server-side database from my cmd window (bat file) and send this data once the process has run and generated the csv file? Or do I need to upload this file via ftp/php to my webserver and import it into the database once it is online?
Any help/thoughts would be greatly appreciated.
hope someone can help me. I made a unity3d game for webplayer. Now i need to store some data e.g. the time how long someone has played the game in a csv file.
Does anyone know how i can test if my php and javascript for writing to this csv file
in my javascript i habe this line
var postDataURL = "myurl/data.php?";
but i don't have a url til now so i need to test this locally and i don't know what to add for the url i always get this error:
Could not resolve host: C; No data record of requested type
thanks
You will need to set up a web server to run locally. At that point you can target your form submission at http:// localhost/data.php and things will work perfectly. Apache is really easy to set up but you might want to look into something like WAMP (Windows Apache MySql PHP.)
http://www.wampserver.com/en/
First of all in order to access your PHP script to either send or retrieve information You need to use the WWW class with your url.
Next is there a reason that you want to write it to a csv file. If your just trying to store information that is pervasive across all session runs you can write data to player prefs.
PlayerPrefs.SetString("KeyName", Value);
Retrieve the data at any time using
PlayerPrefs.GetString("KeyName")
This is good for keeping high scores and such
Finally to directly answer your question my favorite is installing apache because it sets up in minutes with out havnt to know what your doing
http://httpd.apache.org/
after install find your web root and place the file in there. You can access the file using http://localhost/yourPHPfile.php
Make sure you install php on your machine as well.
I have an iOS app that allows users to update the cover charges at local bars. The data is then displayed on the app for other users to see. The updates are made by sending a request to a php script and then the script updates and xml file. What will happen if a user tries to read the xml while another user is updating it, i.e. while the file is being rewritten with a new update?
Thanks!
The user has a 50-50 chance of getting the updated version, depending on the server speed and there connection speed it may differ. I agree with AMayer, once the file gets big it's going to be hard on your server to download and upload the ENTIRE xml file again and again! I would just setup a MySQL database now and use it instead of the XML.
Hey folks, this question can't be too complicated. Please provide a solution to at least figure out the ultimate root cause of the problem.
I currently write an application, which controls Excel through COM: The app creates a COM-based Excel instance, opens some XLS files and reads their contents.
Scenario I
On Windows 7, I start Apache and mySQL using xmapp-control with system administrator rights. All works as expected. The PHP-based controller script interacts with Excel as expected.
Scenario II
A problem appears, if I start Apache and mySQL as 'background jobs'. Here is how:
I created two jobs using Windows 7 Task Planner. One runs apache_start.bat, the other runs mysql_start.bat.
Both tasks run as SYSTEM with elevated privileges when Windows 7 boots.
Apache and mySQL work as expected. Specifically, Apache serves HTTP request from clients and PHP is able to talk to mySQL.
When I call the PHP controller, which calls and interacts with Excel using COM, I do receive an error.
The error message comes from Excel [not COM itself] and reads like this:
Excel can't read the specified Excel-file
Excel failed to save the file due to an ill-name worksheet
Interestingly, the first during the first run of the PHP-based controller script, it takes a few seconds to render the error message. Each subsequent run immediately renders the error message.
Windows system logs didn't show a single problem report entry.
Note, that the PHP program and the Apache instance didn't change - except the way Apache was started.
At least the PHP controller script is perfectly able to read the file-system, since it provides the pathes to the XLS-file through scandir() of a certain directory.
Concurrency issues can't be the cause of the problem. A single instance of the specific PHP controller interacts with Excel.
Question
Could someone provide details, why this happens? Or provide ways to isolate the ultimate cause of the problem (e.g. by means of a PowerShell 2 script)?
UPDATE-1 :: 2011-11-29
As proposed, I switched the Task Planner job from SYSTEM to a conventional user. Works. Apache and MySQL get started and process request.
Unfortunately, the situation regarding Excel did't change a bit. Still, I see the error.
As assumed earlier, the EXCEL COM server starts. I'm able to change various settings (e.g. suppress dialogs) without a problem through the COM-instance.
The problem happens while calling this:
$excelComObject->Workbooks->Open( 'PathToXLSFile' );
UPDATE-2 :: 2011-11-30
Added the accounts USER, GUEST and EVERYONE with the READABLE right to the access control list of the XLS file . No change.
Modified the app in such a way, that the PHP part creates a copy of the XLS file as a temporary file and moves the contents of the original file into this. Just to ensure, that the problem isn't forced by odd file / path names.
Still, the problem persists.
UPDATE-2 :: 2011-12-05
I'm going to send the EXCEL COM-Server methods in such a way, that Excel creates a blank file and saves it to /tmp. Let's see, if Excel even isn't able to read this file.
Go into the task planner and let everything run as a local user. This will probably require that you enter a password so create one if you don't have one already.
Excel is a user-level application that shouldn't run as SYSTEM. I'm sure there are ways around it, but you should simply let everything run at the correct level.
Having Apache run on the user level isn't a problem.
Try creating the following directories:
C:\Windows\SysWOW64\Config\Systemprofile\Desktop
C:\Windows\System32\Config\Systemprofile\Desktop
it worked for me :-)
http://social.msdn.microsoft.com/Forums/en-US/innovateonoffice/thread/b81a3c4e-62db-488b-af06-44421818ef91
In the past (read: pre Vista) services had an option called "Allow Service to interact with desktop" which allowed services to spawn windows etc. Starting Vista, this is no longer allowed.
I suspect Excel is failing because it can't function under this restriction. Thus, any attempt to run it as a service in your Win7 installation will fail.
You can Windows XP and allow desktop interaction for your Apache process, which I don't really recommend for obvious reasons.
Another approach I would take is to create a PHP script that runs as a regular process and listens on a socket in an infinite loop. Your PHP script that runs under Apache would communicate with the secondary script through the local socket and have the secondary script spawn Excel.
This may sound complicated but in fact it's not a lot of code and it fixes a problem you will soon have anyway: You should only have one instance of Excel running or you may run into problems. The secondary script could queue requests, handing off one by one to Excel then taking the next in the queue.