I'm working on integrating a Magento store with an existing desktop Point of Sales software. My idea is that this desktop program would connect through Magento's REST API to gather the product list, inventory changes, etc., and it will also commit new products and other updates through the API endpoint.
The problem arises when I don't want the person in charge of the PoS know the API credentials or don't want to bother prompting for them. The best would be to set them up in a config file.
I thought about loading the API authorize page in the background and automatically post the credentials to the login form. But this looks like a nasty approach.
Any ideas?
Not a solution but some words of experience in this matter...
Magento's API can be slow and the user will wait forever for the task to finish especially if the server is under load. We use another application which uses the Magento's SOAP API which I built using java to handle all updates/downloads between magento and our POS. This way the user is not waiting on slow responses, or stopped by loss of connectivity.
We have adopted your queue approcach, and another reason for having 1 application with a queue is that it handles all updates from all users and Only Allows 1 Task To Execute At Once. You need to do this to avoid database locks. e.g. Two users modify a product and you get a table lock error and the update fails. You can also overload the server by flooding it with lots of single user requests to the server. We still have event driven processing as opposed to synch scripts by having our POS send messages to our local app instructing the app about the task and it simply queues the task for processing. Our application has no user interface what so ever and I run it as a System service on our server, with the user credentials stored in a config file.
Related
I have one application which has admin panel and user panel in laravel.
What i need that when ever user approve any record set from adminpanel that row on user panel should change its color or notify it automatically for some time then this change has occur.
I am not in mood to use ajax calls running continously on the page
WebSockets are used to implement realtime, live-updating user
interfaces. When some data is updated on the server, a message is
typically sent over a WebSocket connection to be handled by the
client. This provides a more robust, efficient alternative to
continually polling your application for changes.
To assist you in building these types of applications, Laravel makes
it easy to "broadcast" your events over a WebSocket connection.
Broadcasting your Laravel events allows you to share the same event
names between your server-side code and your client-side JavaScript
application.
Check Laravel Docs
Here's my scenario: I have 10 ios apps with subscription in app purchases. I need one subscription purchase to be valid accross the 10 apps. Thus I require server-side receipt validation. The flow is like this: When the customer pays for the subscription, the receipt is sent to the Firebase DB and from there, I require a PHP script that takes in the receipt data . as input and sends a 'POST' request to the App store. The App store would then validate the receipt and return a JSON object back. We then overwrite the old receipt with the latest copy. Also, whenever the user logs in to any of the apps, we repeat this process and update the receipt to make sure the subscription of the user is still valid. My question is, is Firebase capable of dynamic script handling and HTTP requests?
Thanks :) for any help.
Firebase Hosting cannot execute PHP scripts. It mostly a static hosting service (so serving uninterpreted HTML, JavaScript, CSS, etc).
Recently Firebase added the ability to connect Cloud Functions to Firebase Hosting. But that still doesn't allow you to run PHP code on Firebase Hosting.
Very shortly I will be required to do an integration between Quickbooks Desktop and a PHP website. I'm aware that there exists a PHP QuickBooks class that helps with integrating, but to my knowledge that only works when the PHP site is the one to initiate contact with the Desktop application. It's required of me that when a Purchase Order and/or Product is created on QuickBooks, it will automatically (and instantaneously) send the information over to my website using a REST API. Considering there will be multiple instances of QuickBooks Desktop that will be connected (we will allow customers to use a QuickBooks application that we will build), it is not practical to have to constantly check if ALL of those QuickBooks Desktop instances have any new Purchase Orders or Products that have been created since the last time we checked.
Is there a way to somehow add code to QuickBooks to send Purchase Orders and Products (upon creation) to my website using a REST API?
Thank you
QuickBooks itself doesn't really have any reliable method of catching events like you're talking about, and also doesn't have any way to then relay those events to an external REST API. So, you're not going to find exactly what you're looking for - it isn't possible.
With that said, you CAN get close by having an external application that polls QuickBooks periodically (as often as every few seconds) to grab new data from it, and then relays that data up to your REST API.
The easiest way to do this is via the Web Connector. It can poll as frequently as 1 minute, and is very capable of doing exactly what you're talking about. If you want to go with the Web Connector, your best bet is probably this open-source QuickBooks PHP DevKit (disclaimer: I'm the author). You could start with the Web Connector quick-start guide.
The harder way, but more flexible way is to write a custom QuickBooks SDK application that sits alongside QuickBooks, polls QuickBooks periodically, and relays that data up to your app. If you want to do this, you should check out the QuickBooks SDK - it has some C# and VB.NET examples in it which should prove useful.
Some specific notes:
but to my knowledge that only works when the PHP site is the one to initiate contact with the Desktop application.
Actually no - it only works when the Web Connector (which runs alongside QuickBooks) initiates the communication. But you can set it to run every 1 minute, which makes it pretty much constantly run and push data up to your app.
it will automatically
This is easily do-able with either the Web Connector or a custom SDK app.
(and instantaneously)
This isn't do-able. QuickBooks isn't even fast-enough performing to instantaneously relay data. You will never get instant data transfers from QuickBooks, so just forget about it now. (This is especially the case when you realize that there are lots of things in QuickBooks that can completely lock integrated applications out of even connecting to the data file - single-user mode, QuickBooks automatic updates, QuickBooks not being running, too many users in QuickBooks, etc.)
send the information over to my website using a REST API.
If using the Web Connector, your website receives the data, and you can then transform it and send it to your REST API.
If using a custom SDK app, you can write custom code to do that no problem.
it is not practical to have to constantly check if ALL of those QuickBooks Desktop instances have any new Purchase Orders or Products that have been created since the last time we checked.
Are you sure? We do this every day for thousands and thousands of people on ridiculously under-powered hardware.
I am writing the spec for a complex business solution; it is basically a set of web apps that are all on their own servers. I want them to be independant so if one has a problem or becomes very busy then the rest are not affected.
There will be a central server that will act as the payment gateway for the apps as well as providing data to the apps themselves. The data is minimal; user ids, have they paid for that app etc.
The idea was that when an app was purchased then we'd just pass that data to the app in question.
The question is how to do this while not holding up the user's experience while we wait for the app server to resoned. The idea was to enter it into a queue and process them one by one on a cron job. However there are concerns that this will not be fast enough and the user could have to wait before accessing the app.
The other idea is that the app just contacts the main server when the user tries to use it. The main server can then approve the user and this will be kept on the app server DB so it doesn't have to check again.
What do you all think about these ideas? Is there an obviously best way of doing it?
The system should be able to scale to 100+ apps and tens of thousands of app purchases an hour.
Very interested to see what you all think! Many thanks
I have a similar but slightly different situation here, supporting a potential competitor... have I gone mad?? haha
To the topic, we use cURL to connect the server requests generally, especially if we don't want information to be public, we have a specific VPS set up for payment handling, account functions and financial functions, this will post to a centralized mySQL database for access information only so it will support a single sign on for multiple apps on multiple server clusters.
To ensure the user is immediately moved to the app they want and it works correctly, we use cURL to post initial data creating the default records in the specific app database, we then set up a PHP header redirect using to move the user to the app requested with the single sign-in already working as part of the cURL post preformed earlier.
An access key is important to us as it enables the single sign-on to be secure. It is generated 1 time per account and never updated even though we can if there is ever a security violation. We then use cURL in the user auth process to ensure the user is still signed in using their key and user id. The key is never actually passed publicly but always posted server side using a cURL method hiding it in the PHP.
I hope this helps.
A friend and I would like to create a website to manipulate Facebook data.
The structure is:
a PHP web role (contains the web page, user OAuth login, interacts with queues, and interacts with SQL Azure database)
an F# worker role (does statistics and quite heavy data extractions)
The process is (assuming a new user):
user arrives on the web page and logs onto Facebook via OAuth, the PHP web role then posts a message in a worktodo queue with the login info and token.
F# worker role reads the message off the worktodo queue and starts doing data crunching (using the Facebook API) and stats, then it writes the results to a SQL Azure database. Finally it posts a message to the workdone queue stating it has succeeded in doing the data processing for the user.
Finally the PHP web role reads the workdone queue and notices the work is done, and displays the algorithm results.
I have two questions:
Is there a big flaw in this design?
What is the best way to collaborate: one person will write the PHP and another the F#, is there a way to use development storage from two different machines?
Thanks a lot! (Apologies if some find this stuff too basic, I am very much a beginner in all these matters).
If you wanted to follow a bit more experimental path, you could also try looking at Phalanger. This is a project that compiles PHP code to .NET, so it may be possible to run it directly on Azure and nicely collaborate with F# (Phalanger has a few language extensions that allow you to call any .NET objects and some API for calling Phalanger objects from C#).
I was involved in the project some time ago, but it is now beign developed by other people (and as you can see from the check-ins, it is quite active again and they would surely be interested in collaborating to resolve possible Azure issues). If you were interested, let me know - I can give you some contacts, so that you can discuss the Phalanger status on Azure with them.
I don't see anything wrong with this plan.
I don't think there's a way to have two machines pointing at the same development storage, but you can just use cloud storage (even when running locally). I do that all of the time; you will pay for bandwidth and storage transactions, but for most apps in testing, this cost is trivial.