Suppose you're developing an independent, small sub-page for a big and well frequented web portal.
The sub-page shows entries from a public event calendar, and allows users to highlight those especially interesting to them. The highlighted events shall be highlighted (and maybe shown on a separate list) on each future visit of that user.
However, building a classical user registration system, or any other way of storing the user-highlighted event picks on the server, is not an option: The sub-module needs to be as self-contained and need as little maintenance as possible. It's one of the conditions of the project.
The only way to do this without building a login system of some sort (as far as I can see) is using cookies or some other local storage (Flash / HTML 5....) which has the obvious and big downside that it's tied to the computer, not the user.
Is there a way of storing a few kilobytes data on a per-person basis, but without having to utilize a login or openID, that I am overlooking? A reliable web service perhaps?
A "key/value" storage service, to which I pass a unique key (one that the user specified) and get the savedvalue in return, would be sufficient. There is no need for real security - the data in question is by no means confidential.
OpenID is not an option: It is not well known enough among the audience of the site.
Facebook would be an option, but I don't think they provide "storage" options like this.
As a workaround, I am contemplating offering the user their event picks as a text file download, that also can be uploaded and turned into cookies on another machine. But that is pretty complicated for the user, and thus not perfect.
We have a similar system on our site, where users can bookmark pages to a planner/wishlist function. The saved items are sent via a webservice and stored on our server, and there is a corresponding get webservice.
We have a 'lazy register' system. The first time a user saves an item, they are asked for their email (but no password, as nothing is confidential). This is hashed and saved locally using a cookie, then used to set/get the saved items. When the user uses a different computer they are again asked for their email.
The key is that a register and a login are the same operation, so there is no need for any password reminders or any reset functionality.
The Google Docs API provides programmatic access to Google Docs, where you can create and store documents and spreadsheets. Your application could have its own Google login, which it uses to create one or more documents per user. These documents could be used to store the user settings.
Provided you can get a unique ID from each user (an email address, or something more secure, perhaps), this should be fairly simple. You can even organize the files into folders—one per user.
Alternatively, you could combine Google Docs with the Google Spreadsheets API, where I have just noticed this rather handy feature:
Tables & Records
Interact with spreadsheets as if they're a database
using Tables and Records.
Related
I have a client that wants to take orders via an online form, with the idea being that an order can be submitted and stored in a database via my application while simultaneously generating an invoice on submission in QuickBooks.
How do I do this in PHP when the person entering in the order is not the client but a client of the client? It seems like Quickbooks uses Oauth tokens and a javascript library to generate them to connect a company to an app, but I'm simply writing a backend for one company and want that backend to create invoices when saving an order. How do I think about this?
I'm not interested in anyone having to hit a button that says "connect to quickbooks" especially not the person filling the order because again, that person is a customer and doesn't need to know about the internals of the customer's invoicing system.
I just really want to use the Accounting API to generate invoices. Is there no way to simply link my backend to my one company directly in the Quickbooks SDK configuration and achieve this, or do they need to use a javascript library to get tokens. I'm unclear about what direction I should be going in and don't want to waste time with a client-side library if I don't need it to do backend logic.
Here's some example code that does exactly what you're looking for:
https://github.com/consolibyte/quickbooks-php
Along with a quick-start guide:
http://www.consolibyte.com/docs/index.php/PHP_DevKit_for_QuickBooks_-_Intuit_Partner_Platform_Quick-Start
Also see notes about your comments below -- you're on the right track, you're just misunderstanding how OAuth works:
It seems like Quickbooks uses Oauth tokens and a javascript library to generate them to connect a company to an app, but I'm simply writing a backend for one company and want that backend to create invoices when saving an order.
Correct, Intuit uses OAuth, and a little Javascript thing to kick off the OAuth process.
I'm not interested in anyone having to hit a button that says "connect to quickbooks"
Someone needs to hit this button... BUT only ONE PERSON needs to hit the button ONCE, EVER, and then NEVER again.
The owner of the company (e.g. your boss) needs to click the button ONCE, which gives the OAuth creds (and the realm ID) to you. Once your boss has done this ONCE, then you have the creds to use forever, for all of the actual customers.
Your customers (e.g. the people actually checking out/placing orders) DO NOT click any buttons, nor do they see or have any idea at all that you're even using QuickBooks.
just really want to use the Accounting API to generate invoices.
Cool, you can totally do that!
Is there no way to simply link my backend to my one company directly in > the Quickbooks SDK configuration and achieve this, or do they need to use a javascript library to get tokens.
Follow the quick-start above. It should take you about 15 minutes to get a working OAuth connection, and then you never need to use the client-side stuff ever again.
You only need to authenticate every 180 days btw.
If you use the reconnect script, you only need to authenticate ONCE, and can automatically renew the tokens every 180 days, no user-interaction required.
https://github.com/consolibyte/quickbooks-php/blob/master/docs/partner_platform/example_app_ipp_v3/reconnect.php
Well with the realm_id for example, I don't understand how that relates to ouath.
The realm ID is just a unique identifier for the particular QuickBooks Online company you're trying to connect to. Yes, you need to store it. If you use our libs, this is done for you automatically.
I guess I don't understand if I'm developing for one client why can't I just get their realm_id from them and then keep using it rather than making them do some form of authentication?
Again, they only have to authenticate ONCE. That's Intuit's way of giving you the realm ID and credentials you need to connect. Once you've done it ONCE, you never need to do it again. It takes all of about 30 seconds.
If they were to just give you OAuth creds without any authentication, it would be a gigantic security hole. If you read the Wikipedia article on OAuth it talks in depth about this, and the goals of OAuth.
Okay I think I get it, so they have to authenticate once every 180 days?
Once every 180 days, UNLESS you use a reconnect script, in which case they just authenticate once and then never ever have to worry about it again.
So I can store the token and the realm_id in a database before it expires and just use that?
Yes.
In this way my client can authenticate and then my scripts can generate invoices for them when their customers visit our website?
Yes!
I've build two seperate browsergames and I want them to share the same login system. So that if you have one account which you use for both games. I want this because of payment reasons (if they buy something I have to pay per website so I want all payments to come from one website) and user experience (why sign up multiple times?).
But when they first sign up and login I do not want to redirect them to the centralized system because I want the sign-up/login-proccess to be as simple as possible because I'm afraid they'll lose interest.
So I was thinking about letting them sign in to the website normally and then using cURL to send the requests to the centralized website by building an API.
I am however wondering if this is a good approach because none of the websites are on the same network. I guess I'll have to build the centralized login system in such a way that it only accepts requests from the domains the games are on. But are there any other things I have to worry about? How should I send for example passwords? Because sending them plain text seems like a horrible idea so I guess I'll have to hash them on the side of the game. I guess I'll have to make a small copy of the users table for for example the username and user_id.
I don't want to use for example OpenID or Facebook Connect or something because that does mean redirecting to an external system (which is bad for user experience) and the average age off players on the website is about fourteen, so they have no clue about OpenID and I don't want to force them to connect their Facebook accounts to my websites.
Also I'm sorry if I'm unclear or writing in bad English, I'm not a native and having a hard time expressing myself the right way.
Thanks in advance!
If I were you, i'd create a API for registration system.
Website A (game1) uses that API to create accounts and verify them, same does the website B.
API would validate data, respond with errors (username taken etc), or return success messages. It should be protected (LAN access only or some authentication)
There's one news-portal and its huge (site A). During production backwards, it has been even powered with some sort of a social networking stuff - not yet started, but about to start (soon site B).
Packed together looks great, but in some future there might be a lot of problems with maintaining the database, servers and stuff so I've been asked to separate it as I did. Site A goes on its own domain, site B also - databases are separated.
Now, I need to do the following: when user logs into site B (social site) and arrives to site A (portal) they should be instantly logged in there too (on a site A).
Any ideas how to do this - without duplicating entries to user tables?
EDIT:
Any other ideas instead of auth services? OpenId will just give us one user with 10 accounts, fakes and so on. How about cookie stuff or multi-database queries?
EDIT 2:
Well this is something hot.. unless its not April 1st joke and worth a try:
http://www.shawnhogan.com/2005/12/cross-database-join-with-mysql.html
You can use OpenID.
OpenID is an open standard that describes how users can be authenticated in a decentralized manner, eliminating the need for services to provide their own ad hoc systems and allowing users to consolidate their digital identities.
If you want to share some profile information (e.g. posts or photos) without giving access to your login and password you can use OAuth.
See if my answer here is of any use.
It's pretty simple, and only requires user information to be kept on one side. Where you need some information on the non-database side, you can just pass that information there using values in a query string, alongside the encrypted string.
Can you believe this?
$DB->query("SELECT * FROM table..."); // works of course...
$DB->query("SELECT * FROM another_database.table..."); // WORKS ALSO!
This guy is a miracle:
http://www.shawnhogan.com/2005/12/cross-database-join-with-mysql.html
I don't believe it, so simple at the end!
This could be me making extrapolations from the documentation (which given how it's organized, isn't too difficult to do) but did Facebook make some big changes to how application developers create and utilize test users?
To my recollection (and I last did this about a month ago), developers used to be able to go to a url from within a logged in account that would convert that user to a 'test user', who had no privileges in the public system but could serve as a test entity for publishing, getting permissions, etc. I had a test user setup via this method and had generated access tokens for him/her that were saved to the database.
Recently this all stopped working. My first instict was to print out the properties of the Facebook object I had created, and where once there had been the JSON decoded user data, there now was an exception from the CURL process that accessed the /me/ api endpoint.
Facebook GraphAPI – Uncaught EntCannotSeeExistenceException: The entity (class EntTestUser) backed by id 12345 cannot be seen by the current viewer 12345 (EntID: 12345)
Values obviously changed to protect the innocent. The userids were identical however, which made it strange that the user possessing ID 12345 for some reason had no rights to see 12345
I looked through the Facebook documentation and found their what seems to be their new logic for making the application itself create temporary test users and generate login URLs for them, something alot of people here are probably familiar with.
http://developers.facebook.com/blog/post/429
Has this system superceded the old one? It seems that acting on behalf of test user accounts generated via the old method is no longer permitted since they are not made "by" the application. I wrote some quick logic to test this new process with fixed access tokens and it worked - I should also mention that all the application logic functions as it originally did with no errors for real user accounts. Has anyone else experienced this with their Facebook api apps? Can we definitely say that these old access tokens/userids/accounts made via that old method are now effectively useless?
Thanks as ever.
http://developers.facebook.com/blog/post/475
I hate to break form and answer my own question, but from the comments enough people seem to be having this problem.
Digging through the developer updates today I finally found the above post (having already written workarounds, of course) Quoted:
We have removed the ability to turn
user accounts in test accounts as
mentioned here to prevent
unintentional conversions of real
accounts into test accounts. The
proper way to create test accounts is
by using the accounts connection of
App Graph Object.
This tiny notification was mixed in with the updates for March. Probably should RSS their developer blog so these things don't take me by suprise again! This particular update became effective March 4th. In any event, it's good to have a conclusive answer right from the source.
I am a web developer and I want to design a commercial website to sell a customer's product. Sell and buy activities are important and I need to maintain user activity information to keep the site secure.
I want to write a dynamic website. I want to control all user activity and then decide whether to save user activity information in a database. Some of site's visitors are registered users and some are anonymous. I want to save online information such as ip address, username, page name, and date/time for my registered users.
I want to know:
How do I save a user's IP address?
What more do I need to save?
Saving each HTTP request details into database will work for low traffic web sites, but you will have performance issues in case of popular website, since writing to database in relatively slow operation.
Why not to use server HTTP logs instead?
All HTTP web servers create plain text log files which record remote user IP address, URL requested, etc. You can create activity report by writing your own script or using log file report tools. AWStats ( http://awstats.sourceforge.net/ ) is one of the most popular open-source tools for this.
On client side you can use Google Analytics to track user activity. It also provides means to track custom events:
_gaq.push(['_trackEvent', 'login', 'user_login', "custom data"]);
More info at: http://code.google.com/apis/analytics/docs/tracking/asyncUsageGuide.html
This option only tracks users with JavaScript enabled, so it won't show bots, crawlers or users having analytics blocking addons installed.
I'm not sure I understand all of your question...but to address at least one aspect of it, if the user is behind a proxy, then you have no way of determining what their real IP is. That's the whole point. The proxy is the one making the request and then forwarding it. Without asking the proxy yourself, you cannot determine that. With regards to what else you need to save, it depends entirely on what you want to do and you haven't done a good job of explaining why you are saving this data. If you can clarify that, perhaps we can help you a bit more in determining what data you should be saving.
Edit To address your clarification, if you wanted to be crazy, you could log everything that a person does. Every link they click, every product they view, etc. I don't necessarily advocate that as I find it a bit creepy, but there are definitely sites that do it. At the bare minimum, I would suggest logging what products people look at and then what products they buy. I would also log that information on a per-session basis. Basically, what products do people look at and then end up buying on the same trip to your store. I wouldn't worry too much about the "real" IP address. Most people won't be behind a proxy and those that are, you can't do anything about anyway.
How do I save a user's IP address?
$_SERVER['REMOTE_ADDR']
What more do I need to save?
That's quite strange question. It's your application, not someone's else. How can we guess what information you need?
However, at least one issue I can point out: a page name is not sufficient to log "all user activity". Query string and POST data usually contains important details on that activity.