I was looking at jsfiddle and shrib.com and the concept of saving and sharing your notes/code without logging in or making an account fascinated me a lot. I noticed they make a different URL for every new entry. So do they save the entry associated with the corresponding code in the database and send cookies to remember the computer or something(using php)? I obviously looked at the source code but obviously the website wouldn't be just HTML. I just wish to understand the concept that works behind. I'm uncertain what should I Google so I came here. My friend thinks there would be no cookies involved.
Thank you for your feedback in advance.
They generate a unique ID which is embedded in the URL that you share; this is associated with the data in some back end storage so that it can be displayed when someone visits the URL.
Because of the requirement to share between users you can't use local storage or cookies, as this would only allow the original user to see the content, not share it.
They probably do use a database of somesort to keep track of the URL's and its content. i doubt it has much to do with anything and that as soon as you visit the url the request is checked and the corresponing page is show to the user.
Also. they probably keep track of the last time visit/active and destroy the input after a certain time period altho im not sure about this
Drawback of using cookies
Limited with low amount of data nearly 4kb It will include your every
HTTP request Will send same data again and again – it’s enough to
slow down your web application Unencrypted data over the internet
(unless your entire web application is served over SSL)
You can use HTML5 Local storage for save data
if(typeof(Storage) !== "undefined") {
// Code for localStorage/sessionStorage.
localStorage.setItem("lastname", "Smith");
} else {
// Sorry! No Web Storage support..
}
For More detail about HTML5 local storage you can refer Here
Related
At the moment, I'm working on a website that could use some extra user usability, so I want to launch a couple of modal windows to aid users on their first time visiting of a couple pages.
I want to check if it is a users time time viewing a specific page. I've read about how you can run into problems when using cookies to do this. They can be deleted, the user can use a different PC or device, etc.
Also, I want to check for multiple pages if it's their first time viewing, not only directly after login.
I'm guessing a good idea for this would be to make a separate table with the pages in it that I need and setting a boolean for it if it is viewed or not.
Would this be the best way going about doing this?
There isn't a highly reliable way of doing that:
You can use cookies, but as you said, they are not reliable, a user can change PC, delete cookies, change browser, etc.
You can try using an IP address, but that's also not reliable. If a user switches address (which can today happen as you walk down the street with your mobile phone) he'll see the page over and over again. Moreover, if some other user happens to stumble upon the IP address the first user used, he won't see your tour/tutorial.
What I can suggest you is that you use cookies to detect if the user is new, but don't automatically throw the help modules on him, but prompt him using an non-obstructive toolbar at the top or bottom (never a popup window or lightbox).
That way, you get most of the users (because many people use the same browser and computer and rarely delete all their cookies), and even if a user has deleted his cookies/he still won't be disturbed that much.
There is no reliable approach if user is not registered and logged in with her/his username & password.
As mentioned before, there is no reliable way of detecting users ( and detecting if the user visits the site the first time), I also recomend Madara Uchiha's aproach, also you colud use html5 local storage in addition to cookies, both are not 100% reliable
u can however try user recognition without relying on cookies or html5 storage, but this is extremly complicated, u dont want to do this.
Just to satisfy your curiosity about how to do this, check this epic answer on a related question:
User recognition without cookies or local storage
I think, as I believe, there is no way with no solution. I think, a possible way consists of some parameters which first to be said and and finally by considering those, we can be able to talk about possibilities and impossibilities.
My parameters are in the below;
talk about features of a webpage as "User Detection" and detail them
think about reactions (I mean being fast to click on any elements of a page or not) on a webpage
inspect elements
URL injection
other reactions like click on some parts as spots placed on the page
stay on that page up to a time defined for being and checking authorizing
and so some solutions like the ones above.
I do have Google Analytics on my site (tied to adwords), and have tried many other packages for tracking visitors, but none give me the detail I need. My goal is to keep the exact path visited, per IP.
All those tools work by adding a piece of code to the page, that goes to their server to save the page visit.
It occurred to me that I could save the IP and breadcrumbs in a SESSION array, and scrape the session files directory every few minutes to save the files before they expire.
Later I could decode those and do all kind of of data mining.
This way I don't have to modify the code, don't have to make each page visit go to an external site, and I believe this is less taxing than, let's say... a mysql record write at every page visit.
Would this be a bad idea for any reason I can't imagine yet?
Thanks for any input.
It is possible to do nefarious things with $_SESSION, however, it is pretty safe. Your plan looks good.
On the other hand, users usually don't like it when you track them without their consent. Make sure you disclose the fact that you'll be tracking their every move.
What I would suggest is to use Apache access logs. Add any other information needed apart from default and be sure to log the session ID Cookie in access logs. This will allow you to group the data when data mining.
I am working on a geolocation project that is browser-based and geared specifically towards mobile browsers. I won't go deep into details but overall the project uses geolocation across many pages and pieces of the project. The problem as it seems is that despite hitting "accept" and "allow" always on any given browser, when the user reloads a page they are prompted almost everytime to reallow the location authentication.
Which to me is weird, I've gone to sites with geolocation on it that it truly remembers that I allowed it, even if for the duration of me being on the site and not always. So with that I am wondering is there some type of special thing I need to do to store this authentication for a prolonged period on my project?
Currently I am using javascript-based geolocation through the google maps API and newer browser support for non ip based location. My project at the users discretion tracks where they are so other people connected to them can see where they are, where they have been, or where they plan to go (if they enter that in). There's more as well, but that's the basics. From a UX perspective, it's annoying to have to prompt my users every time the browser reloads to reauth the location.
So any advice anyone is able to give, or insight on how to handle this would be appreciated.
Cookies and sessions.
Right after detecting the user geo-location you should store it in a Cookie.
Every request from the server should look if there is a cookie set (with the user's location) and act upon it.
Over-simplified code:
// get user location (via javascript)
navigator.geolocation.getCurrentPosition(function(position)
{
initialLocation = new google.maps.LatLng(position.coords.latitude,position.coords.longitude);
}
// ajax the server with the user location
$.post('ajax/set_user_location.php', initialLocation);
Then on the server you grab the initialLocation and save it in a cookie.
two years ago I had to design a system to share authentication data across multiple domains, all of them shared the same server/db. I was able to pull this off with a complex system of cookie sharing which, to date still works.
I'm now in the process of redesigning the system and I was wondering if there are better ways to achieve this without having to write cross domain cookies.
Basically the system MUST do this.
Once logged in one site the user must be logged in all of the other site seamlessly, not only following a link, but even by directly writing the domain name on the address bar.
To my knowledge the only way to achieve this are cross-domain cookies, if there are alternatives please tell me.
Thank you very much
My Idea would be to include a login-Javascript from a third domain which gets includet in all sites. This javascript sets and reads the session-cookie and calls the current domains server via ajax with the result. (No validation should be done in the JS - this simply sets and reads the cookie)
If cross domain AJAX does not work, you can still call the thirds domain server which acts like a proxy and calls the current domains server.
The StackOverflow sites have implemented something similar to this. Check out the details at the following links.
Here is a post giving an outline of how they did it.
And here is even more detail.
For this you do have to use cookies, but you can vary what you store in the cookie. The cookie doesn't have to contain user credentials but can instead contain something more like a token that you use to "centralize" your sessions.
Easies way would be to let all hosts share a single memcached server and use the content of the users cookie as your key.
I have been asked to write to save the data in some secure place after certain task is completed by a client.
Here my client has alot of staffs who makes data entry in online forms provided by some different vendors. After some steps the entered data generates some results. i.e after submitting the web form. Now he wants the generated results to be saved either in local computer or in some online server.
Can it be done by:
1.) Creating a local web server which sits in between the users and the online main third party server and record all the generated results?
2.) Creating a browser extension [m thinking about firefox] and forcing users to navigate via the specific browser and record all the generated results?
**I am pretty sure that second method can work as firebug is doing that only need to add some functionality to save the data.
any idea will be appreciated.
Sorry for mis-leading guys. To make more clear here are some more explanation.
a.) I am writing application to one of my client who has many staffs who enters data in abc.com website.
b.) User submits the data to abc.com.
c.) the website abc.com produces or generates some result as per input data and sends back to user.
d.) Now I need a system which will be inbetween the staffs/user and abc.com website and track the responses of abc.com and save it to some location automatically.
e.) Currently the data entry user will manually save the result to his local computer and if he/she forgets to save then we miss the result so want to do it automatically so that we wont be missing any single result.
A local webserver wouldn't really help so much because of crossdomain issues, unless you wanna go with something as JSONP. In that case, you might wanna use a signed java applet, which (assuming the user accepts the certificate) has any access an installed Java app would have, so it could potentially bind a port and server as HTTP server (I'm not sure how well this works on linux or unix).
Another idea would be to use Flash's local shared objects. Depends on how much data you want to save. You may need to make the Flash visible, so the user can see the dialog for allowing the data to be saved. You can communicate from JavaScript to Flash using ExternalInterface.
Yet the most simple thing is to give the user a permanent cookie, and save the data associated with that cookie on a web server.
greetz
back2dos
Take a look at HTML5 local storage if you don't need a lot of data to be stored or you can use local databases, see http://blog.darkcrimson.com/2010/05/local-databases/ .
You can also sync it with a server once the client is online again..