Google Drive Api Integration Issue - php

I'm trying to figure out a workflow for an app that allows users to upload files (Pdfs, Docs etc) and share them with other users in their organization.
The users who need to view the document will not necessarily have permissions to do so through Google Drive, only through the app.
At present, users can upload documents but I want to allow them to link to documents in their Google Drive.
I'm unable to figure out how I would initially go about doing this.
At present, the best idea I have is to gain offline access to their Google Drive account, and retrieve a copy of the document to the server. This doesn't seem like the best idea as having to retrieve a document, save it and show it (ironically using the Google Docs viewer) for every page load would hog server resources.
I could get a copy when the user first adds the document, but then there's no guarantee that it's up to date when someone accesses it several months later.
Is there a correct way to do this?

Check out Drive Platform Best Practices and Performance Tips to help you build high quality Google Drive apps and improve the performance of your application.
To know any changes, you may want to see the document about Detect Changes and Push Notifications.
For Google Drive apps that need to keep track of changes to files, the Changes collection provides an efficient way to detect changes to all files, including those that have been shared with a user. The collection works by providing the current state of each file, if and only if the file has changed since a given point in time.
The Drive API provides push notifications that let you watch for changes to resources. You can use this feature to improve the performance of your application. It allows you to eliminate the extra network and compute costs involved with polling resources to determine if they have changed. Whenever a watched resource changes, the Drive API notifies your application.
In Manage Revisions, you can always flag to see if there are new revisions created as discussed in the video of Google engineers discussing related tips and tricks.

Related

MySql Database Design, in support of Android App

I am working on an Android application that will allow users to gain access to certain sets of files and then sync them with their device for offline use. Thus, when they first download my app, they will have no account or files, they will have to create an account, then enter an access code to gain access to certain file directories for download.
I have the majority of UI of the app completed, and it now needs to be 'plugged in' to a backend. I am a little familiar with PHP as I have done a few dynamic websites, so I am familiar with working with phpmyadmin, but I really want to make this backend schema well made and able to handle business.
I don't know where to start to design the relational databases and all the other factors I may have not even thought about yet. Does anyone have any good references, tutorials, anything that may help me take this next step?
if you want to learn about data base modeling, these websites can give you a good start example2 example3
or do you wish to learn about SQL coding?

What is the concept behind Video Distribution Services?

I came across quite a few video distribution services (one is heyspread). The video they have on their home page is like, you upload your video only once and it is distributed to other video sharing sites. To me, it looks like they have coded it in Php.
Can anyone explain me the logic behind it? How is possible? Also, will it take the bandwidth of uploading it only once or uploading it to all the sites that we are using?
Also, if I had to code one like that, are there any links which I can use as a start up to kick off?
Here is my explanation based on what I know and your requirements:
Can anyone explain me the logic behind it?
You basically act as a mediator for all the sites. What the users do is given the site their permission to use their credentials that they set up in this multiple sites and allow you to upload video on their behalf so they don't have to do it themselves thereby saving them time
How is possible?
Many of the video hosting websites operate on HTTP protocol. In order to upload on your behalf the video distribution service do the following for each of the website (I have generalize the steps, there could be more than these steps for some of the sites):
Authenticate using the credentials that you give
Upload the video using one of the following ways:
If there is API available, then this is the preferred way for the service to upload on your behalf as the interface to authenticate/upload are clean and well define
If no API is available for a particular website, then the service has to simulate the HTTP sequence as if the request is made from the browser by the user. This is not the best way but sometimes is the only way. This approach is not as robust as the first one because the contract could change and you don't necessarily get the confirmation message (for success/failure of upload for example) other than via parsing the HTML
Also, will it take the bandwidth of uploading it only once or uploading it to all the sites that we are using?
It will use your bandwidth once to upload the video the first time. In order to upload to other websites on your behalf, the video distribution service will use its datacenter/cloud service bandwidth
Also, if I had to code one like that, are there any links which I can use as a start up to kick off?
I don't know any that does exactly like that, but you should look at each site that you want to mediate and see if they have the API first. You might want to start with those who have before trying to simulate user's click for those who don't have API. For example YouTube even goes further by providing the Client Library for you to use based on your language of choice.

Syncing data and images with a client in a web application

I'm writing a web application in PHP which needs to store images and image meta data. In future, the application may need to work offline on the client. A user might need to download all the images and data to his laptop before going to a remote area without internet access. Whilst at the remote location the user could add new images to the system and be able to compare them with his local copy of the image database. When returning to an area with internet access, the user would run a sync operation which would copy his new images to the server and retrieve any new ones.
I've looked at the new web storage / localstorage options in HTML5 (web sql database seems to have been dropped) and I think this is going to be too limited as there is only 5MB space and one or two images could easily exceed that.
Is what I want to do actually possible / practical with a browser-based web application? Or should I be looking at writing a desktop/tablet application with local file storage capabilities for users without net access. Initially, it does need to be a web application, I'm just trying to think ahead. Will I give myself more options in future by using something like couchDB for the backend from the start? As I understand it, this comes with good syncing functionality.
Thanks,
I decided to use Titanium Desktop.
http://www.appcelerator.com/products/titanium-desktop-application-development/

Suggestions on a solution for storing photos per client, which they can then have access to

I am trying to build an app for a photographer. These are his requirements:
Photographer can upload photos or files (files up to 20-30mb).
Photographer can categorize photos and files by client.
Photographer can create client username and passwords.
Photographer can send email from web interface to link to download. (which requires client login)
Client can login and view and download photos and files assigned to his account.
Photographer mentioned 1-2 terabytes of data needs to be stored.
So, my questions:
Is there an open source system out there that already does this.
Is there an app already out there that does this. Photographer currently uses "yousendit" but the free solution is not sufficient since data is lost after 2 weeks.
He mentioned he could host it on a box at his office, but his connection is limited.. thaughts? At that point storage space would not be an issue. I would have to code this app.
GoDaddy hosting for example does unlimited storage with one of its plans that is reasonable. If I coded my own app, this would be perfect. I do hate GoDaddy though..
I will listen to any alternative suggestions.
Thanks!!
I'm not sure about building an app like this, but SmugMug is an existing app that hits most of your requirements.
Can upload unlimited JPGs as part of base cost, other files can be uploaded at additional cost
Photos can be categorized into galleries per client.
Galleries can be locked down, with clients getting unique user/pass to their gallery
Not 100% sure if you can email a link directly from the site, but I believe you can.
Additionally, if you get a pro account, clients can order prints online (no need to build your own payment processing), you can "theme" the galleries how you like, automatic watermarking, etc.
The base cost is $40-$150/year, depending on what level you choose, plus whatever you need for additional file storage. Not sure if it meets all your needs, but just throwing it out there. Note: I am not affiliated w/ SmugMug, just a satisfied user.
The first question you need to answer is whether you will store the files in the database or on the file system. Given the amount of data in question, I would store the file on the file system and keep meta data about the files including their location in the database. The catch to this approach is that you have to keep the two in sync with each other. It is not particularly difficult to build that system.
The second question relates to where you store the files. You could store the at the same place where you host the site or you could use a cloud storage option like Amazon's S3 (or DreamHOst mentioned by J_B). One advantage of using cloud storage is that the site can hosted anywhere without affecting the location of the files.
DreamHost might not mind. They say they will give you all the storage your site needs with all plans (+ some extra for storing whatever you want).
I don't know about an app that does that. Doesn't sound REAL hard to write.

just a bit of strategy

I need some guidance around how to develop the app I'm working on.
It's basically a backend system to manage photos and slideshows (eg arrange photos in albums, decide which ones to publish, update names and captions etc)
I would like to avoid giving the source code to clients but would like to keep the actual photos and thumbnails on the client's server.
I'm not sure what would be the best way to achieve this. In my mind the steps are:
a) client uploads a photo to MY site
b) photo is registered into my DB
c) the original photo is moved to client's server
d) thumbnails are generated and saved on client's server
then the public site:
e) install the public website on my client's server;
f) when a user is browsing the client's website, the script gets the list of images to show from my database, and gets them from the local server.
(hope I made myself clear)
basically the question is: what's the best way to give the client minimal/no access to the source code?
I agree with benjy, however, you can get away with using an API to manage the system specific calls, and just have an upload handler that communicates back to your API on the clints box, so they still have some code, it is minimal, and the code requires an API call to function. That way you reduce the DB need, and reduce the resources required to manage the clients code.
API is used to authenticate / manage communication while the upload / manage scripts handle the upload/image handling.
IMO, this seems a little unnecessary. What exactly is your concern about having the source code rest on a client's server? All you need is a signed license agreement between you and the client preventing them from doing anything with it.
Or, if you really don't trust them, just sell it as hosted software. No point in the above procedure, which is rather convoluted (no offense), when you can just have everything on one server.
Just my $.02.
You can obfuscate the code with a commercial tool like IonCubelink text, or you can develop your application and license it using a SaaS model, and provide an API for the client software to use.
Zend Guard, SourceGuardian, IonCube, and similar are other viable options if you cannot keep it local but want to make it difficult to find out what the "source" is.

Categories