We built a survey tool on top of google forms using wordpress.
Simply, you create a google form, create a private open link, put into a wordpress backend page, then the system processes the module server side and generates the necessary html file. When the user fills the form and sent it, via ajax the server use zend gdata to write the results on the spreadsheet connected to the form et voilat.
But this system is limited, also because google form is quite limited. We want to improve it.
That's why I'm asking your opinions to upgrade the system to have some more features:
We want to be able to keep the form open so that users can fill it in more than one occasion. theoretically then, we need to know which user the spreadsheet rows are connected to. This could be done by saving some sort of ID key to recognize the user, but then we don't know how to refill the fields in the form, since the spreadsheet created from the forms don't retain any sort of key to connect columns and form field.
We need more field types! like a file upload field that put the uploaded file in a specific gdrive folder.
We need to see the data for the single entry while google gives you only the whole spreadsheet that's quite hard to read.
It's not an easy task! Which solutions should we use to solve these problems?
Many thanks!
UPGRADE:
We decided to go by using a mix of google forms, google fusion tables, google charts via api access. Here's the simplified algorithm:
The admin user create his form via google forms and save the url. To have more field type, user can put a tag in the field comment, eg [file] for, well, files upload.
The url is put into an admin page of our system. The page fetch the content of the form page and extrapolate into an array, for every field, the title, the ID, the type and the comment; if there's some tag in the comment, this become the field type.
Using this data, system create if not existing a folder with a fusion table inside. if file fields are present, another subfolder is generated. Addresses of these folders and files is saved.
Using the array data, in the fusion table a column is created for each of the array fields, with a column title of this sort "[field_ID field_type]field_title", plus a column for the end user ID.
The admin user, can more over open or close the form.
When a user goes to the form page, the array is used to generate the form. If the system doesn't have in memory the user ID it means that the user has never filled up the form. Otherwise the system will use the user ID to fetch the data from the fusion table to populate the form.
When the user fills up the form, the entries are feed to the columns using the field ID as reference, plus the user ID. The user ID is also stored in the system the remember that the user already filled the form, as said in point 5. If files are uploaded, they are stored in a gdrive folder.
The admin user therefore can go to the admin page and see how many people has filled up the form, can ask for single user data, for summary data using google charts, can download a pdf of data from single user, every user, or summary.
Of course this is the idea, we have to build it. One first question is whether we should use javascript or php to communicate with google, so doing the processing on the client or server side...
If you're asking about Javascript vs PHP, you should know that the Javascript API can't write to a Google Spreadsheet because of Cross-Domain Security issues.
PHP can as it is a server side language. Zend Framework makes it easy to interact with Google Spreadsheets. http://framework.zend.com/manual/1.12/en/zend.gdata.spreadsheets.html
So go with PHP if that was your question.
Related
I want to submit a form from my web site over to a Gscript file on my Google Drive and have it take the form values, and generate a PDF file and email it. I have this working already using Google Forms, a Google Sheet, but the goal here is to have the data itself stored in a MySQL relational database where data is not duplicated, and as we fill in our form data will auto-populate if it already exists, or we can choose it from a list on the left to fill in pertinent informaton. This is for a donation receipt system for a nonprofit organization.
The key for us is to separate donor information from donations. We want to be able to store our donor info, with things like birthdays, anniversaries, and other miscellaneous information so we can send out birthday cards, etc but not store that data over and over again because they made several donations.
Is there any way I can move the form itself over to a secured portion of my own server, and submit the data over to my .gs file that already parses the data and puts it into the PDF document, Mailchimp, and backwards into my MySQL database, and sends the appropriate emails. I could cut out so much of that extra work on the Google side, and do all of it except for the PDF and email from my own server, and handle the rest from Google if I can just submit the data as a POST or GET request or to JSON to the .GS script.
I cannot find anything about it anywhere.
Apps Script has lots of reserved function names. Two of those reserved function names are:
doGet()
and
doPost()
Both of them are triggers, or event handlers that get triggered when a GET or POST request is made to the published Web App URL. Data can be passed to Apps Script with the request. Here is a related Stack Overflow post Link to Stack Overflow Call GAS from external source
Stack Overflow Question
I have a separate server along with my Moodle DB, which holds all user-course data. In some of my plug-ins (type: block) I fetch details from API (which operates with the other server) and display in blocks.
My requirement is to customize the code of course completion activity by an user, e.g. if user launches a course, I need to POST some data (for example: timestamp, course completion % etc.) to the API when user closes the course after completing or without completing the course.
I guess I need to modify the file "mod/scorm/locallib.php".
You can use the events in Moodle.
Have a look at this answer- Email moodle user data after registration
but replace user_created with course_completed.
I am new to Behat scenario testing and wanted to know how to or best practice to capture/store data from one site to use on another site to fill in user information. For example site 1 can create a user profile such as user ID and password. Site 2 take user ID and password from site 1 and use it on site 2 to fill in the requested ID and password logging the user into the site.
I can do this with java and selenium web driver and with a string name from site 1 and call that string in site 2 at the user id location with “send keys (string)”. What is the best way to do this with behat PHP scenario and feature context setup?
I have also reviewed on this site "How to test with behat two sites in the same test, which helps with the first stage of the scenario, but not the filling in data on site two.
First please tell version of behat. It is big diffrence between v2.5 and v3.0.
Do you use Mink? In that case you have CSSSelector to extract values.
Next you need to save them. You could do this in context saving as value in given step, but for everytime you will repeat this for others data.
To feel data of form in other you simply write create form in context, but still you repeat this for each form.
I created some universal lib for holding data and sending this to scenarion. Still in dev, but could help:
clipboard
I have the following case:
I have a third party plugin in my expression engine site that handles my user registrations. It also handles profile picture uploads (which is part of the registration process).
I want to add a feature for grabbing the user profile picture from Facebook during registration. As I don't want to tinker with the third party plugin, I'm looking for a way to solve this on the client side by sending the profile pic from Facebook directly into the php $_FILES array when the registration form is submitted (so the third party plugin can pick it up from there).
Is there anyway to do this?
The answer is: no you cannot. Same origin policy (amongst other things) makes this impossible. The only way to do this is to take information about the user's Facebook account and get it server side using the FB APIs
Hello I am working on an educational app just for my own learning where a user can login, upload photos, and then have them display on a website. I am a little confused on the general idea of how it should work. I know this isn't the best place for this type of question but I can't find it anywhere else.
Basically I want a user to be able to login with something like OpenID and then upload a photo ( I am using phonegap and thus they would use the Phonegap API to do this and I understand the mobile side here ) to a server and then have be hosted on the server and be able to have a user's photos displayed together.
Do I need to have each photo be submitted to a database and if so how would I store the photo info? If you have any input it would be appreciated I am a little lost. I know php for the server side I just don't know what to do.
Do I need to have each photo be submitted to a database
Only if you want to, you would have the base64 encode it or store it as a blob though. Your best bet would be to store the image somewhere on your server outside of the database and instead store its address or path in a database along with a unique ID.
how would I store the photo info?
What photo info, the exif data? Or are you talking about additional general information? (date it was uploaded, user that uploaded it, etc etc) Either way, you would make a column in the database table for each chunk of data you want to store.
For example, if you wanted to store the date each photo was uploaded and which user uploaded it, you would set up columns in such a way that a row in the database that look something like this:
(columns) id photo user date
(row) 58 uploads/img58.jpg myuser192 1338483324
as for actually getting the photos on your server, perhaps you could use an html5 friendly solution like Uploadify