Objective-c upload to database using script - php

I am about to develop an application for the iOS devices, that will need to store information in a database on the web. For previous projects, i would just have PHP scripts, and my application would run those scripts and passing in the stuff to upload as _GET parameters, like this:
http://example.com?name=George&contents=iWillBePutIntoDB
However, this is not possible for my next project, as it will contain rather large amount of text, and i could exceed the maximum allowed length of a URL.
So, how do i go about doing this? I cannot access the MySQL database directly from my app, since my shared hosting provider doesn't allow it due to security reasons. So i can only access it using PHP scripts that are stored directly on the server.
In short: How do i upload large amount of text to a MySQL database that doesn't allow access from anywhere except files on the server itself (PHP scripts)?
Thanks everyone!

One obvious way would be using POST, instead of using GET.
But take it into your considerations that post is not without its limits.

Related

PHP file for "Web Service" queries between IOS App and Database

I am developing an IOS application that communicates with a database with "web service" requests.
To do this, I write the many SQL queries in a single PHP file that serves as a "bridge" between the application and the database.
My question is the following :
Do I have to make several PHP files (which grouped feature-ordered templates) or can I write all PHP queries in one file (which is so large)?
In addition, some queries allow me to upload images to the server with a rather long transfer time. If I keep these queries in the same PHP file, will it "block" or "slow down" access to this file for other users? Should I make PHP files apart for upload?
For the moment, I develop this application locally and I do not see any problem with a single PHP file. But I have a doubt when putting into production on a real server.
Thank you.
You can make serveral page for each web service or you can do it only in one page. But it dependes upon your application requirements. If it is a huge application then, it would be better to use different webservice for every api call.
The main demerit of single page web service is, if there is a problem in your 1 function then the application will not work at all and you might be think where is the issue.
If you used different web service for different functionality then it will not work on that page only and it will not affect other pages of your application.

LOAD DATA LOCAL INFILE from Google Storage / Google App Engine?

I need to run a process that will perform about 10,000 mysql inserts into a GoogleSQL instance. Normally, I would use a load data local infile query for this to avoid the script timing out, but my app is running in Google App Engine which has a read-only filesystem. Normally, when I need my GAE app to write to the filesystem, I can just use file names prefixed with gs:// and the php code will read/write to/from Google Storage transparently.
However, I doubt that MySQL will understand a file path of gs://path/to/my/file.
Is there another way that I can make a dynamically generated local file available in a Google App Engine environment so that I can load it into my GoogleSQL instance?
Otherwise, I feel like I'm going to need to build a looping ajax system to insert X rows at a time until it's gone through however many I need (10,000... 20,000, etc).
I know that I can put multiple values sets into a single insert to speed it all up and I'm planning to do that, but with the datasets as large as I'm dealing with, that still won't speed things up enough to avoid the timeouts consistently.

Referencing Files from the Cloud

I asked a question earlier about the difference between cloud apps and web apps, and the answers and links I received made me to believe that 'cloud' is more of a location of an application, and not just about specific applications. And that prompts these questions:
1) If I'm developing an application that'll be based in the cloud using PHP and MySQL; traditional server setups requires me to have a PHP and MySQL engines on the server; otherwise, they won't run. Is it the same with the cloud? Do I have to look for clouds with these engines, install them myself, or they aren't needed at all?
2) When building applications, files are usually referenced relatively or absolutely, based on their location to the calling file. With the cloud, since you don't know the location of the files, how can you reference the required files? Do you have to use URLs for that?
I've pored over many of the cloud questions on here, and it seems that there are a lot of confused souls out there just like myself, and most of the answers don't seem too convincing. Hence, my reason for asking again.
Thanks.
The cloud doesn't mean you don't know the location of your files, it only means that the files are not stored on the end user's computer (possibly). From your perspective as the developer of the web application, you still will (indeed must) know the locations of any stored files, since it is your application storing them.
To give your end user a reference URL to a file, you can do many different things. One method, for example, involves storing some kind of unique identifier along with the file path to a stored file on your server together in a database. You give the user a URL that references the unique identifier, and in your code you then retrieve the file from disk and stream it down to the user using the correct headers.
Another method is to store files in the database as binary BLOBs, and retrieve the data and send it down to the browser with the correct headers. Again, you as the application developer are still responsible for the fate of those files, even though the end user doesn't need to worry about where or how they're stored.

php script to read xml data on remote unix server

I got a situation where I have lots of system configurations/logs off which I have to generate a quick review of the system useful for troubleshooting.
At first I'd like to build kind of web interface(most probably a php site) that gives me the rough snapshot of the system configuration using the available information from support logs. The support logs reside on mirrored servers (call it log server) & the server on which I'll be hosting the site (call it web server) will have to ssh/sftp to access them.
My rough sketch:
The php script on web server will make some kind of connection to the log server & go to the support logs location.
It'll then trigger a perl script at logs server, which will collect relevant stuffs from all the config/log files into some useful xml (there'd be multiple of those).
Someway these xml files are transferred to web server & php will use it to create the html out of it.
I'm very new to php & would like to know if this is feasible or if there's any other alternative/better way of doing this?
It would be great if someone could provide more details for the same.
Thanks in advance.
EDIT:
Sorry I missed to mention that the logs aren't the ones generated on live machine, I'm dealing with sustenance activities for NAS storage device & there'll be plenty of support logs coming from different end customers which folks from my team would like to have a look at.
Security is not a big concern here (I'm ok with using plain text authentication to log servers) as these servers can be accessed only through company's VPN.
Yes, PHP can process XML. A simple way is to use SimpleXML: http://php.net/manual/en/book.simplexml.php
While you can do this using something like expect (I think there is something for PHP too..), I would recommend doing this in two separate steps:
A script, running via Cron, retrieves data from servers and store it locally
The PHP script reads from the local stored data only, in order to generate reports.
This way, you have these benefits:
You don't have to worry about how to make your php script connect via ssh to servers
You avoid the security risks related to allowing your webserver user log in to other servers (high risk in case your script gets hacked)
In case of slow / absent connectivity to servers, long time to retrieve logs, etc. you php script will still be able to quickly show the data -- maybe, along with some error message explaining what went wrong during latest update
In any case, you php script will terminate much quicker since it only has to retrieve data from local storage.
Update: ssh client via php
Ok, from your latest comment I understand that what you need is more a "front-end browser" to display the files, than a report generation tool or similar; in this case you can use Expect (as I stated before) in order to connect to remote machines.
There is a PECL extension for PHP providing expect functionality. Have a look at the PHP Expect manual and in particular at the usage examples, showing how to use it to make SSH connections.
Alternate way: taking files from NFS/SAMBA share
Another way, avoiding to use SSH, is to browse files on the remote machines via locally-mounted share.
This is expecially useful in case interesting files are already shared by a NAS, while I wouldn't recommend this if that would mean sharing the whole root filesystem or huge parts of it.

access + mysql converting to webplatform = (php + asp.net + mysql)?

i have a database that is written in access. the access mdb file connects via ODBC to a local mysql database. i have a bunch of sql and vba code in the access file. i dont expect the database to surpass 100mb. currently it is around 10mb. i will need to have multiple user access. (no more than 10 users at a time)
i need to convert this database from being a local one to a web server, and i need to make a web interface for it.
how do i get the current local instance of mysql database to run off a webserver? i am currently running it off wampserver 2.0. i dont have experience putting a database on a webserver.
i have an OK vb.net background. i have never done any web applications. here's a picture of the access form that i may need to replicate to work off a website:
alt text http://img42.imageshack.us/img42/1025/83882488.jpg
which platform should i use as the front end to this thing?
would it be possible to just run this access file off a webserver instead of programming a new front end for it? is that not a smart idea?
thank you for your help!
If your webserver has TCP connectivity to your existing database server, and its hosted in a suitable place (eg, don't have your webserver in a datacenter connecting to a database server on your office DSL connection), then no move is required.
If you do need to move it, it's as easy as creating a backup/dump, and restoring it elsewhere.
As far as the frontend, there are MANY technologies that will do what you need (ASP.NET, PHP, Python, Ruby, Perl, Java being the most popular ones, not necessarily in that order).
Use something you are comfortable with, or that you are interested in learning (provided you have the time to do so)
Use something that runs properly on your target webserver. Really, ASP.NET is the only one that has any major issue here, as it's limited to Windows.
Access itself has no direct web-accessible version. A Google search finds some apps that claim to convert Access forms to web-based, but I will not link to any because I don't know how well they work. I'm certainly leary of anything like that, because web apps are a different breed from Windows apps. If you are going to go that route, be sure they actually generate HTML output; make sane, clean source; and offer a free trial so you can verify it actually works.
Really though, a form like that is reasonably easy to reproduce with some basic knowledge of server-side programming and some HTML.
I don't have any experience migrating access to a web-based interface, although I have heard of people going straight from access to a web page. MySql is exceptionally easy to migrate. MySQL.com has a program called mysqldump that comes with the standard install of MySQL that allows you to export your database straight to a text file that can be used then with mysqldump to import it on another server. I don't believe the WAMP server comes with the command line tools although they can be downloaded from mysql.com. However, if it has phpMyAdmin, then there is also an export feature with that as well that will generate a .sql file that can be imported to the webserver using phpMyAdmin. One thing to keep in mind though is that I have had very little success mixing and matching these methods: ie, I've never been able to get a mysqldump-created file to work with phpMyAdmin and vice versa.
Good luck!
The link will help you to export and import mySQL database
May be on Windows web server there is an opportunity to run Access files, you can check, but any way if you have some programming skills, I would say that it is not difficult to crate a php script which will query your database info and will edit.
Migrating an Access application to the web is quite difficult, because you can't translate an Access form 1:1 into a web page. Web apps are stateless, whereas Access is built around the concept of bound controls and bound datasets.
Secondly, it is impossible to easily replicate an Access subform.
Third, you lose tons of events that Access forms and controls are built around.
In general, a web page that performs the same task as an Access form will bear little or no resemblance to the Access form, simply because the methods for accomplishing the same tasks and the UI widgets available to you are so completely different.
One thing to consider is whether your users need a web application or if they just need to use your existing Access application over the Internet. If the latter is the case, Windows Terminal Server/Citrix can do the job for a lot less money, since there's no conversion needed. You do need to provision a Windows Terminal Server, set up a VPN and purchase CALs for the users, but the costs of those are going to be much less than the cost of rebuilding the app for web deployment.
It may not be an appropriate solution, but it's one that you should consider, I think.

Categories