I have a PHP-based GAE app that takes file uploads, and uses the special app engine handled upload URLs to have users upload files directly to google cloud storage engine. This works great, but I just noticed that for some reason every file that has ever been uploaded also shows up in Blobstore. This is true even of files that I have already deleted from the storage bucket in question. Furthermore, when I try to delete these files from blobstore, I get a message saying they could not be deleted. How can I delete this? I don't want to be billed for them.. Also how can I prevent this from happening?
Here is an image of the blobs:
This seems like an issue which should be reported in the Public Issue Tracker. If you create an issue there you should see a quick response. It seems anomalous that you shouldn't be able to delete these.
Related
I'm stuck wondering what the best solution is to handling large file uploads and sending them to a third-party API. Any pointers on what would be a good solution would be very welcome. Thank you in advance.
The end goal is to send video files to this API - https://docs.bunny.net/reference/manage-videos#video_uploadvideo. The complication is that the files are often large - up to 5GB in size.
I have an existing website built in PHP7 that runs on a LAMP setup on Amazon Lightsail and I want to add a feature for users to upload video files.
Currently I'm uploading the files directly to Amazon S3 using a pre-signed URL. This part is working fine.
But I need to send the files to the API mentioned above. This is where I get stuck!
I think there's two options to explore - (1) find a way to upload directly to the API and skip the S3 upload or (2) continue with uploading to S3 first and then transfer to the API. But I'm not sure if option 1 is even possible or how to do option 2!
With option 1, I'm wondering if there's a way to upload the files from the users directly to the API. If I do this using the regular HTML form upload, then the files are stored temporarily on my server before I can use cURL through PHP to transfer them to the API. This is really time consuming and feels very inefficient. But I don't know how else to send the files to the API without them first being on my server. Maybe there's an option here that I don't know about!
With option 2, I can already upload large files directly to S3 with pre-signed URLs and this process seems to run fine. But I don't know how I would then send the file from S3 to the API. I can use an S3 trigger on new files. But when I looked at Lambda, they have a tiny file size limit. Because my site is hosted on Lightsail, I noticed they have a container option. But I don't know if that can be used for this purpose and if so, how.
Basically, I'm not sure what solution is best, nor how to proceed with that. And maybe there's an option 3 that I'm not aware of!
I would welcome your input.
Many thanks in advance.
I've been spending few days trying to figure out how to set aws s3 as external storage for Resourcespace. and i've been getting more confused with the this app.
I'm using the opensource version and trying to customize it to my needs.
I've been through the web app's lengthy documentation but couldn't find anything about setting storage (like other web apps out there) However, I found a feature called syncdir where it sets an alternative external storage (for backup) but not as an external storage, as from the documentation, it doesent seem to have a direct method to specify storage/integrate s3 with it.
I've tried the following:
I've tried using aws s3 integration and how to integrate to any php website, by changing storing directory of 'storagedir' and directory of 'syncdir' in config.default file (i added the require s3 autoload file and added aws keys in config file), but it's not working, site is still storing locally
Note: I've integrated aws s3 before with Laravel 5.7 & Codeigniter 3 frameworks successfully.
I tried adding the require aws-autoload into the file where uploading functions is, and tried to look for the code responsible to upload, but code seems confusing to me where the upload functionality is (its not a php funtion where $_FILES receives your upload.
Changed place of require aws-autoload into include/general.php, but no luck.
Followed up with some forums on the matter like:
using external storage
Amazon S3 integration
I'm assuming that using the config file (to store AWS credentials and storage set to s3 bucket url), i include the aws-autoload in general/upload file, and it would automatically understand where it should upload, but no error or bug is reporting to address it.
But most of what i found is related to the paid version of the DAM system where it seems to be already set up on amazon.
Please advise, Any help is appreciated.
I'm using Wamp on Winddows 10 PC btw
Check this discussion out, it might help you :
https://groups.google.com/forum/#!topic/resourcespace/JT833klfwjc
It look like it is still a work in progress, so you may see the WIP code,
You will find links to code in the mentioned link.
Relatively new to web development here, but am trying to implement an image upload feature, the contents of which will be previewed to the person (administrator) uploading the image, and then stored in a database (and displayed to the end user on a different page).
I found a resource that uses a Imageshack API, and was a bit confused about what this is and how the person implemented the API to achieve the image upload. The code for this is here: http://www.sceditor.com/posts/how-to-upload-and-insert-an-image/
When I googled "Imageshack API," I kept running across something that said I need to request a key. What does this mean, and do I have to do it? Is this the easiest way to go about creating an image upload feature for my purposes?
Thank you all very much!
Imageshack API is for uploading image files to your account hosted at Imageshack.com. It seems that you want to upload image files to your own website and store such files on your own web servers (either in a cloud service such as AWS or your co-located/managed servers at some data centres). So, you probably do not want to use Imageshack.
As to how to upload image files using HTML & PHP, you may want to check out a short tutorial at:
www.w3schools.com/php/php_file_upload.asp
Also, by the way, storing image files into a database such as MySQL may not be a good idea -- image files should be stored as files. It is faster to access such image files on a web server than to access image contents stored in a database.
Each time a file is uploaded in a specific Google Drive folder, I would like to create a new entry in my CRM.
Since I'm using PHP, I think I might have to run a cron to check for the new files in Google Drive, then report them into my CRM.
But I was wondering, is there a way to tell Google to run my script each time a file is uploaded into a specific folder, thus avoiding the use of crons?
Thanks in advance.
It seems that Google provides a push notification system.
https://developers.google.com/drive/web/push
I still have to test it, but if there is a solution to my problem I think it's this feature ;-)
Thanks to #AndréSchild for his help.
We have our company website and we have many job opportunities often. So we have created a form where in interested candidates can apply directly. We also have a resume upload facility, right now resumes are uploaded and stored using PHP this resumes are stores onto our server, but we don't want to waste server space so we were thinking to just create one dedicated folder for resumes on Google drive and whenever user uploads his resume from our web site it will get stores onto a Google drive folder. Is this possible. We don't want to do it with Google forms or form+ since we have form matching our website theme.
It is possible to integrate into your website the Google Drive SDK which can allow you to do just that. You should look here for the PHP tutorial.
So yes, you can directly store the documents on the Google Drive using the API provided here. Look over here to access the more specifics for file uploading.
yes, it is possible with the php google class.
a very good example you can get from: http://hublog.hubmed.org/archives/001954.html