Ferris provides a few utilities for working with the Blobstore API and Cloud Storage to upload and serve binary files.
The Upload component can take the guesswork out of uploading binary files on App Engine.
Automatically handles file upload fields that need to use the blobstore.
This works by:
- Detecting if you’re on an add or edit action (you can add additional actions with upload_actions, or set process_uploads to True)
- Adding the upload_url template variable that points to the blobstore
- Updating the form_action and form_encoding scaffolding variables to use the new blobstore action
- Processing uploads when they come back
- Adding each upload’s key to the form data so that it can be saved to the model
Does not require that the controller subclass BlobstoreUploadHandler, however to serve blobs you must either use the built-in Download controller or create a custom controller that subclasses BlobstoreDownloadHandler.
This component is designed to work instantly with with scaffolding and forms. Almost no configuration is needed:
from ferris import Model, ndb, Controller, scaffold from ferris.components.upload import Upload class Picture(Model): file = ndb.BlobKeyProperty() class Pictures(Controller): class Meta: components = (scaffold.Scaffolding, Upload) add = scaffold.add edit = scaffold.edit list = scaffold.list view = scaffold.view delete = scaffold.delete
However, there are instances where you need more direct access. This is possible as well. Upload happens in two phases. First, you have to generate an upload url and provide that to the client. The client then uploads files to that URL. When the upload is successful the special upload handler will redirect back to your action with the blob data. Here’s an example of that flow for a JSON/REST API:
from ferris import Controller, route from ferris.components.upload import Upload class Upload(Controller): class Meta: components = (Upload,) @route def url(self): return self.components.upload.generate_upload_url(action='complete') @route def complete(self): uploads = self.components.upload.get_uploads() for blobinfo in uploads: logging.info(blobinfo.filename) return 200
Get all uploads sent to this controller.
Returns: A dictionary mapping field names to a list of blobinfo objects. This blobinfos will have an additional cloud_storage property if they have been uploaded to cloud storage but be aware that this will not be persisted.
Ferris includes a download controller that is disabled by default for security reasons. To begin using it first enable it in app/routes.py:
from ferris.controllers.download import Download routing.route_controller(Download)
You can now generate urls to download files:
uri("download", blobkey=blobkey) uri("download-with-filename", blobkey=blobkey, filename="kitty.jpg")
Google cloud storage is mostly compatible with the existing blobstore API. This means you can upload and serve items the exact same way without any change. However, there are some caveats (see below).
To make all uploads for a controller go to cloud storage all you need to do is configure the bucket name:
class Upload(Controller): class Meta: components = (Upload,) cloud_storage_bucket = "my-bucket"
Locally the App Engine SDK will emulate Cloud Storage, but once deployed you must ensure the App Engine Application has access to the given bucket.
Now all files will be stored with a unique name on cloud storage and a blobkey will be generated that points to that cloud storage item.
You can use the download handler as above to serve blobkeys that point to cloud storage objects, however, serving items in this way does not take advantage of the cloud storage CDN or caching and can be very slow for small items such as images. In order to remedy this you should serve the item directly from cloud storage. However, in order to generate a serving URL you have to have to cloud storage object name. Unfortunately, the App Engine blobkey does not provide this information. As such, you must acquire this object name during the upload step.
If you’re using the easy setup of a Model and Form all you have to do is add a field to the model like such:
class Picture(Model): file = ndb.BlobKeyProperty() file_cloud_storage = ndb.StringProperty()
The upload component will detect these [field]_cloud_storage properties and ensure that these field are populated with the cloud storage object name.
If you’re doing things manually (as with the API example above) you’ll need to get the object name yourself:
@route def complete(self): uploads = self.components.upload.get_uploads() for blobinfo in uploads: logging.info(blobinfo.filename) logging.info(blobinfo.cloud_storage.gs_object_name) return 200
To generate a serving URL:
serving_url = "https://storage.googleapis.com/%s/%s" % (bucket_name, object_name)
These URLs will not work locally as the SDK does not actually upload anything to cloud storage.