Uploads & Downloads¶
Ferris provides a few utilities for working with the Blobstore API and Cloud Storage to upload and serve binary files.
The Upload component can take the guesswork out of uploading binary files on App Engine.
- class ferris.components.upload.Upload(controller)¶
Automatically handles file upload fields that need to use the blobstore/cloud storage.
With the default configuration, this will upload to the application’s default Google Cloud Storage bucket. This behavior is configurable in app/settings.py.
This works by:
- Detecting if you’re on an add or edit action (you can add additional actions with upload_actions, or set process_uploads to True)
- Adding the upload_url template variable that points to the blobstore / cloud storage.
- Updating the form_action and form_encoding scaffolding variables to use the new upload url.
- Processing uploads when the upload handler redirects back to your action.
- Adding each upload’s key to the form data so that it can be saved to the model.
Does not require that the controller subclass BlobstoreUploadHandler, however to serve blobs you must either use the built-in Download controller or create a custom controller that subclasses BlobstoreDownloadHandler.
This component is designed to work instantly with with scaffolding and forms. Almost no configuration is needed:
from ferris import Model, ndb, Controller, scaffold from ferris.components.upload import Upload class Picture(Model): file = ndb.BlobKeyProperty() class Pictures(Controller): class Meta: components = (scaffold.Scaffolding, Upload) add = scaffold.add edit = scaffold.edit list = scaffold.list view = scaffold.view delete = scaffold.delete
There are often instances where your project requires more direct access. Ferris allows for this via a two-stage process. First, generate an upload URL and provide it to the client. Second, upload the file to that URL with the client. Once the upload is successful, the special upload handler will redirect back to the action with the uploaded blob data. Here’s an example of that flow for a JSON/RESTful API:
from ferris import Controller, route from ferris.components.upload import Upload class Upload(Controller): class Meta: components = (Upload,) @route def url(self): return self.components.upload.generate_upload_url( uri=self.uri('upload:complete') ) @route def complete(self): uploads = self.components.upload.get_uploads() for blobinfo in uploads: logging.info(blobinfo.filename) return 200
Get all uploads sent to this controller.
Returns: A dictionary mapping field names to a list of blobinfo objects. This blobinfos will have an additional cloud_storage property if they have been uploaded to cloud storage but be aware that this will not be persisted.
Ferris includes a download controller that is disabled by default for security reasons. To begin using it, first, enable it in app/routes.py:
from ferris.controllers.download import Download routing.route_controller(Download)
You can now generate URLs to download files:
uri("download", blobkey=blobkey) uri("download-with-filename", blobkey=blobkey, filename="kitty.jpg")
Google Cloud Storage is mostly compatible with the existing Blobstore API. This means you can upload and serve items the exact same way without any change. However, there are some caveats.
By default, all uploads go to the Default Cloud Storage Bucket for the application so no configuration is necessary to upload to that bucket. However, if you want to use a different bucket, you can configure it using Settings.
Locally the App Engine SDK will emulate Cloud Storage, but once deployed you must ensure the App Engine Application has access to the given bucket.
You can use the download handler as above to serve blobkeys that point to Cloud Storage objects. Note that serving items in this way does not take advantage of the Cloud Storage CDN or caching and may be very slow for small items such as images. Items should be served directly from Cloud Storage. To generate a serving URL, the Cloud Storage object name must be provided. Because the App Engine blobkey does not provide this information, you must acquire this object name during the upload step.
If you’re using the easy setup of a Model and Form, all you have to do is add a field to the model like so:
class Picture(Model): file = ndb.BlobKeyProperty() file_cloud_storage = ndb.StringProperty()
The upload component will detect these [field]_cloud_storage properties and ensure that these fields are populated with the cloud storage object name.
If you’re doing things manually (as with the API example above) you’ll need to get the object name yourself:
@route def complete(self): uploads = self.components.upload.get_uploads() for blobinfo in uploads: logging.info(blobinfo.filename) logging.info(blobinfo.cloud_storage.gs_object_name) return 200
To generate a serving URL:
serving_url = "https://storage.cloud.google.com/%s/%s" % (bucket_name, object_name)
These URLs will not work locally. The SDK does not upload anything to Cloud Storage when the application is being run locally.