Appengine docs mention a 1Mb limit on both entity size and batch get requests (db.get()): http://code.google.com/appengine/docs/python/datastore/overview.html
Is there also a limit on the total size of all entities returned by a query for a single fetch() call?
Example query:
db.Model.all().fetch(1000)
Update: As of 1.4.0 batch get limits have been removed!
Deployments. In each App Engine application, you can deploy up to 10,000 times per day.
The App Engine standard environment is based on container instances running on Google's infrastructure. Containers are preconfigured with one of several available runtimes. The standard environment makes it easy to build and deploy an application that runs reliably even under heavy load and with large amounts of data.
App Engine is a fully managed, serverless platform for developing and hosting web applications at scale. You can choose from several popular languages, libraries, and frameworks to develop your apps, and then let App Engine take care of provisioning servers and scaling your app instances based on demand.
Google App Engine (GAE) is a platform-as-a-service product that provides web app developers and enterprises with access to Google's scalable hosting and tier 1 internet service. GAE requires that applications be written in Java or Python, store data in Google Bigtable and use the Google query language.
Theres no longer a limit on the number of entities that can be returned by a query, but the same entity size limit applies when you are actually retrieving / iterating over the entities. This will only be on a single entity at a time though; it is not a limit on the total size of all entities returned by the query.
Bottom line: as long as you don't have a single entity that is > 1Mb you should be OK with queries.
I tried it out on production and you can indeed exceed 1 Mb total for a query. I stopped testing at around 20 Mb total response size.
from app import models
# generate 1Mb string
a = 'a'
while len(a) < 1000000:
a += 'a'
# text is a db.TextProperty()
c = models.Comment(text=a)
c.put()
for c in models.Comment.all().fetch(100):
print c
Output:
<app.models.Comment object at 0xa98f8a68a482e9f8>
<app.models.Comment object at 0xa98f8a68a482e9b8>
<app.models.Comment object at 0xa98f8a68a482ea78>
<app.models.Comment object at 0xa98f8a68a482ea38>
....
Yes there is a size limit; the quotas and limits section explicitly states there is a 1 megabyte limit to db API calls.
You will not be able to db.get(list_of_keys) if the total size of the entities in the batch is over 1 megabyte. Likewise, you will not be able to put a batch if the total size of the entities in the batch is over 1 megabyte.
The 1,000 entity limit has been removed, but (at present) you will need to ensure the total size of your batches is less than 1 megabyte yourself.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With