Can anyone offer a resolution to this error? ...

Can anyone offer a resolution to this error?

Our app stopped working for all users during a live educational event featuring the app a few minutes ago.

Unable to add/edit/delete row in table ‘Bills’. _ Error: Failed to read data “southeastPatriotAcademy-562936:1.0000090:Roles’ because: Google drive Rate Limiting Exception while reading docid=1Ks00gm…”

This error appeared for all users for about 5 minutes. Then it went away.

Exactly what is a “Google drive Rate Limiting Exception,” how was it triggered, and how do we resolve it?

0 3 790
3 REPLIES 3

I think @Brian_Sabino is also answering this on a private support thread.

I would suggest doing a community search on “rate limiting” as this is a known issue that others have hit and their workarounds may be instructive. In particular, I provide an explanation in this thread: plus.google.com - Google drive - image data transfer quota limit. I m creating app where users…

Copying my answer here:

“you can hit a rate limit simply on concurrent reads or updates if you have a large number of users concurrently using the app. I believe Google’s limit for each user account is 2000 requests per 100 seconds. That can get used up pretty fast if each user is, say, saving a row with five images — that itself may be 10+ API requests. If you have hundreds of users adding data concurrently, you will have a problem. At that point, you should consider using something like DynamoDB or mySQL or SQLServer instead of a spreadsheet (we support this)” Google drive - image data transfer quota limit. I m creating app where users… plus.google.com

Thank you @praveen for your attention on this question. And thank you for building AppSheet! As you suggested, I went and read every post available on this topic.

What I still cannot determine, and nobody has explained, is how the 2000/100 limit is calculated.

Is that simply 2000 sync requests every 100 seconds?

If so, how is reaching that limit with only 30 users even possible?

My math:

Even if they synced simultaneously, each one would have to sync 60 times back to back in less than 2 minutes. With an average sync time of 8-12 seconds… the most a person can do is 12-18 syncs in a row in 140 seconds. 18 * 30 = 540. That means my upper limit for students in my classroom should be somewhere around 110 (18 * 110 = 1,980).

(My max classroom size is 113 students, so this is perfect for us.)

Therefore, assuming I understand it correctly… This maximum limit is mathematically impossible to reach with a classroom of 30 people.

This leads me to believe there is another factor in this equation that has to do with the SIZE of the data set each person is downloading. Is that a wrong assumption?

Our data:

There are exactly 637 rows of data in our google sheet across 10 total tabs. Each with various numbers of columns. There are exactly 8812 cells that contain individual pieces of data within in the app. Almost all of this data is added prior to deployment. Only a little data is added on some columns containing enum lists during the classroom setting. This is why users “sync.” so they can see those tiny changes.

Here’s a table of the totals from my app:

Tabs #Rows #Col Total Student 210 18 3780 Bills 64 16 1024 Amends 17 7 119 Comm 27 6 162 Roles 200 16 3200 Rules 80 3 240 Maps 3 3 9 Staff 28 8 224 Cabinet 5 9 45 Rsrcs 3 3 9

637 89 8812

Another fact to note. There are 5 total apps representing different events/classes using the same data. We are using security filters to limit which app displays which data. These apps represent different events/classes, and are never run simultaneously. Only one at a time is used.

Does the data size of each sync influence the 2000/100 rule?

Another fact that may be relevant in our case: there are a large number of photos in the app. each student has a profile picture, so thats 210 photos. Plus there are 27 committees which each have a photo as well. Would syncing those each time heavily influence Google’s limits?

Top Labels in this Space