Store Content for offline use

The Store content for offline use option refers to read-only content, if I have tables that are read-only for some users and can be updated for others how does this affect the storing for offline use?

Hi @Austin_Lambeth. The description of that feature is confusing and we’re going to change it.

When “offline use” is disabled, the app will download images and files on-demand. So if you have an image column, the images will be loaded one at a time, as needed. This requires an internet connection.

When “offline use” is enabled, all of the images and files will be downloaded when the app first syncs. That way you’ll be able to see the images and files without a network connection.


In other words: you don’t need to think about the read-only status of your tables.

Gotcha, ya I definitely agree that needs a re-write cause I thought it meant it would store the rows and such for offline use.

All data (except that filtered out by security filters) is stored for offline use.

So if I were using a massive table, say 300k never changing records, would it be better to sort those through slices than security filters since they would be stored locally and not attempt to re-grab the rows if I were to use a security filter? Note:these 300k rows are the only data in the app other than the filter.

I can’t say. It depends on the total size of the data set, not just the number of rows, as well as the amount of memory on the device. That said, if performance is an issue, it would be prudent to use security filters to reduce the data set to only what is needed.

If the entire table is, in fact, “never changing”, consider making it explicitly read-only. I believe I recall someone (@praveen?) saying it offers advantages, though I don’t recall exactly what advantages. I don’t know if using an Are updates allowed? expression to make a table read-only for some users but not others provides the same benefits.

This is only a temporary app to allow us to search through these rows. We unfortunately can’t cut down the dataset. This is a customer list and we are changing providers. We are guaranteed about 40k rows being dropped due to different data cleanliness standards. Its about 60-70 MB of stuff in our database. Even it I try to trim the datatypes in database I can only get it down 4-5 MB.
(This is a different app than the original question in this post.)

It’ll probably depend on the amount of memory available in the devices and the number and memory footprint of other running apps, then. Test, test, test!

My initial sync times seem to be much worse with the slice filters but my filter changes seems to be better with the slice filters. I know that with the raw amount of rows and data we have, we won’t have a snappy app.

That makes complete sense, as without a security filter, the entire table is being loaded.only after the table is loaded is the slice applied to it. The slice in no way reduces the memory footprint, and probably increases it some (though by how much I don’t know).