Fastest Way to Create Large Number of Rows

I'm looking to build a functionality that will create (for example) 700 rows upon execution across 7 tables (Grandparent, parents, children).  These are my options:

  1. A single grouped action that creates all 700 rows on the device, which then will take a very very long time to save to server (very bad UX)
  2. A single action that triggers a BOT to create the 700 rows on server, and sync with asynch mode OFF, such that the sync will not complete until the 700 rows are all created and ready to be sent to the device (long sync time, depending on how long the BOT takes to run)
  3. BOT with asynch mode ON, such that the BOT runs in the background (result not visible until some time later and user manually syncs)
  4. BOT to send webhook to Appsheet API (I suspect same time taken as BOT but not sure)
  5. BOT to send webhook to our third party webapp to insert the rows, and return when complete to allow sync to finish.  This option we have some control over speed with multi-threading, and virtual machine configuration etc.
  6. Create some triggers on our MySQL DB to insert the rows (as actually all variables in the new rows can be calculated from other tables)

Any other ideas, and in particular experience out there as to how to create such a large number or rows as quickly as possible with good UX?

@MultiTech @Aleksi @Bellave_Jayaram 

Solved Solved
2 21 990
1 ACCEPTED SOLUTION

The best option is to use AppSheet API as it's happening on the server side. Adding 700 rows takes probably 10+ secs (actually it will take the same even with 7000 rows) as they will be created as a bulk. BUT.. if you have even one virtual column in your table, that will ruin this as it needs to be calculated row by row and throws a timeout error. What I have used in a case like this.. create a dummy app for the same data source as you can trigger the API in another app with the Bot. In this AppB you don't need to have those virtual columns as it is just for the filling purpose. 

View solution in original post

21 REPLIES 21

The best option is to use AppSheet API as it's happening on the server side. Adding 700 rows takes probably 10+ secs (actually it will take the same even with 7000 rows) as they will be created as a bulk. BUT.. if you have even one virtual column in your table, that will ruin this as it needs to be calculated row by row and throws a timeout error. What I have used in a case like this.. create a dummy app for the same data source as you can trigger the API in another app with the Bot. In this AppB you don't need to have those virtual columns as it is just for the filling purpose. 

AMAZING. Thanks Aleksi! This is exactly the solution I needed.  Very interesting to learn that VCs would prevent bulk inserts via the API - makes complete sense.  I feel confident to move forward with the right experiment here now ๐Ÿ˜Š I will implement and report back..  

You're welcome!

I'd be curious to know if there's a difference is processing time between bots and the API

  • Especially since you're using a bot to trigger the API!  (^_^)
  • Would a bot triggering a record-creation action process just as fast?  (Possibly faster?)

------------------------------------------------------------------------------------------------------------------------------------------------------------------------

FWIW:  How I Go About This

  • Given the following scenario:
      - I have a user inside an app and they need to "push a button" or something, which then kicks off the creation of a butt-ton of records, and they need to keep moving forward WITH the records created

  • I would implement the following:
    1. A composite action for the user to press; inside of which would be
      1. A manual-automation trigger action (to trigger my automation)
      2. A force-sync action to push things faster

    2. An automation, running a record-creation action, to create my records
      1. This would require some setup to get working appropriately in the app
      2. You'd need a blueprint table or something to tell the system what to create
      3. You'd need an action to take the records that need to be "copied" or "propagated forward" and actually make the copies

    3. You might need another action to "expire" out all the older records
    4. Anything else you want your automation to do

The idea being:  the user pushes the button and the app begins a forced sync while everything is happening in the background

  • This adds in the fact that the user needs to wait for the sync to move forward
      - But if built appropriately, this should only take 10-ish seconds (as Aleksi mentioned how fast the servers can create records)


@MultiTech wrote:

Would a bot triggering a record-creation action process just as fast?  (Possibly faster?)


Like that 'scheduled bot' sync X numbers of records in seconds?

Hi Matt,

In my use-case I have exactly what you're suggesting - basically I have about 2000 'template' records, some of which will be copied into various tables as described in my first post upon a button click.  Based on various logical rules, I'm expecting a max of circa 700 of those template records to be copied/inserted.

I have various other apps that do something similar, including a force-sync exactly as you've noted.  In my experience the force-sync doesn't speed up the saving of rows to the database, it simply forces the user to wait while the changes first save and the app then syncs, before proceeding with further tasks.

I think the key point here is in Aleksi's comment in that the rows

will be created as a bulk. BUT.. if you have even one virtual column in your table, that will ruin this as it needs to be calculated row by row

@Aleksi your solution is to create a dummy app with the tables to have the data inserted into without any VCs in them, and then hit the API for that app to create the records there......point being to enable bulk-insert instead of row-by-row adding.

What about I add the same tables a second time within the same app (and give them slightly different names), configure the table defs to be simply the data-types only with no other refs or valid-ifs etc, have NO VC's in those repeated tables and use a BOT to insert the rows there.  Will the BOT/back-end function in the same way as the API and do a bulk-insert if there's no VCs?  I could ALSO include a security filter on those repeated tables to remove all rows from the device so that doubling up on my tables doesn't slow my app sync time down.

@MultiTech @Ed As per Matt's request, I played with actions and webhooks a little, and it seems when using a very simple row adds just with the key value, combination of Bot/Action (Add a new row to another table using values from this row) was a little bit quicker with 4000 test rows. Results were 7.5 vs 9 secs. When using more complex functionalities, the Webhook started to be quicker.. and of course you can do more complex work with webhook easier than with actions. So.. it depends ๐Ÿ™‚

@Aleksi-AppSheet @MultiTech Circling back on this some months later!  (As I eventually had a need to test this out).

Confirming this option also works quite well:
What about I add the same tables a second time within the same app (and give them slightly different names), configure the table defs to be simply the data-types only with no other refs or valid-ifs etc, have NO VC's in those repeated tables and use a BOT to insert the rows there.  Will the BOT/back-end function in the same way as the API and do a bulk-insert if there's no VCs?  I could ALSO include a security filter on those repeated tables to remove all rows from the device so that doubling up on my tables doesn't slow my app sync time down.

However, here's an extra credit question! ๐Ÿ˜€
Is it possible to achieve a rapid Bulk-UPDATE???  Example: I want to update a Date column for say 1000 rows, by adding 3 days to the date in each row?

Hope you're both having fun still!

Cheers,

Ed

@Bellave_Jayaram @mateo 

Hi Ed, for others.. in generally, you are not able to add the same table twice to your app. If using real DB, you could create a view and then have that in your app. I assume you have done that.

To your extra question.. yes you can do that in the same way.. either with the API or action/Bot.

Hi Aleksi, yep Iโ€™m using SQL views to receive the server-side bulk inserts โ€“ itโ€™s very fast. (Using a BOT/action)

Good to know bulk update will work the same way.

For what itโ€™s worth (and for others) Iโ€™ll test it and revert back again โ€“ as actually my use case is quite complex in that the update amount for each row (date โ€˜shiftโ€™) is calculated from querying two other tables for each row to be updated.  Itโ€™s not a simple case of adding 3 days to each date, or a simple case of updating based on templated data held elsewhere โ€“ my calc determines how far to shift each date based on assigned working/non-working day calendars.

It is for a Gantt chart where updating a task end date moves both the start and finish dates of all โ€˜descendantโ€™ tasks instantly.  This works fine in the app, but can sometimes result in 2-300 row changes queued up on the device. Also the chrome browser memory gets exhausted but running so many calcs.  Iโ€™m hoping the solution is in MySQL where I can increase its memory.  Selecting which rows to update (the โ€˜descendentsโ€™) is a simple select in the app.

Cheers,

Ed

Thanks for the confirmation!

I've got a video you might check out that shows how you can use actions/automation to update a group of records.

  • The scenarioIf you've implemented a blueprint table (where you use one table as a "template" and another as the place where you're copying data to), AND you've been using that system for awhile...
    • What happens when you make a major change to the blueprint table?
    • How can I get all the derivative records (the copies) to update with the newest values?

This is a direct example of what you're wanting to do @Ed_Cottrell, and how to accomplish is inside the AppSheet editor (without any API stuff).

------------------------------------------------------------------------------------------------------------

Hope it helps!

@MultiTech , thank you for your insightful video. I've successfully applied the approach you shared in several apps. However, I've encountered a challenge where it ties up the sync function, and as some community members have pointed out, it can lead to timeouts during larger record updates. 

In my recent case, I am creating a finance schedule for a mortgage or loan. The term of the loan, measured in months, determines the number of records to be generated for the schedule. While implementing this technique works well for shorter durations like 2 years (24 records) or 5 years (60 records), it becomes noticeably slow for longer terms such as 25 years (300 records).

I've explored an alternative approach using a webhook as proposed by @Aleksi  and it seems more promising in scenarios where the record limit is determined by a select statement against another table. This table serves can act as a template, containing the correct number of records already. However, I acknowledge that this isn't an ideal solution, where the term of loan can vary from one loan to another.

Therefore, I would like to pose a question to the community: Is there a way to pass the desired number of records to create directly to the webhook, thus optimising the process for longer-term scenarios? Your insights and suggestions are greatly appreciated.

If you know the number of records to create, you can of course limit the qty with the TOP() expression.

Hey man,

this is clearly some bollocks that Aleksi (and others) stated. 
The fastest way possible is always a bulk insert from a CSV file when dealing with a relational database.
I love when they stated that inserting 700 recs takes ONLY 10+ secs ๐Ÿ˜…
A bulk insert to a MySQL database from a LOCAL ( over the network CSV) file inserts 3.2 Million recs in that same time.

Oh lord, have mercy ๐Ÿ˜†

Well.. we are discussing here when using app for this process. Totally two different things!

Hey,

discussing, and leading people on the wrong track, are two different things yes lol ๐Ÿ˜†

If you want to discuss what is the quickest way to import data in generally, please open a new thread for that purpose!


@Denny774 wrote:

Hey man,

this is clearly some bollocks that Aleksi (and others) stated. 
The fastest way possible is always a bulk insert from a CSV file when dealing with a relational database.
I love when they stated that inserting 700 recs takes ONLY 10+ secs :grinning_face_with_sweat:
A bulk insert to a MySQL database from a LOCAL ( over the network CSV) file inserts 3.2 Million recs in that same time.

Oh lord, have mercy :grinning_squinting_face:


There's some quality advice in here, but your delivery needs work. ๐Ÿ˜‰ 

You might find the following video helpful:

Absolutely right, @AleksiAlkio  We're focusing here on the automatic generation of X number of records, usually derived from a parent record to create child records once a parent record is amended or created. Just to clarify, our discussion doesn't involve importing data from a CSV file or any other file type.

And to @Denny774 , while the efficient import topic is valuable, it might be better suited for a separate thread. Let's keep our current conversation centered on the automatic generation process. Thanks!

Top Labels in this Space