I'm looking to build a functionality that will create (for example) 700 rows upon execution across 7 tables (Grandparent, parents, children). These are my options:
Any other ideas, and in particular experience out there as to how to create such a large number or rows as quickly as possible with good UX?
@MultiTech @Aleksi @Bellave_Jayaram
Solved! Go to Solution.
The best option is to use AppSheet API as it's happening on the server side. Adding 700 rows takes probably 10+ secs (actually it will take the same even with 7000 rows) as they will be created as a bulk. BUT.. if you have even one virtual column in your table, that will ruin this as it needs to be calculated row by row and throws a timeout error. What I have used in a case like this.. create a dummy app for the same data source as you can trigger the API in another app with the Bot. In this AppB you don't need to have those virtual columns as it is just for the filling purpose.
The best option is to use AppSheet API as it's happening on the server side. Adding 700 rows takes probably 10+ secs (actually it will take the same even with 7000 rows) as they will be created as a bulk. BUT.. if you have even one virtual column in your table, that will ruin this as it needs to be calculated row by row and throws a timeout error. What I have used in a case like this.. create a dummy app for the same data source as you can trigger the API in another app with the Bot. In this AppB you don't need to have those virtual columns as it is just for the filling purpose.
AMAZING. Thanks Aleksi! This is exactly the solution I needed. Very interesting to learn that VCs would prevent bulk inserts via the API - makes complete sense. I feel confident to move forward with the right experiment here now 😊 I will implement and report back..
You're welcome!
I'd be curious to know if there's a difference is processing time between bots and the API
------------------------------------------------------------------------------------------------------------------------------------------------------------------------
FWIW: How I Go About This
The idea being: the user pushes the button and the app begins a forced sync while everything is happening in the background
@MultiTech wrote:Would a bot triggering a record-creation action process just as fast? (Possibly faster?)
Like that 'scheduled bot' sync X numbers of records in seconds?
Hi Matt,
In my use-case I have exactly what you're suggesting - basically I have about 2000 'template' records, some of which will be copied into various tables as described in my first post upon a button click. Based on various logical rules, I'm expecting a max of circa 700 of those template records to be copied/inserted.
I have various other apps that do something similar, including a force-sync exactly as you've noted. In my experience the force-sync doesn't speed up the saving of rows to the database, it simply forces the user to wait while the changes first save and the app then syncs, before proceeding with further tasks.
I think the key point here is in Aleksi's comment in that the rows
will be created as a bulk. BUT.. if you have even one virtual column in your table, that will ruin this as it needs to be calculated row by row
@Aleksi your solution is to create a dummy app with the tables to have the data inserted into without any VCs in them, and then hit the API for that app to create the records there......point being to enable bulk-insert instead of row-by-row adding.
What about I add the same tables a second time within the same app (and give them slightly different names), configure the table defs to be simply the data-types only with no other refs or valid-ifs etc, have NO VC's in those repeated tables and use a BOT to insert the rows there. Will the BOT/back-end function in the same way as the API and do a bulk-insert if there's no VCs? I could ALSO include a security filter on those repeated tables to remove all rows from the device so that doubling up on my tables doesn't slow my app sync time down.
@MultiTech @Ed As per Matt's request, I played with actions and webhooks a little, and it seems when using a very simple row adds just with the key value, combination of Bot/Action (Add a new row to another table using values from this row) was a little bit quicker with 4000 test rows. Results were 7.5 vs 9 secs. When using more complex functionalities, the Webhook started to be quicker.. and of course you can do more complex work with webhook easier than with actions. So.. it depends 🙂
@Aleksi-AppSheet @MultiTech Circling back on this some months later! (As I eventually had a need to test this out).
Confirming this option also works quite well:
What about I add the same tables a second time within the same app (and give them slightly different names), configure the table defs to be simply the data-types only with no other refs or valid-ifs etc, have NO VC's in those repeated tables and use a BOT to insert the rows there. Will the BOT/back-end function in the same way as the API and do a bulk-insert if there's no VCs? I could ALSO include a security filter on those repeated tables to remove all rows from the device so that doubling up on my tables doesn't slow my app sync time down.
However, here's an extra credit question! 😀
Is it possible to achieve a rapid Bulk-UPDATE??? Example: I want to update a Date column for say 1000 rows, by adding 3 days to the date in each row?
Hope you're both having fun still!
Cheers,
Ed
Hi Ed, for others.. in generally, you are not able to add the same table twice to your app. If using real DB, you could create a view and then have that in your app. I assume you have done that.
To your extra question.. yes you can do that in the same way.. either with the API or action/Bot.
Hi Aleksi, yep I’m using SQL views to receive the server-side bulk inserts – it’s very fast. (Using a BOT/action)
Good to know bulk update will work the same way.
For what it’s worth (and for others) I’ll test it and revert back again – as actually my use case is quite complex in that the update amount for each row (date ‘shift’) is calculated from querying two other tables for each row to be updated. It’s not a simple case of adding 3 days to each date, or a simple case of updating based on templated data held elsewhere – my calc determines how far to shift each date based on assigned working/non-working day calendars.
It is for a Gantt chart where updating a task end date moves both the start and finish dates of all ‘descendant’ tasks instantly. This works fine in the app, but can sometimes result in 2-300 row changes queued up on the device. Also the chrome browser memory gets exhausted but running so many calcs. I’m hoping the solution is in MySQL where I can increase its memory. Selecting which rows to update (the ‘descendents’) is a simple select in the app.
Cheers,
Ed
Thanks for the confirmation!
I've got a video you might check out that shows how you can use actions/automation to update a group of records.
This is a direct example of what you're wanting to do @Ed_Cottrell, and how to accomplish is inside the AppSheet editor (without any API stuff).
------------------------------------------------------------------------------------------------------------
Hope it helps!
@MultiTech , thank you for your insightful video. I've successfully applied the approach you shared in several apps. However, I've encountered a challenge where it ties up the sync function, and as some community members have pointed out, it can lead to timeouts during larger record updates.
In my recent case, I am creating a finance schedule for a mortgage or loan. The term of the loan, measured in months, determines the number of records to be generated for the schedule. While implementing this technique works well for shorter durations like 2 years (24 records) or 5 years (60 records), it becomes noticeably slow for longer terms such as 25 years (300 records).
I've explored an alternative approach using a webhook as proposed by @Aleksi and it seems more promising in scenarios where the record limit is determined by a select statement against another table. This table serves can act as a template, containing the correct number of records already. However, I acknowledge that this isn't an ideal solution, where the term of loan can vary from one loan to another.
Therefore, I would like to pose a question to the community: Is there a way to pass the desired number of records to create directly to the webhook, thus optimising the process for longer-term scenarios? Your insights and suggestions are greatly appreciated.
If you know the number of records to create, you can of course limit the qty with the TOP() expression.
Hey man,
this is clearly some bollocks that Aleksi (and others) stated.
The fastest way possible is always a bulk insert from a CSV file when dealing with a relational database.
I love when they stated that inserting 700 recs takes ONLY 10+ secs 😅
A bulk insert to a MySQL database from a LOCAL ( over the network CSV) file inserts 3.2 Million recs in that same time.
Oh lord, have mercy 😆
Well.. we are discussing here when using app for this process. Totally two different things!
Hey,
discussing, and leading people on the wrong track, are two different things yes lol 😆
If you want to discuss what is the quickest way to import data in generally, please open a new thread for that purpose!
@Denny774 wrote:
Hey man,
this is clearly some bollocks that Aleksi (and others) stated.
The fastest way possible is always a bulk insert from a CSV file when dealing with a relational database.
I love when they stated that inserting 700 recs takes ONLY 10+ secs
A bulk insert to a MySQL database from a LOCAL ( over the network CSV) file inserts 3.2 Million recs in that same time.
Oh lord, have mercy
There's some quality advice in here, but your delivery needs work. 😉
You might find the following video helpful:
Absolutely right, @AleksiAlkio We're focusing here on the automatic generation of X number of records, usually derived from a parent record to create child records once a parent record is amended or created. Just to clarify, our discussion doesn't involve importing data from a CSV file or any other file type.
And to @Denny774 , while the efficient import topic is valuable, it might be better suited for a separate thread. Let's keep our current conversation centered on the automatic generation process. Thanks!
User | Count |
---|---|
37 | |
31 | |
29 | |
22 | |
18 |