On this testing app, we have temp table from where we are trying to copy all the rows onto another table, then try to delete all the rows after copy operation, using actions on set of rows for both.
Actually, not throwing any errror when they fail to delete rows after copies.
This is not a direct topic to this subject, but we are failing on the case where we are trying to copy and delete bunch of rows, such as 50K rows due to timeout related errors at the end. Wondering what is the capability of BOT (or Appsheet API?) in terms of actions against bulk number of the rows.
Any technical indication what the limiation in this case?
The root cause for the behavior you reported was that the reference action that called delete on a sequence of nodes was trying to get the after image for them after they were deleted. The bug is in the action implementation and reproduces with the old workflows and bots.
I put a fix in and will let you know when it is deployed.
Thank you for investigations and wait for a fix to be released.
Please refer to other post I placed on the same day here.
Firstly we tried to import a CSV file to the target table and eventually we failed, presumably just because the number of rows and column we attempted to import.
Then we throught alternative way, which was a this BOT.
We made one table with exactly same schema as the target table. Then we copy from CSV and paste to this table manually and directly on the spreadsheet.
Then we come to AppSheet App, run a bot, which will give a user notification that bulk record import process is started.
Then move to next step, actually add rows from this table to target table using action in terms of set or rows category. Once this step is finished, we trancate the source table, and in turns it finishes, we tried to let the user know the whole process is finished.
If CSV import file action runs fine, then we dont need to buiild this type of bot, but as you see in other post, import CSV file action keeps failing. When the csv file size is relatively large, we sometime managed to import rows, but UX is not user friendly, as App after pick csv file, become responseless, no indication import csv process is done. (No dialogue comes up if or not we failed or succeeded in bulk import)
Again, we gave up import CSV action in this use case, but once again, we failed BOT which does the same job, presumably the timeout issue of the BOT.
I looked into Automation Limit,
But nothing was mentioned here, so it was new to know, that BOT process has timeout max 2 mins.
I appreciate for your guide how to make the use case to get through. I can share my sample and sandbox app with you to check it out how we can improve the BOT construction, so please let me know.
I also wait for @phil to give me a guide for CSV import file actions as well.
thank you for attending to the case. Later today, we will re-test our sandbox app and report the result back to you.
In the meantime, we understand there was timeout limit of 2 mins on the BOT through this experiences. Then question is how 2 mins is going to be calculated?
Is it in total required time PER whole BOT operation ? or is it calculated by “single step” basis?
The expensive step within the bot could be either “Change Data” or “Run Task” steps.
If timeout restriction is imposed on “per step” basis, I thought we might split the data change action into separate chung and chain them as continuous steps under single bot. Just like we create our own batched to carry out the action agaist large data set.