Poor Performance/Process - Whats a better way to accomplish this?

To give you some idea of the business use: I have one table with Locations and another table with Jobs. Jobs are performed at Locations. The jobs are repeated on a somewhat unpredictable schedule. Crews will be assigned a bulk of jobs to do and when they are nearing completion with the assigned jobs, I will enter criteria in the search form for the next locations and then trigger these two actions and a workflow which assigns the next batch of jobs.

I have a dashboard containing two views:

  • SEARCH FORM - A detail view of search form with a few criteria drop down fields.
  • SLICED DATA - A table view of a slice of the Locations table. The slice is filtered based on the criteria entered into the search form.

The search form also has a button which sets a DateTime field to Now(). When the DateTime field is updated, this change increments a ChangeCounter field whose Accumulator is set to Reset.

Each update to the search form table triggers a workflow which has a condition that checks the ChangeCounter value. If the condition is true, the workflow will execute an action.

The action executed is set to Data: execute an action on a set of rows. The referenced rows are those that are contained in the slice mentioned above. The purpose of this action is to select the rows of data to execute an action on which will be rows that meet the criteria in the search. The action that is executed, copies fields from these rows to another table. Pretty simple once you understand whatโ€™s happening.

The problem is, if a large number of rows are in the slice (400 or so), this can take 5 minutes or longer to fully execute.

I set this up initially about a week ago and while it wasnโ€™t super fast, I donโ€™t recall it being quite this slow. Today, a frozen snail seems like it would be fasterโ€ฆ

If youโ€™ve read this far and have some ideas about how I could improve this process, I would love to hear them.

@Steve @Aleksi @LeventK

1 11 653
11 REPLIES 11

Steve
Platinum 4
Platinum 4

Letโ€™s start hereโ€ฆ

This shouldnโ€™t be a workflow. Workflows are performed on the server, which occurs only after the change fro the app has been synced. Thatโ€™s an unnecessary delay here. Instead, use the formโ€™s on-save event action to check the condition and fire off the action.

Thanks for taking the time @Steve

The form is actually a Detail view of the table with Quick Edit enabled for the search criteria fields. Setting this up this way makes an โ€˜interactiveโ€™ dashboard view. As the user enters the search criteria (at the top of the dashboard) the slice view updates to display the results (on the bottom of the dashboard). Iโ€™d really like to keep this as it is rather than require the user to open, edit, save a form. If the user enters the wrong criteria and then saves the form which will add the rows (400 or more) and then realizes an error was made, not only will the process have to be repeated, there are now 400+ rows that have to be removed.

Any other ideas?

I will also add, the delay I referenced in the initial post is significant regardless of whether the app is manually synced or if the user waits for background sync. Obviously, the delay is longer if waiting for background.

Alrighty! Next thing is virtual columns. Are a large number of virtual columns involved? Do any virtual columns contain nested SELECT() (or the like) expressions?

Yep & yepโ€ฆ

I know I need to minimize/remove these where possible.

In my opinion, I donโ€™t see a need for either of these two things to influence this process though. I understand that that do but, they should not.

The action to add the rows has all the data required, no computing is really required:

Take the values entered in these X fields from the search form and the key value of the rows returned in the slice and add the rows to the Jobs table.

The values in the search form ARE derived from SELECT() statements. I hope youโ€™re not going to tell me that the platform runs these statements for every row that is added? That would be HIGHLY inefficientโ€ฆ

I have noticed the platform seems to execute relationships to tables that are not part of the equation. For example, it processes relationships, VCs, and SELECTs even where there are no changes to the tables. Simply syncing the app causes the all of these processes to run.

If your Select expressions are in VCs then yes, each one, in each row, gets recalculated on sync. Because that is what VCs do. So highly intensive Select expressions, especially ones that donโ€™t actually change the end result often, should be stored in real columns.

None of the fields in the search form are VCs. They are all real columns.

In any case, if a VC does not invoke a need to change or add data, there should be no reason to process it.

See, but they donโ€™t know that. Because of how they exist, each one has to rerun itโ€™s expression during the sync to see if they need to update. They have no way to know otherwise.

In the case of multiple Quick Edit fields that are real columns. Each โ€œchangeโ€ triggers an edit to that row. Which mean the app, runs to the server, fetches your table, edits the row, runs every expression for that row, saves the table, returns to the server, runs all the VCs, then comes back to the app. If you have delayed sync and automatic sync turn on, then if the user clicks two or three different things, each one needs to go through one by one. And the app waits for the first to come all the way back before even sending the next.

For every table in the app? Seems like a timestamp comparison for each table between the app and the server would/should be all that is required to determine whether any computation of the expressions would be needed. Thatโ€™s one way โ€˜theyโ€™ would knowโ€ฆ Right?

Timestamp of the table in app = timestamp of the table on the sever, no change occurred, no computation required

Timestamp of the table in app <> timestamp of the table on the server, change occurred, computation required

The way you describe the process makes it sound as if the app has to go back and forth to the server for every change to the data? If this were the case, off-line capability would not be possible. When the app syncs for the first time, the tables are downloaded to the app. Changes are made to the data in app and then compared to the serverโ€™s copy of the data at the next sync. The most recent change takes precedence.

Doesnโ€™t matter since the data could have been changed by an external source. Syncing means it has to check everything.

Do any other tables use the data in this one?

Every change to a Quick Edit column pushes an update for that entire row. Every Quick Edit change is equivalent to a form submit without (I believe) the on-submit event action.

Equations can reference external elements like the time and date, which may be part of some equation.

Yep.

Need or not, thatโ€™s how it works.

I havenโ€™t tinkered with offline for awhile, but when last I did, VCs didnโ€™t recompute except for rows that are updated through a form or by an action. Weโ€™ve been told that VCs are bulk recalculated by the server, not the app. That corresponds with my experience.

I encourage you to engage support@appsheet.com directly to continue you hunt for performance problems. Theyโ€™ll be able to tell you exactly how everything works.

Bahbus
New Member

Side remark, if your going to have a workflow executing an action on a data set of 400+ rows, then you probably should turn automatic updates and delayed updates on so that you donโ€™t have to wait on the sync screen. Or you can try to do this workflow as a webhook to the REST API and run is asynchronously. Otherwise, I imagine a complete rewrite/overhaul of data structure would be about the only other way to solve your issues.

Top Labels in this Space