OCRTEXT() failure notification or logs

Running through a set of client test images for an app using OCRTEXT(), I had about 5% of images with pretty clear text fail to produce anything in the field showing the result of the OCRTEXT().

I ran a second test with the same set of images with zero failures.

Subjectively and without any solid evidence, it felt like a service failure to me.  Like it was not returning from the server with anything.  This is a leap in the dark, I know.

Audit history shows nothing of interest until I retook an image and got a result (added a row) but of course they were all successful.

I'd be grateful for help understanding or some answers to these questions:

  • Can OCRTEXT() fail to be triggered or fail to get a response from the server?

<edit> Yes.  When monitoring network traffic, I found evidence of failures with this response: "ErrorMessage":"Image data not provided".

I also blocked the OCR request URL to see how AppSheet would behave and confirmed that AppSheet tries three times and then fails silently.  Nothing appears in the Audit Log. 

  • What happens if the OCRTEXT() suffers a server error for whatever reason?

<edit> See above.

  • Is there any way to tell there is a problem or show the user an error?

<edit> None of the errors I was able to trigger with testing were reported so unless there is another log somewhere the answer is no.  As far as I can tell, nothing is shown on the UI to tell the user.  I thought maybe having the recognised text field hidden might prevent an error being shown but the field only appears when it gets some text.

Is there a log entry anywhere I can look at?  

<edit> It might be helpful to fail with an audit log entry and some indication to the user like an indication on the field receiving OCRTEXT()

Next, I'll try a trained model and see if that behaves differently.

I'm still very grateful for the feature and super-impressed with the performance.  This app is up against on-device OCR and the one second or so latency for the server-based OCR request is not a limiting factor at this stage.  What does make a difference is the number of taps necessary to capture an image: 6 taps and two hands versus zero taps and one hand for augmented reality.

A version of OCR for AppSheet with a live viewfinder would be very powerful but then latency for server-based OCR may well become an issue.

Sample Images:
The first  image below is similar to one of the images that failed and I should note that OCRTEXT() worked fine on a number of less clear images (including vertical orientation).  I include the second image to suggest that it doesn't look like a problem with recognition itself.

99f7f863.Image.103127.jpgโŒ

e8c5f483.Image.102913.jpgโœ”๏ธ

1 3 96
3 REPLIES 3

Hi @MattJP As far as I understand, you will always see this Error Message initially.
"ErrorMessage":"Image data not provided".

Fabian_Weller_0-1661331341126.png

When you then select an image you get the result.

Fabian_Weller_1-1661331425256.png

BTW: Is OCRTEXT() still working for you? To me it stopped working. As we can see, there is the correct response, but AppSheet is not putting the response in the right field. The column [OCRTEXT Initial Value] stays blank.

I'm not actively using at the mo.  I'll take a look and reply next week.  Thanks

OCRTEXT is working again. AppSheet fixed this.

Top Labels in this Space