One week into AppSheet and I’ve already built a cool app. I like how my app tightly integrates with my phone’s email, text messages, voice, maps, image capture, etc. I’ve just scratched the surface.
I want to anticipate customer questions: like how to integrate specialized test equipment? In a previous position, our company performed rail testing. There was more than one testing modality. One involved self-powered rail cars that scanned the rail continuously. Another one employed trucks that could ride the rail and make stops when a suspected defect was found. A crew member jumped out and performed hand-testing. A third modality was created to facilitate continuous testing that I already mentioned. The test vehicle never stopped but the data was analyzed in near real-time. Suspected defects were identified, and batches of them uploaded to crews with handheld testing equipment. They located the section of rail using GPS coordinates supplied by the analysis team.
The Engineering Department developed the ultrasound testing equipment, but the code for the handheld device (i.e. smartphone) was developed by a member of IT. All of the code was specific to the phone’s O/S, and also to the phone brand because the solution needed communication ports. This was 15 years ago, so it was high-cost.
Fast-forward to today. I’m no longer active in that business domain, but I am certain that there are other opportunities which call for the integration of specialized equipment with a phone. AppSheets doesn’t need to change. It can already sync images and text with the server. But what needs improvement is the manner in which the phone captures images from the specialized test equipment, plus other metadata.
Does anyone have any thoughts on best practices for interfacing specialized equipment? Any Case Studies or White Papers?