Moving Google Sheet to Google Cloud SQL database challenges

Has anyone else had issues moving a Google Sheet app to a Google Cloud SQL database? I’m curious if someone could let me know where I went wrong with mine. The automatic copy to SQL from Appsheet didn’t work for me. Not certain why. So I went and manually created all my tables and columns in my database and then attempted to important a CSV downloaded from Google Sheets with no luck. Tried a variety of different methods with MySQL Workbench crashing every time I tried. Eventually, I tried downloading the Google Sheet in .ods format and then saving to CSV from there on my PC instead of going straight from Excel. After that I was able to import all of my CSVs with no issue. I’m just curious for future what the cause of this may have been. Files always showed encoded as UTF-8 if that matters.

One other thing, when I saved from the .ods file, I had to save the CSV as MS-DOS type.

We were running in Smartsheets but now run in Google Cloud on postgres. I did not transfer any files over but just created new tables in postgres. With postgres, there is an option to import a csv to an existing table, I am unsure about importing to a new table.

Also, it was a difficult transition but the apps sync so much faster and scale to hundreds of users while still syncing faster. If you are not fully committed to mysql, I would recommend postgres. PGadmin works well.

1 Like

We use MySQL and dbforge as our interface with our databases and have had no issues importing data from CSVs or creating new tables from CSVs.

1 Like

I migrated to a Google Cloud Postgres SQL

I installed pgAdmin on my desktop just to keep an eye on the database.

At first, I kept failing because one of my tables has text in an int col. I realized this after carefully reading the error that I got.

Then it would just fail for random errors. I just kept trying and before I would start again I would delete all the tables out of the database via pgAdmin.

Then it worked!

Hope that helps if you are using a Postgres SQL server. pgAdmin might help you keep an eye on things

I have not tried adding table through the Google Cloud site but have added all of my tables through PGAdmin also. I pretty much only use PGAdmin to manage the database.

I never did figure this out. I’m sure it’s a formatting thing of some sort. Works fine to just upload a CSV so just working with that process for now.

I had an initial issue in that one of the columns in my sheet was destined to be int values in my app and had been defined as such.

However, when accessing the actual sheet in my google drive I added a row of test data and did not realize that I had entered text into that column for that row.

It did not throw my app off when it was using the sheet as a data source.

But when I tried to import it into my database it threw an error.

TLDR check your source data columns and make certain that if it is to be an int column that all the rows are int values, same for text, dates, etc.

1 Like

Hi Austin, I’m having lots of issues importing CSV files into Cloud SQL with MySQL Workbench

DBForge is not really an option right now (would need a VM, and it’s expensive)

Do you have any recommendation for an alternative?


We use Excel add-ins for mysql to import data to MySQL.
Open up your CSV with Excel and then use Add-ins to export to MySQL.

Bit of manual works are needed, but we do dump data from spreadsheet (export sheet to CSV) then in turn export down to MYSQL.

1 Like

Ok, will check this out and see if it works!

I am currently trying to create a new table with all variables as text, and will then redefine all variables to the right type. Maybe it could also work

You can set the data type for each fields on new SQL table on export, so it is going to surey work.

Good luck.

I have only ever used dbforge and straight command line interfacing with only mysql databases.

What kind of issues?

Workbench’s CSV “import wizard” is always incredibly slow. It is better to use a SQL query, like this:

LOAD DATA LOCAL INFILE 'C:/full-path-to-local-file_with-forward-slashes' INTO TABLE database.table

This assumes the table is already created, with a matching structure as the CSV. If it is not, sometimes I’ll use the “import wizard” to load a CSV of just the headers and maybe 1 record of data to create the table.

Appsheet’s new “copy data to new source” for individual Tables also works great.