-
Notifications
You must be signed in to change notification settings - Fork 38
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Unable to read MySQL table without Primary Key #26
Comments
Any updates on this? |
I believe this error is due to sqlalchemy, not beam-nuggets: https://docs.sqlalchemy.org/en/13/faq/ormconfiguration.html#how-do-i-map-a-table-that-has-no-primary-key. SQLAlchemy forces a primary key to be defined by the table, and there are a few tricks to work around it. It is best practice when working with relational databases to have a primary key in every table. |
I dealt with that with a wild workaround by creating a mock table with a primary key and put it into relational_db.ReadFromDB(
source_config=data_base.get_source_config(),
table_name='mock_table_with_primary',
query=data_base.needed_query()) It worked despite the fact I query 4 tables excluding the one I put. Why can't I just query whatever I want (including cross-table joins etc...) from DB without putting |
Thanks for the workaround @sann05 I'll try it out. |
I created a simple table and ran a Dataflow job. But I got hit by this error:
sqlalchemy.exc.ArgumentError: Mapper mapped class TableClass->data could not assemble any primary key columns for mapped table 'data' [while running 'Reading from CloudSQLMySQL/ParDo(_ReadFromRelationalDBFn)']
Then I alter my table, I added a primary key then ran the job again and got succeeded.
It appears that beam_nuggets cannot read a table without primary key. If so, why is that? Is there a way to bypass this limitation since I'm trying to dump some tcp data into a table and there's no way to generate a primary key for that data.
Here is the full stacktrace of the error:
The text was updated successfully, but these errors were encountered: