-
Notifications
You must be signed in to change notification settings - Fork 701
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Use unified DB-API in codebase #2821
Use unified DB-API in codebase #2821
Conversation
hdfs_user=hdfs_user, | ||
hdfs_pass=hdfs_pass) as w: | ||
with db.buffered_db_writer(conn, result_table, result_column_names, | ||
100) as w: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Why remove the other parameters?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Parameters like hdfs_pass
can be passed by connection uri, they are driver specific. We have a unified way to save these params (in conn.params dict
), we can get them by conn.param(param_name)
when needed.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I see, the db.py
file was folded and I missed the file.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM basically.
@@ -130,15 +132,18 @@ def get_oss_model_url(model_full_path): | |||
return "oss:https://%s/%s" % (oss.SQLFLOW_MODELS_BUCKET, model_full_path) | |||
|
|||
|
|||
def parse_maxcompute_dsn(datasource): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Do we need this method or just write MaxComputeConnection.get_uri_parts(datasource)
when calling?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Good, idea. I will figure this out and do some modify if possible. In next PR :)
* Update OptFlow FSL generation code when len(variables) == 2 (#2778) * change optflow api * polish * add more ut * alps submitter and codegen (#2771) * test alps submitter * add alps codegen * update * Support download model to local in cli (#2779) * support download model to local in cli * update * update * update * Add query api to db.py (#2782) * Add query to db.py * change delete with truncate * Set EnableWindowFunc to be true by default for TiDB parser. (#2786) * Make the attribute check for XGBoost model compatible with reg:linear * set EnableWindowFunc to true by default for TiDB parser. * fix Tensorflow -> TensorFlow (#2783) * Add optimization guide doc (#2785) * add optimization guide doc * polish according to comments * Generate workflow using runtime (#2784) * WIP generate workflow using runtime * wip update * update * update * fix hive ci * DB api base class (#2787) * Add query to db.py * change delete with truncate * DB interface base class * add to_dict and from_dict method (#2792) * Fix develop jupyter image build (#2790) * fix develop jupyter image build * update * Generate workflow step code using runtime fea der (#2791) * WIP generate workflow step code using runtime fea der * tested local * update * update * update * fix tests * Add MySQL db-api implementation (#2793) * Add query to db.py * change delete with truncate * DB interface base class * Add MySQL db-api implementation * remove unused import * fix actions maxcompute test not running (#2795) * Enable flake8 check on CI (#2788) * test ci * test again * update * update and fix * fix travis ci env * generate python feacol code (#2797) * Add hive DB-API (#2798) * Add query to db.py * change delete with truncate * DB interface base class * Add MySQL db-api implementation * remove unused import * polish mysql db-api * Add hive DB-API * modify doc * format code * modify cora dataset to adapt csv format (#2780) * Add json dump and load support for FeatureColumn (#2794) * add json dump load support * update vocabulary type * update * update * update * Add maxcompute DB-API (#2801) * Add maxcompute DB-API * remove unused import * format code * Push images on self hosted machine (#2799) * push images on self hosted machine * update * update * update * update * test install.sh * fix go mirrors * clean up * add clean up * update clean up script * fix pai xgboost package deps (#2803) * Simplify TO RUN command - use filename instead of absolute path for the executable or script program (#2804) * Make the attribute check for XGBoost model compatible with reg:linear * Derive the absolute path of the runnable program if users just input a file name. * Use python -m command to invoke the TO RUN statement in default submitter. * Move getRunnableProgramAbsPath to alisa.go * Polish DB-API code, export unified connect function from package. (#2808) * Add maxcompute DB-API * remove unused import * format code * polish db-api * add solved y to optimize (#2810) * Generate couler code of workflow steps (#2806) * wip * fix yaml generate * fix tests * fix package deps * fix pip package deps * update * Refine metadata collect and save/load (#2807) * move and refine metadata * fix ci ut * fix ut * follow lhw comment * Adapt paiio with DB-API (#2809) * Add maxcompute DB-API * remove unused import * format code * polish db-api * Adapt paiio with DB-API * Adapt paiio with DB-API * add try import paiio * fix typo * disable actions maxcompute test (#2814) * make constraint optional (#2812) * fix typo (#2820) * Install BARON solver in Docker image (#2811) * install baron solver in Docker image * polish * add pyomo baron into step docker image * Polish DB-API to support Python2 so can run on PAI (#2815) * polish db-api to support Python2 so can run on PAI * enable unittest for hive db-api * switch to github actions (#2818) * Add experimental workflow end2end test (#2813) * add experimental workflow end2end test * fix workflow ci env * update test code * pull latest step before running workflow * Add Model.save_to_oss and Model.load_from_oss (#2817) * add save_to_oss/load_from_oss * change pickle protocol * add more explanations on oss_model_dir doc * fix ut * Fix relative importing cause error (#2823) * fix relative importing cause error * clean up * Use unified DB-API in codebase (#2821) * Add maxcompute DB-API * remove unused import * format code * polish db-api * Adapt paiio with DB-API * Adapt paiio with DB-API * add try import paiio * use db-api in old code * DB-API support Python2 so can run on PAI * polish db-api to support Python2 so can run on PAI * polish db-api to support Python2 so can run on PAI * polish db-api to support Python2 so can run on PAI * Use unified DB-API in codebase. * Use unified DB-API in codebase. * polish code * remove debug info * fix ut * Generate workflow step for normal statement run (#2824) * generate workflow step for normal statement run * clean up * build step image before run workflow test * fix is_query * Fix pai training with optimizer config (#2828) * fix pai training with optimizer config * remove template * Save the trained xgboost model (#2822) * save trained xgboost model * fix flake8 check * fix ut * fix ut * fix workflow ut * fix cwd error Co-authored-by: Wu Yi <[email protected]> Co-authored-by: HongwuLin <[email protected]> Co-authored-by: brightcoder01 <[email protected]>
Fixes #2781