-
Notifications
You must be signed in to change notification settings - Fork 448
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
SQLQuery default print error instead of throwing error stack #581
base: master
Are you sure you want to change the base?
Conversation
cc @itamarst |
Hello @stczwd I am studying the issue on throwing errors instead of printing them on the spark kernel (because I really like/use this project :) ), do you think it could be better if this https://github.com/jupyter-incubator/sparkmagic/blob/master/sparkmagic/sparkmagic/livyclientlib/sqlquery.py#L48 could return something like this https://github.com/jupyter-incubator/sparkmagic/blob/master/sparkmagic/sparkmagic/magics/sparkmagicsbase.py#L44 and the whole idea of if not success:
if conf.spark_statement_errors_are_fatal():
if conf.shutdown_session_on_spark_statement_errors():
self.spark_controller.cleanup()
raise SparkStatementException(out) could be extracted into another function instead of having the same code duplicated in different files. This whole logic also could be extracted into another function in a different file and become easily reusable, but my whole point is to question where this error throwing should be centralized, I think it should be centralized on the |
@PedroRossi Yeap, this is a good suggestion. I will check it out. |
@PedroRossi This is a great idea, but it need a lot of changes. I will submit this in a new patch, which we can discuss in it. |
Could you describe the status of this PR now? The goal again, and what it tries to do? |
@itamarst This patch is mainly used to control throwing exception in sqlqueries. There will be another patch to handle the problem @PedroRossi talked with me. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Please add tests for this
result = records_to_dataframe(records_text, session.kind, self._coerce) | ||
if conf.spark_statement_errors_are_fatal(): | ||
raise SparkStatementException(records_text) | ||
self.ipython_display.send_error(records_text) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
If conf.spark_statement_errors_are_fatal() == False
and a SparkStatementException
is raised then wouldn't this be handled and displayed by handle_expected_exceptions
?
The default result of the Spark failure is the error message, no exception throwing until user sets spark_statement_errors_are_fatal. The result of running SQL needs this, too. For most users, the error message is sufficient.