Skip to content
This repository was archived by the owner on Feb 27, 2025. It is now read-only.

sql-spark-connector issue where the access token expires post 60 mins #253

Open
shyamalesh opened this issue Jan 9, 2024 · 1 comment
Open

Comments

@shyamalesh
Copy link

Even after setting the default time out in the Azure active directory for the particular service principal to 9 hours. We are still facing the token expiry issue post 1 hour of execution.

Connection code from databricks to Azure SQL -
df.write
.format(DBConnection.connector_type_spark)
.mode("overwrite")
.option("url", url)
.option("dbtable", destination_tablename)
.option("accessToken", access_token)
.option("encrypt", "true")
.option("databaseName", DBConnection.jdbc_database)
.option("hostNameInCertificate", host_name_in_certificate)
.save()

Error message - org.apache.spark.SparkException: Job aborted due to stage failure: Task 157 in stage 29.0 failed 4 times, most recent failure: Lost task 157.8 in stage 29.0 (TID 19900) (10.139.64.13 executor 0): com.microsoft.sqlserver.jdbc.SQLServerException: Login failed for user '<token-identified principal>'. Token is expired. ClientConnectionId:c79c98cb-8d3e-421f-88f0-971508e53a72

Is there any way to connect to Azure sql server from databricks using ECDOMAIN without password ?

Example -
df.write
.format("com.microsoft.sqlserver.jdbc.spark")
.mode("overwrite")
.option("url", url)
.option("dbtable", dbtable)
.option("user", ECDOMAIN\username)
.save()

@mcbdx
Copy link

mcbdx commented May 14, 2024

Hello,

I'm having a similar issue, except that the token works for the first 30 seconds only. Did you ever find a solution to this? How are you acquiring your access token?

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants