Welcome to OStack Knowledge Sharing Community for programmer and developer-Open, Learning and Share
Welcome To Ask or Share your Answers For Others

Categories

0 votes
300 views
in Technique[技术] by (71.8m points)

azure - Using a JAR from a private Artifacts Feed in Data Factory Databricks task

My goal is to first publish JAR files of my Scala project to Azure DevOps Artifacts Feed and then use those JARs as part of my Databricks tasks in Azure Data Factory. I'm assuming I would have to use Maven as my library type and point the repository to the artefacts feed.

The feed is private and I haven't found a way to set up credentials for data factory to use it.

Is this possible in the first place? Could the credentials be embedded to the Maven Repository URL?

"Append libraries" options under Data Factory's Databricks task:

"Append libraries" options under Data Factory's Databricks task


与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome To Ask or Share your Answers For Others

1 Answer

0 votes
by (71.8m points)

Maven private repositories is not supported in Azure Databricks. If you want to install a private maven package, you can upload the package to DBFS and install the package via Databricks libraries CLI. See below example:

Upload to DBFS from local.

databricks fs cp "path/to/myPac.jar" dbfs:/mavenPrivate/jars 

Install a JAR from DBFS

databricks libraries install --cluster-id $CLUSTER_ID --jar dbfs:/mavenPrivate/jars/myPac.jar

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome to OStack Knowledge Sharing Community for programmer and developer-Open, Learning and Share
Click Here to Ask a Question

...