Skip to content

Commit

Permalink
Merge branch 'master' into betterEmptyCheck
Browse files Browse the repository at this point in the history
  • Loading branch information
ohadbitt authored Dec 15, 2024
2 parents af9cd54 + 489254f commit 7fab3a7
Show file tree
Hide file tree
Showing 22 changed files with 771 additions and 422 deletions.
1 change: 1 addition & 0 deletions .github/workflows/build.yml
Original file line number Diff line number Diff line change
Expand Up @@ -45,6 +45,7 @@ jobs:
kustoCluster: ${{ secrets.CLUSTER }}
kustoAadAppId: ${{secrets.APP_ID}}
accessToken: ${{env.ACCESS_TOKEN}}
storageAccountUrl: ${{ secrets.STORAGE_CONTAINER_URL }}
run: |
mvn clean verify -DkustoAadAppId=${{ secrets.APP_ID }} -DkustoAadAuthorityID=${{ secrets.TENANT_ID }} -DkustoDatabase=${{ secrets.DATABASE }} -DkustoCluster=${{ secrets.CLUSTER }} -DaccessToken=${{env.ACCESS_TOKEN}}
- name: Publish Unit Test Results
Expand Down
16 changes: 9 additions & 7 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -21,9 +21,12 @@ This connector works with the following spark environments:

## Changelog

**Breaking changes in versions 5.2.x** - From these versions, the published packages are shaded and packaged as a self contained jar. This is to avoid issues with common OSS libraries, spark runtimes and/or application dependencies.

For major changes from previous releases, please refer to [Releases](https://github.com/Azure/azure-kusto-spark/releases).
For known or new issues, please refer to the [issues](https://github.com/Azure/azure-kusto-spark/issues) section.
> Note: Use the 4.x series only if you are using JDK 11. Versions 3.x and 5.x will work with JDK8 and all versions up
From versions 5.2.0 and up, the connector is packaged as an uber jar to avoid conflicts with other jars that are added as part of the spark job definitions.

## Usage

Expand All @@ -38,14 +41,14 @@ link your application with the artifact below to use the Azure Data Explorer Con
```
groupId = com.microsoft.azure.kusto
artifactId = kusto-spark_3.0_2.12
version = 5.0.6
version = 5.2.2
```

**In Maven**:

Look for the following coordinates:
```
com.microsoft.azure.kusto:kusto-spark_3.0_2.12:5.0.6
com.microsoft.azure.kusto:kusto-spark_3.0_2.12:5.2.2
```

Or clone this repository and build it locally to add it to your local maven repository,.
Expand All @@ -55,15 +58,15 @@ The jar can also be found under the [released package](https://github.com/Azure/
<dependency>
<groupId>com.microsoft.azure.kusto</groupId>
<artifactId>kusto-spark_3.0_2.12</artifactId>
<version>5.0.6</version>
<version>5.2.2</version>
</dependency>
```

**In SBT**:

```scala
libraryDependencies ++= Seq(
"com.microsoft.azure.kusto" %% "kusto-spark_3.0" % "5.0.6"
"com.microsoft.azure.kusto" %% "kusto-spark_3.0" % "5.2.2"
)
```

Expand All @@ -72,7 +75,7 @@ libraryDependencies ++= Seq(
Libraries -> Install New -> Maven -> copy the following coordinates:

```
com.microsoft.azure.kusto:kusto-spark_3.0_2.12:5.0.6
com.microsoft.azure.kusto:kusto-spark_3.0_2.12:5.2.2
```

#### Building Samples Module
Expand Down Expand Up @@ -115,8 +118,7 @@ To facilitate ramp-up from local jar on platforms such as Azure Databricks, pre-
are published under [GitHub Releases](https://github.com/Azure/azure-kusto-spark/releases).
These libraries include:
* Azure Data Explorer connector library
* User may also need to include Kusto Java SDK libraries (kusto-data and kusto-ingest), which are published under
[GitHub Releases](https://github.com/Azure/azure-kusto-java/releases)
* Version 5.2.0 and up of the library publish uber jars to maven. This is because of conflicts between custom jars that are added as part of the job and the exclude/include process that has to be followed to avoid conflicts.

## Dependencies
Spark Azure Data Explorer connector depends on [Azure Data Explorer Data Client Library](https://mvnrepository.com/artifact/com.microsoft.azure.kusto/kusto-data)
Expand Down
Loading

0 comments on commit 7fab3a7

Please sign in to comment.