Skip to content

Commit

Permalink
Upgrade to .NET 6 (#1112)
Browse files Browse the repository at this point in the history
  • Loading branch information
AFFogarty authored Feb 17, 2023
1 parent d772503 commit b63c08b
Show file tree
Hide file tree
Showing 31 changed files with 131 additions and 137 deletions.
2 changes: 1 addition & 1 deletion .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -49,7 +49,7 @@ dlldata.c
# Benchmark Results
BenchmarkDotNet.Artifacts/

# .NET Core
# .NET
project.lock.json
project.fragment.lock.json
artifacts/
Expand Down
8 changes: 4 additions & 4 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@

.NET for Apache Spark is compliant with .NET Standard - a formal specification of .NET APIs that are common across .NET implementations. This means you can use .NET for Apache Spark anywhere you write .NET code allowing you to reuse all the knowledge, skills, code, and libraries you already have as a .NET developer.

.NET for Apache Spark runs on Windows, Linux, and macOS using .NET Core, or Windows using .NET Framework. It also runs on all major cloud providers including [Azure HDInsight Spark](deployment/README.md#azure-hdinsight-spark), [Amazon EMR Spark](deployment/README.md#amazon-emr-spark), [AWS](deployment/README.md#databricks) & [Azure](deployment/README.md#databricks) Databricks.
.NET for Apache Spark runs on Windows, Linux, and macOS using .NET 6, or Windows using .NET Framework. It also runs on all major cloud providers including [Azure HDInsight Spark](deployment/README.md#azure-hdinsight-spark), [Amazon EMR Spark](deployment/README.md#amazon-emr-spark), [AWS](deployment/README.md#databricks) & [Azure](deployment/README.md#databricks) Databricks.

**Note**: We currently have a Spark Project Improvement Proposal JIRA at [SPIP: .NET bindings for Apache Spark](https://issues.apache.org/jira/browse/SPARK-27006) to work with the community towards getting .NET support by default into Apache Spark. We highly encourage you to participate in the discussion.

Expand Down Expand Up @@ -61,7 +61,7 @@
.NET for Apache Spark releases are available [here](https://github.com/dotnet/spark/releases) and NuGet packages are available [here](https://www.nuget.org/packages/Microsoft.Spark).

## Get Started
These instructions will show you how to run a .NET for Apache Spark app using .NET Core.
These instructions will show you how to run a .NET for Apache Spark app using .NET 6.
- [Windows Instructions](docs/getting-started/windows-instructions.md)
- [Ubuntu Instructions](docs/getting-started/ubuntu-instructions.md)
- [MacOs Instructions](docs/getting-started/macos-instructions.md)
Expand All @@ -79,8 +79,8 @@ Building from source is very easy and the whole process (from cloning to being a

| | | Instructions |
| :---: | :--- | :--- |
| ![Windows icon](docs/img/windows-icon-32.png) | **Windows** | <ul><li>Local - [.NET Framework 4.6.1](docs/building/windows-instructions.md#using-visual-studio-for-net-framework-461)</li><li>Local - [.NET Core 3.1](docs/building/windows-instructions.md#using-net-core-cli-for-net-core)</li><ul> |
| ![Ubuntu icon](docs/img/ubuntu-icon-32.png) | **Ubuntu** | <ul><li>Local - [.NET Core 3.1](docs/building/ubuntu-instructions.md)</li><li>[Azure HDInsight Spark - .NET Core 3.1](deployment/README.md)</li></ul> |
| ![Windows icon](docs/img/windows-icon-32.png) | **Windows** | <ul><li>Local - [.NET Framework 4.6.1](docs/building/windows-instructions.md#using-visual-studio-for-net-framework-461)</li><li>Local - [.NET 6](docs/building/windows-instructions.md#using-net-core-cli-for-net-core)</li><ul> |
| ![Ubuntu icon](docs/img/ubuntu-icon-32.png) | **Ubuntu** | <ul><li>Local - [.NET 6](docs/building/ubuntu-instructions.md)</li><li>[Azure HDInsight Spark - .NET 6](deployment/README.md)</li></ul> |

<a name="samples"></a>
## Samples
Expand Down
2 changes: 1 addition & 1 deletion ROADMAP.md
Original file line number Diff line number Diff line change
Expand Up @@ -12,7 +12,7 @@ The goal of the .NET for Apache Spark project is to provide an easy to use, .NET
### Performance Optimizations
* Improvements to C# Pickling Library
* Improvements to Arrow .NET Library
* Exploiting .NET Core 3.0 Vectorization (*)
* Exploiting .NET Vectorization (*)
* Micro-benchmarking framework for Interop

### Benchmarks
Expand Down
8 changes: 4 additions & 4 deletions azure-pipelines-e2e-tests-template.yml
Original file line number Diff line number Diff line change
Expand Up @@ -20,7 +20,7 @@ stages:
- job: Run_${{ replace(option.pool, ' ', '_') }}
${{ if eq(lower(option.pool), 'windows') }}:
pool:
vmImage: 'windows-2019'
vmImage: 'windows-2022'
${{ else }}:
pool:
${{ if or(eq(variables['System.TeamProject'], 'public'), in(variables['Build.Reason'], 'PullRequest')) }}:
Expand Down Expand Up @@ -58,10 +58,10 @@ stages:
mvn -version
- task: UseDotNet@2
displayName: 'Use .NET Core sdk'
displayName: 'Use .NET 6 sdk'
inputs:
packageType: sdk
version: 3.1.x
version: 6.x
installationPath: $(Agent.ToolsDirectory)/dotnet

- task: DownloadBuildArtifacts@0
Expand All @@ -71,7 +71,7 @@ stages:
downloadPath: $(Build.ArtifactStagingDirectory)

- pwsh: |
$framework = "netcoreapp3.1"
$framework = "net6.0"
if ($env:AGENT_OS -eq 'Windows_NT') {
$runtimeIdentifier = "win-x64"
Expand Down
72 changes: 36 additions & 36 deletions azure-pipelines.yml
Original file line number Diff line number Diff line change
Expand Up @@ -56,7 +56,7 @@ stages:
jobs:
- job: Build
pool:
vmImage: 'windows-2019'
vmImage: 'windows-2022'

variables:
${{ if and(ne(variables['System.TeamProject'], 'public'), notin(variables['Build.Reason'], 'PullRequest')) }}:
Expand Down Expand Up @@ -171,7 +171,7 @@ stages:
- Sign
displayName: Publish Artifacts
pool:
vmImage: 'windows-2019'
vmImage: 'windows-2022'

variables:
${{ if and(ne(variables['System.TeamProject'], 'public'), notin(variables['Build.Reason'], 'PullRequest')) }}:
Expand Down Expand Up @@ -210,8 +210,8 @@ stages:
forwardCompatibleRelease: $(forwardCompatibleRelease)
tests:
- version: '2.4.0'
enableForwardCompatibleTests: true
enableBackwardCompatibleTests: true
enableForwardCompatibleTests: false
enableBackwardCompatibleTests: false
jobOptions:
- pool: 'Windows'
testOptions: ""
Expand All @@ -222,8 +222,8 @@ stages:
backwardCompatibleTestOptions: $(backwardCompatibleTestOptions_Linux_2_4)
forwardCompatibleTestOptions: $(forwardCompatibleTestOptions_Linux_2_4)
- version: '2.4.1'
enableForwardCompatibleTests: true
enableBackwardCompatibleTests: true
enableForwardCompatibleTests: false
enableBackwardCompatibleTests: false
jobOptions:
- pool: 'Windows'
testOptions: ""
Expand All @@ -234,8 +234,8 @@ stages:
backwardCompatibleTestOptions: $(backwardCompatibleTestOptions_Linux_2_4)
forwardCompatibleTestOptions: $(forwardCompatibleTestOptions_Linux_2_4)
- version: '2.4.3'
enableForwardCompatibleTests: true
enableBackwardCompatibleTests: true
enableForwardCompatibleTests: false
enableBackwardCompatibleTests: false
jobOptions:
- pool: 'Windows'
testOptions: ""
Expand All @@ -246,8 +246,8 @@ stages:
backwardCompatibleTestOptions: $(backwardCompatibleTestOptions_Linux_2_4)
forwardCompatibleTestOptions: $(forwardCompatibleTestOptions_Linux_2_4)
- version: '2.4.4'
enableForwardCompatibleTests: true
enableBackwardCompatibleTests: true
enableForwardCompatibleTests: false
enableBackwardCompatibleTests: false
jobOptions:
- pool: 'Windows'
testOptions: ""
Expand All @@ -258,8 +258,8 @@ stages:
backwardCompatibleTestOptions: $(backwardCompatibleTestOptions_Linux_2_4)
forwardCompatibleTestOptions: $(forwardCompatibleTestOptions_Linux_2_4)
- version: '2.4.5'
enableForwardCompatibleTests: true
enableBackwardCompatibleTests: true
enableForwardCompatibleTests: false
enableBackwardCompatibleTests: false
jobOptions:
- pool: 'Windows'
testOptions: ""
Expand All @@ -270,8 +270,8 @@ stages:
backwardCompatibleTestOptions: $(backwardCompatibleTestOptions_Linux_2_4)
forwardCompatibleTestOptions: $(forwardCompatibleTestOptions_Linux_2_4)
- version: '2.4.6'
enableForwardCompatibleTests: true
enableBackwardCompatibleTests: true
enableForwardCompatibleTests: false
enableBackwardCompatibleTests: false
jobOptions:
- pool: 'Windows'
testOptions: ""
Expand All @@ -282,8 +282,8 @@ stages:
backwardCompatibleTestOptions: $(backwardCompatibleTestOptions_Linux_2_4)
forwardCompatibleTestOptions: $(forwardCompatibleTestOptions_Linux_2_4)
- version: '2.4.7'
enableForwardCompatibleTests: true
enableBackwardCompatibleTests: true
enableForwardCompatibleTests: false
enableBackwardCompatibleTests: false
jobOptions:
- pool: 'Windows'
testOptions: ""
Expand All @@ -294,8 +294,8 @@ stages:
backwardCompatibleTestOptions: $(backwardCompatibleTestOptions_Linux_2_4)
forwardCompatibleTestOptions: $(forwardCompatibleTestOptions_Linux_2_4)
- version: '2.4.8'
enableForwardCompatibleTests: true
enableBackwardCompatibleTests: true
enableForwardCompatibleTests: false
enableBackwardCompatibleTests: false
jobOptions:
- pool: 'Windows'
testOptions: ""
Expand All @@ -306,8 +306,8 @@ stages:
backwardCompatibleTestOptions: $(backwardCompatibleTestOptions_Linux_2_4)
forwardCompatibleTestOptions: $(forwardCompatibleTestOptions_Linux_2_4)
- version: '3.0.0'
enableForwardCompatibleTests: true
enableBackwardCompatibleTests: true
enableForwardCompatibleTests: false
enableBackwardCompatibleTests: false
jobOptions:
- pool: 'Windows'
testOptions: ""
Expand All @@ -318,8 +318,8 @@ stages:
backwardCompatibleTestOptions: $(backwardCompatibleTestOptions_Linux_3_0)
forwardCompatibleTestOptions: $(forwardCompatibleTestOptions_Linux_3_0)
- version: '3.0.1'
enableForwardCompatibleTests: true
enableBackwardCompatibleTests: true
enableForwardCompatibleTests: false
enableBackwardCompatibleTests: false
jobOptions:
- pool: 'Windows'
testOptions: ""
Expand All @@ -330,8 +330,8 @@ stages:
backwardCompatibleTestOptions: $(backwardCompatibleTestOptions_Linux_3_0)
forwardCompatibleTestOptions: $(forwardCompatibleTestOptions_Linux_3_0)
- version: '3.0.2'
enableForwardCompatibleTests: true
enableBackwardCompatibleTests: true
enableForwardCompatibleTests: false
enableBackwardCompatibleTests: false
jobOptions:
- pool: 'Windows'
testOptions: ""
Expand All @@ -342,8 +342,8 @@ stages:
backwardCompatibleTestOptions: $(backwardCompatibleTestOptions_Linux_3_0)
forwardCompatibleTestOptions: $(forwardCompatibleTestOptions_Linux_3_0)
- version: '3.1.1'
enableForwardCompatibleTests: true
enableBackwardCompatibleTests: true
enableForwardCompatibleTests: false
enableBackwardCompatibleTests: false
jobOptions:
- pool: 'Windows'
testOptions: ""
Expand All @@ -354,8 +354,8 @@ stages:
backwardCompatibleTestOptions: $(backwardCompatibleTestOptions_Linux_3_1)
forwardCompatibleTestOptions: $(forwardCompatibleTestOptions_Linux_3_1)
- version: '3.1.2'
enableForwardCompatibleTests: true
enableBackwardCompatibleTests: true
enableForwardCompatibleTests: false
enableBackwardCompatibleTests: false
jobOptions:
- pool: 'Windows'
testOptions: ""
Expand All @@ -366,8 +366,8 @@ stages:
backwardCompatibleTestOptions: $(backwardCompatibleTestOptions_Linux_3_1)
forwardCompatibleTestOptions: $(forwardCompatibleTestOptions_Linux_3_1)
- version: '3.2.0'
enableForwardCompatibleTests: true
enableBackwardCompatibleTests: true
enableForwardCompatibleTests: false
enableBackwardCompatibleTests: false
jobOptions:
- pool: 'Windows'
testOptions: ""
Expand All @@ -378,8 +378,8 @@ stages:
backwardCompatibleTestOptions: $(backwardCompatibleTestOptions_Linux_3_2)
forwardCompatibleTestOptions: $(forwardCompatibleTestOptions_Linux_3_2)
- version: '3.2.1'
enableForwardCompatibleTests: true
enableBackwardCompatibleTests: true
enableForwardCompatibleTests: false
enableBackwardCompatibleTests: false
jobOptions:
- pool: 'Windows'
testOptions: ""
Expand All @@ -390,8 +390,8 @@ stages:
backwardCompatibleTestOptions: $(backwardCompatibleTestOptions_Linux_3_2)
forwardCompatibleTestOptions: $(forwardCompatibleTestOptions_Linux_3_2)
- version: '3.2.2'
enableForwardCompatibleTests: true
enableBackwardCompatibleTests: true
enableForwardCompatibleTests: false
enableBackwardCompatibleTests: false
jobOptions:
- pool: 'Windows'
testOptions: ""
Expand All @@ -402,8 +402,8 @@ stages:
backwardCompatibleTestOptions: $(backwardCompatibleTestOptions_Linux_3_2)
forwardCompatibleTestOptions: $(forwardCompatibleTestOptions_Linux_3_2)
- version: '3.2.3'
enableForwardCompatibleTests: true
enableBackwardCompatibleTests: true
enableForwardCompatibleTests: false
enableBackwardCompatibleTests: false
jobOptions:
- pool: 'Windows'
testOptions: ""
Expand Down
3 changes: 1 addition & 2 deletions benchmark/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -60,8 +60,7 @@ TPCH timing results is written to stdout in the following form: `TPCH_Result,<la
<true for sql tests, false for functional tests>
```

**Note**: Ensure that you build the worker and application with .NET Core 3.0 in order to run hardware acceleration queries.

**Note**: Ensure that you build the worker and application with .NET 6 in order to run hardware acceleration queries.

## Python
1. Upload [run_python_benchmark.sh](run_python_benchmark.sh) and all [python tpch benchmark](python/) files to the cluster.
Expand Down
6 changes: 3 additions & 3 deletions benchmark/csharp/Tpch/Tpch.csproj
Original file line number Diff line number Diff line change
Expand Up @@ -2,8 +2,8 @@

<PropertyGroup>
<OutputType>Exe</OutputType>
<TargetFrameworks>net461;netcoreapp3.1</TargetFrameworks>
<TargetFrameworks Condition="'$(OS)' != 'Windows_NT'">netcoreapp3.1</TargetFrameworks>
<TargetFrameworks>net461;net6.0</TargetFrameworks>
<TargetFrameworks Condition="'$(OS)' != 'Windows_NT'">net6.0</TargetFrameworks>
<RootNamespace>Tpch</RootNamespace>
<AssemblyName>Tpch</AssemblyName>
</PropertyGroup>
Expand All @@ -16,7 +16,7 @@
</ItemGroup>

<Choose>
<When Condition="'$(TargetFramework)' == 'netcoreapp3.1'">
<When Condition="'$(TargetFramework)' == 'net6.0'">
<PropertyGroup>
<AllowUnsafeBlocks>true</AllowUnsafeBlocks>
</PropertyGroup>
Expand Down
6 changes: 3 additions & 3 deletions deployment/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -63,7 +63,7 @@ Microsoft.Spark.Worker is a backend component that lives on the individual worke
## Azure HDInsight Spark
[Azure HDInsight Spark](https://docs.microsoft.com/en-us/azure/hdinsight/spark/apache-spark-overview) is the Microsoft implementation of Apache Spark in the cloud that allows users to launch and configure Spark clusters in Azure. You can use HDInsight Spark clusters to process your data stored in Azure (e.g., [Azure Storage](https://azure.microsoft.com/en-us/services/storage/) and [Azure Data Lake Storage](https://docs.microsoft.com/en-us/azure/storage/blobs/data-lake-storage-introduction)).

> **Note:** Azure HDInsight Spark is Linux-based. Therefore, if you are interested in deploying your app to Azure HDInsight Spark, make sure your app is .NET Standard compatible and that you use [.NET Core compiler](https://dotnet.microsoft.com/download) to compile your app.
> **Note:** Azure HDInsight Spark is Linux-based. Therefore, if you are interested in deploying your app to Azure HDInsight Spark, make sure your app is .NET Standard compatible and that you use [.NET 6 compiler](https://dotnet.microsoft.com/download) to compile your app.

### Deploy Microsoft.Spark.Worker
*Note that this step is required only once*
Expand Down Expand Up @@ -115,7 +115,7 @@ EOF
## Amazon EMR Spark
[Amazon EMR](https://docs.aws.amazon.com/emr/latest/ManagementGuide/emr-what-is-emr.html) is a managed cluster platform that simplifies running big data frameworks on AWS.
> **Note:** AWS EMR Spark is Linux-based. Therefore, if you are interested in deploying your app to AWS EMR Spark, make sure your app is .NET Standard compatible and that you use [.NET Core compiler](https://dotnet.microsoft.com/download) to compile your app.
> **Note:** AWS EMR Spark is Linux-based. Therefore, if you are interested in deploying your app to AWS EMR Spark, make sure your app is .NET Standard compatible and that you use [.NET 6 compiler](https://dotnet.microsoft.com/download) to compile your app.
### Deploy Microsoft.Spark.Worker
*Note that this step is only required at cluster creation*
Expand Down Expand Up @@ -160,7 +160,7 @@ foo@bar:~$ aws emr add-steps \
## Databricks
[Databricks](http://databricks.com) is a platform that provides cloud-based big data processing using Apache Spark.
> **Note:** [Azure](https://azure.microsoft.com/en-us/services/databricks/) and [AWS](https://databricks.com/aws) Databricks is Linux-based. Therefore, if you are interested in deploying your app to Databricks, make sure your app is .NET Standard compatible and that you use [.NET Core compiler](https://dotnet.microsoft.com/download) to compile your app.
> **Note:** [Azure](https://azure.microsoft.com/en-us/services/databricks/) and [AWS](https://databricks.com/aws) Databricks is Linux-based. Therefore, if you are interested in deploying your app to Databricks, make sure your app is .NET Standard compatible and that you use [.NET 6 compiler](https://dotnet.microsoft.com/download) to compile your app.
Databricks allows you to submit Spark .NET apps to an existing active cluster or create a new cluster everytime you launch a job. This requires the **Microsoft.Spark.Worker** to be installed **first** before you submit a Spark .NET app.
Expand Down
Loading

0 comments on commit b63c08b

Please sign in to comment.