Skip to content

Commit

Permalink
Merge pull request #70 from eitam-ring/master
Browse files Browse the repository at this point in the history
added Hadoop
  • Loading branch information
kubemq authored Dec 22, 2020
2 parents e656a2d + c99fd94 commit 2b9bd04
Show file tree
Hide file tree
Showing 16 changed files with 1,426 additions and 4 deletions.
1 change: 1 addition & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -68,6 +68,7 @@ A list of supported targets is below.
| | [IBM-MQ](https://developer.ibm.com/components/ibm-mq) |messaging.ibmmq | [Usage](targets/messaging/ibmmq) | [Example](examples/messaging/ibmmq) |
| Storage | | | | |
| | [Minio/S3](https://min.io/) |storage.minio | [Usage](targets/storage/minio) | [Example](examples/storage/minio) |
| | [hadoop/hdfs](https://hadoop.apache.org/) |storage.hdfs | [Usage](targets/storage/hdfs) | [Example](examples/storage/hdfs) |
| Serverless | | | | |
| | [OpenFaas](https://www.openfaas.com/) |serverless.openfaas | [Usage](targets/serverless/openfass) | [Example](examples/serverless/openfaas) |
| Http | | | | |
Expand Down
9 changes: 5 additions & 4 deletions config.yaml
Original file line number Diff line number Diff line change
@@ -1,12 +1,13 @@
bindings:
- name: consulkv
- name: hdfs
source:
kind: kubemq.query
properties:
address: localhost:50000
channel: query.consulkv
channel: query.hdfs
target:
kind: stores.consulkv
kind: storage.hdfs
properties:
address: localhost:8500
address: localhost:9000
user: test_user
properties: {}
24 changes: 24 additions & 0 deletions examples/storage/hdfs/config.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,24 @@
bindings:
- name: kubemq-query-aws-s3
source:
kind: kubemq.query
name: kubemq-query
properties:
address: "kubemq-cluster:50000"
client_id: "kubemq-query-aws-s3-connector"
auth_token: ""
channel: "query.aws.s3"
group: ""
auto_reconnect: "true"
reconnect_interval_seconds: "1"
max_reconnects: "0"
target:
kind: aws.s3
name: aws-s3
properties:
aws_key: "id"
aws_secret_key: 'json'
region: "region"
token: ""
downloader: "true"
uploader: "true"
1 change: 1 addition & 0 deletions examples/storage/hdfs/examplefile.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
My example file to upload
37 changes: 37 additions & 0 deletions examples/storage/hdfs/main.go
Original file line number Diff line number Diff line change
@@ -0,0 +1,37 @@
package main

import (
"context"
"fmt"
"github.com/kubemq-hub/kubemq-targets/types"
"github.com/kubemq-io/kubemq-go"
"github.com/nats-io/nuid"
"log"
"time"
)

func main() {
client, err := kubemq.NewClient(context.Background(),
kubemq.WithAddress("kubemq-cluster", 50000),
kubemq.WithClientId(nuid.Next()),
kubemq.WithTransportType(kubemq.TransportTypeGRPC))
if err != nil {
log.Fatal(err)
}
// listRequest
statRequest := types.NewRequest().
SetMetadataKeyValue("file_path", "/test/foo.txt").
SetMetadataKeyValue("method", "stat")
queryStatResponse, err := client.SetQuery(statRequest.ToQuery()).
SetChannel("query.hdfs").
SetTimeout(10 * time.Second).Send(context.Background())
if err != nil {
log.Fatal(err)
}
statResponse, err := types.ParseResponse(queryStatResponse.Body)
if err != nil {
log.Fatal(err)
}
log.Println(fmt.Sprintf("stat executed, response: %s", statResponse.Data))

}
1 change: 1 addition & 0 deletions go.mod
Original file line number Diff line number Diff line change
Expand Up @@ -28,6 +28,7 @@ require (
github.com/aws/aws-sdk-go v1.34.31
github.com/bradfitz/gomemcache v0.0.0-20190913173617-a41fca850d0b
github.com/cockroachdb/cockroach-go v2.0.1+incompatible
github.com/colinmarc/hdfs/v2 v2.1.1
github.com/couchbase/gocb/v2 v2.1.6
github.com/denisenkom/go-mssqldb v0.0.0-20200910202707-1e08a3fab204
github.com/eclipse/paho.mqtt.golang v1.2.0
Expand Down
201 changes: 201 additions & 0 deletions targets/storage/hdfs/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,201 @@
# Kubemq hadoop target Connector

Kubemq -hadoop target connector allows services using kubemq server to access hadoop service.

## Prerequisites
The following required to run the -hadoop target connector:

- kubemq cluster
- hadoop active server
- kubemq-source deployment

## Configuration

hadoop target connector configuration properties:

| Properties Key | Required | Description | Example |
|:---------------|:---------|:-------------------------------------------|:----------------------------|
| address | yes | hadoop address | "localhost:9000" |
| user | no | hadoop user | "my_user" |


Example:

```yaml
bindings:
- name: kubemq-query--hadoop
source:
kind: kubemq.query
name: kubemq-query
properties:
address: "kubemq-cluster:50000"
client_id: "kubemq-query--hadoop-connector"
auth_token: ""
channel: "query..hadoop"
group: ""
auto_reconnect: "true"
reconnect_interval_seconds: "1"
max_reconnects: "0"
target:
kind: .hadoop
name: -hadoop
properties:
_key: "id"
_secret_key: 'json'
region: "region"
token: ""
downloader: "true"
uploader: "true"
```
## Usage
### Read File
Read File:
| Metadata Key | Required | Description | Possible values |
|:------------------|:---------|:----------------------------------------|:-------------------------------------------|
| file_path | yes | path to file | "/test/foo2.txt" |
| method | yes | type of method | "read_file" |
Example:
```json
{
"metadata": {
"method": "read_file",
"file_path": "/test/foo2.txt"
},
"data": null
}
```


### Write File

Write File:

| Metadata Key | Required | Description | Possible values |
|:------------------|:---------|:----------------------------------------|:-------------------------------------------|
| file_path | yes | path to file | "/test/foo2.txt" |
| method | yes | type of method | "write_file" |
| file_mode | no | os permission mode default(0777) | "0777" |
| data | yes | file as byte array | "TXkgZXhhbXBsZSBmaWxlIHRvIHVwbG9hZA==" |




Example:

```json
{
"metadata": {
"method": "write_file",
"file_path": "/test/foo2.txt"
},
"data": "TXkgZXhhbXBsZSBmaWxlIHRvIHVwbG9hZA=="
}
```

### Remove File

Remove File:

| Metadata Key | Required | Description | Possible values |
|:------------------|:---------|:----------------------------------------|:-------------------------------------------|
| file_path | yes | path to file | "/test/foo2.txt" |
| method | yes | type of method | "remove_file" |




Example:

```json
{
"metadata": {
"method": "remove_file",
"file_path": "/test/foo2.txt"
},
"data": null
}
```

### Rename File

Rename File:

| Metadata Key | Required | Description | Possible values |
|:------------------|:---------|:----------------------------------------|:-------------------------------------------|
| file_path | yes | new path to file | "/test/foo3.txt" |
| old_file_path | yes | new path to file | "/test/foo2.txt" |
| method | yes | type of method | "rename_file" |




Example:

```json
{
"metadata": {
"method": "rename_file",
"file_path": "/test/foo3.txt",
"old_file_path": "/test/foo2.txt"
},
"data": null
}
```

### Make Dir

Make Dir :

| Metadata Key | Required | Description | Possible values |
|:------------------|:---------|:----------------------------------------|:-------------------------------------------|
| file_path | yes | new path to file | "/test_folder" |
| file_mode | no | os permission mode default(0777) | "0777" |
| method | yes | type of method | "mkdir" |




Example:

```json
{
"metadata": {
"method": "mkdir",
"file_path": "/test_folder"
},
"data": null
}
```

### Stat

Stat :

| Metadata Key | Required | Description | Possible values |
|:------------------|:---------|:----------------------------------------|:-------------------------------------------|
| file_path | yes | new path to file | "/test/foo3.txt" |
| method | yes | type of method | "stat" |




Example:

```json
{
"metadata": {
"method": "stat",
"file_path": "/test/foo2.txt"
},
"data": null
}
```
Loading

0 comments on commit 2b9bd04

Please sign in to comment.