diff options
author | earrage <eddie.arrage@huawei.com> | 2018-10-22 13:53:54 -0700 |
---|---|---|
committer | earrage <eddie.arrage@huawei.com> | 2018-10-22 14:04:38 -0700 |
commit | 2139f983fbe71bf6411259e8ddb460a79663dcb8 (patch) | |
tree | 0bf27000a9a1a73eebfab83f844b403fad45a5c1 /clover/monitoring/__init__.py | |
parent | ee2169ee4b8fb3539ad173fbc1557b54b2f2216f (diff) |
Initial commit for Spark to analyze visibility data
- Add Apache Spark 2.3 with native Kubernetes support.
- Runs self contained within K8s cluster in clover-system
namespace. One container (clover-spark) includes Clover Spark
JAR artifact. This container interacts with the
K8s API to spawn a spark-driver pod. This pod in turn spawns executor
pods to execute Spark jobs.
- Currently JAR is included in source for convenience and must be
built with sbt (install sbt and execute sbt package)
- Includes JAR from DataStax to provide Cassandra connector to analyze
Cassandra schemas as RDDs (Resilient Distributed Dataset).
- Includes Redis interface JAR to write analyzed data back to visibility
(UI, CLI or API).
- Second container (clover-spark-submit) submits Spark jobs
continuously to allow Spark to be operated entirely within the cluster.
- Two Spark jobs (CloverSlow, CloverFast) allows some analytics to be
provided in real-time and other analytics to be provided over longer
horizons.
- Each Spark job spawns two executor pods.
- Includes yaml manifest to deploy clover-spark-submit with the
necessary RBAC permissions to interact with the K8s API.
- Data analyzed includes tracing and metrics schemas obtained by
clover-collector and written to Cassandra.
- Docker builds of clover-spark and clover-spark-submit are provided
and will be pushed as OPNFV DockerHub images in a separate patch.
Change-Id: I2e92c41fd75d4ebba948c0f8cb60face57005e50
Signed-off-by: earrage <eddie.arrage@huawei.com>
Diffstat (limited to 'clover/monitoring/__init__.py')
0 files changed, 0 insertions, 0 deletions