hive-io-domain-example
An example CQRS/ES domain module to help describe implementation details when leveraging the Hiveio framework.
Contents
Overview
This example evolves the previous hive-io-rest-example into a highly distributed architecture in order to handle different magnitudes of network traffic for viewed
metrics and text
management. It is a contrived but slightly more robust example to illustrate different ways to use Actors in the Hiveio framework.
Endpoints
Once you get the app running using the setup instructions below, you can use the application from the following endpoint(s):
http://localhost/contents (GET, POST)
http://localhost/contents/<id> (GET, PATCH, DELETE)
NOTE: Network data models follow the Flux Standard Action specification for network transport. type
and payload
are derived from the routes and data sent respectively in this example.
Getting Started
This is a straight forward CQRS/ES example of a Content
Entity that contains text, a couple Boolean flags, and a count of how many views it has. It is a highly distributed application with the expectation that viewed
traffic will be much larger than text
management traffic. It stores these Content
s in MongoDB. It leverages Hiveio's built-in telemetry solution with OpenTelemetry. Here's how to use it.
NOTE: This does not include error handling, authentication, and other strategies to keep the example straight forward.
Prerequisites
To use, you'll need:
Installing
To start using:
- Create the following files:
Producer.dockerfile
FROM fnalabs/hive-producer-js:latest
RUN npm install hive-io-domain-example
Stream-Processor.dockerfile
FROM fnalabs/hive-stream-processor-js:latest
RUN npm install hive-io-domain-example
Consumer.dockerfile
FROM fnalabs/hive-consumer-js:latest
RUN npm install hive-io-domain-example
Rest.dockerfile
FROM fnalabs/hive-base-js:latest
RUN npm install hive-io-domain-example
docker-compose.yml
version: '3.5'
services:
proxy:
image: haproxy:2.3.2-alpine
container_name: proxy
depends_on:
- hive-base-js
- hive-stream-processor-js
ports:
- 80:80
networks:
- hive-io
restart: on-failure
hive-producer-js:
build:
context: .
dockerfile: Producer.dockerfile
image: hive-producer-js:production
container_name: hive-producer-js
environment:
ACTOR: ViewContentActor
ACTOR_LIB: hive-io-domain-example
ACTOR_URLS: "/contents/:id"
CLUSTER_SIZE: 1
HTTP_VERSION: 1
SECURE: "false"
TELEMETRY: "true"
TELEMETRY_URL_METRICS: "http://collector:55681/v1/metrics"
TELEMETRY_URL_TRACES: "http://collector:55681/v1/trace"
EVENT_STORE_TOPIC: view
EVENT_STORE_BROKERS: "kafka:29092"
EVENT_STORE_ID: producer-client
depends_on:
- collector
- kafka
networks:
- hive-io
hive-stream-processor-js:
build:
context: .
dockerfile: Stream-Processor.dockerfile
image: hive-stream-processor-js:production
container_name: hive-stream-processor-js
environment:
ACTOR: ContentCommandActor
ACTOR_LIB: hive-io-domain-example
ACTOR_URLS: "/contents,/contents/:id"
CLUSTER_SIZE: 1
HTTP_VERSION: 1
SECURE: "false"
TELEMETRY: "true"
TELEMETRY_URL_METRICS: "http://collector:55681/v1/metrics"
TELEMETRY_URL_TRACES: "http://collector:55681/v1/trace"
CACHE_URL: "redis://redis:6379"
EVENT_STORE_PRODUCER_TOPIC: content
EVENT_STORE_BROKERS: "kafka:29092"
EVENT_STORE_ID: stream-processor-client
depends_on:
- collector
- kafka
- redis
networks:
- hive-io
redis:
image: redis:6.0.9-alpine
container_name: redis
networks:
- hive-io
restart: on-failure
kafka:
image: confluentinc/cp-kafka:5.4.3
container_name: kafka
depends_on:
- zookeeper
environment:
KAFKA_ZOOKEEPER_CONNECT: "zookeeper:32181"
KAFKA_ADVERTISED_LISTENERS: "PLAINTEXT://kafka:29092"
KAFKA_OFFSETS_TOPIC_REPLICATION_FACTOR: 1
KAFKA_COMPRESSION_TYPE: gzip
expose:
- 29092
networks:
- hive-io
restart: on-failure
zookeeper:
image: confluentinc/cp-zookeeper:5.4.3
container_name: zookeeper
environment:
ZOOKEEPER_CLIENT_PORT: 32181
expose:
- 32181
networks:
- hive-io
restart: on-failure
hive-consumer-js:
build:
context: .
dockerfile: Consumer.dockerfile
image: hive-consumer-js:production
container_name: hive-consumer-js
environment:
ACTOR: ContentEventActor
ACTOR_LIB: hive-io-domain-example
CLUSTER_SIZE: 1
HTTP_VERSION: 1
SECURE: "false"
TELEMETRY: "true"
TELEMETRY_PLUGINS: '{"mongodb":{"enabled":true,"path":"@opentelemetry/plugin-mongodb"},"mongoose":{"enabled":true,"path":"@wdalmut/opentelemetry-plugin-mongoose"}}'
TELEMETRY_URL_METRICS: "http://collector:55681/v1/metrics"
TELEMETRY_URL_TRACES: "http://collector:55681/v1/trace"
EVENT_STORE_TOPIC: "content|view"
EVENT_STORE_BROKERS: "kafka:29092"
EVENT_STORE_ID: consumer-client
EVENT_STORE_GROUP_ID: consumer-group
EVENT_STORE_FROM_START: "true"
MONGO_URL: "mongodb://mongo:27017/contents"
depends_on:
- collector
- kafka
- mongo
networks:
- hive-io
mongo:
image: mongo:4.4.2
container_name: mongo
networks:
- hive-io
restart: on-failure
hive-base-js:
build:
context: .
dockerfile: Rest.dockerfile
image: hive-base-js:production
container_name: hive-base-js
environment:
ACTOR: ContentQueryActor
ACTOR_LIB: hive-io-domain-example
ACTOR_URLS: "/contents,/contents/:id"
CLUSTER_SIZE: 1
HTTP_VERSION: 1
SECURE: "false"
TELEMETRY: "true"
TELEMETRY_PLUGINS: '{"mongodb":{"enabled":true,"path":"@opentelemetry/plugin-mongodb"},"mongoose":{"enabled":true,"path":"@wdalmut/opentelemetry-plugin-mongoose"}}'
TELEMETRY_URL_METRICS: "http://collector:55681/v1/metrics"
TELEMETRY_URL_TRACES: "http://collector:55681/v1/trace"
MONGO_URL: "mongodb://mongo:27017/contents"
depends_on:
- collector
- hive-producer-js
- mongo
networks:
- hive-io
collector:
image: otel/opentelemetry-collector:0.17.0
container_name: collector
command: ["--config=/conf/collector-config.yml", "--log-level=ERROR"]
depends_on:
- zipkin
networks:
- hive-io
restart: on-failure
zipkin:
image: openzipkin/zipkin:2.23.1
container_name: zipkin
ports:
- 9411:9411
networks:
- hive-io
restart: on-failure
networks:
hive-io:
driver: bridge
- Run the following commands:
docker-compose up
NOTE: There is a chicken or egg scenario when you run this example for the first time. In this example, the topics are not created until events are sent from hive-producer-js
and hive-stream-processor-js
. Therefore, you will need to restart hive-consumer-js
after the topics are created to finally see events flow through the system.
Environment Variables
The table below contains a reference to the custom environment variables used in the example. Standard environment variables are documented for all service containers.
Name | Type | Default | Description |
---|
MONGO_URL | String | 'mongodb://mongo:27017/post' | url to connect to MongoDB instance |