Huge News!Announcing our $40M Series B led by Abstract Ventures.Learn More
Socket
Sign inDemoInstall
Socket

hive-io-domain-example

Package Overview
Dependencies
Maintainers
1
Versions
24
Alerts
File Explorer

Advanced tools

Socket logo

Install Socket

Detect and block malicious and high-risk dependencies

Install

hive-io-domain-example

An example CQRS/ES module to help describe implementation details when leveraging the Hive^io framework.

  • 2.1.1
  • Source
  • npm
  • Socket score

Version published
Weekly downloads
1
Maintainers
1
Weekly downloads
 
Created
Source

hive-io-domain-example

NPM Version License Code Coverage JavaScript Style Guide

An example CQRS/ES domain module to help describe implementation details when leveraging the Hiveio framework.

Contents

Overview

This example evolves the previous hive-io-rest-example into a highly distributed architecture in order to handle different magnitudes of network traffic for viewed metrics and text management. It is a contrived but slightly more robust example to illustrate different ways to use Actors in the Hiveio framework.

Endpoints

Once you get the app running using the setup instructions below, you can use the application from the following endpoint(s):

  • http://localhost/contents (GET, POST)
    • POST API JSON Schema
      {
        "text": "something"
      }
      
  • http://localhost/contents/<id> (GET, PATCH, DELETE)

NOTE: Network data models follow the Flux Standard Action specification for network transport. type and payload are derived from the routes and data sent respectively in this example.

Source Code

Getting Started

This is a straight forward CQRS/ES example of a Content Entity that contains text, a couple Boolean flags, and a count of how many views it has. It is a highly distributed application with the expectation that viewed traffic will be much larger than text management traffic. It stores these Contents in MongoDB. It leverages Hiveio's built-in telemetry solution with OpenTelemetry. Here's how to use it.

NOTE: This does not include error handling, authentication, and other strategies to keep the example straight forward.

Prerequisites

To use, you'll need:

Installing

To start using:

  1. Create the following files:
    • Producer.dockerfile
      FROM fnalabs/hive-producer-js:latest
      RUN npm install hive-io-domain-example
      
    • Stream-Processor.dockerfile
      FROM fnalabs/hive-stream-processor-js:latest
      RUN npm install hive-io-domain-example
      
    • Consumer.dockerfile
      FROM fnalabs/hive-consumer-js:latest
      RUN npm install hive-io-domain-example
      
    • Rest.dockerfile
      FROM fnalabs/hive-base-js:latest
      RUN npm install hive-io-domain-example
      
    • docker-compose.yml
      version: '3.5'
      services:
        # proxy for layer 7 routing
        # NOTE: this is an example, you will need to define your own config
        #       ex. https://github.com/fnalabs/hive-io/tree/master/dev/proxy
        proxy:
          image: haproxy:2.3.2-alpine
          container_name: proxy
          depends_on:
            - hive-base-js
            - hive-stream-processor-js
          ports:
            - 80:80
          networks:
            - hive-io
          restart: on-failure
      
        # producers
        hive-producer-js:
          build:
            context: .
            dockerfile: Producer.dockerfile
          image: hive-producer-js:production
          container_name: hive-producer-js
          environment:
            ACTOR: ViewContentActor
            ACTOR_LIB: hive-io-domain-example
            ACTOR_URLS: "/contents/:id"
            CLUSTER_SIZE: 1
            HTTP_VERSION: 1
            SECURE: "false"
            TELEMETRY: "true"
            TELEMETRY_URL_METRICS: "http://collector:55681/v1/metrics"
            TELEMETRY_URL_TRACES: "http://collector:55681/v1/trace"
            EVENT_STORE_TOPIC: view
            EVENT_STORE_BROKERS: "kafka:29092"
            EVENT_STORE_ID: producer-client
          depends_on:
            - collector
            - kafka
          networks:
            - hive-io
      
        # stream processors
        hive-stream-processor-js:
          build:
            context: .
            dockerfile: Stream-Processor.dockerfile
          image: hive-stream-processor-js:production
          container_name: hive-stream-processor-js
          environment:
            ACTOR: ContentCommandActor
            ACTOR_LIB: hive-io-domain-example
            ACTOR_URLS: "/contents,/contents/:id"
            CLUSTER_SIZE: 1
            HTTP_VERSION: 1
            SECURE: "false"
            TELEMETRY: "true"
            TELEMETRY_URL_METRICS: "http://collector:55681/v1/metrics"
            TELEMETRY_URL_TRACES: "http://collector:55681/v1/trace"
            CACHE_URL: "redis://redis:6379"
            EVENT_STORE_PRODUCER_TOPIC: content
            EVENT_STORE_BROKERS: "kafka:29092"
            EVENT_STORE_ID: stream-processor-client
          depends_on:
            - collector
            - kafka
            - redis
          networks:
            - hive-io
        redis:
          image: redis:6.0.9-alpine
          container_name: redis
          networks:
            - hive-io
          restart: on-failure
      
        # log stream containers
        kafka:
          image: confluentinc/cp-kafka:5.4.3
          container_name: kafka
          depends_on:
            - zookeeper
          environment:
            KAFKA_ZOOKEEPER_CONNECT: "zookeeper:32181"
            KAFKA_ADVERTISED_LISTENERS: "PLAINTEXT://kafka:29092"
            KAFKA_OFFSETS_TOPIC_REPLICATION_FACTOR: 1
            KAFKA_COMPRESSION_TYPE: gzip
          expose:
            - 29092
          networks:
            - hive-io
          restart: on-failure
        zookeeper:
          image: confluentinc/cp-zookeeper:5.4.3
          container_name: zookeeper
          environment:
            ZOOKEEPER_CLIENT_PORT: 32181
          expose:
            - 32181
          networks:
            - hive-io
          restart: on-failure
      
        # consumers
        hive-consumer-js:
          build:
            context: .
            dockerfile: Consumer.dockerfile
          image: hive-consumer-js:production
          container_name: hive-consumer-js
          environment:
            ACTOR: ContentEventActor
            ACTOR_LIB: hive-io-domain-example
            CLUSTER_SIZE: 1
            HTTP_VERSION: 1
            SECURE: "false"
            TELEMETRY: "true"
            TELEMETRY_PLUGINS: '{"mongodb":{"enabled":true,"path":"@opentelemetry/plugin-mongodb"},"mongoose":{"enabled":true,"path":"@wdalmut/opentelemetry-plugin-mongoose"}}'
            TELEMETRY_URL_METRICS: "http://collector:55681/v1/metrics"
            TELEMETRY_URL_TRACES: "http://collector:55681/v1/trace"
            EVENT_STORE_TOPIC: "content|view"
            EVENT_STORE_BROKERS: "kafka:29092"
            EVENT_STORE_ID: consumer-client
            EVENT_STORE_GROUP_ID: consumer-group
            EVENT_STORE_FROM_START: "true"
            MONGO_URL: "mongodb://mongo:27017/contents"
          depends_on:
            - collector
            - kafka
            - mongo
          networks:
            - hive-io
        mongo:
          image: mongo:4.4.2
          container_name: mongo
          networks:
            - hive-io
          restart: on-failure
      
        # rest services
        hive-base-js:
          build:
            context: .
            dockerfile: Rest.dockerfile
          image: hive-base-js:production
          container_name: hive-base-js
          environment:
            ACTOR: ContentQueryActor
            ACTOR_LIB: hive-io-domain-example
            ACTOR_URLS: "/contents,/contents/:id"
            CLUSTER_SIZE: 1
            HTTP_VERSION: 1
            SECURE: "false"
            TELEMETRY: "true"
            TELEMETRY_PLUGINS: '{"mongodb":{"enabled":true,"path":"@opentelemetry/plugin-mongodb"},"mongoose":{"enabled":true,"path":"@wdalmut/opentelemetry-plugin-mongoose"}}'
            TELEMETRY_URL_METRICS: "http://collector:55681/v1/metrics"
            TELEMETRY_URL_TRACES: "http://collector:55681/v1/trace"
            MONGO_URL: "mongodb://mongo:27017/contents"
          depends_on:
            - collector
            - hive-producer-js
            - mongo
          networks:
            - hive-io
      
        # telemetry
        # NOTE: this is an example, you will need to define your own config
        #       ex. https://github.com/fnalabs/hive-io/tree/master/dev/collector
        collector:
          image: otel/opentelemetry-collector:0.17.0
          container_name: collector
          command: ["--config=/conf/collector-config.yml", "--log-level=ERROR"]
          depends_on:
            - zipkin
          networks:
            - hive-io
          restart: on-failure
        zipkin:
          image: openzipkin/zipkin:2.23.1
          container_name: zipkin
          ports:
            - 9411:9411
          networks:
            - hive-io
          restart: on-failure
      
      # networking
      networks:
        hive-io:
          driver: bridge
      
  2. Run the following commands:
    docker-compose up
    

NOTE: There is a chicken or egg scenario when you run this example for the first time. In this example, the topics are not created until events are sent from hive-producer-js and hive-stream-processor-js. Therefore, you will need to restart hive-consumer-js after the topics are created to finally see events flow through the system.

Environment Variables

The table below contains a reference to the custom environment variables used in the example. Standard environment variables are documented for all service containers.

NameTypeDefaultDescription
MONGO_URLString'mongodb://mongo:27017/post'url to connect to MongoDB instance

FAQs

Package last updated on 30 Dec 2020

Did you know?

Socket

Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.

Install

Related posts

SocketSocket SOC 2 Logo

Product

  • Package Alerts
  • Integrations
  • Docs
  • Pricing
  • FAQ
  • Roadmap
  • Changelog

Packages

npm

Stay in touch

Get open source security insights delivered straight into your inbox.


  • Terms
  • Privacy
  • Security

Made with ⚡️ by Socket Inc