Skip to content

Add the possibility to publish all things programatically

Description

  • It should be possible to mimic the same behavior when you go to the current frontend, select all things and make a bulk save
  • As far as I could see is, that in the run_configdb_updater.py for each saved thing, a message will be published, containing the thing_uuid to the topic ${TOPIC_CONFIG_DB_UPDATE} (passed via env)
 payload = json.dumps({"thing": thing_uuid})
            client.publish(topic=pub_topic, payload=payload, qos=pub_qos)

Thoughts

  • My first idea was to use an existing service to do that, but that seems kind of wrong as every service should do it's own thing
  • So I create a new service for that
  • And and extra docker-compose.yml, so it won't get triggered by default

Deprecated Todo

  • create a new service in docker-compose.yml
    • it will have the following same things as service worker-configdb-updater:
image: "${TIMEIO_IMAGE_REGISTRY}/configdb-updater:${TIMEIO_CONFIGDB_UPDATER_IMAGE_TAG}"
    build:
      context: .
      dockerfile: configdb_updater/Dockerfile
      args:
        UID: "${UID}"
        BASE_IMAGE_TAG: "${CONFIGDB_UPDATER_PYTHON_BASE_IMAGE_TAG}"
    restart: "${SERVICE_RESTART_POLICY}"
    depends_on:
      mqtt-broker:
        condition: service_healthy
    environment:
      LOG_LEVEL: "${LOG_LEVEL}"
      MQTT_BROKER_HOST: mqtt-broker
      MQTT_BROKER_PORT: 1883
      MQTT_USER: "${MQTT_USER}"
      MQTT_PASSWORD: "${MQTT_PASSWORD}"
      MQTT_CLIENT_ID: all-things-publisher
      MQTT_CLEAN_SESSION: "${MQTT_CLEAN_SESSION}"
      MQTT_SUBSCRIBE_QOS: "${MQTT_QOS}"
      MQTT_PUBLISH_TOPIC: "${TOPIC_CONFIG_DB_UPDATE}"
      MQTT_PUBLISH_QOS: "${MQTT_QOS}"
      CONFIGDB_CONNECTION_INITIAL_TIMEOUT: 10
      CONFIGDB_DSN: "${DATABASE_ADMIN_DSN}"    
    logging:
      options:
        max-size: "${DEFAULT_MAX_LOG_FILE_SIZE}"
        max-file: "${DEFAULT_MAX_LOG_FILE_COUNT}"
  • in the code I create a class
    • in the constructor there should be the database access like in sync_sms_materialized_views.py
    • It should loop through all the things saved in configdb and publish them using publish_single from timeio.mqtt
  • create an extra docker-compose.yml for that service
  • create extra shell script
  • change image: "${TIMEIO_IMAGE_REGISTRY}/configdb-updater:${TIMEIO_CONFIGDB_UPDATER_IMAGE_TAG}" to own envs
  • build image in pipeline

New Idea

Just use the most recent image of the configdb_updater service to run the script

Edited by Joost Hemmen