Standard DAQ client
Python client and CLI tools for interacting with Standard DAQ.
Install with Pip:
pip install std_daq_client
The only dependency is the requests library.
Getting started
Python client
from std_daq_client import StdDaqClient
rest_server_url = 'http://localhost:5000'
client = StdDaqClient(url_base=rest_server_url)
client.get_status()
client.get_config()
client.set_config(daq_config={'bit_depth': 16, 'writer_user_id': 0})
client.get_logs(n_last_logs=5)
client.get_stats()
client.get_deployment_status()
client.start_writer_async({'output_file': '/tmp/output.h5', 'n_images': 10})
client.start_writer_sync({'output_file': '/tmp/output.h5', 'n_images': 10})
client.stop_writer()
CLI interface
- std_cli_get_status
- std_cli_get_config
- std_cli_set_config [config_file]
- std_cli_get_logs
- std_cli_get_stats
- std_cli_get_deploy_status
- std_cli_write_async [output_file] [n_images]
- std_cli_write_sync [output_file] [n_images]
- std_cli_write_stop
All CLI tools accept the --url_base parameter used to point the client to the correct API base url. Contact
your DAQ integrator to find out this address.
Redis interface
System state is available also on Redis in the form of Redis Streams. Each Redis instance represents a DAQ instance.
Contact your DAQ integrator to find out the host and port on which Redis is running - generally this should be the
same network interface as the REST api, on the default Redis port.
Currently available streams:
- daq:status (DAQ status updates with 1Hz, DAQ status json)
- daq:config (DAQ config pushed on change, DAQ config json)
- daq:log (DAQ logs about created files, DAQ logs json)
- daq:stat (DAQ performance statistics, DAQ stats json)
The JSON object encoded in UTF-8 is stored in the b'json' field.
Example on how to access the status stream:
import json
from redis.client import Redis
redis = Redis()
last_status_id = '$'
while True:
response = redis.xread({"daq:writer_status": last_status_id}, block=0)
last_status_id, status = response[0][0], json.loads(response[0][1][b'json'])
print(status)
Interface objects
Every call returns a dictionary with some values inside. In case of state or logic problems with your request,
an instance of StdDaqAdminException will be thrown. For everything else an instance of RuntimeError will
be raised.
Below we will describe the different returned objects and describe their fields.
DAQ status
Represents the current state of the DAQ with the current state of the last acquisition (either completed
or still running).
This object is returned by:
- get_status
- start_writer_async
- start_writer_sync
- stop_writer
- Redis stream daq:status
{
"acquisition": {
"info": {
"n_images": 100,
"output_file": "/tmp/test.h5",
"run_id": 1684930336122153839
},
"message": "Completed.",
"state": "FINISHED",
"stats": {
"n_write_completed": 100,
"n_write_requested": 100,
"start_time": 1684930336.1252322,
"stop_time": 1684930345.2723851
}
},
"state": "READY"
}
output_file
The path will always be an absolute path or null (when no acquisition happened on the system yet).
state
State | Description |
---|
READY | DAQ is ready to start writing. |
WRITING | DAQ is writing at the moment. Wait for it to finish or call Stop to interrupt. |
UNKNOWN | The DAQ is in an unknown state (usually after reboot). Call Stop to reset. |
When the state of the writer is READY, the writer can receive the next write request. Otherwise, the request will
be rejected. A Stop request can always be sent and will reset the writer status to READY (use the Stop request
when the writer state is UNKNOWN to try to reset it).
acquisition/state
State | Description |
---|
FINISHED | The acquisition has finished successfully. |
FAILED | The acquisition has failed. |
WRITING | Currently receiving and writing images. |
WAITING_IMAGES | Writer is ready and waiting for images. |
ACQUIRING_IMAGES | DAQ is receiving images but writer is not writing them yet. |
FLUSHING_IMAGES | All needed images acquired, writer is flushing the buffer. |
In case of a FAILED acquisition, the acquisition/message will be set to the error that caused it to fail.
acquisition/message
Message | Description |
---|
Completed. | The acquisition has written all the images. |
Interrupted. | The acquisition was interrupted before it acquired all the images. |
ERROR:... | An error happened during the acquisition. |
In case of ERROR, the message will reflect what caused the acquisition to fail.
acquisition/stats/start_time, acquisition/stats/stop_time
All timestamps are Unix timestamps generated on the DAQ machine
and are not really correlated with external sources (clock skew can be up to 100s of milliseconds). In particular cases
any of the timestamp can be null (when no acquisition happened on the system yet).
DAQ config
Represents the DAQ configuration that is loaded by all services. This configuration describes the data source and the
way the data source is processed by stream processors.
This object is returned by:
- get_config
- set_config
- Redis stream daq:config
{
"bit_depth": 16,
"detector_name": "EG9M",
"detector_type": "eiger",
"image_pixel_height": 3264,
"image_pixel_width": 3106,
"n_modules": 2,
"start_udp_port": 50000,
"writer_user_id": 12345,
"module_positions": {
"0": [0, 3263, 513, 3008 ],
"1": [516, 3263, 1029, 3008 ]
}
}
writer_user_id
Must be an integer representing the user_id. For e-accounts, it's simply the number after the 'e'. For example
e12345 has a user_id of 12345. For other users you can find out their user_id by running:
id -u [user_name]
detector_type
Possible values: eiger, gigafrost, jungfrau, bsread
This is not something you usually change without hardware changes on the beamline.
DAQ statistics
Current data flow statistics of the DAQ.
This object is returned by:
- get_stats
- Redis stream daq:stat
{
"detector": {
"bytes_per_second": 0.0,
"images_per_second": 0.0
},
"writer": {
"bytes_per_second": 0.0,
"images_per_second": 0.0
}
}
The statistics is refreshed and aggregated with 1 Hz.
DAQ logs
Log of all acquisitions that produced a file. It is a list of acquisition objects in reverse chronological order.
This object is returned by:
- get_logs
- Redis stream daq:log
[
{
"info": {
"n_images": 100,
"output_file": "/tmp/test.h5",
"run_id": 1684930336122153839
},
"message": "Completed.",
"state": "FINISHED",
"stats": {
"n_write_completed": 100,
"n_write_requested": 100,
"start_time": 1684930336.1252322,
"stop_time": 1684930345.2723851
}
},
{ ... }
]
In case the file could not be created or another error occurred this will not be logged in the acquisition log.