apstools
Various Python tools for use with Bluesky at the APS
Summary Contents
Package Information
version |
1.6.1+3.gbe718c4.dirty |
published |
2022-05-25 16:48 |
copyright |
2017-2022, UChicago Argonne, LLC |
license |
ANL OPEN SOURCE LICENSE (see LICENSE.txt file) |
author |
Pete R. Jemian <jemian@anl.gov> |
Installation
Installation
The apstools
package is available for installation
by conda
, pip
, or from source.
conda
If you are using Anaconda Python and have conda
installed, install with this
command:
$ conda install -c aps-anl-tag apstools
pip
Released versions of apstools are available on PyPI.
If you have pip
installed, then you can install:
$ pip install apstools
source
The latest development versions of apstools can be downloaded from the GitHub repository listed above:
$ git clone http://github.com/BCDA-APS/apstools.git
To install in the standard Python location:
$ cd apstools
$ python setup.py install
To install in user’s home directory:
$ python setup.py install --user
To install in an alternate location:
$ python setup.py install --prefix=/path/to/installation/dir
Required Libraries
The repository’s environment.yml
file lists the additional packages
required by apstools
. Most packages are available as conda packages
from https://anaconda.org. The others are available on
https://PyPI.python.org. Among the required packages:
python>=3.7
bluesky, databroker, ophyd
h5py
pandas
pyEpics
pyqt=5
pyRestTable
qt=5
spec2nexus
xlrd
Applications
bluesky_snapshot
Take a snapshot of a list of EPICS PVs and record it in the databroker.
Retrieve (and display) that snapshot later using
apstools.callbacks.snapshot_report.SnapshotReport
or the bluesky_snapshot_viewer graphical user interface.
Example - command line
Before using the command-line interface, find out what the bluesky_snapshot expects:
$ bluesky_snapshot -h
usage: bluesky_snapshot [-h] [-b BROKER_CONFIG] [-m METADATA_SPEC] [-r] [-v]
EPICS_PV [EPICS_PV ...]
record a snapshot of some PVs using Bluesky, ophyd, and databroker
version=0.0.40+26.g323cd35
positional arguments:
EPICS_PV EPICS PV name
optional arguments:
-h, --help show this help message and exit
-b BROKER_CONFIG YAML configuration for databroker, default:
mongodb_config
-m METADATA_SPEC, --metadata METADATA_SPEC
additional metadata, enclose in quotes, such as -m
"purpose=just tuned, situation=routine"
-r, --report suppress snapshot report
-v, --version show program's version number and exit
The help does not tell you that the default for BROKER_CONFIG is “mongodb_config”, a YAML file in one of the default locations where the databroker expects to find it. That’s what we have.
We want to snapshot just a couple PVs to show basic use. Here are their current values:
$ caget prj:IOC_CPU_LOAD prj:SYS_CPU_LOAD
prj:IOC_CPU_LOAD 0.900851
prj:SYS_CPU_LOAD 4.50426
Here’s the snapshot (we’ll also set a metadata that says this is an example):
$ bluesky_snapshot prj:IOC_CPU_LOAD prj:SYS_CPU_LOAD -m "purpose=example"
========================================
snapshot: 2019-01-03 17:02:42.922197
========================================
hints: {}
hostname: mint-vm
iso8601: 2019-01-03 17:02:42.922197
login_id: mintadmin@mint-vm
plan_description: archive snapshot of ophyd Signals (usually EPICS PVs)
plan_name: snapshot
plan_type: generator
purpose: example
scan_id: 1
software_versions: {'python': '3.6.6 |Anaconda custom (64-bit)| (default, Jun 28 2018, 17:14:51) \n[GCC 7.2.0]', 'PyEpics': '3.3.1', 'bluesky': '1.4.1', 'ophyd': '1.3.0', 'databroker': '0.11.3', 'apstools': '0.0.40+26.g323cd35.dirty'}
time: 1546556562.9231327
uid: 98a86a91-d41e-4965-a048-afa5b982a17c
username: mintadmin
========================== ====== ================ ==================
timestamp source name value
========================== ====== ================ ==================
2019-01-03 17:02:33.930067 PV prj:IOC_CPU_LOAD 0.8007421685989062
2019-01-03 17:02:33.930069 PV prj:SYS_CPU_LOAD 10.309472772459404
========================== ====== ================ ==================
exit_status: success
num_events: {'primary': 1}
run_start: 98a86a91-d41e-4965-a048-afa5b982a17c
time: 1546556563.1087885
uid: 026fa69c-45b7-4b45-a3b3-266aadbf7176
We have a second IOC (gov) that has the same PVs. Let’s get them, too.:
$ bluesky_snapshot {gov,otz}:{IOC,SYS}_CPU_LOAD -m "purpose=this is an example, example=example 2"
========================================
snapshot: 2018-12-20 18:21:53.371995
========================================
example: example 2
hints: {}
iso8601: 2018-12-20 18:21:53.371995
plan_description: archive snapshot of ophyd Signals (usually EPICS PVs)
plan_name: snapshot
plan_type: generator
purpose: this is an example
scan_id: 1
software_versions: {'python': '3.6.2 |Continuum Analytics, Inc.| (default, Jul 20 2017, 13:51:32) \n[GCC 4.4.7 20120313 (Red Hat 4.4.7-1)]', 'PyEpics': '3.3.1', 'bluesky': '1.4.1', 'ophyd': '1.3.0', 'databroker': '0.11.3', 'apstools': '0.0.37'}
time: 1545351713.3727024
uid: d5e15ba3-0393-4df3-8217-1b72d82b5cf9
========================== ====== ================ ===================
timestamp source name value
========================== ====== ================ ===================
2018-12-20 18:21:45.488033 PV gov:IOC_CPU_LOAD 0.22522293126578166
2018-12-20 18:21:45.488035 PV gov:SYS_CPU_LOAD 10.335244804189122
2018-12-20 18:21:46.910976 PV otz:IOC_CPU_LOAD 0.10009633509509736
2018-12-20 18:21:46.910973 PV otz:SYS_CPU_LOAD 11.360899731293234
========================== ====== ================ ===================
exit_status: success
num_events: {'primary': 1}
run_start: d5e15ba3-0393-4df3-8217-1b72d82b5cf9
time: 1545351713.3957422
uid: e033cd99-dcac-4b56-848c-62eede1e4d77
You can log text and arrays, too.:
$ bluesky_snapshot {gov,otz}:{iso8601,HOSTNAME,{IOC,SYS}_CPU_LOAD} compress \
-m "purpose=this is an example, example=example 2, look=can snapshot text and arrays too, note=no commas in metadata"
========================================
snapshot: 2018-12-20 18:28:28.825551
========================================
example: example 2
hints: {}
iso8601: 2018-12-20 18:28:28.825551
look: can snapshot text and arrays too
note: no commas in metadata
plan_description: archive snapshot of ophyd Signals (usually EPICS PVs)
plan_name: snapshot
plan_type: generator
purpose: this is an example
scan_id: 1
software_versions: {'python': '3.6.2 |Continuum Analytics, Inc.| (default, Jul 20 2017, 13:51:32) \n[GCC 4.4.7 20120313 (Red Hat 4.4.7-1)]', 'PyEpics': '3.3.1', 'bluesky': '1.4.1', 'ophyd': '1.3.0', 'databroker': '0.11.3', 'apstools': '0.0.37'}
time: 1545352108.8262713
uid: 7e77708e-9169-45ab-b2b6-4e31534d980a
========================== ====== ================ ===================
timestamp source name value
========================== ====== ================ ===================
2018-12-20 18:24:34.220028 PV compress [0.1, 0.2, 0.3]
2018-12-13 14:49:53.121188 PV gov:HOSTNAME otz.aps.anl.gov
2018-12-20 18:28:25.093941 PV gov:IOC_CPU_LOAD 0.1501490058473918
2018-12-20 18:28:25.093943 PV gov:SYS_CPU_LOAD 10.360270546421546
2018-12-20 18:28:28.817630 PV gov:iso8601 2018-12-20T18:28:28
2018-12-13 14:49:53.135016 PV otz:HOSTNAME otz.aps.anl.gov
2018-12-20 18:28:26.525208 PV otz:IOC_CPU_LOAD 0.10009727705620367
2018-12-20 18:28:26.525190 PV otz:SYS_CPU_LOAD 12.937574161543873
2018-12-20 18:28:28.830285 PV otz:iso8601 2018-12-20T18:28:28
========================== ====== ================ ===================
exit_status: success
num_events: {'primary': 1}
run_start: 7e77708e-9169-45ab-b2b6-4e31534d980a
time: 1545352108.8656788
uid: 0de0ec62-504e-4dbc-ad08-2507d4ed44f9
bluesky_snapshot_viewer
View a snapshot previously recorded by the databroker. This is a GUI program started with the command: bluesky_snapshot_viewer
Internally, this tool calls apstools.callbacks.SnapshotReport
to
make the report. There are no command line options or command line help.

Screen shot of bluesky_snapshot_viewer GUI.
Source code documentation
spec2ophyd
Read SPEC config file and convert to ophyd setup commands.
This is a tool to help migration from SPEC to bluesky. It reads the SPEC configuration file provided on the comand line and converts the lines it recognizes into ophyd commands which create ophyd objects. These commands are printed to sys.stdout. The output can be copied into a setup file for ophyd.
Example
SPEC config file
# ID @(#)getinfo.c 6.6 01/15/16 CSS
# Device nodes
SDEV_0 = /dev/ttyUSB0 19200 raw
SDEV_1 = /dev/ttyUSB2 19200 raw
#SDEV_2 = /dev/ttyUSB2 19200 raw
VM_EPICS_M1 = 9idcLAX:m58:c0: 8
VM_EPICS_M1 = 9idcLAX:m58:c1: 8
VM_EPICS_M1 = 9idcLAX:m58:c2: 8
VM_EPICS_M1 = 9idcLAX:mxv:c0: 8
VM_EPICS_M1 = 9idcLAX:pi:c0: 4
VM_EPICS_M1 = 9idcLAX:xps:c0: 8
VM_EPICS_M1 = 9idcLAX:aero:c0: 1
VM_EPICS_M1 = 9idcLAX:mxv:c1: 8
VM_EPICS_M1 = 9idcLAX:aero:c1: 1
VM_EPICS_M1 = 9idcLAX:aero:c2: 1
PSE_MAC_MOT = kohzuE 1
VM_EPICS_M1 = 9ida: 60
VM_EPICS_M1 = 9idcLAX:aero:c3: 1
VM_EPICS_SC = 9idcLAX:vsc:c0 16
# CAMAC Slot Assignments
# CA_name_unit = slot [crate_number]
# Motor cntrl steps sign slew base backl accel nada flags mne name
MOT000 = EPICS_M2:0/2 2000 1 2000 200 50 125 0 0x003 mx mx
MOT001 = EPICS_M2:0/3 2000 1 2000 200 50 125 0 0x003 my my
MOT002 = EPICS_M2:1/1 2000 1 2000 200 50 125 0 0x003 msx msx
MOT003 = EPICS_M2:1/2 2000 1 2000 200 50 125 0 0x003 msy msy
MOT004 = EPICS_M2:1/3 2000 1 2000 200 50 125 0 0x003 art ART50-100
MOT005 = EPICS_M2:2/5 2000 1 2000 200 50 125 0 0x003 uslvcen uslitvercen
MOT006 = EPICS_M2:2/6 2000 1 2000 200 50 125 0 0x003 uslhcen uslithorcen
MOT007 = EPICS_M2:3/3 2000 1 2000 200 50 125 0 0x003 gslout GSlit_outb
MOTPAR:read_mode = 7
MOT008 = EPICS_M2:3/4 2000 1 2000 200 50 125 0 0x003 gslinb GSlit_inb
MOTPAR:read_mode = 7
MOT009 = EPICS_M2:3/5 2000 1 2000 200 50 125 0 0x003 gsltop GSlit_top
MOTPAR:read_mode = 7
MOT010 = EPICS_M2:3/6 2000 1 2000 200 50 125 0 0x003 gslbot GSlit_bot
MOTPAR:read_mode = 7
MOT011 = MAC_MOT:0/0 2000 1 2000 200 50 125 0 0x003 en en
MOTPAR:read_mode = 7
MOT012 = EPICS_M2:10/43 2000 1 2000 200 50 125 0 0x003 InbMS MonoSl_inb
# Counter ctrl unit chan scale flags mne name
CNT000 = EPICS_SC 0 0 10000000 0x001 sec seconds
CNT001 = EPICS_SC 0 1 1 0x002 I0 I0
CNT002 = EPICS_SC 0 2 1 0x000 I00 I00
CNT003 = EPICS_SC 0 3 1 0x000 upd2 photodiode
CNT004 = EPICS_SC 0 4 1 0x000 trd TR_diode
CNT005 = EPICS_SC 0 5 1 0x000 I000 I000
command line
Translate the SPEC config file in the present directory:
spec2ophyd ./config
output
mx = EpicsMotor('9idcLAX:m58:c0:m2', name='mx', labels=('motor',))
my = EpicsMotor('9idcLAX:m58:c0:m3', name='my', labels=('motor',))
msx = EpicsMotor('9idcLAX:m58:c1:m1', name='msx', labels=('motor',))
msy = EpicsMotor('9idcLAX:m58:c1:m2', name='msy', labels=('motor',))
art = EpicsMotor('9idcLAX:m58:c1:m3', name='art', labels=('motor',)) # ART50-100
uslvcen = EpicsMotor('9idcLAX:m58:c2:m5', name='uslvcen', labels=('motor',)) # uslitvercen
uslhcen = EpicsMotor('9idcLAX:m58:c2:m6', name='uslhcen', labels=('motor',)) # uslithorcen
gslout = EpicsMotor('9idcLAX:mxv:c0:m3', name='gslout', labels=('motor',)) # GSlit_outb # read_mode=7
gslinb = EpicsMotor('9idcLAX:mxv:c0:m4', name='gslinb', labels=('motor',)) # GSlit_inb # read_mode=7
gsltop = EpicsMotor('9idcLAX:mxv:c0:m5', name='gsltop', labels=('motor',)) # GSlit_top # read_mode=7
gslbot = EpicsMotor('9idcLAX:mxv:c0:m6', name='gslbot', labels=('motor',)) # GSlit_bot # read_mode=7
# Macro Motor: SpecMotor(mne='en', config_line='11', macro_prefix='kohzuE') # read_mode=7
InbMS = EpicsMotor('9ida:m43', name='InbMS', labels=('motor',)) # MonoSl_inb
c0 = ScalerCH('9idcLAX:vsc:c0', name='c0', labels=('detectors',))
# counter: sec = SpecCounter(mne='sec', config_line='0', name='seconds', unit='0', chan='0', pvname=9idcLAX:vsc:c0.S1)
# counter: I0 = SpecCounter(mne='I0', config_line='1', unit='0', chan='1', pvname=9idcLAX:vsc:c0.S2)
# counter: I00 = SpecCounter(mne='I00', config_line='2', unit='0', chan='2', pvname=9idcLAX:vsc:c0.S3)
# counter: upd2 = SpecCounter(mne='upd2', config_line='3', name='photodiode', unit='0', chan='3', pvname=9idcLAX:vsc:c0.S4)
# counter: trd = SpecCounter(mne='trd', config_line='4', name='TR_diode', unit='0', chan='4', pvname=9idcLAX:vsc:c0.S5)
# counter: I000 = SpecCounter(mne='I000', config_line='5', unit='0', chan='5', pvname=9idcLAX:vsc:c0.S6)
Cautions
spec2ophyd is a work-in-progress.
spec2ophyd does not rely on any libraries of apstools
It is not necessarily robust
It is not packaged or installed with the apstools.
It is only available from the source code repository.
It may be refactored or removed at any time.
Check the apstools Change History for more updates (https://github.com/BCDA-APS/apstools/blob/master/CHANGES.rst)
There are the applications provided by apstools:
application |
purpose |
---|---|
Take a snapshot of a list of EPICS PVs and record it in the databroker. |
|
read SPEC config file and convert to ophyd setup commands |
Examples
Example: Pilatus EPICS Area Detector
In this example, we’ll show how to create an ophyd object that operates our Pilatus camera as a detector. We’ll show how to use the EPICS Area Detector support to save images into HDF5 data files. In the course of this example, we’ll describe how an ophyd Device, such as this area detector support, is configured (a.k.a. staged) for data acquisition and also describe how ophyd waits for acquisition to complete using a status object.
If you just want the final result, we’ll present that first. Or, skip ahead if you want the full Pilatus Support Code Explained.
Pilatus Support Code
We built a Python class to describe our Pilatus area detector, then
created an ophyd det
object to talk with our EPICS IOC for the
Pilatus. Finally, we configured the det
object to save HDF files
when we count with the detector in Bluesky. When Bluesky is not
operating the detector, the controls will revert back to their settings
before Bluesky started.
Here is the complete support code:
1from ophyd import ADComponent
2from ophyd import ImagePlugin
3from ophyd import PilatusDetector
4from ophyd import SingleTrigger
5from ophyd.areadetector.filestore_mixins import FileStoreHDF5IterativeWrite
6from ophyd.areadetector.plugins import HDF5Plugin_V34
7import os
8
9PILATUS_FILES_ROOT = "/mnt/fileserver/data"
10BLUESKY_FILES_ROOT = "/export/raid5/fileshare/data"
11TEST_IMAGE_DIR = "test/pilatus/%Y/%m/%d/"
12
13class MyHDF5Plugin(FileStoreHDF5IterativeWrite, HDF5Plugin_V34): ...
14
15class MyPilatusDetector(SingleTrigger, PilatusDetector):
16 """Pilatus detector"""
17
18 image = ADComponent(ImagePlugin, "image1:")
19 hdf1 = ADComponent(
20 MyHDF5Plugin,
21 "HDF1:",
22 write_path_template=os.path.join(PILATUS_FILES_ROOT, TEST_IMAGE_DIR),
23 read_path_template=os.path.join(BLUESKY_FILES_ROOT, TEST_IMAGE_DIR),
24 )
25
26det = MyPilatusDetector("Pilatus:", name="det")
27det.hdf1.create_directory.put(-5)
28det.cam.stage_sigs["image_mode"] = "Single"
29det.cam.stage_sigs["num_images"] = 1
30det.cam.stage_sigs["acquire_time"] = 0.1
31det.cam.stage_sigs["acquire_period"] = 0.105
32det.hdf1.stage_sigs["lazy_open"] = 1
33det.hdf1.stage_sigs["compression"] = "LZ4"
34det.hdf1.stage_sigs["file_template"] = "%s%s_%3.3d.h5"
35del det.hdf1.stage_sigs["capture"]
36det.hdf1.stage_sigs["capture"] = 1
Pilatus Support Code Explained
The EPICS Area Detector 1 has support for many different types of area detector. While the full feature set varies amongst the supported camera types, the general approach to building the necessary device support in ophyd follows a common sequence.
An excellent first step to building the device for an area detector is to first check the list of area detector cameras already supported in ophyd. 2 If your camera is not supported, your next step is to build custom support. 3 4
- 2
https://blueskyproject.io/ophyd/area-detector.html#specific-hardware
- 3
https://blueskyproject.io/ophyd/area-detector.html#custom-devices
- 4
https://blueskyproject.io/ophyd/area-detector.html#custom-plugins-or-cameras
On the list of supported cameras, we find the PilatusDetector
. 5
Note that ophyd makes a distinction (using the Pilatus here as an
example) between PilatusDetector` and ``PilatusDetectorCam
. We’ll
clarify that distinction below.
Pay special attention to the Staging an Ophyd Device section. Staging is fundamental to use of the detector with data acquisition.
General Structure
Before you can create an ophyd object for your Pilatus detector, you’ll need to create an ophyd class that describes the features of the EPICS Area Detector interface you plan to use, such as the camera (ADPilatus, in this case) and any plugins such as computations or file writers.
Tip
If your EPICS configuration uses any of the plugins,
you must configure them in ophyd. You can check if you
missed any once you have created your detector object by calling
its .missing_plugins()
method. For example, where our
example Pilatus IOC uses the Pilatus:
PV prefix:
from ophyd import PilatusDetector
det = PilatusDetector("Pilatus:", name="det")
det.missing_plugins()
We expect to see an empty list []
as the result of this last
command. Otherwise, the list will describe the plugins we’ll need to
define.
The general support structure is a Python class that such as this one, that provides for triggering and viewing the image (but not file saving):
1class MyPilatusDetector(SingleTrigger, PilatusDetector):
2 """Ophyd support class describing this detector"""
3
4 # cam is already defined by PilatusDetector
5 image = ADComponent(ImagePlugin, "image1:")
6 # define other plugins here, as needed
7
8det = MyPilatusDetector("Pilatus:", name="det")
The Python class is defined where it derives from PilatusDetector
and adds the SingleTrigger
capabilities. Note the class we are
customizing is always listed last, with additional features (also known
as mixin classes) given first. That’s the way Python wants it.
Then, a Python docstring that describes this structure.
Then, any additional attributes (class variable names) and their
associated ADComponent
constructions, such as the Image plugin
shown. The second argument to the ADComponent
comes from the EPICS
PV for that plugin, such as Pilatus:image1:
for the Image plugin.
Finally, we show how the object is created with just the PV prefix for
EPICS IOC. The name="det"
keyword argument is required. It is
customary that the name matches the object name for the
MyPilatusDetector()
object.
Staging an Ophyd Device
An important part of data acquisition is configuration of each ophyd Device 6 for the acquisition steps. In Bluesky, this is called staging 7 and the acquisition is called triggering. 8 The complete data acquisition sequence of any ophyd Device proceeds in this order:
step |
actions |
---|---|
stage |
save the current device settings, then prepare the device for trigger |
trigger |
tell the device to run its acquisition sequence (returns a status object 9 after starting acquisition) |
wait |
wait until the status object indicates |
read |
get the data from the device (with timestamps) |
unstage |
restore the previous device settings (as saved in the stage step) |
We won’t use the read step in this example (but Python steps to read the image are shown below in the Read the Image into Python section):
The EPICS IOC saves the image to a file
Area detector images, unlike most other data we might handle for data acquisition, consume large resources. We should only load that data into memory at the time we choose, not as a routine practice.
When using the detector in a Bluesky plan, the RunEngine will get the information about the image (name and directory of the file created and the address in the file for the image). This information about the image will be part of the document sent to the databroker.
The ophyd Area Detector SingleTrigger
mixin provides the
configuration to stage and trigger the .cam for acquisition. The
staging settings, defined as a Python dictionary, will be applied in the
order they have been added to the dictionary (and the restored in
reverse order). The dictionary is in each Device’s .stage_sigs
attribute. Without the SingleTrigger
mixin:
>>> from ophyd import PilatusDetector
>>> det = PilatusDetector("Pilatus:", name="det")
>>> det.stage_sigs
OrderedDict()
With the SingleTrigger
mixin:
>>> from ophyd import PilatusDetector
>>> from ophyd import SingleTrigger
>>> class MyPilatusDetector(SingleTrigger, PilatusDetector): ...
>>> det = MyPilatusDetector("Pilatus:", name="det")
>>> det.stage_sigs
OrderedDict([('cam.acquire', 0), ('cam.image_mode', 1)])
The ophyd documentation has more information about Staging.
Build the Support: MyPilatusDetector
In most cases, you’ll want to describe more than just the camera module
that EPICS Area Detector supplies for your detector (such as
ADPilatus
10). We want to trigger the camera during data
collection, view the image during collection 11, and write the image
to a file. 12
The ophyd PilatusDetector
class only provides an area detector with
support for the cam module (the camera controls). Since the
additional features we want are not supported by PilatusDetector
,
we’ll need to add them.
We’ll begin customizing the support in the sections below.
- 9
https://areadetector.github.io/master/ADPilatus/pilatusDoc.html
- 10
https://areadetector.github.io/master/ADViewers/ad_viewers.html
- 11
https://areadetector.github.io/master/ADCore/NDPluginFile.html
- 12
https://blueskyproject.io/ophyd/status.html#status-objects-futures
MyPilatusDetector
class
So, following the general structure shown above, we start our
MyPilatusDetector
class, importing the necessary ophyd packages:
1from ophyd import ImagePlugin
2from ophyd import PilatusDetector
3from ophyd import SingleTrigger
4
5class MyPilatusDetector(SingleTrigger, PilatusDetector):
6 """Ophyd support class describing this detector"""
7
8 image = ADComponent(ImagePlugin, "image1:")
We could get the same structure with this class instead:
1from ophyd import AreaDetector
2from ophyd import ImagePlugin
3from ophyd import PilatusDetectorCam
4from ophyd import SingleTrigger
5
6class MyPilatusDetector(SingleTrigger, AreaDetector):
7 """Ophyd support class describing this detector"""
8
9 cam = ADComponent(PilatusDetectorCam, "cam1:")
10 image = ADComponent(ImagePlugin, "image1:")
PilatusDetectorCam
classThe ophyd.areadetector.PilatusDetectorCam
class provides
an ophyd Device
interface for the ADPilatus camera controls.
This support is already included in the PilatusDetector
class
so we do not need to add it (although there is no problem if we
add it anyway).
Any useful implementation of an EPICS area detector will support the
camera module, which controls the features of the camera and image
acquisition. The detector classes defined in ophyd.areadetector.detectors
all support the cam module appropriate for that detector. They are convenience
classes for the repetitive step of adding cam
support.
HDF5Plugin: Writing images to an HDF5 File
The ophyd HDF5Plugin
class 13, provides support
for the HDF5 File Writing Plugin of EPICS Area Detector.
As the EPICS Area Detector support has changed between various releases,
the PVs available have also changed. There are several version of the
ophyd HDF5Plugin
class to track those changes. Pick the highest
version of ophyd support that is equal or less than the EPICS Area
Detector version used in the IOC. For AD 3.7, the highest available
ophyd plugin is ophyd.areadetector.plugins.HDF5Plugin_V34
:
from ophyd.areadetector.plugins import HDF5Plugin_V34
We could just add this to our custom structure:
hdf1 = ADComponent(HDF5Plugin_V34, "HDF:")
but we still need an additional mixin to control where the files should be written (by the IOC) and read (by Bluesky):
from ophyd.areadetector.filestore_mixins import FileStoreHDF5IterativeWrite
which means we need to define a custom plugin class to bring these two parts together:
class MyHDF5Plugin(FileStoreHDF5IterativeWrite, HDF5Plugin_V34): ...
The FileStoreHDF5IterativeWrite
mixin allows for the file directory
paths to be different on the two computers, but expects the files to be
available to both the EPICS IOC and the Bluesky session. Thus, the
paths may have different first parts, up to a point where they match.
The Pilatus detector is a good example that needs the two paths to be
different. It saves files to its own file systems. (If the paths are
the same on both computers, it is not necessary to specify the
read_path_template
.) For the Bluesky computer to see these files,
both computers must share the same filesystem. The exact mount point
for the shared filesystem can be different on each. Consider these
hypothetical mount points for the same shared data
directory:
PILATUS_FILES_ROOT = "/mnt/fileserver/data"
BLUESKY_FILES_ROOT = "/export/raid5/fileshare/data"
To configure the HDF5Plugin()
, we must configure the
write_path_template
for how the shared filesystem is mounted on the
Pilatus computer and the read_path_template
for how the same shared
filesystem is mounted on the Bluesky computer. To set these paths, we
modify the above line to be:
hdf1 = ADComponent(
MyHDF5Plugin,
"HDF1:",
write_path_template=f"{PILATUS_FILES_ROOT}/",
read_path_template=f"{BLUESKY_FILES_ROOT}/",
)
Tip
EPICS Area Detector file writers require the directory separator at the end of the path and will add one if it is not given. Because ophyd expects the PV to become the value it has set, ophyd will timeout when writing the path if the final directory separator is not provided.
Additionally, we add to the mount point the directory path where our
files are to be stored on the shared. Bluesky allows this path to
include datetime
formatting. We use this formatting to add the year
(%Y
), month (%m
), and day (%d
) into the path for both
write_path_template
and read_path_template
:
TEST_IMAGE_DIR = "test/pilatus/%Y/%m/%d"
With this change, our final change is complete:
hdf1 = ADComponent(
MyHDF5Plugin,
"HDF1:",
write_path_template=f"{PILATUS_FILES_ROOT}/{TEST_IMAGE_DIR}/",
read_path_template=f"{BLUESKY_FILES_ROOT}/{TEST_IMAGE_DIR}/",
)
Tip
Later, when it is decided to change the directory for the HDF5 image files, be sure to set both templates, using the proper mount points for each. Follow the pattern as shown:
path = "user_name/experiment/" # note the trailing slash
det.hdf1.write_path_template.put(os.path.join(PILATUS_FILES_ROOT, path))
det.hdf1.read_path_template.put(os.path.join(BLUESKY_FILES_ROOT, path))
Create the Ophyd object
With the custom support for our Pilatus, it is simple
to create the ophyd object, once we know the PV prefix
used by the EPICS IOC. For this example, we’ll assume
the prefix is Pilatus:
:
det = MyPilatusDetector("Pilatus:", name="det")
Directory for the HDF5 files
Previously, we set the write_path_template
and
read_path_template
to control the directory where the Pilatus IOC
writes the HDF5 files and where Bluesky expects to find them once they
are created.
If these additional directories do not exist, we’ll get an error when we
try to write the HDF5 file. EPICS AD HDF5 plugin will create those
directories if the CreateDirectory PV (the create_directory
attribute of the HDF5Plugin()
) is set to a negative number at least
as large as the number of directories to be created. A value of -5
is usually sufficent. Such as:
det.hdf1.create_directory.put(-5)
Make this adjustment after creating the det
object and before
acquiring an image.
To change the directory for new HDF5 files:
path = "user_name/experiment/" # note the trailing slash
det.hdf1.write_path_template.put(os.path.join(PILATUS_FILES_ROOT, path))
det.hdf1.read_path_template.put(os.path.join(BLUESKY_FILES_ROOT, path))
Staging the Camera Settings
We want to control the number of image frames to be acquired so we
stage these cam
features:
>>> det.cam.stage_sigs["image_mode"] = "Single"
>>> det.cam.stage_sigs["num_images"] = 1
Also, we want to control the acquire time (actual the time the camera is collecting the image) and period (total time between image frames) for the image:
>>> det.cam.stage_sigs["acquire_time"] = 0.1
>>> det.cam.stage_sigs["acquire_period"] = 0.105
Staging the HDF5Plugin
We need to configure hdf1
(the HDF5 plugin) for staging. The
defaults are:
>>> det.hdf1.stage_sigs
OrderedDict([('enable', 1),
('blocking_callbacks', 'Yes'),
('parent.cam.array_callbacks', 1),
('auto_increment', 'Yes'),
('array_counter', 0),
('auto_save', 'Yes'),
('num_capture', 0),
('file_template', '%s%s_%6.6d.h5'),
('file_write_mode', 'Stream'),
('capture', 1)])
These settings enable the HDF5 writer and will pause the next
acquisition until the HDF5 file is written. They will increment the
file numbering and will automatically save the file once the image is
captured. By default, ophyd will choose a file name based on a random
uuid
. 14 It is possible to change this naming style but those
steps are beyond this example.
We want to enable the LazyOpen
feature 15 (so we do not have to acquire
an image into the HDF5 plugin before our first data acquisition):
>>> det.hdf1.stage_sigs["lazy_open"] = 1
and we want to add LZ4 compression:
>>> det.hdf1.stage_sigs["compression"] = "LZ4"
The LazyOpen
setting must happen before the plugin is set to
Capture
, so we must delete that and then add it as the last action:
>>> del det.hdf1.stage_sigs["capture"]
>>> det.hdf1.stage_sigs["capture"] = 1
We might reduce the number of digits written into the file name (this will change the value in place instead of moving the setting to the end of the actions):
>>> det.hdf1.stage_sigs["file_template"] = "%s%s_%3.3d.h5"
Acquire and Save an Image
Now that the det
object is ready for data acquisition,
let’s acquire an image using the ophyd tools:
>>> det.stage()
Ack. An upstream problem might appear in response to det.stage()
as a long exception report, starting with UnprimedPlugin
and
ending with:
UnprimedPlugin: The plugin hdf1 on the area detector with name det has not been primed.
Until the upstream support in ophyd is corrected to watch for
LazyOpen=1
, you need to warmup the plugin (by acquiring an image and
pushing it into the HDF plugin):
>>> det.warmup()
Then, proceed to acquire an image and save it to a file.
>>> st = det.trigger()
The return result was a Status object. If we check its value before the image is saved to an HDF5 file, the result looks like this:
>>> st
ADTriggerStatus(device=det, done=False, success=False)
Once the image acquisition is complete, the status object will
indicate it is done. We must wait until then by checking it. Or, we
can call the .wait()
method of the status object:
>>> st.wait()
Once the acquisition is finished and the HDF5 file is written,
the wait()
method will return. We can check its value:
>>> st
ADTriggerStatus(device=det, done=True, success=True)
Acquisition is complete. Don’t forget to unstage()
:
>>> det.unstage()
When we use det
as a detector in a bluesky plan with the
RunEngine
, the RunEngine
will do all these steps (including
the wait for the status object to finish).
We can find the name of the HDF5 that was written (by the IOC):
>>> det.hdf1.full_file_name.get()
/mnt/fileserver/data/test/pilatus/2021/01/22/4e26f601-df6d-4848-bf3f_000.h5
and we can get a local directory listing of the same file:
>>> !ls -lAFgh /export/raid5/fileshare/data/test/pilatus/2021/01/22/4e26f601-df6d-4848-bf3f_000.h5
-rw-r--r-- 1 root 2.2M Jan 22 00:41 /export/raid5/fileshare/data/test/pilatus/2021/01/22/4e26f601-df6d-4848-bf3f_000.h5
Note: The file size might be different for your detector.
Read the Image into Python
Our long-term plan is to use det
for data acquisition with Bluesky
and the databroker package. 16 Since this example focusses on the
ophyd configuration of an area detector, we’ll show how to read the
image from the HDF5 file. ()
Note
Keep in mind this is not the recommended way to get the image with Bluesky but we show this procedure since we have not used bluesky and databroker to record the image file details.
Once you have taken an image with det
and saved it to an HDF5 file,
we can read that file and get the image. By default, the EPICS area
detector HDF5 File Writer stores the image (using the NeXus schema) at
the address /entry/data/data
in the file.
First, we must get the name of the data file from the IOC:
>>> full_name_ioc = det.hdf1.full_file_name.get()
>>> print(f"IOC: {full_name_ioc}")
'/mnt/fileserver/data/test/pilatus/2021/01/22/4e26f601-df6d-4848-bf3f_000.h5'
This is the full path as the IOC sees the file system. We can simply remove the IOC path and replace it with the local path:
full_name_local = LOCAL_FILES_ROOT + full_name_ioc[len(IOC_FILES_ROOT):]
Verify that we have such a file:
>>> print(f"local file: {full_name_local}")
'/export/raid5/fileshare/data/test/pilatus/2021/01/22/4e26f601-df6d-4848-bf3f_000.h5'
>>> print(f"exists:{os.path.exists(full_name_local)}")
exists:True
Open the file using the h5py 17 package:
>>> import h5py
>>> root = h5py.File(full_name_local, "r")
Read the image:
>>> image = root["/entry/data/data"]
Show the shape of the image:
>>> image.shape
(1, 1024, 1024)
Close the file:
>>> root.close()
Example: Perkin-Elmer EPICS Area Detector
In this example, we’ll show how to create an ophyd object that operates a Perkin-Elmer Camera as a detector. We’ll start with the Pilatus example.
The only difference from the Pilatus example is the detector class
(PerkinElmerDetector
), the PV prefix (we’ll use PE1:
here), and
possibly the plugin version. For simplicity, we’ll assume the same
version of EPICS Area Detector (3.7) is used in this example. We’ll use
a different object name, det_pe
, for this detector and different
directories (where the write and read directories are the same).
Follow the Pilatus Support Code Explained to learn more about the choices made below. The process is similar.
Perkin-Elmer Support Code
Here is the Perkin-Elmer support, derived from the Pilatus support (Example: Pilatus EPICS Area Detector).
1from ophyd import ADComponent
2from ophyd import ImagePlugin
3from ophyd import PerkinElmerDetector
4from ophyd import SingleTrigger
5from ophyd.areadetector.filestore_mixins import FileStoreHDF5IterativeWrite
6from ophyd.areadetector.plugins import HDF5Plugin_V34
7import os
8
9IMAGE_FILES_ROOT = "/local/data"
10TEST_IMAGE_DIR = "test/"
11
12class MyHDF5Plugin(FileStoreHDF5IterativeWrite, HDF5Plugin_V34): ...
13
14class MyPerkinElmerDetector(SingleTrigger, PerkinElmerDetector):
15 """Perkin-Elmer detector"""
16
17 image = ADComponent(ImagePlugin, "image1:")
18 hdf1 = ADComponent(
19 MyHDF5Plugin,
20 "HDF1:",
21 write_path_template=os.path.join(
22 IMAGE_FILES_ROOT, TEST_IMAGE_DIR
23 ),
24 )
25
26det_pe = MyPerkinElmerDetector("PE1:", name="det_pe")
27det_pe.hdf1.create_directory.put(-5)
28det_pe.cam.stage_sigs["image_mode"] = "Single"
29det_pe.cam.stage_sigs["num_images"] = 1
30det_pe.cam.stage_sigs["acquire_time"] = 0.1
31det_pe.cam.stage_sigs["acquire_period"] = 0.105
32det_pe.hdf1.stage_sigs["lazy_open"] = 1
33det_pe.hdf1.stage_sigs["compression"] = "LZ4"
34det_pe.hdf1.stage_sigs["file_template"] = "%s%s_%3.3d.h5"
35del det_pe.hdf1.stage_sigs["capture"]
36det_pe.hdf1.stage_sigs["capture"] = 1
Example: the nscan()
plan
We’ll use a Jupyter notebook to demonstrate the nscan()
plan.
An nscan is used to scan two or more axes together,
such as a \(\theta\)-\(2\theta\) diffractometer scan.
Follow here: https://github.com/BCDA-APS/apstools/blob/master/docs/source/resources/demo_nscan.ipynb
Example: the run_command_file()
plan
You can use a text file or an Excel spreadsheet as a multi-sample
batch scan tool using the run_command_file()
plan.
This section is divided into these parts.
The Command File
A command file can be written as either a plain text file or a spreadsheet (such as from Microsoft Excel or Libre Office).
Text Command File
For example, given a text command file (named sample_example.txt
)
with content as shown …
1# example text command file
2
3step_scan 5.07 8.3 0.0 "Water Blank, deionized"
4other_scan 5.07 8.3 0.0 "Water Blank, deionized"
5
6# all comment lines will be ignored
7
8this line will not be recognized by execute_command_list()
… can be summarized in a bluesky ipython session:
In [1]: import apstools.plans
In [2]: apstools.plans.summarize_command_file("sample_example.txt")
Command file: sample_example.xlsx
====== ========== ===========================================================
line # action parameters
====== ========== ===========================================================
3 step_scan 5.07, 8.3, 0.0, Water Blank, deionized
4 other_scan 5.07, 8.3, 0.0, Water Blank, deionized
8 this line, will, not, be, recognized, by, execute_command_list()
====== ========== ===========================================================
Observe which lines (3, 4, and 8) in the text file are part of
the summary. The other lines are either comments or blank.
The action on line 8 will be passed to execute_command_list()
which will
then report the this
action it is not handled.
(Looks like another type of a comment.)
See parse_text_command_file()
for more details.
Spreadsheet Command File
Follow the example spreadsheet (in the File Downloads section)
and accompanying Jupyter notebook 1 to write your own Excel_plan()
.
For example, given a spreadsheet (named sample_example.xlsx
)
with content as shown in the next figure:

Image of sample_example.xlsx
spreadsheet file.
Tip
Place the column labels on the fourth row of the spreadsheet, starting in the first column. The actions start on the next row. The first blank row indicates the end of the command table within the spreadsheet. Use as many columns as you need, one column per argument.
This spreadsheet can be summarized in a bluesky ipython session:
In [1]: import apstools.plans
In [2]: apstools.plans.summarize_command_file("sample_example.xlsx")
Command file: sample_example.xlsx
====== ================================================================== ======================================
line # action parameters
====== ================================================================== ======================================
1 step_scan 5.07, 8.3, 0.0, Water Blank, deionized
2 other_scan 5.07, 8.3, 0.0, Water Blank, deionized
3 this will be ignored (and also the next blank row will be ignored)
====== ================================================================== ======================================
Note that lines 9 and 10 in the spreadsheet file are not part of
the summary. The spreadsheet handler will stop reading the list of actions
at the first blank line. The action described on line 3 will not be handled
(since we will not define an action named
this will be ignored (and also the next blank row will be ignored)
).
See parse_Excel_command_file()
for more details.
The Actions
To use this example with the run_command_file()
plan, it is
necessary to redefine the execute_command_list()
plan,
we must write a plan to handle every different type of action described
in the spreadsheet. For sample_example.xlsx
, we need to
handle the step_scan
and other_scan
actions.
Tip
Always write these actions as bluesky plans.
To test your actions, use either
bluesky.simulators.summarize_plan(step_scan())
or bluesky.simulators.summarize_plan(other_scan())
.
Here are examples of those two actions (and a stub for an additional instrument procedure to make the example more realistic):
1def step_scan(sx, sy, thickness, title, md={}):
2 md["sample_title"] = title # log this metadata
3 md["sample_thickness"] = thickness # log this metadata
4 yield from bluesky.plan_stubs.mv(
5 sample_stage.x, sx,
6 sample_stage.y, sy,
7 )
8 yield from prepare_detector("scintillator")
9 yield from bluesky.plans.scan([scaler], motor, 0, 180, 360, md=md)
10
11def other_scan(sx, sy, thickness, title, md={}):
12 md["sample_title"] = title # log this metadata
13 md["sample_thickness"] = thickness # log this metadata
14 yield from bluesky.plan_stubs.mv(
15 sample_stage.x, sx,
16 sample_stage.y, sy,
17 )
18 yield from prepare_detector("areadetector")
19 yield from bluesky.plans.count([areadetector], md=md)
20
21def prepare_detector(detector_name):
22 # for this example, we do nothing
23 # we should move the proper detector into position here
24 yield from bluesky.plan_stubs.null()
Now, we replace execute_command_list()
with our own definition to include those two actions:
1def execute_command_list(filename, commands, md={}):
2 """our custom execute_command_list"""
3 full_filename = os.path.abspath(filename)
4
5 if len(commands) == 0:
6 yield from bps.null()
7 return
8
9 text = f"Command file: {filename}\n"
10 text += str(apstools.utils.command_list_as_table(commands))
11 print(text)
12
13 for command in commands:
14 action, args, i, raw_command = command
15 logger.info(f"file line {i}: {raw_command}")
16
17 _md = {}
18 _md["full_filename"] = full_filename
19 _md["filename"] = filename
20 _md["line_number"] = i
21 _md["action"] = action
22 _md["parameters"] = args # args is shorter than parameters, means the same thing here
23
24 _md.update(md or {}) # overlay with user-supplied metadata
25
26 action = action.lower()
27 if action == "step_scan":
28 yield from step_scan(*args, md=_md)
29 elif action == "other_scan":
30 yield from other_scan(*args, md=_md)
31
32 else:
33 logger.info(f"no handling for line {i}: {raw_command}")
Register our own execute_command_list
Finally, we register our new version of execute_command_list
(which replaces the default execute_command_list()
):
APS_plans.register_command_handler(execute_command_list)
If you wish to verify that your own code has been installed, use this command:
print(APS_plans._COMMAND_HANDLER_)
If its output contains apstools.plans.execute_command_list
, you have not
registered your own function. However, if the output looks something such as
either of these:
<function __main__.execute_command_list(filename, commands, md={})>
# or
<function execute_command_list at 0x7f4cf0d616a8>
then you have installed your own code.
Testing the command file
As you were developing plans for each of your actions, we showed you
how to test that each plan was free of basic syntax and bluesky procedural
errors. The bluesky.simulators.summarize_plan()
function
will run through your plan and show you the basic data acquisition steps
that will be executed during your plan. Becuase you did not write
any blocking code, no hardware should ever be changed by running
this plan summary.
To test our command file, run it through the
bluesky.simulators.summarize_plan()
function:
bluesky.simulators.summarize_plan(run_command_file("sample_example.txt"))
The output will be rather lengthy, if there are no errors. Here are the first few lines:
Command file: sample_example.txt
====== ========== ===========================================================
line # action parameters
====== ========== ===========================================================
3 step_scan 5.07, 8.3, 0.0, Water Blank, deionized
4 other_scan 5.07, 8.3, 0.0, Water Blank, deionized
8 this line, will, not, be, recognized, by, execute_command_list()
====== ========== ===========================================================
file line 3: step_scan 5.07 8.3 0.0 "Water Blank, deionized"
sample_stage_x -> 5.07
sample_stage_y -> 8.3
=================================== Open Run ===================================
m3 -> 0.0
Read ['scaler', 'm3']
m3 -> 0.5013927576601671
Read ['scaler', 'm3']
m3 -> 1.0027855153203342
Read ['scaler', 'm3']
m3 -> 1.5041782729805013
and the last few lines:
m3 -> 178.99721448467966
Read ['scaler', 'm3']
m3 -> 179.49860724233983
Read ['scaler', 'm3']
m3 -> 180.0
Read ['scaler', 'm3']
================================== Close Run ===================================
file line 4: other_scan 5.07 8.3 0.0 "Water Blank, deionized"
sample_stage_x -> 5.07
sample_stage_y -> 8.3
=================================== Open Run ===================================
Read ['areadetector']
================================== Close Run ===================================
file line 8: this line will not be recognized by execute_command_list()
no handling for line 8: this line will not be recognized by execute_command_list()
Running the command file
Prepare the RE
These steps were used to prepare our bluesky ipython session to run the plan:
1from bluesky import RunEngine
2from bluesky.utils import get_history
3RE = RunEngine(get_history())
4
5# Import matplotlib and put it in interactive mode.
6import matplotlib.pyplot as plt
7plt.ion()
8
9# load config from ~/.config/databroker/mongodb_config.yml
10from databroker import Broker
11db = Broker.named("mongodb_config")
12
13# Subscribe metadatastore to documents.
14# If this is removed, data is not saved to metadatastore.
15RE.subscribe(db.insert)
16
17# Set up SupplementalData.
18from bluesky import SupplementalData
19sd = SupplementalData()
20RE.preprocessors.append(sd)
21
22# Add a progress bar.
23from bluesky.utils import ProgressBarManager
24pbar_manager = ProgressBarManager()
25RE.waiting_hook = pbar_manager
26
27# Register bluesky IPython magics.
28from bluesky.magics import BlueskyMagics
29get_ipython().register_magics(BlueskyMagics)
30
31# Set up the BestEffortCallback.
32from bluesky.callbacks.best_effort import BestEffortCallback
33bec = BestEffortCallback()
34RE.subscribe(bec)
Also, since we are using an EPICS area detector (ADSimDetector) and have just
started its IOC, we must process at least one image from the CAM to each of the
file writing plugins we’ll use (just the HDF1 for us). A procedure has been added
to the ophyd.areadetector
code for this. Here is the command we used
for this procedure:
areadetector.hdf1.warmup()
Run the command file
To run the command file, you need to pass this to an instance of the
bluesky.RunEngine
, defined as RE
above:
RE(apstools.plans.run_command_file("sample_example.txt"))
The output will be rather lengthy. Here are the first few lines of the output on my system (your hardware may be different so the exact data columns and values will vary):
In [11]: RE(APS_plans.run_command_file("sample_example.txt"))
Command file: sample_example.txt
====== ========== ===========================================================
line # action parameters
====== ========== ===========================================================
3 step_scan 5.07, 8.3, 0.0, Water Blank, deionized
4 other_scan 5.07, 8.3, 0.0, Water Blank, deionized
8 this line, will, not, be, recognized, by, execute_command_list()
====== ========== ===========================================================
file line 3: step_scan 5.07 8.3 0.0 "Water Blank, deionized"
Transient Scan ID: 138 Time: 2019-07-04 16:52:15
Persistent Unique Scan ID: 'f40d1c3c-8361-4efc-83f1-072fcd44c1b2'
New stream: 'primary'
+-----------+------------+------------+------------+------------+------------+
| seq_num | time | m3 | clock | I0 | scint |
+-----------+------------+------------+------------+------------+------------+
| 1 | 16:52:34.5 | 0.00000 | 4000000 | 2 | 1 |
| 2 | 16:52:35.2 | 0.50000 | 4000000 | 1 | 2 |
| 3 | 16:52:36.0 | 1.00000 | 4000000 | 1 | 1 |
| 4 | 16:52:36.6 | 1.50000 | 4000000 | 2 | 1 |
| 5 | 16:52:37.3 | 2.01000 | 4000000 | 1 | 2 |
| 6 | 16:52:38.0 | 2.51000 | 4000000 | 2 | 2 |
| 7 | 16:52:38.7 | 3.01000 | 4000000 | 1 | 1 |
| 8 | 16:52:39.3 | 3.51000 | 4000000 | 1 | 2 |
and the last few lines:
| 352 | 16:56:53.4 | 175.99000 | 4000000 | 3 | 0 |
| 353 | 16:56:54.1 | 176.49000 | 4000000 | 3 | 1 |
| 354 | 16:56:55.0 | 176.99000 | 4000000 | 1 | 3 |
| 355 | 16:56:55.7 | 177.49000 | 4000000 | 2 | 1 |
| 356 | 16:56:56.4 | 177.99000 | 4000000 | 1 | 1 |
| 357 | 16:56:57.1 | 178.50000 | 4000000 | 2 | 1 |
| 358 | 16:56:57.8 | 179.00000 | 4000000 | 2 | 2 |
| 359 | 16:56:58.5 | 179.50000 | 4000000 | 1 | 2 |
| 360 | 16:56:59.2 | 180.00000 | 4000000 | 2 | 1 |
+-----------+------------+------------+------------+------------+------------+
generator scan ['f40d1c3c'] (scan num: 138)
file line 4: other_scan 5.07 8.3 0.0 "Water Blank, deionized"
Transient Scan ID: 139 Time: 2019-07-04 16:56:59
Persistent Unique Scan ID: '1f093f56-3413-4e67-8a1b-27e71881b855'
New stream: 'primary'
+-----------+------------+
| seq_num | time |
+-----------+------------+
| 1 | 16:56:59.7 |
+-----------+------------+
generator count ['1f093f56'] (scan num: 139)
file line 8: this line will not be recognized by execute_command_list()
no handling for line 8: this line will not be recognized by execute_command_list()
Out[11]:
('f40d1c3c-8361-4efc-83f1-072fcd44c1b2',
'1f093f56-3413-4e67-8a1b-27e71881b855')
In [12]:
Appendix: Other spreadsheet examples
You can use an Excel spreadsheet as a multi-sample batch scan tool.
SIMPLE: Your Excel spreadsheet could be rather simple…

Unformatted Excel spreadsheet for batch scans.
See ExcelDatabaseFileGeneric
for an example bluesky plan
that reads from this spreadsheet.
FANCY: … or contain much more information, including formatting.

Example Excel spreadsheet for multi-sample batch scans.
The idea is that your table will start with column labels
in row 4 of the Excel spreadsheet. One of the columns will be the name
of the action (in the example, it is Scan Type
). The other columns will
be parameters or other information. Each of the rows under the labels will
describe one type of action such as a scan. Basically, whatever you
handle in your Excel_plan()
.
Any rows that you do not handle will be reported to the console during execution
but will not result in any action.
Grow (or shrink) the table as needed.
Note
For now, make sure there is no content in any of the spreadsheet cells outside (either below or to the right) of your table. Such content will trigger a cryptic error about a numpy float that cannot be converted. Instead, put that content in a second spreadsheet page.
You’ll need to have an action plan for every different action your spreadsheet
will specify. Call these plans from your Excel_plan()
within an elif
block,
as shown in this example. The example Excel_plan()
converts the Scan Type
into lower case for simpler comparisons. Your plan can be different if you choose.
if scan_command == "step_scan":
yield from step_scan(...)
elif scan_command == "energy_scan":
yield from scan_energy(...)
elif scan_command == "radiograph":
yield from AcquireImage(...)
else:
print(f"no handling for table row {i+1}: {row}")
The example plan saves all row parameters as metadata to the row’s action. This may be useful for diagnostic purposes.
Example: write SPEC data file
We’ll use the demo_specfile_example
Jupyter notebook to demonstrate
how to write one or more scans to a SPEC data file.
https://github.com/BCDA-APS/apstools/blob/master/docs/source/resources/demo_specfile_example.ipynb
Example: the TuneAxis()
class
We’ll use a Jupyter notebook to demonstrate the TuneAxis()
support that provides custom alignment of a signal against an axis.
Follow here: https://github.com/BCDA-APS/apstools/blob/master/docs/source/resources/demo_tuneaxis.ipynb
File Downloads
The jupyter notebooks and files related to this section may be downloaded from the following table.
jupyter notebook:
demo_excel_scan
jupyter notebook:
demo_nscan
jupyter notebook:
demo_tuneaxis
jupyter notebook:
demo_specfile_example
API Documentation
Callbacks (includes File Writers)
Callbacks
Document Collector
Bluesky callback to collect all documents from most-recent plan |
|
|
prints document contents -- use for diagnosing a document stream |
- class apstools.callbacks.doc_collector.DocumentCollectorCallback[source]
Bluesky callback to collect all documents from most-recent plan
Will reset when it receives a start document.
EXAMPLE:
from apstools.callbacks import DocumentCollectorCallback doc_collector = DocumentCollectorCallback() RE.subscribe(doc_collector.receiver) ... RE(some_plan()) print(doc_collector.uids) print(doc_collector.documents["stop"])
Snapshot Report
|
Show the data from a |
- class apstools.callbacks.snapshot_report.SnapshotReport(*args: Any, **kwargs: Any)[source]
Show the data from a
apstools.plans.snapshot()
.Find most recent snapshot between certain dates:
headers = db(plan_name="snapshot", since="2018-12-15", until="2018-12-21") h = list(headers)[0] # pick the first one, it's the most recent apstools.callbacks.SnapshotReport().print_report(h)
Use as callback to a snapshot plan:
RE( apstools.plans.snapshot(ophyd_objects_list), apstools.callbacks.SnapshotReport() )
File Writers
See the File Writers section.
Devices
(ophyd) Devices that might be useful at the APS using Bluesky
Also consult the Index under the Ophyd heading for links to the Devices, Exceptions, Mixins, Signals, and other support items described here.
Categories
APS General Support
|
Get the APS cycle name from the APS Data Management system or a local file. |
|
Common operational parameters of the APS of general interest. |
|
APS PSS shutter |
|
APS PSS shutter with separate status PV |
|
Simulated APS PSS shutter |
Area Detector Support
|
custom class to define image file name from EPICS |
|
custom class to define image file name from EPICS |
|
Has area detector pushed an NDarray to the file writer plugin? True or False |
|
Prime this area detector's file writer plugin. |
|
configure so frames are identified & handled by type (dark, white, or image) |
Detector & Scaler Support
|
Struck/SIS 3820 Multi-Channel Scaler (as used by USAXS) |
|
configure scaler for only the channels with names assigned in EPICS |
|
Evaluate a point on a pseudo-Voigt based on the value of a motor. |
Motors, Positioners, Axes, …
Exception during execution of AxisTunerBase subclass |
|
|
Mixin class to provide tuning capabilities for an axis |
|
add a record's description field to a Device, such as EpicsMotor |
|
add motor record's dial coordinate fields to Device |
|
mixin providing access to motor enable/disable |
|
add motor record HLM & LLM fields & compatibility get_lim() and set_lim() |
|
add motor record's raw coordinate fields to Device |
|
Add motor record's resolution fields to motor. |
|
add motor record's servo loop controls to Device |
|
PVPositioner that computes |
|
PVPositionerSoftDone with stop() and inposition. |
|
Shutter, implemented with an EPICS motor moved between two positions |
|
Shutter using a single EPICS PV moved between two positions |
Shutters
|
APS PSS shutter |
|
APS PSS shutter with separate status PV |
|
Shutter, implemented with an EPICS motor moved between two positions |
|
Shutter using a single EPICS PV moved between two positions |
|
Shutter Device using one Signal for open and close. |
|
Base class for all shutter Devices. |
|
Simulated APS PSS shutter |
Slits
|
EPICS synApps optics xia_slit.db 2D support: inb out bot top ... |
|
EPICS synApps optics 2slit.db 1D support: xn, xp, size, center, sync |
|
EPICS synApps optics 2slit.db 2D support: h.xn, h.xp, v.xn, v.xp |
|
EPICS synApps optics 2slit.db 2D support: inb, out, bot, top |
|
Slit size and center as a named tuple |
synApps Support
See separate synApps Support: Records, Databases, … section.
Temperature Controllers
|
Eurotherm 2216e Temperature Controller |
|
LakeShore 336 temperature controller. |
|
LakeShore 340 temperature controller |
|
Linkam model CI94 temperature controller |
|
Linkam model T96 temperature controller |
|
SRS PTC10 AIO module |
|
SRS PTC10 RTD module channel |
|
SRS PTC10 Tc (thermocouple) module channel |
|
Mixin so SRS PTC10 can be used as a (temperature) positioner. |
Other Support
|
Provide current experiment info from the APS BSS. |
|
XIA PF4 Filter: one set of 4 filters (A). |
|
XIA PF4 Filter: two sets of 4 filters (A, B). |
|
XIA PF4 Filter: three sets of 4 filters (A, B, C). |
|
A single module of XIA PF4 filters (4-blades). |
|
XIA PF4 filters - common support. |
|
LEGACY (use Pf4FilterDual now): Dual (Al, Ti) Xia PF4 filter boxes |
|
add a record's description field to a Device, such as EpicsMotor |
|
synApps Kohzu double-crystal monochromator sequence control program |
|
Ophyd support for Stanford Research Systems 570 preamplifier from synApps. |
|
Struck/SIS 3820 Multi-Channel Scaler (as used by USAXS) |
Internal Routines
|
General messages from the APS main control room. |
|
Non-EPICS signal for use when coordinating Device actions. |
|
Base class for apstools Device mixin classes |
All Submodules
APS User Proposal and ESAF Information
|
Provide current experiment info from the APS BSS. |
- class apstools.devices.aps_bss_user.ApsBssUserInfoDevice(*args: Any, **kwargs: Any)[source]
Provide current experiment info from the APS BSS.
BSS: Beamtime Scheduling System
EXAMPLE:
bss_user_info = ApsBssUserInfoDevice( "9id_bss:", name="bss_user_info") sd.baseline.append(bss_user_info)
NOTE: There is info provided by the APS proposal & ESAF systems.
Area Detector Support
|
configure so frames are identified & handled by type (dark, white, or image) |
|
Has area detector pushed an NDarray to the file writer plugin? True or False |
|
Prime this area detector's file writer plugin. |
|
Prime this area detector's file writer plugin. |
|
custom class to define image file name from EPICS |
|
custom class to define image file name from EPICS |
- class apstools.devices.area_detector_support.AD_EpicsHdf5FileName(*args: Any, **kwargs: Any)[source]
custom class to define image file name from EPICS
Caution
Caveat emptor applies here. You assume expertise!
Replace standard Bluesky algorithm where file names are defined as UUID strings, virtually guaranteeing that no existing images files will ever be overwritten.
Also, this method decouples the data files from the databroker, which needs the files to be named by UUID.
overrides default behavior: Get info from EPICS HDF5 plugin.
generate_datum
(key, timestamp, datum_kwargs)Generate a uid and cache it with its key for later insertion.
overrides default behavior
stage
()overrides default behavior
To allow users to control the file name, we override the
make_filename()
method here and we need to override some intervening classes.To allow users to control the file number, we override the
stage()
method here and triple-comment out that line, and bring in sections from the methods we are replacing here.The image file name is set in FileStoreBase.make_filename() from ophyd.areadetector.filestore_mixins. This is called (during device staging) from FileStoreBase.stage()
EXAMPLE:
To use this custom class, we need to connect it to some intervening structure. Here are the steps:
override default file naming
use to make your custom iterative writer
use to make your custom HDF5 plugin
use to make your custom AD support
imports:
from bluesky import RunEngine, plans as bp from ophyd.areadetector import SimDetector, SingleTrigger from ophyd.areadetector import ADComponent, ImagePlugin, SimDetectorCam from ophyd.areadetector import HDF5Plugin from ophyd.areadetector.filestore_mixins import FileStoreIterativeWrite
override default file naming:
from apstools.devices import AD_EpicsHdf5FileName
make a custom iterative writer:
class myHdf5EpicsIterativeWriter(AD_EpicsHdf5FileName, FileStoreIterativeWrite): pass
make a custom HDF5 plugin:
class myHDF5FileNames(HDF5Plugin, myHdf5EpicsIterativeWriter): pass
define support for the detector (simulated detector here):
class MySimDetector(SingleTrigger, SimDetector): '''SimDetector with HDF5 file names specified by EPICS''' cam = ADComponent(SimDetectorCam, "cam1:") image = ADComponent(ImagePlugin, "image1:") hdf1 = ADComponent( myHDF5FileNames, suffix = "HDF1:", root = "/", write_path_template = "/", )
create an instance of the detector:
simdet = MySimDetector("13SIM1:", name="simdet") if hasattr(simdet.hdf1.stage_sigs, "array_counter"): # remove this so array counter is not set to zero each staging del simdet.hdf1.stage_sigs["array_counter"] simdet.hdf1.stage_sigs["file_template"] = '%s%s_%3.3d.h5'
setup the file names using the EPICS HDF5 plugin:
simdet.hdf1.file_path.put("/tmp/simdet_demo/") # ! ALWAYS end with a "/" ! simdet.hdf1.file_name.put("test") simdet.hdf1.array_counter.put(0)
If you have not already, create a bluesky RunEngine:
RE = RunEngine({})
take an image:
RE(bp.count([simdet]))
INTERNAL METHODS
- class apstools.devices.area_detector_support.AD_EpicsJpegFileName(*args: Any, **kwargs: Any)[source]
custom class to define image file name from EPICS
Caution
Caveat emptor applies here. You assume expertise!
Replace standard Bluesky algorithm where file names are defined as UUID strings, virtually guaranteeing that no existing images files will ever be overwritten. Also, this method decouples the data files from the databroker, which needs the files to be named by UUID.
overrides default behavior: Get info from EPICS JPEG plugin.
generate_datum
(key, timestamp, datum_kwargs)Generate a uid and cache it with its key for later insertion.
overrides default behavior
stage
()overrides default behavior Set EPICS items before device is staged, then copy EPICS naming template (and other items) to ophyd after staging.
Patterned on
apstools.devices.AD_EpicsHdf5FileName()
. (Follow that documentation from this point.)
- apstools.devices.area_detector_support.AD_plugin_primed(plugin)[source]
Has area detector pushed an NDarray to the file writer plugin? True or False
PARAMETERS
- plugin
obj : area detector plugin to be primed (such as
detector.hdf1
)
EXAMPLE:
AD_plugin_primed(detector.hdf1)
Works around an observed issue: #598 https://github.com/NSLS-II/ophyd/issues/598#issuecomment-414311372
If detector IOC has just been started and has not yet taken an image with the file writer plugin, then a TimeoutError will occur as the file writer plugin “Capture” is set to 1 (Start). In such case, first acquire at least one image with the file writer plugin enabled.
Also issue in apstools (needs a robust method to detect if primed): https://github.com/BCDA-APS/apstools/issues/464
Since Area Detector release 2.1 (2014-10-14).
The prime process is not needed if you select the LazyOpen feature with Stream mode for the file plugin. LazyOpen defers file creation until the first frame arrives in the plugin. This removes the need to initialize the plugin with a dummy frame before starting capture.
- apstools.devices.area_detector_support.AD_prime_plugin(detector, plugin)[source]
Prime this area detector’s file writer plugin.
PARAMETERS
- detector
obj : area detector (such as
detector
)- plugin
obj : area detector plugin to be primed (such as
detector.hdf1
)
EXAMPLE:
AD_prime_plugin(detector, detector.hdf1)
- apstools.devices.area_detector_support.AD_prime_plugin2(plugin)[source]
Prime this area detector’s file writer plugin.
Collect and push an NDarray to the file writer plugin. Works with all file writer plugins.
Based on
ophyd.areadetector.plugins.HDF5Plugin.warmup()
.PARAMETERS
- plugin
obj : area detector plugin to be primed (such as
detector.hdf1
)
EXAMPLE:
AD_prime_plugin2(detector.hdf1)
- apstools.devices.area_detector_support.AD_setup_FrameType(prefix, scheme='NeXus')[source]
configure so frames are identified & handled by type (dark, white, or image)
PARAMETERS
- prefix
str : EPICS PV prefix of area detector, such as
13SIM1:
- scheme
str : any key in the
AD_FrameType_schemes
dictionary
This routine prepares the EPICS Area Detector to identify frames by image type for handling by clients, such as the HDF5 file writing plugin. With the HDF5 plugin, the
FrameType
PV is added to the NDattributes and then used in the layout file to direct the acquired frame to the chosen dataset. TheFrameType
PV value provides the HDF5 address to be used.To use a different scheme than the defaults, add a new key to the
AD_FrameType_schemes
dictionary, defining storage values for the fields of the EPICSmbbo
record that you will be using.EXAMPLE:
AD_setup_FrameType("2bmbPG3:", scheme="DataExchange")
Call this function before creating the ophyd area detector object
use lower-level PyEpics interface
APS cycles
|
Get the APS cycle name from the APS Data Management system or a local file. |
APS Machine Parameters
APS machine parameters
|
Common operational parameters of the APS of general interest. |
|
General messages from the APS main control room. |
- class apstools.devices.aps_machine.ApsMachineParametersDevice(*args: Any, **kwargs: Any)[source]
Common operational parameters of the APS of general interest.
EXAMPLE:
import apstools.devices as APS_devices APS = APS_devices.ApsMachineParametersDevice(name="APS") aps_current = APS.current # make sure these values are logged at start and stop of every scan sd.baseline.append(APS) # record storage ring current as secondary stream during scans # name: aps_current_monitor # db[-1].table("aps_current_monitor") sd.monitors.append(aps_current)
The sd.baseline and sd.monitors usage relies on this global setup:
from bluesky import SupplementalData sd = SupplementalData() RE.preprocessors.append(sd)
determine if APS is in User Operations mode (boolean)
- aps_cycle
- property inUserOperations
determine if APS is in User Operations mode (boolean)
Use this property to configure ophyd Devices for direct or simulated hardware. See issue #49 (https://github.com/BCDA-APS/apstools/issues/49) for details.
EXAMPLE:
APS = apstools.devices.ApsMachineParametersDevice(name="APS") if APS.inUserOperations: suspend_APS_current = bluesky.suspenders.SuspendFloor(APS.current, 2, resume_thresh=10) RE.install_suspender(suspend_APS_current) else: # use pseudo shutter controls and no current suspenders pass
- operator_messages
alias of
apstools.devices.aps_machine.ApsOperatorMessagesDevice
APS undulator
|
APS Undulator |
|
APS Undulator with upstream and downstream controls |
- class apstools.devices.aps_undulator.ApsUndulator(*args: Any, **kwargs: Any)[source]
APS Undulator
EXAMPLE:
undulator = ApsUndulator("ID09ds:", name="undulator")
- tracking
Axis Tuner
Exception during execution of AxisTunerBase subclass |
|
|
Mixin class to provide tuning capabilities for an axis |
- exception apstools.devices.axis_tuner.AxisTunerException[source]
Exception during execution of AxisTunerBase subclass
- class apstools.devices.axis_tuner.AxisTunerMixin(*args: Any, **kwargs: Any)[source]
Mixin class to provide tuning capabilities for an axis
See the TuneAxis() example in this jupyter notebook: https://github.com/BCDA-APS/apstools/blob/master/docs/source/resources/demo_tuneaxis.ipynb
HOOK METHODS
There are two hook methods (pre_tune_method(), and post_tune_method()) for callers to add additional plan parts, such as opening or closing shutters, setting detector parameters, or other actions.
Each hook method must accept a single argument: an axis object such as EpicsMotor or SynAxis, such as:
def my_pre_tune_hook(axis): yield from bps.mv(shutter, "open") def my_post_tune_hook(axis): yield from bps.mv(shutter, "close") class TunableSynAxis(AxisTunerMixin, SynAxis): pass myaxis = TunableSynAxis(name="myaxis") mydet = SynGauss('mydet', myaxis, 'myaxis', center=0.21, Imax=0.98e5, sigma=0.127) myaxis.tuner = TuneAxis([mydet], myaxis) myaxis.pre_tune_method = my_pre_tune_hook myaxis.post_tune_method = my_post_tune_hook def tune_myaxis(): yield from myaxis.tune(md={"plan_name": "tune_myaxis"}) RE(tune_myaxis())
Mixin to add EPICS .DESC field
|
add a record's description field to a Device, such as EpicsMotor |
- class apstools.devices.description_mixin.EpicsDescriptionMixin(*args: Any, **kwargs: Any)[source]
add a record’s description field to a Device, such as EpicsMotor
EXAMPLE:
from ophyd import EpicsMotor from apstools.devices import EpicsDescriptionMixin class MyEpicsMotor(EpicsDescriptionMixin, EpicsMotor): pass m1 = MyEpicsMotor('xxx:m1', name='m1') print(m1.desc.get())
more ideas:
class TunableSynAxis(AxisTunerMixin, SynAxis): '''synthetic axis that can be tuned''' class TunableEpicsMotor(AxisTunerMixin, EpicsMotor): '''EpicsMotor that can be tuned''' class EpicsMotorWithDescription(EpicsDescriptionMixin, EpicsMotor): '''EpicsMotor with description field''' class EpicsMotorWithMore( EpicsDescriptionMixin, EpicsMotorLimitsMixin, EpicsMotorDialMixin, EpicsMotorRawMixin, EpicsMotor): ''' EpicsMotor with more fields * description (``desc``) * soft motor limits (``soft_limit_hi``, ``soft_limit_lo``) * dial coordinates (``dial``) * raw coordinates (``raw``) '''
Eurotherm 2216e Temperature Controller
The 2216e is a temperature controller from Eurotherm.
|
Eurotherm 2216e Temperature Controller |
According to their website, the [Eurotherm 2216e Temperature Controller](https://www.eurothermcontrollers.com/eurotherm-2216e-series-controller-now-obsolete/) is obsolete. Please see replacement [EPC3016](https://www.eurothermcontrollers.com/eurotherm-epc3016-1-16-din-process-and-temperature-controller/) in our [EPC3000 Series](https://www.eurothermcontrollers.com/epc3000-series).
New in apstools 1.6.0.
Kohzu double-crystal monochromator
|
synApps Kohzu double-crystal monochromator sequence control program |
- class apstools.devices.kohzu_monochromator.KohzuSeqCtl_Monochromator(*args: Any, **kwargs: Any)[source]
synApps Kohzu double-crystal monochromator sequence control program
- calibrate_energy(value)[source]
Calibrate the monochromator energy.
PARAMETERS
- value: float
New energy for the current monochromator position.
- energy
alias of
apstools.devices.kohzu_monochromator.KohzuSoftPositioner
- theta
alias of
apstools.devices.kohzu_monochromator.KohzuSoftPositioner
- wavelength
alias of
apstools.devices.kohzu_monochromator.KohzuSoftPositioner
- class apstools.devices.kohzu_monochromator.KohzuSoftPositioner(*args: Any, **kwargs: Any)[source]
- cb_done(*args, **kwargs)[source]
Called when parent’s done signal changes (EPICS CA monitor event).
- cb_setpoint(*args, **kwargs)[source]
Called when setpoint changes (EPICS CA monitor event).
When the setpoint is changed, force
done=False
. For any move, done must transition to!= done_value
, then back todone_value
. Next update will refresh value from parent device.
- property inposition
Report (boolean) if positioner is done.
Lakeshore temperature controllers
|
LakeShore 336 temperature controller. |
|
LakeShore 340 temperature controller |
- class apstools.devices.lakeshore_controllers.LS340_LoopBase(*args: Any, **kwargs: Any)[source]
Base settings for both sample and control loops.
- class apstools.devices.lakeshore_controllers.LS340_LoopControl(*args: Any, **kwargs: Any)[source]
Control specific
- class apstools.devices.lakeshore_controllers.LS340_LoopSample(*args: Any, **kwargs: Any)[source]
Sample specific
- class apstools.devices.lakeshore_controllers.LakeShore336Device(*args: Any, **kwargs: Any)[source]
LakeShore 336 temperature controller.
loop 1: temperature positioner AND heater, PID, & ramp controls
loop 2: temperature positioner AND heater, PID, & ramp controls
loop 3: temperature positioner
loop 4: temperature positioner
- loop3
alias of
apstools.devices.lakeshore_controllers.LakeShore336_LoopRO
- loop4
alias of
apstools.devices.lakeshore_controllers.LakeShore336_LoopRO
- serial
alias of
apstools.synApps.asyn.AsynRecord
- class apstools.devices.lakeshore_controllers.LakeShore336_LoopControl(*args: Any, **kwargs: Any)[source]
LakeShore 336 temperature controller – with heater control.
The LakeShore 336 accepts up to two heaters.
- class apstools.devices.lakeshore_controllers.LakeShore336_LoopRO(*args: Any, **kwargs: Any)[source]
LakeShore 336 temperature controller – Read-only loop (no heaters).
- class apstools.devices.lakeshore_controllers.LakeShore340Device(*args: Any, **kwargs: Any)[source]
LakeShore 340 temperature controller
- control
alias of
apstools.devices.lakeshore_controllers.LS340_LoopControl
- sample
alias of
apstools.devices.lakeshore_controllers.LS340_LoopSample
- serial
alias of
apstools.synApps.asyn.AsynRecord
Linkam temperature controllers
|
Linkam model CI94 temperature controller |
|
Linkam model T96 temperature controller |
- class apstools.devices.linkam_controllers.Linkam_CI94_Device(*args: Any, **kwargs: Any)[source]
Linkam model CI94 temperature controller
EXAMPLE:
ci94 = Linkam_CI94_Device("IOC:ci94:", name="ci94")
- temperature
alias of
apstools.devices.positioner_soft_done.PVPositionerSoftDoneWithStop
Base class for Device Mixins
|
Base class for apstools Device mixin classes |
Mixin classes for Motor Devices
|
add motor record's dial coordinate fields to Device |
|
mixin providing access to motor enable/disable |
|
add motor record HLM & LLM fields & compatibility get_lim() and set_lim() |
|
add motor record's raw coordinate fields to Device |
|
Add motor record's resolution fields to motor. |
|
add motor record's servo loop controls to Device |
- class apstools.devices.motor_mixins.EpicsMotorDialMixin(*args: Any, **kwargs: Any)[source]
add motor record’s dial coordinate fields to Device
EXAMPLE:
from ophyd import EpicsMotor from apstools.devices import EpicsMotorDialMixin class myEpicsMotor(EpicsMotorDialMixin, EpicsMotor): pass m1 = myEpicsMotor('xxx:m1', name='m1') print(m1.dial.read())
- class apstools.devices.motor_mixins.EpicsMotorEnableMixin(*args: Any, **kwargs: Any)[source]
mixin providing access to motor enable/disable
EXAMPLE:
from ophyd import EpicsMotor from apstools.devices import EpicsMotorEnableMixin class MyEpicsMotor(EpicsMotorEnableMixin, EpicsMotor): ... m1 = MyEpicsMotor('xxx:m1', name='m1') print(m1.enabled)
In a bluesky plan:
yield from bps.mv(m1.enable_disable, m1.MOTOR_DISABLE) # ... other activities yield from bps.mv(m1.enable_disable, m1.MOTOR_ENABLE)
- class apstools.devices.motor_mixins.EpicsMotorLimitsMixin(*args: Any, **kwargs: Any)[source]
add motor record HLM & LLM fields & compatibility get_lim() and set_lim()
EXAMPLE:
from ophyd import EpicsMotor from apstools.devices import EpicsMotorLimitsMixin class myEpicsMotor(EpicsMotorLimitsMixin, EpicsMotor): pass m1 = myEpicsMotor('xxx:m1', name='m1') lo = m1.get_lim(-1) hi = m1.get_lim(1) m1.set_lim(-25, -5) print(m1.get_lim(-1), m1.get_lim(1)) m1.set_lim(lo, hi)
- class apstools.devices.motor_mixins.EpicsMotorRawMixin(*args: Any, **kwargs: Any)[source]
add motor record’s raw coordinate fields to Device
EXAMPLE:
from ophyd import EpicsMotor from apstools.devices import EpicsMotorRawMixin class myEpicsMotor(EpicsMotorRawMixin, EpicsMotor): pass m1 = myEpicsMotor('xxx:m1', name='m1') print(m1.raw.read())
- class apstools.devices.motor_mixins.EpicsMotorResolutionMixin(*args: Any, **kwargs: Any)[source]
Add motor record’s resolution fields to motor.
Usually, a facility will not provide such high-level access to calibration parameters since these are associated with fixed parameters of hardware. For simulators, it is convenient to provide access so that default settings (typically low-resolution) from the IOC can be changed as part of the device setup in bluesky.
EXAMPLE:
from ophyd import EpicsMotor from apstools.devices import EpicsMotorResolutionMixin class myEpicsMotor(EpicsMotorResolutionMixin, EpicsMotor): pass m1 = myEpicsMotor('xxx:m1', name='m1') print(f"resolution={m1.resolution.read()}") print(f"steps_per_rev={m1.steps_per_rev.read()}") print(f"units_per_rev={m1.units_per_rev.read()}")
- class apstools.devices.motor_mixins.EpicsMotorServoMixin(*args: Any, **kwargs: Any)[source]
add motor record’s servo loop controls to Device
EXAMPLE:
from ophyd import EpicsMotor from apstools.devices import EpicsMotorServoMixin class myEpicsMotor(EpicsMotorServoMixin, EpicsMotor): pass m1 = myEpicsMotor('xxx:m1', name='m1') print(m1.servo.read())
PVPositioner that computes done
as a soft signal.
|
PVPositioner that computes |
|
PVPositionerSoftDone with stop() and inposition. |
- class apstools.devices.positioner_soft_done.PVPositionerSoftDone(*args: Any, **kwargs: Any)[source]
PVPositioner that computes
done
as a soft signal.PARAMETERS
- prefixstr, optional
The device prefix used for all sub-positioners. This is optional as it may be desirable to specify full PV names for PVPositioners.
- readback_pvstr, optional
PV prefix of the readback signal. Disregarded if readback attribute is created.
- setpoint_pvstr, optional
PV prefix of the setpoint signal. Disregarded if setpoint attribute is created.
- tolerancefloat, optional
Motion tolerance. The motion is considered done when:
abs(readback-setpoint) <= tolerance
Defaults to
10^(-1*precision)
, whereprecision = setpoint.precision
.- update_targetbool
True
when this object update thetarget
Component directly. UseFalse
if thetarget
Component will be updated externally, such as by the controller whentarget
is anEpicsSignal
. Defaults toTrue
.- kwargs :
Passed to ophyd.PVPositioner
ATTRIBUTES
- setpointSignal
The setpoint (request) signal
- readbackSignal or None
The readback PV (e.g., encoder position PV)
- actuateSignal or None
The actuation PV to set when movement is requested
- actuate_valueany, optional
The actuation value, sent to the actuate signal when motion is requested
- stop_signalSignal or None
The stop PV to set when motion should be stopped
- stop_valueany, optional
The value sent to stop_signal when a stop is requested
- targetSignal
The target value of a move request.
Override (in subclass) with EpicsSignal to connect with a PV.
In some controllers (such as temperature controllers), the setpoint may be changed incrementally towards this target value (such as a ramp or controlled trajectory). In such cases, the
target
will be final value whilesetpoint
will be the current desired position.Otherwise, both
setpoint
andtarget
will be set to the same value.
(new in apstools 1.5.3)
- cb_readback(*args, **kwargs)[source]
Called when readback changes (EPICS CA monitor event).
Computes if the positioner is done moving:
done = |readback - setpoint| <= tolerance
- cb_setpoint(*args, **kwargs)[source]
Called when setpoint changes (EPICS CA monitor event).
When the setpoint is changed, force done=False. For any move, done must transition to
!= done_value
, then back todone_value
.Without this response, a small move (within tolerance) will not return. Next update of readback will compute
self.done
.
- class apstools.devices.positioner_soft_done.PVPositionerSoftDoneWithStop(*args: Any, **kwargs: Any)[source]
PVPositionerSoftDone with stop() and inposition.
The
stop()
method sets the setpoint to the immediate readback value (only wheninposition
isTrue
). This stops the positioner at the current position.- property inposition
Report (boolean) if positioner is done.
- stop(*, success=False)[source]
Hold the current readback when stop() is called and not
inposition()
.
Generalized ophyd Device base class for preamplifiers.
|
Generalized interface (base class) for preamplifiers. |
- class apstools.devices.preamp_base.PreamplifierBaseDevice(*args: Any, **kwargs: Any)[source]
Generalized interface (base class) for preamplifiers.
All subclasses of
PreamplifierBaseDevice
must define how to update the gain with the correct value from the amplifier. An example isSRS570_PreAmplifier
.
PTC10 Programmable Temperature Controller
The PTC10 is a programmable temperature controller from SRS (Stanford Research Systems). The PTC10 is a modular system consisting of a base unit and provision for addition of add-on boards.
A single, complete ophyd.Device
subclass will not describe all variations
that could be installed. But the add-on boards each allow standardization. Each
installation must build a custom class that matches their hardware
configuration. The APS USAXS instrument has created a custom class
based on the ophyd.PVPositioner to use their PTC10 as a temperature
positioner.
|
SRS PTC10 AIO module |
|
SRS PTC10 RTD module channel |
|
SRS PTC10 Tc (thermocouple) module channel |
|
Mixin so SRS PTC10 can be used as a (temperature) positioner. |
EXAMPLE:
from ophyd import PVPositioner
class MyPTC10(PTC10PositionerMixin, PVPositioner):
readback = Component(EpicsSignalRO, "2A:temperature", kind="hinted")
setpoint = Component(EpicsSignalWithRBV, "5A:setPoint", kind="hinted")
rtd = Component(PTC10RtdChannel, "3A:")
pid = Component(PTC10AioChannel, "5A:")
ptc10 = MyPTC10("IOC_PREFIX:ptc10:", name="ptc10")
ptc10.report_dmov_changes.put(True) # a diagnostic
ptc10.tolerance.put(1.0) # done when |readback-setpoint|<=tolerance
New in apstools 1.5.3.
- class apstools.devices.ptc10_controller.PTC10AioChannel(*args: Any, **kwargs: Any)[source]
SRS PTC10 AIO module
- class apstools.devices.ptc10_controller.PTC10PositionerMixin(*args: Any, **kwargs: Any)[source]
Mixin so SRS PTC10 can be used as a (temperature) positioner.
cb_readback
(*args, **kwargs)Called when readback changes (EPICS CA monitor event).
cb_setpoint
(*args, **kwargs)Called when setpoint changes (EPICS CA monitor event).
Report (boolean) if positioner is done.
stop
(*[, success])Hold the current readback when the stop() method is called and not done.
- cb_setpoint(*args, **kwargs)[source]
Called when setpoint changes (EPICS CA monitor event).
When the setpoint is changed, force
done=False
. For any move,done
MUST change to!= done_value
, then change back todone_value (True)
. Without this response, a small move (within tolerance) will not return. Next update of readback will computeself.done
.
- property inposition
Report (boolean) if positioner is done.
Scaler support
|
configure scaler for only the channels with names assigned in EPICS |
Shutters
|
APS PSS shutter |
|
APS PSS shutter with separate status PV |
|
Shutter, implemented with an EPICS motor moved between two positions |
|
Shutter using a single EPICS PV moved between two positions |
|
Shutter Device using one Signal for open and close. |
|
Base class for all shutter Devices. |
|
Simulated APS PSS shutter |
- class apstools.devices.shutters.ApsPssShutter(*args: Any, **kwargs: Any)[source]
APS PSS shutter
APS PSS shutters have separate bit PVs for open and close
set either bit, the shutter moves, and the bit resets a short time later
no indication that the shutter has actually moved from the bits (see
ApsPssShutterWithStatus()
for alternative)
Since there is no direct indication that a shutter has moved, the
state
property will always return unknown and theisOpen
andisClosed
properties will always return False.A consequence of the unknown state is that the shutter will always be commanded to move (and wait the
delay_s
time), even if it is already at that position. This device could keep track of the last commanded position, but that is not guaranteed to be true since the shutter could be moved from other software.The default
delay_s
has been set at 1.2 s to allow for shutter motion. Change this as desired. Advise if this default should be changed.EXAMPLE:
shutter_a = ApsPssShutter("2bma:A_shutter:", name="shutter") shutter_a.open() shutter_a.close() shutter_a.set("open") shutter_a.set("close")
When using the shutter in a plan, be sure to use
yield from
, such as:def in_a_plan(shutter): yield from abs_set(shutter, "open", wait=True) # do something yield from abs_set(shutter, "close", wait=True) RE(in_a_plan(shutter_a))
The strings accepted by set() are defined in two lists: valid_open_values and valid_close_values. These lists are treated (internally to set()) as lower case strings.
Example, add “o” & “x” as aliases for “open” & “close”:
shutter_a.addOpenValue(“o”) shutter_a.addCloseValue(“x”) shutter_a.set(“o”) shutter_a.set(“x”)
- property state
is shutter “open”, “close”, or “unknown”?
- class apstools.devices.shutters.ApsPssShutterWithStatus(*args: Any, **kwargs: Any)[source]
APS PSS shutter with separate status PV
APS PSS shutters have separate bit PVs for open and close
set either bit, the shutter moves, and the bit resets a short time later
a separate status PV tells if the shutter is open or closed (see
ApsPssShutter()
for alternative)
EXAMPLE:
A_shutter = ApsPssShutterWithStatus( "2bma:A_shutter:", "PA:02BM:STA_A_FES_OPEN_PL", name="A_shutter") B_shutter = ApsPssShutterWithStatus( "2bma:B_shutter:", "PA:02BM:STA_B_SBS_OPEN_PL", name="B_shutter") A_shutter.open() A_shutter.close() or A_shutter.set("open") A_shutter.set("close")
When using the shutter in a plan, be sure to use yield from.
- def in_a_plan(shutter):
yield from abs_set(shutter, “open”, wait=True) # do something yield from abs_set(shutter, “close”, wait=True)
RE(in_a_plan(A_shutter))
- property state
is shutter “open”, “close”, or “unknown”?
- wait_for_state(target, timeout=10, poll_s=0.01)[source]
wait for the PSS state to reach a desired target
PARAMETERS
- target
[str] : list of strings containing acceptable values
- timeout
non-negative number : maximum amount of time (seconds) to wait for PSS state to reach target
- poll_s
non-negative number : Time to wait (seconds) in first polling cycle. After first poll, this will be increased by
_poll_factor_
up to a maximum time of_poll_s_max_
.
- class apstools.devices.shutters.EpicsMotorShutter(*args: Any, **kwargs: Any)[source]
Shutter, implemented with an EPICS motor moved between two positions
EXAMPLE:
tomo_shutter = EpicsMotorShutter("2bma:m23", name="tomo_shutter") tomo_shutter.close_value = 1.0 # default tomo_shutter.open_value = 0.0 # default tomo_shutter.tolerance = 0.01 # default tomo_shutter.open() tomo_shutter.close() # or, when used in a plan def planA(): yield from abs_set(tomo_shutter, "open", group="O") yield from wait("O") yield from abs_set(tomo_shutter, "close", group="X") yield from wait("X") def planA(): yield from abs_set(tomo_shutter, "open", wait=True) yield from abs_set(tomo_shutter, "close", wait=True) def planA(): yield from mv(tomo_shutter, "open") yield from mv(tomo_shutter, "close")
- property state
is shutter “open”, “close”, or “unknown”?
- class apstools.devices.shutters.EpicsOnOffShutter(*args: Any, **kwargs: Any)[source]
Shutter using a single EPICS PV moved between two positions
Use for a shutter controlled by a single PV which takes a value for the close command and a different value for the open command. The current position is determined by comparing the value of the control with the expected open and close values.
EXAMPLE:
bit_shutter = EpicsOnOffShutter("2bma:bit1", name="bit_shutter") bit_shutter.close_value = 0 # default bit_shutter.open_value = 1 # default bit_shutter.open() bit_shutter.close() # or, when used in a plan def planA(): yield from mv(bit_shutter, "open") yield from mv(bit_shutter, "close")
- class apstools.devices.shutters.OneSignalShutter(*args: Any, **kwargs: Any)[source]
Shutter Device using one Signal for open and close.
PARAMETERS
- signal
EpicsSignal
orSignal
: (override in subclass) Thesignal
is the comunication to the hardware. In a subclass, the hardware may have more than one communication channel to use. See theApsPssShutter
as an example.
See
ShutterBase
for more parameters.EXAMPLE
Create a simulated shutter:
shutter = OneSignalShutter(name=”shutter”)
open the shutter (interactively):
shutter.open()
Check the shutter is open:
In [144]: shutter.isOpen Out[144]: True
Use the shutter in a Bluesky plan. Set a post-move delay time of 1.0 seconds. Be sure to use
yield from
, such as:def in_a_plan(shutter): shutter.delay_s = 1.0 t0 = time.time() print("Shutter state: " + shutter.state, time.time()-t0) yield from bps.abs_set(shutter, "open", wait=True) # wait for completion is optional print("Shutter state: " + shutter.state, time.time()-t0) yield from bps.mv(shutter, "open") # do it again print("Shutter state: " + shutter.state, time.time()-t0) yield from bps.mv(shutter, "close") # ALWAYS waits for completion print("Shutter state: " + shutter.state, time.time()-t0) RE(in_a_plan(shutter))
which gives this output:
Shutter state: close 1.7642974853515625e-05 Shutter state: open 1.0032124519348145 Shutter state: open 1.0057861804962158 Shutter state: close 2.009695529937744
The strings accepted by set() are defined in two lists: valid_open_values and valid_close_values. These lists are treated (internally to set()) as lower case strings.
Example, add “o” & “x” as aliases for “open” & “close”:
shutter.addOpenValue(“o”) shutter.addCloseValue(“x”) shutter.set(“o”) shutter.set(“x”)
- property state
is shutter “open”, “close”, or “unknown”?
- class apstools.devices.shutters.ShutterBase(*args: Any, **kwargs: Any)[source]
Base class for all shutter Devices.
PARAMETERS
- value
str : any from
self.choices
(typically “open” or “close”)- valid_open_values
[str] : A list of lower-case text values that are acceptable for use with the
set()
command to open the shutter.- valid_close_values
[str] : A list of lower-case text values that are acceptable for use with the
set()
command to close the shutter.- open_value
number : The actual value to send to open
signal
to open the shutter. (default = 1)- close_value
number : The actual value to send to close
signal
to close the shutter. (default = 0)- delay_s
float : time to wait (s) after move is complete, does not wait if shutter already in position (default = 0)
- busy
Signal : (internal) tells if a move is in progress
- unknown_state
str : (constant) Text reported by
state
when not open or closed. cannot move to this position (default = “unknown”)
- property choices
return list of acceptable choices for set()
- close()[source]
BLOCKING: request shutter to close, called by
set()
.Must implement in subclass of ShutterBase()
EXAMPLE:
if not self.isClosed: self.signal.put(self.close_value) if self.delay_s > 0: time.sleep(self.delay_s) # blocking call OK here
- property isClosed
is the shutter closed?
- property isOpen
is the shutter open?
- open()[source]
BLOCKING: request shutter to open, called by
set()
.Must implement in subclass of ShutterBase()
EXAMPLE:
if not self.isOpen: self.signal.put(self.open_value) if self.delay_s > 0: time.sleep(self.delay_s) # blocking call OK here
- set(value, **kwargs)[source]
plan: request the shutter to open or close
PARAMETERS
- value
str : any from
self.choices
(typically “open” or “close”)- kwargs
dict : ignored at this time
- property state
returns
open
,close
, orunknown
Must implement in subclass of ShutterBase()
EXAMPLE:
if self.signal.get() == self.open_value: result = self.valid_open_values[0] elif self.signal.get() == self.close_value: result = self.valid_close_values[0] else: result = self.unknown_state return result
Ophyd support for Stanford Research Systems 570 preamplifier from synApps
Public Structures
|
Ophyd support for Stanford Research Systems 570 preamplifier from synApps. |
This device connects with the SRS570 support from synApps. (https://github.com/epics-modules/ip/blob/master/ipApp/Db/SR570.db)
The SRS570 synApps support is part of the ip
module:
https://htmlpreview.github.io/?https://raw.githubusercontent.com/epics-modules/ip/R3-6-1/documentation/swaitRecord.html
- class apstools.devices.srs570_preamplifier.SRS570_PreAmplifier(*args: Any, **kwargs: Any)[source]
Ophyd support for Stanford Research Systems 570 preamplifier from synApps.
- property computed_gain
Amplifier gain (A/V), as floating-point number.
Struck 3820
|
Struck/SIS 3820 Multi-Channel Scaler (as used by USAXS) |
Synthetic pseudo-Voigt function
EXAMPLES:
1from apstools.devices import SynPseudoVoigt
2from ophyd.sim import motor
3det = SynPseudoVoigt('det', motor, 'motor',
4 center=0, eta=0.5, scale=1, sigma=1, bkg=0)
5
6# scan the "det" peak with the "motor" positioner
7# RE(bp.scan([det], motor, -2, 2, 41))
1import numpy as np
2from apstools.devices import SynPseudoVoigt
3synthetic_pseudovoigt = SynPseudoVoigt(
4 'synthetic_pseudovoigt', m1, 'm1',
5 center=-1.5 + 0.5*np.random.uniform(),
6 eta=0.2 + 0.5*np.random.uniform(),
7 sigma=0.001 + 0.05*np.random.uniform(),
8 scale=1e5,
9 bkg=0.01*np.random.uniform())
10
11# scan the "synthetic_pseudovoigt" peak with the "m1" positioner
12# RE(bp.scan([synthetic_pseudovoigt], m1, -2, 0, 219))
|
Evaluate a point on a pseudo-Voigt based on the value of a motor. |
- class apstools.devices.synth_pseudo_voigt.SynPseudoVoigt(*args: Any, **kwargs: Any)[source]
Evaluate a point on a pseudo-Voigt based on the value of a motor.
Provides a signal to be measured. Acts like a detector.
PARAMETERS
- name str :
name of detector signal
- motor positioner :
The independent coordinate
- motor_field str :
name of motor
- center float :
(optional) location of maximum value, default=0
- eta float :
(optional) 0 <= eta < 1.0: Lorentzian fraction, default=0.5
- scale float :
(optional) scale >= 1 : scale factor, default=1
- sigma float :
(optional) sigma > 0 : width, default=1
- bkg float :
(optional) bkg >= 0 : constant background, default=0
- noise
"poisson"
or"uniform"
orNone
: Add noise to the result.
- noise_multiplier float :
Only relevant for ‘uniform’ noise. Multiply the random amount of noise by ‘noise_multiplier’
Tracking Signal for Device coordination
|
Non-EPICS signal for use when coordinating Device actions. |
XIA PF4 Filters
|
XIA PF4 Filter: one set of 4 filters (A). |
|
XIA PF4 Filter: two sets of 4 filters (A, B). |
|
XIA PF4 Filter: three sets of 4 filters (A, B, C). |
|
XIA PF4 filters - common support. |
|
A single module of XIA PF4 filters (4-blades). |
|
LEGACY (use Pf4FilterDual now): Dual (Al, Ti) Xia PF4 filter boxes |
- class apstools.devices.xia_pf4.DualPf4FilterBox(*args: Any, **kwargs: Any)[source]
LEGACY (use Pf4FilterDual now): Dual (Al, Ti) Xia PF4 filter boxes
Support from synApps (using Al, Ti foils)
EXAMPLE:
pf4 = DualPf4FilterBox("2bmb:pf4:", name="pf4") pf4_AlTi = DualPf4FilterBox("9idcRIO:pf4:", name="pf4_AlTi")
- class apstools.devices.xia_pf4.Pf4FilterBank(*args: Any, **kwargs: Any)[source]
A single module of XIA PF4 filters (4-blades).
EXAMPLES:
pf4B = Pf4FilterBank("ioc:pf4:", name="pf4B", bank="B") # -or- class MyTriplePf4(Pf4FilterCommon): A = Component(Pf4FilterBank, "", bank="A") B = Component(Pf4FilterBank, "", bank="B") C = Component(Pf4FilterBank, "", bank="C") pf4 = MyTriplePf4("ioc:pf4:", name="pf4")
- class apstools.devices.xia_pf4.Pf4FilterCommon(*args: Any, **kwargs: Any)[source]
XIA PF4 filters - common support.
Use
Pf4FilterCommon
to build support for a configuration of PF4 filters (such as 3 or 4 filter banks).EXAMPLE:
class MyTriplePf4(Pf4FilterCommon): A = Component(Pf4FilterBank, "", bank="A") B = Component(Pf4FilterBank, "", bank="B") C = Component(Pf4FilterBank, "", bank="C") pf4 = MyTriplePf4("ioc:pf4:", name="pf4")
- class apstools.devices.xia_pf4.Pf4FilterDual(*args: Any, **kwargs: Any)[source]
XIA PF4 Filter: two sets of 4 filters (A, B).
- B
- class apstools.devices.xia_pf4.Pf4FilterSingle(*args: Any, **kwargs: Any)[source]
XIA PF4 Filter: one set of 4 filters (A).
- A
XIA Slit from EPICS synApps optics: xia_slit.db
Coordinates (viewing from detector towards source):
top
inb out
bot
Each blade 1 (in the XIA slit controller) travels in a _cylindrical_ coordinate system. Positive motion moves a blade outwards from the center with a backlash correction. No backlash correction is applied for negative motion (as the blades close). Size and center are computed by the underlying EPICS support.
hsize = inb + out vsize = top + bot
- 1
Note that the blade names here are different than the EPICS support. The difference is to make the names of the blades consistent with other slits with the Bluesky framework.
USAGE:
slit = XiaSlitController(“IOC:hsc1:”, name=”slit”) print(slit.geometry)
|
EPICS synApps optics xia_slit.db 2D support: inb out bot top ... |
- class apstools.devices.xia_slit.XiaSlit2D(*args: Any, **kwargs: Any)[source]
EPICS synApps optics xia_slit.db 2D support: inb out bot top …
- property geometry
Return the slit 2D size and center as a namedtuple.
- hcenter
alias of
apstools.devices.positioner_soft_done.PVPositionerSoftDone
- hsize
alias of
apstools.devices.positioner_soft_done.PVPositionerSoftDone
- vcenter
alias of
apstools.devices.positioner_soft_done.PVPositionerSoftDone
- vsize
alias of
apstools.devices.positioner_soft_done.PVPositionerSoftDone
File Writers
The file writer callbacks are:
|
Base class for filewriter callbacks. |
|
Customize |
|
General class for writing HDF5/NeXus file (using only NeXus base classes). |
|
Collect data from Bluesky RunEngine documents to write as SPEC data. |
Overview
Each filewriter can be used as a callback to the Bluesky RunEngine
for live data acquisition or later as a handler for a document set
from the databroker. Methods are provided to handle each document
type. The callback method, receiver(document_type, document)
, receives
the set of documents, one-by-one, and sends them to the appropriate
handler. Once the stop
document is received, the writer()
method
is called to write the output file.
Examples
Write SPEC file automatically from data acquisition:
specwriter = apstools.callbacks.SpecWriterCallback()
RE.subscribe(specwriter.receiver)
Write NeXus file automatically from data acquisition:
nxwriter = apstools.callbacks.NXWriter()
RE.subscribe(nxwriter.receiver)
Write APS-specific NeXus file automatically from data acquisition:
nxwriteraps = apstools.callbacks.NXWriterAPS()
RE.subscribe(nxwriteraps.receiver)
Programmer’s Note
Subclassing from object
(or no superclass)
avoids the need to import bluesky.callbacks.core.CallbackBase
.
One less import when only accessing the Databroker.
The only advantage to subclassing from CallbackBase
seems to be a simpler setup call to RE.subscribe()
.
superclass |
subscription code |
---|---|
object |
|
CallbackBase |
|
HDF5/NeXus File Writers
FileWriterCallbackBase
Base class for filewriter callbacks.
Applications should subclass and rewrite the writer()
method.
The local buffers are cleared when a start document is received. Content is collected here from each document until the stop document. The content is written once the stop document is received.
The output file will be written to the file named in
self.file_name
. (Include the output directory if
different from the current working directory.)
When self.file_name
is None
(the default), the
make_file_name()
method will construct
the file name using a combination of the date and time
(from the start
document), the start
document
uid
, and the scan_id
.
The default file extension (given in NEXUS_FILE_EXTENSION
)
is used in
make_file_name()
.
The directory will be
self.file_path
(or the current working directory
if self.file_path
is None
which is the default).
Either specify self.file_name
or override
make_file_name()
in a subclass to change the procedure for default output file names.
Almost all metadata keys (additional attributes added to the run’s
start
document) are completely optional. Certain keys are
specified by the RunEngine, some keys are specified by the plan
(or plan support methods), and other keys are supplied by the
user or the instrument team.
These are the keys used by this callback to help guide how information is stored in a NeXus HDF5 file structure.
key |
creator |
how is it used |
---|---|---|
detectors |
inconsistent |
name(s) of the signals used as detectors |
motors |
inconsistent |
synonym for |
plan_args |
inconsistent |
parameters (arguments) given |
plan_name |
inconsistent |
name of the plan used to collect data |
positioners |
inconsistent |
name(s) of the signals used as positioners |
scan_id |
RunEngine |
incrementing number of the run, user can reset |
uid |
RunEngine |
unique identifier of the run |
versions |
instrument |
documents the software versions used to collect data |
For more information about Bluesky events and document types, see https://blueskyproject.io/event-model/data-model.html.
NXWriter
General class for writing HDF5/NeXus file (using only NeXus base classes).
One scan is written to one HDF5/NeXus file.
The output file will be written to the file named in
self.file_name
. (Include the output directory if
different from the current working directory.)
When self.file_name
is None
(the default), the
make_file_name()
method will construct
the file name using a combination of the date and time
(from the start document), the start
document
uid
, and the scan_id
.
The default file extension (given in NEXUS_FILE_EXTENSION
) is
used in
make_file_name()
.
The directory will be
self.file_path
(or the current working directory
if self.file_path
is None
which is the default).
Either specify self.file_name
of override
make_file_name()
in a subclass
to change the procedure for default output file names.
Almost all metadata keys (additional attributes added to the run’s
start
document) are completely optional. Certain keys are
specified by the RunEngine, some keys are specified by the plan
(or plan support methods), and other keys are supplied by the
user or the instrument team.
These are the keys used by this callback to help guide how information is stored in a NeXus HDF5 file structure.
key |
creator |
how is it used |
---|---|---|
detectors |
inconsistent |
name(s) of the signals used as plottable values |
motors |
inconsistent |
synonym for |
plan_args |
inconsistent |
parameters (arguments) given |
plan_name |
inconsistent |
name of the plan used to collect data |
positioners |
inconsistent |
name(s) of the positioners used for plotting |
scan_id |
RunEngine |
incrementing number of the run, user can reset |
subtitle |
user |
-tba- |
title |
user |
/entry/title |
uid |
RunEngine |
unique identifier of the run |
versions |
instrument |
documents the software versions used to collect data |
Notes:
detectors[0]
will be used as the/entry/data@signal
attributethe complete list in
positioners
will be used as the/entry/data@axes
attribute
NXWriterAPS
Customize NXWriter
with APS-specific content.
Adds
/entry/instrument/undulator
group if metadata exists.Adds APS information to
/entry/instrument/source
group.
APS instruments should subclass NXWriterAPS
to make customizations for specific plans or other considerations.
HDF5/NeXus File Structures
Bluesky stores a wealth of information about a measurement in a run.
Raw data from the Bluesky run is stored in the HDF5/NeXus structure
under the /entry/instrument/bluesky
group as shown in this example:
1 bluesky:NXnote
2 @NX_class = NXnote
3 @target = /entry/instrument/bluesky
4 plan_name --> /entry/instrument/bluesky/metadata/plan_name
5 uid --> /entry/instrument/bluesky/metadata/run_start_uid
The NeXus structure is built using links to the raw data in
/entry/instrument/bluesky
.
Metadata
Metadata from the start
document is stored in the metadata
subgroup
as shown in this example:
1 metadata:NXnote
2 @NX_class = NXnote
3 @target = /entry/instrument/bluesky/metadata
4 beamline_id:NX_CHAR = b'APS USAXS 9-ID-C'
5 @target = /entry/instrument/bluesky/metadata/beamline_id
6 datetime:NX_CHAR = b'2019-05-02 17:45:33.904824'
7 @target = /entry/instrument/bluesky/metadata/datetime
8 detectors:NX_CHAR = b'- I0_USAXS\n'
9 @target = /entry/instrument/bluesky/metadata/detectors
10 @text_format = yaml
11 hints:NX_CHAR = b'dimensions:\n- - - m_stage_r\n - primary\n'
12 @target = /entry/instrument/bluesky/metadata/hints
13 @text_format = yaml
14 login_id:NX_CHAR = b'usaxs@usaxscontrol.xray.aps.anl.gov'
15 @target = /entry/instrument/bluesky/metadata/login_id
16 motors:NX_CHAR = b'- m_stage_r\n'
17 @target = /entry/instrument/bluesky/metadata/motors
18 @text_format = yaml
19 pid:NX_INT64[] =
20 @target = /entry/instrument/bluesky/metadata/pid
21 plan_name:NX_CHAR = b'tune_mr'
22 @target = /entry/instrument/bluesky/metadata/plan_name
23 plan_type:NX_CHAR = b'generator'
24 @target = /entry/instrument/bluesky/metadata/plan_type
25 proposal_id:NX_CHAR = b'testing Bluesky installation'
26 @target = /entry/instrument/bluesky/metadata/proposal_id
27 purpose:NX_CHAR = b'tuner'
28 @target = /entry/instrument/bluesky/metadata/purpose
29 run_start_uid:NX_CHAR = 2ffe4d87-9f0c-464a-9d14-213ec71afaf7
30 @long_name = bluesky run uid
31 @target = /entry/instrument/bluesky/metadata/run_start_uid
32 tune_md:NX_CHAR = b"initial_position: 8.824977\ntime_iso8601: '2019-05-02 17:45:33.923544'\nwidth: -0.004\n"
33 @target = /entry/instrument/bluesky/metadata/tune_md
34 @text_format = yaml
35 tune_parameters:NX_CHAR = b'initial_position: 8.824977\nnum: 31\npeak_choice: com\nwidth: -0.004\nx_axis: m_stage_r\ny_axis: I0_USAXS\n'
36 @target = /entry/instrument/bluesky/metadata/tune_parameters
37 @text_format = yaml
38 uid --> /entry/instrument/bluesky/run_start_uid
Note that complex structures (lists and dictionaries)
are written as YAML. YAML is easily converted back into the
python structure using yaml.load(yaml_text)
where yaml_text
is the YAML text from the HDF5 file.
Streams
Data from each ophyd signal in a document stream is stored as datasets
within a NXdata
1 subgroup of that stream. Bluesky collects value(s) and timestamp(s)
which are stored in the datasets value
and EPOCH
, respectively.
Since EPOCH
is the absolute number of seconds from some time in the deep past,
an additional time
dataset repeats this data as time relative to the first time.
(This makes it much easier for some visualization programs to display
value v. time plots.) Additional information, as available (such as units),
is added as NeXus attributes. The @target
attribute is automatically added
to simplify linking any item into the NeXus tree structure.
- 1
NXdata
is used since some visualization tools recognize it to make a plot.
For example, the main data in a run is usually stored in the primary
stream.
Here, we show the tree structure for one signal (I0_USAXS
) from the primary stream:
1 primary:NXnote
2 @NX_class = NXnote
3 @target = /entry/instrument/bluesky/streams/primary
4 @uid = 90489e9b-d66e-4753-8c4f-849e7a809aeb
5 I0_USAXS:NXdata
6 @NX_class = NXdata
7 @axes = time
8 @signal = value
9 @signal_type = detector
10 @target = /entry/instrument/bluesky/streams/primary/I0_USAXS
11 EPOCH:NX_FLOAT64[31] = [1556837134.972796, 1556837135.589462, 1556837135.989462, '...', 1556837147.656129]
12 @long_name = epoch time (s)
13 @target = /entry/instrument/bluesky/streams/primary/I0_USAXS/EPOCH
14 @units = s
15 time:NX_FLOAT64[31] = [0.0, 0.6166660785675049, 1.0166659355163574, '...', 12.683332920074463]
16 @long_name = time since first data (s)
17 @start_time = 1556837134.972796
18 @start_time_iso = 2019-05-02T17:45:34.972796
19 @target = /entry/instrument/bluesky/streams/primary/I0_USAXS/time
20 @units = s
21 value:NX_FLOAT64[31] = [127.0, 126.0, 127.0, '...', 127.0]
22 @long_name = I0_USAXS
23 @lower_ctrl_limit = 0.0
24 @precision = 0
25 @signal_type = detector
26 @source = PV:9idcLAX:vsc:c0.S2
27 @target = /entry/instrument/bluesky/streams/primary/I0_USAXS/value
28 @units =
29 @upper_ctrl_limit = 0.0
Baseline
In Bluesky, baseline streams record the value (and timestamp) of a
signal at the start and end of the run. Similar to the handling for
streams (above), a subgroup is created for each baseline stream.
The datasets include value
, EPOCH
, time
(as above) and
value_start
and value_end
. Here’s an example:
1 baseline:NXnote
2 @NX_class = NXnote
3 @target = /entry/instrument/bluesky/streams/baseline
4 @uid = f5fce6ac-f3fa-4c34-b11d-9e33c263cd20
5 aps_current:NXdata
6 @NX_class = NXdata
7 @axes = time
8 @signal = value
9 @target = /entry/instrument/bluesky/streams/baseline/aps_current
10 EPOCH:NX_FLOAT64[2] = [1556837133.753769, 1556837147.753723]
11 @long_name = epoch time (s)
12 @target = /entry/instrument/bluesky/streams/baseline/aps_current/EPOCH
13 @units = s
14 time:NX_FLOAT64[2] = [0.0, 13.999953985214233]
15 @long_name = time since first data (s)
16 @start_time = 1556837133.753769
17 @start_time_iso = 2019-05-02T17:45:33.753769
18 @target = /entry/instrument/bluesky/streams/baseline/aps_current/time
19 @units = s
20 value:NX_FLOAT64[2] = [0.004512578244000032, 0.003944485484000011]
21 @long_name = aps_current
22 @lower_ctrl_limit = 0.0
23 @precision = 1
24 @source = PV:S:SRcurrentAI
25 @target = /entry/instrument/bluesky/streams/baseline/aps_current/value
26 @units = mA
27 @upper_ctrl_limit = 310.0
28 value_end:NX_FLOAT64[] =
29 @long_name = aps_current
30 @lower_ctrl_limit = 0.0
31 @precision = 1
32 @source = PV:S:SRcurrentAI
33 @target = /entry/instrument/bluesky/streams/baseline/aps_current/value_end
34 @units = mA
35 @upper_ctrl_limit = 310.0
36 value_start:NX_FLOAT64[] =
37 @long_name = aps_current
38 @lower_ctrl_limit = 0.0
39 @precision = 1
40 @source = PV:S:SRcurrentAI
41 @target = /entry/instrument/bluesky/streams/baseline/aps_current/value_start
42 @units = mA
43 @upper_ctrl_limit = 310.0
Full Structure
The full structure of the example HDF5/NeXus file (omitting the attributes
and array data for brevity) is shown next. You can see that most of the
NeXus structure is completed by making links to data from either
/entry/instrument/bluesky/metadata
or /entry/instrument/bluesky/streams
:
120190502-174533-S00108-2ffe4d8.hdf : NeXus data file
2 entry:NXentry
3 duration:NX_FLOAT64[] = [ ... ]
4 end_time:NX_CHAR = 2019-05-02T17:45:48.078618
5 entry_identifier --> /entry/instrument/bluesky/uid
6 plan_name --> /entry/instrument/bluesky/metadata/plan_name
7 program_name:NX_CHAR = bluesky
8 start_time:NX_CHAR = 2019-05-02T17:45:33.937294
9 title:NX_CHAR = tune_mr-S0108-2ffe4d8
10 contact:NXuser
11 affiliation --> /entry/instrument/bluesky/streams/baseline/bss_user_info_institution/value_start
12 email --> /entry/instrument/bluesky/streams/baseline/bss_user_info_email/value_start
13 facility_user_id --> /entry/instrument/bluesky/streams/baseline/bss_user_info_badge/value_start
14 name --> /entry/instrument/bluesky/streams/baseline/bss_user_info_contact/value_start
15 role:NX_CHAR = contact
16 data:NXdata
17 EPOCH --> /entry/instrument/bluesky/streams/primary/scaler0_time/time
18 I0_USAXS --> /entry/instrument/bluesky/streams/primary/I0_USAXS/value
19 m_stage_r --> /entry/instrument/bluesky/streams/primary/m_stage_r/value
20 m_stage_r_soft_limit_hi --> /entry/instrument/bluesky/streams/primary/m_stage_r_soft_limit_hi/value
21 m_stage_r_soft_limit_lo --> /entry/instrument/bluesky/streams/primary/m_stage_r_soft_limit_lo/value
22 m_stage_r_user_setpoint --> /entry/instrument/bluesky/streams/primary/m_stage_r_user_setpoint/value
23 scaler0_display_rate --> /entry/instrument/bluesky/streams/primary/scaler0_display_rate/value
24 scaler0_time --> /entry/instrument/bluesky/streams/primary/scaler0_time/value
25 instrument:NXinstrument
26 bluesky:NXnote
27 plan_name --> /entry/instrument/bluesky/metadata/plan_name
28 uid --> /entry/instrument/bluesky/metadata/run_start_uid
29 metadata:NXnote
30 APSTOOLS_VERSION:NX_CHAR = b'1.1.0'
31 BLUESKY_VERSION:NX_CHAR = b'1.5.2'
32 EPICS_CA_MAX_ARRAY_BYTES:NX_CHAR = b'1280000'
33 EPICS_HOST_ARCH:NX_CHAR = b'linux-x86_64'
34 OPHYD_VERSION:NX_CHAR = b'1.3.3'
35 beamline_id:NX_CHAR = b'APS USAXS 9-ID-C'
36 datetime:NX_CHAR = b'2019-05-02 17:45:33.904824'
37 detectors:NX_CHAR = b'- I0_USAXS\n'
38 hints:NX_CHAR = b'dimensions:\n- - - m_stage_r\n - primary\n'
39 login_id:NX_CHAR = b'usaxs@usaxscontrol.xray.aps.anl.gov'
40 motors:NX_CHAR = b'- m_stage_r\n'
41 pid:NX_INT64[] = [ ... ]
42 plan_name:NX_CHAR = b'tune_mr'
43 plan_type:NX_CHAR = b'generator'
44 proposal_id:NX_CHAR = b'testing Bluesky installation'
45 purpose:NX_CHAR = b'tuner'
46 run_start_uid:NX_CHAR = 2ffe4d87-9f0c-464a-9d14-213ec71afaf7
47 tune_md:NX_CHAR = b"initial_position: 8.824977\ntime_iso8601: '2019-05-02 17:45:33.923544'\nwidth: -0.004\n"
48 tune_parameters:NX_CHAR = b'initial_position: 8.824977\nnum: 31\npeak_choice: com\nwidth: -0.004\nx_axis: m_stage_r\ny_axis: I0_USAXS\n'
49 uid --> /entry/instrument/bluesky/run_start_uid
50 streams:NXnote
51 baseline:NXnote
52 aps_current:NXdata
53 EPOCH:NX_FLOAT64[2] = [ ... ]
54 time:NX_FLOAT64[2] = [ ... ]
55 value:NX_FLOAT64[2] = [ ... ]
56 value_end:NX_FLOAT64[] = [ ... ]
57 value_start:NX_FLOAT64[] = [ ... ]
58 aps_fill_number:NXdata
59 EPOCH:NX_FLOAT64[2] = [ ... ]
60 time:NX_FLOAT64[2] = [ ... ]
61 value:NX_FLOAT64[2] = [ ... ]
62 value_end:NX_FLOAT64[] = [ ... ]
63 value_start:NX_FLOAT64[] = [ ... ]
64 aps_global_feedback:NXdata
65 EPOCH:NX_FLOAT64[2] = [ ... ]
66 time:NX_FLOAT64[2] = [ ... ]
67 value:NX_CHAR[3,3] = ["Off", "Off"]
68 value_end:NX_CHAR = b'Off'
69 value_start:NX_CHAR = b'Off'
70 # many baseline groups omitted for brevity
71 primary:NXnote
72 I0_USAXS:NXdata
73 EPOCH:NX_FLOAT64[31] = [ ... ]
74 time:NX_FLOAT64[31] = [ ... ]
75 value:NX_FLOAT64[31] = [ ... ]
76 m_stage_r:NXdata
77 EPOCH:NX_FLOAT64[31] = [ ... ]
78 time:NX_FLOAT64[31] = [ ... ]
79 value:NX_FLOAT64[31] = [ ... ]
80 m_stage_r_soft_limit_hi:NXdata
81 EPOCH:NX_FLOAT64[31] = [ ... ]
82 time:NX_FLOAT64[31] = [ ... ]
83 value:NX_FLOAT64[31] = [ ... ]
84 m_stage_r_soft_limit_lo:NXdata
85 EPOCH:NX_FLOAT64[31] = [ ... ]
86 time:NX_FLOAT64[31] = [ ... ]
87 value:NX_FLOAT64[31] = [ ... ]
88 m_stage_r_user_setpoint:NXdata
89 EPOCH:NX_FLOAT64[31] = [ ... ]
90 time:NX_FLOAT64[31] = [ ... ]
91 value:NX_FLOAT64[31] = [ ... ]
92 scaler0_display_rate:NXdata
93 EPOCH:NX_FLOAT64[31] = [ ... ]
94 time:NX_FLOAT64[31] = [ ... ]
95 value:NX_FLOAT64[31] = [ ... ]
96 scaler0_time:NXdata
97 EPOCH:NX_FLOAT64[31] = [ ... ]
98 time:NX_FLOAT64[31] = [ ... ]
99 value:NX_FLOAT64[31] = [ ... ]
100 detectors:NXnote
101 I0_USAXS:NXdetector
102 data --> /entry/instrument/bluesky/streams/primary/I0_USAXS
103 monochromator:NXmonochromator
104 energy --> /entry/instrument/bluesky/streams/baseline/monochromator_dcm_energy/value_start
105 feedback_on --> /entry/instrument/bluesky/streams/baseline/monochromator_feedback_on/value_start
106 mode --> /entry/instrument/bluesky/streams/baseline/monochromator_dcm_mode/value_start
107 theta --> /entry/instrument/bluesky/streams/baseline/monochromator_dcm_theta/value_start
108 wavelength --> /entry/instrument/bluesky/streams/baseline/monochromator_dcm_wavelength/value_start
109 y_offset --> /entry/instrument/bluesky/streams/baseline/monochromator_dcm_y_offset/value_start
110 positioners:NXnote
111 m_stage_r:NXpositioner
112 value --> /entry/instrument/bluesky/streams/primary/m_stage_r
113 source:NXsource
114 name:NX_CHAR = Bluesky framework
115 probe:NX_CHAR = x-ray
116 type:NX_CHAR = Synchrotron X-ray Source
APS-specific HDF5/NeXus File Structures
Examples of additional structure in NeXus file added by
NXWriterAPS()
:
1 source:NXsource
2 @NX_class = NXsource
3 @target = /entry/instrument/source
4 current --> /entry/instrument/bluesky/streams/baseline/aps_current/value_start
5 energy:NX_INT64[] =
6 @units = GeV
7 fill_number --> /entry/instrument/bluesky/streams/baseline/aps_fill_number/value_start
8 name:NX_CHAR = Advanced Photon Source
9 @short_name = APS
10 probe:NX_CHAR = x-ray
11 type:NX_CHAR = Synchrotron X-ray Source
12 undulator:NXinsertion_device
13 @NX_class = NXinsertion_device
14 @target = /entry/instrument/undulator
15 device --> /entry/instrument/bluesky/streams/baseline/undulator_downstream_device/value_start
16 energy --> /entry/instrument/bluesky/streams/baseline/undulator_downstream_energy/value_start
17 energy_taper --> /entry/instrument/bluesky/streams/baseline/undulator_downstream_energy_taper/value_start
18 gap --> /entry/instrument/bluesky/streams/baseline/undulator_downstream_gap/value_start
19 gap_taper --> /entry/instrument/bluesky/streams/baseline/undulator_downstream_gap_taper/value_start
20 harmonic_value --> /entry/instrument/bluesky/streams/baseline/undulator_downstream_harmonic_value/value_start
21 location --> /entry/instrument/bluesky/streams/baseline/undulator_downstream_location/value_start
22 total_power --> /entry/instrument/bluesky/streams/baseline/undulator_downstream_total_power/value_start
23 type:NX_CHAR = undulator
24 version --> /entry/instrument/bluesky/streams/baseline/undulator_downstream_version/value_start
SPEC File Structure
EXAMPLE : use as Bluesky callback:
from apstools import SpecWriterCallback
specwriter = SpecWriterCallback()
RE.subscribe(specwriter.receiver)
EXAMPLE : use as writer from Databroker:
from apstools import SpecWriterCallback
specwriter = SpecWriterCallback()
for key, doc in db.get_documents(db["a123456"]):
specwriter.receiver(key, doc)
print("Look at SPEC data file: "+specwriter.spec_filename)
EXAMPLE : use as writer from Databroker with customizations:
from apstools import SpecWriterCallback
# write into file: /tmp/cerium.spec
specwriter = SpecWriterCallback(filename="/tmp/cerium.spec")
for key, doc in db.get_documents(db["abcd123"]):
specwriter.receiver(key, doc)
# write into file: /tmp/barium.dat
specwriter.newfile("/tmp/barium.dat")
for key, doc in db.get_documents(db["b46b63d4"]):
specwriter.receiver(key, doc)
Example output from SpecWriterCallback()
:
1#F test_specdata.txt
2#E 1510948301
3#D Fri Nov 17 13:51:41 2017
4#C BlueSky user = mintadmin host = mint-vm
5
6#S 233 scan(detectors=['synthetic_pseudovoigt'], num=20, motor=['m1'], start=-1.65, stop=-1.25, per_step=None)
7#D Fri Nov 17 11:58:56 2017
8#C Fri Nov 17 11:58:56 2017. plan_type = generator
9#C Fri Nov 17 11:58:56 2017. uid = ddb81ac5-f3ee-4219-b047-c1196d08a5c1
10#MD beamline_id = developer__YOUR_BEAMLINE_HERE
11#MD login_id = mintadmin@mint-vm
12#MD motors = ['m1']
13#MD num_intervals = 19
14#MD num_points = 20
15#MD pid = 7133
16#MD plan_pattern = linspace
17#MD plan_pattern_args = {'start': -1.65, 'stop': -1.25, 'num': 20}
18#MD plan_pattern_module = numpy
19#MD proposal_id = None
20#N 20
21#L m1 m1_user_setpoint Epoch_float Epoch synthetic_pseudovoigt
22-1.6500000000000001 -1.65 8.27465009689331 8 2155.6249784809206
23-1.6288 -1.6289473684210525 8.46523666381836 8 2629.5229081466964
24-1.608 -1.6078947368421053 8.665581226348877 9 3277.4074328018964
25-1.5868 -1.5868421052631578 8.865738153457642 9 4246.145049452576
26-1.5656 -1.5657894736842104 9.066259145736694 9 5825.186516381953
27-1.5448000000000002 -1.5447368421052632 9.266754627227783 9 8803.414029867528
28-1.5236 -1.5236842105263158 9.467074871063232 9 15501.419687691103
29-1.5028000000000001 -1.5026315789473683 9.667330741882324 10 29570.38936784884
30-1.4816 -1.4815789473684209 9.867793798446655 10 55562.3437459487
31-1.4604000000000001 -1.4605263157894737 10.067811012268066 10 89519.64275090238
32-1.4396 -1.4394736842105262 10.268356084823608 10 97008.97190269837
33-1.4184 -1.418421052631579 10.470621824264526 10 65917.29757650592
34-1.3972 -1.3973684210526316 10.669955730438232 11 36203.46726798266
35-1.3764 -1.3763157894736842 10.870310306549072 11 18897.64061096024
36-1.3552 -1.3552631578947367 11.070487976074219 11 10316.223844200193
37-1.3344 -1.3342105263157895 11.271018743515015 11 6540.179615556269
38-1.3132000000000001 -1.313157894736842 11.4724280834198 11 4643.555421314616
39-1.292 -1.2921052631578946 11.673305034637451 12 3533.8582404216445
40-1.2712 -1.2710526315789474 11.874176025390625 12 2809.1872596809008
41-1.25 -1.25 12.074703216552734 12 2285.9226305883626
42#C Fri Nov 17 11:59:08 2017. num_events_primary = 20
43#C Fri Nov 17 11:59:08 2017. time = 2017-11-17 11:59:08.324011
44#C Fri Nov 17 11:59:08 2017. exit_status = success
Source Code
Base Class for File Writer Callbacks
|
Base class for filewriter callbacks. |
- class apstools.callbacks.callback_base.FileWriterCallbackBase(*args, **kwargs)[source]
Base class for filewriter callbacks.
New with apstools release 1.3.0.
Applications should subclass and rewrite the
writer()
method.The local buffers are cleared when a start document is received. Content is collected here from each document until the stop document. The content is written once the stop document is received.
User Interface methods
receiver
(key, doc)bluesky callback (handles a stream of documents)
Internal methods
clear
()delete any saved data from the cache and reinitialize
generate a file name to be used as default
writer
()print summary of run as diagnostic
Document Handler methods
bulk_events
(doc)Deprecated.
datum
(doc)Like an event, but for data recorded outside of bluesky.
descriptor
(doc)description of the data stream to be acquired
event
(doc)a single "row" of data
resource
(doc)like a descriptor, but for data recorded outside of bluesky
start
(doc)beginning of a run, clear cache and collect metadata
stop
(doc)end of the run, end collection and initiate the
writer()
method- datum(doc)[source]
Like an event, but for data recorded outside of bluesky.
Example:
Datum ===== datum_id : 621caa0f-70f1-4e3d-8718-b5123d434502/0 datum_kwargs : HDF5_file_name : /mnt/usaxscontrol/USAXS_data/2020-06/06_10_Minjee_waxs/AGIX3N1_0699.hdf point_number : 0 resource : 621caa0f-70f1-4e3d-8718-b5123d434502
NeXus File Writer Callbacks
|
General class for writing HDF5/NeXus file (using only NeXus base classes). |
|
Customize |
- class apstools.callbacks.nexus_writer.NXWriter(*args, **kwargs)[source]
General class for writing HDF5/NeXus file (using only NeXus base classes).
New with apstools release 1.3.0.
One scan is written to one HDF5/NeXus file.
METHODS
writer
()write collected data to HDF5/NeXus data file
h5string
(text)Format string for h5py interface.
add_dataset_attributes
(ds, v[, long_name])add attributes from v dictionary to dataset ds
decide if a signal in the primary stream is a detector or a positioner
create_NX_group
(parent, specification)create an h5 group with named NeXus class (specification)
return the title for this sample
get_stream_link
(signal[, stream, ref])return the h5 object for
signal
write_data
(parent)group: /entry/data:NXdata
write_detector
(parent)group: /entry/instrument/detectors:NXnote/DETECTOR:NXdetector
group: /entry/data:NXentry
write_instrument
(parent)group: /entry/instrument:NXinstrument
write_metadata
(parent)group: /entry/instrument/bluesky/metadata:NXnote
write_monochromator
(parent)group: /entry/instrument/monochromator:NXmonochromator
write_positioner
(parent)group: /entry/instrument/positioners:NXnote/POSITIONER:NXpositioner
write_root
(filename)root of the HDF5 file
write_sample
(parent)group: /entry/sample:NXsample
write_slits
(parent)group: /entry/instrument/slits:NXnote/SLIT:NXslit
write_source
(parent)group: /entry/instrument/source:NXsource
write_streams
(parent)group: /entry/instrument/bluesky/streams:NXnote
write_user
(parent)group: /entry/contact:NXuser
- add_dataset_attributes(ds, v, long_name=None)[source]
add attributes from v dictionary to dataset ds
- assign_signal_type()[source]
decide if a signal in the primary stream is a detector or a positioner
- create_NX_group(parent, specification)[source]
create an h5 group with named NeXus class (specification)
- getResourceFile(resource_id)[source]
full path to the resource file specified by uid
resource_id
override in subclass as needed
- get_sample_title()[source]
return the title for this sample
default title: {plan_name}-S{scan_id}-{short_uid}
- get_stream_link(signal, stream=None, ref=None)[source]
return the h5 object for
signal
DEFAULTS
stream
:baseline
key
:value_start
- write_data(parent)[source]
group: /entry/data:NXdata
- write_entry()[source]
group: /entry/data:NXentry
- write_metadata(parent)[source]
group: /entry/instrument/bluesky/metadata:NXnote
metadata from the bluesky start document
- write_positioner(parent)[source]
group: /entry/instrument/positioners:NXnote/POSITIONER:NXpositioner
- write_slits(parent)[source]
group: /entry/instrument/slits:NXnote/SLIT:NXslit
override in subclass to store content, name patterns vary with each instrument
- write_source(parent)[source]
group: /entry/instrument/source:NXsource
Note: this is (somewhat) generic, override for a different source
- class apstools.callbacks.nexus_writer.NXWriterAPS(*args, **kwargs)[source]
Customize
NXWriter
with APS-specific content.New with apstools release 1.3.0.
Adds /entry/instrument/undulator group if metadata exists.
Adds APS information to /entry/instrument/source group.
write_instrument
(parent)group: /entry/instrument:NXinstrument
write_source
(parent)group: /entry/instrument/source:NXsource
write_undulator
(parent)group: /entry/instrument/undulator:NXinsertion_device
SPEC Data File Writer Callback
EXAMPLE:
Execution of this plan (with RE(myPlan())
):
def myPlan():
yield from bps.open_run()
spec_comment("this is a start document comment", "start")
spec_comment("this is a descriptor document comment", "descriptor")
yield bps.Msg('checkpoint')
yield from bps.trigger_and_read([scaler])
spec_comment("this is an event document comment after the first read")
yield from bps.sleep(2)
yield bps.Msg('checkpoint')
yield from bps.trigger_and_read([scaler])
spec_comment("this is an event document comment after the second read")
spec_comment("this is a stop document comment", "stop")
yield from bps.close_run()
results in this SPEC file output:
#S 1145 myPlan()
#D Mon Jan 28 12:48:09 2019
#C Mon Jan 28 12:48:09 2019. plan_type = generator
#C Mon Jan 28 12:48:09 2019. uid = ef98648a-8e3a-4e7e-ac99-3290c9b5fca7
#C Mon Jan 28 12:48:09 2019. this is a start document comment
#C Mon Jan 28 12:48:09 2019. this is a descriptor document comment
#MD APSTOOLS_VERSION = 2019.0103.0+5.g0f4e8b2
#MD BLUESKY_VERSION = 1.4.1
#MD OPHYD_VERSION = 1.3.0
#MD SESSION_START = 2019-01-28 12:19:25.446836
#MD beamline_id = developer
#MD ipython_session_start = 2018-02-14 12:54:06.447450
#MD login_id = mintadmin@mint-vm
#MD pid = 21784
#MD proposal_id = None
#N 2
#L Epoch_float scaler_time Epoch
1.4297869205474854 1.1 1
4.596935987472534 1.1 5
#C Mon Jan 28 12:48:11 2019. this is an event document comment after the first read
#C Mon Jan 28 12:48:14 2019. this is an event document comment after the second read
#C Mon Jan 28 12:48:14 2019. this is a stop document comment
#C Mon Jan 28 12:48:14 2019. num_events_primary = 2
#C Mon Jan 28 12:48:14 2019. exit_status = success
|
Collect data from Bluesky RunEngine documents to write as SPEC data. |
|
make it easy to add spec-style comments in a custom plan |
- class apstools.callbacks.spec_file_writer.SpecWriterCallback(filename=None, auto_write=True, RE=None, reset_scan_id=False)[source]
Collect data from Bluesky RunEngine documents to write as SPEC data.
This gathers data from all documents in a scan and appends scan to the file when the
stop
document is received. One or more scans can be written to the same file. The file format is text.Note
SpecWriterCallback()
does not inherit fromFileWriterCallbackBase()
.PARAMETERS
- filename
string : (optional) Local, relative or absolute name of SPEC data file to be used. If
filename=None
, defaults to format ofYYYmmdd-HHMMSS.dat
derived from the current system time.- auto_write
boolean : (optional) If
True
(default),write_scan()
is called when stop document is received. IfFalse
, the caller is responsible for callingwrite_scan()
before the nextstart
document is received.- RE
object : Instance of
bluesky.RunEngine
orNone
.- reset_scan_id
boolean : (optional) If True, and filename exists, then sets
RE.md.scan_id
to highest scan number in existing SPEC data file. default: False
User Interface methods
receiver
(key, document)Bluesky callback: receive all documents for handling
newfile
([filename, scan_id, RE])prepare to use a new SPEC data file
usefile
(filename)read from existing SPEC data file
generate a file name to be used as default
clear
()reset all scan data defaults
format the scan for a SPEC data file
write the most recent (completed) scan to the file
Internal methods
write the header section of a SPEC data file
start
(doc)handle start documents
descriptor
(doc)handle descriptor documents
event
(doc)handle event documents
bulk_events
(doc)handle bulk_events documents
datum
(doc)handle datum documents
resource
(doc)handle resource documents
stop
(doc)handle stop documents
- descriptor(doc)[source]
handle descriptor documents
prepare for primary scan data, ignore any other data stream
- newfile(filename=None, scan_id=None, RE=None)[source]
prepare to use a new SPEC data file
but don’t create it until we have data
- apstools.callbacks.spec_file_writer.spec_comment(comment, doc=None, writer=None)[source]
make it easy to add spec-style comments in a custom plan
These comments only go into the SPEC data file.
PARAMETERS
- comment string :
(optional) Comment text to be written. SPEC expects it to be only one line!
- doc string :
(optional) Bluesky RunEngine document type. One of:
start descriptor event resource datum stop
(default:event
)- writer obj :
(optional) Instance of
SpecWriterCallback()
, typically:specwriter = SpecWriterCallback()
Plans
Plans that might be useful at the APS when using Bluesky.
Plans and Support by Activity
Batch Scanning
|
plan: execute the command list |
|
return command list from either text or Excel file |
|
parse an Excel spreadsheet with commands, return as command list |
|
parse a text file with commands, return as command list |
|
Define the function called to execute the command list |
|
plan: execute a list of commands from a text or Excel file |
|
print the command list from a text or Excel file |
Custom Scans
|
Save text as a bluesky run. |
|
lineup and center a given axis, relative to current position |
|
Scan over |
|
bluesky plan: record current values of list of ophyd signals |
|
simple 1-D scan using EPICS synApps sscan record |
|
tune an axis with a signal |
|
Bluesky plan to tune a list of axes in sequence |
Overall
|
add an ophyd Device as an additional document stream |
|
format a command list as a pyRestTable.Table object |
|
Save text as a bluesky run. |
|
plan: execute the command list |
|
return command list from either text or Excel file |
|
lineup and center a given axis, relative to current position |
|
Scan over |
|
parse an Excel spreadsheet with commands, return as command list |
|
parse a text file with commands, return as command list |
|
Define the function called to execute the command list |
|
plan: execute a list of commands from a text or Excel file |
|
bluesky plan: record current values of list of ophyd signals |
|
simple 1-D scan using EPICS synApps sscan record |
|
print the command list from a text or Excel file |
|
tune an axis with a signal |
|
Bluesky plan to tune a list of axes in sequence |
Also consult the Index under the Bluesky heading for links to the Callbacks, Devices, Exceptions, and Plans described here.
Submodules
Alignment plans
|
lineup and center a given axis, relative to current position |
|
Bluesky plan to tune a list of axes in sequence |
|
tune an axis with a signal |
|
Provides bps.read() as a Device |
- class apstools.plans.alignment.TuneAxis(signals, axis, signal_name=None)[source]
tune an axis with a signal
This class provides a tuning object so that a Device or other entity may gain its own tuning process, keeping track of the particulars needed to tune this device again. For example, one could add a tuner to a motor stage:
motor = EpicsMotor("xxx:motor", "motor") motor.tuner = TuneAxis([det], motor)
Then the
motor
could be tuned individually:RE(motor.tuner.tune(md={"activity": "tuning"}))
or the
tune()
could be part of a plan with other steps.Example:
tuner = TuneAxis([det], axis) live_table = LiveTable(["axis", "det"]) RE(tuner.multi_pass_tune(width=2, num=9), live_table) RE(tuner.tune(width=0.05, num=9), live_table)
Also see the jupyter notebook referenced here: Example: the TuneAxis() class.
tune
([width, num, peak_factor, md])Bluesky plan to execute one pass through the current scan range
multi_pass_tune
([width, step_factor, num, ...])Bluesky plan for tuning this axis with this signal
peak_detected
([peak_factor])returns True if a peak was detected, otherwise False
SEE ALSO
tune_axes
(axes)Bluesky plan to tune a list of axes in sequence
- multi_pass_tune(width=None, step_factor=None, num=None, pass_max=None, peak_factor=None, snake=None, md=None)[source]
Bluesky plan for tuning this axis with this signal
Execute multiple passes to refine the centroid determination. Each subsequent pass will reduce the width of scan by
step_factor
. Ifsnake=True
then the scan direction will reverse with each subsequent pass.PARAMETERS
- width
float : width of the tuning scan in the units of
self.axis
Default value inself.width
(initially 1)- num
int : number of steps Default value in
self.num
(initially 10)- step_factor
float : This reduces the width of the next tuning scan by the given factor. Default value in
self.step_factor
(initially 4)- pass_max
int : Maximum number of passes to be executed (avoids runaway scans when a centroid is not found). Default value in
self.pass_max
(initially 4)- peak_factor
float : peak maximum must be greater than
peak_factor*minimum
(default: 4)- snake
bool : If
True
, reverse scan direction on next pass. Default value inself.snake
(initially True)- md
dict : (optional) metadata
- peak_detected(peak_factor=None)[source]
returns True if a peak was detected, otherwise False
The default algorithm identifies a peak when the maximum value is four times the minimum value. Change this routine by subclassing
TuneAxis
and overridepeak_detected()
.
- tune(width=None, num=None, peak_factor=None, md=None)[source]
Bluesky plan to execute one pass through the current scan range
Scan self.axis centered about current position from
-width/2
to+width/2
withnum
observations. If a peak was detected (default check is that max >= 4*min), then setself.tune_ok = True
.PARAMETERS
- width
float : width of the tuning scan in the units of
self.axis
Default value inself.width
(initially 1)- num
int : number of steps Default value in
self.num
(initially 10)- md
dict : (optional) metadata
- class apstools.plans.alignment.TuneResults(*args: Any, **kwargs: Any)[source]
Provides bps.read() as a Device
- apstools.plans.alignment.lineup(counter, axis, minus, plus, npts, time_s=0.1, peak_factor=4, width_factor=0.8, md=None)[source]
lineup and center a given axis, relative to current position
PARAMETERS
- counter
object : instance of ophyd.Signal (or subclass such as ophyd.scaler.ScalerChannel) dependent measurement to be maximized
- axis
movable object : instance of ophyd.Signal (or subclass such as EpicsMotor) independent axis to use for alignment
- minus
float : first point of scan at this offset from starting position
- plus
float : last point of scan at this offset from starting position
- npts
int : number of data points in the scan
- time_s
float : count time per step (if counter is ScalerChannel object) (default: 0.1)
- peak_factor
float : peak maximum must be greater than
peak_factor*minimum
(default: 4)- width_factor
float : fwhm must be less than
width_factor*plot_range
(default: 0.8)
EXAMPLE:
RE(lineup(diode, foemirror.theta, -30, 30, 30, 1.0))
nscan plan
Exception when reading a command file. |
|
|
format a command list as a pyRestTable.Table object |
|
plan: execute the command list |
|
return command list from either text or Excel file |
|
parse an Excel spreadsheet with commands, return as command list |
|
parse a text file with commands, return as command list |
|
Define the function called to execute the command list |
|
plan: execute a list of commands from a text or Excel file |
|
print the command list from a text or Excel file |
- exception apstools.plans.command_list.CommandFileReadError[source]
Exception when reading a command file.
- apstools.plans.command_list.command_list_as_table(commands, show_raw=False)[source]
format a command list as a pyRestTable.Table object
- apstools.plans.command_list.execute_command_list(filename, commands, md=None)[source]
plan: execute the command list
The command list is a tuple described below.
Only recognized commands will be executed.
Unrecognized commands will be reported as comments.
See example implementation with APS USAXS instrument: https://github.com/APS-USAXS/ipython-usaxs/blob/5db882c47d935c593968f1e2144d35bec7d0181e/profile_bluesky/startup/50-plans.py#L381-L469
PARAMETERS
- filename
str : Name of input text file. Can be relative or absolute path, such as “actions.txt”, “../sample.txt”, or “/path/to/overnight.txt”.
- commands
[command] : List of command tuples for use in
execute_command_list()
where
- command
tuple : (action, OrderedDict, line_number, raw_command)
- action
str : names a known action to be handled
- parameters
list : List of parameters for the action. The list is empty of there are no values
- line_number
int : line number (1-based) from the input text file
- raw_command
str or [str] : contents from input file, such as:
SAXS 0 0 0 blank
SEE ALSO
execute_command_list
(filename, commands[, md])plan: execute the command list
register_command_handler
([handler])Define the function called to execute the command list
run_command_file
(filename[, md])plan: execute a list of commands from a text or Excel file
summarize_command_file
(filename)print the command list from a text or Excel file
parse_Excel_command_file
(filename)parse an Excel spreadsheet with commands, return as command list
parse_text_command_file
(filename)parse a text file with commands, return as command list
new in apstools release 1.1.7
- apstools.plans.command_list.get_command_list(filename)[source]
return command list from either text or Excel file
SEE ALSO
execute_command_list
(filename, commands[, md])plan: execute the command list
get_command_list
(filename)return command list from either text or Excel file
register_command_handler
([handler])Define the function called to execute the command list
run_command_file
(filename[, md])plan: execute a list of commands from a text or Excel file
summarize_command_file
(filename)print the command list from a text or Excel file
parse_Excel_command_file
(filename)parse an Excel spreadsheet with commands, return as command list
parse_text_command_file
(filename)parse a text file with commands, return as command list
new in apstools release 1.1.7
- apstools.plans.command_list.parse_Excel_command_file(filename)[source]
parse an Excel spreadsheet with commands, return as command list
TEXT view of spreadsheet (Excel file line numbers shown):
[1] List of sample scans to be run [2] [3] [4] scan sx sy thickness sample name [5] FlyScan 0 0 0 blank [6] FlyScan 5 2 0 blank
PARAMETERS
- filename
str : Name of input Excel spreadsheet file. Can be relative or absolute path, such as “actions.xslx”, “../sample.xslx”, or “/path/to/overnight.xslx”.
RETURNS
- list of commands
[command] : List of command tuples for use in
execute_command_list()
RAISES
- FileNotFoundError
if file cannot be found
SEE ALSO
get_command_list
(filename)return command list from either text or Excel file
register_command_handler
([handler])Define the function called to execute the command list
run_command_file
(filename[, md])plan: execute a list of commands from a text or Excel file
summarize_command_file
(filename)print the command list from a text or Excel file
parse_text_command_file
(filename)parse a text file with commands, return as command list
new in apstools release 1.1.7
- apstools.plans.command_list.parse_text_command_file(filename)[source]
parse a text file with commands, return as command list
The text file is interpreted line-by-line.
Blank lines are ignored.
A pound sign (#) marks the rest of that line as a comment.
All remaining lines are interpreted as commands with arguments.
Example of text file (no line numbers shown):
#List of sample scans to be run # pound sign starts a comment (through end of line) # action value mono_shutter open # action x y width height uslits 0 0 0.4 1.2 # action sx sy thickness sample name FlyScan 0 0 0 blank FlyScan 5 2 0 "empty container" # action sx sy thickness sample name SAXS 0 0 0 blank # action value mono_shutter close
PARAMETERS
- filename
str : Name of input text file. Can be relative or absolute path, such as “actions.txt”, “../sample.txt”, or “/path/to/overnight.txt”.
RETURNS
- list of commands
[command] : List of command tuples for use in
execute_command_list()
RAISES
- FileNotFoundError
if file cannot be found
SEE ALSO
execute_command_list
(filename, commands[, md])plan: execute the command list
get_command_list
(filename)return command list from either text or Excel file
register_command_handler
([handler])Define the function called to execute the command list
run_command_file
(filename[, md])plan: execute a list of commands from a text or Excel file
summarize_command_file
(filename)print the command list from a text or Excel file
parse_Excel_command_file
(filename)parse an Excel spreadsheet with commands, return as command list
new in apstools release 1.1.7
- apstools.plans.command_list.register_command_handler(handler=None)[source]
Define the function called to execute the command list
PARAMETERS
- handler obj :
Reference of the
execute_command_list
function to be used fromrun_command_file()
. IfNone
or not provided, will reset toexecute_command_list()
, which is also the initial setting.
SEE ALSO
execute_command_list
(filename, commands[, md])plan: execute the command list
get_command_list
(filename)return command list from either text or Excel file
register_command_handler
([handler])Define the function called to execute the command list
summarize_command_file
(filename)print the command list from a text or Excel file
parse_Excel_command_file
(filename)parse an Excel spreadsheet with commands, return as command list
parse_text_command_file
(filename)parse a text file with commands, return as command list
new in apstools release 1.1.7
- apstools.plans.command_list.run_command_file(filename, md=None)[source]
plan: execute a list of commands from a text or Excel file
Parse the file into a command list
yield the command list to the RunEngine (or other)
SEE ALSO
execute_command_list
(filename, commands[, md])plan: execute the command list
get_command_list
(filename)return command list from either text or Excel file
register_command_handler
([handler])Define the function called to execute the command list
summarize_command_file
(filename)print the command list from a text or Excel file
parse_Excel_command_file
(filename)parse an Excel spreadsheet with commands, return as command list
parse_text_command_file
(filename)parse a text file with commands, return as command list
new in apstools release 1.1.7
- apstools.plans.command_list.summarize_command_file(filename)[source]
print the command list from a text or Excel file
SEE ALSO
execute_command_list
(filename, commands[, md])plan: execute the command list
get_command_list
(filename)return command list from either text or Excel file
run_command_file
(filename[, md])plan: execute a list of commands from a text or Excel file
parse_Excel_command_file
(filename)parse an Excel spreadsheet with commands, return as command list
parse_text_command_file
(filename)parse a text file with commands, return as command list
new in apstools release 1.1.7
Documentation of batch runs
|
add an ophyd Device as an additional document stream |
|
Save text as a bluesky run. |
- apstools.plans.doc_run.addDeviceDataAsStream(devices, label)[source]
add an ophyd Device as an additional document stream
Use this within a custom plan, such as this example:
from apstools.plans import addDeviceStream ... yield from bps.open_run() # ... yield from addDeviceDataAsStream(prescanDeviceList, "metadata_prescan") # ... yield from custom_scan_procedure() # ... yield from addDeviceDataAsStream(postscanDeviceList, "metadata_postscan") # ... yield from bps.close_run()
- apstools.plans.doc_run.documentation_run(text, stream=None, bec=None, md=None)[source]
Save text as a bluesky run.
PARAMETERS
- text
str : Text to be written.
- stream
str : document stream, default: “primary”
- bec
object : Instance of bluesky.BestEffortCallback, default: get from IPython shell
- md
dict (optional): metadata dictionary
nscan plan
|
Scan over |
- apstools.plans.nscan_support.nscan(detectors, *motor_sets, num=11, per_step=None, md=None)[source]
Scan over
n
variables moved together, each in equally spaced steps.PARAMETERS
- detectors list :
list of ‘readable’ objects
- motor_sets list :
sequence of one or more groups of: motor, start, finish
- motor object :
any ‘settable’ object (motor, temp controller, etc.)
- start float :
starting position of motor
- finish float :
ending position of motor
- num int :
number of steps (default = 11)
- per_step callable :
(optional) hook for customizing action of inner loop (messages per step) Expected signature:
f(detectors, step_cache, pos_cache)
- md dict
(optional) metadata
See the
nscan()
example in a Jupyter notebook: https://github.com/BCDA-APS/apstools/blob/master/docs/source/resources/demo_nscan.ipynb
snapshot Support
|
bluesky plan: record current values of list of ophyd signals |
sscan Record plans
|
simple 1-D scan using EPICS synApps sscan record |
- apstools.plans.sscan_support.sscan_1D(sscan, poll_delay_s=0.001, phase_timeout_s=60.0, running_stream='primary', final_array_stream=None, device_settings_stream='settings', md=None)[source]
simple 1-D scan using EPICS synApps sscan record
assumes the sscan record has already been setup properly for a scan
PARAMETERS
- sscan Device :
one EPICS sscan record (instance of apstools.synApps.sscanRecord)
- running_stream stror None
(default:
"primary"
) Name of document stream to write positioners and detectors data made available while the sscan is running. This is typically the scan data, row by row. If set to None, this stream will not be written.- final_array_stream str or
None
: Name of document stream to write positioners and detectors data posted after the sscan has ended. If set to None, this stream will not be written. (default:
None
)- device_settings_stream str or
None
: Name of document stream to write settings of the sscan device. This is all the information returned by
sscan.read()
. If set to None, this stream will not be written. (default:"settings"
)- poll_delay_s float :
How long to sleep during each polling loop while collecting interim data values and waiting for sscan to complete. Must be a number between zero and 0.1 seconds. (default: 0.001 seconds)
- phase_timeout_s float :
How long to wait after last update of the
sscan.FAZE
. When scanning, we expect the scan phase to update regularly as positioners move and detectors are triggered. If the scan hangs for some reason, this is a way to end the plan early. To cancel this feature, set it toNone
. (default: 60 seconds)
NOTE about the document stream names
Make certain the names for the document streams are different from each other. If you make them all the same (such as
primary
), you will have difficulty when reading your data later on.Don’t cross the streams!
EXAMPLE
Assume that the chosen sscan record has already been setup.
from apstools.devices import sscanDevice scans = sscanDevice(P, name=”scans”)
from apstools.plans import sscan_1D RE(sscan_1D(scans.scan1), md=dict(purpose=”demo”))
Utilities
Various utilities to help APS use the Bluesky framework.
Also consult the Index under the apstools heading for links to the Exceptions, and Utilities described here.
Utilities by Activity
Finding
|
Find the ophyd (dotted name) object associated with the given ophyd name. |
|
Find all ophyd objects associated with the given EPICS PV. |
|
Listing
|
Describe the signal information from device |
|
Show all the ophyd Signal and Device objects defined as globals. |
|
List all plans. |
|
Convenience function to list all keys (column names) in the scan's stream (default: primary). |
|
List the runs from the given catalog according to some options. |
|
List runs from catalog. |
Reporting
|
custom print the RunEngine metadata in a table |
|
print (stdout) a list of all snapshots in the databroker |
|
Slit size and center as a named tuple |
Other Utilities
|
convert text so it can be used as a dictionary key |
|
Given list of EPICS PV names, return dict of EpicsSignal objects. |
|
send email notifications when requested |
|
Get the first live plot that matches |
|
Get the MatPlotLib Figure window for y vs x. |
|
Find the plot(s) by name and replot with at most the last n lines. |
|
Find the plot with axes x and y and replot with at most the last n lines. |
String must not exceed EPICS PV length. |
|
|
Run a UNIX command, returns (stdout, stderr). |
General
|
convert text so it can be used as a dictionary key |
|
format a command list as a pyRestTable.Table object |
|
Given list of EPICS PV names, return dict of EpicsSignal objects. |
|
copy filtered runs from source_cat to target_cat |
|
Searches the databroker v2 database. |
|
return a text table from |
|
send email notifications when requested |
|
base class: read-only support for Excel files, treat them like databases |
|
Generic (read-only) handling of Excel spreadsheet-as-database |
|
Exception when reading Excel spreadsheet. |
|
Find the ophyd (dotted name) object associated with the given ophyd name. |
|
Find all ophyd objects associated with the given EPICS PV. |
|
|
|
Return the full dotted name |
|
|
|
Return Bluesky database using keyword guides or default choice. |
|
|
Find the "default" database (has the most recent run). |
|
|
get the IPython shell's namespace dictionary (or globals() if not found) |
|
Convenience function to get the run's data. |
|
Convenience function to get value of key in run stream. |
|
Get values from a previous scan stream in a databroker catalog. |
return the name of the current ipython profile or |
|
|
Format a list of items. |
|
Describe the signal information from device |
|
Show all the ophyd Signal and Device objects defined as globals. |
|
List all plans. |
|
Convenience function to list all keys (column names) in the scan's stream (default: primary). |
|
List the runs from the given catalog according to some options. |
|
List runs from catalog. |
Define parameters that can be overridden from a user configuration file. |
|
|
break a list (or other iterable) into pairs |
|
custom print the RunEngine metadata in a table |
|
print (stdout) a list of all snapshots in the databroker |
|
Print table of different |
|
Set EPICS motor record's user coordinate to |
|
Replay the document stream from one (or more) scans (headers). |
|
return memory used by this process |
|
(decorator) run |
|
make text safe to be used as an ophyd object name |
|
Get the first live plot that matches |
|
Get the MatPlotLib Figure window for y vs x. |
|
splits a line into words some of which might be quoted |
|
Report bluesky run metrics from the databroker. |
|
Encode |
|
|
|
Find the plot(s) by name and replot with at most the last n lines. |
|
Find the plot with axes x and y and replot with at most the last n lines. |
String must not exceed EPICS PV length. |
|
|
Run a UNIX command, returns (stdout, stderr). |
Submodules
Working with databroker catalogs
|
copy filtered runs from source_cat to target_cat |
|
|
|
|
|
Return Bluesky database using keyword guides or default choice. |
|
|
Find the "default" database (has the most recent run). |
|
|
Get values from a previous scan stream in a databroker catalog. |
|
Print table of different |
- apstools.utils.catalog.copy_filtered_catalog(source_cat, target_cat, query=None)[source]
copy filtered runs from source_cat to target_cat
PARAMETERS
- source_cat
obj : instance of databroker.Broker or databroker.catalog[name]
- target_cat
obj : instance of databroker.Broker or databroker.catalog[name]
- query
dict : mongo query dictionary, used to filter the results (default:
{}
)see: https://docs.mongodb.com/manual/reference/operator/query/
example:
copy_filtered_catalog( databroker.Broker.named("mongodb_config"), databroker.catalog["test1"], {'plan_name': 'snapshot'})
- apstools.utils.catalog.getDatabase(db=None, catalog_name=None)[source]
Return Bluesky database using keyword guides or default choice.
PARAMETERS
- db
object : Bluesky database, an instance of
databroker.catalog
(default: seecatalog_name
keyword argument)- catalog_name
str : Name of databroker v2 catalog, used when supplied
db
isNone
. (default: catalog with most recent run timestamp)
RETURNS
- object or
None
: Bluesky database, an instance of
databroker.catalog
(new in release 1.4.0)
- apstools.utils.catalog.getDefaultDatabase()[source]
Find the “default” database (has the most recent run).
Note that here, database and catalog mean the same.
This routine looks at all the database instances defined in the current session (console or notebook). If there is only one or no database instances defined as objects in the current session, the choice is simple. When there is more than one database instance in the current session, then the one with the most recent run timestamp is selected. In the case (as happens when starting with a new database) that the current database has no runs and another database instance is defined in the session and that additional database has runs in it (such as the previous database), then the database with the newest run timestamp (and not the newer empty database) will be chosen.
RETURNS
- object or
None
: Bluesky database, an instance of
databroker.catalog
(new in release 1.4.0)
- object or
- apstools.utils.catalog.getStreamValues(scan_id, key_fragment='', db=None, stream='baseline', query=None, use_v1=True)[source]
Get values from a previous scan stream in a databroker catalog.
Optionally, select only those data with names including
key_fragment
.Tip
If the output is truncated, use
pd.set_option('display.max_rows', 300)
to increase the number of rows displayed.PARAMETERS
- scan_id
int or str : Scan (run) identifier. Positive integer value is
scan_id
from run’s metadata. Negative integer value is since most recent run in databroker. String is run’suid
unique identifier (can abbreviate to the first characters needed to assure it is unique).- key_fragment
str : Part or all of key name to be found in selected stream. For instance, if you specify
key_fragment="lakeshore"
, it will return all the keys that includelakeshore
.- db
object : Bluesky database, an instance of
databroker.catalog
. Default: will search existing session for instance.- stream
str : Name of the bluesky data stream to obtain the data. Default: ‘baseline’
- query
dict : mongo query dictionary, used to filter the results Default:
{}
see: https://docs.mongodb.com/manual/reference/operator/query/
- use_v1
bool : Chooses databroker API version between ‘v1’ or ‘v2’. Default:
True
(meaning use the v1 API)
RETURNS
- object :
pandas DataFrame with values from selected stream, search_string, and query
see: https://pandas.pydata.org/pandas-docs/stable/reference/api/pandas.DataFrame.html
(new in apstools 1.5.1)
- apstools.utils.catalog.quantify_md_key_use(key=None, db=None, catalog_name=None, since=None, until=None, query=None)[source]
Print table of different
key
values and how many times each appears.PARAMETERS
- key str :
one of the metadata keys in a run’s start document (default:
plan_name
)- db object :
Instance of databroker v1
Broker
or v2catalog
(default: seecatalog_name
keyword argument)- catalog_name str :
Name of databroker v2 catalog, used when supplied
db
isNone
. (default:mongodb_config
)- since str :
include runs that started on or after this ISO8601 time (default:
1995-01-01
)- until str :
include runs that started before this ISO8601 time (default:
2100-12-31
)- query dict :
mongo query dictionary, used to filter the results (default:
{}
)see: https://docs.mongodb.com/manual/reference/operator/query/
EXAMPLES:
quantify_md_key_use(key="proposal_id") quantify_md_key_use(key="plan_name", catalog_name="9idc", since="2020-07") quantify_md_key_use(key="beamline_id", catalog_name="9idc") quantify_md_key_use(key="beamline_id", catalog_name="9idc", query={'plan_name': 'Flyscan'}, since="2020", until="2020-06-21 21:51") quantify_md_key_use(catalog_name="8id", since="2020-01", until="2020-03") In [8]: quantify_md_key_use(catalog_name="apstools_test") ========= ===== plan_name #runs ========= ===== count 26 scan 27 ========= ===== In [9]: quantify_md_key_use(catalog_name="usaxs_test") ========================== ===== plan_name #runs ========================== ===== Flyscan 1 TuneAxis.tune 1 count 1 measure_USAXS_Transmission 1 run_Excel_file 1 snapshot 1 tune_a2rp 1 tune_ar 1 tune_m2rp 1 tune_mr 1 ========================== =====
Device information
|
Describe the signal information from device |
- apstools.utils.device_info.listdevice(obj, scope=None, cname=False, dname=True, show_pv=False, use_datetime=True, show_ancient=True)[source]
Describe the signal information from device
obj
in a pandas DataFrame.Look through all subcomponents to find all the signals to be shown.
PARAMETERS
- obj
object : Instance of ophyd Signal or Device.
- scope
str or None : Scope of content to be shown.
"full"
(orNone
) shows all Signal components"epics"
shows only EPICS-based Signals"read"
shows only the signals returned byobj.read()
default:
None
- cname
bool : Show the _control_ (Python, dotted) name in column
name
.default:
False
- dname
bool : Show the _data_ (databroker, with underlines) name in column
data name
.default:
True
- show_pv
bool : Show the EPICS process variable (PV) name in column
PV
.default:
False
- use_datetime
bool : Show the EPICS timestamp (time of last update) in column
timestamp
.default:
True
- show_ancient
bool : Show uninitialized EPICS process variables.
In EPICS, an uninitialized PV has a timestamp of 1990-01-01 UTC. This option enables or suppresses ancient values identified by timestamp from 1989. These are values only defined in the original
.db
file.default:
True
email Support
|
send email notifications when requested |
- class apstools.utils.email.EmailNotifications(sender=None)[source]
send email notifications when requested
use default OS mail utility (so no credentials needed)
EXAMPLE
Send email(s) when feedback_limits_approached (a hypothetical boolean) is True:
# setup from apstools.utils import EmailNotifications SENDER_EMAIL = "instrument_user@email.host.tld" email_notices = EmailNotifications(SENDER_EMAIL) email_notices.add_addresses( # This list receives email when send() is called. "joe.user@goodmail.com", "instrument_team@email.host.tld", # others? ) # ... later if feedback_limits_approached: # send emails to list subject = "Feedback problem" message = "Feedback is very close to its limits." email_notices.send(subject, message)
- run_in_thread()
(decorator) run
func
in threadUSAGE:
@run_in_thread def progress_reporting(): logger.debug("progress_reporting is starting") # ... #... progress_reporting() # runs in separate thread #...
Directory of the known plans
|
List all plans. |
- apstools.utils.list_plans.listplans(base=None, trunc=40)[source]
List all plans. (Actually, lists all generator functions).
NOTE: Can only detect generator functions. Bluesky plans are generator functions that generate
bluesky.Msg
objects. There is a PR to define a decorator that identifies a generator function as a bluesky plan.PARAMETERS
- base
object or None : Object that contains plan methods (if
None
, use global namespace.) (default:None
)- trunc
int : Truncate long docstrings to no more than
trunc
characters. (default: 40)
Directory of bluesky runs
|
Convenience function to get the run's data. |
|
Convenience function to get value of key in run stream. |
|
Convenience function to list all keys (column names) in the scan's stream (default: primary). |
|
List the runs from the given catalog according to some options. |
|
List runs from catalog. |
|
Report bluesky run metrics from the databroker. |
- class apstools.utils.list_runs.ListRuns(cat: Optional[object] = None, query: Optional[dict] = None, keys: Optional[str] = None, missing: str = '', num: int = 20, reverse: bool = True, since: Optional[str] = None, sortby: str = 'time', timefmt: str = '%Y-%m-%d %H:%M:%S', until: Optional[str] = None, ids: Optional[Any] = None)[source]
List the runs from the given catalog according to some options.
EXAMPLE:
ListRuns(cat).to_dataframe()
PUBLIC METHODS
Output as pandas DataFrame object
to_table
([fmt])Output as pyRestTable object.
Parse the runs for the given metadata keys.
INTERNAL METHODS
_get_by_key
(md, key)Get run's metadata value by key.
_check_cat
()_apply_search_filters
()Search for runs from the catalog.
_check_keys
()Check that self.keys is a list of strings.
- apstools.utils.list_runs.getRunData(scan_id, db=None, stream='primary', query=None, use_v1=True)[source]
Convenience function to get the run’s data. Default is the
primary
stream.PARAMETERS
- scan_id
int or str : Scan (run) identifier. Positive integer value is
scan_id
from run’s metadata. Negative integer value is since most recent run in databroker. String is run’suid
unique identifier (can abbreviate to the first characters needed to assure it is unique).- db
object : Bluesky database, an instance of
databroker.catalog
. Default: will search existing session for instance.- stream
str : Name of the bluesky data stream to obtain the data. Default: ‘primary’
- query
dict : mongo query dictionary, used to filter the results Default:
{}
see: https://docs.mongodb.com/manual/reference/operator/query/
- use_v1
bool : Chooses databroker API version between ‘v1’ or ‘v2’. Default:
True
(meaning use the v1 API)
(new in apstools 1.5.1)
- apstools.utils.list_runs.getRunDataValue(scan_id, key, db=None, stream='primary', query=None, idx=- 1, use_v1=True)[source]
Convenience function to get value of key in run stream.
Defaults are last value of key in primary stream.
PARAMETERS
- scan_id
int or str : Scan (run) identifier. Positive integer value is
scan_id
from run’s metadata. Negative integer value is since most recent run in databroker. String is run’suid
unique identifier (can abbreviate to the first characters needed to assure it is unique).- key
str : Name of the key (data column) in the table of the stream’s data. Must match identically.
- db
object : Bluesky database, an instance of
databroker.catalog
. Default: will search existing session for instance.- stream
str : Name of the bluesky data stream to obtain the data. Default: ‘primary’
- query
dict : mongo query dictionary, used to filter the results Default:
{}
see: https://docs.mongodb.com/manual/reference/operator/query/
- idx
int or str : List index of value to be returned from column of table. Can be
0
for first value,-1
for last value,"mean"
for average value, or"all"
for the full list of values. Default:-1
- use_v1
bool : Chooses databroker API version between ‘v1’ or ‘v2’. Default:
True
(meaning use the v1 API)
(new in apstools 1.5.1)
- apstools.utils.list_runs.listRunKeys(scan_id, key_fragment='', db=None, stream='primary', query=None, strict=False, use_v1=True)[source]
Convenience function to list all keys (column names) in the scan’s stream (default: primary).
PARAMETERS
- scan_id
int or str : Scan (run) identifier. Positive integer value is
scan_id
from run’s metadata. Negative integer value is since most recent run in databroker. String is run’suid
unique identifier (can abbreviate to the first characters needed to assure it is unique).- key_fragment
str : Part or all of key name to be found in selected stream. For instance, if you specify
key_fragment="lakeshore"
, it will return all the keys that includelakeshore
.- db
object : Bluesky database, an instance of
databroker.catalog
. Default: will search existing session for instance.- stream
str : Name of the bluesky data stream to obtain the data. Default: ‘primary’
- query
dict : mongo query dictionary, used to filter the results Default:
{}
see: https://docs.mongodb.com/manual/reference/operator/query/
- strict
bool : Should the
key_fragment
be matched identically (strict=True
) or matched by lower case comparison (strict=False
)? Default:False
- use_v1
bool : Chooses databroker API version between ‘v1’ or ‘v2’. Default:
True
(meaning use the v1 API)
(new in apstools 1.5.1)
- apstools.utils.list_runs.listruns(cat=None, keys=None, missing='', num=20, printing='smart', reverse=True, since=None, sortby='time', tablefmt='dataframe', timefmt='%Y-%m-%d %H:%M:%S', until=None, ids=None, **query)[source]
List runs from catalog.
This function provides a thin interface to the highly-reconfigurable
ListRuns()
class in this package.PARAMETERS
- cat
object : Instance of databroker v1 or v2 catalog.
- keys
str or [str] or None: Include these additional keys from the start document. (default:
None
means"scan_id time plan_name detectors"
)- missing
str: Test to report when a value is not available. (default:
""
)- ids
[int] or [str]: List of
uid
orscan_id
value(s). Can mix different kinds in the same list. Also can specify offsets (e.g.,-1
). According to the rules fordatabroker
catalogs, a string is auid
(partial representations allowed), an int isscan_id
if positive or an offset if negative. (default:None
)- num
int : Make the table include the
num
most recent runs. (default:20
)- printing
bool or
"smart"
: IfTrue
, print the table to stdout. If"smart"
, then act as shown below. (default:True
)session
action(s)
python session
print and return
None
Ipython console
return
DataFrame
objectJupyter notebook
return
DataFrame
object- reverse
bool : If
True
, sort in descending order bysortby
. (default:True
)- since
str : include runs that started on or after this ISO8601 time (default:
"1995-01-01"
)- sortby
str : Sort columns by this key, found by exact match in either the
start
orstop
document. (default:"time"
)- tablefmt
str : When returning an object, specify which type of object to return. (default:
"dataframe",
)value
object
dataframe
pandas.DataFrame
table
str(pyRestTable.Table)
- timefmt
str : The
time
key (also includes keys"start.time"
and"stop.time"
) will be formatted by theself.timefmt
value. See https://strftime.org/ for examples. The specialtimefmt="raw"
is used to report time as the raw value (floating point time as used in python’stime.time()
). (default:"%Y-%m-%d %H:%M:%S",
)- until
str : include runs that started before this ISO8601 time (default:
2100-12-31
)**query
dict : Any additional keyword arguments will be passed to the databroker to refine the search for matching runs using the
mongoquery
package.
RETURNS
- object:
None
orstr
orpd.DataFrame()
object
EXAMPLE:
TODO
(new in release 1.5.0)
- apstools.utils.list_runs.summarize_runs(since=None, db=None)[source]
Report bluesky run metrics from the databroker.
How many different plans?
How many runs?
How many times each run was used?
How frequently? (TODO:)
PARAMETERS
- since
str : Report all runs since this ISO8601 date & time (default:
1995
)- db
object : Instance of
databroker.Broker()
(default:db
from the IPython shell)
Diagnostic Support for Memory
|
return memory used by this process |
Miscellaneous Support
|
convert text so it can be used as a dictionary key |
|
Given list of EPICS PV names, return dict of EpicsSignal objects. |
|
return a text table from |
|
Return the full dotted name |
|
Format a list of items. |
|
Show all the ophyd Signal and Device objects defined as globals. |
|
break a list (or other iterable) into pairs |
|
custom print the RunEngine metadata in a table |
|
print (stdout) a list of all snapshots in the databroker |
|
Set EPICS motor record's user coordinate to |
|
Replay the document stream from one (or more) scans (headers). |
|
(decorator) run |
|
make text safe to be used as an ophyd object name |
|
splits a line into words some of which might be quoted |
|
Encode |
|
|
String must not exceed EPICS PV length. |
|
|
Run a UNIX command, returns (stdout, stderr). |
- apstools.utils.misc.cleanupText(text)[source]
convert text so it can be used as a dictionary key
Given some input text string, return a clean version remove troublesome characters, perhaps other cleanup as well. This is best done with regular expression pattern matching.
- apstools.utils.misc.connect_pvlist(pvlist, wait=True, timeout=2, poll_interval=0.1)[source]
Given list of EPICS PV names, return dict of EpicsSignal objects.
PARAMETERS
- pvlist
[str] : list of EPICS PV names
- wait
bool : should wait for EpicsSignal objects to connect (default:
True
)- timeout
float : maximum time to wait for PV connections, seconds (default: 2.0)
- poll_interval
float : time to sleep between checks for PV connections, seconds (default: 0.1)
- apstools.utils.misc.dictionary_table(dictionary, **kwargs)[source]
return a text table from
dictionary
PARAMETERS
- dictionary
dict : Python dictionary
Note: Keyword arguments parameters are kept for compatibility with previous versions of apstools. They are ignored now.
RETURNS
- table
object or
None
:pyRestTable.Table()
object (multiline text table) orNone
if dictionary has no contents
EXAMPLE:
In [8]: RE.md Out[8]: {'login_id': 'jemian:wow.aps.anl.gov', 'beamline_id': 'developer', 'proposal_id': None, 'pid': 19072, 'scan_id': 10, 'version': {'bluesky': '1.5.2', 'ophyd': '1.3.3', 'apstools': '1.1.5', 'epics': '3.3.3'}} In [9]: print(dictionary_table(RE.md)) =========== ============================================================================= key value =========== ============================================================================= beamline_id developer login_id jemian:wow.aps.anl.gov pid 19072 proposal_id None scan_id 10 version {'bluesky': '1.5.2', 'ophyd': '1.3.3', 'apstools': '1.1.5', 'epics': '3.3.3'} =========== =============================================================================
- apstools.utils.misc.full_dotted_name(obj)[source]
Return the full dotted name
The
.dotted_name
property does not include the name of the root object. This routine adds that.
- apstools.utils.misc.listobjects(show_pv=True, printing=True, verbose=False, symbols=None)[source]
Show all the ophyd Signal and Device objects defined as globals.
PARAMETERS
- show_pv
bool : If True, also show relevant EPICS PV, if available. (default: True)
- printing
bool : If True, print table to stdout. (default: True)
- verbose
bool : If True, also show
str(obj
. (default: False)- symbols
dict : If None, use global symbol table. If not None, use provided dictionary. (default:
globals()
)
RETURNS
- object:
Instance of
pyRestTable.Table()
EXAMPLE:
In [1]: listobjects() ======== ================================ ============= name ophyd structure EPICS PV ======== ================================ ============= adsimdet MySingleTriggerSimDetector vm7SIM1: m1 EpicsMotor vm7:m1 m2 EpicsMotor vm7:m2 m3 EpicsMotor vm7:m3 m4 EpicsMotor vm7:m4 m5 EpicsMotor vm7:m5 m6 EpicsMotor vm7:m6 m7 EpicsMotor vm7:m7 m8 EpicsMotor vm7:m8 noisy EpicsSignalRO vm7:userCalc1 scaler ScalerCH vm7:scaler1 shutter SimulatedApsPssShutterWithStatus ======== ================================ ============= Out[1]: <pyRestTable.rest_table.Table at 0x7fa4398c7cf8> In [2]:
(new in apstools release 1.1.8)
- apstools.utils.misc.pairwise(iterable)[source]
break a list (or other iterable) into pairs
s -> (s0, s1), (s2, s3), (s4, s5), ... In [71]: for item in pairwise("a b c d e fg".split()): ...: print(item) ...: ('a', 'b') ('c', 'd') ('e', 'fg')
- apstools.utils.misc.print_RE_md(dictionary=None, fmt='simple', printing=True)[source]
custom print the RunEngine metadata in a table
PARAMETERS
- dictionary
dict : Python dictionary
EXAMPLE:
In [4]: print_RE_md() RunEngine metadata dictionary: ======================== =================================== key value ======================== =================================== EPICS_CA_MAX_ARRAY_BYTES 1280000 EPICS_HOST_ARCH linux-x86_64 beamline_id APS USAXS 9-ID-C login_id usaxs:usaxscontrol.xray.aps.anl.gov pid 67933 proposal_id testing Bluesky installation scan_id 0 versions ======== ===== key value ======== ===== apstools 1.1.3 bluesky 1.5.2 epics 3.3.1 ophyd 1.3.3 ======== ===== ======================== ===================================
- apstools.utils.misc.print_snapshot_list(db, printing=True, **search_criteria)[source]
print (stdout) a list of all snapshots in the databroker
USAGE:
print_snapshot_list(db, ) print_snapshot_list(db, purpose="this is an example") print_snapshot_list(db, since="2018-12-21", until="2019")
EXAMPLE:
In [16]: from apstools.utils import print_snapshot_list ...: from apstools.callbacks import SnapshotReport ...: print_snapshot_list(db, since="2018-12-21", until="2019") ...: = ======== ========================== ================== # uid date/time purpose = ======== ========================== ================== 0 d7831dae 2018-12-21 11:39:52.956904 this is an example 1 5049029d 2018-12-21 11:39:30.062463 this is an example 2 588e0149 2018-12-21 11:38:43.153055 this is an example = ======== ========================== ================== In [17]: SnapshotReport().print_report(db["5049029d"]) ======================================== snapshot: 2018-12-21 11:39:30.062463 ======================================== example: example 2 hints: {} iso8601: 2018-12-21 11:39:30.062463 look: can snapshot text and arrays too note: no commas in metadata plan_description: archive snapshot of ophyd Signals (usually EPICS PVs) plan_name: snapshot plan_type: generator purpose: this is an example scan_id: 1 software_versions: { 'python': '''3.6.2 |Continuum Analytics, Inc.| (default, Jul 20 2017, 13:51:32) [GCC 4.4.7 20120313 (Red Hat 4.4.7-1)]''', 'PyEpics': '3.3.1', 'bluesky': '1.4.1', 'ophyd': '1.3.0', 'databroker': '0.11.3', 'apstools': '0.0.38' } time: 1545413970.063167 uid: 5049029d-075c-453c-96d2-55431273852b ========================== ====== ================ =================== timestamp source name value ========================== ====== ================ =================== 2018-12-20 18:24:34.220028 PV compress [0.1, 0.2, 0.3] 2018-12-13 14:49:53.121188 PV gov:HOSTNAME otz.aps.anl.gov 2018-12-21 11:39:24.268148 PV gov:IOC_CPU_LOAD 0.22522317161410768 2018-12-21 11:39:24.268151 PV gov:SYS_CPU_LOAD 9.109026666525944 2018-12-21 11:39:30.017643 PV gov:iso8601 2018-12-21T11:39:30 2018-12-13 14:49:53.135016 PV otz:HOSTNAME otz.aps.anl.gov 2018-12-21 11:39:27.705304 PV otz:IOC_CPU_LOAD 0.1251210270549924 2018-12-21 11:39:27.705301 PV otz:SYS_CPU_LOAD 11.611234438304471 2018-12-21 11:39:30.030321 PV otz:iso8601 2018-12-21T11:39:30 ========================== ====== ================ =================== exit_status: success num_events: {'primary': 1} run_start: 5049029d-075c-453c-96d2-55431273852b time: 1545413970.102147 uid: 6c1b2100-1ef6-404d-943e-405da9ada882
- apstools.utils.misc.redefine_motor_position(motor, new_position)[source]
Set EPICS motor record’s user coordinate to
new_position
.
- apstools.utils.misc.replay(headers, callback=None, sort=True)[source]
Replay the document stream from one (or more) scans (headers).
PARAMETERS
- headers
scan or [scan] : Scan(s) to be replayed through callback. A scan is an instance of a Bluesky
databroker.Header
. see: https://nsls-ii.github.io/databroker/api.html?highlight=header#header-api- callback
scan or [scan] : The Bluesky callback to handle the stream of documents from a scan. If
None
, then use the bec (BestEffortCallback) from the IPython shell. (default:None
)- sort
bool : Sort the headers chronologically if True. (default:
True
)
(new in apstools release 1.1.11)
- apstools.utils.misc.run_in_thread(func)[source]
(decorator) run
func
in threadUSAGE:
@run_in_thread def progress_reporting(): logger.debug("progress_reporting is starting") # ... #... progress_reporting() # runs in separate thread #...
- apstools.utils.misc.safe_ophyd_name(text)[source]
make text safe to be used as an ophyd object name
Given some input text string, return a clean version. Remove troublesome characters, perhaps other cleanup as well. This is best done with regular expression pattern matching.
The “sanitized” name fits this regular expression:
[A-Za-z_][\w_]*
Also can be used for safe HDF5 and NeXus names.
- apstools.utils.misc.split_quoted_line(line)[source]
splits a line into words some of which might be quoted
TESTS:
FlyScan 0 0 0 blank FlyScan 5 2 0 "empty container" FlyScan 5 12 0 "even longer name" SAXS 0 0 0 blank SAXS 0 0 0 "blank"
RESULTS:
['FlyScan', '0', '0', '0', 'blank'] ['FlyScan', '5', '2', '0', 'empty container'] ['FlyScan', '5', '12', '0', 'even longer name'] ['SAXS', '0', '0', '0', 'blank'] ['SAXS', '0', '0', '0', 'blank']
OverrideParameters
Define parameters that can be overridden from a user configuration file.
EXAMPLE:
Create an overrides
object in a new file override_params.py
:
import apstools.utils
overrides = apstools.utils.OverrideParameters()
When code supports a parameter for which a user can provide
a local override, the code should import the overrides
object (from the override_params
module),
and then register the parameter name, such as this example:
from override_params import overrides
overrides.register("minimum_step")
Then later:
minstep = overrides.pick("minimum_step", 45e-6)
In the user’s configuration file that will override
the value of 45e-6
(such as can be loaded via
%run -i user.py
), import the overrides`
object (from the override_params
module):
from override_params import overrides
and then override the attribute(s) as desired:
overrides.set("minimum_step", 1.0e-5)
With this override in place, the minstep
value
(from pick()
)
will be 1e-5
.
Get a pandas DataFrame object with all the overrides:
overrides.summary()
which returns this table:
parameter value
0 minimum_step 0.00001
Define parameters that can be overridden from a user configuration file. |
- class apstools.utils.override_parameters.OverrideParameters[source]
Define parameters that can be overridden from a user configuration file.
NOTE: This is a pure Python object, not using ophyd.
pick
(parameter, default)Return either the override parameter value if defined, or the default.
register
(parameter_name)Register a new parameter name to be supported by user overrides.
reset
(parameter_name)Remove an override value for a known parameter.
Remove override values for all known parameters.
set
(parameter_name, value)Define an override value for a known parameter.
summary
()Return a pandas DataFrame with all overrides.
(new in apstools 1.5.2)
- pick(parameter, default)[source]
Return either the override parameter value if defined, or the default.
Plot Support
|
Get the MatPlotLib Figure window for y vs x. |
|
Get the first live plot that matches |
|
Find the plot with axes x and y and replot with at most the last n lines. |
|
Find the plot(s) by name and replot with at most the last n lines. |
- apstools.utils.plot.select_live_plot(bec, signal)[source]
Get the first live plot that matches
signal
.PARAMETERS
- bec
object: instance of
bluesky.callbacks.best_effort.BestEffortCallback
- signal
object: The Y axis object (an
ophyd.Signal
)
RETURNS
- object:
Instance of
bluesky.callbacks.best_effort.LivePlotPlusPeaks()
orNone
- apstools.utils.plot.select_mpl_figure(x, y)[source]
Get the MatPlotLib Figure window for y vs x.
PARAMETERS
- x
object: X axis object (an
ophyd.Signal
)- y
ophyd object: X axis object (an
ophyd.Signal
)
RETURNS
- object or
None
: Instance of
matplotlib.pyplot.Figure()
- apstools.utils.plot.trim_plot_by_name(n=3, plots=None)[source]
Find the plot(s) by name and replot with at most the last n lines.
Note: this is not a bluesky plan. Call it as normal Python function.
It is recommended to call
trim_plot_by_name()
before the scan(s) that generate plots. Plots are generated from a RunEngine callback, executed after the scan completes.PARAMETERS
- n
int : number of plots to keep
- plots
str, [str], or None : name(s) of plot windows to trim (default: all plot windows)
EXAMPLES:
trim_plot_by_name() # default of n=3, apply to all plots trim_plot_by_name(5) # change from default of n=3 trim_plot_by_name(5, "noisy_det vs motor") # just this plot trim_plot_by_name( 5, ["noisy_det vs motor", "det noisy_det vs motor"]] )
EXAMPLE:
# use simulators from ophyd from bluesky import plans as bp from bluesky import plan_stubs as bps from ophyd.sim import * snooze = 0.25 def scan_set(): trim_plot_by_name() yield from bp.scan([noisy_det], motor, -1, 1, 5) yield from bp.scan([noisy_det, det], motor, -2, 1, motor2, 3, 1, 6) yield from bps.sleep(snooze) # repeat the_scans 15 times uids = RE(bps.repeat(scan_set, 15))
(new in release 1.3.5)
- apstools.utils.plot.trim_plot_lines(bec, n, x, y)[source]
Find the plot with axes x and y and replot with at most the last n lines.
Note:
trim_plot_lines()
is not a bluesky plan. Call it as normal Python function.EXAMPLE:
trim_plot_lines(bec, 1, m1, noisy)
PARAMETERS
- bec
object : instance of BestEffortCallback
- n
int : number of plots to keep
- x
object : instance of ophyd.Signal (or subclass), independent (x) axis
- y
object : instance of ophyd.Signal (or subclass), dependent (y) axis
(new in release 1.3.5)
Support for IPython profiles
|
get the IPython shell's namespace dictionary (or globals() if not found) |
return the name of the current ipython profile or |
|
get the IPython shell's namespace dictionary (or empty if not found) |
- apstools.utils.profile_support.getDefaultNamespace(attr='user_ns')[source]
get the IPython shell’s namespace dictionary (or globals() if not found)
EPICS PV Registry
|
Find the ophyd (dotted name) object associated with the given ophyd name. |
|
Find all ophyd objects associated with the given EPICS PV. |
|
Cross-reference EPICS PVs with ophyd EpicsSignalBase objects. |
- class apstools.utils.pvregistry.PVRegistry(ns=None)[source]
Cross-reference EPICS PVs with ophyd EpicsSignalBase objects.
- apstools.utils.pvregistry.findbyname(oname, force_rebuild=False, ns=None)[source]
Find the ophyd (dotted name) object associated with the given ophyd name.
PARAMETERS
- oname
str : ophyd name to search
- force_rebuild
bool : If
True
, rebuild the internal registry that maps ophyd names to ophyd objects.- ns
dict or None : Namespace dictionary of Python objects.
RETURNS
- str or
None
: Name of the ophyd object.
EXAMPLE:
In [45]: findbyname("adsimdet_cam_acquire") Out[45]: 'adsimdet.cam.acquire'
(new in apstools 1.5.0)
- apstools.utils.pvregistry.findbypv(pvname, force_rebuild=False, ns=None)[source]
Find all ophyd objects associated with the given EPICS PV.
PARAMETERS
- pvname
str : EPICS PV name to search
- force_rebuild
bool : If
True
, rebuild the internal registry that maps EPICS PV names to ophyd objects.- ns
dict or None : Namespace dictionary of Python objects.
RETURNS
- dict or
None
: Dictionary of matching ophyd objects, keyed by how the PV is used by the ophyd signal. The keys are
read
andwrite
.
EXAMPLE:
In [45]: findbypv("ad:cam1:Acquire") Out[45]: {'read': [], 'write': ['adsimdet.cam.acquire']} In [46]: findbypv("ad:cam1:Acquire_RBV") Out[46]: {'read': ['adsimdet.cam.acquire'], 'write': []}
Searching databroker catalogs
|
Searches the databroker v2 database. |
- apstools.utils.query.db_query(db, query)[source]
Searches the databroker v2 database.
PARAMETERS
- db
object : Bluesky database, an instance of
databroker.catalog
.- query
dict : Search parameters.
RETURNS
- object :
Bluesky database, an instance of
databroker.catalog
satisfying thequery
parameters.
See also
databroker.catalog.search()
Common support of slits
|
Slit size and center as a named tuple |
- class apstools.utils.slit_core.SlitGeometry(width, height, x, y)
Slit size and center as a named tuple
- height
Alias for field number 1
- width
Alias for field number 0
- x
Alias for field number 2
- y
Alias for field number 3
Spreadsheet Support
|
base class: read-only support for Excel files, treat them like databases |
|
Generic (read-only) handling of Excel spreadsheet-as-database |
|
Exception when reading Excel spreadsheet. |
- class apstools.utils.spreadsheet.ExcelDatabaseFileBase(ignore_extra=True)[source]
base class: read-only support for Excel files, treat them like databases
Use this class when creating new, specific spreadsheet support.
EXAMPLE
Show how to read an Excel file where one of the columns contains a unique key. This allows for random access to each row of data by use of the key.
class ExhibitorsDB(ExcelDatabaseFileBase): ''' content for exhibitors from the Excel file ''' EXCEL_FILE = os.path.join("resources", "exhibitors.xlsx") LABELS_ROW = 2 def handle_single_entry(self, entry): '''any special handling for a row from the Excel file''' pass def handleExcelRowEntry(self, entry): '''identify unique key (row of the Excel file)''' key = entry["Name"] self.db[key] = entry
- class apstools.utils.spreadsheet.ExcelDatabaseFileGeneric(filename, labels_row=3, ignore_extra=True)[source]
Generic (read-only) handling of Excel spreadsheet-as-database
Note
This is the class to use when reading Excel spreadsheets.
In the spreadsheet, the first sheet should contain the table to be used. By default (see keyword parameter
labels_row
), the table should start in cell A4. The column labels are given in row 4. A blank column should appear to the right of the table (see keyword parameterignore_extra
). The column labels will describe the action and its parameters. Additional columns may be added for metadata or other purposes.The rows below the column labels should contain actions and parameters for those actions, one action per row.
To make a comment, place a
#
in the action column. A comment should be ignored by the bluesky plan that reads this table. The table will end with a row of empty cells.While it’s a good idea to put the
action
column first, that is not necessary. It is not even necessary to name the columnaction
. You can re-arrange the order of the columns and change their names as long as the column names match what text strings your Python code expects to find.A future upgrade 1 will allow the table boundaries to be named by Excel when using Excel’s
Format as Table
2 feature. For now, leave a blank row and column at the bottom and right edges of the table.- 1
- 2
Excel’s
Format as Table
: https://support.office.com/en-us/article/Format-an-Excel-table-6789619F-C889-495C-99C2-2F971C0E2370
PARAMETERS
- filename
str : name (absolute or relative) of Excel spreadsheet file
- labels_row
int : Row (zero-based numbering) of Excel file with column labels, default:
3
(Excel row 4)- ignore_extra
bool : When
True
, ignore any cells outside of the table, default:True
.Note that when
True
, a row of cells within the table will be recognized as the end of the table, even if there are actions in following rows. To force an empty row, use a comment symbol#
(actually, any non-empty content will work).When
False
, cells with other information (in Sheet 1) will be made available, sometimes with unpredictable results.
EXAMPLE
See section Example: the run_command_file() plan for more examples.
(See also example screen shot.) Table (on Sheet 1) begins on row 4 in first column:
1 | some text here, maybe a title 2 | (could have content here) 3 | (or even more content here) 4 | action | sx | sy | sample | comments | | <-- leave empty column 5 | close | | | close the shutter | | 6 | image | 0 | 0 | dark | dark image | | 7 | open | | | | open the shutter | | 8 | image | 0 | 0 | flat | flat field image | | 9 | image | 5.1 | -3.2 | 4140 steel | heat 9172634 | | 10 | scan | 5.1 | -3.2 | 4140 steel | heat 9172634 | | 11 | scan | 0 | 0 | blank | | | 12 | 13 | ^^^ leave empty row ^^^ 14 | (could have content here)
Example python code to read this spreadsheet:
from apstools.utils import ExcelDatabaseFileGeneric, cleanupText def myExcelPlan(xl_file, md={}): excel_file = os.path.abspath(xl_file) xl = ExcelDatabaseFileGeneric(excel_file) for i, row in xl.db.values(): # prepare the metadata _md = {cleanupText(k): v for k, v in row.items()} _md["xl_file"] = xl_file _md["excel_row_number"] = i+1 _md.update(md) # overlay with user-supplied metadata # determine what action to take action = row["action"].lower() if action == "open": yield from bps.mv(shutter, "open") elif action == "close": yield from bps.mv(shutter, "close") elif action == "image": # your code to take an image, given **row as parameters yield from my_image(**row, md=_md) elif action == "scan": # your code to make a scan, given **row as parameters yield from my_scan(**row, md=_md) else: print(f"no handling for row {i+1}: action={action}") # execute this plan through the RunEngine RE(myExcelPlan("spreadsheet.xlsx", md=dict(purpose="apstools demo"))
synApps Support: Records, Databases, …
Ophyd-style support for EPICS synApps structures (records and databases).
EXAMPLES:
import apstools.synApps
calcs = apstools.synApps.userCalcsDevice("xxx:", name="calcs")
scans = apstools.synApps.SscanDevice("xxx:", name="scans")
scripts = apstools.synApps.userScriptsDevice("xxx:set1:", name="scripts")
xxxstats = apstools.synApps.IocStatsDevice("xxx:", name="xxxstats")
calc1 = calcs.calc1
apstools.synApps.swait_setup_random_number(calc1)
apstools.synApps.swait_setup_incrementer(calcs.calc2)
calc1.reset()
- 1
synApps XXX: https://github.com/epics-modules/xxx
Categories
Support the default structures as provided by the synApps template XXX 1 IOC. Also support, as needed, for structures from EPICS base.
Records
|
EPICS asyn record support in ophyd |
|
EPICS synApps busy record |
|
EPICS base calcout record support in ophyd |
|
EPICS synApps epid record support in ophyd |
|
EPICS synApps luascript record: used as |
|
EPICS SynApps calc scalcout record support in ophyd |
|
EPICS synApps sscan record: used as |
|
EPICS synApps sseq record support in ophyd |
|
EPICS base sub record support in ophyd |
|
EPICS synApps swait record: used as |
|
EPICS transform record support in ophyd |
The ophyd-style Devices for these records rely on common structures:
|
Many of the fields common to all EPICS records. |
|
Some fields common to EPICS input records. |
|
Some fields common to EPICS output records. |
|
Some fields common to EPICS records supporting floating point values. |
Databases
|
EPICS synApps sseq support to quickly re-arrange steps. |
|
EPICS synApps optics 2slit.db 1D support: xn, xp, size, center, sync |
|
EPICS synApps optics 2slit.db 2D support: h.xn, h.xp, v.xn, v.xp |
|
EPICS synApps optics 2slit.db 2D support: inb, out, bot, top |
|
EPICS synApps saveData support. |
|
EPICS synApps XXX IOC setup of sscan records: |
|
Single instance of the userCalcN database. |
|
EPICS synApps XXX IOC setup of userCalcs: |
|
EPICS synApps XXX IOC setup of user calcouts: |
|
Single instance of the userCalcoutN database. |
|
EPICS synApps XXX IOC setup of user scalcouts: |
|
Single instance of the userStringCalcN database. |
|
EPICS synApps XXX IOC setup of user lua scripts: |
|
EPICS synApps XXX IOC setup of userStringSeqs: |
|
Single instance of the userStringSeqN database. |
|
EPICS synApps XXX IOC setup of user average: |
|
EPICS synApps XXX IOC setup of user averaging sub records: |
|
Single instance of the userTranN database. |
|
EPICS synApps XXX IOC setup of userTransforms: |
|
Supports |
|
channel of a calcout record: A-L |
|
synApps IOC stats |
|
number input of a synApps luascript record: A-J |
|
string input of a synApps luascript record: AA-JJ |
|
Number channel of an scalcout record: A-L |
|
String channel of an scalcout record: AA-LL |
|
Number channel of a sub record: A-L |
|
EPICS synApps synApps swait record: single channel [A-L] |
Other Support
These functions configure calcout or swait records for certain algorithms.
|
setup calcout for noisy Gaussian |
|
setup calcout record as an incrementer |
|
setup calcout record for noisy Lorentzian |
|
setup swait for noisy Gaussian |
|
setup swait record as an incrementer |
|
setup swait record for noisy Lorentzian |
|
setup swait record to generate random numbers |
All Submodules
EPICS Record support: common structures
Some fields are common 1 to all 2 or some EPICS records. These are defined as mixin classes for the various ophyd-style devices.
Ophyd support for fields common to all EPICS records
Public Structures
|
Many of the fields common to all EPICS records. |
|
Some fields common to EPICS input records. |
|
Some fields common to EPICS output records. |
|
Some fields common to EPICS records supporting floating point values. |
|
Supports |
- see
https://wiki-ext.aps.anl.gov/epics/index.php/RRM_3-14_dbCommon
- see
https://wiki-ext.aps.anl.gov/epics/index.php/RRM_3-14_Common
- class apstools.synApps._common.EpicsRecordDeviceCommonAll(*args: Any, **kwargs: Any)[source]
Many of the fields common to all EPICS records.
Some fields are not included because they are not interesting to an EPICS client or are already provided in other support.
- class apstools.synApps._common.EpicsRecordFloatFields(*args: Any, **kwargs: Any)[source]
Some fields common to EPICS records supporting floating point values.
- class apstools.synApps._common.EpicsRecordInputFields(*args: Any, **kwargs: Any)[source]
Some fields common to EPICS input records.
synApps asyn record
see the synApps asyn
module suppport:
https://github.com/epics-modules/asyn
Ophyd support for the EPICS asyn record
Public Structures
|
EPICS asyn record support in ophyd |
synApps busy record
see the synApps busy
module suppport:
https://github.com/epics-modules/busy
Ophyd support for the EPICS busy record
Public Structures
|
EPICS synApps busy record |
EPICS base calcout record
The calcout
record is part of EPICS base:
https://wiki-ext.aps.anl.gov/epics/index.php/RRM_3-14_Calcout
Ophyd support for the EPICS calcout record
https://wiki-ext.aps.anl.gov/epics/index.php/RRM_3-14_Calcout
Public Structures
|
EPICS synApps XXX IOC setup of user calcouts: |
|
Single instance of the userCalcoutN database. |
|
EPICS base calcout record support in ophyd |
|
channel of a calcout record: A-L |
|
setup calcout for noisy Gaussian |
|
setup calcout record as an incrementer |
|
setup calcout record for noisy Lorentzian |
- class apstools.synApps.calcout.CalcoutRecord(*args: Any, **kwargs: Any)[source]
EPICS base calcout record support in ophyd
reset
()set all fields to default values
- class apstools.synApps.calcout.CalcoutRecordChannel(*args: Any, **kwargs: Any)[source]
channel of a calcout record: A-L
reset
()set all fields to default values
- class apstools.synApps.calcout.UserCalcoutDevice(*args: Any, **kwargs: Any)[source]
EPICS synApps XXX IOC setup of user calcouts:
$(P):userCalcOut$(N)
reset
()set all fields to default values
- calcout1
- calcout10
- calcout2
- calcout3
- calcout4
- calcout5
- calcout6
- calcout7
- calcout8
- calcout9
- class apstools.synApps.calcout.UserCalcoutN(*args: Any, **kwargs: Any)[source]
Single instance of the userCalcoutN database.
- apstools.synApps.calcout.setup_gaussian_calcout(calcout, ref_signal, center=0, width=1, scale=1, noise=0.05)[source]
setup calcout for noisy Gaussian
calculation:
D*(0.95+E*RNDM)/exp(((A-B)/C)^2)
PARAMETERS
- calcout
object : instance of
CalcoutRecord
- ref_signal
object : instance of
EpicsSignal
used asA
- center
float : EPICS record field
B
, default = 0- width
float : EPICS record field
C
, default = 1- scale
float : EPICS record field
D
, default = 1- noise
float : EPICS record field
E
, default = 0.05
- apstools.synApps.calcout.setup_incrementer_calcout(calcout, scan=None, limit=100000)[source]
setup calcout record as an incrementer
PARAMETERS
- calcout
object : instance of
CalcoutRecord
- scan
text or int or
None
: any of the EPICS record.SCAN
values, or the index number of the value, set to default ifNone
, default:.1 second
- limit
int or
None
: set the incrementer back to zero when this number is reached (or passed), default: 100000
- apstools.synApps.calcout.setup_lorentzian_calcout(calcout, ref_signal, center=0, width=1, scale=1, noise=0.05)[source]
setup calcout record for noisy Lorentzian
calculation:
D*(0.95+E*RNDM)/(1+((A-B)/C)^2)
PARAMETERS
- calcout
object : instance of
CalcoutRecord
- ref_signal
object : instance of
EpicsSignal
used asA
- center
float : EPICS record field
B
, default = 0- width
float : EPICS record field
C
, default = 1- scale
float : EPICS record field
D
, default = 1- noise
float : EPICS record field
E
, default = 0.05
EPICS synApps optics 2slit.db
db_2slit: synApps optics 2slit.db
There are two implementations, corresponding to differing and competing opinions of how the support should be implemented.
Coordinates of Optics2Slit2D_InbOutBotTop
(viewing from detector towards source):
top
inb out
bot
Coordinates of Optics2Slit2D_HV
(viewing from detector towards source):
v.xp
h.xn h.xp
v.xn
Each blade 1 (in the XIA slit controller) travels in a _cartesian_ coordinate
system. Positive motion moves a blade outwards (towards the p
suffix).
Negative motion moves towards the n
suffix. Size and center are computed
by the underlying EPICS support.
hsize = out - inb vsize = top - bot
- 1
Note that the blade names here may be different than the EPICS support. The difference is to make the names of the blades consistent with other slits with the Bluesky framework.
USAGE:
slit1 = Optics2Slit2D_HV("gp:Slit1", name="slit1")
slit1.geometry = 0.1, 0.1, 0, 0 # moves the slits
print(slit1.geometry)
slit2 = Optics2Slit_InbOutBotTop("gp:Slit2", name="slit2")
slit2.geometry = 0.1, 0.1, 0, 0 # moves the slits
print(slit2.geometry)
Public Structures
|
EPICS synApps optics 2slit.db 1D support: xn, xp, size, center, sync |
|
EPICS synApps optics 2slit.db 2D support: h.xn, h.xp, v.xn, v.xp |
|
EPICS synApps optics 2slit.db 2D support: inb, out, bot, top |
new in release 1.6.0
- class apstools.synApps.db_2slit.Optics2Slit1D(*args: Any, **kwargs: Any)[source]
EPICS synApps optics 2slit.db 1D support: xn, xp, size, center, sync
“sync” is used to tell the EPICS 2slit database to synchronize the virtual slit values with the actual motor positions.
- center
alias of
apstools.devices.positioner_soft_done.PVPositionerSoftDone
- class apstools.synApps.db_2slit.Optics2Slit2D_HV(*args: Any, **kwargs: Any)[source]
EPICS synApps optics 2slit.db 2D support: h.xn, h.xp, v.xn, v.xp
- property geometry
Return the slit 2D size and center as a namedtuple.
- h
- v
- class apstools.synApps.db_2slit.Optics2Slit2D_InbOutBotTop(*args: Any, **kwargs: Any)[source]
EPICS synApps optics 2slit.db 2D support: inb, out, bot, top
- property geometry
Return the slit 2D size and center as a namedtuple.
- hcenter
alias of
apstools.devices.positioner_soft_done.PVPositionerSoftDone
- hsize
alias of
apstools.devices.positioner_soft_done.PVPositionerSoftDone
- vcenter
alias of
apstools.devices.positioner_soft_done.PVPositionerSoftDone
- vsize
alias of
apstools.devices.positioner_soft_done.PVPositionerSoftDone
synApps epid record
The epid
record is part of the std
module:
https://epics.anl.gov/bcda/synApps/std/epidRecord.html
Ophyd support for the EPICS epid record
Public Structures
|
EPICS synApps epid record support in ophyd |
synApps IOC statistics
The synApps iocStats support is in the synApps iocStats
module:
https://github.com/epics-modules/iocStats
Ophyd support for the iocStats support
Public Structures
|
synApps IOC stats |
synApps luascript record
see the synApps luascript
module suppport:
https://epics-lua.readthedocs.io/en/latest/luascriptRecord.html
Ophyd support for the EPICS synApps luascript record
EXAMPLES:
import apstools.synApps
scripts = apstools.synApps.UserScriptsDevice("xxx:", name="scripts")
scripts.reset()
|
EPICS synApps XXX IOC setup of user lua scripts: |
|
EPICS synApps luascript record: used as |
|
number input of a synApps luascript record: A-J |
|
string input of a synApps luascript record: AA-JJ |
- class apstools.synApps.luascript.LuascriptRecord(*args: Any, **kwargs: Any)[source]
EPICS synApps luascript record: used as
$(P):userScript$(N)
reset
()set all fields to default values
- class apstools.synApps.luascript.LuascriptRecordNumberInput(*args: Any, **kwargs: Any)[source]
number input of a synApps luascript record: A-J
- class apstools.synApps.luascript.LuascriptRecordStringInput(*args: Any, **kwargs: Any)[source]
string input of a synApps luascript record: AA-JJ
synApps save data
The synApps SaveData support is in the synApps sscan
module:
https://github.com/epics-modules/sscan
Ophyd support for the EPICS synApps saveData support
see: https://epics.anl.gov/bcda/synApps/sscan/sscanRecord.html
EXAMPLE:
from apstools.synApps import SaveData
save_data = SaveData("xxx:saveData_", name="save_data")
Public Structures
|
EPICS synApps saveData support. |
EPICS synApps calc scalcout record
The scalcout
record is part of EPICS synApps calc
:
http://htmlpreview.github.io/?https://github.com/epics-modules/calc/blob/R3-6-1/documentation/calcDocs.html
Ophyd support for the EPICS scalcout record
Public Structures
|
EPICS synApps XXX IOC setup of user scalcouts: |
|
Single instance of the userStringCalcN database. |
|
EPICS SynApps calc scalcout record support in ophyd |
|
Number channel of an scalcout record: A-L |
|
String channel of an scalcout record: AA-LL |
- class apstools.synApps.scalcout.ScalcoutRecord(*args: Any, **kwargs: Any)[source]
EPICS SynApps calc scalcout record support in ophyd
reset
()set all fields to default values
- class apstools.synApps.scalcout.ScalcoutRecordNumberChannel(*args: Any, **kwargs: Any)[source]
Number channel of an scalcout record: A-L
reset
()set all fields to default values
- class apstools.synApps.scalcout.ScalcoutRecordStringChannel(*args: Any, **kwargs: Any)[source]
String channel of an scalcout record: AA-LL
reset
()set all fields to default values
- class apstools.synApps.scalcout.UserScalcoutDevice(*args: Any, **kwargs: Any)[source]
EPICS synApps XXX IOC setup of user scalcouts:
$(P):userStringCalc$(N)
reset
()set all fields to default values
- scalcout1
- scalcout10
- scalcout2
- scalcout3
- scalcout4
- scalcout5
- scalcout6
- scalcout7
- scalcout8
- scalcout9
synApps sscan record
see the synApps sscan
module suppport:
https://github.com/epics-modules/sscan
Ophyd support for the EPICS synApps sscan record
see: https://epics.anl.gov/bcda/synApps/sscan/SscanRecord.html
EXAMPLE:
import apstools.synApps
scans = apstools.synApps.SscanDevice("xxx:", name="scans")
scans.select_channels() # only the channels configured in EPICS
Public Structures
|
EPICS synApps sscan record: used as |
|
EPICS synApps XXX IOC setup of sscan records: |
Private Structures
|
positioner of an EPICS sscan record |
|
detector of an EPICS sscan record |
|
detector trigger of an EPICS sscan record |
- class apstools.synApps.sscan.SscanDevice(*args: Any, **kwargs: Any)[source]
EPICS synApps XXX IOC setup of sscan records:
$(P):scan$(N)
reset
()set all fields to default values
Select only the scans that are configured in EPICS
- scan1
alias of
apstools.synApps.sscan.SscanRecord
- scan2
alias of
apstools.synApps.sscan.SscanRecord
- scan3
alias of
apstools.synApps.sscan.SscanRecord
- scan4
alias of
apstools.synApps.sscan.SscanRecord
- scanH
alias of
apstools.synApps.sscan.SscanRecord
- class apstools.synApps.sscan.SscanRecord(*args: Any, **kwargs: Any)[source]
EPICS synApps sscan record: used as
$(P):scan(N)
True if will be used in EPICS
reset
()set all fields to default values
Select channels that are configured in EPICS
- property defined_in_EPICS
True if will be used in EPICS
- class apstools.synApps.sscan.sscanDetector(*args: Any, **kwargs: Any)[source]
detector of an EPICS sscan record
True if defined in EPICS
reset
()set all fields to default values
- property defined_in_EPICS
True if defined in EPICS
synApps sseq record
- The
sseq
(String Sequence) record is part of thecalc
module:
Ophyd support for the EPICS sseq (string sequence) record
Public Structures
|
EPICS synApps sseq support to quickly re-arrange steps. |
|
EPICS synApps sseq record support in ophyd |
|
EPICS synApps XXX IOC setup of userStringSeqs: |
|
Single instance of the userStringSeqN database. |
- class apstools.synApps.sseq.EditStringSequence(*args: Any, **kwargs: Any)[source]
EPICS synApps sseq support to quickly re-arrange steps.
See the
editSseq_more
GUI screen for assistance.
- class apstools.synApps.sseq.SseqRecord(*args: Any, **kwargs: Any)[source]
EPICS synApps sseq record support in ophyd
abort
().ABORT is a push button.
reset
()set all fields to default values
- class apstools.synApps.sseq.UserStringSequenceDevice(*args: Any, **kwargs: Any)[source]
EPICS synApps XXX IOC setup of userStringSeqs:
$(P):userStringSeq$(N)
Note: This will connect more than 1,000 EpicsSignal objects!
reset
()set all fields to default values
- sseq1
- sseq10
- sseq2
- sseq3
- sseq4
- sseq5
- sseq6
- sseq7
- sseq8
- sseq9
EPICS base sub record & synApps userAve records
The sub
record is part of EPICS base:
https://epics.anl.gov/base/R7-0/6-docs/subRecord.html
The user average databases ($(P)userAve$(N)
) are from synApps.
Ophyd support for the EPICS sub record
https://epics.anl.gov/base/R7-0/6-docs/subRecord.html
Public Structures
|
EPICS synApps XXX IOC setup of user average: |
|
EPICS synApps XXX IOC setup of user averaging sub records: |
|
EPICS base sub record support in ophyd |
|
Number channel of a sub record: A-L |
- class apstools.synApps.sub.SubRecord(*args: Any, **kwargs: Any)[source]
EPICS base sub record support in ophyd
reset
()set all fields to default values
- class apstools.synApps.sub.SubRecordChannel(*args: Any, **kwargs: Any)[source]
Number channel of a sub record: A-L
reset
()set all fields to default values
- class apstools.synApps.sub.UserAverageDevice(*args: Any, **kwargs: Any)[source]
EPICS synApps XXX IOC setup of user averaging sub records:
$(P):userAve$(N)
reset
()set all fields to default values
- average1
alias of
apstools.synApps.sub.UserAverageN
- average10
alias of
apstools.synApps.sub.UserAverageN
- average2
alias of
apstools.synApps.sub.UserAverageN
- average3
alias of
apstools.synApps.sub.UserAverageN
- average4
alias of
apstools.synApps.sub.UserAverageN
- average5
alias of
apstools.synApps.sub.UserAverageN
- average6
alias of
apstools.synApps.sub.UserAverageN
- average7
alias of
apstools.synApps.sub.UserAverageN
- average8
alias of
apstools.synApps.sub.UserAverageN
- average9
alias of
apstools.synApps.sub.UserAverageN
- class apstools.synApps.sub.UserAverageN(*args: Any, **kwargs: Any)[source]
EPICS synApps XXX IOC setup of user average:
$(P):userAve$(N)
This database uses a sub record for most features plus additional records to support done, acquire, clear, and other features.
It uses a
sub
record and other, hence not exactly aSubRecord()
.
synApps swait record
The swait
record is part of the calc
module:
https://htmlpreview.github.io/?https://raw.githubusercontent.com/epics-modules/calc/R3-6-1/documentation/swaitRecord.html
see the synApps calc
module suppport:
https://github.com/epics-modules/calc
Ophyd support for the EPICS synApps swait record
EXAMPLES:
import apstools.synApps
calcs = apstools.synApps.UserCalcsDevice("xxx:", name="calcs")
calc1 = calcs.calc1
apstools.synApps.setup_random_number_swait(calc1)
apstools.synApps.setup_incrementer_swait(calcs.calc2)
calc1.reset()
|
Single instance of the userCalcN database. |
|
EPICS synApps XXX IOC setup of userCalcs: |
|
EPICS synApps swait record: used as |
|
EPICS synApps synApps swait record: single channel [A-L] |
|
setup swait record to generate random numbers |
|
setup swait for noisy Gaussian |
|
setup swait record for noisy Lorentzian |
|
setup swait record as an incrementer |
- class apstools.synApps.swait.SwaitRecord(*args: Any, **kwargs: Any)[source]
EPICS synApps swait record: used as
$(P):userCalc$(N)
reset
()set all fields to default values
- class apstools.synApps.swait.SwaitRecordChannel(*args: Any, **kwargs: Any)[source]
EPICS synApps synApps swait record: single channel [A-L]
- class apstools.synApps.swait.UserCalcN(*args: Any, **kwargs: Any)[source]
Single instance of the userCalcN database.
- class apstools.synApps.swait.UserCalcsDevice(*args: Any, **kwargs: Any)[source]
EPICS synApps XXX IOC setup of userCalcs:
$(P):userCalc$(N)
reset
()set all fields to default values
- calc1
alias of
apstools.synApps.swait.UserCalcN
- calc10
alias of
apstools.synApps.swait.UserCalcN
- calc2
alias of
apstools.synApps.swait.UserCalcN
- calc3
alias of
apstools.synApps.swait.UserCalcN
- calc4
alias of
apstools.synApps.swait.UserCalcN
- calc5
alias of
apstools.synApps.swait.UserCalcN
- calc6
alias of
apstools.synApps.swait.UserCalcN
- calc7
alias of
apstools.synApps.swait.UserCalcN
- calc8
alias of
apstools.synApps.swait.UserCalcN
- calc9
alias of
apstools.synApps.swait.UserCalcN
- apstools.synApps.swait.setup_gaussian_swait(swait, ref_signal, center=0, width=1, scale=1, noise=0.05)[source]
setup swait for noisy Gaussian
calculation: $D*(0.95+E*RNDM)/exp(((A-B)/C)^2)$
PARAMETERS
- swait
object : instance of
SwaitRecord
- ref_signal
object : instance of
EpicsSignal
used as $A$- center
float : instance of
EpicsSignal
used as $B$, default = 0- width
float : instance of
EpicsSignal
used as $C$, default = 1- scale
float : instance of
EpicsSignal
used as $D$, default = 1- noise
float : instance of
EpicsSignal
used as $E$, default = 0.05
- apstools.synApps.swait.setup_incrementer_swait(swait, scan=None, limit=100000)[source]
setup swait record as an incrementer
PARAMETERS
- swait
object : instance of
SwaitRecord
- scan
text or int or
None
any of the EPICS record.SCAN
values, or the index number of the value, set to default ifNone
, default:.1 second
- limit
int or
None
set the incrementer back to zero when this number is reached (or passed), default: 100000
- apstools.synApps.swait.setup_lorentzian_swait(swait, ref_signal, center=0, width=1, scale=1, noise=0.05)[source]
setup swait record for noisy Lorentzian
calculation: $D*(0.95+E*RNDM)/(1+((A-B)/C)^2)$
PARAMETERS
- swait
object : instance of
SwaitRecord
- ref_signal
object : instance of
EpicsSignal
used as $A$- center
float : instance of
EpicsSignal
used as $B$, default = 0- width
float : instance of
EpicsSignal
used as $C$, default = 1- scale
float : instance of
EpicsSignal
used as $D$, default = 1- noise
float : instance of
EpicsSignal
used as $E$, default = 0.05
synApps transform record
- The
transform
record is part of thecalc
module:
Ophyd support for the EPICS transform record
Public Structures
|
Single instance of the userTranN database. |
|
EPICS synApps XXX IOC setup of userTransforms: |
|
EPICS transform record support in ophyd |
- class apstools.synApps.transform.TransformRecord(*args: Any, **kwargs: Any)[source]
EPICS transform record support in ophyd
reset
()set all fields to default values
- class apstools.synApps.transform.UserTransformN(*args: Any, **kwargs: Any)[source]
Single instance of the userTranN database.
Change History
The project milestones describe the future plans.
1.6.1
release expected by 2021-01-26
Fixes
Move
enable
Component out from synApps Record devices.Renew the unit tests for PVPositionerSoftDoneWithStop.
1.6.0
released 2021-01-20
Breaking Changes
Moved
apsbss
support to newapsbss
package (install with eitherpip
orconda
). See https://bcda-aps.github.io/apsbss/ for details.Can use Python 3.7 - 3.9. Cannot use Python 3.10 yet due to upstream limitation from databroker and intake packages.
Moved
command_list_as_table()
from utils intoplans/command_list
.Removed
BusyStatus
from apstools.synApps.busycallbacks/
:DocumentCollectorCallback
,document_contents_callback
, andSnapshotReport
moved intocallbacks/
.devices/
: Reorganized all devices, includingsynApps/
, intodevices/
subpackage.devices/
:SynPseudoVoigt()
moved fromsignals/
todevices/
.plans/
: Reorganizedplans.py
and_plans/
intoplans/
subpackage.snapshot/
: Movedsnapshot
application and related files to a subdirectory.utils/
: Reorganizedutils.py
and_utils/
intoutils/
subpackage.
New Features and/or Enhancements
Add support for Eurotherm 2216e temperature controller
Add support for Lakeshore 336 temperature controller
Add support for Lakeshore 340 temperature controller
Add support for synApps calc
scalcout
record.Add support for synApps calc
sseq
record.Add support for EPICS base
sub
record.Add support for synApps calc
userAve
database.Add support for synApps calc
userStringSeq
database.Add support for synApps calc
userStringCalc
database.Add support for synApps optics
2slit
database.
Fixes
Convert
None
to"null"
when savingPeakStats
to stream.
Maintenance
Now testing with Python versions 3.7 - 3.9. (Can’t use with Py3.10 yet due to upstream requirements.)
Update notebooks:
demo_specfile_example
demo_tuneaxis
Remove notebooks:
demo_specfile_databroker
Deprecations
Applications
apstools_plan_catalog application and related support.
Devices
ApsCycleComputedRO
move_energy()
method inKohzuSeqCtl_Monochromator
classProcessController
Utilities
device_read2table
json_export
json_import
listdevice_1_5_2
listruns_v1_4
object_explorer
Contributors
Gilberto Fabbris
Jan Ilavsky
Qingteng Zhang
1.5.4
release expected 2021-12-01
NOTE: The apsbss
component will be moved out of apstools
into its
own package with the next release (1.6.0, ~Feb 2022) of apstools
.
Notice
The Python version is limited to 3.7 due to aps-dm-api package. Expect this limitation to be relaxed, allowing any Python 3.7 and higher with the 1.6.0 release.
Fixes
Added table of APS run cycle dates. Use that if aps-dm-api not available.
Restricted python version to 3.7 due to upstream aps_dm_api package.
Rename name uid to token to avoid LGTM security false alert.
Deprecations
This support was marked as deprecated in release 1.5.4:
apstools.devices.ApsCycleComputedRO
1.5.3
released 2021-10-15
Notice
The apstools.beamtime
module and related content (includes apsbss
)
will be moved to a new repository for release 1.6.0. This will
remove the requirement that the APS data management tools (package aps-dm,
which only works on the APS computing network) be included. With this
change, users will be able to conda install apstools -c aps-anl-tag
on
computers outside of the APS computing network.
Breaking Changes
apstools.utils.listdevice
has a new API (old version renamed tolistdevice_1_5_2
)
New Features and/or Enhancements
Kohzu monochromator
energy
,wavelength
, andtheta
each are now aPVPositioner
(subclass).Linkam temperature controller CI94
Linkam temperature controller T96
Stanford Research Systems 570 current preamplifier
Stanford Research Systems PTC10 temperature controller
XIA PF4 filter now supports multiple PF4 units.
Generalize that amplifiers will have a
gain
Component attribute.Generalize that temperature controllers will have a
temperature
Component attribute that is a positioner (subclass ofophyd.PVPositioner
).Enhanced positioners for EPICS Devices: *
apstools.devices.PVPositionerSoftDone
*apstools.devices.PVPositionerSoftDoneWithStop
Fixes
Fixed bug in
devices.ApsCycleComputedRO
anddevices.ApsCycleDM
involvingdatetime
.
Maintenance
Moved all device support into individual modules under apstools._devices because apstools.devices module was getting too big. Will refactor all with release 1.6.0.
Add unit tests for
devices.ApsCycle*
Devices.Add EPICS IOCs (ADSimDetector and synApps xxx) to continuous integration for use in unit testing.
Unit tests now use pytest package.
Suppress certain warnings during unit testing.
Deprecations
This support will be removed in release 1.6.0:
apstools.beamtime
module and related content (includesapsbss
) will be moved to a new repositoryapstools.devices.ProcessController
apstools.utils.device_read2table
apstools.utils.listdevice_1_5_2
apstools.utils.object_explorer
Contributors
Fanny Rodolakis
Gilberto Fabbris
Jan Ilavsky
Qingteng Zhang
4-ID-C Polar
8-ID-I XPCS
9-ID-C USAXS
1.5.2 (and previous)
See this table for release change histories, highlighted by version control reference (pull request or issue):
- 1.5.2
released 2021-09-29
Drop Codacy (https://app.codacy.com/gh/BCDA-APS/apstools) as no longer needed.
- #540
Add
apstools.utils.listplans()
function.
- #534
Add
apstools.utils.OverrideParameters
class. Hoisted from APS USAXS instrument.
- #537
Enhancements to
apstools.utils.listruns()
:Add search by list of
scan_id
oruid
values.Optimize search speed.
- #534
Add
apstools.plans.documentation_run()
plan. Hoisted from APS USAXS instrument.
- #528
Add
kind=
kwarg to synApps Devices.
- #539
Break
devices
into submodule_devices
.
- 1.5.1
released 2021-07-22
- #522
Deprecate apstools.devices.ProcessController. Suggest ophyd.PVPositioner instead.
- #521
Enhancement: new functions: getRunData(), getRunDataValue(), getStreamValues() & listRunKeys()
- #518
Bug fixed: TypeError from summary() of CalcoutRecord
- #517
Added support for python 3.9.
- #514
Refactor ‘SIGNAL.value’ to ‘SIGNAL.get()’
- 1.5.0
released 2021-04-02
- #504 comment
Dropped support for python 3.6.
- #495
Dropped diffractometer support code.
- #506
spec2ophyd
can now read SPEC config files from APS 17BM
- #504
Overhaul of listruns() using pandas. Previous code renamed to listruns_v1_4().
- #503
Unit tests with data now used msgpack-backed databroker.
- #495
remove hklpy requirement since all diffractometer support code will be moved to [hklpy](https://github.com/bluesky/hklpy) package.
- 1.4.1
released: 2021-01-23
- 1.4.0
released: 2021-01-15
- #483
Python code style must pass
flake8
test.
- #482
specwriter: Fix bug when plan_args structure includes a numpy ndarray.
- #474
apstools.utils.listruns()
now defaults to the current catalog in use.New functions:
apstools.utils.getDatabase()
apstools.utils.getDefaultDatabase()
- #470
Area Detector plugin preparation & detection.
apstools.devices.AD_plugin_primed()
re-written completely
apstools.devices.AD_prime_plugin()
replaced by
apstools.devices.AD_prime_plugin2()
- #463
Remove deprecated features.
apstools.suspenders.SuspendWhenChanged()
apstools.utils.plot_prune_fifo()
apstools.utils.show_ophyd_symbols()
apstools.synapps.asyn.AsynRecord.binary_output_maxlength()
apstools.devices.AD_warmed_up()
- #451
Undulator and Kohzu monochromator functionalities
apstools.devices.ApsUndulator()
Adds some
Signal
components (such as setting kind kwarg) that are helpful in moving the undulator
- 1.3.9
released 2020-11-30
- 1.3.8
released: 2020-10-23
- #449
diffractometer wh() shows extra positioners
- #446
utils: device_read2table() renamed to listdevice()
- #445
synApps: add Device for iocStats
- #437
diffractometer add pa() report
- #426
diffractometer add simulated diffractometers
- #425
BUG fixed: listruns() when no stop document
- #423
BUG fixed: apsbss IOC starter script
- 1.3.7
released: 2020-09-18
- 1.3.6
released 2020-09-04
- 1.3.5
released 2020-08-25
- 1.3.4
released 2020-08-14
- #400
resolve warnings and example documentation inconsistency
- #399
parse iso8601 date for py36
- #398
DiffractometerMixin: add wh() method
- #396
provide spec2ophyd application
- #394
add utils.copy_filtered_catalog()
- #392
RTD make parameter lists clearer
- #390
improve formatting of parameter list in RTD
- #388
add utils.quantify_md_key_use()
- #385
spec2ophyd: make entry point
- 1.3.3
released 2020-07-22
- 1.3.2
released 2020-07-20
- #380
apsbss: fix object references
- 1.3.1
released 2020-07-18
- #378
apsbss_ioc.sh: add checkup (keep-alive feature for the IOC)
- #376
apsbss: example beam line-specific shell scripts
- #375
apsbss: add PVs for numbers of users
- #374
apsbss_ophyd: addDeviceDataAsStream() from USAXS
- #373
account for time zone when testing datetime-based file name
- #371
update & simplify the travis-ci setup
- #369
spec2ophyd: handle NONE in SPEC counters
- #368
spec2ophyd: config file as command-line argument
- #367
apsbss: move ophyd import from main
- #364
apsbss: add PVs for ioc_host and ioc_user
- #363
Handle when mailInFlag not provided
- #360
prefer logging to print
- 1.3.0
release expected by 2020-07-15
add NeXus writer callback
add
apsbss
: APS experiment metadata support- #351
apsbss: put raw info into PV
- #350
apsbss: clarify meaning of reported dates
- #349
apsbss: add “next” subcommand
- #347
some apbss files not published
- #346
publish fails to push conda packages
- #344
listruns() uses databroker v2 API
- #343
review and update requirements
- #342
summarize runs in databroker by plan_name and frequency
- #341
tools to summarize activity
- #340
update copyright year
- #339
resolve Codacy code review issues
- #338
unit tests are leaving directories undeleted
- #337
Document new filewriter callbacks
- #336
add NeXus file writer from USAXS
- #335
update requirements
- #334
support APS proposal & ESAF systems to provide useful metadata
- #333
access APS proposal and ESAF information
- #332
listruns(): use databroker v2 API
- #329
add NeXus writer base class from USAXS
- 1.2.6
released 2020-06-26
- #331
listruns succeeds even when number of existing runs is less than requested
- #330
BUG: listruns: less than 20 runs in catalog
- #328
epid: add final_value (.VAL field)
- #327
epid: remove clock_ticks (.CT field)
- #326
BUG: epid failed to connect to .CT field
- #325
BUG: epid final_value signal not found
- #324
BUG: epid controlled_value signal name
- 1.2.5
released 2020-06-05
- 1.2.3
released 2020-05-07
- 1.2.2
released 2020-05-06
- DEPRECATION #306
apstools.plans.show_ophyd_symbols() will be removed by 2020-07-01. Use apstools.plans.listobjects() instead.
- #311
adapt to databroker v1
- #310
enhance listruns() search capabilities
- #308
manage diffractometer constraints
- #307
add diffractometer emhancements
- #306
rename show_ophyd_objects() as listobjects()
- #305
add utils.safe_ophyd_name()
- #299
set_lim() does not set low limit
- 1.2.1
released 2020-02-18 - bug fix
- #297
fix import error
- 1.2.0
released 2020-02-18 - remove deprecated functions
- 1.1.19
released 2020-02-15
- 1.1.18
released 2020-02-09
PyPI would not accept the 1.1.17 version: filename has already been used
see release notes for 1.1.17
- 1.1.17
released 2020-02-09 - hot fixes
- 1.1.16
released 2019-12-05
- #269
bug: shutter does not move when expected
- #268
add redefine_motor_position() plan
- #267
remove lineup() plan for now
- #266
bug fix for #265
- #265
refactor of #264
- #264
Limit number of traces shown on a plot - use a FIFO
- #263
device_read2table() should print unless optioned False
- #262
add lineup() plan (from APS 8-ID-I XPCS)
- 1.1.15
released 2019-11-21 : bug fixes, adds asyn record support
- 1.1.14
released 2019-09-03 : bug fixes, more synApps support
- #246
synApps: shorten name from synApps_ophyd
- #245
swait & calcout: change from EpicsMotor to any EpicsSignal
- #240
swait: refactor swait record & userCalc support
- #239
transform: add support for transform record
- #238
calcout: add support for calcout record & userCalcOuts
- #237
epid: add support for epid record
- #234
utils: replicate the unix() command
- #230
signals: resolve TypeError
- 1.1.13
released 2019-08-15 : enhancements, bug fix, rename
- 1.1.12
released 2019-08-05 : bug fixes & updates
- 1.1.11
released 2019-07-31 : updates & new utility
- 1.1.10
released 2019-07-30 : updates & bug fix
- 1.1.9
released 2019-07-28 : updates & bug fix
- 1.1.8
released 2019-07-25 : updates
- #196
spec2ophyd handle MOTPAR:read_misc_1
- #194
new
show_ophyd_symbols
shows table of global ophydSignal
andDevice
instances
- #193
spec2ophyd ignore None items in SPEC config file
- #192
spec2ophyd handles VM_EPICS_PV in SPEC config file
- #191
spec2ophyd handles PSE_MAC_MOT in SPEC config file
- #190
spec2ophyd handles MOTPAR in SPEC config file
- 1.1.7
released 2019-07-04
- DEPRECATION
apstools.plans.run_blocker_in_plan() will be removed by 2019-12-31. Do not write blocking code in bluesky plans.
Dropped python 3.5 from supported versions
- #175
move plans.run_in_thread() to utils.run_in_thread()
- #168
new spec2ophyd migrates SPEC config file to ophyd setup
- #166
device_read2table(): format device.read() results in a pyRestTable.Table
- #161
addDeviceDataAsStream(): add Device as named document stream event
- #159
convert xlrd.XLRDError into apstools.utils.ExcelReadError
- #158
new
run_command_file()
runs a command list from text file or Excel spreadsheet
- 1.1.6
released 2019-05-26
- 1.1.5
released 2019-05-14
- #135
add refresh button to snapshot GUI
- 1.1.4
released 2019-05-14
- 1.1.3
released 2019-05-10
adds packaging dependence on event-model
- #137
adds utils.json_export() and utils.json_import()
- 1.1.1
released 2019-05-09
adds packaging dependence on spec2nexus
- #136
get json document stream(s)
- #134
add build on travis-ci with py3.7
- #130
fix conda recipe and pip dependencies (thanks to Maksim Rakitin!)
- #128
SpecWriterCallback.newfile() problem with scan_id = 0
- #127
fixed: KeyError from SPEC filewriter
- #126
add uid to metadata
- #125
SPEC filewriter scan numbering when “new” data file exists
- #124
fixed: utils.trim_string_for_EPICS() trimmed string too long
- #100
fixed: SPEC file data columns in wrong places
- 1.1.0
released 2019.04.16
change release numbering to Semantic Versioning (remove all previous tags and releases)
batch scans using Excel spreadsheets
bluesky_snapshot_viewer and bluesky_snapshot
conda package available
License
Copyright (c) 2017-2022, UChicago Argonne, LLC
All Rights Reserved
apstools (previously: APS_BlueSky_tools)
BCDA, Advanced Photon Source, Argonne National Laboratory
OPEN SOURCE LICENSE
Redistribution and use in source and binary forms, with or without
modification, are permitted provided that the following conditions are met:
1. Redistributions of source code must retain the above copyright notice,
this list of conditions and the following disclaimer. Software changes,
modifications, or derivative works, should be noted with comments and
the author and organization's name.
2. Redistributions in binary form must reproduce the above copyright notice,
this list of conditions and the following disclaimer in the documentation
and/or other materials provided with the distribution.
3. Neither the names of UChicago Argonne, LLC or the Department of Energy
nor the names of its contributors may be used to endorse or promote
products derived from this software without specific prior written
permission.
4. The software and the end-user documentation included with the
redistribution, if any, must include the following acknowledgment:
"This product includes software produced by UChicago Argonne, LLC
under Contract No. DE-AC02-06CH11357 with the Department of Energy."
****************************************************************************
DISCLAIMER
THE SOFTWARE IS SUPPLIED "AS IS" WITHOUT WARRANTY OF ANY KIND.
Neither the United States GOVERNMENT, nor the United States Department
of Energy, NOR uchicago argonne, LLC, nor any of their employees, makes
any warranty, express or implied, or assumes any legal liability or
responsibility for the accuracy, completeness, or usefulness of any
information, data, apparatus, product, or process disclosed, or
represents that its use would not infringe privately owned rights.
****************************************************************************
See Also
apstools home |
|
apstools source |
|
apsbss home |
|
Bluesky home |
|
Bluesky source |