The time between the beam crossings in HERA (96 ns) is an order of
magnitude smaller than at previous accelerators.
This brief time interval puts stringent requirements on the electronic
systems of the experiments.
The trigger systems must reduce the large background rate of about
Hz and identify the interesting physics events, which occur at
a rate of a few Hertz at HERA.
The readout system of ZEUS must process about 3 Mbyte/event of raw
data from 250 000 electronic channels to produce approximately
120 kbyte/event of compressed data.
The ZEUS trigger and data acquisition system (DAQ) is a highly parallel distributed real-time system and is shown schematically in Fig. 1 [1, 2]. It consists of 17 independent component readout systems and three levels of triggering. Most components have first and second level triggers, while the third level trigger consists of a farm of Silicon Graphics 4D/35S data station servers with the capability of processing data from all the components. In order to calculate global trigger decisions from the asynchronous input from the front-end of the data acquisition system, distributed synchronized buffering is needed. In addition, if significant dead-time is to be avoided, the reduction by the trigger system at the first and second levels can only be achieved by using pipelined-parallel processing.
The design of the first level trigger (FLT) is to accept events at a rate of
1 kHz and this can be achieved using custom-built hardware processor
boards.
Component data is stored for 5.6 s (58 beam crossings) in local
digital or analog pipelines until a global first level trigger decision is
received.
Both the trigger and readout are dead-time free at this level.
After a global first level trigger accept, the remaining analog signals are
digitized, some component data zero suppressed and all data stored in
dual-port memories on the front-end boards.
When the data is copied from the first level trigger pipelines to the
second level trigger pipelines a small dead-time of about 1% is introduced.
The first level calorimeter trigger output data is processed by an additional trigger - the fast clear. This system operates in a non-pipelined mode between arrivals of the global first level triggers and may abort the event readout before second level trigger processing begins.
The task of the second level trigger (SLT) is to reduce the trigger rate by a further factor of 10 (to 100 Hz). Most components in ZEUS have many crates of front-end electronics and the data in the dual-port memories are transferred over the crate back-plane busses to a readout module in each crate. In the readout modules, the data is available for second level trigger processing and for further transport to the event-builder [3].
The ZEUS event-builder is a real-time data formatting and transport system responsible for routing the component event data over a switch to an appropriate branch of third level trigger processor nodes.
The third level trigger processor farm consists of six branches with five processing nodes each [4]. The FNAL ACP Branch Bus, with a bandwidth of 20 Mbyte/s, is used to feed 30 Silicon Graphics 4D/35S data station servers (SGI). The system can process up to 30 events in parallel, has a computing power of about 1000 MIPS and can sustain a transfer rate of 1.6 Mbyte/s to the DESY IBM.