A classical way of acquiring data in radiation physics is to capture analog signals exceeding a certain threshold, using either analog discriminators or digital pulse processing on FPGAs. Then, a temporal region of interest is defined, where the signal is integrated or its maximum is measured, either analogically or digitally. This information is stored and constitutes an event in your measurement run. This technique has two disadvantages: it leads to erroneous data if two signals pile-up within the same integration region, and it introduces some processing dead time after the event, where further data can not be acquired. These errors only become relevant when the event count rate is high (compared to the integration and processing times), and require the application of pile-up rejection and dead time correction algorithms to partially mitigate the introduced noise.
In the context of cancer treatment with accelerated protons, secondary prompt gamma-rays with high energies are emitted along the beam path, and can be measured to monitor the quality of the therapy. Unfortunately, these gamma-rays are produced at a rate of 1e9 cps. Assuming a detection efficiency of 1% with a fast scintillation detector, this leads to a count rate of 1e7 cps and thus to a significant dead time and pile-up proportion, which degrades the measurement.
To circumvent this limitation, we are designing a data acquisition system with zero dead time and with the ability of unraveling piled-up events. The first experimental tests of this system, tailored (but not limited) to prompt gamma-ray measurements in proton therapy, will be presented in the seminar.
IFIC Experimental Seminar organizers