http://arxiv.org/abs/1504.04833
We consider a sample of ten GRBs with long lasting ($\gtrsim10^2\rm sec$) emission detected by Fermi/LAT and for which X-ray data around $1\,$day are also available. We assume that both the X-rays and the GeV emission are produced by electrons accelerated at the external shock, and show that the X-ray and the LAT fluxes lead to very different estimates of the initial kinetic energy of the blast wave. The energy estimated from LAT is on average $\sim50$ times larger than the one estimated from X-rays. We model the data (accounting also for optical detections around $1\,$day, if available) to unveil the reason for this discrepancy and find that good modelling within the external shock scenario is always possible and leads to two possibilities: either the X-ray emitting electrons (unlike the GeV emitting electrons) are in the slow cooling regime or ii) the X-ray synchrotron flux is strongly suppressed by Compton cooling, whereas, due to the Klein-Nishina suppression, this effect is much smaller at GeV energies. In both cases the X-ray flux is no longer a robust proxy for the blast wave kinetic energy. On average, both cases require weak magnetic fields ($10^{-6}\lesssim \epsilon_B \lesssim 10^{-3}$) and relatively large isotropic equivalent kinetic blast wave energies, in the range $10^{53}\rm erg$$<E_{0,kin}<10^{55}\rm erg$. These energies are larger than those estimated from the X-ray flux alone, and imply smaller inferred values of the prompt efficiency mechanism, reducing the efficiency requirements on the still uncertain mechanism responsible for prompt emission.
P. Beniamini, L. Nava, R. Duran, et. al.
Tue, 21 Apr 15
47/69
Comments: 21 pages, 4 figures. Submitted to MNRAS
You must be logged in to post a comment.