http://arxiv.org/abs/1609.03723
Aims. Although the temporal evolution of active regions (ARs) is relatively well understood, the processes involved continue to be the subject of investigation. We study how the magnetic field of a series of ARs evolves with time to better characterise how ARs emerge and disperse. Methods. We examine the temporal variation in the magnetic field distribution of 37 emerging ARs. A kernel density estimation plot of the field distribution was created on a log-log scale for each AR at each time step. We found that the central portion of the distribution is typically linear and its slope was used to characterise the evolution of the magnetic field. Results. The slopes were seen to evolve with time, becoming less steep as the fragmented emerging flux coalesces. The slopes reached a maximum value of ~ -1.5 just before the time of maximum flux before becoming steeper during the decay phase towards the quiet Sun value of ~ -3. This behaviour differs significantly from a classical diffusion model, which produces a slope of -1. These results suggest that simple classical diffusion is not responsible for the observed changes in field distribution, but that other processes play a significant role in flux dispersion. Conclusions. We propose that the steep negative slope seen during the late decay phase is due to magnetic flux reprocessing by (super)granular convective cells.
S. Dacie, P. Demoulin, L. Driel-Gesztelyi, et. al.
Wed, 14 Sep 16
26/75
Comments: N/A
You must be logged in to post a comment.