Orbits for eighteen visual binaries and two double-line spectroscopic binaries observed with HRCAM on the CTIO SOAR 4m telescope, using a new Bayesian orbit code based on Markov Chain Monte Carlo [SSA]


We present orbital elements and mass sums for eighteen visual binary stars of spectral types B to K (five of which are new orbits) with periods ranging from 20 to more than 500 yr. For two double-line spectroscopic binaries with no previous orbits, the individual component masses, using combined astrometric and radial velocity data, have a formal uncertainty of ~0.1 MSun. Adopting published photometry, and trigonometric parallaxes, plus our own measurements, we place these objects on an H-R diagram, and discuss their evolutionary status. These objects are part of a survey to characterize the binary population of stars in the Southern Hemisphere, using the SOAR 4m telescope+HRCAM at CTIO. Orbital elements are computed using a newly developed Markov Chain Monte Carlo algorithm that delivers maximum likelihood estimates of the parameters, as well as posterior probability density functions that allow us to evaluate the uncertainty of our derived parameters in a robust way. For spectroscopic binaries, using our approach, it is possible to derive a self-consistent parallax for the system from the combined astrometric plus radial velocity data (“orbital parallax”), which compares well with the trigonometric parallaxes. We also present a mathematical formalism that allows a dimensionality reduction of the feature space from seven to three search parameters (or from ten to seven dimensions – including parallax – in the case of spectroscopic binaries with astrometric data), which makes it possible to explore a smaller number of parameters in each case, improving the computational efficiency of our Markov Chain Monte Carlo code.

Read this paper on arXiv…

R. Mendez, R. Claveria, M. Orchard, et. al.
Thu, 21 Sep 17

Comments: 32 pages, 9 figures, 6 tables. Detailed Appendix with methodology. Accepted by The Astronomical Journal

Field dynamics inference via spectral density estimation [CL]


Stochastic differential equations (SDEs) are of utmost importance in various scientific and industrial areas. They are the natural description of dynamical processes whose precise equations of motion are either not known or too expensive to solve, e.g., when modeling Brownian motion. In some cases, the equations governing the dynamics of a physical system on macroscopic scales occur to be unknown since they typically cannot be deduced from general principles. In this work, we describe how the underlying laws of a stochastic process can be approximated by the spectral density of the corresponding process. Furthermore, we show how the density can be inferred from possibly very noisy and incomplete measurements of the dynamical field. Generally, inverse problems like these can be tackled with the help of Information Field Theory (IFT). For now, we restrict to linear and autonomous processes. Though, this is a non-conceptual limitation that may be omitted in future work. To demonstrate its applicability we employ our reconstruction algorithm on a time-series and spatio-temporal processes.

Read this paper on arXiv…

P. Frank, T. Steininger and T. Ensslin
Fri, 18 Aug 17

Comments: 12 pages, 9 figures

Verification of operational solar flare forecast: Case of Regional Warning Center Japan [SSA]


In this article, we discuss a verification study of an operational solar flare forecast in the Regional Warning Center (RWC) Japan. The RWC Japan has been issuing four-categorical deterministic solar flare forecasts for a long time. In this forecast verification study, we used solar flare forecast data accumulated over 16 years (from 2000 to 2015). We compiled the forecast data together with solar flare data obtained with the Geostationary Operational Environmental Satellites (GOES). Using the compiled data sets, we estimated some conventional scalar verification measures with 95% confidence intervals. We also estimated a multi-categorical scalar verification measure. These scalar verification measures were compared with those obtained by the persistence method and recurrence method. As solar activity varied during the 16 years, we also applied verification analyses to four subsets of forecast-observation pair data with different solar activity levels. We cannot conclude definitely that there are significant performance difference between the forecasts of RWC Japan and the persistence method, although a slightly significant difference is found for some event definitions. We propose to use a scalar verification measure to assess the judgment skill of the operational solar flare forecast. Finally, we propose a verification strategy for deterministic operational solar flare forecasting.

Read this paper on arXiv…

Y. Kubo, M. Den and M. Ishii
Wed, 26 Jul 17

Comments: 29 pages, 7 figures and 6 tables. Accepted for publication in Journal of Space Weather and Space Climate (SWSC)

Big Data vs. complex physical models: a scalable inference algorithm [CL]


The data torrent unleashed by current and upcoming instruments requires scalable analysis methods. Machine Learning approaches scale well. However, separating the instrument measurement from the physical effects of interest, dealing with variable errors, and deriving parameter uncertainties is usually an after-thought. Classic forward-folding analyses with Markov Chain Monte Carlo or Nested Sampling enable parameter estimation and model comparison, even for complex and slow-to-evaluate physical models. However, these approaches require independent runs for each data set, implying an unfeasible number of model evaluations in the Big Data regime. Here we present a new algorithm based on nested sampling, deriving parameter probability distributions for each observation. Importantly, in our method the number of physical model evaluations scales sub-linearly with the number of data sets, and we make no assumptions about homogeneous errors, Gaussianity, the form of the model or heterogeneity/completeness of the observations. Our method has immediate application in speeding up analyses of large surveys, integral-field-unit observations, and Monte Carlo simulations.

Read this paper on arXiv…

J. Buchner
Mon, 17 Jul 17

Comments: Submitted to MNRAS. Comments welcome. Figure 6 demonstrates the scaling. Implementation at this https URL

Computing Entropies With Nested Sampling [CL]


The Shannon entropy, and related quantities such as mutual information, can be used to quantify uncertainty and relevance. However, in practice, it can be difficult to compute these quantities for arbitrary probability distributions, particularly if the probability mass functions or densities cannot be evaluated. This paper introduces a computational approach, based on Nested Sampling, to evaluate entropies of probability distributions that can only be sampled. I demonstrate the method on three examples: a simple gaussian example where the key quantities are available analytically; (ii) an experimental design example about scheduling observations in order to measure the period of an oscillating signal; and (iii) predicting the future from the past in a heavy-tailed scenario.

Read this paper on arXiv…

B. Brewer
Thu, 13 Jul 17

Comments: Submitted to Entropy. 18 pages, 3 figures. Software available at this https URL

Radio-flaring Ultracool Dwarf Population Synthesis [SSA]


Over a dozen ultracool dwarfs (UCDs), low-mass objects of spectral types $\geq$M7, are known to be sources of radio flares. These typically several-minutes-long radio bursts can be up to 100\% circularly polarized and have high brightness temperatures, consistent with coherent emission via the electron cyclotron maser operating in $\sim$kG magnetic fields. Recently, the statistical properties of the bulk physical parameters that describe these UCDs have become adequately described to permit synthesis of the population of radio-flaring objects. For the first time, I construct a Monte Carlo simulator to model the population of these radio-flaring UCDs. This simulator is powered by Intel Secure Key (ISK)- a new processor technology that uses a local entropy source to improve random number generation that has heretofore been used to improve cryptography. The results from this simulator indicate that only $\sim$5% of radio-flaring UCDs within the local interstellar neighborhood ($<$25 pc away) have been discovered. I discuss a number of scenarios which may explain this radio-flaring fraction, and suggest that the observed behavior is likely a result of several factors. The performance of ISK as compared to other pseudorandom number generators is also evaluated, and its potential utility for other astrophysical codes briefly described.

Read this paper on arXiv…

M. Route
Mon, 10 Jul 17

Comments: Accepted for publication in ApJ; 18 pages, 4 figures

Charged particle tracking without magnetic field: optimal measurement of track momentum by a Bayesian analysis of the multiple measurements of deflections due to multiple scattering [CL]


We revisit the precision of the measurement of track parameters (position, angle) with optimal methods in the presence of detector resolution, multiple scattering and zero magnetic field. We then obtain an optimal estimator of the track momentum by a Bayesian analysis of the filtering innovations of a series of Kalman filters applied to the track.
This work could pave the way to the development of autonomous high-performance gas time-projection chambers (TPC) or silicon wafer gamma-ray space telescopes and be a powerful guide in the optimisation of the design of the multi-kilo-ton liquid argon TPCs that are under development for neutrino studies.

Read this paper on arXiv…

M. Frosini and D. Bernard
Tue, 20 Jun 17

Comments: 39 pages, 12 figures, submitted to Nuclear Inst. and Methods in Physics Research, A