Results 1-50 of 3650 (3556 ASCL, 94 submitted)
The transient search pipeline realfast integrates with the real-time environment at the Very Large Array (VLA) to look for fast radio bursts, pulsars, and other rare astrophysical transients. The software monitors multicast messages, catches visibility data, and defines a fast transient search pipeline with rfpipe (ascl:1710.002). It indexes candidate transients and other metadata for the search interface, and writes and archives new visibility files for candidate transients. realfast provides support for GPU algorithms, manages distributed futures, and performs blind injection and management of mock transients, among other tasks, and rapidly distributes data products and transient alerts to the public.
2-DUST is a general-purpose dust radiative transfer code for an axisymmetric system that reveals the global energetics of dust grains in the shell and the 2-D projected morphologies of the shell that are strongly dependent on the mixed effects of the axisymmetric dust distribution and inclination angle. It can be used to model a variety of axisymmetric astronomical dust systems.
21cmDeepLearning extracts the underlying matter density map from a 21 cm intensity field by making use of a convolutional neural network (CNN) with the U-Net architecture; the software is implemented in Pytorch. The astrophysical parameters of the simulations can be predicted with a secondary CNN. The simulations of matter density and 21 cm maps are performed with the code 21cmFAST (ascl:1102.023).
21cmEMU emulates 21cmFAST (ascl:1102.023) summary statistics, among them the 21-cm power spectrum, 21-cm global brightness temperature, IGM spin temperature, and neutral fraction. It also emulates the Thomson scattering optical depth and UV luminosity functions. With 21cmFAST installed, parameters can be supplied direction to 21cmEMU, and 21cmEMU can be used for, for example, analytic calculations of taue and UV luminosity functions. The code is included as an alternative simulator in 21cmMC (ascl:1608.017).
21cmFAST is a powerful semi-numeric modeling tool designed to efficiently simulate the cosmological 21-cm signal. The code generates 3D realizations of evolved density, ionization, peculiar velocity, and spin temperature fields, which it then combines to compute the 21-cm brightness temperature. Although the physical processes are treated with approximate methods, the results were compared to a state-of-the-art large-scale hydrodynamic simulation, and the findings indicate good agreement on scales pertinent to the upcoming observations (>~ 1 Mpc). The power spectra from 21cmFAST agree with those generated from the numerical simulation to within 10s of percent, down to the Nyquist frequency. Results were shown from a 1 Gpc simulation which tracks the cosmic 21-cm signal down from z=250, highlighting the various interesting epochs. Depending on the desired resolution, 21cmFAST can compute a redshift realization on a single processor in just a few minutes. The code is fast, efficient, customizable and publicly available, making it a useful tool for 21-cm parameter studies.
21cmFirstCLASS extends 21cmFAST (ascl:1102.023) and interfaces with CLASS (ascl:1106.020) to generate initial conditions at recombination that are consistent with the input cosmological model. These initial conditions can be set during the time of recombination, allowing one to compute the 21cm signal (and its spatial fluctuations) throughout the dark ages, as well as in the proceeding cosmic dawn and reionization epochs, just as in the standard 21cmFAST. 21cmFirstCLASS tracks both the CDM density field δc as well as the baryons density field δb. In addition, the user interface in 21cmFirstCLASS has been improved and allows one to easily plot the 21cm power spectrum while including noise from the output of 21cmSense (ascl:1609.013).
21CMMC is an efficient Python sampler of the semi-numerical reionization simulation code 21cmFAST (ascl:1102.023). It can recover constraints on astrophysical parameters from current or future 21 cm EoR experiments, accommodating a variety of EoR models, as well as priors on individual model parameters and the reionization history. By studying the resulting impact on the EoR astrophysical constraints, 21CMMC can be used to optimize foreground cleaning algorithms; interferometer designs; observing strategies; alternate statistics characterizing the 21cm signal; and synergies with other observational programs.
21cmSense calculates the expected sensitivities of 21cm experiments to the Epoch of Reionization power spectrum. Written in Python, it requires NumPy, SciPy, and AIPY (ascl:1609.012).
21cmvFAST demonstrates that including dark matter (DM)-baryon relative velocities produces velocity-induced acoustic oscillations (VAOs) in the 21-cm power spectrum. Based on 21cmFAST (ascl:1102.023) and 21CMMC (ascl:1608.017), 21cmvFAST accounts for molecular-cooling haloes, which are expected to drive star formation during cosmic dawn, as both relative velocities and Lyman-Werner feedback suppress halo formation. This yields accurate 21-cm predictions all the way to reionization (z>~10).
2cosmos is a modification of Monte Python (ascl:1307.002) and allows the user to write likelihood modules that can request two independent instances of CLASS (ascl:1106.020) and separate dictionaries and structures for all cosmological and nuisance parameters. The intention is to be able to evaluate two independent cosmological calculations and their respective parameters within the same likelihood. This is useful for evaluating a likelihood using correlated datasets (e.g. mutually exclusive subsets of the same dataset for which one wants to take into account all correlations between the subsets).
2D-FFTLog takes the FFTLog algorithm for 1D Hankel transforms and generalizes it for 2D Hankel transforms. The algorithm is useful for efficiently computing non-Gaussian covariance matrices of cosmological 2-point statistics in configuration space from Fourier space covariances. Fast bin-averaging method is also developed for both the logarithmic binning and general binning choices. C and Python versions of the code are available.
2DBAT implements Bayesian fits of 2D tilted-ring models to derive rotation curves of galaxies. It performs 2D tilted-ring analysis based on a Bayesian Markov Chain Monte Carlo (MCMC) technique, thus quantifying the kinematic geometry of galaxy discs, and deriving high-quality rotation curves that can be used for mass modeling of baryons and dark matter halos.
2dfdr is an automatic data reduction pipeline dedicated to reducing multi-fibre spectroscopy data, with current implementations for AAOmega (fed by the 2dF, KOALA-IFU, SAMI Multi-IFU or older SPIRAL front-ends), HERMES, 2dF (spectrograph), 6dF, and FMOS. A graphical user interface is provided to control data reduction and allow inspection of the reduced spectra.
2DFFT utilizes two-dimensional fast Fourier transformations of images of spiral galaxies to isolate and measure the pitch angles of their spiral arms; this provides a quantitative way to measure this morphological feature and allows comparison of spiral galaxy pitch angle to other galactic parameters and test spiral arm genesis theories. 2DFFT requires fourn.c from Numerical Recipes in C (Press et al. 1989).
P2DFFT (ascl:1806.011) is a parallelized version of 2DFFT.
The Python module 2DFFTUtils implements tasks associated with measuring spiral galaxy pitch angle with 2DFFT (ascl:1608.015). Since most of the 2DFFT utilities are implemented in one place, it makes preparing images for 2DFFT and dealing with 2DFFT data interactively or in scripts event easier.
The vectorized physical domain structure function (SF) algorithm calculates the velocity anisotropy within two-dimensional molecular line emission observations. The vectorized approach is significantly faster than brute force iterative algorithms and is very efficient for even relatively large images. Furthermore, unlike frequency domain algorithms which require the input data to be fully integrable, this algorithm, implemented in Python, has no such requirements, making it a robust tool for observations with irregularities such as asymmetric boundaries and missing data.
Setting initial conditions in numerical simulations using the standard procedure based on the Zel'dovich approximation (ZA) generates incorrect second and higher-order growth and therefore excites long-lived transients in the evolution of the statistical properties of density and velocity fields. Using more accurate initial conditions based on second-order Lagrangian perturbation theory (2LPT) reduces transients significantly; initial conditions based on 2LPT are thus much more appropriate for numerical simulations devoted to precision cosmology. The 2LPTIC code provides initial conditions for running cosmological simulations based on second-order Lagrangian Perturbation Theory (2LPT), rather than first-order (Zel'dovich approximation).
2MASS Kit is an open source software for use in easily constructing a high performance search server for important astronomical catalogs. It is tuned for optimal coordinate search performance (Radial Search, Box Search, Rectangular Search) of huge catalogs, thus increasing the speed by more than an order of magnitude when compared to simple indexing on a single table. Optimal conditions enable more than 3,000 searches per second for radial search of 2MASS PSC. The kit is best characterized by its flexible tuning. Each table index is registered in one of six table spaces (each resides in a separate directory), thus allowing only the essential parts to be easily moved onto fast devices. Given the terrific evolution that has taken place with recent SSDs in performance, a very cost-effective way of constructing high-performance servers is moving part of or all table indices to a fast SSD.
The Matlab Tool generates a 3D model (WRL, texturized in height false color map) of a defined region of the Mars surface. It defines the region of interest of the Mars surface (by Lat Long), a resolution of the MOLA DTMs to be considered (with a minimum px onground of 468 m), a scale factor to be multiplied to the height of the surface to improve features visibility for bumping or shadowing effect.
3D-Barolo (3D-Based Analysis of Rotating Object via Line Observations) or BBarolo is a tool for fitting 3D tilted-ring models to emission-line datacubes. BBarolo works with 3D FITS files, i.e. image arrays with two spatial and one spectral dimensions. BBarolo recovers the true rotation curve and estimates the intrinsic velocity dispersion even in barely resolved galaxies (about 2 resolution elements) if the signal to noise of the data is larger than 2-3. It has source-detection and first-estimate modules, making it suitable for analyzing large 3D datasets automatically, and is a useful tool for deriving reliable kinematics for both local and high-redshift galaxies.
3D-PDR is a three-dimensional photodissociation region code written in Fortran. It uses the Sundials package (written in C) to solve the set of ordinary differential equations and it is the successor of the one-dimensional PDR code UCL_PDR (ascl:1303.004). Using the HEALpix ray-tracing scheme (ascl:1107.018), 3D-PDR solves a three-dimensional escape probability routine and evaluates the attenuation of the far-ultraviolet radiation in the PDR and the propagation of FIR/submm emission lines out of the PDR. The code is parallelized (OpenMP) and can be applied to 1D and 3D problems.
3DCORE forward models solar storm magnetic flux ropes called 3-Dimensional Coronal Rope Ejection (3DCORE). The code is able to produce synthetic in situ observations of the magnetic cores of solar coronal mass ejections sweeping over planets and spacecraft. Near Earth, these data are taken currently by the Wind, ACE and DSCOVR spacecraft. Other suitable spacecraft making these kind of observations carrying magnetometers in the solar wind were MESSENGER, Venus Express, MAVEN, and even Helios.
High precision cosmology requires analysis of large scale surveys in 3D spherical coordinates, i.e. Fourier-Bessel decomposition. Current methods are insufficient for future data-sets from wide-field cosmology surveys. 3DEX (3D EXpansions) is a public code for fast Fourier-Bessel decomposition of 3D all-sky surveys which takes advantage of HEALPix for the calculation of tangential modes. For surveys with millions of galaxies, computation time is reduced by a factor 4-12 depending on the desired scales and accuracy. The formulation is also suitable for pre-calculations and external storage of the spherical harmonics, which allows for further speed improvements. The 3DEX code can accommodate data with masked regions of missing data. It can be applied not only to cosmological data, but also to 3D data in spherical coordinates in other scientific fields.
3DView creates visualizations of space physics data in their original 3D context. Time series, vectors, dynamic spectra, celestial body maps, magnetic field or flow lines, and 2D cuts in simulation cubes are among the variety of data representation enabled by 3DView. It offers direct connections to several large databases and uses VO standards; it also allows the user to upload data. 3DView's versatility covers a wide range of space physics contexts.
In cosmological N-body simulations, higher-order Lagrangian perturbation on the initial condition affects the formation of nonlinear structure. Using this code, the initial condition generated by Zel'dovich approximation (Lagrangian linear perturbation) for Gadget-2 code to initial condition with second- or third-order Lagrangian perturbation (2LPT, 3LPT).
4DAO launches DAOSPEC (ascl:1011.002) for a large sample of spectra. Written in Fortran, the software allows one to easily manage the input and output files of DAOSPEC, optimize the main DAOSPEC parameters, and mask specific spectral regions. It also provides suitable graphical tools to evaluate the quality of the solution and provides final, normalized, zero radial velocity spectra.
We present corrections to the Schlegel, Finkbeiner, Davis (SFD98) reddening maps over the Sloan Digital Sky Survey northern Galactic cap area. To find these corrections, we employ what we dub the "standard crayon" method, in which we use passively evolving galaxies as color standards by which to measure deviations from the reddening map. We select these passively evolving galaxies spectroscopically, using limits on the H alpha and O II equivalent widths to remove all star-forming galaxies from the SDSS main galaxy catalog. We find that by correcting for known reddening, redshift, color-magnitude relation, and variation of color with environmental density, we can reduce the scatter in color to below 3% in the bulk of the 151,637 galaxies we select. Using these galaxies we construct maps of the deviation from the SFD98 reddening map at 4.5 degree resolution, with 1-sigma error of ~ 1.5 millimagnitudes E(B-V). We find that the SFD98 maps are largely accurate with most of the map having deviations below 3 millimagnitudes E(B-V), though some regions do deviate from SFD98 by as much as 50%. The maximum deviation found is 45 millimagnitudes in E(B-V), and spatial structure of the deviation is strongly correlated with the observed dust temperature, such that SFD98 underpredicts reddening in regions of low dust temperature. The maps of these deviations, as well as their errors, are made available to the scientific community as supplemental correction to SFD98 at the URL below.
Two neural networks were designed to identify hazardous planetesimals that were trained on object trajectories calculated in a cloud computing environment. The first neural network was fully-connected and was trained on the orbital elements (OEs) of real/simulated planetesimals, while the second was a 1-dimensional convolutional neural network that was trained on the position Cartesian coordinates of real/simulated planetesimals. Ultimately, the network trained on OEs had a better performance by identifying one-third of known potentially hazardous objects including the 3 asteroids with the highest chance of impact with Earth (2009 FD, 1999 RQ36, 1950 DA) as established by NASA's Monte Carlo based Sentry system.
Working with a GUI, or adding interaction in plotting, will help a lot in data analysis. However, the common GUI of Python is OS-dependent, while manually adding interactive codes is too complex. A pseudo-GUI tool is introduced in this work. It will help to add buttons/checkers in the graph and assign callback functions to them. The remaining problem is that the documents in this package are in Chinese and will be in English in the next version. This program is published to the PyPI, and can be installed by 'pip install pltgui'.
Photon asymmetry is a novel robust substructure statistic for X-ray cluster observations with only a few thousand counts; it exhibits better stability than power ratios and centroid shifts and has a smaller statistical uncertainty than competing substructure parameters, allowing for low levels of substructure to be measured with confidence. A_phot computes the photon asymmetry (A_phot) parameter for morphological classification of clusters and allows quantifying substructure in samples of distant clusters covering a wide range of observational signal-to-noise ratios. The python scripts are completely automatic and can be used to rapidly classify galaxy cluster morphology for large numbers of clusters without human intervention.
A-SLOTH (Ancient Stars and Local Observables by Tracing Halos) connects the formation of the first stars and galaxies to observables. The model is based on dark matter merger trees, on which A-SLOTH applies analytical recipes for baryonic physics to model the formation of both metal-free and metal-poor stars and the transition between them. The software samples individual stars and includes radiative, chemical, and mechanical feedback. A-SLOTH has versatile applications with moderate computational requirements. It can be used to constrain the properties of the first stars and high-z galaxies based on local observables, predicts properties of the oldest and most metal-poor stars in the Milky Way, can serve as a subgrid model for larger cosmological simulations, and predicts next-generation observables of the early Universe, such as supernova rates or gravitational wave events.
A-Track is a fast, open-source, cross-platform pipeline for detecting moving objects (asteroids and comets) in sequential telescope images in FITS format. The moving objects are detected using a modified line detection algorithm.
a3cosmos-gas-evolution calculates galaxies' cold molecular gas properties using gas scaling functions derived from the A3COSMOS project. By known galaxies' redshifts or cosmic age, stellar masses, and star formation enhancement to galaxies' star-forming main sequence (Delta MS), the gas scaling functions predict their stellar mass ratio (gas fraction) and gas depletion time.
The ALeRCE anomaly detector cross-validates six anomaly detection algorithms for three classes (transient, periodic, and stochastic) of anomalous sources within the Zwicky Transient Facility (ZTF) data stream using the ALeRCE light curve features. A machine and deep learning-based framework is used for anomaly detection. For each class, a distinct anomaly detection model is constructed using only information about the known objects (i.e., inliers) for training. An anomaly score is computed using the probabilities to determine whether the light curve corresponds to a transient, stochastic, or periodic nature.
AAOGlimpse is an experimental display program that uses OpenGL to display FITS data (and even JPEG images) as 3D surfaces that can be rotated and viewed from different angles, all in real-time. It is WCS-compliant and designed to handle three-dimensional data. Each plane in a data cube is surfaced in the same way, and the program allows the user to travel through a cube by 'peeling off' successive planes, or to look into a cube by suppressing the display of data below a given cutoff value. It can blink images and can superimpose images and contour maps from different sources using their world coordinate data. A limited socket interface allows communication with other programs.
This python code automatically detects solar active regions (AR). Based on morphological operation and region growing, it uses synoptic magnetograms from SOHO/MDI and SDO/HMI and calculates the parameters that characterize each AR, including the latitude and longitude of the flux-weighted centroid of two polarities and the whole AR, the area, and the flux of each polarity, and the initial and final dipole moments.
AART (Adaptive Analytical Ray Tracing) exploits the integrability properties of the Kerr spacetime to compute high-resolution black hole images and their visibility amplitude on long interferometric baselines. It implements a non-uniform adaptive grid on the image plane suitable to study black hole photon rings (narrow ring-shaped features, predicted by general relativity but not yet observed). The code implements all the relevant equations required to compute the appearance of equatorial sources on the (far) observer's screen.
aartfaac2ms converts raw Aartfaac correlator files to the casacore (ascl:1912.002) measurement set format. It phase rotates the data to a common phase center, and (optionally) flags, averages, and compresses the data. The code includes a tool, afedit, to splice a raw Aartfaac set based on LST.
autoregressive-bbh-inference, written in Python, models the distributions of binary black hole masses, spins, and redshifts to identify physical features appearing in these distributions without the need for strongly-parametrized population models. This allows not only agnostic study of the “known unknowns” of the black hole population but also reveals the “unknown unknowns," the unexpected and impactful features that may otherwise be missed by the standard building-block method.
abcpmc is a Python Approximate Bayesian Computing (ABC) Population Monte Carlo (PMC) implementation based on Sequential Monte Carlo (SMC) with Particle Filtering techniques. It is extendable with k-nearest neighbour (KNN) or optimal local covariance matrix (OLCM) pertubation kernels and has built-in support for massively parallelized sampling on a cluster using MPI.
Line broadening cross sections for the broadening of spectral lines by collisions with neutral hydrogen atoms have been tabulated by Anstee & O’Mara (1995), Barklem & O’Mara (1997) and Barklem, O’Mara & Ross (1998) for s–p, p–s, p–d, d–p, d–f and f–d transitions. abo-cross, written in Fortran, interpolates in these tabulations to make these data more accessible to the end user. This code can be incorporated into existing spectrum synthesis programs or used it in a stand-alone mode to compute line broadening cross sections for specific transitions.
abundance, written in Fortran, provides driver and fitting routines to compute the predicted number of clusters in a ΛCDM cosmology that agrees with CMB, SN, BAO, and H0 measurements (up to 2010) at some specified parameter confidence and the mass that would rule out that cosmology at some specified sample confidence. It also computes the expected number of such clusters in the light cone and the Eddington bias factor that must be applied to observed masses.
The AbundanceMatching Python module creates (interpolates and extrapolates) abundance functions and also provides fiducial deconvolution and abundance matching.
ACORNS-ADI, written in python, is a parallelized software package which reduces high-contrast imaging data. Originally written for imaging data from Subaru/HiCIAO, it requires minimal modification to reduce data from other instruments. It is efficient, open-source, and includes several optional features which may improve performance.
acorns generates a hierarchical system of clusters within discrete data by using an n-dimensional unsupervised machine-learning algorithm that clusters spectroscopic position-position-velocity data. The algorithm is based on a technique known as hierarchical agglomerative clustering. Although acorns was designed with the analysis of discrete spectroscopic position-position-velocity (PPV) data in mind (rather than uniformly spaced data cubes), clustering can be performed in n-dimensions and the algorithm can be readily applied to other data sets in addition to PPV measurements.
ALMA Common Software (ACS) provides a software infrastructure common to all ALMA partners and consists of a documented collection of common patterns and components which implement those patterns. The heart of ACS is based on a distributed Component-Container model, with ACS Components implemented as CORBA objects in any of the supported programming languages. ACS provides common CORBA-based services such as logging, error and alarm management, configuration database and lifecycle management. Although designed for ALMA, ACS can and is being used in other control systems and distributed software projects, since it implements proven design patterns using state of the art, reliable technology. It also allows, through the use of well-known standard constructs and components, that other team members whom are not authors of ACS easily understand the architecture of software modules, making maintenance affordable even on a very large project.
The ACStools package contains Python tools to work with data from the Hubble Space Telescope (HST) Advanced Camera for Surveys (ACS). The package has several calibration utilities and a zeropoints calculator, can detect satellite trails, and offers destriping, polarization, and photometric tools.
ActSNClass uses a parametric feature extraction method, Random Forest classifier and two learning strategies (uncertainty sampling and random sampling) to performs active learning for supernova photometric classification.
ADAM (All-Data Asteroid Modeling) models asteroid shape reconstruction from observations. Developed in MATLAB with core routines in C, its features include general nonconvex and non-starlike parametric 3D shape supports and reconstruction of asteroid shape from any combination of lightcurves, adaptive optics images, HST/FGS data, disk-resolved thermal images, interferometry, and range-Doppler radar images. ADAM does not require boundary contour extraction for reconstruction and can be run in parallel.
AdaMet (Adaptive Metropolis) performs efficient Bayesian analysis. The user-friendly Python package is an implementation of the Adaptive Metropolis algorithm. In many real-world applications, it is more efficient and robust than emcee (ascl:1303.002), which warm-up phase scales linearly with the number of walkers. For this reason, and because of its didactic value, the AdaMet code is provided as an alternative.
Would you like to view a random code?