The main function of this team is perform design and analysis for a variety of systems and applications, including energy research (solar cells, reactors, transportation, and buildings), experimental installations of all types, centrifuge cascades for uranium enrichment and separation of stable isotopes, and industrial facilities. The analysis leads to improvement in performance and assessing risks from accidents or terrorist attacks.
Dispersion models provided by the ARQA team support the processing and quality assurance of meteorological data for use in environmental compliance modeling. In addition, climate data processing is a key contributor to national and international research efforts related to global climate change.
The CSFM team performs independent integrity assessments of engineering structures using computational methods. They specialize in probabilistic risk assessments, as well as in defect assessments and predictions of probability of failure of engineering structures due to the presence of defects. The team has extensive experience in the analysis of ferritic and austenitic steels, including welds and weld heat-affected zones for domestic and international nuclear power plants and performs independent reviews of probabilistic leak-before-break for piping systems of different materials in nuclear power plants. Since the late 1960’s, the CSFM team at ORNL has provided the technical bases and computational tools to the US Nuclear Regulatory Commission’s office of Nuclear Regulatory Research (USA NRC-RES) and are positioned to offer a wide and varied suite of software tools and analysis services, backed by strong and credible research products that are well documented in technical publications.
Qubits must typically be kept isolated and very cold to minimize interactions with the external environment. These interactions lead to qubit decoherence - essentially loss of quantum information - and adversely affect the efficiency of quantum computing schemes. However, it may be possible to not only control these environmental interactions, but harness them in a constructive manner that results in entanglement, versus destroying it. The result is a scalable, more efficient, quantum computing platform that doesn't require cryogenics to operate.
Reducing the propagation loss, while increasing electric field confinement, is a major goal of nanophotonics for future high bandwidth, high processing speed computational requirements. However, in the current state-of-the-art metal waveguides, the propagating signal suffers restrictive limiting losses as the size of the components are reduced to the nano-scale regime. In this project we seek to exploit the propagation of surface plasmon nanojets on nanostructured thin films in order to reduce propagation losses while retaining field confinement. This improvement will allow advances into future nanophotonic-based computational platforms that will leapfrog Moore’s Law.
Neuromorphic computing is one proposed computing architecture for the beyond Moore's Law computing landscape. Neuromorphic computers are software/hardware systems that have implementations with features inspired by biological brains. Key research questions associated with neuromorphic computing are (1) how to program or train these architectures to perform tasks and (2) what supporting software is required in order to integrate these architectures into real systems and make them accessible to novice users. The goal of this project is to develop algorithms for training neuromorphic computers and supporting software for neuromorphic computers, including development environments, visualization tools, and hardware simulators.
There has been a recent surge of success in utilizing Deep Learning (DL) in imaging and speech applications for its relatively automatic feature generation and, in particular for convolutional neural networks (CNNs), high accuracy classification abilities. While these models learn their parameters through data-driven methods, model selection (as architecture construction) through hyper-parameter choices remains a tedious and highly intuition driven task. To address this, Multi-node Evolutionary Neural Networks for Deep Learning (MENNDL) is proposed as a method for automating network selection on computational clusters through hyper-parameter optimization performed via genetic algorithms.
EDEN is a visual analytics tool for exploratory analysis of multivariate data sets. Based on an interactive variant of parallel coordinates, EDEN includes statistical analytics that guide the user to significant associations in complex data sets without information loss.