- API data.nasa.gov | Last Updated 2020-01-29T04:57:20.000Z
This dataset is comprised of asteroid flux data measured in 26 filters using the McCord dual beam photometer, and covering the range 0.32 - 1.08 microns for 285 numbered asteroids, as published in Chapman & Gaffey (1979b) and McFadden, et al. (1984).
- API data.nasa.gov | Last Updated 2020-01-29T02:14:25.000Z
The objective of this research is to create a suite of tools for monitoring airport gate activities with the objective of improving aircraft turnaround. Airport ramp areas are the most crowded and cluttered spaces in the entire National Airspace System (NAS). Activities related to turnaround of the aircraft from the gate represent a significant source of delay and therefore impact the predictability of NAS operations. Optimal Synthesis Inc., seeks to leverage its expertise in monitoring aircraft in the ramp areas using video surveillance data and advanced computer vision algorithms towards building an advanced gate activity monitoring that will in turn enable a gate turnaround prediction tool. The tool suite will specifically identify the various stages of turnaround such as refueling, luggage unloading/loading, catering, and deicing. It will further create a probabilistic model of the times associated with each of these events, that will be used for predicting the future sequence of events and their predicted times of completion. Phase I research will demonstrate the core ideas of gate activity recognition using state-of-the-art computer vision and machine learning algorithms. Phase II research will elevate the technology readiness level of this tool suite to work with real-time video surveillance streams.
Global Flood Risk From Advanced Modeling and Remote Sensing in Collaboration With Google Earth Enginedata.nasa.gov | Last Updated 2020-01-29T04:25:53.000Z
As predictive accuracy of the climate response to greenhouse emissions improves, measurements of sea level rise are being coupled with modeling to better understand coastal vulnerability to flooding. Predictions of rising intensity of storm rainfall and larger tropical storms also imply increased inland flooding, and many studies conclude this is already occurring in some regions. Most rivers experience some flooding each year: the seasonal discharge variation from low to high water can be 2-3 orders of magnitude. The mean annual flood is an important threshold: its level separates land flooded each year from land only affected by large floods. We lack adequate geospatial information on a global basis defining floodplains within the mean annual flood limit and the higher lands still subject to significant risk (e.g. with exceedance probability of greater than 3.3%; the 30 yr floodplain). This lack of knowledge concerning changing surface water affects many disciplines and remote sensing data sets, where, quite commonly, a static water 'mask' is employed to separate water from land. For example, inland bio-geochemical cycling of C and N is affected by flooding, but floodplain areas are not well constrained. Measurements and computer models of flood inundation over large areas have been difficult to incorporate because of a scarcity of observations in compatible formats, and a lack of the detailed boundary conditions, in particular floodplain topography, required to run hydrodynamic models. However, the available data now allow such work, and the computational techniques needed to ingest such information are ready for development. Optical and SAR sensing are providing a near-global record of floodplain inundation, and passive microwave radiometry is producing a calibrated record of flood-associated discharge values, 1998-present. Also, global topographic data are of increasingly fine resolution, and techniques have been developed to facilitate their incorporation into modeling. Several of us have already demonstrated the new capability to accurately model and map floodplains on a continent scale using input discharges of various sizes and exceedance probabilities. Work is needed to accomplish global-scale products, wherein results are extended to all continents, and downscaled to be locally accurate and useful. Floodplain mapping technologies and standards vary greatly among nations (many nations have neither): the planned effort will provide a global flood hazard infrastructure on which detailed local risk assessment can build. Our project brings together an experienced team of modeling, remote sensing, hydrology, and information technology scientists at JPL and the University of Colorado with the Google Earth Engine team to implement and disseminate a Global Floodplains and Flood Risk digital map product. This project addresses major priorities listed in the AIST program: with Google, we would identify, develop, and demonstrate advanced information system technologies that increases the accessibility and utility of NASA science data and enables new information products. The work will address the Core Topic 'Data-Centric Technologies', including 'Technologies that provide opportunities for more efficient interoperations with observations data systems, such as high end computing and modeling systems; and Capabilities that advance integrated Earth science missions by enabling discovery and access to Service Oriented Architecture'. It will also address the Special Subtopic 'Technology Enhancements for Applied Sciences Applications' in regard to natural disasters, and contribute to the GEOSS architecture for the use of remote sensing products in disaster management and risk assessment.
- API data.nasa.gov | Last Updated 2020-01-29T05:02:22.000Z
The Sloan Digital Sky Survey (SDSS) Moving Object Catalog 3rd release lists astrometric and photometric data for moving objects detected in the SDSS. The catalog includes various identification parameters, SDSS astrometric measurements (five SDSS magnitudes and their errors), and orbital elements for previously cataloged asteroids. The data set also includes a list of the runs from which data are included, and filter response curves.
- API data.nasa.gov | Last Updated 2020-01-29T04:18:50.000Z
The objective of this project is to demonstrate intelligent health and maintenance status determination and predictive fault diagnosis techniques for NASA rocket engines under online and offline conditions from either on-board or maintenance, test and analytic data. AGNC proposes a Health and Maintenance Status Determination and Predictive Fault Diagnosis System (HMSD/PFDS). The fuzzy qualitative model for model-based residual generation and the rule-based evaluation of residuals using neural-fuzzy combination are defined. Intelligent data fusion strategies for health and maintenance determination and predictive fault diagnosis are developed for rocket engine systems/subsystems. The goal is to ensure safety, cost reduction, graceful degradation and re-optimization in the case of failures, malfunctions and damages. Kalman filter based and rule based evaluation of residuals using neural-fuzzy combination are developed. The use of fuzzy qualitative models takes into account the uncertainties associated with behavior descriptions and incorporates available expert failure symptom knowledge to recognize the particular failure features. Actual or simulated rocket engine sensed or derived data are utilized to evaluate the effectiveness of the health and maintenance determination and fault prognosis approaches for NASA platforms. Phase I is devoted to the HMSD/PFDS design and simulation. Phase II will result in development of a functional prototype.
- API data.nasa.gov | Last Updated 2020-01-29T04:05:07.000Z
Artificial Intelligence (AI) is driving the fourth industrial revolution as well as permeating every aspect of our day-to-day life. From big data analysis to language analysis and real time translation, from speech recognition to image recognition. The latter is a powerful and quite general application with a scope that spans from medical imaging to autonomous driving and to military applications. Mentium Technologies Inc., spun from a UC Santa Barbara research lab in the Electrical and Computer Engineering department is committed to embrace the AI revolution strong of the experience of its team in the neuromorphic hardware for AI. Indeed, we will develop a neuromorphic chip able to do higher than real-time image recognition and/or object classification on board the UAS. The chip will use 1/100th of the energy while reaching 100x in speed compared to state of the art. The team already had demonstrated 1000x and 1/1000th energy consumption in a smaller scale experimental demo. From this experience UCSB has a patented technology licensed by Mentium Technologies Inc. thanks to this technology and its develpment within this project, the Neuromorphic Chip will empower the UAS with Cognitive functions enabling autonomous guidance, decision making and complex image processing, while keeping the power consumption low.
- API data.nasa.gov | Last Updated 2020-02-24T05:40:06.000Z
Multi-angle Imaging SpectroRadiometer (MISR) is an instrument designed to view Earth with cameras pointed in 9 different directions. As the instrument flies overhead, each piece of Earth's surface below is successively imaged by all 9 cameras, in each of 4 wavelengths (blue, green, red, and near-infrared). The goal of MISR is to improve our understanding of the fate of sunlight in Earth environment, as well as distinguish different types of clouds, particles and surfaces. Specifically, MISR monitors the monthly, seasonal, and long-term trends in three areas: 1) amount and type of atmospheric particles (aerosols), including those formed by natural sources and by human activities; 2) amounts, types, and heights of clouds, and 3) distribution of land surface cover, including vegetation canopy structure. This file contains the public MISR Level 3 Cloud Top Height-Optical Depth Product covering a month.
- API data.nasa.gov | Last Updated 2019-12-12T23:50:14.000Z
CAL_LID_L2_05kmCPro-Prov-V3-40 data are CALIPSO Lidar Level 2 Cloud Profile data. The Lidar Level 2 Cloud Profile data product contains cloud profile data and ancillary data. The cloud profile product is produced at 5 km horizontal resolution and is written in HDF. Note that there is no atmospheric volume characterization associated with the cloud profile products. Also, the 1064 calibration scheme assumes that both the extinction and the backscatter from clouds are spectrally independent. Consistent with this assumption, extinction and backscatter profiles will be reported for clouds only at 532 nm. Additionally, it is important to note that the aerosol profile product extends upward to 30.1 km, while the cloud profile product ceases at 20.2. Therefore, users interested in polar stratospheric clouds will need to order the aerosol profile data product. The science algorithms used to produce the V3.40 CALIOP data products are identical to those used to generate the V3.01 and V3.02 products; however, some of the ancillary data used in the V3.40 analyses is different. All CALIOP data products rely on meteorological data provided by NASA's Global Modeling and Assimilation Office (GMAO). The V3.01 and V3.02 data products were produced using the GMAO's GEOS 5.2 data products. Cloud-Aerosol Lidar and Infrared Pathfinder Satellite Observations (CALIPSO) was launched on April 28, 2006 to study the impact of clouds and aerosols on the Earth's radiation budget and climate. It flies in the international A-Train constellation for coincident Earth observations. The CALIPSO satellite comprises three instruments, the Cloud-Aerosol LIdar with Orthogonal Polarization (CALIOP), the Imaging Infrared Radiometer (IIR), and the Wide Field Camera (WFC). CALIPSO is a joint satellite mission between NASA and the French Agency, CNES.
- API data.nasa.gov | Last Updated 2020-01-29T03:39:20.000Z
<p>The primary objective of this activity is to develop, design, and test (DD&T) the QUAD-core siTARA (QUATARA) computer to distribute computationally intensive processes such as: communication, sensors, attitude determination, attitude control, cameras, robotic manipulators, and science payloads. An example of the current state-of-the art for a COTS CubeSat flight computer is, a 16 bit 80 MHz Microchip dsPIC33 microcontroller capable of managing the satellite attitude determination, control system, communication system, power, and science payloads. Adding more capability to these COTS flight computers required the development, under a previous CIF proposal, of the Modular Attitude Determination System (MADS) board. MADS lessened the I/O load from the flight computer so it could focus on higher priority tasks such as managing a Real-Time Operating System (RTOS) or carrying out an attitude control algorithm. The MADS board utilized a 16 bit 80 MHz Texas Instruments ARM Cortex-M4 Stellaris microcontroller to execute the attitude determination algorithm independently of the dsPIC33 flight computer. Once the MADS board processes the data, the dsPIC33 receives the estimated states and determines the desired attitude control.</p><p>The addition of cameras, proximity sensors, robotic manipulators, thruster systems, complex science payloads and video guidance systems, would cause current CubeSat flight computers to be overwhelmed. Because of the desire to expand the capabilities of CubeSats, the innovation of the QUATARA architecture enhances the capabilities of data handling and computer processing by replacing the 16 bit 80 MHz microcontrollers with four 64 bit 1 GHz microprocessors. The QUATARA allows for tasks to be processed at a faster rate not only because of the difference in clock speed between the platforms but also because of the fact that there are four individual microprocessors which can run these tasks independently without the need to serialize the execution of the code like in a single microcontroller.</p><p>The QUATARA computer aims to be fault-tolerant by means of a software voting scheme to guard against the effects of Single Event Effects (SEE) such as Single Event Upsets (SEU). Each ‘node’ (Gumstix Computer-On-Modules (COM)) of the QUATARA computer will be connected to its own set of sensors and actuators. These individual nodes will collect their respective data and share it between themselves over a data bus (such as RS-485). Once each node has all the data from all of the other nodes it will process it and come up with a result. This result can then be used to determine if a node is considered as ‘failed’ and that node then needs to be disabled, (this can be done by ignoring future data received from that node or by completely shutting it off). In the case a node is lost a support node is available to be switched in for the failed node. This support node will focus on low priority tasks, (such as housekeeping), if it is not required as a voting node. Synchronization between the nodes can be maintained by having a precise timing source on each of the processors, (such as a ticking timer interrupt routine), that ticks at a set time interval. This timing information will be passed between the nodes and the tick rate of the interrupt routine will be modified as required to ensure that all of the nodes data remains in sync.</p>
- API data.nasa.gov | Last Updated 2020-01-29T04:12:36.000Z
Current-day capabilities for performing post operations analysis (POA) of air traffic operations at airports, airlines and FAA facilities are mostly limited to creating reporting type of analysis results which compare mean values of key performance indicators against the respective expected nominal levels (e.g., average daily delay). This single point comparison method does not directly enable a POA analyst to identify the root-cause for a particular observed inefficiency, nor does it help in identifying a solution for mitigating that inefficiency. This SBIR develops a machine learning based approach for improving POA and for potentially making it more autonomous. We call this tool Collective Inference based Data Analytics System for POA (CIDAS-P). CIDAS-P will provide airport, airline, FAA and NASA personnel with a fast, flexible and streamlined process for analyzing the day-of-operations, rapidly pinpointing exact causes for any observed inefficiencies, as well as recommending actions to be taken to avoid the same inefficiencies in the future. It does this by developing an innovative, collective inference algorithm for cross-comparing performance of the same facility on different days as well as cross-comparing performance across different facilities. The algorithm leverages sophisticated probabilistic modeling techniques that consider the subtle nuances by which cross-facility and cross-day operational scenarios differ to enable apples-to-apples comparisons across traffic scenarios and identify what works well and what does not in similar situations. User acceptance of NASA Trajectory Based Operations research products stands to benefit from CIDAS-P because CIDAS-P's automated recommendations can help identify and fix problems with these products early on in their deployment life-cycle.