Wednesdays at 12:45 p.m. – 1:45 p.m. —
UNLESS OTHERWISE NOTED.
Discussion of the National Weather Service’s polygon warnings has been ongoing since being introduced in October, 2007. As originally conceived, polygon warnings were intended to spatially enclose the hazards accompanying severe weather without regard to county or other geopolitical boundaries. In fact, statistics and visual evidence confirm that many storm-based warnings are issued from this perspective. However, NOAA Weather Radio All Hazards and most other warning dissemination systems remain county-based. This, coupled with the fact that polygons often have unnecessarily large dimensions in either space or time, can lead to the reality or perception of over-warning. Recent social science research indicates that the public’s perception of false alarms is much greater than the actual number of warnings issued, regardless of event verification. Adding improved polygonology and properly analyzed environmental data to the warning decision process allows forecasters to reduce the false alarm ratio, without reducing the more important probability of detection and lead time. The strategies and best practices that will be presented are intended to reduce the public’s perception, real or perceived, of false alarms, thereby increasing the overall confidence in the warning decisions.
Wednesday, September 2nd – Lightning: A Biological Perspective – (Tom Wallace Endowed Chair of Conservation) – Steve Yanoviak
Trees form the terrestrial interface with the atmosphere in forested regions, and lightning damages millions of trees worldwide each year. However, the biology of tree-lightning interactions remains poorly understood. Research in my lab addresses this problem by exploring three general topics: 1) the electrical properties of trees, 2) the frequency and mechanisms of tree mortality via lightning, and 3) the potential role of vines as passive lightning protection for trees. Electrical resistivity is the most important electrical property of trees in the context of lightning damage, and our field data show that vines tend to be more conductive than trees. Although often overlooked as an important source of tree mortality, estimates drawn from the literature suggest that lightning directly or indirectly kills up to 4% of large canopy trees in a stand annually. Finally, our ongoing efforts to monitor lightning in real time in a tropical forest of central Panama suggest that tropical vines dilute the damaging effects of lightning. Accurate quantification of lightning-induced tree mortality will improve forest turnover models and facilitate predictions of future forest structure.
Wednesday, September 9th – Ocean Weather Applications with GOES-R – (NOAA – NWS Ocean Prediction Center) – Joseph Sienkiewicz
The NOAA/NWS Ocean Prediction Center (OPC) issues wind warnings and forecasts for the extratropical portions of the North Atlantic and North Pacific from the sub-tropics to the low Arctic. The OPC mission partially fulfills the U.S. commitment to the International Safety of Life At Sea Convention (SOLAS). OPC areas of responsibility fall under the umbrella of four geostationary satellite from Himawari-8/MTSAT to Meteosat and include GOES-W and GOES-E. The OPC with the Weather Prediction Center and Satellite Analysis Branch of the benefit from a Satellite Liaison to prepare for GOES-R and JPSS. This talk will discuss preparation activities and service focus areas that will benefit from the capabilities of GOES-R including offshore convection, explosive cyclogenesis, reduced visibility due to fog, and oceanography.
In the Central United States, most heavy rain events are associated with MCSs that occur during the warm season, a time when QPF skill is especially poor. This presentation will examine predictability of these events from several angles. First, the impact of radar data assimilation in convection--allowing Weather Research and Forecasting (WRF) runs on QPF for 12 extreme rainfall events in Iowa will be discussed. Although the assimilation generally improves forecasts, these improvements are restricted mostly to the first 12 hours, and impacts vary significantly among the different cases. Next, the relationship between accuracy of WRF LLJ forecasts and the skill of QPF for MCSs occurring near the nose of the LLJ will be examined. Preliminary work suggests that the accuracy of the WRF forecasts of ageostrophic wind direction correlate best with Equitable Threat Scores for MCS precipitation. Finally, predictability of the morphology of MCSs will be explored. In a study of 37 events in the central United States, it is found that the WRF model simulates most poorly squall lines with trailing stratiform rain and bow echoes. Progressive bows are found to be more poorly predicted than serial bow echoes. In a well--predicted bow event, an ensemble using mixed initial conditions results in the greatest spread, with one using mixed microphysics having less spread, and one using Stochastic Kinetic Energy Backscatter having the least spread. However, despite these variations in spread which are seen easily in position of the systems, all ensembles show very large spread in the small-- scale details of the systems, including amount of bowing.
Wednesday, September 23rd – Climate Engineering: Marine Cloud Brightening – (NCAR & Manchester University (UK)) – John Latham
The Advanced Radar for Meteorological and Operational Research (ARMOR), which is located at the Huntsville International Airport, is jointly owned and operated by UAH and WHNT-TV. ARMOR has been a collaborative Huntsville enterprise since its inception. It was integrated into a dual-polarization (DP) capable radar system for UAH and WHNT-TV between 2004 and 2006 by the Huntsville headquartered Baron Services. ARMOR has contributed to basic and applied research in the atmospheric sciences by UAH, NASA MSFC and the NOAA NWS Huntsville (HUN) Weather Forecast Office (WFO) for about 10 years now; it has also been used actively for situational awareness and forecasting operations during high impact weather over the Tennessee Valley by WHNT-TV and the HUN WFO. To celebrate just a few of ARMOR’s accomplishments, I recently presented an invited keynote talk on the use of ARMOR in Mesoscale and Severe Weather at the recent American Meteorological Society’s 37th Conference on Radar Meteorology. In this seminar, I will re-present this invited talk and provide the opportunity for other’s in NSSTC to re-count their use of ARMOR in service to science and society over the last 10 years.
Predicting hazards associated with deep convection remains a considerable challenge for the weather community, but recent progress in storm-scale ensemble forecast system design has enabled considerable recent progress. Key developments in convective weather hazard prediction include 1) ensemble forecasts with sufficiently fine grid spacing to resolve the key attributes of the parent convection; 2) appropriate initialization approach that results in ensemble forecast error growth characteristics that reliably approximate the predictability of the forecast event; and 3) development of severe storm surrogates that relate resolved model forecast features with convective weather hazards.
To develop robust statistics on the predictive skill of rare events, large sample sizes are needed. As such, since early April in 2015, NCAR has produced daily 10-member convection-permitting ensemble forecasts over CONUS out to 48 hours (http://ensemble.ucar.edu). These daily forecasts are expected to continue at least through Summer 2016. This forecast system draws ensemble initial conditions from a Data Assimilation Research Testbed (DART) toolkit continuously cycled ensemble analysis that uses the same forecast model to advance member states between cycled analyses. For the talk, I will give a brief description of the basis for the NCAR ensemble system design, followed by a discussion of areas of storm-scale ensemble design that NCAR is actively investigating, including higher resolution ensembles (1-km horizontal grid spacing), more frequent (hourly) cycling of the ensemble analysis, as well as progress in fully cycled convection-permitting ensemble analysis on a CONUS grid. Each of these aspects will be considered with respect making more reliable mesoscale predictions of convective weather hazards.
About 20% of the Brazilian Amazon has now been converted to pasture or agriculture. This deforestation has led to a patchwork landscape, in which deforested patches have a typical size of about 20 km. Assessing the atmospheric response to deforestation at this scale is challenging because conventional GCMs run at much coarser resolution. To address this problem, we have carried out a set of simulations with a new, variable-resolution atmospheric model. These simulations were designed to capture the critical 20 km length scale in the Amazon in the context of a global atmospheric GCM. We have found that reductions in surface roughness associated with ~20 km scale deforestation can give rise to a previously unrecognized atmospheric circulation. This circulation is capable of convective triggering, but it also weakens the turbulent exchange between the land and atmosphere. Furthermore, this circulation has distinct impacts on the hydroclimate of the western and eastern halves of the deforested sector of the Amazon in the Brazilian state of Rondonia: shallow cloudiness is increased in the western half of Rondonia but reduced in the eastern half. Our analysis of satellite imagery paints a similar picture: increased cloudiness in western Rondonia and reduced cloudiness in eastern Rondonia. Overall, our results show that the atmospheric response to contemporary ~20 km scale deforestation is likely to be more influenced by differences in surface roughness between forests and pastures than by previously recognized mechanisms, such as impacts of changes in surface energy partitioning.
This presentation will consist of two parts. The first part will describe how vertical air motion and raindrop size distributions (DSDs) were retrieved from 449-MHz and 2.835-GHz (UHF and S-band) vertically pointing radars (VPRs) deployed side-by-side during the 2011 Mid-latitude Continental Convective Clouds Experiment (MC3E) held in Northern Oklahoma. The 449-MHz VPR can measure both turbulent air motions and raindrop motion while the S-band VPR can only measure raindrop motion. The difference in VPR sensitivities of these two instruments enables two peaks to be identified in 449-MHz VPR reflectivity-weighted Doppler velocity spectra facilitating vertical air motion and DSD parameter retrievals from near the surface to just below the melting layer.
The second part of this presentation will analyze the vertical evolution of falling raindrops by introducing liquid water content (LWC) Vertical Decomposition Diagrams (LWC-VDDs). Using VPR retrieved DSDs, the LWC is decomposed into two terms: one representing the number concentration and another representing the DSD shape. The LWC-VDD is a diagnostic tool used to investigate microphysical processes in the vertical column including raindrop evaporation and raindrop breakup or coalescence.
As an example, a stratiform rain event (20-May-2011) will be presented where the LWC-VDDs exhibited signatures of evaporation as well as raindrop coalescence as the raindrops fell 2 km below the radar brightband.
As a discipline, archaeology is poised to fully embrace both the power and the peril of big data analysis. Our datasets are growing ever larger, especially those generated via remote sensing and geospatial processing activities, as is the computational complexity of algorithms designed to exploit them. Analyses are quickly outpacing what can be done using a single processing core on a desktop computer, leveraging off-the-shelf commercial and open source software. Our research needs are becoming increasingly sophisticated, to the point where relying wholly on outside experts in computer science and related fields is untenable. While the above statements could be viewed primarily as challenges, it is better to think of them as opportunities for archaeology to grow technologically and retain more ownership of our hardest problems. High performance computing, i.e., supercomputing, is already having an impact on the field, but we are moving into an era that promises to put the power of the world’s largest and fastest computers at archaeologists’ fingertips. What does the state of the art look like? How could we use the power already available, much less what is coming next? What lines of inquiry and analysis could we pursue once long-standing technical limitations have been removed? How will fieldwork be transformed?
This presentation will focus on the present and the future of archaeological supercomputing, using several ongoing projects across a broad swath of the discipline as examples of where we are now and signposts for where we are heading, concluding with some thoughts on the art of the possible, given current and emerging technological trends. It is hoped that the audience will come away feeling less intimidated by the idea of using supercomputing to solve archaeological problems, and knowing that they can and should take full advantage of the computing power available today as well as help drive how the systems of tomorrow are designed.
Wednesday, October 28th –Snowfall Rate Retrieval using JPSS/Metop/POES Microwave Radiometers – (NOAA/NESDIS) – Huan Meng
Passive microwave measurements at certain high frequencies are sensitive to the scattering effect of snow particles, and can be utilized to retrieve snowfall properties. Some of the microwave sensors with snowfall sensitive channels are the Advance Technology Microwave Sounder (ATMS) aboard S-NPP and Advanced Microwave Sounding Unit (AMSU) and Microwave Humidity Sounder (MHS) aboard POES and Metop satellites. ATMS is the follow-on sensor to AMSU and MHS. Currently, an AMSU and MHS based land snowfall rate (SFR) product is running operationally at NOAA/NESDIS. Based on the AMSU/MHS SFR, an ATMS SFR algorithm was developed in a project supported by the JPSS PGRR program. Much improvement has been made since the original ATMS SFR algorithm was developed. A major advancement is the addition of a cold temperature component for snowfall detection. It extends the 2-meter temperature low limit for SFR from about 22°F to about 7°F, and drastically increases the probability of detection of snowfall in colder weather. Other algorithm development includes increasing accuracy of snowfall rate retrieval by performing histogram matching with radar snowfall data. Validation study was carried out for snowfall detection and snowfall rate against ground snowfall observations and radar snowfall estimates, and demonstrated the robustness of the SFR algorithm. The ATMS and AMSU/MHS SFR products have been evaluated at several NWS Weather Forecast Offices (WFOs) in operational environment. This is a collaborative effort among NASA SPoRT, NOAA, and the Cooperative Institute of Climate and Satellites (CICS) at University of Maryland. The SFR products have been found particularly valuable for weather forecasting in areas with inadequate radar coverage.
Wildfires are becoming more frequent and more intense in the western U.S. and are expected to increase in importance due to a shift in forest management practices, and projected warmer, drier regional climate. These fires have climate, air quality and health impacts that result from the chemical and physical properties of emissions. Several aspects of wildfire are not understood: why do some fire plumes produce ozone and others do not? Why does secondary organic aerosol increase in some fire plumes and decrease in others? What is the nature of brown carbon aerosol in wildfire plumes? What causes health effects from fires? NOAA CSD is leading a multi-year, multi-platform research project, Fire Influence on Regional and Global Environments Experiment, (FIREX), aimed at addressing these research questions and more. I will talk about some current research on fire emissions that bears on some of the above research questions, focusing primarily on nitrogen species, and their photochemical and multi-phase processing.
The Western U.S. is perceived to have relatively pristine air, but at present the west is facing a number of significant air quality issues, including:
In this presentation, I will summarize our knowledge on the issues above, with a particular emphasis on key scientific questions and the interactions between science and policy in the Western U.S.
Wednesday, November 18th –
Analysis of the Lightning Jump Algorithm Using Multiple Datasets for Hail Events
The Lightning Jump Algorithm (LJA) is a useful tool when assessing severe weather potential. This study seeks to combine the LJA with traditional radar methods to evaluate connections between lightning and radar trends. Additionally, hail reports from the Severe Hazards Analysis and Verification Experiment (SHAVE) help to enhance the verification process for each case.
Radar and Profiling Observations of the Interaction Between a Lake-effect Snow Band and a Shallow Cold Front During the Ontario Winter Lake-effect Systems (OWLeS) Experiment
The Ontario Winter Lake-effect Systems (OWLeS) experiment was conducted from December 2013-January 2014. The field project utilized in-situ and remotely sensing instruments in order to obtain multi-scale kinematic & thermodynamic measurements of lake-effect snow systems. A Long-Lake Axis Parallel (LLAP) Band passed over the University of Alabama in Huntsville (UAH) Mobile Integrated Profiling System (MIPS) on 16 December 2013 from 0550-0630 UTC. Analysis of the LLAP Band is presented herein. Doppler On Wheels (DOW) data was also analyzed in conjunction with MIPS observations to study notable features sampled during the event.
The Use of CYGNSS for Understanding the Onset of the Madden-Julian Oscillation
The Madden-Julian Oscillation (MJO) is a planetary circulation in the tropics that is difficult to explicitly define given that it mostly takes place in data void regions. To overcome sparse datasets, Cyclone Global Navigation Satellite System (CYGNSS), which is a micro-satellite system set to launch in 2016, will capture ocean surface winds across the tropics, even beneath heavy precipitation. This study examines the ability of CYGNSS to effectively observe the MJO. This is achieved by using the CYGNSS End-to-End Simulator to simulate the weak December 2011 MJO event from the Dynamics of the MJO (DYNAMO) field campaign.
Wednesday, November 25th –Thanksgiving Break
Wednesday, December 2nd –
Applied Remote Sensing for Archaeological Excavation Preparation at the Oakville Mounds Site in Northern Alabama
Excavations recently were conducted on a large Middle Woodland (200 B.C. – A.D. 500) platform mound at the Oakville Mound site in southeast Lawrence County, Alabama. These preliminary findings provide insights into the mound’s construction and contexts. Ongoing research to better understand these prehistoric monuments aims to derive excavation test locations using Geographical Information Systems (GIS) and remote sensing techniques.
Hydrometeor Analysis and Identification of Small and Large Hailstones
HID (Hydrometeor Identification) is used primarily to detect different polarimetric signatures in precipitation and then give an estimate (using fuzzy logic) as to what precipitation is falling. A background of HID schemes will be given along with a case to attempt to validate the HID. Small and large hail will be the primary hydrometeors looked at in this data set. This HID scheme, although not attempted at this current time, will be essential in trying to validate SHAVE hail data in order to optimize and solidify the general size of hail that is falling at a specific location.
Correlation of NASA EOS, Aerial Multispectral, and Lidar Data with Field Estimated Above-Ground Biomass:
A Case Study of the Mayan Biosphere Reserve
Conventional biomass estimations acquired through field survey campaigns and direct measurement of trees result in very accurate forest structure allometry but are not sustainable solutions for large-scale carbon stock inventories. Remote sensing provides a solution to field survey expenses but statistical regression techniques need to be explored to minimize the inherent limitations of individual datasets that cannot account for complex forest structure. This research evaluates the statistical relations between in situ estimated biomass and remotely sensed imagery and derived products to test the applicability of remote sensing for above ground biomass in the Mayan Biosphere Reserve.
An Analysis of a Shallow Cold Front and Wave Interactions from the PLOWS Field Campaign
The Profiling of Winter Storms was a field campaign that occurred from February 2009 until March 2010 and used a variety of mobile instruments to investigate mesoscale structures and the dynamics of cyclonic weather systems. IOP 19 occurred on February 14-15, 2010, in southwestern Indiana where a shallow cold front and associated wave features moved over the mobile instrumentation array. The impact of the cold front and associated waves on surrounding environment is assessed using the mobile instrumentation.
Understanding How CYGNSS Will Depict Convective Variability Associated with the Madden-Julian Oscillation
The Madden-Julian Oscillation is a globally, eastward propagating mode of tropical atmospheric variability. In 2011, the DYNAMO field campaign collected a wide variety of data, during several MJO events, which is assimilated into the Weather Research and Forecasting (WRF) model in order to obtain a high-resolution wind field to ingest into the Cyclone Global Navigation Satellite System (CYGNSS) End-to-End Simulator. This study will examine the ability of CYGNSS to retrieve ocean surface winds speeds resulting from the convective variability associated with the MJO.
Wednesday, December 9th –
Where Are the Lightning Hotspots on Earth?
Dr. Rachel Albrecht
Professor, Department of Atmospheric Science
University of Sao Paulo, Brazil
Previous total lightning climatology studies using TRMM Lightning Imaging Sensor (LIS) observations used coarse resolution (0.5°) and employed significant spatial and temporal smoothing to account for sampling limitations of the satellite’s low-earth orbit coverage. As a result, several fine scale local convection features are masked out, including the Earth’s principal lightning hotspot. It has been acknowledge that Congo Basin in Africa is the place with most lighting on Earth, but using a new analysis of 16 years of TRMM LIS observations binned onto a very high resolution (0.1°) grid, we reveal that the Earth’s principal lightning hotspot actually occurs over Lake Maracaibo in Venezuela, surprisingly during the night. It is during the night that very localized thunderstorms persistently develop over a warm lake due to convergent windflow from mountain-valley and land-lake breezes. Several other inland lakes (e.g., Lake Victoria) with similar conditions, i.e., deep nocturnal convection driven by locally forced convergent flow over a warm lake surface, are also revealed. Although Africa does not hold the principal hotspot, it is the continent with the most lightning hotspots, followed by Asia, South America, North America, and Australia. We present a ranking of the first 500 lightning hotspots on Earth and show that most of the principal continental maxima are located near major mountain ranges, revealing the importance of local topography in thunderstorm development.
These results are especially relevant in anticipation of the upcoming availability of continuous total lightning observations from the Geostationary Lightning Mapper (GLM) aboard GOES-R. This study provides context to forecasters as to total lightning activity and locations within GLM field of view as well as around the world.