ALBERT

All Library Books, journals and Electronic Records Telegrafenberg

Your email was sent successfully. Check your inbox.

An error occurred while sending the email. Please try again.

Proceed reservation?

Export
Filter
  • American Geophysical Union (AGU)
  • 2015-2019  (282)
  • 1930-1934
Collection
Years
Year
  • 1
    facet.materialart.
    Unknown
    American Geophysical Union (AGU)
    In:  EPIC3Journal of Geophysical Research: Oceans, American Geophysical Union (AGU), 124(8), pp. 5503-5528, ISSN: 2169-9275
    Publication Date: 2022-11-02
    Repository Name: EPIC Alfred Wegener Institut
    Type: Article , NonPeerReviewed
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 2
    Publication Date: 2023-03-31
    Description: Coeval changes in atmospheric CO2 and 14C contents during the last deglaciation are often attributed to ocean circulation changes that released carbon stored in the deep ocean during the Last Glacial Maximum (LGM). Work is being done to generate records that allow for the identification of the exact mechanisms leading to the accumulation and release of carbon from the oceanic reservoir, but these mechanisms are still the subject of debate. Here we present foraminifera 14C data from five cores in a transect across the Chilean continental margin between ~540 and ~3,100 m depth spanning the last 20,000 years. Our data reveal that during the LGM, waters at ~2,000 m were 50% to 80% more depleted in Δ14C than waters at ~1,500 m when compared to modern values, consistent with the hypothesis of a glacial deep ocean carbon reservoir that was isolated from the atmosphere. During the deglaciation, our intermediate water records reveal homogenization in the Δ14C values between ~800 and ~1,500 m from ~16.5–14.5 ka cal BP to ~14–12 ka cal BP, which we interpret as deeper penetration of Antarctic Intermediate Water. While many questions still remain, this process could aid the ventilation of the deep ocean at the beginning of the deglaciation, contributing to the observed ~40 ppm rise in atmospheric pCO2.
    Repository Name: EPIC Alfred Wegener Institut
    Type: Article , peerRev
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 3
    facet.materialart.
    Unknown
    American Geophysical Union (AGU)
    Publication Date: 2018-03-06
    Description: Just like people are often divided into two opposing categories—early birds and night owls, introverts and extroverts—meteorologists designate two main types of storms. Storms that form through organized convection are long-lived, cover a large area, generate and accumulate more clusters of clouds with time, take place in environments with large-scale circulation, and are triggered by uplifting air from a passing front or low-pressure system. Meanwhile, storms that form through unorganized convection are triggered by a temperature anomaly or change near Earth’s surface, take place in environments without large-scale circulation, and are more chaotic and unpredictable. Of the two types, organized convection is better understood and more ubiquitous, especially in the tropics. But meteorologists still have much to learn about convective organization to better understand and predict the behavior of storms. Here Becker et al. investigated the impact of convective organization on entrainment—a process in which warm, buoyant parcels of air become saturated with moisture; form cumulus clouds; and mix with cooler, drier parcels of air. This causes some cloud droplets to evaporate, cooling down the clouds and making them less buoyant. At this point, either the clouds can dissipate into a popcorn-like formation—which is called unaggregated convection—or other cumulus clouds in the vicinity will pile onto them, forming a larger cloud cluster (aggregated convection). Using a numerical model developed by the Max Planck Institute for Meteorology and the German Weather Service, the team created two simulations of convectional organization over a 312,000-square-kilometer grid with 1-kilometer spacing. One simulation started out with unaggregated convection that remained unaggregated throughout. The other simulated the same conditions but started out with aggregated convection and stayed fully aggregated throughout the simulation. The team found that in the lower levels of the troposphere, where our weather occurs, the rate of entrainment is higher when convection is aggregated. This is due to increased turbulence caused by updrafts. Meanwhile, more buoyancy is maintained during aggregated convection because of a moist layer, or shell, surrounding the convective cluster. The researchers advise that future modeling of convective organization should include simulations of this moist shell. This study sheds light on how convective systems interact with their environments and their behavior (whether by aggregating or dissipating) as they move farther skyward. ( Geophysical Research Letters , https://doi.org/10.1002/2017GL076640, 2018) —Sarah Witman, Freelance Writer
    Print ISSN: 0096-3941
    Electronic ISSN: 2324-9250
    Topics: Geosciences
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 4
    Publication Date: 2018-03-06
    Description: Joseph B. Walsh. Credit: Benjamin Bennett Joseph B. “Joe” Walsh died on 30 August 2017 at the age of 86 in Adamsville, R.I., where he had lived for many years. Joe was well known in the rock mechanics community, although perhaps underappreciated outside it. The influence of his work is, nonetheless, broad and profound. Seismologists who interpret high velocities of compressional waves compared with those of shear waves (high Vp / Vs ratios) as indicators of high pore pressures, oil explorers who recognize oil and gas zones in tomographic images, and geophysicists identifying high permeability and water content from electrical conductivity measurements all rely on Joe’s foundational work. The reason is because these scientists are not so much measuring the properties of the rock as measuring the influence of its cracks. Joe, in a series of classic papers in the 1960s and 1970s, did the fundamental work establishing the profound effect of cracks on the elastic and transport properties of rock. By recognizing this influence, Joe was able to provide, for example, rational explanations for relationships between the constitutive relations for permeability and for electrical resistivity, to predict how increasing effective pressure changes permeability, and to understand the influence of surface roughness on joint transmissivity or the coefficient of friction. Whole fields of study are based on those beginnings. Early Career Joseph B. Walsh was born in Utica, N.Y., the son of Joseph B. and Ann (née Bowman) Walsh. He grew up in upstate New York before moving to Massachusetts to attend the Massachusetts Institute of Technology (MIT), from which between 1952 and 1958 he received bachelor’s, master’s, and doctor of science degrees in mechanical engineering. His D.Sc. work yielded a paper with Frank McClintock in which they developed what became known as the modified Griffith theory for brittle fracture in compression. After graduating from MIT, Joe spent 2 years in industry, including a stint with a consulting company in Stockholm. This job morphed into a globe-circling trip in a VW bug, ending with Joe in California, substantially poorer financially but much richer in experience. Returning to Massachusetts, he applied his skills in solid mechanics as an engineer at the Woods Hole Oceanographic Institution, where he was responsible for the design of the pressure hull for the pioneering submersible Alvin . Joe joined the Geology and Geophysics (later Earth, Atmospheric and Planetary Sciences) Department at MIT in 1963, beginning a 25-year collaboration with W. F. Brace. It was a very fruitful combination: Joe did the theory, and Bill did the experiments. The Walsh-Brace period was one of rapid development in rock mechanics on many fronts. A host of graduate students and postdocs (including most of us) were trained under their guidance, many of whom went on to productive research careers in academia and industry. Theory Grounded in Reality Joe’s papers are both succinct and eloquent. He identified the most pertinent elements of each problem and focused his analysis exclusively on those aspects. His chosen tools were pencil, paper, and the fundamental principles of mechanics. Joe arrived at work each day impeccably attired in coat, tie, and slacks. Sitting at his desk, he would dive into his cool mathematical treatments, seemingly abstract but always securely attached to reality. As a result of his training with McClintock, Argon, and others in MIT’s Mechanical Engineering Department, Joe always tested his work with experimental or observational data, thus providing a perfect interface with Brace’s group. A Taste for Conversation and Rugby At lunchtime, he could be found at a local diner, talking to machinists, custodians, or academics.Despite his analytical proclivity, Joe was not detached socially. To all who knew him, he was a quiet, unassuming man with a wry sense of humor. He liked people and had catholic tastes in his choice of company. At lunchtime, he could be found at a local diner, talking to machinists, custodians, or academics. In the evening, he might dine at one of several classic private social clubs in the Back Bay. His gentle demeanor gave no clue that in his younger years he was an avid rugby player and the founding director of the U.S. Rugby Foundation. Joe was equally at ease bantering with a surly waitress or securing a large donation for the rugby foundation from an influential industrial magnate. A Very Active Retirement Joe retired from MIT in 1986 and settled in Rhode Island. In “retirement,” Joe continued to conduct theoretical studies of fluid flow in fractured rocks and of rock friction. He was appointed visiting scholar in the Department of Earth, Environmental and Planetary Sciences at Brown University in 1999, and he continued in that capacity until his passing, having been reappointed only a few months before. Joe eagerly mined experimental rock friction data at Brown, which he used as a starting point for his analyses. Armed with his famous and formidable yellow legal pad and No. 2 pencil, Joe worked closely with experimentalists at Brown and elsewhere to establish a physical basis for rate and state friction laws. In addition to scientific interactions, Joe also educated younger scientists at Brown by his example about the importance of a proper lunch, the joys of poetry (Joe could, and did, recite many poems from memory), and the meaning of savoir faire. Joe continued his scientific work up to the time of his death. American Geophysical Union president Ralph Cicerone congratulates Joseph Walsh on being named an AGU Fellow in 1993. Credit: AGU Joe’s work often took him overseas: He served as a visiting scientist at the University of Cambridge in England; the University of Edinburgh, Scotland; and the South African Chamber of Mines in Johannesburg. Although his name is not a household word outside the rock mechanics community, Joe has been well recognized for his accomplishments. In 2007, he received the Rock Mechanics Research Award, in 2000 he was honored as a Life Fellow at the University of Cambridge, and in 1993 he was named a Fellow by the American Geophysical Union. —Christopher H. Scholz (email: scholz@ldeo.columbia.edu), Lamont-Doherty Earth Observatory, Columbia University, Palisades, N.Y.; David L. Goldsby, Department of Earth and Environmental Science, University of Pennsylvania, Philadelphia; and Yves Bernabé and Brian Evans, Department of Earth, Atmospheric and Planetary Sciences, Massachusetts Institute of Technology, Cambridge
    Print ISSN: 0096-3941
    Electronic ISSN: 2324-9250
    Topics: Geosciences
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 5
    Publication Date: 2018-03-06
    Description: Declining water quality in inland and coastal systems has become, and will continue to be, a major environmental, social, and economic problem as human populations increase, agricultural activities expand, and climate change effects on hydrological cycles and extreme events become more pronounced. Providing government and nongovernment groups with timely observations on the time and location of anomalous water quality conditions can lead to more informed decisions about the use, management, and protection of water resources. By observing the color of the water, satellite sensors provide information on the concentrations of the constituents that give rise to these colors. These constituents include chlorophyll a (the primary photosynthetic pigment in phytoplankton), total suspended solids (an indicator of sediments and other insoluble material), and dissolved organic matter. Other environmentally relevant optical characteristics include turbidity and water clarity. Varieties of physical and biological phenomena can be inferred from space. This map pinpoints potential biogeochemical hot spots that require further investigations through field sampling. (left) A Landsat 8 true-color image of Boston Harbor, Mass., alongside (right) the corresponding near-surface turbidity map produced via NASA’s Sea-Viewing Wide Field-of-View Sensor (SeaWiFS) Data Analysis System (SeaDAS). Warmer colors represent higher turbidity. The area in the top right of both images is Massachusetts Bay. Credit: NASA Goddard Space Flight Center A 1-day workshop at NASA’s Goddard Space Flight Center introduced the concept and potential capabilities of a satellite-based, near-real-time water quality monitoring tool. This tool will complement existing field monitoring programs by automatically alerting water resource and ecosystem managers to potentially hazardous water quality conditions, resulting in more timely and informed decision-making. The workshop aimed to identify the next steps toward making a satellite-based, near-real-time water quality monitoring system a reality with input and guidance from end users.The workshop brought together more than 340 environmental specialists, economists, scientists, industry representatives, and legal advisors from state and federal agencies and the private sector. The primary requirements that workshop attendees identified for developing this warning system include automated, near-real-time processing of Landsat-Sentinel imagery, the development of robust anomaly detection algorithms, and support for ongoing implementation and calibration and validation efforts. The workshop further aimed to identify the next steps toward making such a near-real-time system a reality with input and guidance from end users. The workshop featured a series of short presentations on the perspectives of end users on the potential value of satellite data for water quality monitoring. These presentations covered a broad range of topics, including monitoring harmful algal blooms in California, Utah, Oklahoma, Oregon, and Florida; identifying sites for aquacultures in New England; and concerns about pipeline leaks contaminating waterways. Other talks highlighted the need for improved satellite technology (e.g., hyperspectral missions) with sunglint mitigation strategies in the future to enable more precise and accurate estimations of water quality conditions from space. The NASA Goddard team is currently developing a prototype system for select regions to evaluate the performance of such an expedited service.The NASA Goddard team is currently developing a prototype system for select regions (e.g., Florida’s Indian River Lagoon, Lake Mead, and Oregon reservoirs) to evaluate the performance of such an expedited service. The team, in collaboration with water authorities, will initiate algorithm development, prototyping, testing, and implementation of the system. All presentations are available on the meeting’s website. The Water Quality Workshop was sponsored by the NASA Goddard Applied Sciences office. We thank Steve R. Greb, Richard Stumpf, Maria Tzortziou, and Jeremy P. Werdell for serving on the organizing committee. —Nima Pahlevan (email: nima.pahlevan@nasa.gov; @nima_pahlevan), NASA, Greenbelt, Md.; also at Science Systems and Applications, Inc., Greenbelt, Md.; Steve G. Ackleson, Naval Research Laboratory, Washington, D. C.; and Blake A. Shaeffer, U.S. Environmental Protection Agency, Research Triangle Park, N.C.
    Print ISSN: 0096-3941
    Electronic ISSN: 2324-9250
    Topics: Geosciences
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 6
    Publication Date: 2018-03-06
    Description: Caves may be dark and eerie, but now they are a little less mysterious. That is, at least, when it comes to what happens to the atmospheric methane that enters the caves. Researchers recently completed the most extensive published study to date of methane concentrations in karst caves. Cave environments are a global sink for methane, they assert, but other scientists have voiced some reservations about that claim. Scientists conducting those initial explorations expected to find higher methane concentrations inside the caves than in the surrounding atmosphere, but they found the opposite.Karst terrains, such as caves, sinkholes, and other formations pockmarked by internal voids, form when soluble rocks such as limestone erode. The most widespread karst caves form when carbon dioxide dissolves in surface waters, creating carbonic acid that disintegrates the limestone. Caves and other karst environments underlie about 14% of Earth’s land, and it’s not well known how this terrestrial subsurface interacts with the atmosphere, said independent biogeochemist Kevin Webster of Tucson, Ariz., who led the new study and was formerly with the University of Arizona in Tucson. Methane is a potent greenhouse gas, and its concentration in the atmosphere is increasing. Scientists are tracing methane’s sources and sinks to help them understand the rate and expected amount of the gas’s accumulation in our planet’s atmosphere. Within the past decade, investigators started measuring methane levels in a few caves and associated karst landscapes. The scientists conducting those initial explorations expected to find higher methane concentrations inside the caves than in the surrounding atmosphere, but they found the opposite, according to Webster. Giuseppe Etiope measures the flux of methane from a cave surface in Kentucky. Credit: Agnieszka Drobniak To find out if methane depletion is unique to only a handful of caves, Webster and his colleagues recently measured the air inside 33 karst caves in the United States and 3 caves in New Zealand, essentially tripling the known data set. From May 2012 to September 2016, they measured methane concentrations, as well as ratios of stable carbon and hydrogen isotopes in the methane. Supporting measurements provided information on the cave air residence times and mixing processes. Thirty-five, or 97%, of the tested caves had methane concentrations lower than those in the outside atmosphere in at least one measurement location, the researchers report in the 1 March issue of Earth and Planetary Science Letters ( EPSL ). Methane Eaters So what’s happening to the methane? Earlier karst cave studies that initially observed the methane depletion proposed two main hypotheses: Methane is oxidized by ions and radicals from the radioactive decay of radon, or methane is consumed by methane-eating microorganisms. However, the researchers doing those studies couldn’t conduct all the measurements needed to distinguish between the two, Webster noted, whereas he and his colleagues made special provisions to collect the essential data. The researchers found the isotopic signature of methanotrophic bacteria, which led them to conclude that those bacteria were the cause of the methane loss.For instance, he and his team report in the EPSL paper, which was first published online on 8 January, that they measured the hydrogen isotope ratios of their methane samples, which had not been done previously in these cave environments, according to Webster. To be able to do so, Webster told Eos , the scientists custom built an inlet for a gas chromatograph–isotope ratio mass spectrometer and used the inlet to preconcentrate the methane. “Preconcentration is necessary because methane is at such low abundance in air” and even less abundant in cave air, Webster explained. Ultimately, the researchers found the isotopic signature of methanotrophic bacteria, which led them to conclude that those bacteria were the cause of the methane loss. The scientists’ novel approach also helped them to solve another mystery. It turns out that the team observed at least two sources of methane entering some of the caves: The main source was the atmosphere outside of the cave, but a minor contribution also came from two different microbial biochemistries. “That was very surprising,” said Webster, given that researchers have previously observed such processes co-occurring only in lake or fen environments. David Mattey, an isotope geochemist at Royal Holloway, University of London, in the United Kingdom, saluted the advance of using hydrogen isotopes to fingerprint different methanogenic sources. Mattey, an original proposer of the microbial depletion pathway who was not involved with the new cave study, noted that the findings by Webster and his team confirm his earlier work. To eliminate contamination, Arndt Schimmelmann fills a plastic bag with cave air to remove any air from external sources prior to collecting a cave air sample in a glass sampling container. Credit: Agnieszka Drobniak Global Sink The fact that nearly all of the studied caves consumed methane suggests to Webster that caves in general do the same. “There is nothing unusual about methane consumption in caves or the minor processes of methane production we observed,” he said. On the basis of their measurements, he and his coauthors contend in their paper that karst environments reduce methane’s atmospheric concentrations worldwide. Because the microbial sources of methane were minor, those emissions did little to offset the net loss of methane in the caves. The next step is to figure out how large the losses are, Webster said. Mattey needs more evidence. “You have to be careful claiming that it might have a global impact,” he said, “because it’s very, very difficult to upscale it from just one or two measurements.” He advised more long-term monitoring as well as reaching a better understanding of fluxes and how the caves are connected to the atmosphere. —Laura G. Shields (lgshields@gmail.com; @LauraGShields), Science Communication Program Graduate Student, University of California, Santa Cruz
    Print ISSN: 0096-3941
    Electronic ISSN: 2324-9250
    Topics: Geosciences
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 7
    facet.materialart.
    Unknown
    American Geophysical Union (AGU)
    Publication Date: 2018-03-06
    Description: Like river canyons, steep-sided submarine channels are effective transportation systems capable of carrying billions of tons of sediment across distances of hundreds of kilometers. Although previous studies have shown that helical (spiraling) flow around meander bends plays an important role in transporting sediment in rivers, a lack of field measurements from deep-ocean turbidity currents has led to competing models describing their motion around curves. To settle this controversy, Azpiroz-Zabala et al. present the first deep-ocean measurements of turbidity currents around a submarine channel bend. Using an acoustic Doppler current profiler anchored downstream of a meander at a depth of 2,000 meters in Congo Canyon, the team acquired the velocity-depth profiles of 10 flows that occurred between December 2009 and March 2010. Surprisingly, despite having variable thicknesses ranging from 16 to 75 meters and durations lasting from 8 hours to 10 days, nearly all of the turbidity currents displayed the same helical flow structure. It consisted of two stacked cells rotating in opposite directions, with the bottom cell revolving in the direction opposite to helical flows observed in rivers. These results are consistent with models of other types of stratified flows and support the hypothesis that the same mechanism that forms circulation cells in other geophysical flows (such as rivers and saline flows)—the interaction of competing pressure gradients—also applies to turbidity currents. These difficult-to-obtain measurements show that the type of circulation a large-scale flow will exhibit depends upon the extent to which the current is stratified. The resulting helical flow causes the sediment to slosh from side to side, to gather at the inner bend, or to be continuously overturned. In combination with fluid turbulence, these processes keep sediment in suspension across long distances and thus play a crucial role in the ability of turbidity currents to transport enormous amounts of sediment from the continental shelf all the way to the deep-ocean floor. ( Geophysical Research Letters , https://doi.org/10.1002/2017GL075721, 2017) —Terri Cook, Freelance Writer
    Print ISSN: 0096-3941
    Electronic ISSN: 2324-9250
    Topics: Geosciences
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 8
    Publication Date: 2018-03-06
    Description: A new scorecard that rates members of Congress on how they voted on environmental issues found that 46 Republican senators scored a 0% in 2017. The average score for all Republican senators was 1%, according to the League of Conservation Voters (LCV), a nonprofit environmental group based in Washington, D. C., that released the scorecard on Monday. This is the lowest average score for Republican senators since LCV began tracking this issue in 1970, according to the group. In contrast, 2e7 Democratic senators earned a 100% on the scorecard, with Democrats averaging 93%. The low Republican average score means that those senators “voted against the environment and public health at every opportunity,” the LCV report states. On the House side, 124 Republicans received a zero, with House GOP members overall receiving an average score of 5%. Among House Democrats, 84 earned a 100% score, and House Democrats overall earned a 94% average score. “The Republican-led Congress repeatedly refused to stand up to President Trump’s extreme antienvironmental agenda and his attacks on our air, water, land, wildlife.”“At the federal level, 2017 was an unmitigated disaster for the environment and public health, with President Trump and his cabinet quickly becoming the most antienvironmental administration in our nation’s history,” Tiernan Sittenfeld, LCV’s senior vice president for government affairs, said at a briefing on Monday to release the report. “The Republican-led Congress repeatedly refused to stand up to President Trump’s extreme antienvironmental agenda and his attacks on our air, water, land, wildlife. This is particularly shameful in a year when climate change–fueled hurricanes and wildfires caused so much devastation. Fortunately, Senate Democrats, led by Sen. Schumer”—the Senate minority leader from New York—“maintained a green firewall of defense to block any egregious events throughout 2017.” Votes That Were Counted The report graded members of Congress on the basis of specific votes that LCV and other environmental and conservation organizations determined were key indicators. On the Senate side, members were scored on how they voted in 19 instances. These included eight votes to confirm the administration’s cabinet or subcabinet nominees, whom the report labeled as “historically anti-environmental.” Among them was Environmental Protection Agency administrator Scott Pruitt. The report said that Pruitt “has aggressively gutted the agency from the inside.” Other votes counted in the scorecard were the recent tax bill that opens the Arctic National Wildlife Refuge to fossil fuel development and legislation that would threaten drinking water and public lands. An environmental voting scorecard issued by the League of Conservation Voters shows a sharp drop in Senate Republican scores between 2016 and 2017. Credit: League of Conservation Voters Sen. Whitehouse said that the low scores for Republicans in both houses “very clearly show a party that has been completely captured by the polluting industries.”In the Senate, “what this year’s results show is a dramatic crash on the Republican side of the aisle, which is in many respects a very sad testament to what has become of the GOP,” Sen. Sheldon Whitehouse (D-R.I.) said at the briefing. Only 14 Senate Republicans scored zero in 2016 compared with the 46 who did in 2017. House numbers were fairly stable, with 122 House members receiving a zero in 2016 compared with 124 with that score in 2017. Whitehouse said that the low scores for Republicans in both houses “very clearly show a party that has been completely captured by the polluting industries.” At least one Republican senator, Thad Cochran of Mississippi, paid little regard to his LCV grade of zero. “As a lifelong Republican, Sen. Cochran tends not to score highly with liberal activist groups,” a spokesperson for the senator told Eos . “Senator Cochran’s career reflects a careful understanding of the importance of protecting and preserving our nation’s natural resources. He has a strong record of making decisions on environmental issues that are in the best interests of Mississippi and our nation, and supporting legislation and policies that promote cooperative conservation programs.” Looking at Votes by Climate Caucus Members For House members, LCV graded 35 votes on issues related to public lands, climate change, water resources, clean air, deregulation of environmental rules, and other environment-relevant topics. The average environmental voting record for House Republicans falls far below that for Democrats, according to the League of Conservation Voters. Credit: League of Conservation Voters Republican House members who belong to the bipartisan Climate Solutions Caucus averaged a 16% score, which is more than 3 times higher than the overall Republican average, according to the report. However, the report concluded that members of the caucus, which was founded in 2016 to explore policy options on climate change, need to do more. “Joining the caucus can be an important step, but it’s simply not enough,” the report states. “We need these Republican members to vote for climate action, to lead on real solutions, and to push their colleagues and party leadership to do better.” “We don’t think the scorecard accurately captures the emerging work being done by the caucus to develop bipartisan solutions to climate change.”The head of a group that has worked closely with the caucus said the scorecard is valuable but that it does not provide the whole picture about the importance of the caucus. “We think the scorecard plays the essential role of providing pressure on members of Congress to do better on environmental issues, especially climate change. However, we don’t think the scorecard accurately captures the emerging work being done by the caucus to develop bipartisan solutions to climate change,” Mark Reynolds, executive director of the Citizens’ Climate Lobby (CCL), said in a statement provided to Eos . The lobby is a grassroots advocacy organization based in Coronado, Calif., that focuses on national policies to address climate change. A CCL analysis published on Wednesday found that 15 of the 34 caucus members who are Republicans improved their environmental voting scores since joining the caucus, despite the fact that only 5 of the 35 tracked votes are “climate-relevant.” “Much is happening behind the scenes, thanks to the caucus, and we think patience will eventually be rewarded with major legislation to address climate change,” Reynolds added. —Randy Showstack (@RandyShowstack), Staff Writer
    Print ISSN: 0096-3941
    Electronic ISSN: 2324-9250
    Topics: Geosciences
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 9
    Publication Date: 2018-03-06
    Description: Earth system models contain dozens if not hundreds of parameters that are not easily definable or accurately tunable over most of the Earth, meaning these models do not fulfill their true potential as tools for simulating or predicting weather and climate. Ricciuto et al. [2018] extend the mathematical technique called Polynomial Chaos (PC), a method to determine uncertainty in dynamical systems, with a new Bayesian compressive sensing (BCS) algorithm to applications at very high dimensions like those found in complex Earth system models. Interesting applications, such as sensitivity analysis, parameter optimization and distribution estimation are possible. With the help of PC-BCS, the sensitivity indices within models can be directly computed, and is demonstrated to be a more effective means of optimizing the land surface parameters of the Energy Exascale Earth System Model (E3SM). Citation: Ricciuto, D., Sargsyan, K., & Thornton, P. [2018]. The impact of parametric uncertainties on biogeochemistry in the E3SM land model. Journal of Advances in Modeling Earth Systems , 10. https://doi.org/10.1002/2017MS000962 —Paul A. Dirmeyer, Editor, JAMES
    Print ISSN: 0096-3941
    Electronic ISSN: 2324-9250
    Topics: Geosciences
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 10
    facet.materialart.
    Unknown
    American Geophysical Union (AGU)
    Publication Date: 2018-03-06
    Description: Rooftop gardens. Seedlings sprouting on windowsills. The clucking of chickens in a metropolitan backyard. These and more are small harbingers of the expansion of urban agriculture around the world. More than half of the world’s population lives in cities, a figure that the United Nations expects to increase to 67% by 2050—yet urbanized land makes up just 1% of the Earth’s surface. Because of this, urban planners are working to make cities more resilient, habitable, and adaptable to change. In a new paper, Clinton et al. have developed a framework to estimate the environmental benefits of urban agriculture on a global scale—current and future. The team envisions a scenario in which over the next few decades, cities around the world adopt intensive efforts to expand urban agriculture. Using Google Earth Engine, a free platform for processing global satellite data, the researchers analyzed data sets on population, urban landscapes, meteorology, terrain, and food and agriculture. They developed national estimates for the entire globe of ecosystem services provided by urban agriculture, finding that existing vegetation in urban areas provides the equivalent of about $33 billion each year. In more specific terms, the team estimates that urban agriculture, if deployed across all available vacant land, rooftops, and building façades, could produce 100–180 million tons of food, save about 14–15 billion kilowatt hours of energy, sequester 100,000–170,000 tons of nitrogen, and offset roughly 2 trillion cubic feet of storm runoff each year. Projected out, the researchers estimate that dramatically increasing urban agriculture efforts around the globe has the potential to positively influence food production, nitrogen fixation, energy savings, pollination, climate regulation, soil formation, and the biological control of pests, services that are worth, as a whole, as much as $160 billion. The team’s findings show that urban agriculture has the ability to improve food security and ecosystem health on a global scale. Although its impacts vary from country to country, the results are promising. This study is a thorough look at the importance of urban agriculture, especially in the face of global climate change and unsustainable urban development practices around the world. ( Earth’s Future , https://doi.org/10.1002/2017EF000536, 2018) —Sarah Witman, Freelance Writer
    Print ISSN: 0096-3941
    Electronic ISSN: 2324-9250
    Topics: Geosciences
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 11
    Publication Date: 2018-03-06
    Description: In 2017, more than 200 students participated in the Virtual Poster Showcase (VPS), a program of the American Geophysical Union (AGU) that is designed to help students share research without traveling to an in-person conference. This program has been continually advancing in size and scope since it was created in 2013. Embellishments added last year helped to scale up the program as well as to add value for participants.Four embellishments added to VPS in 2017 helped to scale up the program as well as to add value for the student participants and their faculty and research advisers. A year ago, two high school instructors helped pilot the participation of 21 groups of high school juniors and seniors in VPS. Some of these students stated that they expected the experience to strengthen their college applications. A university instructor incorporated VPS into the curriculum of her spring graduate course in science communication. This meant that all aspects of the VPS process (abstract submission, poster creation, a video presentation of the student’s project, participation in a peer evaluation component, and responses to feedback from professionals serving as judges) were considered for the students’ final grades in the course. As part of a partnership between AGU and the American Geosciences Institute, all 447 VPS abstracts from 2015 onward are now available and searchable within GeoRef, the world’s largest database of geoscience abstracts. A geographic information system (GIS) map of all VPS participants, their abstracts, and the location of each lead author’s institution has been created to share the abstracts with the entire Earth and space science community. This GIS visualization is a project between University of Texas at El Paso professor Raed Aldouri and his GIS class. The link will be publicized in the coming weeks after all the 2017 abstracts have been added to the GIS. Global Participation In 2017, VPS offered spring and fall showcases that have continued to draw students from around the globe. For the third year in a row, the fall showcase attracted U.S. undergraduate students who had completed summer research programs known as Research Experience for Undergraduates (REUs). The National Science Foundation funds those programs nationwide. VPS is not just for those in the United States or just for undergraduate students. Nearly half of the participants in VPS’s 2017 events were graduate students from around the world. Overall, VPS’s non-U.S. participants in 2017 came from 15 countries and five continents. Building Student Confidence Taking part in the Virtual Poster Showcase continues to increase students’ confidence levels in preparing and presenting posters.Taking part in the Virtual Poster Showcase continues to increase students’ experience with preparing and presenting research. Nearly four out of five students in the 2017 showcases reported a boost in confidence in their poster preparation and presentation skills. Survey respondents also said that having a VPS abstract will ultimately help them in the next steps of their careers. During the two VPS events in 2017, presentations featured a wide array of research that spanned many subdisciplines within the Earth and space sciences, from environmental degradation caused by zinc smelting to modeling solar wind parameters in the Martian atmosphere. The first-place winners of the spring and fall showcases are listed below. Information about other winners can be found on the Virtual Poster Showcase recognition page. Spring 2017 winners: Graduate showcase: Babak Jalalzadeh Fard, Northeastern University, “Effective mitigation and adaptation strategies for public health impacts of heatwaves for Brookline, MA” Undergraduate showcase: Jacob Smith, Clemson University, “The effect of atmospheric CO2 on the chemical weathering of silicate minerals as measured by cation flux in the vadose zone” High school showcase: Hannah Kim, Thomas Jefferson High School for Science and Technology, “The effect of disease resistance on the bacterial community of the fecal microbiome of Crassostrea virginica ” Fall 2017 winners: Graduate showcase: Ruadhan Magee, University of Queensland, Australia, “Magma dynamics recorded in clinopyroxene megacrysts: Investigating the destructive 1669 eruption of Mount Etna” Undergraduate showcase: Caitlin Hoeber, San Jose State University, “Spatial and temporal effects on diversity of Monterey Bay’s microbiome” Each first-place winner of the graduate and undergraduate showcases will receive a travel grant to attend the 2018 AGU Fall Meeting in Washington, D. C., along with complimentary meeting registration. Spring Showcase Open for Abstracts Register and submit your abstracts today.The 2018 spring showcase is accepting abstracts until Tuesday, 13 March, 11:59 p.m. Eastern Time. VPS offers an excellent means for students to share their research and get valuable feedback from their peers and professionals in the Earth and space sciences. Register and submit your abstracts today at http://vps.agu.org, or message vps@agu.org to learn more about how you can participate in future showcases. To everyone who helped make VPS a continued success this past year, AGU’s Virtual Poster Showcase staff offers its heartfelt thanks. It is only through the generous volunteerism of professionals who sign up as VPS judges and through the VPS program’s collaborations with other professional societies and individuals within the scientific community that the VPS program can continue to strive toward AGU’s mission of advancing Earth and space science. —Pranoti M. Asher (email: pasher@agu.org), Manager, Higher Education, AGU; and Nathaniel Janick, Career Services Coordinator, AGU
    Print ISSN: 0096-3941
    Electronic ISSN: 2324-9250
    Topics: Geosciences
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 12
    facet.materialart.
    Unknown
    American Geophysical Union (AGU)
    Publication Date: 2018-03-06
    Description: The 2017 hurricane season was especially severe in the Northern Hemisphere. When Hurricane Harvey hit Houston, Texas, in August, so much water fell on the city that the ground sank beneath the weight of it. Hurricane Irma followed closely on its heels, and Puerto Rico is still reeling from Hurricane Maria, which swept through in late September. There is no doubt that swiftly and accurately measuring hurricane conditions to predict their behavior before landfall could safeguard lives—and lifesaving infrastructure. With this goal in mind, Foti et al. demonstrate a new satellite remote sensing technique that uses reflected GPS signals to measure ocean wind speeds during hurricanes. The Global Navigation Satellite System–Reflectometry (GNSS-R) technique takes measurements of Earth surface conditions by reading GPS signals after they bounce off Earth’s surface. This method can sense many surface properties like ocean roughness, wind speed, soil moisture, and sea ice. A previous study used this technique to record hurricane winds from an aircraft, but the new research shows that the same technique also works from satellite altitudes to sense ocean surface roughness in hurricane wind conditions. To prove that the technique was sensitive enough to be used from space, the researchers used data from a GNSS Receiver Remote Sensing Instrument mounted onto a satellite 635 kilometers above the ocean. For this study, the scientists used data collected between May 2015 and October 2016, during which time Hurricane Joaquin, Hurricane Jimena, and Typhoon Chan-hom occurred. The researchers compared the GNSS-R satellite measurements with data from other sources, including tropical cyclone best track data from the National Oceanic and Atmospheric Administration’s National Centers for Environmental Information; two climate reanalysis products; and a spaceborne scatterometer, a tool that uses microwave radar to measure winds near the surface of the ocean. They found that the GNSS-R satellite data successfully sensed wind speed conditions in each hurricane compared with these other data sources. They also captured rapid changes in wind speed that occur around the eye of the cyclone, which did not appear to be affected by any data loss, like what can be caused by heavy precipitation in other satellite data. There are still uncertainties, however. The wind speed algorithm the scientists used was developed for low to moderate winds, below 67 miles per hour (30 meters per second). As a result, the authors conclude that the GNSS-R technique needs further validation at high wind speeds. Although the GNSS-R technique could improve wind speed measurements and hurricane research in the future, the scientists conclude that more work needs to be done to develop the technique and improve the accuracy of the high wind speeds it measures. ( Geophysical Research Letters , https://doi.org/10.1002/2017GL076166, 2017) —Alexandra Branscombe
    Print ISSN: 0096-3941
    Electronic ISSN: 2324-9250
    Topics: Geosciences
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 13
    Publication Date: 2018-03-06
    Description: A congressional panel yesterday heard testimonies about the impact of and fight against sexual harassment in the sciences. Four women prominent and successful in their fields spoke about the need to reform not just the laws but also a harmful culture that considers such behaviors permissible and fosters systemic inequity. The entire scientific community, especially those in leadership positions, must strive to change a culture that treats harassment as commonplace.“We talk a lot about getting more women in the sciences,” said Rep. Suzanne Bonamici (D-Ore.), but “we need to be able to keep them there when they get there.” Bonamici sits on the Subcommittee on Research and Technology of the House Committee on Science, Space, and Technology, which held the hearing. According to the witnesses, antiharassment policies must grow more comprehensive and include more input from experts; findings and procedures require greater transparency, and violations must provoke tangible consequences. Overall, the entire scientific community, especially those in leadership positions, must strive to change a culture that treats harassment as commonplace, they said. “We cannot afford to lose another brilliant scientist because she did not feel safe in her lab,” said Rep. Daniel Lipinski (D-Ill.), ranking member of the subcommittee. Clarity, Transparency, and Informed Policy Making No standard harassment policy prevails at American universities and research institutions, nor is there a consistent definition of what actions constitute harassment, some witnesses noted. The four witnesses who testified before the House Subcommittee on Research and Technology on 27 February. From left to right are Rhonda Davis, Kathryn Clancy, Kristina Larsen, and Chris McEntee. Credit: House Committee on Science, Space, and Technology Kathryn Clancy, an associate professor of anthropology at the University of Illinois at Urbana-Champaign, explained that antiharassment policies need to be explicit about acceptable and unacceptable behaviors, easily accessible to all, and taught as part of standard workplace training. They also need to address the problems actually occurring in that workplace, said Clancy, who conducts research on workplace climate in the sciences. “We need to do a lot more of the hard work, not just slapping on a policy and saying ‘OK, sexual harassment is fixed,’” Clancy said. Scientific institutions should ask themselves, “What is the culture at our organization, and is this the culture that we want?” she added. Attorney Kristina Larsen told the subcommittee that many antiharassment policies focus mainly on legality and the potential for litigation, instead of addressing more prevalent, but technically legal, smaller harassments. Larsen represents women and underrepresented minorities in science, technology, engineering, and mathematics (STEM) fields who are facing discrimination, harassment, and retaliation. “Don’t write a zero-tolerance policy until you’re really clear on what you’re not tolerating,” she advised in her testimony. We need to base policies on “the conduct that is actually damaging” to victims and not worry “about whether it is legal or illegal under the law,” she said. Fieldwork Amplifies Problems Field research conducted far from a formal academic environment increases the need to have clear and explicit ethical policies and codes of conduct, said Chris McEntee, executive director and CEO of the American Geophysical Union, publisher of Eos . “A very small number of people, who were actually harassed, even knew what the reporting mechanism was” at field sites.“The Earth and space sciences typically involve remote field settings,” she noted in her testimony. “When coupled with a male-dominated environment and power structure, these situations can amplify the problem.” Clancy highlighted that field research brings added uncertainty about antiharassment policies. “In field sciences, we found that the majority of our respondents were not aware of a code of conduct or sexual harassment policy for their field site. And [only] a very small number of people, who were actually harassed, even knew what the reporting mechanism was,” she said. Principal investigators, supervisors, and field site directors should develop and enforce implicit and explicit codes of conduct and bear responsibility for them, she added. Making Consequences for Harassers Real Witnesses and members of Congress at the hearing lauded the National Science Foundation (NSF) for its 8 February decision requiring grant-seeking universities to maintain clear antiharassment policies and to report policy violations to NSF. “No taxpayer dollars should be awarded to a university researcher who engages in harassment and inappropriate behavior toward a colleague or student under their charge,” Rep. Lamar Smith (R-Texas), who chairs the House Committee on Science, Space, and Technology, said during yesterday’s hearing. Subcommittee chairwoman Barbara Comstock (left) speaks with witness Rhonda Davis of NSF (right) after the hearing. Credit: Kimberly M. S. Cartier Rhonda Davis, head of NSF’s Office of Diversity and Inclusion, who also testified at the hearing, noted that NSF’s new guidelines were prompted by the fact that American universities do not have a universal ethics policy regarding sexual or other types of harassment or any requirement for universities to develop such policies beyond the scope of federal protections, like Title IX. Consistent and visible enforcement of antiharassment policy will help mitigate the harassment “epidemic,” said Clancy, citing her research. “Across workplaces, it’s consistent that if you have consequences…you do see less harassment in those workplaces,” she explained. The fear of backlash for reporting harassment falls on the targets of harassment, not the harassers, said McEntee, who encouraged sanctions against harassers for violating ethics policies. “People don’t change because they feel the light; people change because they feel the heat.”“People don’t change because they feel the light; people change because they feel the heat,” said Larsen. “And there is no heat in academics….We have a problem with enforcement.” Davis said that NSF’s new policy includes independent and anonymous avenues for anyone, including students, to report harassment directly to NSF, which may reduce the fear of backlash. Culture Change Needed All of the witnesses called for culture change in the scientific community, where, they said, harassment is allowed to persist and is deemed tolerable. “Let’s move away from a culture of compliance and towards a culture of change,” Clancy said, by “focusing on the behaviors we want to see.” “I see you, and I think of you, and I thank you for getting up every day, and I derive strength from you.”Clancy and McEntee called for more well informed training in how to recognize harassing and harmful behavior and how to safely diffuse a situation from the outside. This type of bystander intervention, especially from those in leadership positions, they explained, would have a twofold effect: first, showing the harasser that such behavior is not acceptable or tolerated in the workplace and, second, demonstrating that vulnerable persons are visible, heard, and supported by those with the power to effect change. Speaking directly to victims of sexual harassment, Clancy added, “I see you, and I think of you, and I thank you for getting up every day, and I derive strength from you. I hope you know how much you mean to those of us who do this work.” —Kimberly M. S. Cartier (@AstroKimCartier), News Writing and Production Intern
    Print ISSN: 0096-3941
    Electronic ISSN: 2324-9250
    Topics: Geosciences
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 14
    Publication Date: 2018-03-06
    Description: New details released yesterday about the administration’s $7.47 billion funding request for the National Science Foundation (NSF) for fiscal year (FY) 2019 show that proposed budgets related to the geosciences are slated for two of the three biggest monetary and percentage increases among the agency’s major funding accounts. The Directorate for Geosciences (GEO) requests $853 million, a 3.3% increase above the FY 2017 spending level. The Office of Polar Programs, which operates as part of GEO, gets a 14.3% boost, bringing its requested funding to $534.5 million. This assumes that Congress goes along with the White House’s request. The budget request “would allow NSF to build on the important work done by our directorates within individual fields by encouraging convergence among different disciplines in science and engineering.”Only the budget for NSF’s Office of Integrative Activities fares better among the agency’s major accounts. The office’s $537 million requested budget, a 27.7% increase, includes funding for midscale research infrastructure and for several new areas known as “convergence accelerators,” which are initiatives to leverage resources across the agency to support innovative science. This information (see Table 1) updates earlier budget materials released by NSF on 12 February. The budget request “would allow NSF to build on the important work done by our directorates within individual fields by encouraging convergence among different disciplines in science and engineering and collaboration with partners in different disciplines in science and engineering and collaboration with partners in different sectors,” NSF director France Córdova said in a statement. “Investments that incorporate such an approach will accelerate U.S. innovation.” Table 1. National Science Foundation’s FY 2019 Budget Request to Congressa   FY 2017 Actualb FY 2019 Requestb Changeb Percentage Change Research and Related Activities 6,006.5 6,150.7 144.2 2.4 Geosciences (GEO) 825.6 853.0 27.4 3.3 Office of Polar Programs (OPP) 467.9 534.5 66.7 14.3 Biological Sciences (BIO) 742.2 738.2 −4.1 −0.5 Computer and Information Science and Engineering (CISE) 935.9 925.4 −10.5 −1.1 Engineering (ENG) 930.9 921.4 −9.5 −1.0 Mathematical and Physical Sciences (MPS) 1,362.4 1,345.3 −17.1 −1.3 Social, Behavioral, and Economic Sciences (SBE) 270.9 246.2 −24.7 –9.1 Office of International Science and Engineering (OISE) 49.0 48.5 −0.5 −0.9 Integrative Activities (IA) 420.3 536.7 116.5 27.7 U.S. Arctic Research Commission 1.4 1.4 0.0 −0.7 Education and Human Resources 873.4 873.4 0.0 0.0 Major Research Equipment and Facilities Construction 222.8 94.7 −128.1 −57.5 Agency Operations and Award Management 382.1 333.6 −48.4 −12.7 National Science Board 4.3 4.3 0.1 1.2 Office of Inspector General 15.1 15.4 0.3 1.7 Total, NSF 7,504.1 7,472.0 −32.1 −0.4 aSources: “National Science Foundation FY 2019 Budget Request to Congress” and “Summary Table.” bValues in millions of U.S. dollars, rounded to the nearest $0.1 million. A Look at the Geosciences Budget In the FY 2019 request, funding for the geosciences does well overall, although some divisions within GEO decline about 6% (see Table 2). “GEO supports the [budget] request and is confident that it will allow significant advancement of knowledge about how the Earth works” and in other specific programs, GEO directorate head William Easterling told Eos. However, “with an overall NSF budget that is flat from FY17, and with the priorities of investing in [other NSF initiatives], some reductions in other areas are inevitable.” As part of a whopping 37.4% increase, GEO’s Division of Integrative and Collaborative Education and Research would receive $30 million to fund the Navigating the New Arctic (NNA) program. NNA would establish an observing network “to document and understand the Arctic’s rapid biological, physical, chemical, and social changes,” according to NSF budget documents. The agency’s proposed budget includes funding for 10 “big ideas,” which are programs at the frontiers of science and engineering that the agency has selected for investment.NSF calls NNA a “big idea.” In all, the agency’s proposed budget includes funding for its 10 big ideas, which are programs at the frontiers of science and engineering that the agency has selected for investment. In addition to NNA, GEO programs contribute to several other big ideas funded by other agency divisions: Harnessing the Data Revolution for 21st Century Science and Education (funded at $30 million), Understanding the Rules of Life ($30 million), and INCLUDES (Inclusion across the Nation of Communities of Learners of Underrepresented Discoverers in Engineering and Science; $20 million). “Virtually all areas of research supported by GEO are needed to advance the Big Ideas (NNA and others) and so we see broad opportunities for the Geosciences community within those investments,” Easterling told Eos . Table 2. Directorate for Geosciences Budget Request for FY 2019 a FY 2017 Actualb Revised FY 2019 Requestb Changeb Percentage Change Atmospheric and Geospace Sciences (AGS) 253 239 −14 −5.6 Earth Sciences (EAR) 179 169 −10 −5.5 Integrative and Collaborative Education and Research (ICER) 76 105 29 37.4 Ocean Sciences (OCE) 317 340 23 7.2 Total 826 853 27 3.3 aSources: “National Science Foundation FY 2019 Budget Request to Congress” and “Directorate for Geosciences.” bValues in millions of U.S. dollars, rounded to the nearest million. In a budget proposed to go up 7.2%, GEO’s Division of Ocean Sciences (OCE) receives $174.8 million for infrastructure and $40 million for the Ocean Observatories Initiative. The ocean sciences also will benefit from a $28.7 million request for the $255.6 million Regional Class Research Vessel project, which is included in NSF’s Major Research Equipment and Facilities Construction funding account. However, the envisioned FY 2019 funding supports construction of only two of three regional class research vessels for which Congress appropriated money in FY 2017. “In FY 2017, [Public Law] 115-31 appropriated $121.88 million in funding to facilitate the planning and construction of three vessels. In the context of the President’s overall fiscal goals intended to maintain spending restraint, this Budget Request supports construction of the two vessels,” NSF budget documents note. Easterling expressed satisfaction with the proposed GEO budget despite some expected belt-tightening in certain areas. The proposal “will allow investments in NSF’s Big Ideas to begin, and it will allow important investments in research infrastructure through the start of modernization of McMurdo Station and through commitment to a second Regional Class Research Vessel for modernizing the academic fleet,” Easterling said. Polar Funding The Antarctic infrastructure project is “a necessity for maintaining U.S. scientific and geopolitical eminence across the continent of Antarctica.”For its Office of Polar Programs (OPP), the NSF funding proposal includes $420.2 million for infrastructure, an increase of 21.3%. That includes $103.7 million in FY 2019 for the Antarctic Infrastructure Modernization for Science (AIMS) construction project. In the budget document, the agency describes the $355 million project, which will modernize major facilities at McMurdo Station, as “a necessity for maintaining U.S. scientific and geopolitical eminence across the continent of Antarctica.” OPP director Kelly Falkner told Eos that “we would be delighted to take [on] the long-needed, major overhaul of McMurdo Station to set the [U.S. Antarctic Program] on a more robust, sustainable pathway.” —Randy Showstack (@RandyShowstack), Staff Writer
    Print ISSN: 0096-3941
    Electronic ISSN: 2324-9250
    Topics: Geosciences
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 15
    Publication Date: 2018-03-06
    Description: Jules Verne’s adventure novels Five Weeks in a Balloon and Around the World in 80 Days highlighted some of the great technological advances of the late 19th century that revolutionized travel and captured the imagination of the public [ Verne, 1863, 1873]. Among those inspired by the novels was Nellie Bly, an American journalist for the New York World , who set off in November 1889 to complete a journey by rail and steamship, following Verne’s imagined path around the world in a record 72 days [ Bly, 1890] (Figure 1). Fig. 1. In 1889–1890, real-life New York World reporter Nellie Bly completed Jules Verne’s imagined path (shown here) around the world in slightly less than Verne’s “80 days.” Neither Bly’s journey nor Verne’s Around the World in 80 Days actually involved balloon travel, but Verne’s book drew on his previous novel Five Weeks in a Balloon. The earlier novel inspired the idea of incorporating balloon travel for one leg of the trip in the 1956 movie Around the World in 80 Days that has become a beloved misconception about Verne’s later book. Credit: Roke/Wikimedia Commons CC BY-SA 3.0 Bly’s accounts demonstrated how new technology, such as the transcountry railroads in the United States and India and the Suez Canal, brought exotic destinations within reach. The revolutionary development of submarine cables and the electric telegraph allowed Bly to keep her editors, and the larger connected world, aware of her progress in near-real time. The France-U.S. collaborative Stratéole 2 project is planning its own series of balloon trips, which will circle the world near the equator for 80 days (more or less), as did these fictional and factual 19th century adventurers, demonstrating new technology and sending new observations from the voyage back via satellite. Drifting with the Winds Scientists with the Stratéole 2 project will release superpressure balloons, designed to drift in the lower stratosphere, from the Seychelles islands in the Indian Ocean (Figure 2). Superpressure balloons contain a fixed amount of helium sealed inside an envelope that does not stretch. This type of balloon is not fully inflated when it is launched, but it expands to its full volume as it rises to an altitude where the gas density inside the balloon matches the density of the surrounding air and where it drifts with the wind. Fig. 2. Early test flights of the French National Center for Space Studies superpressure balloon system during February–May 2010 followed a tropical route. The flight durations of the three balloons were 92, 78, and (yes!) 80 days. The traces of the balloon paths show some wave structure, and the balloon paths reversed direction when the quasi-biennial oscillation, a periodic east–west oscillating feature in tropical lower stratospheric winds, changed phase. Credit: A. Hertzog Each balloon will carry as many as four instruments. As they collect their high-accuracy measurements of meteorological variables, chemical tracers, clouds, and aerosols, their horizontal motions are nearly identical to those of the surrounding air mass. These measurements will advance our knowledge and understanding of cirrus clouds, aerosols, and equatorial waves in the tropical tropopause layer (TTL; the transition region between the troposphere and the stratosphere) and in the lower stratosphere. Shown here is a fully inflated superpressure balloon in the lab at the French National Center for Space Studies (CNES). Credit: Philippe Cocquerez, CNES The Stratéole 2 research program will begin with a five-balloon technology validation campaign in Northern Hemisphere (boreal) fall–winter 2018–2019, followed by 20 balloon flights in boreal fall–winter 2020–2021. In the second campaign, 10 balloons will fly at an altitude near 20 kilometers, just above the TTL, and another 10 will fly near 18 kilometers, within the TTL. From past experience, we expect each balloon to fly for more than 2 months. Typically, a balloon will fly for about 84 days before chaotic atmospheric motions or interactions with Rossby waves push it outside of the deep tropics. A final 20-balloon campaign in 2023–2024 will drift in the opposite direction because of the shifting phase of the quasi-biennial oscillation (QBO), a dominant, periodic east–west oscillating feature in tropical lower stratospheric winds. Challenges Aloft The Stratéole 2 campaign targets the TTL, the primary entry point for tropospheric air into the stratosphere. As air slowly ascends across the TTL, the coldest temperatures encountered at the cold point tropopause (CPT) freeze water vapor into ice crystals. The formation of ice crystals dehydrates the air and regulates the amount of humidity reaching the global stratosphere, giving the TTL an outsized importance considering its geographic extent. The ice crystals form thin cirrus clouds, which have a global impact on the balance between incoming solar radiation and radiation reflected back into space at tropical latitudes. Water vapor and cirrus feedbacks are extremely important in climate system models. The underlying processes that control the formation and sublimation (direct conversion of ice crystals to water vapor) of these clouds remain strongly debated. These processes involve the interplay of deep convection, microphysics, aerosols, wave-induced temperature variations with timescales ranging from minutes to weeks, and the balance of forces driving large-scale slow ascent of air in the tropics. The superposition of wave-induced fluctuations on the average upwelling motion forces the temperatures in the TTL to extreme values at the CPT—less than –94°C at times and well below those expected from radiative equilibrium. These same waves also drive the QBO, which has an important long-range indirect influence on high-latitude seasonal forecasts. The waves, generated by convection below, transport momentum vertically across the TTL and drive QBO wind variations as the momentum dissipates in the stratosphere. Satellite and in situ observations can track the wind reversals of the QBO, but most general circulation models cannot replicate the QBO using current methods. This shortcoming is due to a combination of inadequate spatial resolution and a lack of small-scale wave drag applied at the subgrid scale. Even when models do simulate the QBO, doubts remain on the contribution from various families of waves with different scales and frequencies. As a result, even models that internally generate a QBO were unable to forecast the anomalous disruption of the oscillation that occurred in February 2016 [ Osprey et al., 2016]. Science Objectives This superpressure balloon, shown here at launch, is not fully inflated. As it rises, the volume of helium sealed inside increases until the spherical balloon is fully inflated, giving the balloon a fixed density. Once the balloon has reached the atmospheric level where the air has the same density, it drifts with the wind, providing accurate wind measurements. Credit: Philippe Cocquerez, CNES The overarching objectives of Stratéole 2 are to explore processes that control the transfer of trace gases and momentum between the equatorial upper troposphere and lower stratosphere. The instruments will provide fine-scale measurements of water vapor, temperature, and aerosol/ice at the balloon gondola and also within several kilometers below flight level, documenting air composition and investigating the formation of cirrus in the upper TTL. The balloons also provide unique measurements of equatorial waves over the full spectrum from high-frequency buoyancy waves to planetary-scale equatorial waves, providing information needed to improve representation of these waves in climate models. Stratéole 2 balloons will sample the whole equatorial band from 20°S to 15°N, thus complementing the widespread (but limited-resolution) spaceborne observations and the high-resolution (but geographically restricted) airborne and ground-based measurements from previous field missions. Past balloon campaign measurements sampling the Antarctic stratospheric vortex [see Podglajen et al., 2016] have been used to make accurate estimates of wave momentum fluxes as well as to explain springtime stratospheric ozone loss rates; we expect similar successes with our current campaigns. Stratéole 2 balloon flights will collect measurements over oceanic areas that are otherwise devoid of any stratospheric wind measurements.Other Stratéole 2 science objectives include contributions to operational meteorology and satellite validation. Wind analyses and forecasts have notably large errors in the tropics because sparse tropical wind measurements cannot be modeled in a straightforward way through their dynamical relation to temperature, as they are at higher latitudes. Thus, reducing these errors requires a higher density of measurements. Stratéole 2 balloon flights will address this data shortage by providing unprecedented, accurate wind observations in the equatorial regions of the upper troposphere and lower stratosphere. In particular, the project will collect measurements over oceanic areas that are otherwise devoid of any stratospheric wind measurements. The data will also contribute to the validation of Atmospheric Dynamics Mission Aeolus (ADM-Aeolus) wind products. An innovative European Space Agency mission, ADM-Aeolus, due to be launched in September 2018, is designed to perform the first spaceborne wind lidar measurements, providing unprecedented global coverage. The ensemble of Stratéole 2 instrumentation includes in situ measurements of pressure, temperature, and winds every 30 seconds and less frequently sampled observations of ozone, aerosols, water vapor, and carbon dioxide, plus remotely sensed cloud structure from microlidar and directional radiative fluxes. Instruments providing profiles will include GPS radio occultation receivers that measure temperature profiles to the side of the balloons. Novel reel-down devices suspended as far as 2 kilometers directly below the balloons will also provide profiles to explore the fine-scale distribution of temperature, aerosol/ice, and humidity. Capturing temperature variations in high-resolution profiles, in particular, from the unique balloon platform, is an approach that will provide new insight into equatorial wave processes. Measuring ozone in combination with water vapor and carbon dioxide enables us to discover correlations among these tracers that describe transport processes at the top of the TTL, including convective overshoots that rapidly transport air from the surface into the TTL. Data Dissemination Within 12 months of the end of each balloon campaign, the Stratéole 2 data set will be freely available to the scientific community.The Stratéole 2 data policy is in compliance with World Meteorological Organization (WMO) Resolution 40 (WMO Cg-XII) on the policy and practice for the exchange of meteorological and related data and products. Within 12 months of the end of each balloon campaign, the Stratéole 2 data set will be freely available to the scientific community through the Stratéole 2 Data Archive Center (S2DAC), which is scheduled to launch its website in July 2018. S2DAC will collect and make available the balloon observations and associated ground-based and satellite data, reanalyses, and model outputs. The S2DAC includes a primary, full repository at the Dynamic Meteorology Laboratory (LMD) in France and a secondary mirror site at the Laboratory for Atmospheric and Space Physics (LASP) in Boulder, Colo., in the United States. In addition, during the balloon campaigns, a subset of the Stratéole 2 data set, specifically flight-level winds, will be disseminated on the Global Telecommunication System for their assimilation in numerical weather prediction systems. We invite and encourage the use of Stratéole 2 data by the broader scientific community, and potential users can watch for future campaign updates on the project website. Up, Up, and Away In the spirit of Verne’s imagined use of new technologies and Bly’s real-world application of those technologies to explore the world, the Stratéole 2 campaign will scientifically explore the tropical tropopause and lower stratosphere from a long-duration superpressure balloon platform. The use of multiple balloons will permit extensive exploration of the finely layered features and unique processes occurring in this remote part of the atmosphere. With the involvement of the broader scientific community, analyses of the Stratéole 2 measurements hold promise to provide a new and deeper understanding of these processes and the connections of this region to global chemistry, dynamics, and climate variability. Acknowledgments Major funding for the Stratéole 2 campaign is provided by France’s National Center for Space Studies (CNES) and National Center for Scientific Research (CNRS), as well as the U.S. National Science Foundation (NSF).
    Print ISSN: 0096-3941
    Electronic ISSN: 2324-9250
    Topics: Geosciences
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 16
    Publication Date: 2018-03-06
    Description: Many of the natural disasters that make the news headlines are related to extreme or unusual weather events. In an open-access article recently published in Reviews in Geophysics , Steptoe et al. [2018] examine extreme atmospheric hazards effecting different countries and regions around the world, and their connections with the global climate system. The editor asked the authors to explain more about these hazards and describe how scientific insights can be used by governments, communities and corporations involved in disaster risk reduction. What do you mean by “extreme atmospheric hazards”? Extreme atmospheric hazards are high impact weather events, typically judged by human or financial losses, caused by processes occurring in the Earth’s atmosphere. The atmospheric processes responsible for extreme events are themselves often influenced by some other large-scale component of the Earth’s atmosphere-ocean system, such as ocean-wide changes to sea-surface temperatures. Why is it important to understand regional extreme atmospheric events in the wider context of large scale atmosphere-ocean processes? In atmospheric science, the links that connect large scale changes in the atmosphere or ocean (such as widespread changes in temperature or humidity in an ocean basin) with localized hazards relating to regional weather conditions (such as extremes of rainfall or temperature) are collectively referred to as teleconnections. Most local extreme events may be related to temporal changes in the large scale dynamics of the climate system. Large scale changes are predicted by weather and climate models more skillfully than local extremes so understanding the link is vital to understanding impacts. There are many different kinds of teleconnection, typically named after the geographic location in which they are observed. Because any one teleconnection may influence weather conditions in multiple remote locations, understanding the interplay between regional extremes and teleconnections helps us to understand how different extreme hazards occurring in widely separate locations can have a common origin. In our review, we examined 16 different regional hazards and their interplay with eight different teleconnections. Connections between 10 key drivers of global weather and climate (acronyms around left of circle in bold) and 16 regional extreme weather events (right). Credit: Steptoe et al., 2018, Figure 3 Can you give a specific example of a regional atmospheric hazard and its connection to global teleconnections? In our review, we find that rainfall over China shares the most connections with global drivers. We summarized academic papers that have identified links to six teleconnections including large scale atmosphere-ocean processes in both Northern and Southern Hemispheres. The regional hazard with the strongest single linkage to a teleconnection are windstorms over Europe, and their connection to the North Atlantic Oscillation (NAO). The NAO describes a varying pattern in surface pressure across the North Atlantic. For European windstorms, the NAO pattern has a strong steering effect on winds high in the atmosphere, which in turn influences the path stormy weather takes as it approaches Europe. Which is the most significant process that influences multiple hazards across different regions at the same time. Our investigation finds that El Niño–Southern Oscillation (ENSO) influence 15 regional hazards. ENSO describes variations in sea-surface temperatures in the equatorial Pacific. In some cases, this connection is relatively well understood (for example, the way it influences rainfall over South Africa) and in other cases work is still being carried out to better understand the connection (such as its influence on Mexican rainfall). How does a scientific understanding of these teleconnections help to understand the risks and prepare for extreme events? Extreme events are the occasions that pose the greatest risk to communities and livelihoods. Hence, understanding the sorts of climatic situations where extremes events are more likely to happen represents one important facet of disaster risk management. By understanding the teleconnections and their associated hazards, it becomes possible to develop mitigation methods tailored to, and in advance of, potential risks. For example, the relationship between rainfall in South and Southeast Asia is driven by connections with the Indian Ocean Dipole (IOD) and ENSO. Understanding this complex relationship may offer a predictive insight into rainfall and potential hazards, such as flood or drought, for the coming season. This predictive insight in one aspect the scientific community can contribute to in order to enable advanced planning to mitigate against potential risks. How may these insights influence organizations to better plan for, and respond to, multi-hazard risks? Atmospheric science makes an important contribution to understanding hazard risk in areas such as California which is susceptible to wildfires. Credit: U.S. Forest Service International policies reflect the growing understanding of atmospheric hazards and their interconnectivity. Throughout the UN Sendai Framework for Disaster Risk Reduction 2015 – 2030, multi-hazard resilience is a consistent theme, reflected in guidance towards “inclusive and risk-informed” decision making and in the context of managing disaster risk effectively. In practice, these insights have contributed to multi-hazard approaches being adopted in early warning systems across the globe. The Regional Integrated Multi-Hazard Early Warning System for Africa and Asia (RIMES) provides monitoring and data services to local tsunami centers and national meteorological services, as well partnering with research organizations on projects implementing early warning systems in-country, such as early flood warning in Bangladesh. For private sector groups, such as the insurance industry, knowledge of the relationship between teleconnections and hazards can be vitally important when underwriting exposure, as it may increase their risk of multi-hazard losses across different regions. —Hamish Steptoe, Met Office, UK; email: hamish.steptoe@metoffice.gov.uk; Sarah Jones, JBA Risk Management, UK; and Helen Fox, Office for National Statistics, UK
    Print ISSN: 0096-3941
    Electronic ISSN: 2324-9250
    Topics: Geosciences
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 17
    Publication Date: 2018-03-06
    Description: Environmental remediation efforts in low- and middle-income countries have yet to be evaluated for their cost effectiveness. To address this gap we calculate a cost per Disability Adjusted Life Year (DALY) averted following the environmental remediation of the former lead smelter and adjoining residential areas in Paraiso de Dios, Haina, the Dominican Republic, executed from 2009 to 2010. The remediation had the effect of lowering surface soil lead concentrations to below 100 mg/kg and measured geometric mean blood lead levels (BLLs) from 20.6 μg/dL to 5.34 ug/dL. Because BLLs for the entire impacted population were not available, we use environmental data to calculate the resulting disease burden. We find that before the intervention 176 people were exposed to elevated environmental lead levels at Paraiso de Dios resulting in mean BLLs of 24.97 (95% CI: 24.45–25.5) in children (0–7 years old) and 13.98 μg/dL (95% CI: 13.03–15) in adults. We calculate that without the intervention these exposures would have resulted in 133 to 1,096 DALYs and that all of these were averted at a cost of USD 392 to 3,238, depending on assumptions made. We use a societal perspective, meaning that we include all costs regardless of by whom they were incurred and estimate costs in 2009 USD. Lead remediation in low- and middle-income countries is cost effective according to World Health Organization thresholds. Further research is required to compare the approach detailed here with other public health interventions.
    Electronic ISSN: 2471-1403
    Topics: Geosciences , Agriculture, Forestry, Horticulture, Fishery, Domestic Science, Nutrition , Medicine
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 18
    Publication Date: 2018-03-06
    Description: Rotavirus is the most common cause of diarrheal disease among children under 5. Especially in South Asia, rotavirus remains the leading cause of mortality in children due to diarrhea. As climatic extremes and safe water availability significantly influence diarrheal disease impacts in human populations, hydroclimatic information can be a potential tool for disease preparedness. In this study, we conducted a multivariate temporal and spatial assessment of 34 climate indices calculated from ground and satellite Earth observations to examine the role of temperature and rainfall extremes on the seasonality of rotavirus transmission in Bangladesh. We extracted rainfall data from the Global Precipitation Measurement and temperature data from the Moderate Resolution Imaging Spectroradiometer sensors to validate the analyses and explore the potential of a satellite-based seasonal forecasting model. Our analyses found that the number of rainy days and nighttime temperature range from 16°C to 21°C are particularly influential on the winter transmission cycle of rotavirus. The lower number of wet days with suitable cold temperatures for an extended time accelerates the onset and intensity of the outbreaks. Temporal analysis over Dhaka also suggested that water logging during monsoon precipitation influences rotavirus outbreaks during a summer transmission cycle. The proposed model shows lag components, which allowed us to forecast the disease outbreaks 1 to 2 months in advance. The satellite data-driven forecasts also effectively captured the increased vulnerability of dry-cold regions of the country, compared to the wet-warm regions.
    Electronic ISSN: 2471-1403
    Topics: Geosciences , Agriculture, Forestry, Horticulture, Fishery, Domestic Science, Nutrition , Medicine
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 19
    Publication Date: 2018-03-06
    Description: New dam construction is known to exacerbate malaria transmission in Africa as the vectors of malaria— Anopheles mosquitoes—use bodies of water as breeding sites. Precise environmental mechanisms of how reservoirs exacerbate malaria transmission are yet to be identified. Understanding of these mechanisms should lead to a better assessment of the impacts of dam construction and to new prevention strategies. Combining extensive multi-year field surveys around the Koka Reservoir in Ethiopia and rigorous model development and simulation studies, environmental mechanisms of malaria transmission around the reservoir were examined. Most comprehensive and detailed malaria transmission model, HYDREMATS, was applied to a village adjacent to the reservoir. Significant contributions to the dynamics of malaria transmission are shaped by wind profile, marginal pools, temperature, and shoreline locations. Wind speed and wind direction influence Anopheles populations and malaria transmission during the major and secondary mosquito seasons. During the secondary mosquito season, a noticeable influence was also attributed to marginal pools. Temperature was found to play an important role, not so much in Anopheles population dynamics, but in malaria transmission dynamics. Change in shoreline locations drives malaria transmission dynamics, with closer shoreline locations to the village making malaria transmission more likely. Identified environmental mechanisms help in predicting malaria transmission seasons and in developing village relocation strategies upon dam construction to minimize the risk of malaria.
    Electronic ISSN: 2471-1403
    Topics: Geosciences , Agriculture, Forestry, Horticulture, Fishery, Domestic Science, Nutrition , Medicine
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 20
    facet.materialart.
    Unknown
    American Geophysical Union (AGU)
    In: GeoHealth
    Publication Date: 2018-03-06
    Description: No abstract is available for this article.
    Electronic ISSN: 2471-1403
    Topics: Geosciences , Agriculture, Forestry, Horticulture, Fishery, Domestic Science, Nutrition , Medicine
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 21
    Publication Date: 2018-03-06
    Description: Valley fever is endemic to the southwestern United States. Humans contract this fungal disease by inhaling spores of Coccidioides spp. Changes in the environment can influence the abundance and dispersal of Coccidioides spp., causing fluctuations in valley fever incidence. We combined county-level case records from state health agencies to create a regional valley fever database for the southwestern United States, including Arizona, California, Nevada, New Mexico, and Utah. We used this data set to explore how environmental factors influenced the spatial pattern and temporal dynamics of valley fever incidence during 2000–2015. We compiled climate and environmental geospatial data sets from multiple sources to compare with valley fever incidence. These variables included air temperature, precipitation, soil moisture, surface dust concentration, normalized difference vegetation index, and cropland area. We found that valley fever incidence was greater in areas with warmer air temperatures and drier soils. The mean annual cycle of incidence varied throughout the southwestern United States and peaked following periods of low precipitation and soil moisture. From year-to-year, however, autumn incidence was higher following cooler, wetter, and productive springs in the San Joaquin Valley of California. In southcentral Arizona, incidence increased significantly through time. By 2015, incidence in this region was more than double the rate in the San Joaquin Valley. Our analysis provides a framework for interpreting the influence of climate change on valley fever incidence dynamics. Our results may allow the U.S. Centers for Disease Control and Prevention to improve their estimates of the spatial pattern and intensity of valley fever endemicity.
    Electronic ISSN: 2471-1403
    Topics: Geosciences , Agriculture, Forestry, Horticulture, Fishery, Domestic Science, Nutrition , Medicine
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 22
    Publication Date: 2018-03-06
    Description: Much concern has been raised about the increasing threat to air quality and human health due to ammonia (NH 3 ) emissions from agricultural systems, which is associated with the enrichment of reactive nitrogen (N) in southern Asia (SA), home of more than 60% the world's population (i.e., the people of West, central, East, South, and Southeast Asia). Southern Asia consumed more than half of the global synthetic N fertilizer and was the dominant region for livestock waste production since 2004. Excessive N application could lead to a rapid increase of NH 3 in the atmosphere, resulting in severe air and water pollution in this region. However, there is still a lack of accurate estimates of NH 3 emissions from agricultural systems. In this study, we simulated the agricultural NH 3 fluxes in SA by coupling the Bidirectional NH 3 exchange module (Bi-NH 3 ) from the Community Multi-scale Air Quality model with the Dynamic Land Ecosystem Model. Our results indicated that NH 3 emissions were 21.3 ± 3.9 Tg N yr −1 from SA agricultural systems with a rapidly increasing rate of ~0.3 Tg N yr −2 during 1961−2014. Among the emission sources, 10.8 Tg N yr −1 was released from synthetic N fertilizer use, and 10.4 ± 3.9 Tg N yr −1 was released from manure production in 2014. Ammonia emissions from China and India together accounted for 64% of the total amount in SA during 2000−2014. Our results imply that the increased NH 3 emissions associated with high N inputs to croplands would likely be a significant threat to the environment and human health unless mitigation efforts are applied to reduce these emissions.
    Electronic ISSN: 2471-1403
    Topics: Geosciences , Agriculture, Forestry, Horticulture, Fishery, Domestic Science, Nutrition , Medicine
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 23
    Publication Date: 2018-03-06
    Description: While there have been substantial efforts to quantify the health burden of exposure to PM 2.5 from solid fuel use (SFU), the sensitivity of mortality estimates to uncertainties in input parameters has not been quantified. Moreover, previous studies separate mortality from household and ambient air pollution. In this study, we develop a new estimate of mortality attributable to SFU due to the joint exposure from household and ambient PM 2.5 pollution and perform a variance-based sensitivity analysis on mortality attributable to SFU. In the joint exposure calculation, we estimate 2.81 (95% confidence interval: 2.48–3.28) million premature deaths in 2015 attributed to PM 2.5 from SFU, which is 580,000 (18%) fewer deaths than would be calculated by summing separate household and ambient mortality calculations. Regarding the sources of uncertainties in these estimates, in China, India, and Latin America, we find that 53–56% of the uncertainty in mortality attributable to SFU is due to uncertainty in the percent of the population using solid fuels and 42–50% from the concentration-response function. In sub-Saharan Africa, baseline mortality rate (72%) and the concentration-response function (33%) dominate the uncertainty space. Conversely, the sum of the variance contributed by ambient and household PM 2.5 exposure ranges between 15 and 38% across all regions (the percentages do not sum to 100% as some uncertainty is shared between parameters). Our findings suggest that future studies should focus on more precise quantification of solid fuel use and the concentration-response relationship to PM 2.5 , as well as mortality rates in Africa.
    Electronic ISSN: 2471-1403
    Topics: Geosciences , Agriculture, Forestry, Horticulture, Fishery, Domestic Science, Nutrition , Medicine
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 24
    Publication Date: 2018-03-06
    Description: New analyses are revealing the scale of pollution on global health, with a disproportionate share of the impact borne by lower-income nations, minority and marginalized individuals. Common themes emerge on the drivers of this pollution impact, including a lack of regulation and its enforcement, research and expertise development, and innovative funding mechanisms for mitigation. Creative approaches need to be developed and applied to address and overcome these obstacles. The existing “business as usual” modus operandi continues to externalize human health costs related to pollution, which exerts a negative influence on global environmental health.
    Electronic ISSN: 2471-1403
    Topics: Geosciences , Agriculture, Forestry, Horticulture, Fishery, Domestic Science, Nutrition , Medicine
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 25
    Publication Date: 2018-03-06
    Description: Dengue is the most important human arboviral disease in Singapore. We classified residential areas into low-rise and high-rise housing and investigated the influence of urban drainage on the distribution of dengue incidence and outdoor breeding at neighborhood and country scales. In Geylang area (August 2014 to August 2015), dengue incidence was higher in a subarea of low-rise housing compared to high-rise one, averaging 26.7 (standard error, SE = 4.83) versus 2.43 (SE = 0.67) per 1,000 people. Outdoor breeding drains of Aedes aegypti have clustered in the low-rise housing subarea. The pupal density per population was higher in the low-rise blocks versus high-rise ones, 246 (SE = 69.08) and 35.4 (SE = 25.49) per 1,000 people, respectively. The density of urban drainage network in the low-rise blocks is double that in the high-rise ones, averaging 0.05 (SE = 0.0032) versus 0.025 (SE = 0.00245) per meter. Further, a holistic analysis at a country-scale has confirmed the role of urban hydrology in shaping dengue distribution in Singapore. Dengue incidence (2013–2015) is proportional to the fractions of the area (or population) of low-rise housing. The drainage density in low-rise housing is 4 times that corresponding estimate in high-rise areas, 2.59 and 0.68 per meter, respectively. Public housing in agglomerations of high-rise buildings could have a positive impact on dengue if this urban planning comes at the expense of low-rise housing. City planners in endemic regions should consider the density of drainage networks for both the prevention of flooding and the breeding of mosquitoes.
    Electronic ISSN: 2471-1403
    Topics: Geosciences , Agriculture, Forestry, Horticulture, Fishery, Domestic Science, Nutrition , Medicine
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 26
    Publication Date: 2018-01-25
    Description: Communities in Australia's Murray-Darling Basin face the challenge of trying to achieve social, economic and environmental sustainability; but experience entrenched conflict about the best way to achieve a sustainable future, especially for small rural communities. Integral ecology is a philosophical concept that seeks to address community, economic, social and environmental sustainability simultaneously. Its inclusive processes are designed to reduce stakeholder conflict. However, to date the application of the integral ecology concept has been largely qualitative in nature. This study developed a quantitative integral ecology framework, and applied this framework to a case study of the Riverina, in the Murray-Darling Basin. Seventy-seven community-focused initiatives were assessed, ranked and quantified. The majority of the community-focused ranked initiatives did not exhibit all aspects of integral ecology. Initiatives typically prioritized either i) economic and community development or ii) environmental health; rarely both together. The integral ecology framework developed here enables recommendations on future community initiatives and may provide a pathway for community leaders and other policy-makers to more readily apply integral ecology objectives. Further research refining the framework's operationalization, application and implementation to a wider-scale may enhance communities’ capacity to develop and grow sustainably.
    Electronic ISSN: 2328-4277
    Topics: Geosciences
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 27
    Publication Date: 2018-02-09
    Description: The 2015 Paris Agreement aims to limit global warming to well below 2 K above pre-industrial levels, and to pursue efforts to limit global warming to 1.5 K, in order to avert dangerous climate change. However, current greenhouse gas emissions targets are more compatible with scenarios exhibiting end-of-century global warming of 2.6 - 3.1 K, in clear contradiction to the 1.5 K target. In this study, we use a global climate model to investigate the climatic impacts of using solar geoengineering by stratospheric aerosol injection to stabilize global-mean temperature at 1.5 K for the duration of the 21 st century against 3 scenarios spanning the range of plausible greenhouse gas mitigation pathways (RCP2.6, RCP4.5, RCP8.5). In addition to stabilizing global mean temperature and offsetting both Arctic sea-ice loss and thermosteric sea-level rise, we find that solar geoengineering could effectively counteract enhancements to the frequency of extreme storms in the North Atlantic and heatwaves in Europe, but would be less effective at counteracting hydrological changes in the Amazon basin and North Atlantic storm track displacement. In summary, solar geoengineering may reduce global mean impacts but is an imperfect solution at the regional level, where the effects of climate change are experienced. Our results should galvanize research into the regionality of climate responses to solar geoengineering.
    Electronic ISSN: 2328-4277
    Topics: Geosciences
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 28
    Publication Date: 2018-02-13
    Description: Even if global warming is kept below +2°C, European agriculture will be significantly impacted. Soil degradation may amplify these impacts substantially and thus hamper crop production further. We quantify biophysical consequences and bracket uncertainty of +2°C warming on calories supply from ten major crops and vulnerability to soil degradation in Europe using crop modelling. The Environmental Policy Integrated Climate (EPIC) model together with regional climate projections from the European branch of the Coordinated Regional Downscaling Experiment (EURO-CORDEX) were used for this purpose. A robustly positive calorie yield change was estimated for the EU Member States except for some regions in Southern and South-Eastern Europe. The mean impacts range from +30 Gcal ha –1 in the north, through +25 and +20 Gcal ha –1 in Western and Eastern Europe, respectively, to +10 Gcal ha –1 in the south if soil degradation and heat impacts are not accounted for. Elevated CO 2 and increased temperature are the dominant drivers of the simulated yield changes in high-input agricultural systems. The growth stimulus due to elevated CO 2 may offset potentially negative yield impacts of temperature increase by +2°C in most of Europe. Soil degradation causes a calorie vulnerability ranging from 0 to 80 Gcal ha –1 due to insufficient compensation for nutrient depletion and this might undermine climate benefits in many regions, if not prevented by adaptation measures, especially in Eastern and North-Eastern Europe. Uncertainties due to future potentials for crop intensification are about two to fifty times higher than climate change impacts.
    Electronic ISSN: 2328-4277
    Topics: Geosciences
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 29
    facet.materialart.
    Unknown
    American Geophysical Union (AGU)
    Publication Date: 2018-02-16
    Electronic ISSN: 2328-4277
    Topics: Geosciences
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 30
    Publication Date: 2018-02-16
    Description: Systemic threats to food-energy-environment-water systems require national policy responses. Yet complete control of these complex systems is impossible and attempts to mitigate systemic risks can generate unexpected feedback effects. Perverse outcomes from national policy can emerge from the diverse responses of decision-makers across different levels and scales of resource governance. Participatory risk assessment processes can help planners to understand sub-national dynamics and ensure that policies do not undermine the resilience of social-ecological systems and infrastructure networks. Researchers can play an important role in participatory processes as both technical specialists and brokers of stakeholder knowledge on the feedbacks generated by systemic risks and policy decisions. Here, we evaluate the use of causal modeling and participatory risk assessment to develop national policy on systemic water risks. We present an application of the Risks and Options Assessment for Decision-Making (ROAD) process to a district of Vietnam where national agricultural water reforms are being piloted. The methods and results of this project provide general insights about how to support resilient decision-making, including the transfer of knowledge across administrative levels, identification of feedback effects, and the effective implementation of risk assessment processes.
    Electronic ISSN: 2328-4277
    Topics: Geosciences
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 31
    Publication Date: 2018-01-25
    Description: As global average sea-level rises in the early part of this century there is great interest in how much global and local sea level will change in the forthcoming decades. The Paris Climate Agreement's proposed temperature thresholds of 1.5 °C and 2 °C have directed the research community to ask what differences occur in the climate system for these two states. We have developed a novel approach to combine climate model outputs that follow specific temperature pathways to make probabilistic projections of sea-level in a 1.5 °C and 2 °C world. We find median global sea-level projections for 1.5 °C and 2 °C temperature pathways of 44 cm and 50 cm respectively. The 90% uncertainty ranges (5-95%) are both around 48 cm by 2100. In addition, we take an alternative approach to estimate the contribution from ice sheets by using a semi-empirical global sea-level model. Here we find median projections of 58 cm and 68 cm for 1.5 °C and 2 °C temperature pathways. The 90% uncertainty ranges are 67 cm and 82 cm respectively. Regional projections show similar patterns for both temperature pathways, though differences vary between the median projections (2-10 cm) and 95 th percentile (5-20 cm) for the bulk of oceans using process-based approach and 10-15 cm (median) and 15-25 cm (95th percentile) using the semi-empirical approach.
    Electronic ISSN: 2328-4277
    Topics: Geosciences
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 32
    Publication Date: 2018-01-25
    Description: To contribute to a quantitative comparison of climate engineering (CE) methods, we assess atmosphere-, ocean-, and land-based CE measures with respect to Earth system effects consistently within one comprehensive model. We use the Max Planck Institute Earth System Model (MPI-ESM) with prognostic carbon cycle to compare solar radiation management (SRM) by stratospheric sulfur injection and two carbon dioxide removal methods: afforestation and ocean alkalinization. The CE model experiments are designed to offset the effect of fossil-fuel burning on global mean surface air temperature under the RCP8.5 scenario to follow or get closer to the RCP4.5 scenario. Our results show the importance of feedbacks in the CE effects. For example, as a response to SRM the land carbon uptake is enhanced by 92 Gt by the year 2100 compared to the reference RCP8.5 scenario due to reduced soil respiration thus reducing atmospheric CO 2 . Furthermore, we show that normalizations allow for a better comparability of different CE methods. For example, we find that due to compensating processes such as biogeophysical effects of afforestation more carbon needs to be removed from the atmosphere by afforestation than by alkalinization to reach the same global warming reduction. Overall, we illustrate how different CE methods affect the components of the Earth system, we identify challenges arising in a CE comparison, and thereby contribute to developing a framework for a comparative assessment of CE.
    Electronic ISSN: 2328-4277
    Topics: Geosciences
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 33
    facet.materialart.
    Unknown
    American Geophysical Union (AGU)
    Publication Date: 2018-02-13
    Description: Anthropogenic land use and land cover change, including local environmental disturbances, moderate rates of wind-driven soil erosion and dust emission. These human-dust cycle interactions impact ecosystems and agricultural production, air quality, human health, biogeochemical cycles, and climate. While the impacts of land use activities and land management on aeolian processes can be profound, the interactions are often complex and assessments of anthropogenic dust loads at all scales remain highly uncertain. Here, we critically review the drivers of anthropogenic dust emission and current evaluation approaches. We then identify and describe opportunities to: (1) develop new conceptual frameworks and interdisciplinary approaches that draw on ecological state-and-transition models to improve the accuracy and relevance of assessments of anthropogenic dust emissions; (2) improve model fidelity and capacity for change detection to quantify anthropogenic impacts on aeolian processes; and (3) enhance field research and monitoring networks to support dust model applications to evaluate the impacts of disturbance processes on local to global-scale wind erosion and dust emissions.
    Electronic ISSN: 2328-4277
    Topics: Geosciences
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 34
    Publication Date: 2018-02-08
    Description: Methane accounts for 20% of the global warming caused by greenhouse gases, and wastewater is a major anthropogenic source of methane. Based on the Intergovernmental Panel on Climate Change (IPCC) greenhouse gas inventory guidelines and current research findings, we calculated the amount of methane emissions from 2000 to 2014 that originated from wastewater from different provinces in China. Methane emissions from wastewater increased from 1349.01 Gg to 3430.03 Gg from 2000 to 2014, and the mean annual increase was 167.69 Gg. The methane emissions from industrial wastewater treated by wastewater treatment plants (E It ) accounted for the highest proportion of emissions. We also estimated the future trend of industrial wastewater methane emissions using the artificial neural network model. A comparison of the emissions for the years 2020, 2010 and 2000 showed an increasing trend in methane emissions in China and a spatial transition of industrial wastewater emissions from eastern and southern regions to central and southwestern regions and from coastal regions to inland regions. These changes were caused by changes in economics, demographics and relevant policies.
    Electronic ISSN: 2328-4277
    Topics: Geosciences
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 35
    Publication Date: 2018-02-14
    Description: Risk-based water resources management is based on the premise that water managers should invest up to the point where the marginal benefit of risk reduction equals the marginal cost of achieving that benefit. However, this cost-benefit approach may not guarantee robustness under uncertain future conditions, for instance under climatic changes. In this paper, we expand risk-based decision analysis to explore possible ways of enhancing robustness in engineered water resources systems under different risk attitudes. Risk is measured as the expected annual cost of water use restrictions, whilst robustness is interpreted in the decision-theoretic sense as the ability of a water resource system to maintain performance—expressed as a tolerable risk of water use restrictions—under a wide range of possible future conditions. Linking risk attitudes with robustness allows stakeholders to explicitly trade-off incremental increases in robustness with investment costs for a given level of risk. We illustrate the framework through a case study of London’s water supply system using state of the art regional climate simulations to inform the estimation of risk and robustness.
    Electronic ISSN: 2328-4277
    Topics: Geosciences
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 36
    Publication Date: 2018-02-14
    Description: Particulate matter with the diameter smaller than 2.5 micrometers (PM2.5) poses health threats to human population. Regardless of efforts to regulate the pollution sources, it is unclear how climate change caused by greenhouse gases (GHGs) would affect PM2.5 levels. Using century-long ensemble simulations with Community Earth System Model 1 (CESM1), we show that, if the anthropogenic emissions would remain at the level in the year 2005, the global surface concentration and atmospheric column burden of sulfate, black carbon, and primary organic carbon would still increase by 5-10% at the end of 21st century (2090-2100) due to global warming alone. The decrease in the wet removal flux of PM2.5, despite an increase in global precipitation, is the primary cause of the increase in the PM2.5 column burden. Regionally over North America and East Asia, a shift of future precipitation toward more frequent heavy events contributes to weakened wet removal fluxes. Our results suggest climate change impact needs to be accounted for to define the future emission standards necessary to meet air quality standard.
    Electronic ISSN: 2328-4277
    Topics: Geosciences
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 37
    Publication Date: 2018-02-14
    Description: In light of the Paris Agreement, it is essential to identify regional impacts of half a degree additional global warming to inform climate adaptation and mitigation strategies. We investigate the effects of 1.5 ° C and 2.0 ° C global warming above pre-industrial conditions, relative to present day (2006–2015), over the Asian-Australian monsoon region (AAMR) using five models from the Half a degree Additional warming, Prognosis and Projected Impacts (HAPPI) project. There is considerable inter-model variability in projected changes to mean climate and extreme events in 2.0 ° C and 1.5 ° C scenarios. There is high confidence in projected increases to mean and extreme surface temperatures over AAMR, as well as more-frequent persistent daily temperature extremes over East Asia, Australia and northern India with an additional 0.5 ° C warming, which are likely to occur. Mean and extreme monsoon precipitation amplify over AAMR, except over Australia at 1.5 ° C where there is uncertainty in the sign of the change. Persistent daily extreme precipitation events are likely to become more frequent over parts of East Asia and India with an additional 0.5 ° C warming. There is lower confidence in projections of precipitation change than in projections of surface temperature change. These results highlight the benefits of limiting the global-mean temperature change to 1.5 ° C above pre-industrial, as the severity of the above effects increases with an extra 0.5 ° C warming.
    Electronic ISSN: 2328-4277
    Topics: Geosciences
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 38
    Publication Date: 2018-02-14
    Description: The most common approaches to detection and attribution of extreme weather events using FAR or RR (Fraction of Attributable Risk or Risk Ratio) answer a particular form of research question, namely, “What is the probability of a certain class of weather events, given global climate change, relative to a world without?” In a set of recent papers, Kevin Trenberth et al. (2015) and Theodore Shepherd (2016) have argued that this is not always the best tool for analyzing causes, or for communicating with the public about climate events and extremes. Instead, they promote the idea of a “storyline” approach, which ask complementary questions, such as “How much did climate change affect the severity of a given storm?” From the vantage of history and philosophy of science, a proposal to introduce a new approach or to answer different research questions—especially those of public interest—does not appear particularly controversial. However, the proposal proved highly controversial, with the majority of detection and attribution scientists reacting in a very negative and even personal manner. Some suggested the proposed alternatives amount to a weakening of standards, or an abandonment of scientific method. Here, we address the question: Why is this such a controversial proposition? We argue that there is no “right” or “wrong” approach to D&A in any absolute sense, but rather that in different contexts society may have a greater or lesser concern with errors of a particular type. How we view the relative risk of over-estimation vs. under-estimation of harm is context-dependent. [250]
    Electronic ISSN: 2328-4277
    Topics: Geosciences
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 39
    Publication Date: 2018-02-14
    Description: The impacts of land use have been shown to have considerable influence on regional climate. With the recent international commitment to limit global warming to well below 2°C, emission reductions need to be ambitious and could involve major land-use change (LUC). Land-based mitigation efforts to curb emissions growth include increasing terrestrial carbon sequestration through reforestation, or the adoption of bioenergy crops. These activities influence local climate through biogeophysical feedbacks however it is uncertain how important they are for a 1.5 degree climate target. This was the motivation for HAPPI-Land: the Half a degree Additional warming, Prognosis and Projected Impacts – Land use scenario experiment. Using four Earth System Models we present the first multi-model results from HAPPI-Land and demonstrate the critical role of land use for understanding characteristics of regional climate extremes in low-emission scenarios. In particular, our results show that changes in temperature extremes due to LUC are comparable in magnitude to changes arising from half a degree of global warming. We also demonstrate that LUC contributes to more than 20% of the change in temperature extremes for large land areas concentrated over the Northern Hemisphere. However, we also identify sources of uncertainty that influence the multi-model consensus of our results including how LUC is implemented and the corresponding biogeophysical feedbacks that perturb climate. Therefore our results highlight the urgent need to resolve the challenges in implementing LUC across models to quantify the impacts and consider how LUC contributes to regional changes in extremes associated with sustainable development pathways.
    Electronic ISSN: 2328-4277
    Topics: Geosciences
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 40
    Publication Date: 2018-02-13
    Description: Worldwide, humans are facing high risks from natural hazards, especially in coastal regions with high population densities. Rising sea levels due to global warming are making coastal communities’ infrastructure vulnerable to natural disasters. The present study aims to provide a coupling approach of vulnerability and resilience through restoration and conservation of lost or degraded coastal natural habitats to reclamation under different climate change scenarios. The Integrated Valuation of Ecosystems and Tradeoffs (InVEST) model is used to assess the current and future vulnerability of coastal communities. The model employed is based on seven different bio-geophysical variables to calculate a Natural Hazard Index (NHI) and to highlight the criticality of the restoration of natural habitats. The results show that roughly 25 percent of the coastline and more than 5 million residents are in highly vulnerable coastal areas in China, and these numbers are expected to double by 2100. Our study suggests that restoration and conservation in recently reclaimed areas have the potential to reduce this vulnerability by 45 percent. Hence, natural habitats have proved to be a great defense against coastal hazards and should be prioritized in coastal planning and development. The findings confirm that natural habitats are critical for coastal resilience and can act as a recovery force of coastal functionality loss. Therefore, we recommend that the Chinese government prioritize restoration where possible and conservation of the remaining habitats for the sake of coastal resilience to prevent natural hazards from escalating into disasters.
    Electronic ISSN: 2328-4277
    Topics: Geosciences
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 41
    Publication Date: 2018-02-21
    Description: During the last few decades, the global agricultural production has risen and technology enhancement is still contributing to yield growth. However, population growth, water crisis, deforestation and climate change threaten the global food security. An understanding of the variables that caused past changes in crop yields can help improve future crop prediction models. In this paper, we present a comprehensive global analysis of the changes in the crop yields and how they relate to different large-scale and regional climate variables, climate change variables and technology in a unified framework. A new multilevel model for yield prediction at the country level is developed and demonstrated. The structural relationships between average yield and climate attributes as well as trends are estimated simultaneously. All countries are modeled in a single multilevel model with partial pooling to automatically group and reduce estimation uncertainties. El Niño-Southern Oscillation (ENSO), Palmer Drought Severity Index (PDSI), geopotential height anomalies (GPH), historical carbon dioxide (CO2) concentration and country-based time series of GDP per capita as an approximation of technology measurement are used as predictors to estimate annual agricultural crop yields for each country from 1961 to 2013. Results indicate that these variables can explain the variability in historical crop yields for most of the countries and the model performs well under out-of-sample verifications. While some countries were not generally affected by climatic factors, PDSI and GPH acted both positively and negatively in different regions for crop yields in many countries.
    Electronic ISSN: 2328-4277
    Topics: Geosciences
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 42
    Publication Date: 2018-02-23
    Description: Bioenergy Carbon Capture and Storage (BECCS) has been proposed to reduce atmospheric CO 2 concentrations, but concerns remain about competition for arable land and freshwater. The synergistic integration of algae production, which does not require arable land or freshwater, with BECCS (called “ABECCS”) can reduce CO 2 emissions without competing with agriculture. This study presents a techno-economic and life-cycle assessment for co-locating a 121-ha algae facility with a 2,680-ha eucalyptus forest for BECCS. The eucalyptus biomass fuels combined heat and power generation (CHP) with subsequent amine based carbon capture and storage (CCS). A portion of the captured CO 2 is used for growing algae and the remainder is sequestered. Biomass combustion supplies CO 2 , heat, and electricity, thus increasing the range of sites suitable for algae cultivation. Economic, energetic, and environmental impacts are considered. The system yields as much protein as soybeans while generating 61.5 TJ of electricity and sequestering 29,600 t of CO 2 per year. More energy is generated than consumed and the freshwater footprint is roughly equal to that for soybeans. Financial break-even is achieved for product value combinations ranging from 1) algal biomass sold for $1,780/t without a carbon credit to 2) algal biomass sold for $100/t with a carbon credit of $396/t. Sensitivity analysis shows significant reductions to the cost of carbon sequestration are possible. The ABECCS system represents a unique technology for negative emissions without reducing protein production or increasing water demand, and should therefore be included in the suite of technologies being considered to address global sustainability.
    Electronic ISSN: 2328-4277
    Topics: Geosciences
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 43
    Publication Date: 2018-02-23
    Description: Extreme events are of interest worldwide given their potential for substantial impacts on social, ecological, and technical systems. Many climate-related extreme events are increasing in frequency and/or magnitude due to anthropogenic climate change, and there is increased potential for impacts due to the location of urbanization and the expansion of urban centers and infrastructures. Many disciplines are engaged in research and management of these events. However, a lack of coherence exists in what constitutes and defines an extreme event across these fields, which impedes our ability to holistically understand and manage these events. Here, we review 10 years of academic literature and use text analysis to elucidate how six major disciplines--climatology, earth sciences, ecology, engineering, hydrology, and social sciences--define and communicate extreme events. Our results highlight critical disciplinary differences in the language used to communicate extreme events. Additionally, we found a wide range in definitions and thresholds, with more than half of examined papers not providing an explicit definition, and disagreement over whether impacts are included in the definition. We urge distinction between extreme events and their impacts, so that we can better assess when responses to extreme events have actually enhanced resilience. Additionally, we suggest that all researchers and managers of extreme events be more explicit in their definition of such events as well as be more cognizant of how they are communicating extreme events. We believe clearer and more consistent definitions and communication can support transdisciplinary understanding and management of extreme events.
    Electronic ISSN: 2328-4277
    Topics: Geosciences
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 44
    Publication Date: 2018-02-24
    Description: To maintain the chance of keeping the average global temperature increase below 2°C and to limit long-term climate change, removing carbon dioxide from the atmosphere (Carbon Dioxide Removal, CDR) is becoming increasingly necessary. We analyze optimal and cost-effective climate policies in the Dynamic Integrated Assessment Model of Climate and the Economy (DICE2016R) and investigate i) the utilization of (ocean) CDR under different climate objectives, ii) the sensitivity of policies with respect to carbon cycle feedbacks, and iii) how well carbon cycle feedbacks are captured in the carbon-cycle models used in state-of-the-art integrated assessment models. Overall, the carbon cycle model in DICE2016R shows clear improvements compared to its predecessor, DICE2013R, capturing much better long-term dynamics and also oceanic carbon outgassing due to excess oceanic storage of carbon from CDR. However, this comes at the cost of a (too) tight short-term remaining emission budget, limiting the model suitability to analyze low emission scenarios accurately. With DICE2016R, the compliance with the 2°C goal is no longer feasible without negative emissions via CDR. Overall, the optimal amount of CDR has to take into account i) the emission substitution effect and ii) compensation for carbon cycle feedbacks.
    Electronic ISSN: 2328-4277
    Topics: Geosciences
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 45
    Publication Date: 2018-01-06
    Description: The 2015/2016 El Niño has been classified as one of the three most severe on record. El Niño teleconnections are commonly associated with droughts in southern Africa and high precipitation in eastern Africa. Despite their relatively frequent occurrence, evidence for their hydrological effects and impacts beyond agriculture is limited. We examine the hydrological response and impact pathways of the 2015/2016 El Niño in eastern and southern Africa, focusing on Botswana, Kenya, and Zambia. We use in situ and remotely sensed time series of precipitation, river flow, and lake levels complemented by qualitative insights from interviews with key organizations in each country about awareness, impacts, and responses. Our results show that drought conditions prevailed in large parts of southern Africa, reducing runoff and contributing to unusually low lake levels in Botswana and Zambia. Key informants characterized this El Niño through record high temperatures and water supply disruption in Botswana and through hydroelectric load shedding in Zambia. Warnings of flood risk in Kenya were pronounced, but the El Niño teleconnection did not materialize as expected in 2015/2016. Extreme precipitation was limited and caused localized impacts. The hydrological impacts in southern Africa were severe and complex, strongly exacerbated by dry antecedent conditions, recent changes in exposure and sensitivity and management decisions. Improved understanding of hydrological responses and the complexity of differing impact pathways can support design of more adaptive, region-specific management strategies.
    Electronic ISSN: 2328-4277
    Topics: Geosciences
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 46
    Publication Date: 2018-01-11
    Description: Though urban agriculture (UA), defined here as growing of crops in cities, is increasing in popularity and importance globally, little is known about the aggregate benefits of such natural capital in built-up areas. Here, we introduce a quantitative framework to assess global aggregate ecosystem services from existing vegetation in cities and an intensive UA adoption scenario based on data-driven estimates of urban morphology and vacant land. We analyzed global population, urban, meteorological, terrain, and Food and Agriculture Organization (FAO) datasets in Google Earth Engine to derive global scale estimates, aggregated by country, of services provided by UA. We estimate the value of four ecosystem services provided by existing vegetation in urban areas to be on the order of $33 billion annually. We project potential annual food production of 100–180 million tonnes, energy savings ranging from 14 to 15 billion kilowatt hours, nitrogen sequestration between 100,000 and 170,000 tonnes, and avoided storm water runoff between 45 and 57 billion cubic meters annually. In addition, we estimate that food production, nitrogen fixation, energy savings, pollination, climate regulation, soil formation and biological control of pests could be worth as much as $80–160 billion annually in a scenario of intense UA implementation. Our results demonstrate significant country-to-country variability in UA-derived ecosystem services and reduction of food insecurity. These estimates represent the first effort to consistently quantify these incentives globally, and highlight the relative spatial importance of built environments to act as change agents that alleviate mounting concerns associated with global environmental change and unsustainable development.
    Electronic ISSN: 2328-4277
    Topics: Geosciences
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 47
    Publication Date: 2018-02-01
    Description: The Paris Agreement of the United Nations Framework Convention on Climate Change aims not only at avoiding +2°C warming (and even limit the temperature increase further to +1.5 o C), but also sets long-term goals to guide mitigation. Therefore, the best available science is required to inform policy makers on the importance of and the adaptation needs in a +1.5 o C warmer world. Seven research institutes from Europe and Turkey integrated their competencies to provide a cross-sectoral assessment of the potential impacts at a pan-European scale. The initial findings of this initiative are presented and key messages communicated. The approach is to select periods based on global warming thresholds rather than the more typical approach of selecting time periods (e.g. end of century). The results indicate that the world is likely to pass the +1.5 o C threshold in the coming decades. Cross-sectoral dimensions are taken into account to show the impacts of global warming that occur in parallel in more than one sector. Also, impacts differ across sectors and regions. Alongside the negative impacts for certain sectors and regions, some positive impacts are projected. Summer tourism in parts of Western Europe may be favoured by climate change; energy demand decreases outweigh increases over most of Europe and catchment yields in hydropower regions will increase. However, such positive findings should be interpreted carefully as we do not take into account exogenous factors that can and will influence Europe such as migration patterns, food production and economic and political instability.
    Electronic ISSN: 2328-4277
    Topics: Geosciences
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 48
    Publication Date: 2018-02-03
    Description: Governments around the world have agreed to end hunger and food insecurity and to improve global nutrition, largely through changes to agriculture and food systems. However, they are faced with a lot of uncertainty when making policy decisions, since any agricultural changes will influence social and biophysical systems, which could yield either positive or negative nutrition outcomes. We outline a holistic probability modeling approach with Bayesian Network (BN) models for nutritional impacts resulting from agricultural development policy. The approach includes the elicitation of expert knowledge for impact model development, including sensitivity analysis and value of information calculations. It aims at a generalizable methodology that can be applied in a wide range of contexts. To showcase this approach, we develop an impact model of Vision 2040, Uganda's development strategy, which, among other objectives, seeks to transform the country's agricultural landscape from traditional systems to large-scale commercial agriculture. Model results suggest that Vision 2040 is likely to have negative outcomes for the rural livelihoods it intends to support; it may have no appreciable influence on household hunger but, by influencing preferences for and access to quality nutritional foods, may increase the prevalence of micronutrient deficiency. The results highlight the tradeoffs that must be negotiated when making decisions regarding agriculture for nutrition, and the capacity of BNs to make these tradeoffs explicit. The work illustrates the value of BNs for supporting evidence-based agricultural development decisions.
    Electronic ISSN: 2328-4277
    Topics: Geosciences
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 49
    Publication Date: 2018-02-03
    Description: An analytic scenario generation framework is developed based on the idea that the same climate outcome can result from very different socio-economic and policy drivers. The framework builds on the Scenario Matrix Framework's abstraction of “challenges to mitigation” and “challenges to adaptation” to facilitate the flexible discovery of diverse and consequential scenarios. We combine visual and statistical techniques for interrogating a large factorial dataset of 33,750 scenarios generated using the Global Change Assessment Model (GCAM). We demonstrate how the analytic framework can aid in identifying which scenario assumptions are most tied to user specified measures for policy relevant outcomes of interest, specifically for our example high or low mitigation costs. We show that the current approach for selecting reference scenarios can miss policy relevant scenario narratives that often emerge as hybrids of optimistic and pessimistic scenario assumptions. We also show that the same scenario assumption can be associated with both high and low mitigation costs depending on the climate outcome of interest and the mitigation policy context. In the illustrative example, we show how agricultural productivity, population growth, and economic growth are most predictive of the level of mitigation costs. Formulating policy relevant scenarios of deeply and broadly uncertain futures benefits from large ensemble-based exploration of quantitative measures of consequences. To this end, we have contributed a large database of climate change futures that can support “bottom-up” scenario generation techniques, that capture a broader array of consequences than those that emerge from limited sampling of a few reference scenarios.
    Electronic ISSN: 2328-4277
    Topics: Geosciences
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 50
    facet.materialart.
    Unknown
    American Geophysical Union (AGU)
    Publication Date: 2018-03-14
    Description: New oceanic crust is forged along Earth’s mid-ocean ridge system, a global chain of volcanic spreading centers whose segments are offset by perpendicular transform faults. Because these faults are not present during continental rifting, the first stage in the formation of a new ocean basin, but are pervasive features of newly minted seafloor, it remains unclear when they form and how their presence is perpetuated. Seismic anisotropy, the directional variation in the velocity of seismic waves, is influenced by individual crystal alignment and is therefore a potentially powerful tool for deducing patterns of mantle flow along divergent plate boundaries. To date, however, the difficulties and expense of instrumenting the seafloor have limited the application of this technique. Now Eakin et al. have refined a method for conducting such measurements along mid-ocean ridges and have used the results to elucidate the role of transform faults in seafloor spreading. By carefully eliminating seismic stations where the underlying mantle displays directionality, the team was able to measure the mantle properties beneath individual earthquake sources, rather than below each station, as is typical for this method. For the first time, this modification allowed the researchers to characterize mantle flow beneath active transform faults on a global scale. The observed patterns of nearly vertically aligned anisotropy suggest that widespread upwelling of the mantle is occurring beneath oceanic transform faults. The results, which are consistent with geodynamic models, imply that mantle upwelling warms, and consequently weakens, transforms. These features thus appear to play an important role in localizing strain and ultimately helping to stabilize divergent plate boundaries. In addition to representing a significant advance in how measurements of mantle anisotropy are conducted, this study is noteworthy for predicting the dominant pattern of mantle flow beneath transform faults. As more ocean bottom studies are conducted along mid-ocean ridge systems, these results offer a working hypothesis for other researchers to test. ( Journal of Geophysical Research: Solid Earth , https://doi.org/10.1002/2017JB015176, 2018) —Terri Cook, Freelance Writer The post Widespread Mantle Upwelling Beneath Oceanic Transform Faults appeared first on Eos .
    Print ISSN: 0096-3941
    Electronic ISSN: 2324-9250
    Topics: Geosciences
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 51
    Publication Date: 2018-03-14
    Description: A helicopter from an Argentinean icebreaker plucked four U.S. scientists, a support person, and all their gear from a scientific field camp on an island off the Antarctic coast on Sunday morning after ice and weather conditions prevented a U.S. vessel from retrieving them. The scientists, who were conducting research funded by the U.S. National Science Foundation (NSF), had completed their paleoclimate field work, they are safe, and they were never in any danger because they had sufficient provisions to last for about 2 more weeks, according to Kelly Falkner, director of NSF’s Office of Polar Programs (OPP). The incident “did catch the attention of diplomats, and that’s why it rose to a news item.”The incident “did catch the attention of diplomats, and that’s why it rose to a news item,” Falkner told Eos . A Difficult Combination of Fog and Ice The four-person research team led by principal investigator Alexander Simms, an assistant professor and sedimentologist in the Department of Earth Science at the University of California, Santa Barbara, was on Joinville Island in the Weddell Sea. There, the scientists were sampling and conducting elevation and ground-penetrating radar surveys of raised beaches to determine their ages and to reconstruct past sea levels and climate. The U.S. Antarctic Program (USAP) research vessel Laurence M. Gould was expected to collect the scientists from the island. However, it’s not an icebreaker (although it is ice reinforced) and proved unable to safely navigate to the island, Falkner told Eos . “A combination of fog and ice was making it very difficult for [the Gould ] to make headway close enough to put Zodiacs”—small motorized boats—“in the water to retrieve people,” she said. Scientists wait for a helicopter liftoff from Joinville Island. Credit: National Science Foundation After the USAP requested assistance, the Argentinean icebreaker Almirante Irízar , which had been operating nearby, steamed several hours to pick up the stranded researchers and a fifth person who is an employee of ASC, an NSF support contractor, according to the Argentinean Ministry of Foreign Affairs and Worship. A Tough Work Environment “This wasn’t an emergency, but we are very grateful that we can engage with our international partners to take care of situations that could become emergencies.”“Antarctica remains a tough operational work area because of its extreme environment. You can plan for all kinds of things, but you can’t plan for everything,” Falkner said, adding that NSF does a lot of advance and contingency planning and works closely with its international counterparts on a regular basis through the Council of Managers of National Antarctic Programs. This kind of incident is pretty rare, Falkner said. “This is the first time I have requested such assistance following a camp that NSF put in in my seven years at OPP,” she noted. “This wasn’t an emergency, but we are very grateful that we can engage with our international partners to take care of situations that could become emergencies.” Carlos Bunge, an adviser to the manager of the Argentinean National Antarctic Program, told Eos from on board the Argentinean icebreaker that the research team on Monday was being transferred to the Gould by Zodiac. The plan is for the Gould to then return to Punta Arenas, Chile, the home port for the ship’s Antarctic operations. Bunge, who has been involved with coordinating science operations with logistical assistance, stressed that this was not a rescue operation. “It was an assistance. They needed to be evacuated from the island. We helped them to get out before the conditions got worse, but there was no risk for human life during the operation. It was just preventing a problematic situation in the future,” Bunge said. “International cooperation is one of the most important objectives of the Antarctic treaty,” he added. “We all have the same spirit here of research and cooperation.” —Randy Showstack (@RandyShowstack), Staff Writer The post U.S. Scientists Safely Retrieved from Ice-Bound Antarctic Island appeared first on Eos .
    Print ISSN: 0096-3941
    Electronic ISSN: 2324-9250
    Topics: Geosciences
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 52
    facet.materialart.
    Unknown
    American Geophysical Union (AGU)
    Publication Date: 2018-03-14
    Description: Deep summertime continental convection can lift boundary-layer air directly into the upper troposphere–lower stratosphere (UTLS), but measurements of tropopause-penetrating convection are limited to field campaigns or low vertical resolution satellite data. Cooney et al. [2018] provide an analysis of deep convection over North America that extends above the tropopause, based on synthesis of high-resolution, three-dimensional gridded radar reflectivity fields. Such an analysis at high spatial and temporal resolution has not been done before using NEXRAD or any other dataset. Overshooting primarily occurs during May, June, and July over the central United States, with likely impacts on composition of the lower stratosphere. These data have the potential to refine our detailed understanding of convective influence on the UTLS. Citation: Cooney, J. W., Bowman, K. P., Homeyer, C. R., & Fenske, T. M. [2018]. Ten year analysis of tropopause-overshooting convection using GridRad data. Journal of Geophysical Research: Atmospheres , 123, 329–343. https://doi.org/10.1002/2017JD027718 —William Randel, Editor, JGR: Atmospheres The post Continental Convection Reaches New Highs appeared first on Eos .
    Print ISSN: 0096-3941
    Electronic ISSN: 2324-9250
    Topics: Geosciences
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 53
    facet.materialart.
    Unknown
    American Geophysical Union (AGU)
    Publication Date: 2018-03-14
    Description: From bold purple to brilliant yellow-orange bursts, the map above pinpoints regions along coastal Antarctica that have recently experienced moderate to severe ice loss. The map’s surface ice velocities, measured in meters per year, come from Landsat 7 and 8 data averaged over 2013–2015. As this map shows, ice loss in western Antarctica is faster and more extensive than in the east, particularly in the Ronne (upper) and Ross (lower) ice shelves. The data underlying this map that depicts recent Antarctic ice flow are from a 13 February study published in The Cryosphere . However, the map doesn’t appear in the paper; the researchers created it for an article posted online by NASA about the new research. Not only is ice loss in western Antarctica already much higher than on eastern shores, but it’s also accelerating, according to the research team behind the map and paper. By comparing the most recently available Landsat data with earlier estimates of ice velocity from 2008, the team could determine where ice loss has sped up or remained steady across nearly the entire Antarctic Ice Sheet. The researchers’ comprehensive look revealed that the greatest acceleration of ice loss from 2008 to 2015 took place among glaciers that feed Marguerite Bay on the western Antarctic Peninsula, in an area too small to see on the large map above. (left) Present-day surface ice velocity in meters per year for (17) Prospect and (19) Seller glaciers, which flow into Marguerite Bay in the western Antarctic Peninsula. (right) The change in surface ice velocity from 2008 to 2015 in meters per year for that region, whose boundary is identified in both panels by a thick black curve. Credit: Gardner et al., 2018, Figs. 8 and 9, https://doi.org/10.5194/tc-12-521-2018; CC BY 3.0 As seen in another map below that zooms in on the Marguerite Bay area, the speed of ice loss there increased from about 2,600 to 3,000 meters per year in some locations. That’s a jump of roughly 400 meters per year (15%). Using the newest Landsat data each time, “we can map ice flow over nearly the entire continent, every year,” Alex Gardner, lead author on the paper, said in the NASA article. Gardner, who is a research scientist at the Jet Propulsion Laboratory in Pasadena, Calif., added that “with these new data, we can begin to unravel the mechanisms by which the ice flow is speeding up or slowing down in response to changing environmental conditions.” —Kimberly M. S. Cartier (@AstroKimCartier), News Writing and Production Intern The post New Maps Highlight Antarctica’s Flowing Ice appeared first on Eos .
    Print ISSN: 0096-3941
    Electronic ISSN: 2324-9250
    Topics: Geosciences
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 54
    Publication Date: 2018-03-07
    Description: To avoid the most dangerous consequences of anthropogenic climate change, the Paris Agreement provides a clear and agreed climate mitigation target of stabilizing global surface warming to under 2.0 °C above preindustrial, and preferably closer to 1.5 °C. However, policy makers do not currently know exactly what carbon emissions pathways to follow to stabilize warming below these agreed targets, because there is large uncertainty in future temperature rise for any given pathway. This large uncertainty makes it difficult for a cautious policy maker to avoid either: (1) allowing warming to exceed the agreed target; or (2) cutting global emissions more than is required to satisfy the agreed target, and their associated societal costs. This study presents a novel Adjusting Mitigation Pathway (AMP) approach to restrict future warming to policy-driven targets, in which future emissions reductions are not fully determined now but respond to future surface warming each decade in a self-adjusting manner. A large ensemble of Earth system model simulations, constrained by geological and historical observations of past climate change, demonstrates our self-adjusting mitigation approach for a range of climate stabilization targets ranging from 1.5 to 4.5 °C, and generates AMP scenarios up to year 2300 for surface warming, carbon emissions, atmospheric CO 2 , global mean sea level, and surface ocean acidification. We find that lower 21 st century warming targets will significantly reduce ocean acidification this century, and will avoid up to 4m of sea-level rise by year 2300 relative to a high-end scenario.
    Electronic ISSN: 2328-4277
    Topics: Geosciences
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 55
    facet.materialart.
    Unknown
    American Geophysical Union (AGU)
    Publication Date: 2018-03-09
    Description: Sometimes, when you’re designing an experiment or just trying to lug around a bunch of rocks, science makes you think outside the box. Necessity, after all, is the mother of invention. Like Department of External Services agent Angus “Mac” MacGyver, scientists around the world are jerry-rigging everyday objects into specialized equipment. And this being the digital age, there’s a hashtag for that! Here are 10 examples of scientists channeling their inner MacGyver with everyday household products, as told by tweets. Mr. Coffee? More Like Mr. Dirt Perfect for grinding up soil for carbon and nitrogen analysis. Gets just the right texture. Will only last about 2 months before it burns up, but easily replaceable at this price. Bonus points for not being as loud as a ball mill. #reviewforscience pic.twitter.com/ELlAUwUCMi — Jeff Atkins (@atkinsjeff) January 30, 2018   Have More Soil to Grind? Try Some Mesh Great for grinding soils to a fine powder for analysis. Seller said it’s for harvesting trichomes, so it has other lab applications too! Disclaimer: I really thought it meant harvesting trichomes for research & had to be informed otherwise.#reviewforscience #straightedgeproblems pic.twitter.com/Gp1xcnTnJu — MH Andrews Holmes (@skunkcabbages) January 30, 2018   For Those Arctic “Emergencies”… The great thing about the wide opening on a nalgene is you can urinate in it should the conditions become to extreme to leave your tent. The downside to this is if you drink while hiking or driving off-road water often pours onto your face #reviewforscience pic.twitter.com/BOEDEryA5h — Dani Rabaiotti (@DaniRabaiotti) January 30, 2018   And Here Are Some Gloves You Can Wear While “Using” the Bottle Excellent gloves when a surprise April blizzard catches you unprepared while installing PRS soil probes. Also good for insulating cold water bottles, frozen soil cores, and ice packs. Made my hands smell like feet, lack of thumbs did affect handwriting. 4/5 #reviewforscience pic.twitter.com/JIjZgn9mb5 — Cait Rottler (@Caitydid685) January 30, 2018   Storing Sediment Cores Freezer bags: These bags are extremely durable. They hold a large sediment core (16cm diam, 10cm deep) easily & do not tear when lobbed across the mudflat towards the shore or when tiredly piled in the van after fieldwork. #reviewforscience pic.twitter.com/qaalVFBJmt — Rachel ILoveWorms Hale (@_glitterworm) January 30, 2018   OK, So You’ve Got Your Core. How Do You Clean It? Best implement for clearing years of mould and rime off your sediment cores. 5/5. #reviewforscience https://t.co/4VG9PLe3SW — Andy Emery (@AndyDoggerBank) January 30, 2018   Speaking of Cleaning… Signstek Telescopic Golf Ball Retriever #reviewforscience. When wrapped with a few generous layers of #kimwipes, this handy device is perfect for cleaning snow off your tower mounted pyranometer: https://t.co/LK87wKvUpe … pic.twitter.com/Ptayt0qRGT — Elizabeth Burakowski (@LizBurakowski) January 30, 2018   It’s All About the Festive Colors just the right diameter for storing tree-ring cores in the field. Paper rather than plastic reduces chances of molding. Festive colors make tedious coring job less tedious. #reviewforscience pic.twitter.com/rYb4mSDgIM — Valerie Trouet (@epispheric) January 30, 2018   Need to Stay Dry During a Tropical Storm? The best thing about these trash bags is how well a 5’6″ frog scientist can fit inside to stay dry and warm during a tropical storm. Highly recommended for other similar sized field researchers #reviewforscience pic.twitter.com/ABH6sGNeIx — Jonathan Kolby (@MyFrogCroaked) January 31, 2018   The Other Kids Won’t Make Fun of You, I Swear These floaties come in a handy 2-pack and are great for floating temperature sensors in ponds and wetlands- just attach with fishing line! The major con is that they are not resistant to biting by dogs, after which they sink only to lose your sensor. #reviewforscience pic.twitter.com/hBBafcbmKA — Dr. Julia E. Earl (@Julia_E_Earl) January 30, 2018 —JoAnna Wendel (@JoAnnaScience), Staff Writer The post Ten Everyday Objects That Can Be Used for Science appeared first on Eos .
    Print ISSN: 0096-3941
    Electronic ISSN: 2324-9250
    Topics: Geosciences
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 56
    facet.materialart.
    Unknown
    American Geophysical Union (AGU)
    Publication Date: 2018-03-09
    Description: During his confirmation hearing yesterday before the Senate Committee on Energy and Natural Resources, the Trump administration’s choice to lead the U.S. Geological Survey (USGS), James Reilly II, promised to uphold scientific integrity at the agency. “I am fully committed to scientific integrity. Science drives good policy, and good science has to be there for good policy to be made.”“I am fully committed to scientific integrity. Science drives good policy, and good science has to be there for good policy to be made,” Reilly testified at the hearing, where he received support from Republicans and Democrats. Reilly is a former NASA astronaut who flew on three space shuttle missions. He also is a former petroleum geologist who holds a Ph.D. in the geosciences from the University of Texas at Dallas. He currently serves the U.S. military and allied militaries as a subject matter expert on space operations and works as a technical adviser supporting the National Security Space Institute of the U.S. Air Force. A Focus on Scientific Integrity “Scientific integrity has got to be a key element of the USGS because, as we mentioned, it’s an independent organization that is designed to deliver unbiased science to the decision makers, to you, for example,” Reilly said in response to questioning from Sen. Maria Cantwell (D-Wash.), the committee’s ranking Democrat, about how he would maintain scientific integrity at the agency. “That will be one of the highest priorities that I will have as the director.” If somebody were to ask him to change a document for political reasons, Reilly said, “I would politely decline.” Scientific integrity became a hot-button issue for USGS following revelations that a top official, Murray Hitzman, recently resigned out of concern that the agency provided final results of an assessment of the National Petroleum Reserve in Alaska to the U.S. secretary of the interior prior to its public release. Hitzman said that providing the early look contradicted USGS policy, but the Interior Department said that the secretary was acting within his authority. Hitzman had been the associate director for energy and minerals of USGS, which is within the Department of the Interior. Reilly said he didn’t know all the particulars related to Hitzman’s resignation but that in his other positions, he has always felt a responsibility to deliver information, particularly if it might be sensitive, to his leadership. He did so “with the understanding that the leadership would hold that [information] as tight as I would in terms of it being protected information,” Reilly noted. If, in the future, somebody told him that they were uncomfortable with such a stance, he would deal with the specific example at the time, Reilly added, “and hopefully wouldn’t get in the situation that occurred.” Sen. Lisa Murkowski (R-Alaska), chair of the committee, also weighed in on the topic. “I appreciate you mentioning the scientific integrity of the agency. I think USGS, we know, is known for its focus on seeking out the best science, the best data, and doing so in a way that is not biased and that we can certainly look to. And my hope, my ask, is that you maintain that integrity within the agency.” Budget Woes Among his budget concerns, Reilly said, is “where are we sensitive [and] what are the things that we need to be discussing more with the secretary [of the interior] in terms of the budget.”In response to questions from several Democratic senators, Reilly acknowledged the budget challenges facing USGS. The administration’s $857.7 million proposed budget for fiscal year 2019, which was sent to Congress in February, would cut funding for the agency by 21% and reduce appropriations for nearly all major areas within USGS. Some areas, however, such as mineral and energy resources, would see increases. Reilly said that although he doesn’t yet know all the budget details, if he is confirmed, fully understanding the budget would be a top priority. Among his concerns, Reilly said, is “where are we sensitive [and] what are the things that we need to be discussing more with the secretary [of the interior] in terms of the budget.” A Focus on the Agency’s Organic Act Some senators also pushed Reilly about what they said is the need for USGS to return to its core mission as outlined in the Organic Act of 1879. That act of Congress, which established the agency, provided for “the classification of the public lands and examination of the geological structure, mineral resources, and products of the national domain.” Responding to Murkowski’s question about whether the agency is on the right track and where it might need to be adjusted, Reilly said he would work with senior USGS managers to evaluate how well the agency’s core missions today align with the Organic Act. In an interview with reporters following the hearing, Reilly said his biggest priority would be “maintaining the focus on the mission statement that is identified in [the Organic Act]. That pretty much describes the USGS.” Extracting Promises from the Nominee Sen. Murkowski extracted commitments from Reilly about mineral security and retaining a series of seismographs in Alaska.Murkowski, who said she hopes the Senate confirms Reilly soon, also pushed him in several other areas. One area is mineral security, with Murkowski noting that last year the United States imported 100% of its supply of 21 different minerals and at least 50% of its supply of another 30 minerals. Reilly told Murkowski that he shared her concern and promised to review with her what USGS can do. He also committed to working with Murkowski to help her retain in Alaska a series of portable seismographs that are part of what’s known as the USArray. Other senators, including Sen. Joe Manchin (D-W.V.), drew commitments from Reilly to visit their states to learn about local issues. Manchin, in an expression of support for the nominee, noted to Reilly, “It’s hard sometimes to find good recruits [for government positions], and you seem to be the best of the best.” —Randy Showstack (@RandyShowstack), Staff Writer The post USGS Nominee Calls Scientific Integrity a High Priority appeared first on Eos .
    Print ISSN: 0096-3941
    Electronic ISSN: 2324-9250
    Topics: Geosciences
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 57
    facet.materialart.
    Unknown
    American Geophysical Union (AGU)
    Publication Date: 2018-03-12
    Description: If solar geoengineering were to be deployed so as to mask a high level of global warming, and then stopped suddenly, there would be a rapid and damaging rise in temperatures. This effect is often referred to as termination shock, and it is an influential concept. Based on studies of its potential impacts, commentators often cite termination shock as one of the greatest risks of solar geoengineering. However, there has been little consideration of the likelihood of termination shock, so that conclusions about its risk are premature. This paper explores the physical characteristics of termination shock, then uses simple scenario analysis to plot out the pathways by which different driver events (such as terrorist attacks, natural disasters, or political action) could lead to termination. It then considers where timely policies could intervene to avert termination shock. We conclude that some relatively simple policies could protect a solar geoengineering system against most of the plausible drivers. If backup deployment hardware were maintained and if solar geoengineering were implemented by agreement among just a few powerful countries, then the system should be resilient against all but the most extreme catastrophes. If this analysis is correct, then termination shock should be much less likely, and therefore much less of a risk, than has previously been assumed. Much more sophisticated scenario analysis—going beyond simulations purely of worst-case scenarios—will be needed to allow for more insightful policy conclusions.
    Electronic ISSN: 2328-4277
    Topics: Geosciences
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 58
    facet.materialart.
    Unknown
    American Geophysical Union (AGU)
    Publication Date: 2018-03-12
    Description: Saltier than any ocean, surrounded by a scrubby desert, and inhospitable to nearly all kinds of plant and animal life, the Dead Sea is actually a lake, although hardly a typical one. The Dead Sea is a hypersaline lake located in a hyperarid region at the lowest place on Earth (~430 meters below sea level) between Israel and Jordan, about 80 kilometers from the eastern coast of the Mediterranean Sea. Each evening during summer, sea breezes from the Mediterranean waft over its surface and die down again by sunup. This locale makes the Dead Sea a uniquely interesting research subject for scientists studying the evaporation of water over lakes. With other lakes, it is nearly impossible to extricate the individual effects of solar radiation (or sunlight) and wind speed on evaporation since the weather tends to be windy and sunny at the same time of day and both tend to be lacking at the same time of night. Satellite image of the Dead Sea and the Mediterranean Sea using data collected in 2015 by the Moderate Resolution Imaging Spectroradiometer (MODIS) sensor aboard NASA’s Aqua satellite; the white arrow indicates the location of the eddy covariance station. Credit: NASA Here Lensky et al. use a reliable technique called the eddy covariance system to measure the diurnal course of the evaporation rate. With these measurements, the role of solar radiation and wind speed could be treated as separate entities, so that when these two phenomena occur at different times of day—as they do over the Dead Sea, which is windy and dark at night but calm and sunny during the day—it is possible to determine their individual impacts on evaporation. Using eddy covariance systems, the researchers collected data at the Dead Sea, as well as two freshwater bodies in northern Israel, the Eshkol Reservoir and Lake Kinneret, in the middle of summer. Since the latter two sites are physically closer to the Mediterranean Sea, the sea breeze reaches them at different times of day, providing a helpful comparison. The researchers incorporated these data into a comparative framework of the rates of evaporation at all three sites over a 24-hour average (over periods of a few days in summer). In the Dead Sea, they found that the rate of evaporation peaks during the day just a few hours after solar radiation peaks. At night, there is another evaporation peak at the same time as the wind speed peak. The rate of evaporation is lowest at both sunrise and sunset. Alternatively, the Eshkol Reservoir and Lake Kinneret had just one evaporation peak, in the afternoon, which coincides with the Mediterranean Sea’s breeze reaching their shores. The researchers attributed this single, midafternoon evaporation peak to the fact that both the wind speed and solar radiation peaks happen around the same time. These results not only shed more light on one of Earth’s most unusual natural formations, the Dead Sea, but they provide scientists with a better understanding of how water evaporates from the surface of a lake. The study may be the first successful attempt to separate out two key drivers of lake water evaporation: sunlight and wind. ( Water Resources Research , https://doi.org/10.1002/2017WR021536, 2018) —Sarah Witman, Freelance Writer The post Dead Sea Provides Unique Insights on Water Evaporation appeared first on Eos .
    Print ISSN: 0096-3941
    Electronic ISSN: 2324-9250
    Topics: Geosciences
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 59
    Publication Date: 2018-03-12
    Description: Geology is, in essence, the history of the Earth, and any history depends on dates and rates.  Geochronology supplies these. A new book, Geochronology and Thermochronology , recently published by the American Geophysical Union, presents the current state of this science including its concepts, approaches, methods, and applications. Here, the authors answer some questions about the science of geochronology and its relevance, and describe how this field has evolved. What is the role of geochronology in Earth and planetary sciences? The Earth and planetary sciences are fundamentally interwoven with the importance of time. Understanding the basic workings of our Solar System requires knowledge of the evolution of the constituent planets and their interactions. This in turn requires sequencing events far back in time, orders of magnitude beyond the scope of human history. The Paria River, a slot canyon carved by fluvial incision in northern Arizona. Geochronology and thermochronology provide key understanding of the timing and rate of uplift of the Colorado Plateau that drove incision of its rivers such as the Paria and Colorado. Credit: Peter Reiners Geochronology expressly fills this need, often providing the key to causality hypotheses linking various phenomena in the distant prehistoric past. Many basic discoveries, from the theory of plate tectonics to the antiquity of our own species, were made possible by geochronology. Such discoveries are ongoing, as geochronology plays a key role in, for example, linking major extinctions of Earth’s biota to large meteoroid impacts and massive volcanic episodes. In such cases, causality requires temporal coincidence, which is what geochronology can establish (or refute). Geochronology and the related discipline of thermochronology are also invaluable to determining the rates at which processes happen; for example, the rates at which species evolve or become extinct. Ultimately, understanding the mechanisms and rates of various modes of planetary evolution enable humankind to better anticipate future events affecting our planet and beyond. What are some examples of the application of geochronology? Want to know when the Solar System formed? Uranium-lead ages for the first mineral grains to condense from the hot gases surrounding the proto-Sun date that event at 4.5675 billion years ago. Want more resolution on this time? The 700,000-year half-life for radioactive decay of 26-aluminum to 26-magnesium can provide relative ages with precision of tens of thousands of years for events occurring close to the time of Solar System formation. Want to know when Earth’s core formed? The 9-million-year half-life for decay of 182-hafnium to 182-tungsten dates that event to within the first 30-100 million years of Solar System history, depending on the duration and mechanism of core formation. Want to know what’s the oldest mineral found on Earth? Grains of the mineral zircon from sediments in western Australia have been dated using uranium-lead to 4.37 billion years. Want to know the rate of sedimentation in a Precambrian sedimentary basin? Either zircon uranium-lead or rhenium-osmium can provide the answer, depending on the type of sediment. Want to know the rate of biological evolution? A variety of geochronologic methods now place precise absolute dates on the boundaries of the geologic time scale that historically were marked by the appearance or disappearance of certain types of fossils. Geologists Jay Quade and Kendra Murray sample volcanic ash deposits in the Atacama Desert, Chile to determine the age of sedimentary rocks bearing a record of tectonic processes in the Central Andes. Credit: Peter Reiners Want to know where the magmas erupted from volcanoes come from? Uranium-series dating tells us their sources, ascent rates, and residence times in magma chambers. Want to know the uplift or erosion rate of a terrane? Uranium-helium in apatite can date the time when a rock cooled below 60 degrees Celsius, and cosmogenic nuclides created by the impact of cosmic rays on surface materials can be used to determine the rates of removal of surface layers from rocks or sediments. What is the specific focus of thermochronology? Thermochronology is a type of geochronology based on the fact that, in some minerals, some daughter products produced by radioactive decay are not fully retained in crystals until these crystals have cooled to low temperatures. The “closure temperature” at which this full retention starts depends on the decay system (for example, uranium to lead, potassium to argon, etc.), and the type of mineral. Examples include the uranium-lead system in titanite, the potassium-argon system in biotite mica, and the uranium-helium system in apatite, which have closure temperatures of about 600, 300, and 60 degrees Celsius, respectively. If we measure ages of these minerals/systems in the same rock, we can map out the time-temperature path (thermal history) of a rock. This is useful for many applications, including understanding when and how fast rocks get to Earth’s surface by faulting or erosion during mountain-building, when metamorphism deep in the crust occurs, or when surficial events like meteorite impacts or wildfires occurred in the past. How has discipline of geochronology developed and advanced over the last century? Geochronology in its nascent years was mainly the playground of physicists using homemade instruments and methods. Today, the field is mainly populated by geoscientists using commercially available equipment and common protocols to analyze rocks and other materials. Cathodoluminescence images of zircon crystals from igneous rocks of the Cornucopia Stock in northeastern Oregon. Geochronology and thermochronology of zircon provides some of the most precise and accurate ages of Earth materials, as well as their thermal histories. Credit: Peter Reiners Progress in geochronology has evolved along several interrelated tracks in the last century. The most fundamental is conceptual, beginning with the discovery of radioactivity, which allowed sequences of events and roughly estimated times to be quantified. Remarkably, this enabled rudimentary geochronology (and actually foreshadowed thermochronology) even before the discovery of the neutron, which in turn led to the recognition of isotopes and major conceptual refinements in the field as well as analytical methods such as mass spectrometry. Another major area of progress has been technical, notably accelerated by the invention of mass spectrometers that have undergone continuous refinement in terms of stability, sensitivity, and throughput over the past half-century. These technical developments have enabled increasingly large quantities of data, of increasingly better quality from ever-smaller samples, to be obtained. A third avenue of progress, driven in part by the burgeoning data quality and volume occasioned by technical advances such as process automation, has been in data analysis and calibration. Our knowledge of radioisotopes’ half-lives is now in many cases the accuracy-limiting step in geochronology, and new measurements are ongoing to further improve this situation. Also ongoing are developments of numerical techniques for translating isotope measurements into ages and meaningful uncertainties in these ages. Measurements of diffusion parameters needed for thermochronology, and of cosmogenic production rates needed for cosmogenic nuclide dating, have dramatically strengthened these applications and are undergoing continuous refinement today. What are some of the most exciting new techniques in the field? Refinements in uranium-lead dating of zircon can date crystallization ages to better than 0.01%.  Similar precision can be obtained with modern argon-argon techniques on young volcanic rocks. The use of a number of radioactive isotopes that were present when the Solar System formed, but have half-lives much shorter than the age of the Earth, now allow precise resolution of the events that were occurring in the first five million years or so of Solar System history when the planets, including Earth, formed. While once abandoned as “leaky” geochronometers, the accumulation of helium from uranium and thorium decay has been turned into a versatile thermochronometer that can be used to measure the timing and rates of cooling of rocks to temperatures close to the Earth’s surface. The use of cosmogenic isotopes, those generated by interaction of Earth’s surface with cosmic rays, allows wide ranging applications from archeology, to paleoseismology, to studies of landscape evolution. Another technique experiencing exciting growth is dating nucleogenic neon, which forms as a by-product of uranium and thorium decay, and is opening up opportunities for dating minerals including some types of hydrothermal ores that have been difficult or impossible to date by other means. Geochronology and thermochronology are dynamic fields whose capabilities and range of applications will continue to expand for the foreseeable future. Geochronology and Thermochronology , 2017, 480pp., ISBN: 978-1-118-45585-2, list price $150 (hardcover), $100 (paperback), $80.99 (e-book) —Peter W. Reiners, University of Arizona: email: reiners@email.arizona.edu; Richard W. Carlson, Carnegie Institution for Science; Paul R. Renne, Berkeley Geochronology Center and University of California; Kari M. Cooper, University of California, Davis; Darryl E. Granger, Purdue University; Noah M. McLean, University of Kansas; and Blair Schoene, Princeton University The post The Science of Dates and Rates appeared first on Eos .
    Print ISSN: 0096-3941
    Electronic ISSN: 2324-9250
    Topics: Geosciences
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 60
    Publication Date: 2018-03-14
    Description: Beneath vast plains of Arctic tundra and swampy taiga forests lies permanently frozen ground, or permafrost. As northern polar regions continue to warm at a rate twice the global average, this permafrost begins to thaw. Unfrozen, waterlogged soils are like witches’ cauldrons for methane, a greenhouse gas 25 times more potent than carbon dioxide. In these environments, organic material from plants and other sources slowly decays with the help of microorganisms called Archaea, releasing methane (CH4) into the atmosphere [ Schuur et al ., 2015]. Scientists know that this process is occurring, but the precise amount of Arctic carbon released as CH4 remains uncertain. Also, atmospheric measurements of the amounts of methane released by permafrost (a top-down approach) are far less than estimates of these amounts made using point-based field assessments and ecosystem modeling (bottom-up approaches). Thus, how a changing climate has affected and will affect future CH4 emissions remains a topic of debate among scientists. Study of Environmental Arctic Change (SEARCH) launched a CH4 synthesis project with the goal of estimating contemporary budgets for CH4 in the Arctic and projecting rates of future release. This effort aims to outline the current CH4 budget and provide guidelines for monitoring future CH4 release from the northern permafrost region. The project was initiated at the International Workshop to Reconcile Northern Permafrost Region Methane Budgets held in Seattle in March 2017, and it includes a broad consortium of more than 40 scientists. Fig. 1. This map displays permafrost regions of the Arctic. Dark purple indicates continuous permafrost coverage, whereas lighter colors indicate discontinuous, sporadic, or isolated coverage. The Stampede Trail research site is identified with a yellow star. Credit: NASA Earth Observatory using data from the National Snow and Ice Data Center Here we highlight what we know, as well as a selection of important knowledge gaps in our understanding of terrestrial, marine, and atmospheric environments that affect the development of Arctic CH4 budgets (Figure 1). We also summarize new work focused on improving our understanding of CH4 dynamics in this region. Inland Emissions What We Know .  Total global CH4 emissions are likely 550–650 billion kilograms per year (550–650 teragrams, Tg). Overall, wetlands and lakes are likely the largest source of CH4 emissions, followed by contributions from submerged permafrost along the Arctic Ocean shelf. Human activities (e.g., oil and gas drilling), geologic seeps, and fire also contribute to the total budget. For latitudes above 60°N, emissions are estimated to be 18–29 Tg CH4 per year on the basis of top-down atmospheric model approaches. Wetlands, lakes, and other riparian areas are responsible for more than 70% of annual CH4 emissions from northern land regions, whereas seeps, fires, and fossil fuel burning account for the remainder (Figure 2). Fig. 2. Lakes and wetlands are the largest of the primary sources of CH4 emissions from northern high latitudes (60°N–90°N). Current emission estimates are shown in teragrams (Tg) per year. Permafrost thaw in the Arctic can initially lead to wetter landscapes, development of new lakes and wetlands, and increased CH4 emissions [ Olefeldt et al ., 2016]. Continued thaw results in draining of surface waters and drying of upper soil layers, which might mitigate CH4 loss to the atmosphere [ Watts et al. , 2014]. Knowledge Gaps. The emissions estimates shown in Figure 2 are compiled from the A rctic Monitoring Assessment Programme ( AMAP ) [2015], Walter Anthony et al. [2012], and expert opinion. These numbers are highly uncertain because of the remoteness of the northern regions and the limited extent of data networks for long-term monitoring. Ongoing changes in land components, including the appearance of new lakes and the disappearance of older water bodies as subsurface permafrost erodes and opens new drainage passages, can substantially affect localized CH4 fluxes and further complicate regional emission mapping. The appearance of new lakes and the disappearance of older water bodies can substantially affect localized methane fluxes.Resolving the magnitude and location of change in CH4 emissions, especially those from wetlands and lakes, remains a formidable task for researchers. Winter CH4 release could account for more than 50% of the annual budget, but more research is needed to better understand the magnitude of emissions occurring during cold seasons [ Zona et al. , 2016]. Scientists still lack landscape-scale monitoring and mapping systems capable of detecting short-term (e.g., monthly) and decadal changes in surface wetness and temperature. There’s also a need for accurate soil carbon and land cover maps that distinguish between wetlands, lakes, and rivers to avoid double counting emissions budgets [ Wrona et al. , 2016]. A better understanding of the ways that vegetation regulates CH4 production and mediates CH4 transport will help to inform models and explain why emission response differs for different landscapes. Marine Methane What We Know.  Ongoing changes in the Arctic Ocean will affect future CH4 emissions. A reduction in sea ice extent could increase the direct transfer of gas from the ocean to the atmosphere. Warming ocean temperatures can increase CH4 production as permafrost underlying the continental shelf begins to thaw. Bubbling from shallow shelf sediments creates hot spots of methane emissions to the atmosphere.Bubbling from shallow shelf sediments creates hot spots of CH4 emissions to the atmosphere [ Shakhova et al., 2015]. In deeper shelf regions, much of the produced CH4 is dissolved and oxidized in the water column or transported into deeper, dense waters [ Myhre et al., 2016]. Knowledge Gaps.  Bottom­-up estimates for marine environments differ greatly depending on which offshore ocean shelf is investigated, reflecting different processes among the circum-Arctic shelves [ AMAP, 2015]. New shipboard direct sampling methods for air-sea CH4 exchange will enable researchers to better quantify marine fluxes and will help the community address disparities in marine CH4 emissions estimates among different regions [ Thornton et al. , 2016]. Methane in the Atmosphere What We Know.  Atmospheric scientists measure the amount of CH4 gas in the atmosphere and use these data, along with models of atmospheric transport, to estimate the amount of CH4 released at Earth’s surface. Scientists observe atmospheric CH4 using a network of about 20 towers across the Arctic, and they intensively measure atmospheric CH4 from intermittent aircraft flights. Airborne sampling near Inuvik, Canada, provides a bird’s-eye view of CH4 across a large region. Credit: NASA/Katy Mersmann Knowledge Gaps.  These efforts produce regional- and continental-scale estimates of where, when, and how much CH4 is released, but estimates vary according to the method used. For example, atmospheric studies indicate less CH4 from boreal forests in North America and more from cold, Arctic tundra relative to bottom-up estimates [e.g., Miller et al., 2016]. Integrating these CH4 observations is a key challenge. Various methods collect data at different scales: Chamber measurements collect data over square-meter areas, tall towers and aircraft observe larger areas, and satellites (e.g., Greenhouse Gases Observing Satellite, or GOSAT) observe areas larger than a square kilometer. Top-down and bottom-up estimates that use only one of these data sources often arrive at very different CH4 totals, and they extrapolate emissions over time and space in ways that are difficult to compare. Chamber-based measurements near the Stampede Trail capture fine-scale relationships between CH4 fluxes and environmental drivers, providing a detailed understanding of CH4 release from an Arctic tundra. Credit: Meghan Taylor A challenge for the scientific community is to find consensus in the methodologies used to scale CH4 from sample locations to the larger domain and to integrate information obtained from these various data sets. An integrated approach could better elucidate patterns and trends in CH4 fluxes and help pinpoint the underlying mechanisms driving these changes. These results could help flag the regions of greatest concern for future CH4 release. Overall, the total amount of CH4 emissions that atmospheric scientists see from high latitudes is half of that in bottom-up estimates, and scientists are working to understand this discrepancy [ Bruhwiler et al., 2014]. Furthermore, the answer to the question of whether CH4 emissions are increasing remains elusive. In many regions, the atmospheric data record is too short (30–35 years at most) to conclusively quantify long-term CH4 trends [e.g., AMAP, 2015]. Efforts to Reconcile the Northern CH4 Budget In many regions, the atmospheric data record is too short to conclusively quantify long-term methane trends.Scientists participating in the SEARCH CH4 synthesis project are working to better constrain the CH4 budget in the northern permafrost region. Researchers are creating a comprehensive database of Arctic CH4 observations from disparate measurement platforms. Data availability has been an obstacle for existing studies, and the resulting database will allow future studies to better synthesize existing observations. Project members are working to improve methodology used to extrapolate from site-level measurements to continental scales. These measurements are often sparse, and varying extrapolation methods can result in large differences in terrestrial and marine CH4 Scientists will conduct a data synthesis of understudied winter emissions, which may account for a large fraction of total emissions in the region. SEARCH participants will make recommendations for how to improve and expand existing CH4 observing networks. For example, they are evaluating whether the existing atmospheric observation network could detect a broad emissions trend, and they are outlining the new observations that would be needed to detect such a trend. Project members are also planning to incorporate new, forthcoming data into the SEARCH synthesis activities. These data include new aircraft observations from the northern permafrost region (e.g., the NASA Arctic-Boreal Vulnerability Experiment) and measurement of sea-air gas exchange (e.g., the U.S. Geological Survey (USGS) Gas Hydrates Project). Taken together, these efforts will enable a better understanding of present-day CH4 budgets and the underlying environmental drivers that will help scientists predict future CH4 release and the associated impacts on global climate. Acknowledgments We thank NASA, USGS, the U.S. Arctic Research Commission, and the Arctic Research Consortium of the United States (ARCUS) for sponsoring the workshop. Special thanks goes to A. David McGuire, Charles Miller, Ted Schuur, and all workshop participants for the discussions that inspired this article. The post Understanding High-Latitude Methane in a Warming Climate appeared first on Eos .
    Print ISSN: 0096-3941
    Electronic ISSN: 2324-9250
    Topics: Geosciences
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 61
    Publication Date: 2018-03-14
    Description: Life permeates Earth’s critical zone, where microorganisms have inhabited nearly all of our planet’s surface and near surface for the last 3.5 billion years. Given the vast time that Earth has been teeming with life, it is hard to imagine what the planet would be like without its biosphere. But Earth without life is exactly what participants at a recent meeting sought to contemplate. More than 30 scientists from eight countries attended an international workshop hosted by the Earth-Life Science Institute Origins Network (EON) at the Tokyo Institute of Technology in September 2017. The participants contributed expertise in Earth science, planetary science, biology, chemistry, and mathematics. To begin this thought experiment, participants sought to answer the following question: What are the key characteristics of an abiotic Earth compared to the Earth that we know? Exploring this question may help uncover essential aspects of what makes our home planet habitable. What we learn may help us to assess the possibility of extraterrestrial life elsewhere in the universe. Attendees contemplated the hypothesis that “everything on Earth that is or has been influenced by water is inseparably coupled with life.” Scientists debated questions such as whether any surface process on Earth is truly abiotic, to what degree a process has been influenced by life, and whether everything in the critical zone, deeper in the crust, and even in the mantle has been affected by life. Participants engaged in spirited debates about how best to evaluate abiotic processes. They concluded that developing a set of standards for abiotic and biotic characteristics could help advance community understanding by providing quantitative metrics for comparison across what are often very different data types and observed time frames. Long discussions questioned whether enough is presently known about the boundaries of life on Earth to make such assessments, especially in light of continuing revelations about the many challenging conditions to which extremophiles have adapted. Attendees agreed that evidence for life falls into three primary categories of biosignatures: objects: physical features such as mats, fossils, and concretions substances: elements, isotopes, molecules, allotropes, enantiomers, mineral identities, and properties patterns: physical three-dimensional or conceptual n -dimensional relationships of chemistry, physical structures, etc. Small breakout groups addressed many different expressions and the preservation potential of biosignatures in these three broad categories. Participants also identified five key issues that warrant further development: the criticality of examining phenomena at the “right” spatial scale and how biosignatures may elude us if not examined with the appropriate instrumentation or modeling approach at that specific scale the need to identify the precise context across multiple spatial and temporal scales to understand how tangible biosignatures may or may not be preserved the desire to increase the community’s capability to mine big data sets to reveal major relationships, for example, how Earth’s mineral diversity may have evolved in conjunction with life the need to leverage cyberinfrastructure for data management of biosignature types, classifications, and relationships the utility of 3-D to n -D representations of biotic and abiotic models overlain on multiple overlapping spatial and temporal relationships that can provide new insights The lively and engaged mood of the participants resulted in emerging collaborations to pursue these challenges into the future. —Marjorie A. Chan (email: marjorie.chan@utah.edu), Department of Geology and Geophysics, University of Utah, Salt Lake City; H. James Cleaves II, Earth-Life Science Institute, Tokyo Institution of Technology, Japan; and Penelope J. Boston, NASA Astrobiology Institute, Moffett Field, Calif. The post What Would Earth Be Like Without Life? appeared first on Eos .
    Print ISSN: 0096-3941
    Electronic ISSN: 2324-9250
    Topics: Geosciences
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 62
    facet.materialart.
    Unknown
    American Geophysical Union (AGU)
    Publication Date: 2018-03-16
    Electronic ISSN: 2328-4277
    Topics: Geosciences
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 63
    Publication Date: 2018-03-06
    Description: In March 2005, a climate policy analyst at the U.S. Global Change Research Program resigned in protest over what he saw as political interference in science. After 10 years in federal service, Rick Piltz said he could no longer be complicit in what he called a “conspiracy of silence.” The people around him, he claimed, were allowing politics to be injected into what should have been scientific work on climate change. Because Piltz spoke out, we learned that the Bush White House was indeed tampering with science [ Doremus, 2008]. In a now widely known case, White House aide Philip A. Cooney, a nonscientist, was editing scientific documents to downplay the link between greenhouse gas emissions and climate change. Undue influence that interferes with the ability of independent science to inform what should be science-based decisions is problematic.This was a case of a loss of scientific integrity in the federal government. Such cases include instances where political, financial, and ideological forces interfere in the process of evidence-based decision-making. To be clear, many factors beyond scientific evidence (e.g., cultural values) legitimately go into policy decisions. However, undue influence that interferes with the ability of independent science to inform what should be science-based decisions is problematic [ Wagner , 2015]. Evidence collected by the Union of Concerned Scientists suggests that science in the federal government is currently being conducted in an environment that discourages the use of scientists’ knowledge in decision-making [ Goldman et al., 2017]. But little is known about the scope and scale of such issues and how they affect scientists’ ability to meet the goals of their science-based agencies’ mission. One useful tool for obtaining a broader view is surveying federal scientists. This month, the Union of Concerned Scientists, working with Iowa State University, is conducting a scientific integrity survey to gain such a perspective. The survey includes more than 60,000 scientists in 16 federal agencies, and it asks vital questions about scientific integrity and the atmosphere within agencies. This will be the first scientist survey under the Trump administration, and we hope it will secure vital data about how the administration is doing on science. Our hope is that these data will pinpoint the challenges to scientific integrity that government scientists face in the current political climate. Whistle-Blowers and Anecdotes The written and verbal records of a few whistle-blowing federal scientists and others willing to share stories provide anecdotal evidence on the current state of scientific integrity in the U.S. government. For example, in a Washington Post piece last July, Department of Interior scientist Joel Clement told the world about the Trump administration’s reassignments of scientists within his department. Clement, whose work focused on helping native Alaskan communities mitigate and adapt to climate change impacts, was reassigned to an accounting office, collecting royalty checks from the oil and gas industry. Clement’s action led to a wave of media inquiries, congressional investigations, and public outcry. Across federal agencies, science advisory committees, which provide crucial independent scientific input to government decisions, have been dismissed, dismantled, and allowed to sit idle.In other cases, scientists have been removed from agency decision-making on scientific topics. Across federal agencies, science advisory committees, which provide crucial independent scientific input to government decisions, have been dismissed, dismantled, and allowed to sit idle [ Halpern, 2018]. For instance, in June 2017, the U.S. Environmental Protection Agency (EPA) decided not to ban certain uses of the pesticide chlorpyrifos, despite recommendations from EPA scientists to ban the pesticide because exposure is linked to neurological damage in children.  Last year, reports from the EPA and Department of the Interior revealed political interference in selection of research grant recipients; administration appointees with minimal scientific training handpicked which grants to fund [ Goldman et al ., 2018]. These examples are just a few of many. But what’s lacking is a fuller picture: Are these examples indicative of current trends? Or are they aberrations? A Broad Survey To understand the state of scientific integrity under the current administration, we need information beyond the accounts of a few whistle-blowers and leaked documents. Surveying federal scientists will help the public gain this understanding. To that end, 63,383 federal scientists have been contacted and asked to take our survey. If you are interested in learning more about the survey, please visit our survey’s website. Results of the survey will be publicly released in early summer 2018 and will be shared with the scientific community, the media, elected officials, and federal agency staff. Past Surveys: Results, the Changes They Helped to Trigger, and Lessons Learned The surveys provide a needed pulse on the status of scientific integrity across the government.Since 2005, the Union of Concerned Scientists has conducted anonymous surveys of scientists across federal agencies. The surveys provide a needed pulse on the status of scientific integrity across the government, shedding light on how agencies differ between each other and over time, what policies and practices need changing, and the overall well-being of scientists working in the government. For example, are scientists able to do their jobs and communicate their science? Is political interference affecting their work? How is morale? Are scientific integrity policies being fully implemented? For more than a decade and across both the Bush and Obama administrations, surveys of federal scientists have provided answers to these questions—and those answers have led to concrete changes at federal agencies. In 2011, the National Science Foundation developed a media policy following survey responses and policy analysis developed by the Union of Concerned Scientists. In 2013, within hours of the release of a cross-agency social media communications assessment, the U.S. Geological Survey improved its social media policy to better ensure scientifically accurate agency communications. Scientist surveys have improved policies and practices across the government, providing crucial data on federal scientific integrity issues [ Carroll et al. , 2017]. The surveys also provide a window into the evolution of policies and practices within agencies over time. Between 2005 and 2007, the surveys indicated that federal scientists had few protections against overt political interference in their work. Some 1,028 scientists (60% of respondents) from a survey of climate researchers at seven agencies and a separate survey of EPA researchers reported that they had personally experienced at least one incident of political interference in their work over the previous 5 years [ Union of Concerned Scientists, 2009]. After the introduction of scientific integrity policies at more than 24 federal agencies by 2011, the surveys began to indicate a shift in scientists’ experiences. On a 2015 survey of the Centers for Disease Control and Prevention (CDC), U.S. Fish and Wildlife Service (FWS), U.S. Food and Drug Administration (FDA), and the National Oceanic and Atmospheric Administration (NOAA), 67% (2,351 respondents) agreed or strongly agreed that their agency adhered to its scientific integrity policy [ Goldman et al., 2015]. But we have also learned from surveys that policies don’t equal practice. Despite strong policies in place, many scientists in the same 2015 survey reported scientific integrity challenges at their agency. For example, 1,412 respondents to the survey (23%) disagreed or strongly disagreed that they can openly express any concerns about the mission-driven work of their agency without fear of retaliation (Figure 1). Fig. 1. On a 2015 survey, 52%–59% of federal scientists across agencies said they agreed or strongly agreed with the statement “I can openly express concerns about the mission-driven work of my agency without fear of retaliation.” Do federal scientists still feel this way? A 2018 survey will shed light. What can be effective, we learned—in addition to having a strong policy in place—is strong leadership that prioritizes scientific integrity and creates an environment where it can thrive. For example, one NOAA scientist stated in an open response to a question, “In general, I think there is a culture of scientific integrity at NOAA.…Although I do not connect much with the highest levels in the agency I feel that, with a few exceptions, they are making the best decisions they can under the circumstance.” Overcoming a Culture of Fear Surveys like these provide a needed and anonymous voice for federal scientists, which is crucial because for every Rick Piltz or Joel Clement, many within the government stay silent. Currently, the reason may be in part because the administration has created a culture of fear. Thus, federal employees may feel they are best served by keeping their heads down and avoiding any attention from those in charge. En masse, scientific censorship harms the public good.We’ve already seen several cases where government employees who manage scientists or serve as the gatekeepers of scientific output chose to avoid certain politically contentious scientific work—even without being explicitly directed by political appointees. Before President Trump was even sworn in, the Centers for Disease Control and Prevention canceled a long-planned conference on climate change and public health. Both the U.S. Geological Survey and NOAA have released press releases on new climate-related research, with references to climate implications suspiciously absent. These individual acts of censorship serve the status quo, perhaps even allowing scientists and other government employees to continue their day-to-day work. But en masse, this censorship harms the public good [ Ritchie et al., 2017; Branscomb, 2004]. The Public Good of Freely Conducted Federal Science All Americans depend on the science produced by federal agencies. From weather forecasting to disaster preparedness, food safety inspections to drug approvals, federal scientists play a key role in protecting the public good. Pollution mitigation, medical research, and infectious disease monitoring by government scientists have saved countless lives. If scientists are unable to freely conduct research, if they avoid studying politically contentious topics, and if they can’t talk to the public honestly about the work they do, the scientific enterprise and the lives that depend on it are at risk. We cannot allow a conspiracy of silence to stop scientists from serving the public.The process by which scientific knowledge informs policy-making decisions works, but only if there is an environment that allows federal scientists to effectively produce such knowledge. Because the current political climate may discourage federal scientists from speaking out on their working environment, it is important for scientists outside of the government to investigate and assess these working conditions. Only through such knowledge can we effectively guide actions to ensure scientific integrity within our government. This year, as a flood of unsettling stories on the treatment of science and scientists within the federal government has raised alarms in the science community, this kind of broad view is incredibly important, and the input of federal scientists is crucial. We cannot again allow a “conspiracy of silence” to stop scientists from doing their jobs and to prevent science from serving the public. Our nation’s future depends on it. Acknowledgment The authors would like to acknowledge the support of members of the Union of Concerned Scientists.
    Print ISSN: 0096-3941
    Electronic ISSN: 2324-9250
    Topics: Geosciences
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 64
    Publication Date: 2018-02-23
    Description: Climate change, globalization, urbanization, social isolation, and increased interconnectedness between physical, human, and technological systems (Cutter et al., 2015) pose major challenges to disaster risk reduction (DRR). Subsequently, economic losses caused by natural hazards are increasing in many regions of the world (Figure 1), despite scientific progress, persistent policy action and international cooperation (United Nations, 2015).
    Electronic ISSN: 2328-4277
    Topics: Geosciences
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 65
    Publication Date: 2018-02-27
    Description: Increasing demands for water, driven by population growth and socio-economic development, environmental regulations and future climate uncertainty, are highlighting limitations on water supplies. This water-energy-food-environment nexus is not confined to semi-arid regions but is emerging as a key business, societal and economic risk in humid and temperate countries, where abundant water supplies and regulation have historically coped with fluctuating demands between industry, power generation, agriculture, domestic supply and the environment. In the UK, irrigation is supplemental to rainfall, consumptive in use and concentrated in the driest years and most resource-stressed catchments. This paper describes an empirical application of a mixed methods approach to integrate agriculture into a robust decision-making framework, focusing on a water-stressed region in England. The approach shows that competing demands between sectors can be reconciled and that potential options or portfolios compatible with multi-sectoral collaboration and investment can be identified. By combining model outputs to forecast the impacts of climate and socio-economic change on agricultural demand within a regional water resource simulator, future spatial estimates of demand were derived. A set of search and tracked metrics were used to drive multi-criteria searches to identify preferred supply and demand management orientated portfolios. The methodological challenges in forecasting agricultural demand, defining acceptable ‘trade-offs’, managing scale and uncertainty issues and the importance of engendering open dialogue between stakeholders is described. The study provides valuable insights for countries where similar emergent issues regarding conflicts over water demand exist.
    Electronic ISSN: 2328-4277
    Topics: Geosciences
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 66
    Publication Date: 2018-03-06
    Description: We use multiple synthetic mitigation sea-level scenarios, together with a non-mitigation sea-level scenario from the Warming Acidification and Sea-level Projector model. We find sea-level rise continues to accelerate post 2100 for all but the most aggressive mitigation scenarios indicative of 1.5°C and 2.0°C. Using the Dynamic Interactive Vulnerability Assessment modelling framework, we project land and population exposed in the 1 in 100 year coastal flood plain under sea-level rise and population change. In 2000, the flood plain is estimated at 540 x10 3 km 2 . By 2100, under the mitigation scenarios, it ranges between 610 x10 3 km 2 and 640 x10 3 km 2 [580 x10 3 km 2 and 700 x10 3 km 2 for the 5 th and 95 th percentiles]. Thus differences between the mitigation scenarios are small in 2100. However, in 2300, flood plains are projected to increase to between 700 x10 3 km 2 and 960 x10 3 km 2 in 2300 [610 x10 3 km 2 and 1,290 x10 3 km 2 ] for the mitigation scenarios, but 1,630 x10 3 km 2 [1,190 x10 3 km 2 and 2,220 x10 3 km 2 ] for the non-mitigation scenario. The proportion of global population exposed to sea-level rise in 2300 is projected to be between 1.5% and 5.4% [1.2% to 7.6%] (assuming no population growth after 2100) for the aggressive mitigation and the non-mitigation scenario, respectively. Hence over centennial timescales there are significant benefits to climate change mitigation and temperature stabilization. However, sea-levels will continue to rise albeit at lower rates. Thus potential impacts will keep increasing necessitating adaptation to existing coastal infrastructure and the careful planning of new coastal developments.
    Electronic ISSN: 2328-4277
    Topics: Geosciences
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 67
    Publication Date: 2018-03-06
    Description: Despite the societal relevance of sea-level research, a knowledge-to-action gap remains between researchers and coastal communities. In the agricultural and water-management sectors, intermediaries such as consultants and extension agencies have a long and well-documented history of helping to facilitate the application of scientific knowledge on the ground. However, the role of such intermediaries in adaptation to sea-level rise, though potentially of vital importance, has been less thoroughly explored. In this commentary, we describe three styles of science intermediation that can connect researchers working on sea-level projections with decision-makers relying on those projections. We illustrate these styles with examples of recent and ongoing contexts for the application of sea-level research, at different spatial scales and political levels ranging from urban development projects to international organizations. Our examples highlight opportunities and drawbacks for the researchers involved and communities adapting to rising seas.
    Electronic ISSN: 2328-4277
    Topics: Geosciences
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 68
    Publication Date: 2018-03-09
    Description: Many developing nations in earthquake-prone areas confront a tough problem: How much of their limited resources should they use to mitigate earthquake hazards? This decision is difficult because major earthquakes are infrequent, and it is unclear when one may happen, how big it could be, and how much harm it may cause. Moreover, these nations have profound immediate needs, including such ongoing rapid transformations as urbanization. Tough societal challenges for which crucial information is missing and proposed solutions involve complex interactions with other issues are called “wicked” problems [ Rittel and Webber , 1973]. These contrast with “tame” problems in which necessary information is available and solutions, even if difficult and expensive, are straightforward to identify and execute. A close look at issues involved with mitigating earthquake risk in Bangladesh illustrates what researchers and disaster managers can do address wicked problems in disaster management. The examination shows that wicked problems, despite their complexity, can be approached with strategies that should reduce vulnerabilities and potentially save lives. Wicked or Tame? Updating the United States’ aging infrastructure is a tame problem because what is wrong and how to fix it are clear. In contrast, addressing climate change is a wicked problem because its effects are uncertain and the best strategies to address them are unclear [ Stang and Ujvari , 2015]. Natural hazard problems can be tame or wicked. Earthquake hazard mitigation for San Francisco is a relatively tame problem. Studies of regional geology and past earthquakes have been used to infer shaking in future earthquakes and develop mitigation approaches, including codes for earthquake-resistant construction. The population is affluent and aware enough to accept these measures, although financing and carrying out these measures is still challenging. In contrast, earthquake hazard mitigation in Bangladesh and its surroundings is a wicked problem (Figure 1). Bangladesh is the world’s most densely populated nation, with 160 million people, approximately half the U.S. population, crowded into an area the size of Iowa. The region lies on the boundary between plates whose collision uplifts the Himalayas, but complex geology and sparse data make it difficult to assess earthquake hazard. Thus, it is difficult to decide how much of the limited resources available should be used for earthquake hazard mitigation, given other more immediate needs. Fig. 1. (a) Major tectonic boundaries of the northeast Indian plate. Numbers represent the years for major historic earthquakes. The base map shows population density. Bangladesh, at the northern end of the Bay of Bengal, has more than 1,000 people per square kilometer and is situated on a seismic gap. (b) Topographical image from NASA’s Shuttle Radar Topography Mission (green) of the area around the seismic gap in Bangladesh, with an overlay of night lights as a proxy for population density (from C. Small). The black lines show the major thrust systems with ticks on the upper plate. The red overlay shows the locked megathrust [from Steckler et al., 2016]. The striped area shows the uncertain downdip limit of the locked zone based on two models for the structure. The lighter coloring for the updip (western) portion of the megathrust is due to the uncertainty regarding whether the frontal part will rupture in the next megathrust earthquake. (c) Schematic cross section showing the locked megathrust in red, with dashed portions indicating the uncertain updip and downdip limits. Credit: (a) Modified from Steckler et al. [2008]; (c) modified from Steckler et al. [2016]For example, 31% of Bangladeshis live below the national poverty line, according to data from 2010. Per capita gross domestic product is only about $1,200, so Bangladesh needs to devote resources to economic growth. Bangladesh also needs resources to address challenges resulting from the nation’s low elevation. Almost half the population lives within 10 meters of sea level, so the country is very vulnerable to tropical cyclones, riverine flooding, and rising sea level. Hazards, Risks, and Vulnerability Risks are affected by human actions that increase or decrease vulnerability, such as where people live and how they build.“Hazards” are the natural occurrence of earthquakes or other phenomena over which we have no control, whereas “risks” are the dangers they pose to lives and property. In this formulation, risk is the product of hazard and vulnerability. We want to assess hazards—to estimate their significance—and develop methods to reduce vulnerabilities and mitigate the resulting losses. We can assess hazards only as best we can, but risks are affected by human actions that increase or decrease vulnerability, such as where people live and how they build. A disaster occurs when, because of high vulnerability, a natural event has major negative consequences for society. Vulnerable Urban Areas Assessments of hazards, vulnerabilities, and risks illustrate another factor that makes the earthquake problem particularly wicked for developing countries: Many are rapidly urbanizing and thus increasing their vulnerability such that earthquake hazards will have amplified effects. For example, in their humid subtropical environment, rural Bangladeshis traditionally relied on modest homes with walls of mud or bamboo, which are less dangerous and more easily rebuilt than large concrete structures. Along the Himalayan plate boundary, more than 50 million people now live in cities of at least a million inhabitants, including the capitals of Bangladesh, Bhutan, India, Nepal, and Pakistan. These rapidly growing, crowded megacities are filled with multistory concrete buildings that are likely vulnerable to earthquakes. Dhaka, Bangladesh’s capital, is one of the world’s fastest growing megacities. Some 16 million people currently live in Dhaka, and the potential collapse of services and accessibility after an earthquake compounds their risks. This view of Dhaka, Bangladesh, shows the contrast between a waterfront neighborhood with densely packed small houses and the skyscrapers in the more affluent Gulshan neighborhood. Credit: Michael Steckler Small Shifts, Big Effects Urban vulnerabilities are only expressed when hazards trigger them. And in Bangladesh, hazards have the potential to be great. The Indian tectonic plate moves northward toward Eurasia at a pace of about 50 millimeters each year (Figure 1). This continuing collision has raised the great Himalayas and caused large destructive earthquakes along the plate boundary. Bangladesh is at the boundary’s northeastern end, which is complicated and poorly understood. The plate boundary forms a roughly east–west arc along the Himalayas, bends 180° around the eastern Himalayan syntaxis, and then transitions into a broad zone of roughly north–south trending folds and thrusts, the Indo-Burma Ranges [ Steckler et al. , 2008]. The boundary continues southward to the Andaman-Sumatra subduction zone. Although the deformation zone that accommodates the motion between India and southeast Asia is often called the Burma platelet, multiple active structures indicate this “platelet” is not rigid. Until recently, it was unclear whether the India–Indo-Burma motion included convergence [ Gahalaut et al ., 2013] and caused megathrust earthquakes. However, GPS data show that although the motion is highly oblique, it has a significant convergence component [ Steckler et al., 2016]. This deformation, at 13–17 millimeters per year, appears to be loading the locked shallow megathrust, along which India subducts beneath Burma (Figures 1b and 1c). The strain from this deformation will likely be released in future large earthquakes, like those at other subduction zones [ Steckler et al ., 2016]. What We Know and What We Don’t These new data provide only some of the information needed to estimate the danger of future earthquakes. Scientists also need better estimates of how often large earthquakes may happen on sufficiently close active faults, how big they may be, and how much shaking they may cause. Results are compiled in earthquake hazard maps predicting how much shaking is expected to occur with a certain probability within a certain period of time. These maps are used to prepare for earthquakes, notably via building codes that prescribe earthquake-resistant construction. Although we have no way of knowing the future, we can make estimates with information about past earthquakes. For example, the Juan de Fuca plate subducts beneath northern California, Oregon, Washington, and British Columbia much as India subducts beneath the Indo-Burma Ranges. This area, known as the Cascadia subduction zone, was widely considered to be mostly aseismic until geological records became available. These records showed that large earthquakes happened some 530 years apart over the past 10,000 years, although the intervals are irregular [ Goldfinger et al., 2012] . The most recent, in 1700 CE, is thought to have had a moment magnitude ( Mw ) of about 9. We can gain insight on what to expect if we assume that the future will resemble the past when we derive earthquake hazard maps, but Earth does not always cooperate, and surprises are inevitable [ Stein et al ., 2012]. Add to this a lack of information, which makes Bangladesh’s situation much more challenging. Hazard assessment for the Indo-Burma boundary is like the assessments for Cascadia before evidence of past megathrust earthquakes became available. Dhaka has been shaken by both teleseismic (distant) and local earthquakes during recent times [ Akhter, 2010], but there is little documentation of past megathrust earthquakes. As a result, there is no good way to estimate how often such earthquakes may occur, how big they may be, or how much shaking they may cause. The limited historical records we do have indicate that no megathrust earthquake has ruptured beneath Dhaka since 1610. If this is true, then the strain from more than 5 meters of motion has been stored on the megathrust. If this strain were released in one earthquake, it would have Mw ~8.2. If it has been longer since the last earthquake, the next temblor may be even bigger. Such a large earthquake seems possible: The plate boundary segment to the south ruptured in 1762 in an earthquake estimated as Mw 8.5–8.8 [ Cummins , 2007; Wang et al., 2013]. Furthermore, the subduction zone here is extremely large and complex. Field geology and seismic data [ Sikder and Alam, 2003; Betka et al., 2016] indicate that the megathrust is unusually broad and shallow, but it is uncertain whether and how often it ruptures seismically. It’s also unclear whether slip in an earthquake would taper to the west or whether the frontal zone would rupture in separate, less frequent earthquakes [ Wang et al ., 2014]. Might only some of the megathrust earthquakes propagate to the thrust tip near Dhaka? Splay faults rooting the folds and other faults within the plate boundary zone are also possible sources of damaging earthquakes [e.g., Debbarma et al. , 2017]. The multiple scenarios increase the uncertainty in seismic hazard assessment. Tackling the Problem Although protecting millions of urban dwellers in Bangladesh might seem daunting, it is not hopeless; Bangladesh has tackled this kind of problem before.Although protecting millions of urban dwellers in Bangladesh might seem daunting, it is not hopeless; Bangladesh has tackled this kind of problem before. Over a span of decades, Bangladesh has successfully reduced the risk from tropical cyclones. Shelters have been built along the coast, and a network of volunteers warns people when to evacuate. A cyclone in 1970, before the program, killed 300,000–500,000 people. By 2007, 1.5 million people took refuge in shelters ahead of Cyclone Sidr, reducing the death toll to about 4,300. Efforts continue to increase the stock of cyclone shelters and promote recovery after storms. Similar efforts are beginning for earthquakes. The Ministry of Disaster Management and Relief has adopted 12 July as Earthquake Day, to be observed with earthquake drills and seminars to increase awareness. Scientists are preparing hazard maps for the country, although the maps are preliminary and are bound to have large uncertainties. Initial studies and planning efforts are devoted to exploring the consequences of large earthquakes [ World Bank and Earthquakes and Megacities Initiative , 2014]. Assessment of the building stock typical in developing nations shows their vulnerability to earthquakes. In Dhaka, ~21% of the buildings are easily damaged, unreinforced masonry (brick) construction. About 77% are reinforced concrete but have not been designed to resist earthquake shaking. Moreover, in many cases the site preparation and construction are thought to be poor. Although a building code was enacted in 2006, enforcement is limited, and newer buildings may be as vulnerable as older ones. For example, Dhaka’s Rana Plaza opened in 2009 when the code was in place. In 2013, it collapsed, killing more than 1,100 people. Typical skyscrapers in Dhaka, Bangladesh, are not constructed to resist earthquake damage: They are made of rectilinear reinforced concrete construction with bricked-in faces. Credit: T. L. Anderman A further problem is that Dhaka and most of Bangladesh are located on the sediments of the Ganges-Brahmaputra Delta. Earthquake shaking in thick sediments is generally enhanced relative to hard rock, but the amount depends on the size and shape of the basin and the sediment properties. Surface sediment is prone to liquefaction and sand boils, in which strong shaking causes saturated soil to lose strength or develop high pore pressure and sand eruptions. For example, the 2017 Mw 5.7 earthquake in Tripura, India [ Debbarma et al. , 2017], caused sand boils and damaged buildings in northeast Bangladesh about 40 kilometers away. Reasonable Risk Reduction Steps Mitigation is like buying insurance; we spend money today to reduce consequences of possible future events.The case study of Bangladesh illustrates the challenge of how to address an uncertain hazard, given limited resources [ Stein and Stein , 2014]. How much mitigation is enough? Mitigation is like buying insurance; we spend money today to reduce consequences of possible future events. More mitigation reduces future losses but costs more now; resources used for mitigation are not available for other purposes. Money spent making existing schools earthquake resistant cannot be used to build schools or hire teachers for communities that have none [ Kenny, 2009]. Ideally, if the hazard were well understood, economic models could be used to develop mitigation strategies. The total cost of natural disasters to society is the sum of the expected loss in future disasters and the cost of mitigation. This total depends on the amount of mitigation, shown schematically by the U-shaped curve in Figure 2. Fig. 2. The total cost to society of natural disasters depends on the amount invested in mitigation. The optimal mitigation level minimizes the total cost, the sum of the expected loss and the mitigation cost. In reality, a community is likely to spend less than the optimum, but spending less than the optimum is better than doing nothing. Credit: Stein and Stein [2014]If we undertake no mitigation, we have no mitigation costs (left side of the curve) but expect high losses, so it makes sense to invest more in mitigation. Increased mitigation should decrease losses, so the curve goes down. Eventually, however, the cost of more mitigation exceeds the reduced losses, and the curve rises again. These additional resources would be better invested otherwise. The optimum mitigation is the sweet spot at the bottom of the curve. Uncertainties in our ability to assess hazards and resulting losses limit our ability to determine an optimal strategy. Moreover, given limited resources, a community is likely to spend less than the optimum anyway. Fortunately, spending less is better than doing nothing (Figure 2), and we can still suggest strategies that make sense given the high uncertainty and limited resources. This approach follows the idea that “the best is the enemy of the good”: Requiring too much safety would cost so much that nothing is likely to be done. Public education and understanding is needed to raise support for any level of investment. Recent nearby earthquakes, like the 2004 Sumatra, 2015 Gorkha, and 2016 Manipur earthquakes, which caused shaking and damage in Bangladesh, have raised earthquake awareness in the country. The scientific community is providing better understanding and monitoring of tectonics and earthquake processes in and around Bangladesh. These developments offer Bangladesh the opportunity to increase earthquake preparedness and reduce earthquake risk [ Akhter, 2010]. Building New Versus Fixing Old As the population shifts from rural to urban, the extensive construction that follows provides an opportunity for earthquake risk reduction. This opportunity stems from one key idea: A crucial step to mitigating earthquake risk in Bangladesh is enforcing the building code. Studies show that a moderate degree of safety is achievable with a modest, perhaps 5%–10%, increase in building costs [ Schulze et al., 1987]. Over time, natural turnover of buildings will make communities more resilient. Thus, an approach to reducing risk is to plan the desired fraction of safer buildings over time and to incentivize new safer construction over modifying unsafe existing buildings. Because strengthening (retrofitting) an older building can cost between 25% and 70% of the building’s value, we recommend this approach for only the most critical structures.Because strengthening (retrofitting) an older building can cost between 25% and 70% of the building’s value, we recommend this approach for only the most critical structures [ Arikan et al., 2005; McMonies, 2016]. For example, the Bangladeshi government has decided to retrofit some fire stations. Outside of critical infrastructure, the ideal case is when tenants would pay more for ensuring the safety of their buildings. However, conditions aren’t always ideal. Erdik and Durukal [2008] report on similar issues faced in Istanbul, a comparable setting. Assessments showed that retrofits would cost about 40% of replacement value. Their study showed that Istanbul residents viewed this “as an investment with no financial return and, as such, no conceivable reduction in insurance premium, property tax, or building permit fees would be sufficient to create an incentive for retrofitting.” This response was rational, unless one postulates a high probability of major damage on a short timescale [ Kenny, 2009]. Hence, a major retrofitting program would require large investment of public funds, which is unrealistic given other needs. Putting It All Together Recommendations by World Bank and Earthquakes and Megacities Initiative [2014] favor raising public earthquake awareness; building competency for architects, engineers, planners, and construction professionals; improving emergency response; and planning land use in a risk-sensitive manner. Ongoing programs, such as the annual U.S.-Bangladesh Pacific Resilience Disaster Response Exercise and Exchange, the Global Facility for Disaster Reduction and Recovery program, and the Comprehensive Disaster Management Program, build toward these goals. Robust risk management is practical, even for developing nations. It involves recognizing uncertainties and developing policies that should give a reasonable outcome for a range of the possible hazard and loss scenarios. It requires accepting the need for humility in the face of the complexities and capriciousness of nature while making realistic policies that the public accepts. Although long-term investments in risk reduction compete with immediate needs, they will pay back handsomely should a major earthquake strike. Acknowledgments We thank the editors and reviewers for helping to improve this paper. This work was supported by NSF grant OISE 09-68354. LDEO contribution number 8192. The post The Wicked Problem of Earthquake Hazard in Developing Countries appeared first on Eos .
    Print ISSN: 0096-3941
    Electronic ISSN: 2324-9250
    Topics: Geosciences
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 69
    facet.materialart.
    Unknown
    American Geophysical Union (AGU)
    Publication Date: 2018-03-09
    Description: Internal gravity waves affect and determine ocean processes in many ways, such as ocean mixing that is crucial for a more complete understanding of the climate system, or for supply of nutrients for photosynthesis to support biological production. Yet because of their relatively short time scales (minutes to hours), small spatial scales (tens to hundreds of meters), and vast travel distances (hundreds to thousands of kilometers), internal gravity waves in the ocean are difficult to observe and measure in a comprehensive way. Tang et al. [2018] use seismic measurements to resolve fine-scale internal wave structures in northern South China Sea. The technique of using seismic data to reveal ocean fine scales was developed more than 10 years ago [Holbrook et al., 2003]. But this paper demonstrates how the technique helps reveal hydraulic jumps, wave breaking and shear instability, potentially providing much greater insight to mixing in the ocean interior. At the high resolution of tens of meters, the data in fact now even pose a challenge to modelers! Citation: Tang, Q., Xu, M., Zheng, C., Xu, X., & Xu, J. [2018]. A locally generated high-mode nonlinear internal wave detected on the shelf of the northern South China Sea from marine seismic observations. Journal of Geophysical Research: Oceans , 123 https://doi.org/10.1002/2017JC013347   —Lie-Yauw Oey, Editor, JGR: Oceans The post Chaos Beneath a Calm Sea appeared first on Eos .
    Print ISSN: 0096-3941
    Electronic ISSN: 2324-9250
    Topics: Geosciences
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 70
    facet.materialart.
    Unknown
    American Geophysical Union (AGU)
    Publication Date: 2018-03-09
    Description: Wind is one of the two primary agents sculpting the Earth’s landscape, the other being water. Our understanding of aeolian processes is ever expanding from tropical and temperate deserts to cold high-latitude and high-altitude areas. The Tibetan Plateau is a particularly interesting place to study high-altitude aeolian processes. First, because the characteristics of the region are unique and some of the aeolian processes are quite different from other locations. Second, because aeolian processes on the Tibetan Plateau have an impact far beyond its geographic area, including influencing global climate. Our open access review article, recently published in Reviews of Geophysics , gives an overview of research that has improved our understanding of aeolian processes and landforms on the Tibetan Plateau. Here we give a brief overview of the characteristics and significance of this region. Aeolian processes in the past on the Tibetan Plateau left extensive aeolian sediments such as loess and fossil aeolian sand, the oldest loess dating back to 800,000 years ago. However, the reconstruction of past aeolian activities is short (mostly since the Late Glacial) and shows wide regional differences because of the overall erosional environment that does not favour the preservation of aeolian records. Processes that were active in the geological past continue in the present day. However, locations where aeolian processes occur strongly depend on the geomorphological conditions that determine sediment availability. A yardang in the Qaidam Basin, China, one of the characteristic aeolian landforms of the region. Credit: Dong et al., 2017, Figure 15b Therefore, contemporary aeolian processes primarily occur in several dry basins in the northern part of the plateau including the Qaidam Basin, Gonghe Basin and Kumkuri Basin. Also in wide river valleys such as the Yarlung Zangbo River, headwaters of the Yangtz and Yellow River; on lakeshores such as the Qinghai Lake and Dongi Cona; on mountain slopes; and on gravel pavements. Studies of aeolian sediment characteristics suggest a local origin. For example, loess on the Tibetan Plateau provides interesting contrasts with that of China’s Loess Plateau, which are transported from arid regions of northwest China. Aeolian processes form various aeolian landforms but aeolian geomorphology is much simpler on the Tibetan Plateau than other deserts such as those in northwest China due to limited sediment availability and short development history. For example, compound and complex dunes are absent on the Tibetan Plateau. Aeolian processes on the Tibetan Plateau also exert a direct impact on the earth system because aeolian dust is emitted into the high atmosphere and transported long distances. However, the presence of a cryosphere makes the responses of aeolian processes to global change on the Tibetan Plateau different from those in other areas. Sand blown by the wind near Beiluhe along the Qinghai-Tibetan railway. Wind-blown sand is causing serious damage to ecosystems on the Tibetan Plateau. Credit: Dong et al., 2017, Figure 3b It used to be thought that thawed frozen ground in response to global warming led to the expansion of aeolian desertification. However, it proves that aeolian desertification is not so severe as previously thought because that warming that is accompanied by slightly increased precipitation and reduced wind speed favours the restoration of aeolian desertified lands. Aeolian desertification in general is driven by climate, though adverse human interferences, such as over-grazing, exert strong impacts in some localities thus aeolian desertification may be less severe in the future. The location of the Tibetan Plateau means that aeolian processes take place where there is generally low air temperature, low air density, and the presence of a cryosphere. Thus, comparative studies of aeolian physics between the Tibetan Plateau and other areas are necessary to understand the characteristics of aeolian processes that are specific to the plateau, and the effect of freeze-thaw cycles on aeolian processes. Another direction for comparative study is interplanetary. The wind is possibly active in shaping the landscape of other planets too, such as Mars and Venus, as well as Saturn’s moon, Titan. Recent findings on the Tibetan Plateau of analogues of Martian aeolian landforms have attracted researchers interested in comparative studies of aeolian processes between Mars and Tibetan Plateau. Although the aeolian processes on the Tibetan Plateau are quite different from elsewhere in the world, it is important to continue research in this area as it will help to improve our understanding of Earth-surface processes and the impacts of climate change. —Zhibao Dong, Shaanxi Normal University; email: zbdong@lzb.snnu.edu.cn, with contributions from co-authors The post A Landscape Shaped by Wind appeared first on Eos .
    Print ISSN: 0096-3941
    Electronic ISSN: 2324-9250
    Topics: Geosciences
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 71
    Publication Date: 2018-01-09
    Description: Robustness is being used increasingly for decision analysis in relation to deep uncertainty and many metrics have been proposed for its quantification. Recent studies have shown that the application of different robustness metrics can result in different rankings of decision alternatives, but there has been little discussion of what potential causes for this might be. To shed some light on this issue, we present a unifying framework for the calculation of robustness metrics, which assists with understanding how robustness metrics work, when they should be used, and why they sometimes disagree. The framework categorizes the suitability of metrics to a decision-maker based on (i) the decision-context (i.e. the suitability of using absolute performance or regret), (ii) the decision-maker's preferred level of risk aversion, and (iii) the decision-maker's preference towards maximizing performance, minimizing variance, or some higher-order moment. This paper also introduces a conceptual framework describing when relative robustness values of decision alternatives obtained using different metrics are likely to agree and disagree. This is used as a measure of how “stable” the ranking of decision alternatives is when determined using different robustness metrics. The framework is tested on three case studies, including water supply augmentation in Adelaide, Australia, the operation of a multipurpose regulated lake in Italy, and flood protection for a hypothetical river based on a reach of the river Rhine in the Netherlands. The proposed conceptual framework is confirmed by the case study results, providing insight into the reasons for disagreements between rankings obtained using different robustness metrics.
    Electronic ISSN: 2328-4277
    Topics: Geosciences
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 72
    Publication Date: 2018-01-09
    Description: Reliable inventory information is critical in informing emission mitigation efforts. Using the latest officially released emission data, which is production-based, we take a consumption perspective to estimate the non-CO 2 greenhouse gas (GHG) emissions for China in 2012. The non-CO 2 GHG emissions, which cover CH 4 , N 2 O, HFCs, PFCs and SF 6 , amounted to 2003.0 Mt CO 2 -eq (including 1871.9 Mt CO 2 -eq from economic activities), much larger than the total CO 2 emissions in some developed countries. Urban consumption (30.1%), capital formation (28.2%) and exports (20.6%) derived approximately four fifths of the total embodied emissions in final demand. Furthermore, the results from structural path analysis help identify critical embodied emission paths and key economic sectors in supply chains for mitigating non-CO 2 GHG emissions in Chinese economic systems. The top 20 paths were responsible for half of the national total embodied emissions. Several industrial sectors such as Construction, Production and Supply of Electricity and Steam , Manufacture of Food and Tobacco and Manufacture of Chemicals and Chemical Products played as the important transmission channels. Examining both production-based and consumption-based non-CO 2 GHG emissions will enrich our understanding of the influences of industrial positions, final consumption demands and trades on national non-CO 2 GHG emissions by considering the comprehensive abatement potentials in the supply chains.
    Electronic ISSN: 2328-4277
    Topics: Geosciences
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 73
    Publication Date: 2018-01-10
    Description: Rapid economic and population growth over the last centuries have started to push the Earth out of its Holocene state into the Anthropocene. In this new era, ecosystems across the globe face mounting dual pressure from human land use change (LUC) and climate change (CC). With the Paris Agreement, the international community has committed to holding global warming below 2°C above preindustrial levels, yet current pledges by countries to reduce greenhouse gas emissions appear insufficient to achieve that goal. At the same time, the sustainable development goals strive to reduce inequalities between countries and provide sufficient food, feed, and clean energy to a growing world population likely to reach more than 9 billion by 2050. Here, we present a macro-scale analysis of the projected impacts of both CC and LUC on the terrestrial biosphere over the 21st century using the Representative Concentration Pathways (RCPs) to illustrate possible trajectories following the Paris Agreement. We find that CC may cause major impacts in landscapes covering between 16% and 65% of the global ice-free land surface by the end of the century, depending on the success or failure of achieving the Paris goal. Accounting for LUC impacts in addition, this number increases to 38%–80%. Thus, CC will likely replace LUC as the major driver of ecosystem change unless global warming can be limited to well below 2°C. We also find a substantial risk that impacts of agricultural expansion may offset some of the benefits of ambitious climate protection for ecosystems.
    Electronic ISSN: 2328-4277
    Topics: Geosciences
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 74
    Publication Date: 2018-01-12
    Description: China has suffered severe air pollutions during wintertime as national industrialization and urbanization have been increasingly developed in the past decades. Recent studies suggest that climate change has important impacts on extreme haze events in northern China. This study uses reanalysis datasets to analyze the trend and variability of Siberian High (SiH) intensity, and its relationship with the Arctic temperature and sea ice cover (SIC) in past two decades. The results show that Arctic is warming accompanied by a rapid decline of SIC, while Eurasia is cooling and SiH intensity is gradually enhancing. The statistics illustrates that the SiH has a significantly positive correlation to the temperature (R = 0.70), and a significant anti-correlation to the SIC (R = -0.69), and this is because the warming Arctic and the reducing SIC enhanced the SiH. The enhanced SiH leads to strengthened northerly winds in the North China Plain (NCP). The WRF-Chem model calculation reveals the strengthened northerly winds during the stronger SiH period in January 2016 produce a significant decrease in PM 2.5 concentrations by 100 - 200 μg m -3 than that during the weaker one in January 2013. A sensitivity calculation figures out the reduction of PM 2.5 concentrations due to a decrease of 50% in emissions is comparable to changes from the weak SiH condition to the strong SiH condition, suggesting that extreme climate variability in the past few years could have an equivalent impact as a consequence of a large emission reduction on wintertime air pollution in the NCP.
    Electronic ISSN: 2328-4277
    Topics: Geosciences
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 75
    Publication Date: 2018-01-17
    Description: Rates of atmospheric deposition are declining across the United States, yet urban areas remain hotspots of atmospheric deposition. While past studies show elevated rates of inorganic nitrogen (N) deposition in cities, less is known about atmospheric inputs of organic N, organic carbon (C), and organic and inorganic phosphorus (P), all of which can affect ecosystem processes, water quality, and air quality. Further, the effect of the tree canopy on amounts and forms of nutrients reaching urban ground surfaces is not well-characterized. We measured growing season rates of total N, organic C, and total P in bulk atmospheric inputs, throughfall, and soil solution around the greater Boston area. We found that organic N constitutes a third of total N inputs, organic C inputs are comparable to rural inputs, and inorganic P inputs are 1.2 times higher than those in sewage effluent. Atmospheric inputs are enhanced two-to-eight times in late spring and are elevated beneath tree canopies, suggesting that trees augment atmospheric inputs to ground surfaces. Additionally, throughfall inputs may directly enter runoff when trees extend above impervious surfaces, as is the case with 26.1% of Boston's tree canopy. Our results indicate that the urban atmosphere is a significant source of elemental inputs that may impact urban ecosystems and efforts to improve water quality, particularly in terms of P. Further, as cities create policies encouraging tree planting to provide ecosystem services, locating trees above permeable surfaces to reduce runoff nutrient loads may be essential to managing urban biogeochemical cycling and water quality.
    Electronic ISSN: 2328-4277
    Topics: Geosciences
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 76
    facet.materialart.
    Unknown
    American Geophysical Union (AGU)
    Publication Date: 2018-01-18
    Electronic ISSN: 2328-4277
    Topics: Geosciences
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 77
    Publication Date: 2018-01-23
    Description: Trends in short-lived high-temperature extremes record a different dimension of change than the extensively studied annual and seasonal mean daily temperatures. They also have important socioeconomic, environmental, and human health implications. Here, we present analysis of the highest temperature of the year for approximately 9000 stations globally, focusing on quantifying spatially explicit exceedance probabilities during the recent 50- and 30-year periods. A global increase of 0.19°C per decade during the past 50 years (through 2015) accelerated to 0.25°C per decade during the last 30 years, a faster increase than in the mean annual temperature. Strong positive 30-year trends are detected in large regions of Eurasia and Australia with rates higher than 0.60°C per decade. In cities with more than 5 million inhabitants, where most heat-related fatalities occur, the average change is 0.33°C per decade, while some east Asia cities, Paris, Moscow, and Houston have experienced changes higher than 0.60°C per decade.
    Electronic ISSN: 2328-4277
    Topics: Geosciences
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 78
    facet.materialart.
    Unknown
    American Geophysical Union (AGU)
    In:  EPIC3Geophysical Research Letters, American Geophysical Union (AGU), 45(23), pp. 12972-12981, ISSN: 0094-8276
    Publication Date: 2023-01-30
    Description: The Arctic Ocean is known to be contaminated by various persistent organic pollutants (POPs). The Fram Strait, the only deepwater passage to the Arctic Ocean (from the Atlantic Ocean), represents an unquantified gateway for POPs fluxes into and out of the Arctic. Polyethylene passive samplers were deployed in vertical profiles in the Fram Strait and in air and surface water in the Canadian Archipelago to determine the concentrations, profiles, and mass fluxes of dissolved polychlorinated biphenyls (PCBs) and organochlorine pesticides. In the Fram Strait, higher concentrations of ΣPCBs (1.3–3.6 pg/L) and dichlorodiphenyltrichloroethanes (ΣDDTs, 5.2–9.1 pg/L) were observed in the deepwater masses (below 1,000 m), similar to nutrient-like vertical profiles. There was net southward transport of hexachlorobenzene and hexachlorocyclohexanes (ΣHCHs) of 0.70 and 14 Mg/year but a net northward transport of ΣPCBs at 0.16 Mg/year through the Fram Strait.
    Repository Name: EPIC Alfred Wegener Institut
    Type: Article , isiRev
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 79
    Publication Date: 2017-01-01
    Description: The historical developments are reviewed that have led from a bottom-up responsibility initiative of concerned scientists to the emergence of a nationwide interdisciplinary Priority Program on the assessment of Climate Engineering (CE), funded by the German Research Foundation (DFG). Given the perceived lack of comprehensive and comparative appraisals of different CE methods, the Priority Program was designed to encompass both solar radiation management (SRM) and carbon dioxide removal (CDR) ideas, and to cover the atmospheric, terrestrial and oceanic realm. First key findings obtained by the ongoing Priority Program are summarized and reveal that compared to earlier assessments, such as the 2009 Royal Society report, more detailed investigations tend to indicate less efficiency, lower effectiveness and often lower safety. Emerging research trends are discussed in the context of the recent Paris agreement to limit global warming to less than two degrees and the associated increasing reliance on negative emission technologies. Our results show then when deployed at scales large enough to have a significant impact on atmospheric CO 2 , even CDR methods such as afforestation – often perceived as ‘benign’ – can have substantial side effects and may raise severe ethical, legal and governance issues. We suppose that before being deployed at climatically relevant scales, any negative-emission or climate engineering method will require careful analysis of efficiency, effectiveness and undesired side effects.
    Electronic ISSN: 2328-4277
    Topics: Geosciences
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 80
    Publication Date: 2017-01-01
    Description: Since 1960's, India experiences a series of extreme drought and flood events during the summer months. The Humid Subtropical Climatic Zone (HSTC), which comprises the Indo-Gangetic Plain (IGP) region of India, is highly vulnerable to the climatic extremities. This region is the home for ~40% of total population and yields ~50% of total agricultural production of India. We investigate the historical variation in dry/wet conditions and project the future changes in extreme events under two different scenarios of the CMIP5 models. Firstly, the model parameters i.e. precipitation ( P ) and temperature ( T ) are bias corrected with respect to observation data and finally 6 models are selected, which are in right phase with the observation for composite analysis. Next, we calculate the potential evapo-transpiration ( PET ) and the Standardized Potential Evapo-transpiration Index ( SPEI ) to characterize the extreme events. Both P and PET are projected to increase in the HSTC zone; however, both the wet and dry conditions demonstrate a persistent increase in future. In relative terms, P increases faster than PET along the Gangetic Plain region (wet condition) and decreases in the southern and eastern part of the region (dry condition). The mitigating effect (RCP4.5 scenario) of precipitation increase will be overridden by strengthened PET and extreme dry condition project markedly under RCP8.5 scenario. The features are consistent with the increase/ decrease in multi-model mean SPEI, consistent with the spatial pattern of P−PET . The area affected due to wet and dry events will be relatively higher under the RCP4.5 and RCP8.5 scenario, respectively.
    Electronic ISSN: 2328-4277
    Topics: Geosciences
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 81
    Publication Date: 2017-09-06
    Description: In the 2000s, the rapid growth of CO 2 emitted in the production of exports from developing to developed countries, in which China accounted for the dominant share, led to concerns that climate polices had been undermined by international trade. Arguments on ‘carbon leakage’ and ‘competitiveness’ – which led to the refusal of the United States to ratify the Kyoto Protocol – put pressure on developing countries, especially China, to limit their emissions with Border Carbon Adjustments used as one threat. After strong growth in the early 2000s, emissions exported from developing to developed countries plateaued and could have even decreased since 2007. These changes were mainly due to China: In 2002–2007, China’s exported emissions grew by 827 MtCO 2 , amounting to almost all the 892 MtCO 2 total increase in emissions exported from developing to developed countries, while in 2007–2012, emissions exported from China decreased by 229 MtCO 2 , contributing to the total decrease of 172 MtCO 2 exported from developing to developed countries. We apply Structural Decomposition Analysis to find that, in addition to the diminishing effects of the global financial crisis, the slowdown and eventual plateau was largely explained by several potentially permanent changes in China: Decline in export volume growth, improvements in CO 2 intensity, and changes in production structure and the mix of exported products. We argue that growth in China’s exported emissions will not return to the high levels during the 2000s, therefore the arguments for climate polices focused on embodied emissions such as Border Carbon Adjustments are now weakened.
    Electronic ISSN: 2328-4277
    Topics: Geosciences
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 82
    Publication Date: 2017-09-02
    Description: We present new detail on how future SLR will modify nonlinear wave transformation processes, shoreline wave energy and wave driven flooding on atoll islands. Frequent and destructive wave inundation is a primary climate-change hazard that may render atoll islands uninhabitable in the near future. However, limited research has examined the physical vulnerability of atoll islands to future SLR and sparse information is available to implement process based coastal management on coral reef environments. We utilize a field-verified numerical model capable of resolving all nonlinear wave transformation processes to simulate how future SLR will modify wave dissipation and overtopping on Funafuti Atoll, Tuvalu, accounting for static and accretionary reef adjustment morphologies. Results show that future SLR coupled with a static reef morphology will not only increase shoreline wave energy and overtopping but will fundamental alter the spectral composition of shoreline energy by decreasing the contemporary influence of low frequency infragravity waves. ‘ Business-as-usual ' emissions (RCP 8.5) will result in annual wave overtopping on Funafuti Atoll by 2030, with overtopping at high tide under mean wave conditions occurring from 2090. Comparatively, vertical reef accretion in response to SLR will prevent any significant increase in shoreline wave energy and mitigate wave driven flooding volume by 72%. Our results provide the first quantitative assessment of how effective future reef accretion can be at mitigating SLR associated flooding on atoll islands and endorse active reef conservation and restoration for future coastal protection.
    Electronic ISSN: 2328-4277
    Topics: Geosciences
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 83
    Publication Date: 2017-09-06
    Description: ABSTRACT Risk perception research has played an influential role in supporting risk management and risk communication policy. Risk perception studies are popular across a range of disciplines in the social and natural sciences for a wide range of hazard types. Their results have helped to articulate the complex individual, relational, structural, and environmental factors influencing people's behavior. Connections between individual and collective behaviors and norms impacting global climate change, and consequently, local disaster risk, however, are infrequently included in disaster risk management. This paper presents results from two diverse and complementary European risk perception studies examining both natural and anthropogenic hazards. Research gaps and recommendations for developing more comprehensive risk management strategies are presented.
    Electronic ISSN: 2328-4277
    Topics: Geosciences
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 84
    Publication Date: 2017-08-22
    Description: Multi-sectoral partnerships (MSPs) form an increasingly popular and important part of the global climate and disaster risk governance landscape, but literature offers little critical investigation of this phenomenon. In particular it remains unclear how MSPs can support the transition from agenda-setting to implementation in response to multiple current and future pressures threatening the resilience of cities. Through the lens of the London Climate Change Partnership (LCCP) and drawing from other MSP examples, this paper investigates the scope for MSPs to enhance climate adaptation in an urban context. Our paper has two main aims: to expand understanding of the role of MSPs in the adaptation decision process in the context of the wider governance literature, and to shed some light on the complexities of transitioning through that process. To clarify the role of a MSP we propose a distinction between ‘first generation’ and ‘second generation’ MSPs, illustrating the progression from agenda-setting to implementation: ‘first generation’ MSPs are focused on agenda-setting and knowledge sharing in order to support decision-makers, while ‘second generation’ partnerships are aimed at implementing solutions. We consider this distinction from the perspective of the individual members and their perceptions, motivations and expectations. We find that the dynamic nature of urban adaptation with a shifting focus from initial agenda setting towards the implementation of actions presents challenges for existing MSPs, particularly such long-established ones like the LCCP. Our investigation shows that ‘first generation’ MSPs can play important roles in agenda-setting, but finds little evidence of ‘second generation’ MSPs achieving implementation.
    Electronic ISSN: 2328-4277
    Topics: Geosciences
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 85
    Publication Date: 2017-08-22
    Description: Improving society's ability to prepare for, respond to and recover from flooding requires integrated, anticipatory flood risk management (FRM) . However, most countries still focus their efforts on responding to flooding events if and when they occur rather than addressing their current and future vulnerability to flooding. Flood insurance is one mechanism that could a more ex-ante approach to risk by supporting risk reduction activities. This paper uses an adapted version of Easton's System Theory to investigate the role of insurance for FRM in Germany and England. We introduce an anticipatory FRM framework, which allows to consider flood insurance as part of a broader policy field. We analyse if and how flood insurance can catalyse a change towards a more anticipatory approach to FRM. In particular we consider insurance's role in influencing five key components of an anticipatory FRM: risk knowledge, prevention through better planning, property-level protection measures, structural protection and preparedness (for response). We find that in both countries FRM is still a reactive, event-driven process, while anticipatory FRM remains underdeveloped. However, collaboration between insurers and FRM decision-makers has already been successful, for example in improving risk knowledge and awareness, while in other areas insurance acts as a disincentive for more risk reduction action. In both countries there is evidence that insurance can play a significant role in encouraging anticipatory FRM, but this remains underutilized. Effective collaboration between insurers and government, should not be seen as a cost, but as an investment to secure future insurability through flood resilience.
    Electronic ISSN: 2328-4277
    Topics: Geosciences
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 86
    facet.materialart.
    Unknown
    American Geophysical Union (AGU)
    Publication Date: 2017-08-30
    Description: Just as carbon fueled the Industrial Revolution, nitrogen has fueled an Agricultural Revolution. The use of synthetic nitrogen fertilizers and the cultivation of nitrogen-fixing crops both expanded exponentially during the last century, with most of the increase occurring after 1960. As a result, the current flux of reactive, or fixed, nitrogen compounds to the biosphere due to human activities is roughly equivalent to the total flux of fixed nitrogen from all natural sources, both on land masses and in the world's oceans. Natural fluxes of fixed nitrogen are subject to very large uncertainties, but anthropogenic production of reactive nitrogen has increased almost five-fold in the last half-century, and this rapid increase in anthropogenic fixed nitrogen has removed any uncertainty on the relative importance of anthropogenic fluxes to the natural budget. The increased use of nitrogen has been critical for increased crop yields and protein production needed to keep pace with the growing world population. However, similar to carbon, the release of fixed nitrogen into the natural environment is linked to adverse consequences at local, regional, and global scales. Anthropogenic contributions of fixed nitrogen continue to grow relative to the natural budget, with uncertain consequences.
    Electronic ISSN: 2328-4277
    Topics: Geosciences
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 87
    Publication Date: 2017-09-13
    Description: Land surface albedo is a key parameter controlling the local energy budget, and altering the albedo of built surfaces has been proposed as a tool to mitigate high near-surface temperatures in the Urban Heat Island. However, most research on albedo in urban landscapes has used coarse-resolution data, and few studies have attempted to relate albedo to other urban land cover characteristics. This study provides an empirical description of urban summertime albedo using 30 m remote sensing measurements in the metropolitan area around Boston, Massachusetts, relating albedo to metrics of impervious cover fraction, tree canopy coverage, population density, and land surface temperature (LST). At 30 m spatial resolution, median albedo over the study area (excluding open water) was 0.152 (0.112–0.187). Trends of lower albedo with increasing urbanization metrics and temperature emerged only after aggregating data to 500 m or the boundaries of individual towns, at which scale a −0.01 change in albedo was associated with a 29 (25–35)% decrease in canopy cover, a 27 (24–30)% increase in impervious cover, and an increase in population from 11–386 km −2 . The most intensively urbanized towns in the region showed albedo up to 0.035 lower than the least urbanized towns, and mean mid-morning LST 12.6 °C higher. Trends in albedo derived from 500 m MODIS measurements were comparable, but indicated a strong contribution of open water at this coarser resolution. These results reveal linkages between albedo and urban land cover character, and offer empirical context for climate resilient planning and future landscape functional changes with urbanization.
    Electronic ISSN: 2328-4277
    Topics: Geosciences
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 88
    facet.materialart.
    Unknown
    American Geophysical Union (AGU)
    Publication Date: 2017-08-18
    Electronic ISSN: 2328-4277
    Topics: Geosciences
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 89
    facet.materialart.
    Unknown
    American Geophysical Union (AGU)
    Publication Date: 2017-09-12
    Electronic ISSN: 2328-4277
    Topics: Geosciences
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 90
    facet.materialart.
    Unknown
    American Geophysical Union (AGU)
    Publication Date: 2017-02-25
    Electronic ISSN: 2328-4277
    Topics: Geosciences
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 91
    Publication Date: 2017-02-28
    Description: In this study, we demonstrate skillful spring forecasts of detrended September Arctic sea ice extent using passive microwave observations of sea ice concentration (SIC) and melt onset (MO). We compare these to forecasts produced using data from a sophisticated melt pond model, and find similar to higher skill values, where the forecast skill is calculated relative to linear trend persistence. The MO forecasts shows the highest skill in March–May, while the SIC forecasts produce the highest skill in June–August, especially when the forecasts are evaluated over recent years (since 2008). The high MO forecast skill in early spring appears to be driven primarily by the presence and timing of open water anomalies, while the high SIC forecast skill appears to be driven by both open water and surface melt processes. Spatial maps of detrended anomalies highlight the drivers of the different forecasts, and enable us to understand regions of predictive importance. Correctly capturing sea ice state anomalies, along with changes in open water coverage appear to be key processes in skillfully forecasting summer Arctic sea ice.
    Electronic ISSN: 2328-4277
    Topics: Geosciences
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 92
    Publication Date: 2017-03-02
    Description: Society has set ambitious targets for stabilizing mean global temperature. To attain these targets, it will have to reduce CO 2 emissions to near zero by mid-century and subsequently remove CO 2 from the atmosphere during the latter half of the century. There is a recognized need to develop technologies for CO 2 removal; however, attempts to develop direct air capture systems have faced both energetic and financial constraints. Recently, BioEnergy with Carbon Capture and Storage (BECCS) has emerged as a leading candidate for removing CO 2 from the atmosphere. However, BECCS can have negative consequences on land, nutrient, and water use as well as biodiversity and food production. Here, we describe an alternative approach based on the large-scale industrial production of marine microalgae. When cultivated with proper attention to power, carbon, and nutrient sources, microalgae can be processed to produce a variety of biopetroleum products, including carbon neutral biofuels for the transportation sector and long-lived, potentially carbon-negative construction materials for the built environment. In addition to these direct roles in mitigating and potentially reversing the effects of fossil CO 2 emissions, microalgae can also play an important indirect role. Because microalgae exhibit much higher primary production rates than terrestrial plants, they require much less land area to produce an equivalent amount of bioenergy and/or food. On a global scale, the avoided emissions resulting from displacement of conventional agriculture may exceed the benefits of microalgae biofuels in achieving climate stabilization goals.
    Electronic ISSN: 2328-4277
    Topics: Geosciences
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 93
    Publication Date: 2017-06-08
    Description: Biome-specific soil respiration (Rs) has important yet different roles in both the carbon cycle and climate change from regional to global scales. To date, no comparable studies related to global biome-specific Rs have been conducted applying comprehensive global Rs databases. The goal of this study was to develop artificial neural network (ANN) models capable of spatially estimating global Rs and to evaluate the effects of interannual climate variations on 10 major biomes. We used 1,976 annual Rs field records extracted from global Rs literature to train and test the ANN models. We determined that the best ANN model for predicting biome-specific global annual Rs was the one that applied mean annual temperature (MAT), mean annual precipitation (MAP) and biome type as inputs ( r 2  = 0.60). The ANN models reported an average global Rs of 93.3 ± 6.1 Pg C year −1 from 1960 to 2012 and an increasing trend in average global annual Rs of 0.04 Pg C year −1 . Estimated annual Rs increased with increases in MAT and MAP in cropland, boreal forest, grassland, shrubland and wetland biomes. Additionally, estimated annual Rs decreased with increases in MAT and increased with increases in MAP in desert and tundra biomes, and only significantly decreased with increases in MAT ( r 2  = 0.87) in the savannah biome. The developed biome-specific global Rs database for global land and soil carbon models will aid in understanding the mechanisms underlying variations in soil carbon dynamics and in quantifying uncertainty in the global soil carbon cycle.
    Electronic ISSN: 2328-4277
    Topics: Geosciences
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 94
    Publication Date: 2017-06-08
    Description: ABSTRACT Monitoring is science keeping our thumb on the pulse of the environment to detect any changes of concern for societies. Basic science is the question-driven search for fundamental processes and mechanisms. Given the firm root of monitoring in human interests and needs, basic sciences have often been regarded as scientifically “purer” – particularly within university-based research communities. We argue that the dichotomy between “research” and “monitoring” is an artificial one, and that this artificial split clouds the definition of scientific goals and leads to suboptimal use of resources. We claim that the synergy between the two scientific approaches is well distilled by science conducted under extreme logistic constraints, when scientists are forced to take full advantage of both the data and the infrastructure available. In evidence of this view, we present our experiences from two decades of uniting research and monitoring at the remote research facility Zackenberg in High Arctic Greenland. For this site, we show how the combination of insights from monitoring with the mechanistic understanding obtained from basic research has yielded the most complete understanding of the system – to the benefit of all, and as an example to follow. We therefore urge scientists from across the continuum from monitoring to research to come together, to disregard old division lines, and to work together to expose a comprehensive picture of ecosystem change and its consequences.
    Electronic ISSN: 2328-4277
    Topics: Geosciences
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 95
    Publication Date: 2017-06-08
    Description: Although recent decades have been the warmest since 1850, and global mean temperatures during 2015 and 2016 beat all instrumental records, the rate of increase in global surface air temperature (GSAT) significantly decreased at the beginning of the 21st Century. In this context, we examine the roles of ice melting and associated increase in sea-water mass, both of which significantly increased at the same time as GSAT decreased. Specifically, we show that (1) the slowdown of the rate of increase in GSAT between the specific periods 1992–2001 and 2002–2011 exists in all three climate records analyzed and is statistically significant at the 5% level amounting between 0.029 and 0. 036 °C/yr and leaving an energy of 14.8 to 18.4 10 19  J/yr available; (2) the increase of the atmosphere-related ice melt between these two periods amounts to 316 Gt/yr which requires 10.5 10 19  J/yr, i.e. between 57% and 71 % of the energy left by the slowdown; and (3) the energy budget shows therefore that the heat required to melt this additional 316 Gt/yr of ice is of the same order as the energy needed to warm the atmosphere during the decade 2002–2011 as much as during the previous one, suggesting a redistribution of heat within the atmosphere-cryosphere system.
    Electronic ISSN: 2328-4277
    Topics: Geosciences
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 96
    Publication Date: 2017-06-03
    Description: Climate change is a major driver of vegetation activity but its complex ecological relationships impede research efforts. In this study, the spatial distribution and dynamic characteristics of climate change effects on vegetation activity in China from the 1980s to the 2010s and from 2021 to 2050 were investigated using a geographically weighted regression (GWR) model. The GWR model was based on combined datasets of satellite vegetation index, climate observation and projection, and future vegetation productivity simulation. Our results revealed that the significantly positive precipitation-vegetation relationship was and will be mostly distributed in North China. However, the regions with temperature-dominated distribution of vegetation activity were and will be mainly located in South China. Due to the varying climate features and vegetation cover, the spatial correlation between vegetation activity and climate change may be altered. There will be different dominant climatic factors for vegetation activity distribution in some regions such as Northwest China, and even opposite correlations in Northeast China. Additionally, the response of vegetation activity to precipitation will move southward in the next three decades. In contrast, although the high warming rate will restrain the vegetation activity, precipitation variability could modify hydrothermal conditions for vegetation activity. This observation is exemplified in the projected future enhancement of vegetation activity in the Tibetan Plateau and weakened vegetation activity in East and Middle China. Furthermore, the vegetation in most parts of North China may adapt to an arid environment, whereas in many southern areas, vegetation will be repressed by water shortage in the future.
    Electronic ISSN: 2328-4277
    Topics: Geosciences
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 97
    Publication Date: 2017-06-08
    Description: We examine some unexpected epistemological conflicts that arise at the interfaces between ecological science, the ecosystem services framework, policy and industry. We use an example from our own research to motivate and illustrate our main arguments, while also reviewing standard approaches to ecological science using the ecosystem services framework. While we agree that the ecosystem services framework has benefits in its industrial applications because it may force economic decision makers to consider a broader range of costs and benefits than they would do otherwise, we find that many alignments of ecology with the ecosystem services framework are asking questions that are irrelevant to real-world applications, and generating data that does not serve real-world applications. We attempt to clarify why these problems arise and how to avoid them. We urge fellow ecologists to reflect on the kind of research that can lead to both scientific advances and applied relevance to society. In our view, traditional empirical approaches at landscape scales or with place-based emphases are necessary to provide applied knowledge for problem solving, which is needed once decision makers identify risks to ecosystem services. We conclude that the ecosystem services framework is a good policy tool when applied to decision-making contexts, but not a good theory either of social valuation or ecological interactions, and should not be treated as one.
    Electronic ISSN: 2328-4277
    Topics: Geosciences
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 98
    Publication Date: 2017-07-14
    Description: Because of its low level of energy consumption and the small scale of its industrial development, the Tibet Autonomous Region has historically been excluded from China's reported energy statistics, including those regarding CO 2 emissions. In this paper, we estimate Tibet's energy consumption using limited online documents, and we calculate the 2014 energy-related and process-related CO 2 emissions of Tibet and its seven prefecture-level administrative divisions for the first time. Our results show that 5.52 million tons of CO 2 were emitted in Tibet in 2014; 33% of these emissions are associated with cement production. Tibet's emissions per capita amounted to 1.74 tons in 2014, which is substantially lower than the national average, although Tibet's emission intensity is relatively high at 0.60 tons per thousand yuan in 2014. Among Tibet's seven prefecture-level administrative divisions, Lhasa City and Shannan Region are the two largest CO 2 contributors and have the highest per capita emissions and emission intensities. The Nagqu and Nyingchi regions emit little CO 2 due to their farming/pasturing-dominated economies. This quantitative measure of Tibet's regional CO 2 emissions provides solid data support for Tibet's actions on climate change and emission reductions.
    Electronic ISSN: 2328-4277
    Topics: Geosciences
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 99
    Publication Date: 2017-07-27
    Description: Evapotranspiration is a key variable in hydrology, playing an important role in water and energy balance of the land surface. There has been speculation on the direction of trend in potential and actual evapotranspiration (PET and AET) resulting from rising global temperatures, in both observational and derived records representing the historical climate. In this study, PET and AET trends of 8 different global model datasets where analyzed over two time periods: from 2003 to 2012 (short term) and from 1980 to 2012 (multi-decadal), to identify regions where the trends coincide or differ and to study the reasons behind these changes. The short-term analysis showed considerable uncertainty exists on the detection and direction of significant trends on both PET and AET. There was little agreement amongst the datasets about the direction of the global trends. The multi-decadal study showed much more consistent trends throughout the datasets, particularly in relation to positive significant PET trends. During this period, the global PET mean increased 0.091mm/month/year, while the global AET rose at 0.045 mm/month/year. Much of the opposite PET/AET trends can be attributed to changes in the precipitation. Most of the regions which present these trends are water-limited and present strong correlations between AET and precipitation trends. Some energy-limited regions showed an increasing gap between PET and AET, suggesting the influence of additional variables controlling AET.
    Electronic ISSN: 2328-4277
    Topics: Geosciences
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 100
    Publication Date: 2017-07-27
    Description: As flood impacts are increasing in large parts of the world, understanding the primary drivers of changes in risk is essential for effective adaptation. To gain more knowledge on the basis of empirical case studies, we analyze eight paired floods, i.e. consecutive flood events that occurred in the same region, with the second flood causing significantly lower damage. These success stories of risk reduction were selected across different socio-economic and hydro-climatic contexts. The potential of societies to adapt is uncovered by describing triggered societal changes, as well as formal measures and spontaneous processes that reduced flood risk. This novel approach has the potential to build the basis for an international data collection and analysis effort to better understand and attribute changes in risk due to hydrological extremes in the framework of the IAHSs Panta Rhei initiative. Across all case studies, we find that lower damage caused by the second event was mainly due to significant reductions in vulnerability, e.g. via raised risk awareness, preparedness and improvements of organizational emergency management. Thus, vulnerability reduction plays an essential role for successful adaptation. Our work shows that there is a high potential to adapt, but there remains the challenge to stimulate measures that reduce vulnerability and risk in periods in which extreme events do not occur. Index Terms 1821 Floods (4303); 4327 Resilience; 4328 Risk; 4330 Vulnerability; 4339 Disaster mitigation
    Electronic ISSN: 2328-4277
    Topics: Geosciences
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
Close ⊗
This website uses cookies and the analysis tool Matomo. More information can be found here...