<body> --------------
Contact Us       Consulting       Projects       Our Goals       About Us
home / Archive
Nature Blog Network


    Mongabay, a leading resource for news and perspectives on environmental and conservation issues related to the tropics, has launched Tropical Conservation Science - a new, open access academic e-journal. It will cover a wide variety of scientific and social studies on tropical ecosystems, their biodiversity and the threats posed to them. Tropical Conservation Science - March 8, 2008.

    At the 148th Meeting of the OPEC Conference, the oil exporting cartel decided to leave its production level unchanged, sending crude prices spiralling to new records (above $104). OPEC "observed that the market is well-supplied, with current commercial oil stocks standing above their five-year average. The Conference further noted, with concern, that the current price environment does not reflect market fundamentals, as crude oil prices are being strongly influenced by the weakness in the US dollar, rising inflation and significant flow of funds into the commodities market." OPEC - March 5, 2008.

    Kyushu University (Japan) is establishing what it says will be the world’s first graduate program in hydrogen energy technologies. The new master’s program for hydrogen engineering is to be offered at the university’s new Ito campus in Fukuoka Prefecture. Lectures will cover such topics as hydrogen energy and developing the fuel cells needed to convert hydrogen into heat or electricity. Of all the renewable pathways to produce hydrogen, bio-hydrogen based on the gasification of biomass is by far both the most efficient, cost-effective and cleanest. Fuel Cell Works - March 3, 2008.


    An entrepreneur in Ivory Coast has developed a project to establish a network of Miscanthus giganteus farms aimed at producing biomass for use in power generation. In a first phase, the goal is to grow the crop on 200 hectares, after which expansion will start. The project is in an advanced stage, but the entrepreneur still seeks partners and investors. The plantation is to be located in an agro-ecological zone qualified as highly suitable for the grass species. Contact us - March 3, 2008.

    A 7.1MW biomass power plant to be built on the Haiwaiian island of Kaua‘i has received approval from the local Planning Commission. The plant, owned and operated by Green Energy Hawaii, will use albizia trees, a hardy species that grows in poor soil on rainfall alone. The renewable power plant will meet 10 percent of the island's energy needs. Kauai World - February 27, 2008.


Creative Commons License


Saturday, September 13, 2008

Anaerobic digestion of livestock manure – an attractive option for renewable power

In the following guest contribution, Salman Zafar outlines the benefits of using livestock manure for the production of biogas. In many countries, livestock production generates large amounts of organic waste the treatment of which is either inefficient or non-existent. With high energy prices, using the resource as a feedstock for renewable methane production has become an interesting option.

Salman has been active in the field of renewable energy for the past few years. His areas of expertise include biomass utilization, waste-to-energy conversion and sustainable development. After obtaining his Masters degree in Chemical Engineering in 2004, he has been involved in industrial research on biomass-to-bioenergy conversion processes in different waste sectors. He has been instrumental in the implementation and successful operation of 1MW biogas plant based on animal manure in Punjab (India).

The generation and disposal of organic waste without adequate treatment result in significant environmental pollution. Besides health concerns for the people in the vicinity of disposal sites, degradation of waste leads to uncontrolled release of greenhouse gases (GHGs) into the atmosphere.

Conventional means, like aeration, is energy intensive, expensive and also generates a significant quantity of biological sludge. In this context, anaerobic digestion offers potential energy savings and is a more stable process for medium and high strength organic effluents. Waste-to-Energy (WTE) plants, based on anaerobic digestion of biomass, are highly efficient in harnessing the untapped renewable energy potential of organic waste by converting the biodegradable fraction of the waste into high calorific gases.

Apart from treating the wastewater, the methane produced from the biogas facilities can be recovered, with relative ease, for electricity generation and industrial/domestic heating.

Anaerobic digestion plants not only decrease GHGs emission but also reduce dependence on fossil fuels for energy requirements. The anaerobic process has several advantages over other methods of waste treatment. Most significantly, it is able to accommodate relatively high rates of organic loading. With increasing use of anaerobic technology for treating various process streams, it is expected that industries would become more economically competitive because of their more judicious use of natural resources. Therefore, anaerobic digestion technology is almost certainly assured of increased usage in the future.


Anaerobic digestion provides a wide range of advantages over other conversion processes. These may be classified into three groups viz. environmental, economic and energy benefits (figure 1, click to enlarge).

Feedstocks
A wide range of feedstock is available for anaerobic digesters. In addition to MSW, large quantity of waste, in both solid and liquid forms, is generated by the industrial sector like breweries, sugar mills, distilleries, food-processing industries, tanneries, and paper and pulp industries. Out of the total pollution contributed by industrial sub-sectors, nearly 40% of the total organic pollution is contributed by the food products industry alone.

Food products and agro-based industries together contribute 65% to 70% of the total industrial wastewater in terms of organic load. Poultry waste has the highest per tonne energy potential of electricity per tonne but livestock have the greatest potential for energy generation in the agricultural sector.

Most small-scale units such as tanneries, textile bleaching and dying, dairy, slaughterhouses cannot afford effluent treatment plants of their own because of economies of scale in pollution abatement. Recycling/recovery/re-use of products from the wastes of such small-scale units by adopting suitable technology could be a viable proposition. Generation of energy using anaerobic digestion process has proved to be economically attractive in many such cases.


The urban municipal waste (both solid and liquid) – industrial waste coming from dairies, distilleries, pressmud, tanneries, pulp and paper, and food processing industries, etc., agro-waste and biomass in different forms – if treated properly, has a tremendous potential for energy generation. Fig 2 (click to enlarge) lists the possible feedstock for waste-to-energy plants based on anaerobic digestion of biomass.

Anaerobic digestion of livestock Manure – a case study
The livestock industry is an important contributor to the economy of any country. More than one billion tons of manure is produced annually by livestock in the United States. Animal manure is a valuable source of nutrients and renewable energy.

However, most of the manure is collected in lagoons or left to decompose in the open which pose a significant environmental hazard. The air pollutants emitted from manure include methane, nitrous oxide, ammonia, hydrogen sulfide, volatile organic compounds and particulate matter, which can cause serious environmental concerns and health problems:
:: :: :: :: :: :: :: :: :: ::

Anaerobic digestion is a unique treatment solution for animal agriculture as it can deliver positive benefits related to multiple issues, including renewable energy, water pollution, and air emissions. Anaerobic digestion of animal manure is gaining popularity as a means to protect the environment and to recycle materials efficiently into the farming systems. Waste-to-Energy (WTE) plants, based on anaerobic digestion of biomass, are highly efficient in harnessing the untapped renewable energy potential of organic waste by converting the biodegradable fraction of the waste into high calorific gases. The biomass of important domestic animals has been listed in Fig 3 (click to enlarge).


The establishment of anaerobic digestion systems for livestock manure stabilization and energy production has accelerated substantially in the past several years. There are more than 111 digesters operating at commercial livestock facilities in the United States which generated around 215 million kWh equivalent of useable energy. Besides generating electricity (170 million kWh), biogas is used as boiler and domestic fuel. Many of the projects that generate electricity also capture waste heat for various in-house requirements.

In the past, livestock waste was recovered and sold as a fertilizer or simply spread onto agricultural land. The introduction of tighter environmental controls on odour and water pollution means that some form of waste management is necessary, which provides further incentives for biomass-to-energy conversion.

Important factors to consider
The main factors that influence biogas production from livestock manure are pH and temperature of the feedstock. It is well established that a biogas plant works optimally at neutral pH level and mesophilic temperature of around 35o C. Carbon-nitrogen ratio of the feed material is also an important factor and should be in the range of 20:1 to 30:1.

Animal manure has a carbon - nitrogen ratio of 25:1 and is considered ideal for maximum gas production. Solid concentration in the feed material is also crucial to ensure sufficient gas production, as well as easy mixing and handling. Hydraulic retention time (HRT) is the most important factor in determining the volume of the digester which in turn determines the cost of the plant; the larger the retention period, higher the construction cost.

An emerging technological advance in anaerobic digestion that may lead to increased biogas yields is the use of ultrasound to increase volatile solids conversion. This process disintegrates solids in the influent, which increases surface area and, in turn, allows for efficient digestion of biodegradable waste.

Process description of WTE facility based on livestock manure
The layout of a typical biogas facility using livestock manure as raw material is shown in Fig 4 (click to enlarge). The fresh animal manure is stored in a collection tank before its processing to the homogenization tank which is equipped with a mixer to facilitate homogenization of the waste stream. The uniformly mixed waste is passed through a macerator to obtain uniform particle size of 5-10 mm and pumped into suitable-capacity anaerobic digesters where stabilization of organic waste takes place.


In anaerobic digestion, organic material is converted to biogas by a series of bacteria groups into methane and carbon dioxide. The majority of commercially operating digesters are plug flow and complete-mix reactors operating at mesophilic temperatures. The type of digester used varies with the consistency and solids content of the feedstock, with capital investment factors and with the primary purpose of digestion.

Biogas contains a significant amount of hydrogen sulfide (H2S) gas which needs to be stripped off due to its highly corrosive nature. The removal of H2S takes place in a biological desulphurization unit in which a limited quantity of air is added to biogas in the presence of specialized aerobic bacteria which oxidizes H2S into elemental sulfur.

Gas is dried and vented into a CHP unit to a generator to produce electricity and heat. The size of the CHP system depends on the amount of biogas produced daily. The digested substrate is passed through screw presses for dewatering and then subjected to solar drying and conditioning to give high-quality organic fertilizer. The press water is treated in an effluent treatment plant based on activated sludge process which consists of an aeration tank and a secondary clarifier. The treated wastewater is recycled to meet in-house plant requirements. A chemical laboratory is necessary to continuously monitor important environmental parameters such as BOD, COD, VFA, pH, ammonia, C:N ratio at different locations for efficient and proper functioning of the process.

The continuous monitoring of the biogas plant is achieved by using a remote control system such as Supervisory Control and Data Acquisition (SCADA) system. This remote system facilitates immediate feedback and adjustment, which can result in energy savings.

Utilization of biogas and digestate
An anaerobic digestion plant produces biogas as well as digestate which can be further utilized to produce secondary outputs. Biogas can be used for producing electricity and heat, as a natural gas substitute and also a transportation fuel. A combined heat and power plant system (CHP) not only generates power but also produces heat for in-house requirements to maintain desired temperature level in the digester during cold season.

CHP systems cover a range of technologies but indicative energy outputs per m3 of biogas are approximately 1.7 kWh electricity and 2.5kWh heat. The combined production of electricity and heat is highly desirable because it displaces non-renewable energy demand elsewhere and therefore reduces the amount of carbon dioxide released into the atmosphere.

In Sweden, the compressed biogas is used as a transportation fuel for cars and buses. Biogas can also be upgraded and used in gas supply networks. The use of biogas in solid oxide fuel cells is being researched.

The surplus heat energy generated may be utilized through a district heating network. Thus, there is potential scope for biogas facilities in the proximity of new housing and development areas, particularly if the waste management system could utilise kitchen and green waste from the housing as a supplement to other feed stock.

Digestate can be further processed to produce liquor and a fibrous material. The fiber, which can be processed into compost, is a bulky material with low levels of nutrients and can be used as a soil conditioner or a low level fertilizer. A high proportion of the nutrients remain in the liquor, which can be used as a liquid fertilizer.


Conclusions
Anaerobic digestion of biomass offer two important benefits of environmentally safe waste management and disposal, as well as the generation of clean electric power. The growing use of digestion technology as a method to dispose off livestock manure has greatly reduced its environmental and economic impacts.

Biomass-to-biogas transformation mitigates GHGs emission and harness the untapped potential of a variety of organic waste. Anaerobic digestion technology affords greater water quality benefits than standard slurry storage due to lower pollution potential. It also provides additional benefits in terms of meeting the targets under the Kyoto Protocol and other environmental legislations.

The livestock industry is a vitally important contributor to the economy of any country, regardless of the degree of industrialization. Animal manure is a valuable source of renewable energy; additionally, it has soil enhancement properties. Anaerobic digestion is a unique treatment solution for animal agriculture as it can deliver positive benefits related to multiple issues, including renewable energy, water pollution, and air emissions.

Anaerobic digestion of animal manure is gaining popularity as a means to protect the environment and to produce clean energy. There is an urgent need to integrate the digester with manure management systems for effective implementation of the anaerobic digestion technology to address associated environmental concerns and to harness renewable energy potential of livestock.


Salman Zafar is currently working as an independent renewable energy advisor. His articles and studies appear on a regular basis in reputed journals and magazines, both in India and abroad, and on leading web-portals. He can be reached at [email protected].




Article continues

Indian government sets indicative biofuels target: 20% by 2017

The Indian Government today announced its ambitious new national biofuels policy: by 2017 all petrol sold in the country should be mixed with 20% ethanol and diesel should be doped with 20% biodiesel made from non-edible oils. The policy aims to cut the nation's dependence on expensive oil imports, which are damaging the economy and are fueling inflation.

The new rules include a provision that discourages imports of biofuels and instead stimulates the creation of plantations in India as way to boost employment opportunties amongst the rural poor. The policy-making and implementation process includes input from a wide range of stakeholders, including representatives of India's lowest government level (the Panchayat Raj, i.e. the village).

India currently blends all gasoline with 5% ethanol (E5), and pilot projects are on to check the viability of biodiesel based on non-edible oils. The percentage of ethanol in petrol is to double from next month onwards (E10).

The Indian cabinet has now approved the implementation of a new National Biofuel Policy that has set an indicative target of blending 20 per cent ethanol in petrol and 20 per cent biodiesel from non-edible oil (e.g. from jatropha) in diesel, by 2017.

The policy calls for scrapping taxes and duties on biodiesel, and 'declared goods status' will be conferred on biodiesel and bioethanol. The 'declared goods status' means that the two fuels will be taxed at a uniform central sales tax or VAT rate rather than at the varied sales tax rates prevalent in India's states. (Oil firms currently buy ethanol at a fixed price of Rs 21.50 per litre [€0.33/liter - US$3.78/gallon], but non-edible oil for biodiesel is purchased at a price linked to prevailing diesel prices.)

Instead of setting up a National Bio-Fuel Development Board, as had been recommended by a Group of Ministers headed by Agriculture Minister Sharad Pawar, the Cabinet constituted a new National Biofuel Coordination Committee headed by the Prime Minister.

Before the biofuels are blended with the fossil fuels, they should go through a series of protocols and certifications, for which the industry and oil marketing companies (OMCs) should jointly set up an appropriate mechanism and the required facilities.

Imports of free fatty acids are prohibited and no import of duty rebate would be provided, as it hinders promotion of indigenous plantations of non-edible oil seeds. The Indian government sees the establishment of local plantations as a way to generate employment opportunities in rural areas.

While biodiesel plantations on community or government wastelands are encouraged, the government now discourages the establishment of plantations on fertile or irrigated land areas:
:: :: :: :: :: :: :: :: :: :: ::

The cabinet also approved the creation of a National Biofuel Coordination Committee which would be chaired by the Prime Minister and which will seven member ministers, while also giving its nod for setting up Biofuel Steering Committee.

The Steering Committee would be chaired by the Cabinet Secretary which, along with the National Biofuel Coordination Committee would be serviced by the Ministry of New and Renewable Energy. The Panchayati Raj (a decentralized government body representing the village level) would also be included as member in the Steering Committee.

A sub-committee, comprising the Department of Biotechnology and the Ministries of Agriculture, New and Renewable Energy and that of Rural Development under the Steering Committee would aid research on biofuels.

A minimum support price (MSP) for oil seeds will be determined and ensured with provisions for its periodic revision to provide a fair price to farmers, which would be looked into by the Steering Committee.

A Statutory Minimum Price (SMP) mechanism similar to that currently operating for sugarcane will be examined and possibly extended to the oil seeds processing industry which will produce the feedstock for biodiesel.



Article continues

Friday, September 12, 2008

Nexterra receives order for biomass gasification system at Oak Ridge National Laboratory - reduces 30,000t GHGs, saves $8.7m per year


Nexterra Energy Corp. announced today that it has received an order from Johnson Controls Inc. for the multi-million dollar biomass gasification system at the U.S. Department of Energy’s Oak Ridge National Laboratory (ORNL) located in Oak Ridge, Tennessee. The ORNL recently chose biomass as the renewable energy source to power its campuses (previous post).

The Nexterra biomass gasification system is the cornerstone of an $89 million contract for Johnson Controls to undertake a wide range of building management and energy conservation measures at ORNL. The ORNL campus is home to the U.S. Department of Energy’s largest science and energy laboratory as well as the DOE’s Bioenergy Science Center.

The Nexterra system will replace ORNL’s existing natural gas steam plant by converting locally sourced woody biomass into clean burning syngas. The syngas will produce 60,000 lbs/hr of saturated steam to displace 75 MMBtu/hr of fossil fuel traditionally used to heat the campus. Once operational, the system will reduce campus fossil fuel consumption by 80 per cent.

Nexterra is supplying the complete gasification system from fuel handling and storage through to the exhaust stack. Engineering design is underway and the system will be delivered in late 2009.

The core of Nexterra’s technology is a fixed-bed, updraft gasifier (schematic, click to enlarge). Fuel, sized to 3 inches or less, is bottom-fed into the centre of the dome-shaped, refractory lined gasifier. Combustion air, steam and/or oxygen are introduced into the base of the fuel pile. Partial oxidation, pyrolysis and gasification occur at 1500 — 1800 °F, and the fuel is converted into “syngas” and non-combustible ash. The ash migrates to the base of the gasifier and is removed intermittently through an automated in-floor ash grate. The clean syngas can then be directed through energy recovery equipment or fired directly into boilers, dryers and kilns to produce useable heat, hot water, steam and/or electricity.

The Johnson Controls contract for ORNL was among the first awarded under the Department of Energy’s Transformational Energy Action Management (TEAM) Initiative. TEAM aims to reduce energy waste and greenhouse gases at DOE facilities nationwide by 30 per cent and have those facilities acquire at least 7.5 per cent of all energy from renewable sources by 2010.

The entire Johnson Controls-ORNL project will save ORNL an estimated $8.7 million annually and reduce greenhouse gas (GHG) emissions by more than 30,000 tons per year:
:: :: :: :: :: :: :: :: ::

The Nexterra biomass gasification system will contribute about two thirds of the GHG emissions reductions, or the equivalent of taking 5,000 cars off the road each year.
ORNL is the flagship energy research institution in the country and we are very pleased to partner with Nexterra to deliver another state-of-the-art biomass gasification system. Nexterra has consistently demonstrated that its technology is a new standard for converting biomass into energy that is clean, reliable, versatile and ideally suited to institutional and urban environments. - Don Albinger, Vice President of Renewable Solutions at Johnson Controls
Jonathan Rhone, President and CEO of Nexterra, said the ORNL installation represents a tremendous opportunity to showcase the benefits of biomass gasification:
Nexterra’s biomass gasification system will not only assist ORNL to meet its cost and GHG reduction targets, but will also help raise the profile of biomass gasification and what it offers in terms of increased energy security, lower costs and improved air quality.
The Oak Ridge National Laboratory is a multi-program science and technology laboratory managed for the U.S. Department of Energy by UT-Battelle, LLC. Scientists and engineers at ORNL conduct basic and applied research and development to create scientific knowledge and technological solutions that strengthen the nation's leadership in key areas of science; increase the availability of clean, abundant energy; restore and protect the environment; and contribute to national security. With 4,200 staff, 3,000 guest researchers, 20 user facilities, and a budget of approximately $1.2 billion, ORNL supports the Department of Energy's mission through six major scientific competencies in energy, neutron science, high-performance computing, complex biological systems, materials research, and national security.

Nexterra Energy is a leading developer and supplier of advanced gasification systems that enable customers to self-generate clean, low cost heat and/or power using waste fuels "inside-the-fence" at institutional and industrial facilities. Nexterra gasification systems provide a unique combination of attributes including design simplicity, reliability, versatility, low emissions, low cost and full automation to provide customers with a superior value proposition compared to conventional solutions. Nexterra is a private company based in Vancouver, BC, Canada.

Nexterra's projects include biomass gasification projects that deliver power and heat for wood processing plants, a cogeneration plant at the University of South Carolina, a gasification system that delivers heat and hot water for a green residential development in Virginia, and a power generation system based on wood waste for multiple 10 MW plants servicing up to 15 communities in British Columbia's Interior.

References:
Biopact: ORNL chooses biomass to power its campuses - August 12, 2008


Article continues

Scientists develop plant-powered desalination technology - allows land reclamation


A very interesting technology under development at the University of New South Wales could offer new hope to farmers in drought-affected and marginal areas by enabling crops to grow using salty groundwater. The technology desalinates the water by making use of the power of plant roots, and so makes it possible to reclaim deserted farm land. The innovation could be particularly welcome for some of the world's poorest farmers, who are confronted with soil salinization.

Global agriculture utilizes large amounts of fresh water. But continued cropping and irrigation can turn land gradually into a salt pan, making agriculture impossible over time. The problem of soil salinization is universal and few practical solutions currently exist. In the meantime, water is becoming a scarce resource. Making fresh water out of salt water is possible with desalination technologies, but these require large inputs of (fossil) energy.

A new technology has now been developed that succeeds in desalinating the salt water from agricultural land, without relying on external energy sources. Instead, the technology utilizes the power of plant roots.

Associate Professor Greg Leslie, a chemical engineer at UNSW's UNESCO Centre for Membrane Science and Technology, is working with the University of Sydney on the technology which uses reverse-osmosis membranes to turn previously useless, brackish groundwater into a valuable agricultural resource.

The team is looking at ways to grow plants on very salty water while restoring the soil. The key of the system can be found in the incorportation of a reverse osmosis membrane into a sub-surface drip irrigation system. The irrigation system relies on the roots of the plant drawing salty groundwater through the membrane – in doing so removing the salt which would otherwise degrade the soil and make continued cropping unsustainable.

Desalination such as this requires a pressure gradient to draw clean water through the membrane. Professor Leslie has demonstrated that, by running irrigation lines under the ground beneath the plants, the root systems of the plants provide enough of a pressure gradient to draw up water without the high energy consumption usually required for desalination:
:: :: :: :: :: :: :: :: :: ::

The scientists think the technology will make it possible to provide agriculture with a new tool to grow crops in drought years when there is limited access to run-off and surface water.

The plant-powered desalination technique could also make the reclamation of salinated soils viable, thus expanding the land resource that is available for agriculture.

The membrane technology, developed by Professor Leslie and the University of Sydney's Professor Bruce Sutton, has been patented by UNSW's commercial arm, NewSouth Innovations.

Image: plant roots provide enough pressure to make desalination via reverse osmosis work.

References:

UNESCO Centre for Membrane Science and Technology, University of New South Wales.

NewSouth Innovations.


Article continues

Thursday, September 11, 2008

First CCS pilot plant online - carbon-negative energy ever closer


Energy company Vattenfall has commenced operations at the world's first pilot plant capable of capturing and sequestering CO2 from coal. This landmark event brings the era of carbon-negative energy ever closer.

The pilot unit has a thermal capacity of 30 megawatts, and was constructed over the last 15 months at the “Schwarze Pumpe” plant premises in the Lausitz region in the State of Brandenburg, Germany. Vattenfall has invested approximately €70 million in the construction of the unit. The carbon dioxide produced in this pilot plant will not be released into the atmosphere, instead it will be almost completely separated, liquefied and further treated for long-term secure underground storage.

The process used in the plant is called 'oxyfuel combustion', which consists of burning the fuel in a stream of oxygen, which makes separation and capture of carbon much easier. Lignite and hard coal will be combusted in a mixture of oxygen and re-circulated CO2, which also contains water vapour. The flue gas will then be treated and sulphuroxides, particles and other contaminants removed. Finally, the water will be condensed and the concentrated CO2 compressed into liquid (schematic, click to enlarge).

Negative emissions

With the advent of a new generation of solid biomass fuels - torrefied biomass - it will be possible to switch from coal to biomass in this type of CCS power plants. If this is done, then the production of carbon-negative energy is a reality.

'Ordinary' renewables like biomass, wind or solar energy are all 'carbon-neutral' in theory (slightly carbon-positive in practise). That is, energy generated this way does not add CO2 to the atmosphere, or only small amounts over the lifecycle of the technology. Carbon-negative energy, however, goes much further, and actively removes CO2 from the atmosphere.

The difference is radical, making 'negative emissions' energy the most potent weapon in the climate fight. Depending on the technology, it can take up to 1000 tons of CO2 out of the atmosphere per GWh of electricity generated (that is: its carbon balance is -1000 tons CO2). Mildly carbon-positive renewables like wind add +30 tons CO2eq/GWh; solar PV adds around +100 tons CO2eq/GWh. In short, carbon-negative energy is capable of tackling climate change far more drastically than any other type of (renewable) energy.

By relying on carbon-negative energy, it becomes possible to reach the goal of reducing atmospheric CO2 levels to 350ppm, down from today's 384ppm, as called for by leading climate scientists. In fact, new projections on how to aggressively tackle climate change have finally included negative emissions energy. The recent Bellona Foundation's scenario even shows it is the biggest wedge of all - bigger than the contribution of all other renewables combined (earlier post):
:: :: :: :: :: :: :: :: ::

The development of CCS facilities makes this futuristic energyscape ever more realistic. Vattenfall's pilot plant is a key milestone. Following test operations extending over several years at this pilot unit, the company intends by 2015 at the latest to construct two demonstration power plants with an electrical capacity of up to 500 megawatts. From 2020 onwards the technology should then be economically viable and available for large-scale industrial applications - including the use of biomass for the production of negative emissions energy.

There are many advantages of using biomass in CCS operations. First of all, the major critique against this type of technology - namely that leaks of CO2 would be disastrous - are no longer valid, because the CO2 to be sequestered is biogenic in nature. Most scientists agree that leaks are highly unlikely and so even the sequestration of CO2 from coal is seen as safe. But if the CO2 comes from biomass, the problem is entirely negated.

Other advantages are the fact that biomass fuels are low in sulfur, thus lowering the costs of cleaning and processing the CO2-rich gas to make it ready for separation and sequestration. Importantly, second-generation biomass can be fed into CCS power plants in a gradual manner - the fuel can be co-fired. Torrefied biomass is of such a quality that it can make use of all existing infrastructures. There is thus no need for the construction of new fuel handling or processing facilities.

References:
Vattenfall: Vattenfall inaugurates World's First Pilot for a Coal-fired Power Plant with CO2 Capture - September 9, 2008.

Vattenfall: overview of Pilot Plant.

Vattenfall: CCS project overview.

Further reading:
James Hansen, Makiko Sato, Pushker Kharecha, David Beerling, Valerie Masson-Delmotte, Mark Pagani, Maureen Raymo, Dana L. Royer, James C. Zachos, "Target Atmospheric CO2: Where Should Humanity Aim?", March 2008, in press [but widely distributed on the net].

Bellona Foundation: It is fully possible to reduce emissions by 85 percent - June 5, 2008.

H. Audus and P. Freund, "Climate Change Mitigation by Biomass Gasificiation Combined with CO2 Capture and Storage", IEA Greenhouse Gas R&D Programme.

James S. Rhodesa and David W. Keithb, "Engineering economic analysis of biomass IGCC with carbon capture and storage", Biomass and Bioenergy, Volume 29, Issue 6, December 2005, Pages 440-450.

Noim Uddin and Leonardo Barreto, "Biomass-fired cogeneration systems with CO2 capture and storage", Renewable Energy, Volume 32, Issue 6, May 2007, Pages 1006-1019, doi:10.1016/j.renene.2006.04.009

Christian Azar, Kristian Lindgren, Eric Larson and Kenneth Möllersten, "Carbon Capture and Storage From Fossil Fuels and Biomass – Costs and Potential Role in Stabilizing the Atmosphere", Climatic Change, Volume 74, Numbers 1-3 / January, 2006, DOI 10.1007/s10584-005-3484-7

Peter Read and Jonathan Lermit, "Bio-Energy with Carbon Storage (BECS): a Sequential Decision Approach to the threat of Abrupt Climate Change", Energy, Volume 30, Issue 14, November 2005, Pages 2654-2671.

Stefan Grönkvist, Kenneth Möllersten, Kim Pingoud, "Equal Opportunity for Biomass in Greenhouse Gas Accounting of CO2 Capture and Storage: A Step Towards More Cost-Effective Climate Change Mitigation Regimes", Mitigation and Adaptation Strategies for Global Change, Volume 11, Numbers 5-6 / September, 2006, DOI 10.1007/s11027-006-9034-9

Biopact: The strange world of carbon-negative bioenergy: the more you drive your car, the more you tackle climate change - October 29, 2007

Biopact: Researchers find geosequestration of CO2 much safer than thought - February 18, 2008

Biopact: Carbon-negative bioenergy making headway, at last - June 06, 2008

Article continues

Researchers find new key to tropical African climate


A team of researchers from the Netherlands and the US has found that tropical Africa's climate is not so much linked to ocean and atmospheric patterns of the Intertropical Convergence Zone (ICTZ), as was previously thought, but to patterns found in the Northern Hemisphere. The findings, published in Science Express, redefine views on Africa's climate and will be important to model the continent's response to climate change.

The Lake Tanganyika area, in southeast Africa, is home to nearly 130 million people living in four countries that bound the lake, the second deepest on Earth. Scientists have known that the region experiences dramatic wet and dry spells, and that rainfall profoundly affects the area's people, who depend on it for agriculture, drinking water and hydroelectric power.

Scientists thought they knew what caused those rains: a season-following belt of clouds along the equator known as the Intertropical Convergence Zone. Specifically, they believed the ITCZ and rainfall and temperature patterns in the Lake Tanganyika area marched more or less in lockstep. When the ITCZ moved north of the equator during the northern summer, the heat (and moisture) would follow, depriving southeast Africa of moisture and rainfall. When the ITCZ moved south of the equator during the northern winter, the moisture followed, and southeast Africa got rain.

Now a research team, led by scientists from Brown University, has discovered the ITCZ may not be the key to southeast Africa's climate after all. Examining data from core sediments taken from Lake Tanganyika covering the last 60,000 years, the researchers report in this week's Science Express that the region's climate instead appears to be linked with ocean and atmospheric patterns in the Northern Hemisphere. The finding underscores the interconnectedness of the Earth's climate — how weather in one part of the planet can affect local conditions half a world away.

The discovery also could help scientists understand how tropical Africa will respond to global warming, said Jessica Tierney, a graduate student in Brown's Geological Sciences Department and the paper's lead author. The findings imply the sensitivity of rainfall in eastern Africa is really high. It doesn't really take much to tip it.

The researchers, including James Russell and Yongsong Huang of Brown's Department of Geological Sciences faculty and scientists at the University of Arizona and the Royal Netherlands Institute for Sea Research, identified several time periods in which rainfall and temperature in southeast Africa did not correspond with the ITCZ's location. One such period was the early Holocene, extending roughly from 11,000 years ago to 6,000 years ago, in which the ITCZ's location north of the equator would have meant that tropical Africa would have been relatively dry. Instead, the team's core samples showed the region had been wet:
:: :: :: :: :: :: :: :: :: ::

Two other notable periods — about 34,000 years ago and about 58,000 years ago — showed similar discrepancies, the scientists reported.

In addition, the team found climatic changes that occurred during stadials (cold snaps that occur during glacial periods), such as during the Younger Dryas, suddenly swung rainfall patterns in southeast Africa. Some of those swings occurred in less than 300 years, the team reported. This is really fast, the researchers note, adding it shows precipitation in the region is "jumpy" and could react abruptly to changes wrought by global warming.

While the scientists concluded the ITCZ is not the dominant player in shaping tropical African climate, they say more research is needed to determine what drives rainfall and temperature patterns there. They suspect that a combination of winter winds in northern Asia and sea surface temperatures in the Indian Ocean have something to do with it. Under this scenario, the winds emanating from Asia would pick up moisture from the Indian Ocean as they swept southward toward tropical Africa. The warmer the waters the winds passed over, the more moisture would be gathered, and thus, more rain would fall in southeast Africa. The theory would help explain the dry conditions in southeast Africa during the stadials, Tierney and Russell said, because Indian Ocean surface temperatures would be cooler, and less moisture would be picked up by the prevailing winds.

What happens in southeast Africa appears to be really sensitive to the Indian Ocean's climate, the researchers say.

The team examined past temperature in the region using a proxy called TEX86, developed by the Dutch contributing authors. To measure past precipitation, the researchers examined fatty acid compounds contained in plant leaf waxes stored in lakebed sediments — a relatively new proxy but considered by scientists to be a reliable gauge of charting past rainfall.

Image: Researchers take cores from Lake Tanganyika, the world's second-deepest lake. Each core was 8 m (26 feet) long and taken at depths of 650 m (2,133 feet). The cores were collected in 2004 as part of the Nyanza Project and were analyzed in 2006 and 2007. Credit: The Nyanza Project, University of Arizona.


References:

Jessica E. Tierney, James E. Russell, Yongsong Huang, Jaap S. Sinninghe Damsté, Ellen C. Hopmans, and Andrew S. Cohen, "Northern Hemisphere Controls on Tropical Southeast African Climate During the Past 60,000 Years", Published online September 11, Science Express, 2008; 10.1126/science.1160485


Article continues

NASA study illustrates how global "peak oil" could impact climate


The burning of fossil fuels - notably coal, oil and gas - has accounted for about 80 percent of the rise of atmospheric carbon dioxide since the pre-industrial era. Now, NASA researchers have identified feasible emission scenarios that could keep carbon dioxide below levels that some scientists have called dangerous for climate.

When and how global oil production will peak has been debated, making it difficult to anticipate emissions from the burning of fuel and to precisely estimate its impact on climate. To better understand how emissions might change in the future, Dr Pushker Kharecha and Dr James Hansen of NASA's Goddard Institute for Space Studies (GISS) in New York considered a wide range of fossil fuel consumption scenarios. The research was published Aug. 5 in the American Geophysical Union's Global Biogeochemical Cycles [abstract],- a freely accessible version is available at the GISS, here. The study shows that the rise in carbon dioxide from burning fossil fuels can be kept below harmful levels as long as emissions from coal are phased out globally within the next few decades.
This is the first paper in the scientific literature that explicitly melds the two vital issues of global peak oil production and human-induced climate change. We're illustrating the types of action needed to get to target carbon dioxide levels. - Dr Pushker Kharecha, NASA Goddard Institute for Space Studies
Carbon dioxide is a greenhouse gas that concerns climate scientists because it can remain in the atmosphere for many centuries and studies have indicated that humans have already caused those levels to rise for decades by burning fossils fuels. Also, carbon dioxide accounts for more than half of all human-caused greenhouse gases in the atmosphere.

Previously published research shows that a dangerous level of global warming will occur if carbon dioxide in the atmosphere exceeds a concentration of about 450 parts per million. That's equivalent to about a 61 percent increase from the pre-industrial level of 280 parts per million, but only 17 percent more than the current level of 385 parts per million. The carbon dioxide cap is related to a global temperature rise of about 1.8 degrees Fahrenheit above the 2000 global temperature, at or beyond which point the disintegration of the West Antarctic ice sheet and Arctic sea ice could set in motion feedbacks and lead to accelerated melting.

To better understand the possible trajectory of future carbon dioxide, Kharecha and Hansen devised five carbon dioxide emissions scenarios that span the years 1850-2100 (figure 1, click to enlarge). Each scenario reflects a different estimate for the global production peak of fossil fuels, the timing of which depends on reserve size, recoverability and technology (figure 2, click to enlarge).
Even if we assume high-end estimates and unconstrained emissions from conventional oil and gas, we find that these fuels alone are not abundant enough to take carbon dioxide above 450 parts per million. - Dr Kharecha
The first scenario estimates carbon dioxide levels if emissions from fossil fuels are unconstrained and follow along "business as usual," growing by two percent annually until half of each reservoir has been recovered, after which emissions begin to decline by two percent annually:
:: :: :: :: :: :: :: :: :: ::

The second scenario considers a situation in which emissions from coal are reduced first by developed countries starting in 2013 and then by developing countries a decade later, leading to a global phase out by 2050 of the emissions from burning coal that reach the atmosphere. The reduction of emissions to the atmosphere in this case can come from reducing coal consumption or from capturing and sequestering the carbon dioxide before it reaches the atmosphere.

The remaining three scenarios include the above-mentioned phase out of coal, but consider different scenarios for oil use and supply. One case considers a delay in the oil peak by about 21 years to 2037. Another considers the implications of fewer-than-expected additions to proven reserves due to overestimated reserves, or the addition of a price on emissions that makes the fuel too expensive to extract. The final scenario looks at emissions from oil fields that peak at different times, extending the peak into a plateau that lasts from 2020-2040.

Next, the team used a simplified mathematical model, called the Bern carbon cycle model, to convert carbon dioxide emissions from each scenario into estimates of future carbon dioxide concentrations in the atmosphere:

The unconstrained "business as usual" scenario resulted in a level of atmospheric carbon dioxide that more than doubled the preindustrial level and from about 2035 onward levels exceed the 450 parts per million threshold of this study. Even when low-end estimates of reserves were assumed, the threshold was exceeded from about 2050 onwards. However, the other four scenarios resulted in carbon dioxide levels that peaked in various years but all fell below the prescribed cap of 450 parts per million by about 2080 at the latest, with levels in two of the scenarios always staying below the threshold.

The researchers suggest that the results illustrated by each scenario have clear implications for reducing carbon dioxide emissions from coal, as well as "unconventional" fuels such as methane hydrates and tar sands, all of which contain much more fossil carbon than conventional oil and gas.
Because coal is much more plentiful than oil and gas, reducing coal emissions is absolutely essential to avoid 'dangerous' climate change brought about by atmospheric carbon dioxide concentration exceeding 450 parts per million. The most important mitigation strategy we recommend – a phase-out of carbon dioxide emissions from coal within the next few decades – is feasible using current or near-term technologies. - Dr Kharecha

Figure 1: Atmospheric carbon dioxide changes over time for the study's five fossil fuel scenarios: business-as-usual (a), coal phase-out (b) and oil use and supply (c-e). Credit: NASA/Kharecha and Hansen

Figure 2
: Fossil fuel-related estimates used. Historical emissions are shown in purple, remaining reserves in blue, and potential near-term additions to reserves in yellow. Unconventional fuels (all-yellow bar) have uncertain magnitude, but are commonly assumed to be vast. Carbon units are shown on the left, and their CO2 equivalents on the right. (IPCC = Intergovernmental Panel on Climate Change; WEC = World Energy Council.)


References:

Kharecha P. A., J. E. Hansen (2008), "Implications of "peak oil" for atmospheric CO 2 and climate", Global Biogeochem. Cycles, 22, GB3012, doi:10.1029/2007GB003142. [Freely accessible here -*.pdf].

Hansen, J., Mki. Sato, R. Ruedy, P. Kharecha, A. Lacis, R.L. Miller, L. Nazarenko, K. Lo, G.A. Schmidt, G. Russell, I. Aleinov, S. Bauer, E. Baum, B. Cairns, V. Canuto, M. Chandler, Y. Cheng, A. Cohen, A. Del Genio, G. Faluvegi, E. Fleming, A. Friend, T. Hall, C. Jackman, J. Jonas, M. Kelley, N.Y. Kiang, D. Koch, G. Labow, J. Lerner, S. Menon, T. Novakov, V. Oinas, Ja. Perlwitz, Ju. Perlwitz, D. Rind, A. Romanou, R. Schmunk, D. Shindell, P. Stone, S. Sun, D. Streets, N. Tausnev, D. Thresher, N. Unger, M. Yao, and S. Zhang, 2007: "Dangerous human-made interference with climate: A GISS modelE study". Atmos. Chem. Phys., 7, 2287-2312.

Pushker Karesha: How Will the End of Cheap Oil Affect Future Global Climate? - NASA Goddard Institute for Space Studies - September 2008.

NASA Contributions to Carbon Management.


Article continues

Wednesday, September 10, 2008

Study: solid biofuels 570% more efficient than corn ethanol in reducing GHG emissions

A study by Canadian and Dutch scientists just released in a book by Springer Publishers found that commercial second-generation solid biofuel technology has set a new Canadian benchmark in greenhouse gas displacement. The solid biofuel technology using biomass from energy crops for heat energy developed by REAP-Canada reduces GHG emissions by 7,600-13,100 kg CO2e /ha. By comparison soybean biodiesel and corn ethanol were found to reduce GHGs by a mere 900 and 1,500 kg CO2e/ha respectively.

Biopact has often said that liquid biofuels are not the smartest idea, because the conversion process is inefficient and the fuels would be used in equally inefficient internal combustion engines. With the advent of electric vehicles, it will be far more interesting to use biomass (pellets) as a solid fuel to generate electricity and heat, in highly efficient cogeneration facilities. (However, note that electric vehicles may not penetrate markets of poor, developing countries. There, liquid biofuels may remain a temporary option that makes sense, given the fact that mobility is key to development, and that high oil prices are having devastating effects on all sectors of the economy of such energy-intensive countries.)

According to Roger Samson, Executive Director of REAP-Canada and lead author of the new study:
Solid biofuels produced from 2nd generation energy crops, such as switchgrass, effectively blow away liquid biofuels like corn ethanol as a serious greenhouse gas mitigation option.
REAP-Canada is calling on the federal and provincial governments to work jointly to implement a national solid biofuels GHG mitigation strategy.
Solid biofuels are 570% more efficient than liquid fuels in using farmland to mitigate GHG's yet there are no subsidies in Canada for such technologies. In the race to reduce GHG's, solid biofuels are the winning horse while liquid biofuels are a donkey. - Roger Samson, Executive Director of REAP-Canada
The study also offers several recommendations on how energy from solid biofuels can be developed efficiently and equitably across Canada.

One of the key densification technologies to make next-generation solid biofuels work is found in a process known as torrefaction. By 'roasting' biomass, its energy density can be improved dramatically and its fuel properties are altered in such a way that the fuel can be (co-)fired in existing power infrastructures:
:: :: :: :: :: :: :: :: :: :: ::

Torrefaction also makes long-distance transport of the biomass much more feasible (distances can be squared), allowing for a far more efficient planning process and use of the existing resources.

The full article titled "Developing Energy Crops for Thermal Applications" in Biofuels, Solar and Wind as Renewable Energy Systems: Benefits and Risks is available from Springer Publishers or online, here.

Resource Efficient Agricultural Production (REAP)-Canada is a not-for-profit organization that focuses on sustainable biofuel systems development. A world leader in developing bioenergy for greenhouse gas mitigation and rural development, REAP-Canada provides services in bioenergy research, policy and market development. The agency has 17 years of experience in energy crop development, biomass resource assessments and bioenergy conversion technologies in industrialized and developing nations.

References:


REAP-Canada: Analysing Ontario Biofuel Options: Greenhouse Gas Mitigation Efficiency and Costs (2008) [*.pdf].

Innovative research on Switchgrass and BioHeat - click here to see a CBC Market Place Video.

Biopact: Torrefaction gives biomass a 20% energy boost, makes logistics far more efficient - July 25, 2008


Article continues

Governing agrobiodiversity: new book looks at international regimes and the management of crop genetic resources

Plant genetic diversity is crucial for food security and for fighting poverty. Nevertheless, crop plant varieties are disappearing fast, and access to genetic resources is increasingly restricted by commercial interests. A new book by FNI Senior Research Fellow Regine Andersen makes the first comprehensive analysis of how international agreements affect the management of crop genetic resources in developing countries, revealing that the interaction of the agreements has produced largely negative impacts, despite good intentions. The book, titled Governing Agrobiodiversity - Plant Genetics and Developing Countries, also highlights entry points to shape a better governance of agrobiodiversity.

Plant genetic diversity is crucial to the breeding of food crops and is therefore a central precondition for food security. Diverse genetic resources provide the genetic traits required to deal with crop pests and diseases, as well as changing climate conditions. It is also essential for the millions of people worldwide who depend on traditional small-scale farming for their livelihoods. As such, plant genetic diversity is an indispensable factor in the fight against poverty.

However, the diversity of domesticated plant varieties is disappearing at an alarming rate while the interest in the commercial use of genetic resources has increased in line with bio-technologies, followed by demands for intellectual property rights. The ensuing struggle over genetic resources has given rise to several international agreements. A new book by FNI Senior Research Fellow Regine Andersen provides the first comprehensive analysis of how the international agreements pertaining to crop genetic resources affect the management of these vital resources for food security and poverty eradication in developing countries.

The book analyses the international regimes and their interaction, traces the driving forces across scales and the effects in developing countries. Finally, it identifies entry points to shape a better governance of agrobiodiversity.

A key conclusion is that the interaction between the various regimes has had largely negative effects for the management of crop genetic diversity in developing countries - despite other intentions behind the individual agreements. The result of these developments is an emerging anti-commons tragedy: A situation where multiple actors have the possibilities to exclude each other from the use of plant genetic resources in agriculture.
:: :: :: :: :: :: :: :: :: ::

Not only is this a threat to the conservation and sustainable use of these resources, but it may also seriously affect food security and the outlook for combating poverty in the world. With the International Treaty on Plant Genetic Resources for Food and Agriculture, which was adopted in 2001, the international community has an instrument with the potential to change this negative trend. Whether that will happen, however, depends crucially on the political will of the contracting parties to the Treaty.
It is my sincere hope that this book can contribute to the efforts already underway, aimed at breaking out of the vicious circle of today's management of plant genetic resources for food and agriculture, so that we may ensure the continued maintenance of these resources so vital to food security and poverty eradication. I also hope it will advance our understanding of how international regimes can better be employed as instruments for strengthening global governance in environmental issues. - Dr Regine Andersen
The "Governing Agrobiodiversity" project ran from 2000 to 2008 and was the base for the doctoral dissertation of Regine Andersen. The point of departure was a series of case studies in one country, the Philippines, and the results have been validated with regard to their relevance for other developing countries. The focus of the case studies was on how actors that were engaged in the field of plant genetic resources for food and agriculture have influenced the processes leading to the legislation on this issue, and on the factors that have influenced the positions of these actors. In this context, the influence of international regimes on actor positions was emphasised. Finally, the effects of these constellations on the implementation of the legislation were analysed.

The doctoral dissertation was submitted and approved in 2006, and was successfully defended on 23 February 2007. An edited version of the dissertation will be published by Ashgate in 2008.

The project was funded by the Research Council of Norway. Dr Hansen performed her research at the Fridtjof Nansen Institute, an independent Norwegian foundation engaged in research on international environmental, energy and resource management politics.

References:
Andersen, Regine, Governing Agrobiodiversity: Plant Genetics and Developing Countries. Aldershot, Ashgate, 2008, 420 p. ISBN 978-0-7546-4741-6

Fridtjof Nansen Institute: Governing Agrobiodiversity – Inter-Regime Conflicts on Plant Genetics and Developing Countries - project website.



Article continues

Tuesday, September 09, 2008

Landmark study reports breakdown in biotech patent system - scientists call for overhaul of IP laws worldwide


The world's intellectual property (IP) system is broken. It's stopping lifesaving technologies from reaching the people who need them most in developed and developing countries, according to the authors of a landmark report released in Ottawa today by an international coalition of experts. In 'Toward a New Era of Intellectual Property: From Confrontation to Negotiation' [*.pdf], the authors state that biotech is at the center of our progress in health care, agriculture, and in the environmental sciences. They note that the need for win-win IP systems based on cooperation is larger than ever before, but the chances of getting there look bleaker than ever. The experts call for a major overhaul of the current system.

Richard Gold, professor of intellectual property at McGill University and chair of the International Expert Group that produced the report:
We found the same stumbling blocks in the traditional communities of Brazil as we did in the boardroom of a corporation that holds the patent to a gene that can determine the chance a woman will develop breast cancer. Most striking is that no matter where we looked, the lack of trust played a vital role in blocking negotiations that could have benefited both sides, as well as the larger public.
The report is the result of seven years of work by Gold and his colleagues, experts in law, ethics and economics. Gold said that the authors based their report on revelations that came out of discussions with policy-makers, industry representatives, scientists and academics from around the world, as well as the outcomes of a series of case studies involving Brazil, Canada, Kenya the United States, the European Union, Japan, Australia, and India.

The authors portray a crucially important but increasingly dysfunctional industry that relies on a business model based on outdated conceptions of IP. In their report, the authors describe conclusions and recommendations based on data collected over the last seven years; the data itself will be released at an October 14 event in Washington, DC.
The old IP approach of the biotechnology community has failed to deliver on its potential to address disease and hunger in both developing and industrialised nations. We need to do better, and the IT world has shown us part of the solution. Look at the way that change has swept through the IT world and brought benefits to millions. - Professor Richard Gold
While biotech's potential seems unlimited, so do its problems. The report finds that a fixation on patents and privately-controlled research has frequently given rise to controversy and roadblocks to innovation. Recent examples include: the $612 million patent suit that almost shut down the world's Blackberries; Myriad Genetics' inability to introduce its breast cancer screening test in Canada and Europe; a pharmaceutical industry with an increasingly bare medicine cabinet; an ongoing failure to deliver life-saving medications to developing countries.
For better or for worse, biotechnology is at the heart of current debates about health care, the environment, food and development. It offers the promise of producing plants to resist drought and nourish the world's poor, and to offer new medicines and energy sources. Biotech is at the heart of not only today's economy but its security and well-being as well. - Professor Gold
The current crisis in biotechnology has given rise not only to economic problems but to endemic mistrust among its actors that is stifling innovation and preventing cutting-edge technologies from helping those who can most benefit. The report and case studies provide the following as illustrations:
:: :: :: :: :: :: :: :: :: ::

  • Findings from a study of stem cell researchers suggesting that those who patent the most, collaborate the least, based on a study of measures of success.
  • The reasons for the breakdown of negotiations between a US company, which had patented human genes for breast and ovarian cancer, and Canadian health authorities. A case study reveals that talks with Canada were in crisis after Myriad delivered threatening letters to the Ontario Minister of Health from US Senator Orin Hatch and from the US Ambassador to Canada, Paul Cellucci.
  • Evidence based on a Brazilian case study that ensuring property rights to indigenous practices and knowledge has served as a significant barrier to research in Brazil and has not furthered the interests of the country's traditional communities.
  • Revelations from participants in discussions of Canada's legislation to allow emerging markets to produce drugs for poor countries—known as the Access to Medicines Regime. They said privately that they knew the regime would not work by the end of the negotiations, yet they publicly applauded it.
While exposing a number of systemic failures associated with biotech and IP regimes, the Expert Group reports that the best innovative activity occurs when everyone – researchers, companies, government and NGOs – works together to ensure that new ideas reach the public, but are appropriately regulated and efficiently delivered to those who need them.

At an event on Tuesday in Ottawa, organized to present the report to Canadian policymakers, the findings of the Brazilian case study were presented by Maristela Basso of the Brazilian Institute of International Trade Law & Development (IDCID).
NGOs in Brazil help communities sue researchers and companies that use indigenous knowledge without consent, but no one is present to help communities change the legislation or enter into agreements with those same companies in advance, so that everyone can benefit. This leaves behind a culture of mistrust. The NGOs and local community leaders often distrust industry and are therefore reluctant to negotiate. On the other hand, researchers and industry feel so overburdened by a maze of unworkable rules and procedures that they trust neither the government nor the local communities. - Maristela Basso, associate professor of international law at the University of Sao Paul School of Law.
The authors of the new report make a number of concrete recommendations that would address the problems she and her colleagues had documented in their case study. Pointing to governments, the private sector and universities as crucial players, the authors call for better management of scientific knowledge and new ways to measure whether technology transfers are working.

Chad Gaffield, president of the Social Sciences and Humanities Research Council of Canada (SSHRC), which funded the research activities that led to the report, noted the work of the same group in helping international organizations that are struggling with ways to improve access to biotech breakthroughs for poor countries. Most recently for UNITAID, an international governmental group, Gold and his colleagues have created the design for a patent pool to unblock patents so that needed fixed dose combination and pediatric antiretroviral medicines reach those suffering from HIV/AIDS.
The end of our old way of doing business does not mean we don't need a system for protecting intellectual knowledge. We need an IP system that will support collaborations among researchers and partners in industry and academia worldwide so that knowledge gets to those who need it most. This means the laws may have to be changed, but more importantly, it means that we have a lot of work to do to change behaviors and build trust among all the players. How people behave – in other words, their practices – and the effect of practices on innovation is critical. Public and private institutions – patent offices, courts, universities, governments, corporations and industry groups – that manage, award, review and hold intellectual property also play an essential role in shaping the IP system. - Professor Gold
Recommendations
The report released today, Toward a New Era of Intellectual Property: From Confrontation to Negotiation, documents a series of failed attempts to expand access to both traditional knowledge and the products of modern biotechnology. The authors, members of the International Expert Group on Biotechnology, Innovation and IP make a number of concrete recommendations to address their findings. Pointing to governments, the private sector and universities as crucial players, they call for better management of scientific knowledge and new ways to measure whether technology transfers are working. The following are among their key recommendations, by the group, which is organized under the aegis of McGill University and the non-profit The Innovation Partnership.

Governments should:
  • Seek other ways to encourage innovation—not just through IP, but through health and environmental regulations, the judicial system and tax rules, for instance.[
  • Work with industry to help create respected and trusted entities whose members that can be counted on to mediate disputes fairly and encourage indigenous and local communities in policy development
  • Develop Public-Private Partnerships to conduct early stage research including through the sharing of health related data to allow the sharing of risk across industry.
Patent offices should:
  • Collect standardized patent-related information, including license data as they are doing in Japan
  • Assist developing countries and NGOs in finding out which patents exist in order to enable licensing
Industry should:
  • Establish an independent, non-profit technology assessment organization to evaluate new biotechnology products from developing countries
  • Participate actively in the creation of Public-Private Partnerships and other collaborative mechanisms
  • Be transparent about patent holdings
  • Develop new business models that promote partnerships and collaborations
Universities should:
  • Develop clear principles relating to the use and dissemination of intellectual property and promote greater access and broad licensing
  • Develop measures of the success of transfer of technology based on social returns rather than on the number of patents hold
  • Enter into collaborations between developed and developing countries to ensure that developing country doctoral and post-doctoral students have opportunities to study and work at home.
The Innovation Partnership (TIP) is an independent non-profit consultancy with experts in developed and developing countries specializing in the understanding, use and management of intellectual property. TIP's mission is to foster innovation and creativity through the better use of intellectual property and its alternatives.

Funded by the Canadian government and organized through McGill University's Centre for Intellectual Property Policy, the International Expert Group on Biotechnology, Innovation and Intellectual Property concluded that government, industry and NGOs lack independent, empirically-based expertise on how best to adapt intellectual property to the needs of modern society.

References:

International Expert Group on Biotechnology, Innovation and Intellectual Property: Toward a New Era of Intellectual Property: From Confrontation to Negotiation - A Report from the International Expert Group on Biotechnology, Innovation and Intellectual Property [*.pdf] - September 2008.


Article continues

Sir David King: Green activists 'are keeping Africa poor'


Western do-gooders are impoverishing Africa by promoting traditional farming at the expense of modern scientific agriculture, according to Sir David King, Britain's former chief scientist. Anti-science attitudes among aid agencies, poverty campaigners and green activists are denying the continent access to technology that could improve millions of lives, the professor said. Sir David made his critique during his keynote lecture to the British Association for the Advancement of Science.

Biopact can agree with several elements of the chief scientist's sharp and scathing analysis. We too think Africa is too often used as a fantasy screen on which irrational desires instead of rational solutions are projected - Africa should somehow, in the name of 'tradition' remain a place of idyllic poverty and autarky, a laboratory for experiments with ideas that don't make it in the West, a locus where the anti-modern tendencies of dissatisfied, often reactionary and conservative forces, can be played out.

We would not hesitate to conclude that, contrary to what they say, many of the aid workers who are active in Africa, are not progressive, free-thinking forces who rely on science, technology and reason to help African communities determine their own future and reach key development goals, in full solidarity with the international community. On the contrary, their views often resemble those of reactionary, science-averse, and anti-modern conservatives who look at Africans as a 'species to be preserved' in its blissful state of traditionalism, out of touch with the modern world. What is more, many of the 'green' actions they think will contribute to environmental and social sustainability, are, objectively speaking, extremely inefficient, lead to poverty and spur environmental destruction (previous post).

Sir David zooms in on the way in which these Western activists on the continent deal with agriculture - the single most important field of development. Non-governmental organisations from Europe and America are turning African countries against sophisticated farming methods, in favour of 'indigenous' and 'organic' approaches that cannot deliver the continent's much needed 'green revolution', he believes.

The professor said that the slow pace of African development was linked directly to this irrational Western influence.
I'm going to suggest, and I believe this very strongly, that a big part has been played in the impoverishment of that continent by the focus on nontechnological agricultural techniques, on techniques of farming that pertain to the history of that continent rather than techniques that pertain to modern technological capability. Why has that continent not joined Asia in the big green revolutions that have taken place over the past few decades?

The suffering within that continent, I believe, is largely driven by attitudes developed in the West which are somewhat anti-science, anti-technology - attitudes that lead towards organic farming, for example, attitudes that lead against the use of genetic technology for crops that could deal with increased salinity in the water, that can deal with flooding for rice crops, that can deal with drought resistance. - Sir David King
Sir David, who stepped down in December as the Government's Chief Scientific Adviser, used his presidential address to the BA Festival of Science in Liverpool to accuse governments and NGOs of confused thinking about African development.
Solutions will only emerge if full use is made of modern agricultural technology methods, under progressive, scientifically informed regulation. The most advanced form of plant breeding, using modern genetic techniques, is now available to us. Plant breeding needs to meet a range of demands, including defences against evolving plant diseases, drought resistance, saline resistance, and flood tolerance. The problem is that the Western-world move toward organic farming - a lifestyle choice for a community with a lavish surplus of food - and against agricultural technology in general and GM in particular, has been adopted across Africa, with the exception of South Africa, with devastating consequences. - Sir David King
:: :: :: :: :: :: :: :: :: :: :: :: ::

His remarks place him in direct opposition to some of his former Whitehall colleagues. The British government recently endorsed the International Assessement of Agricultural Science and Technology, a report from 400 scientists and development experts published in April, which championed small-scale farming and traditional knowledge. The exercise was led by Professor Bob Watson, the chief scientist at the Department for Environment, Food and Rural Affairs. Sir David said that its findings were short-sighted. “I hesitate to criticise Bob Watson, who I admire enormously, but I think that we have been overwhelmed by attitudes to Africa that for some reason are qualitatively different to attitudes elsewhere.

The professor also feels that, if one really scrutinizes the 'green' ideas and actions of many of these organisations, one finds that they result in the contrary: a range of concepts actually that lead to unsustainable, environmentally destructive lifestyles:
We have the technology to feed the population of the planet. The question is do we have the ability to understand that we have it, and to deliver? I think there is a tremendous groundswell of feeling that we need to support 'tradition' in Africa. What that actually means in practice is if you go to a marketplace in a lovely town like Livingstone in Zambia, near Victoria Falls, you will see hundreds of people with little piles of their crops for sale.

This is an extremely inefficient process. The sort of thing we're seeing [and is promoted nowadays as 'organic farming'] existed in this country hundreds of years ago. I don't believe that will lead to the economic development of Africa. - Sir David King
Sir David cited the example of rice that can resist flooding, which has been developed by the International Rice Research Institute (IRRI) in the Philippines. Its development has been held up for several years because scientists felt they could not use GM techniques, such is the scale of Western-influenced opposition to the technology.

He also accused green groups such as the UN Environment Programme of agitating against new technologies on the basis of speculative risks, while ignoring potential benefits.
For example, Friends of the Earth in 1999 worried that drought-tolerant crops may have the potential to grow in habitats unavailable' to conventional crops. The priority of providing food to an area of the world in greatest need appears to not have been noted.For decades, approaches to international development have been dominated by this well-meaning but fatally flawed doctrine. - Sir David King
The strange thing about Sir David's position is that he will be branded a reactionary because he goes against the Western consensus on they way to approach Africa, and wants to stick to the recipes of modernity - the use of reason, science and an objective analysis of material conditions - to help achieve certain goals.
I’m from Africa. I feel very strongly about this issue. Africa needs economic development in just the same way as our country had economic development. What we see there existed here hundreds of years ago. African development will depend, as ours did, on modern skills right across the board. - Sir David King
Sir David A. King is the Director of the Smith School of Enterprise and the Environment at the University of Oxford. He was the UK Government’s Chief Scientific Adviser and Head of the Government Office of Science from October 2000 to 31 December 2007. In that time, he raised the profile of the need for governments to act on climate change and was instrumental in creating the new £1 billion Energy Technologies Institute. In 2008 he co-authored The Hot Topic on this subject.

As Director of the Government’s Foresight Programme, Sir David created an in-depth horizon scanning process which advised government on a wide range of long term issues, from flooding to obesity. He also chaired the government’s Global Science and Innovation Forum from its inception. He advised government on issues including: The foot-and-mouth disease epidemic 2001; post 9/11 risks to the UK; GM foods; energy provision; and innovation and wealth creation; and he was heavily involved in the Government’s Science and Innovation Strategy 2004-2014.

Professor King suggested that scientists should honour a Hippocratic Oath for Scientists.

Picture: women in the Democratic Republic of Congo, planting crops by hand. Should they use modern farming tools instead?

References:

British Association for the Advancement of Science: BA President accuses West of retarding African development - September 8, 2008.

The Times Online: Green activists 'are keeping Africa poor' - September 8, 2008.

Biopact: Researchers: lack of farming opportunities, Western subsidies key causes of conflict in third world - January 20, 2008


Article continues

Monday, September 08, 2008

New insights into plants' chemical defense mechanisms could lead to ecofriendly fungicides, pesticides, climate-proof crops


Even closely related plants produce their own natural chemical cocktails, each set uniquely adapted to the individual plant's specific habitat. Comparing anti-fungals produced by tobacco and henbane, an international team of researchers discovered that only a few mutations in a key enzyme are enough to shift the whole output to an entirely new product mixture. Making fewer changes led to a mixture of henbane and tobacco-specific molecules and even so-called "chemical hybrids," explaining how plants can tinker with their natural chemical factories and adjust their product line to a changing environment without shutting down intracellular chemical factories completely.

The findings not only gave the scientists a glimpse of the plants' evolutionary past, but may help them fine-tune the production of natural and environmentally friendly fungicides and pesticides as well as new flavors and fragrances by turning "enzymatic knobs" in the right direction. The insights may also help in the design of crops capable of withstanding climate change. The study was published ahead of print in the Sept. 7 online edition of Nature Chemical Biology.

Trying to make the best of their real estate, plants rely on an impressive arsenal of volatile and nonvolatile molecules, which diffuse easily through the membranes of the cells that produce them to communicate and interact with the outside world. Often highly aromatic and exceedingly specific for a particular ecological niche, these chemicals attract pollinators, summon natural predators of pests, defend against competitors or, through their antimicrobial properties, protect against natural plant pathogens such as fungi and bacteria.
Most people are familiar with the word biodiversity, but 'chemodiversity,'—the extraordinary tapestry of natural chemicals found in plants — is just as important for life, the appearance of new species and the survival of many different ecosystems on the earth. - Joseph P. Noel, Ph.D, director of the Jack H. Skirball Center for Chemical Biology and Proteomics, Howard Hughes Medical Institute investigator, lead author
For centuries, mankind has exploited this vast reservoir of natural chemicals for the discovery of new pharmaceuticals to treat disease. Understanding the chemistry and evolutionary principles that underlie this extraordinary biological diversity will show us how to alter biosynthetic pathways to equip crops with natural and environmentally friendly defenses against pests and diseases, to produce new pharmaceuticals, to enhance levels of naturally occurring health-promoting nutrients or to speed up plant adaptation in the face of global climatic change.

For the current study, postdoctoral researcher and first author Paul O'Maille, Ph.D., probed the metabolic pathways that members of the nightshade family, which includes tobacco, tomatoes, potatoes, peppers and henbane, use to produce terpenes — compounds that impart aromatic odors and flavors to foods. In many cases, they are also modified in the plant to produce so-called phytoalexins, which are natural forms of anti-fungal and antimicrobial compounds found in many different plants:
:: :: :: :: :: :: :: :: :: ::

Henbane (Hyoscyamus muticus) and tobacco (Nicotiana tabacum) each rely on a different phytoalexin to successfully defend themselves against fungi typical for their habitat. Yet the more than 500 amino acids that make up the chemical factories in each—known as sesquiterpene synthases — are nearly identical to each other, with very minor differences accumulated over approximately several million years of evolutionary change. Using structural analyses, O'Maille and his colleagues had earlier discovered that changing only 9 of the 550 amino acids shifts the production from tobacco-specific phytoalexins to the henbane versions and vice versa.

This time, they were trying to understand the many possible roads that cross the evolutionary divide between tobacco and henbane sesquiterpene synthases. O'Maille created a gene library that encoded all possible amino acid combinations, 512 in total, and produced and analyzed the mutant proteins, paying specific attention to the chemical output and efficiency of each enzyme. This was the first systematic effort to link DNA sequence variation with chemical complexity, says O'Maille.

This first glimpse revealed a rugged landscape of catalytic activities, where small changes gradually shift the equilibrium between both phytoalexins and in some cases cause rapid evolutionary jumps. It isn't the specific amino acid change that's important but rather the genetic context in which it occurs, O'Maille adds.

Now the researchers are planning to extend their studies to other members of the nightshade family, including tomato, potato, pepper and eggplant, to see how the simplified laboratory system is recapitulated by Mother Nature. This latter much larger study will take the researchers all over the globe to sample organisms and ecosystems harboring this large family of agriculturally important plants, says Noel, who predicts that it is almost certain that the highly simplified experimental system just published will undergo revision as the scientists peel back the layers of time and begin to understand the enormous biochemical potential of the plant kingdom.

Researchers who also contributed to the study include interns Arthur Malone and Iseult Sheehan and graduate student Nikki Dellas in the Noel lab; Professor B. Andes Hess Jr., Ph.D., Department of Chemistry at Vanderbilt University in Nashville; Professor Lidia Smentek, Ph.D., Institute of Physics, Nicolaus Copernicus University, Toru, Poland; research scientist Bryan T. Greenhagen, Ph.D., Microbia Precision Engineering; Professor Joe Chappell, Ph.D., in the Department of Plant and Soil Sciences at the University of Kentucky in Lexington; and Gerard Manning, Ph.D., director of the Razavi Newman Center for Bioinformatics at the Salk Institute.

Picture: Subcellular phytoalexin localization. Credit: Purdue University - Botany and Plant Pathology.

References:
Paul E O'Maille, Arthur Malone, Nikki Dellas, B Andes Hess, Jr, Lidia Smentek, Iseult Sheehan, Bryan T Greenhagen, Joe Chappell, Gerard Manning & Joseph P Noel, "Quantitative exploration of the catalytic landscape separating divergent plant sesquiterpene synthases", [full article, open access], Nature Chemical Biology, Published online: 7 September 2008 | doi:10.1038/nchembio.113




Article continues

New software showing most efficient way to transport goods: rail, road, waterway?

Researchers from the Fraunhofer Institute for Material Flow and Logistics (IML) in Dortmund are developing new comprehensive software that determines the cheapest, fastest or most environmentally compatible mode of transportation for a wide range of goods. Biomass as well as other agricultural products are being increasingly traded across countries and continents, but transport costs can be relatively high. The new software helps to lower costs by showing when it is best to make use of rail, road or waterway. It is the first highly integrated and detailed logistics model to do so, and it will be available online soon.

Most national and international freight is transported by road, because it is the least expensive option. But this is likely to change soon, due to road tolls and the rising cost of fuel. Even when it’s a question of making sure that the merchandise is delivered precisely on time, trucks are not always the most reliable solution. It can often take a long time to clear goods through the container terminals, and tailbacks on the motorways can cause additional delays.

So what is the optimum strategy for transporting goods over a particular route? Where could costs be saved by using inland waterways, and at what point would it be best to transship to a road or rail vehicle? What is the cheapest, fastest, or most environmentally compatible overall solution? All answers can be provided by the new software package.

The user enters the locations between which the goods are to be transported, as you would when using a route planner, says IML team leader Joachim Kochsiek. The system calculates different variants to find the optimum solution that fits the specified criterion: costs, time or, in a future version, least environmental burden. It even factors in the time and costs for transshipment.

Digital maps of road, rail and inland waterway networks can be purchased off the shelf, but the information they provide is not sufficiently detailed for the new software:
:: :: :: :: :: :: :: :: ::

There are different categories of train, and different pricing systems for different rail connections – you can’t simply apply a standard price per kilometer. Producers, traders and transportation companies need to know what rules to apply to the speed, width and height of trains, how many wagons are permitted on a particular section of railroad, and the maximum speed limit. Whereas this kind of information is included in road maps, it has to be compiled manually for the rail networks, Kochsiek explains.

For each mode of transportation, the system adapts its calculation of costs and fuel consumption to the degree of capacity utilization. For example, the lower the number of wagons pulled by a locomotive, the higher the costs.

A prototype version of the software for optimizing time and costs is already available. The researchers are now working on the algorithms for calculating the environmental burden. A later version with online access will enable modified shipping timetables, for instance, to be instantly included in the calculations.

References:
Fraunhofer-Gesellschaft: Rail, road or waterway? - September 8, 2008.

Fraunhofer Institute Materialfluss und Logistik.

Article continues

Sunday, September 07, 2008

UN climate chief: less meat = less heat

Dr Rajendra Pachauri, Chair of the Intergovernmental Panel on Climate Change, and joint winner of the 2007 Nobel Peace Prize, will present the case that eating less meat can help us fight climate change in an efficient way. Pachauri will deliver the message tomorrow when he presents the Peter Roberts Memorial Lecture titled "Global Warning: the impact of meat production & consumption on climate change", at a conference organised by Compassion in World Farming (CIWF), an animal welfare organisation.

According to the FAO, global livestock production is responsible for more greenhouse gas emissions than the entire transport sector (previous post). Cutting back on it can prevent the release of large amounts of methane, CO2 and N2O, which are emitted during the long production chain of meat, from field to fork. A large fraction of these emissions arise from tropical forests that are cleared to make way for cattle feed production. Other main sources are the fermentation of feed by ruminants and the emissions from manure (graph, click to enlarge).

According to Henning Steinfeld, Chief of FAO’s Livestock Information and Policy Branch and senior author of a recent report entitled Livestock's long shadow - Environmental issues and options, “livestock are one of the most significant contributors to today’s most serious environmental problems. Urgent action is required to remedy the situation.”

With increased prosperity, people are consuming more meat and dairy products every year. Global meat production is projected to more than double from 229 million tonnes in 1999/2001 to 465 million tonnes in 2050, while milk output is set to climb from 580 to 1043 million tonnes.

The global livestock sector is growing faster than any other agricultural sub-sector. It provides livelihoods to about 1.3 billion people and contributes about 40 percent to global agricultural output. For many poor farmers in developing countries livestock are also a source of renewable energy for draft and an essential source of organic fertilizer for their crops.

But such rapid growth exacts a steep environmental price. “The environmental costs per unit of livestock production must be cut by one half, just to avoid the level of damage worsening beyond its present level,” the FAO report warns. When emissions from land use and land use change are included, the livestock sector accounts for 9 percent of CO2 deriving from human-related activities, but produces a much larger share of even more harmful greenhouse gases. It generates 65 percent of human-related nitrous oxide, which has 296 times the Global Warming Potential (GWP) of CO2. Most of this comes from manure.

And it accounts for respectively 37 percent of all human-induced methane (23 times as warming as CO2), which is largely produced by the digestive system of ruminants, and 64 percent of ammonia, which contributes significantly to acid rain:
:: :: :: :: :: :: :: :: :: ::

There are various possibilities for reducing the greenhouse gas emissions associated with farming animals. They range from scientific approaches, such as genetically engineering strains of cattle that produce less methane flatus, to reducing the amount of transport involved through eating locally reared animals.

Farmers in Europe strongly support research aimed at reducing methane emissions from livestock farming by, for example, changing diets and using anaerobic digestion to produce biogas.


At the CIFW conference, Dr Pachauri will especially urge people in wealthy countries to lower their meat consumption. This will free up resources to feed the billions of poorer people who cannot currently afford animal protein, but who could benefit from it healthwise.

Changing diets in highly developed countries would also benefit the health of wealthier people. Overconsumption of meat products has contributed in a significant way to the obesity pandemic and its related health problems. A many a 'food pyramid' shows, red meat, dairy products, poultry and fish should only form a small part of a healthy diet.

CIWF's ambassador Joyce D'Silva said that thinking about climate change as well as the health benefits of eating less meat, could spur people to change their habits. "The climate change angle could be quite persuasive," she thinks. "Surveys show people are anxious about their personal carbon footprints and cutting back on car journeys and so on; but they may not realise that changing what's on their plate could have an even bigger effect."

Ms D'Silva believes that governments negotiating a successor to the Kyoto Protocol ought to take the emissions from livestock production into account. "I would like governments to set targets for reduction in meat production and consumption," she said.

That is something that should probably happen at a global level as part of a negotiated climate change treaty, and "it would be done fairly, so that people with little meat at the moment such as in sub-Saharan Africa would be able to eat more, and we in the west would eat less."

Dr Pachauri, however, sees it more as an issue of personal choice. "I'm not in favour of mandating things like this, but if there were a (global) price on carbon perhaps the price of meat would go up and people would eat less," he said. "But if we're honest, less meat is also good for the health, and would also at the same time reduce emissions of greenhouse gases."

Graph: Proportion of greenhouse-gas emissions from different parts of livestock production. Adapted from FAO.

References:

CIFW Peter Roberts Memorial Lecture: "Global Warning: the impact of meat production & consumption on climate change".

CIFW: Global warning: climate change and farm animal welfare [*.pdf] - 2007.

CIFW: Impact of livestock farming: solutions for animals, people and the planet [*.pdf] - 2007.

Biopact: FAO: Livestock a major environmental and climate threat; implications for bioenergy - December 06, 2006


Article continues