Crowdfunding for renewables comes of age

In recession era Europe, much talk is of ‘innovative’ or ‘alternative’ financing for sustainable energy â€" meaning money other than the public purse. In 2012 crowdfunding in Europe saw an estimated 65 % growth compared to 2011 and reached €735 million. With industry insiders Massolution forecasting an 81 % increase in global crowdfunding volumes in 2013, it looks like crowdfunding might just get serious.

Crowdfunding is the collective effort of many individuals who pool their resources to support efforts initiated by other people or organisations. This is usually done via or with the help of the Internet. Individual projects and businesses are financed with small contributions from a large number of people, allowing innovators, entrepreneurs and business owners to utilise their social networks to raise capital.

Crowdfunding has several things going for it compared to traditional funding, as was noted recently in a report published by the European Capacity Building Initiative. First, crowdfunding can provide finance to small business and community organizations otherwise excluded from formal finance. This support for entrepreneurship is also touted as a leading advantage by lobby group European Crowdfunding Network.  Speed in mobilising funding is a another characteristic of crowdfundingâ€" as neatly demonstrated in the recent new world record where  €1.3 million was raised in just 13 hours by selling shares in a wind turbine to 1700 Dutch households in a deal brokered by Windcentrale. Risk-taking, necessary for marketing novel renewable energy products which still need to be tested in large scale, can also be addressed by crowd sourced finance as it taps into a more risk-tolerant segment of lenders or investors.

Cooperatives

Accelerated through social media and online communication, crowdfunding is a financial power tool for energy cooperatives.

To realize the ambitions of local sustainable energy plans, ‘community finance’ â€" which may be regarded as a form of crowdfunding â€" could be a big part of the solution.  Given the speed with which both crowdfunding and the energy cooperative sector are expanding (the number of European energy grew cooperatives grew from 1200 last year to some 2000 this year), community-financed cooperatives could seriously shake up the energy market in many countries.

Energy cooperatives have perhaps been most successful so far in Denmark. While many other countries are still struggling with local opposition to wind or other green energy projects, and with the integration of fluctuating energy from wind and solar, Denmark has largely overcome these problems by giving local communities a financial stake in local energy projects, and by combining heat and power and implementing district heating infrastructure across the country.According to Dirk Vansintjan of Belgian renewable energy cooperative Ecopower, ‘For Danes, it is the natural way of organizing themselves. Since the Middle Ages they’ve been doing it, and today most renewable projects in the country are organized this way.’

Germany too is a leader in the field of community energy, with 65 % of its renewable energy capacity community-owned. There are over 600 energy cooperatives in Germany, the number having increased tenfold in the period from 2000-2010.

Despite a long tradition of cooperatives, Spain just gained its first in the energy sector â€" Som Energia. By June 2013, this cooperative had 8000 members and had invested more than €3 million in renewable energy production projects â€" an impressive result in an acutely recession-struck country in less than two years.

In the UK, where rising household energy prices are hitting the headlines and energy is set to become a major election issue in May 2015, local energy cooperatives are seen as a way to combat the monopoly of the ‘Big Six’ â€" Britain’s biggest energy suppliers.  The movement is supported by some local authorities, such as Cornwall County Council which has made a €1.2 million revolving loan fund available to help community groups build local renewable energy installations.  Also at the forefront of this work is Cornish energy charity Community Energy Plus. Energy advisor Neil Farrington  says, ‘We are currently working with fifteen community energy cooperatives across Cornwall with more emerging every few weeks.’

In Croatia, the UNDP (United Nations Development Programme) plans to develop a crowdfunding platform for community energy projects and has issued a call for cooperatives to submit project proposals. According to Mak Đukan and Robert Pašičko of the UNDP Environmental Governance program in Croatia, design of the UNDP crowdfunding platform  will incorporate the best elements of crowdfunding platforms specifically designed to support renewable energy projects, such as Solar Schools, Abundance Generation, Sun Funder and Solar Mosaic.

Cohesion funding

If matched or leveraged with other funding, community finance could become a much bigger player in Europe’s energy transition. And it just so happens that between now and next summer, there is a brief window of opportunity to access a big pot of money â€" upwards of €20 billion in fact. This is the ballpark figure of what will be allocated to investments in energy efficiency and renewable energy in the next funding period of the EU Cohesion Funds for 2014-2020.

Cohesion funds are part of the EU budget aimed at reducing regional disparities in terms of income, wealth and opportunities. How the money is spent at national level is determined by negotiation between the Commission and the national Managing Authority.  This is made formal through Partnership Agreements (essentially overarching national strategies, setting out plans for use of the funds) and Operational Programmes (setting out a region’s priorities for delivering the funds). The point is, the negotiation process is taking place now, and for local sustainable energy leaders to have a shot at the money, they need to be aware of and influencing this process.

Whether or not this can happen depends on a number of factors â€" for example how transparent or opaque the national managing authority is in public consultation, the degree of support and involvement from local authorities and perhaps most importantly, the willingness and capacity of localized initiatives to get to grips with the maze of bureaucracy involved. But given that many energy cooperatives have tackled administrative and legal complexity in gaining grid access, they might just be able to handle this.

Earlier this month representatives from DG Regio (The European Commission’s Directorate-General for Regional Policy) and associations of local authorities discussed the potential for community sustainable energy projects to access cohesion funding.  It was concluded that although no precedents exist, there are no legal barriers for community finance to provide the private finance element needed to leverage the EU Cohesion funds.

To be first in this would be some achievement â€" and for now, the door is open. For example, community led local development is one of the new aspects of cohesion policy that could support voluntary and community organisations, and local authorities among others. Crucially, community led local development (CLLD) must be included in the â€" now in draft form â€" national Partnership Agreements. Unless the box is ticked for CLLD, community based crowdfunded initiatives and other third sector enterprises (like renewable energy cooperatives) cannot get access to the big money available through cohesion funding. The devil is in the details â€" but with billions of euro on the table, there’s a lot to play for. It will be 2021 by the time the next negotiation period rolls around. What will the energy landscape look like then?

CEO Peter Terium of RWE â€" one of Europe’s largest utilities â€" stated in 2012: ‘Our core markets are changing remarkably fast. Almost no other industry is currently undergoing such dynamic change as the energy sector…The success of this transformation of the energy industry will be decided at the local level.’

Can the upstarts join forces with the bureaucrats? If the local can get organized quickly enough, distributed energy could become a game-changer faster than we think.

 

Read more:  

EU Cohesion policy support for sustainable energy

Structural and Investment funds 2014-2020

Renewable energy cooperatives

Read More

Genovation Cars Behind Advanced EV Battery Research Project


We interrupt our all-Tesla-all-the-time programming for this special announcement: Genovation Cars, a company that all but dropped off the radar a couple of years ago, is partnering in a new hybrid EV battery research project that combines a high density battery pack with an ultracapacitor pack and a DC/DC converter based on silicon carbide.

If all that sounds like a heavy load to carry, guess again. Genovation and its research partner, the University of Maryland, are set on producing an energy storage system that weighs less but lasts longer than a conventional battery pack alone.

What Is This DC/DC Converter Of Which You Speak?

To be honest, “DC/DC converter” hadn’t popped up in conversation before, so we turned to our friends over at the Society of Automotive Engineers (SAE) for an explanation.

DC/DC converters address a critical issue for electric vehicles, which is the plethora of different electrical systems drawing from the same battery, each with its own unique voltage requirements for optimum efficiency.

Genovation partners in EV battery research.

G2 electric vehicles courtesy of Genovation Cars.

That includes stop-start, power steering, air conditioning, safety systems, and any number of something SAE calls “comfort priorities.” That last item in particular is bound to grow as the EV market gains mainstream traction and EV manufacturers add more bells and whistles to stand out from the crowd.

DC/DC converters are an established technology for distributing voltage, but they are not ideal for the light weight/small size requirements of EVs.

This is the nexus of interest for the new EV battery research project, which is funded by a $438,418 grant from the National Science Foundation.

Carborundum Solves DC/DC Conundrum

The new project leaves behind the conventional silicon-based DC/DC converter and focuses on silicon carbide (SiC), a compound of silicon and carbon also called carborundum.

SiC enables higher switching speeds, which significantly reduces the size of related systems. The aerospace industry has been eyeballing SiC for some years now, so it’s not surprising that the EV field would catch on.

Not coincidentally a British R&D consortium spearheaded by the company Prodrive has been working on that very same thing. SAE cites Pete Tibbles, Research Manager for Prodrive, who explains another key advantage:

The very high efficiency of the new technology also reduces the need for heavy and complex cooling systems. We have been able to reduce the size and weight [of the DC/DC converter] by around two-thirdsâ€"from around that of a flight bag to more like a shoe box.

It’s A Horserace!

The US has some catching up to do when it comes to developing a next-generation DC/DC converter, but it appears we have a secret weapon on our side.

https://plus.google.com/102291313118764969093/posts

// ]]>

The University of Maryland end of the partnership is spearheaded by Professor Alireza Khaligh, who leads the school’s Power Electronics, Energy Harvesting and Renewable Energies Laboratory (PEHREL).

For the second year in a row this fall, Khaligh was awarded the Best Vehicular Electronics Paper Award by the IEEE Vehicle Technology Society, a leading global professional organization.

The award was for his paper, co-written with former student and current GE Global Research Center scientist Li Zhihao,  “Battery, Ultracapacitor, Fuel Cell and Hybrid Energy Storage Systems for Electric, Hybrid Electric, Fuel Cell and Plugâ€"In Hybrid Electric Vehicles.”

As For Tesla Motors…

Tesla seems to have recovered quite nicely from last year’s spat with the New York Times. Bad press over a couple of recent vehicle fires notwithstanding, the company has a stellar safety record and it is forging ahead on the R&D side (a new battery pack patent being one example) while opening retail stores hand over fist.

Follow me on Twitter and Google+.


Tags: , , , , , , , ,


About the Author

Tina Casey specializes in military and corporate sustainability, advanced technology, emerging materials, biofuels, and water and wastewater issues. Tina’s articles are reposted frequently on Reuters, Scientific American, and many other sites. You can also follow her on Twitter @TinaMCasey and Google+.



Read More

Energy Storage On Silicon Chips — New Supercapacitor Creates Interesting Possibilities For Solar Cells

Batteries Silicon chip with porous surface next to the special furnace where it was coated with graphene to create a supercapacitor electrode.
Image Credit: Joe Howell / Vanderbilt

Published on October 30th, 2013 | by Nathan


The first supercapacitor composed of silicon was recently created by researchers at Vanderbilt University â€" the novel supercapacitor opens up a number of very interesting possibilities with regard to solar cell technology and mobile electronics. In particular, the researchers note the possibility of developing solar cells that can provide electricity for a full 24 hours of the day, and of developing mobile phones that can recharge in seconds and work for weeks between charges.

The great strength of the new supercapacitor is that, since its created out of silicon, it can simply be built into a silicon chip along with and at the same time as the same microelectronic circuitry that it powers. The researchers even mention the possibility of constructing these power cells “out of the excess silicon that exists in the current generation of solar cells, sensors, mobile phones and a variety of other electromechanical devices, providing a considerable cost savings.”

energy storage on silicon chips

Silicon chip with porous surface next to the special furnace where it was coated with graphene to create a supercapacitor electrode. Image Credit: Joe Howell / Vanderbilt

“If you ask experts about making a supercapacitor out of silicon, they will tell you it is a crazy idea,” stated Cary Pint, the assistant professor of mechanical engineering who headed the development. “But we’ve found an easy way to do it.”

Most research to date to improve the energy density of supercapacitors has focused on the utilization of carbon-based nanomaterials like graphene and nanotubes, but because of the great difficulty in “constructing high-performance, functional devices out of nanoscale building blocks with any level of control,” improvements have been slow. So, the researchers decided to try something radically new â€" utilizing porous silicon, a material with a controllable and well-defined nanostructure made by electrochemically etching the surface of a silicon wafer.


Vanderbilt University provides details:

This allowed the researchers to create surfaces with optimal nanostructures for supercapacitor electrodes, but it left them with a major problem. Silicon is generally considered unsuitable for use in supercapacitors because it reacts readily with some of the chemicals in the electrolytes that provide the ions that store the electrical charge.

With experience in growing carbon nanostructures, Pint’s group decided to try to coat the porous silicon surface with carbon. When the researchers pulled the porous silicon out of the furnace, they found that it had turned from orange to purple or black. When they inspected it under a powerful scanning electron microscope they found that it looked nearly identical to the original material but it was coated by a layer of graphene a few nanometers thick.

“We had no idea what would happen,” Pint explained. “Typically, researchers grow graphene from silicon-carbide materials at temperatures in excess of 1400 degrees Celsius. But at lower temperatures â€" 600 to 700 degrees Celsius â€" we certainly didn’t expect graphene-like material growth.”

After testing the coated material, the researchers found that it had chemically stabilized the silicon surface â€" and that, when it was used to create supercapacitors, the graphene coating “improved energy densities by over two orders of magnitude compared to those made from uncoated porous silicon and significantly better than commercial supercapacitors.”

The researchers think that this approach very likely isn’t specific to graphene. “The ability to engineer surfaces with atomically thin layers of materials combined with the control achieved in designing porous materials opens opportunities for a number of different applications beyond energy storage,” Pint argued.

“Despite the excellent device performance we achieved, our goal wasn’t to create devices with record performance,” Pint continued. “It was to develop a road map for integrated energy storage. Silicon is an ideal material to focus on because it is the basis of so much of our modern technology and applications. In addition, most of the silicon in existing devices remains unused since it is very expensive and wasteful to produce thin silicon wafers.”

The researchers are now pursuing this line of thought â€" looking to develop energy storage that can be built into the excess materials and/or unused back-sides of solar cells.

The new research was detailed in a paper published in the journal Scientific Reports.


Tags: , , , , ,


About the Author

For the fate of the sons of men and the fate of beasts is the same; as one dies, so dies the other. They all have the same breath, and man has no advantage over the beasts; for all is vanity. - Ecclesiastes 3:19



Read More

Record Breaking Solar Cell Efficiency From A “Perfect Crystal”

Research record breaking solar cell efficiency from new InGaN crystals

Published on October 28th, 2013 | by Tina Casey


Gallium is already on its way to becoming the workhorse of the solar tech field, and now it looks like the soft metal is is on track to become a thoroughbred. A team of US scientists has hit upon an improved a new method for growing indium gallium nitride (InGaN) crystals that could lead to record-breaking solar cell efficiency. So far the method has resulted in a film of InGaN that has “almost ideal characteristics.”

To ice the cake, an analysis of the film revealed the precise reason why the results of the new InGaN growing method were so good, which could lead to further improvements in LED technology as well as solar cells.

Nitride refers to a compound of nitrogen, in this case in conjunction with indium, a soft silvery-white, zinc-like metal, as well as gallium.

If InGaN already rings a bell, you might be thinking of the world record-setting concentrating solar cell module developed by the company Amonix. That module is based on a record setting solar cell developed by Solar Junction, that incorporates  a layer of antimony-doped InGaN.

Gallium in particular is an effective material for LEDs as well as solar cells due to its band gap characteristics, most familiarly in CIGS thin film solar cells (CIGS is the semiconductor copper-indium-gallium-(di)selenide). The potential has barely been scratched, though.

Arizona State University and the Georgia Institute of Technology collaborated on the new method, which addressed the problem at its core. The obstacle has been irregularities in the atomic structure of the crystal, as explained by ASU team leader Fernando Ponce:

Being able to ease the strain and increase the uniformity in the composition of InGaN is very desirable, but difficult to achieve. Growth of these layers is similar to trying to smoothly fit together two honeycombs with different cell sizes, where size difference disrupts a periodic arrangement of the cells.

The new method is called metal modulated epitaxy. It is a variation of the epitaxial deposition method first developed at Bell Labs in the 1960′s, which involves applying a thin layer of material to a substrate that takes on the crystal structure of the lower layer.

The result was a more film that resembles a perfect crystal, both in its uniformity of structure and in the desirable trait of luminosity.

As for why the improvement occurred, the analysis credited “strain relaxation at the first atomic layer of crystal growth.”

We Built This Next-Generation Solar Cell

Solar cell efficiency is not the only factor leading to a drop in the cost of solar power, since the “soft costs” of installing a solar system still account for a considerable chunk of change.

However, solar cell efficiency is still a key factor, and if the new findings translate from the lab to commercial development, let’s throw ourselves a taxpayer appreciation party.

https://plus.google.com/102291313118764969093/posts

// ]]>

The latest development has roots in a 2008 paper published by Georgia Tech team leader Alan Doolittle with other collaborators, titled “Metal modulation epitaxy growth for extremely high hole concentrations above 1019cmâˆ'3 in GaN.” It described how the metal modulated epitaxy method yielded an enhanced doping efficiency of up to 10 percent, which compares favorably to the 1 percent efficiency under the conventional method.

That research was funded by grants from the Office of Naval Research, the Air Force Office of Scientific Research, and the Defense Advanced Research Projects Agency as well as the National Science Foundation.

Follow me on Twitter and Google+


Tags: , , , , , , ,


About the Author

Tina Casey specializes in military and corporate sustainability, advanced technology, emerging materials, biofuels, and water and wastewater issues. Tina’s articles are reposted frequently on Reuters, Scientific American, and many other sites. You can also follow her on Twitter @TinaMCasey and Google+.



Read More

Can the EUETS Combine Intensity-Based and Absolute Emissions Caps?

Emissions and Caps

An innovative reserve mechanism for the EUETS is being considered that moves allowances to and from the reserve based on the level of economic activity. This would give the EUETS elements of an intensity based cap (limiting emissions per unit of output) within an absolute cap (limiting total tonnes emitted).   Although potentially less economically efficient than managing reserves based on prevailing prices it may prove politically more tractable. 

The prevailing surplus of allowances in the EUETS is leading to the scheme lacking effectiveness as a signal for abatement, and especially for low carbon investment.  The EU is considering a range of reform options to address this.  One option is to cancel allowances currently due to be auctioned, although this is likely to face substantial opposition.   Another option, not mutually exclusive, is to establish a reserve of allowances to stabilise the market.  Options for the reserve now under discussion are all volume based, as they seek to stabilise the market by using a reserve to manage the volume of available allowances.  But they differ in the basis for the trigger mechanism used to determine the timing and number of allowances transferred to or from the reserve.  There are three main types of basis for a trigger mechanism: volume of allowances, levels of economic activity, such as GDP, and allowance prices.  I assume here that the mechanisms applies automatically, but it would be possible for it to be administered on a more discretionary basis, analogously to the types of functions performed by a central bank.

I have previously written about the advantages of allowance reserves that use price based triggers to implement soft floors and ceilings on the price, such as those already implemented in California.  However, even if such mechanisms represent the best way forward in principle they may prove politically impossible to introduce in Europe at the moment.  So what about the other two possible types of trigger?

A volume based trigger is essentially a form of temporary set-aside in which allowances are put into the reserve when the cumulative surplus exceeds a specified upper threshold, and moved back into the market when the surplus falls below a lower threshold.  Allowances thus automatically come back into the market as it tightens, with the timing and extent of the market’s return to scarcity largely unaffected.  Market participants will anticipate the return of allowances to the market and factor it into their pricing.  Such a mechanism thus fails to add significant market stability, and seems unlikely to be worth pursuing.

An economic activity based trigger is quite different.  It seeks to make the supply-demand balance more stable by adjusting the supply of allowances in response to the level of economic activity.  Allowances are placed into the reserve if economic activity is lower than expected and withdrawn from the reserve if activity is higher. This can result in supply being reduced into the long term after a recession, because allowances only come back into the market when a period of higher than expected economic growth leads to economic acitivity above expected levels.  This may not occur, or may return only a proportion of allowances.  It turns the scheme into something more like an intensity based scheme, where emissions per unit of activity are limited, but in this case still subject to an overall cap.

Such a mechanism does not address all of the causes of over or under supply.  Consequently it may not prevent price falling very low or rising very high in some circumstances, and so inefficiently low or high prices remain a possibility.  However a price-based trigger would probably be needed to avoid such risks completely.

This type of mechanism requires the measure of economic activity used as a basis for the trigger to be defined.  This may be GDP, which measures (however imperfectly) activity in the economy as a whole, or it may be something that represents activity in the sectors covered by the EUETS, for example a mix of electricity consumption and industrial output.

The expected level of economic activity that corresponds to no transfers to and from the reserve needs to be set.  It will likely be appropriate to reset this from time to time to take account of changed expectations, at least at the start of a new phase of the scheme.

The number of allowances that are transferred to or from the reserve in response to changes in the level of economic activity (elasticity) also needs to be defined.  For example if economic activity is 1% lower than expected then 1% of the cap may be put into the reserve, or some other proportion such 0.75%.  The response may include limits on the number of allowances transferred in any one period, for example a quarter of auctioned volumes in any one year.  And it would be possible to specify no transfer in the event of small changes in economic activity relative to the expected level.

The way that such a mechanism might have worked had it been in place since the start of Phase 2 is illustrated in the chart.  For simplicity it uses GDP as an indicator of economic activity.  Expected GDP at the start of Phase 2 in 2008 was 2.3% p.a, with rising economic activity over time (dashed blue line).  Actual GDP has been well below this (solid blue line), and is currently around 12% below expected levels.  This leads to allowances being put into the reserve (green bars and lines which show annual and cumulative totals, assuming an elasticity of 0.75).  This closely matches the cumulative surplus due to lower emissions shown on the chart (dotted black line).  The surplus is defined as the difference between the annual cap and actual emissions (dashed and solid grey lines), and excludes any surplus due to other factors such as the use of offsets.  In a hypothetical future the reserve continues to grow as GDP remains below expected values.  Then, after expected GDP is reset at the start of Phase 4 in 2021, a period of more rapid than expected growth begins to reduce the size of the reserve.  However a large volume allowances remains in the reserve even though the cap has significantly tightened and the market is likely to have returned to scarcity.

reserves chart

It is not clear whether or not such a reserve mechanism, in effect providing an intensity based limit subject to an overall absolute cap, is a good idea.  But it seems well worth further consideration.  If nothing else it may at least help remove at least some allowances from the market if permanent set-aside (cancellation) of allowances is not politically feasible.  And as an innovation among emissions trading schemes it could go some way to restoring the reputation of the EUETS, and would provide a signal to others around the world that the EU is willing to take action to address the problems with the EUETS as it now stands.

Photo Credit: Emissions Cap and Intensity/shutterstock

Authored by:

Adam Whitmore

Adam Whitmore has over 20 years' experience of the energy sector, and has been working on climate change issues for much of the last 15 years for companies, governments and regulatory agencies.  He writes about all aspects of climate policy.

See complete profile

Read More

Acidification: The Ocean's Changing Climate

Ocean Acidification

Scientists overwhelmingly agree that carbon dioxide (CO2) emissions in the atmosphere cause global temperatures to rise.  This has detrimental impacts for the environment due to changes in climate patterns.  The increase in ocean temperatures, particularly within the last 50 years, is assumed to contribute to stronger hurricanes and tropical storms, as well as changes in ocean currents.  Marine ecosystems face other damaging aspects of human-caused carbon emissions as well.  A chemical change is taking place in our oceans, and it is making waters more and more acidic.  Ocean acidification occurs due to several effects of CO2 in the atmosphere.  Excess amounts of CO2 are entering our oceans and creating a high concentration of carbonic acid: a product of the chemical reaction between water and carbon dioxide.

Oceans on Acid            

Scientists have been studying the acidification of oceans for years.  In 2007, Scott Doney, a senior scientist with the Marine Chemistry & Geochemistry Department of the Woods Hole Oceanographic Institution testified before a U.S. Senate Subcommittee about the effects of climate change on the ocean.  Dr. Doney mentioned then that ocean life was facing an almost unprecedented environmental challenge.  He stated in his testimony:

[m]arine life has survived large climate and acidification variations in the past, but the
projected rates of climate change and ocean acidification over the next century are much
faster than experienced by the planet in the past.

The ocean has always absorbed CO2 from the air.  An article published in Nature August 2012 explains that about half of all human-generated CO2 is absorbed by the ocean.  This chemical reaction is naturally occurring, but at the rate carbon is being absorbed, the concentration of carbonic acid is rising.  This has devastating consequences, especially for shelled creatures.  Animals such as clams, crabs, and corals need calcium carbonate to build their shells.  Carbonic acid dissolves calcium carbonate, and that means these animals are unable to maintain calcium casings.  Additional studies, such as one published in Biogeosciences in 2010, suggest that carbon sequestration by oceans and land is decreasing, which has the potential to further worsen the impact of carbon emissions.

Ecosystem Damage

Without calcium carbonate, shellfish and coral cannot survive.  The ocean’s ecosystem depends on these organisms because they provide a large source of food at the bottom of the food chain.  They need to be plentiful in order to feed higher predators, who in turn are the food for even larger predators.  Without a sufficient supply of feeder organisms at the bottom of the food chain, other animals go without food and their numbers diminish.  This collapses the marine ecosystem due to lack of sufficient food for marine animals to survive.

Recent information about the increasingly acidic ocean is suggesting that unless there is some mitigation of anthropogenic warming, corals and other calcifying organisms will simply dissolve into the ocean.  Acidification is also exacerbated by warming ocean temperatures.  Coral bleaching is a phenomenon already observed in several places in the world where warming waters have made habitats unsuitable for coral polyps and the symbionts that give coral reefs brilliant colors.  Coral polyps are tiny creatures that tend to live in colonies and secrete the calcium carbonate that eventually becomes a coral skeleton.  Over time, these organisms build massive colonies that become reefs.  Symbiotic algae live inside coral polyps and supply them with oxygen and essential nutrients in return for carbon dioxide and other nutrients secreted by the coral.  Bleaching occurs when waters become too warm for the polyps and symbiotic algae to survive.  They die off, and only the calcium coral skeleton remains.

Impact for Economy and Sustainability

Coral reefs are often called “rainforests of the sea” because of their biodiversity and beauty.  Together with other calcifying organisms, these creatures support an estimated 25 percent of all marine species known to science.  They provide a large portion of the base of the ocean’s food web and are necessary for the survival of marine ecosystems.

Preserving coral reefs and other calcifying sea creatures has an ecological importance that impacts the everyday lives of most humans.  Commercial fishing contributes to the world economy, manufacturing industry, and feeds millions of people.  Feeder organisms not only provide stability for species like whales, dolphins, and sharks â€" they also provide the food commercial fish need to maintain sustainable populations.

Not all fish caught are used as food for humans.  Chemicals derived from commercially caught fish are literally everywhere.  Omega-3 fish supplements are popular.  Everyday products such as fertilizers, gelatins, cosmetic ingredients, vitamins and even pigments can all be made from byproducts of commercially caught fish.

Ocean acidity will destroy marine ecosystems if it continues unabated.  Maintaining sustainable fishing industries will become impossible if the carbon dioxide absorbed by the world’s oceans is not drastically reduced.

Photo Credit: Acidification and Climate Change/shutterstock

Read More

Energy Efficiency Firms Lead Worldwide Clean Technology List

Peter Miller, Senior Scientist, San Francisco

The new Global Cleantech 100 list of private companies breaking barriers with their clean technology ideas and executions across the worldwide economy finds that energy efficiency companies not only continue to dominate the list, but are growing in influence.   

For the past five years, the Cleantech Group has published an inventory of 100 companies it believes are most likely to have a large commercial impact in the next 5 to 10 years. The recently released  2013 version indicates the world economy is waking up to energy efficiency’s vast business potential, and its reach is spreading.Thumbnail image for 48_Installing Programmable Thermostat.jpg

Twenty-seven of the Global Cleantech 100 are energy-efficiency companies, which is seven more than last year. California, which leads the nation in overall clean energy development, is home to eight of the 13 U.S. energy efficiency firms named.

In all, the 2013 Global Cleantech list represents 15 different sectors, including solar, smart grid, transportation, biofuels and biochemical, and even agriculture and waste management. Compiled by 90 industry experts from more than 5,000 nominations, the list includes 56 U.S. companies while the rest come from 18 other nations.

Energy efficiency firms dominate

Although the San Francisco-based Cleantech Group has been issuing its list since 2009, the authors note that “energy efficiency’s dominant presence on the 2013 Global Cleantech 100 makes a 42 percent increase in representation since the list’s inception.”

The authors say the growing importance of energy efficiency comes out of “investors’ distinct preferences today for business models that most closely resemble those of traditional capital â€" lighter and faster-to-market tech and software startups â€" proven money winners of the past.”

Energy efficiency and the economy

It’s heartening to see more proof that energy efficiency â€" doing more with less energy â€"  is an important economic driver as well as saving us money and avoiding pollution. This echoes the findings of NRDC’s new 1st Annual Energy and Environment Report entitled  America’s (Amazingly) Good Energy News. 

The new Cleantech report also underscores the expansion of energy efficiency as a job creator. The clean economy, according to a 2011 Brookings Institution report, employs more people than the fossil fuel industry. Earlier this year, Environmental Entrepreneurs (E2), a business group affiliated with NRDC that tracks new energy efficiency and other clean energy job announcements, launched a Clean Energy Works for US!  website that showcases the industry’s broad growth.

And, as Matt Lawson of Graybar Electric explained during an interview at the recent Chicago Energy Efficiency Expo, the efficiency business is growing rapidly, creating jobs of all kinds from installers to complex systems engineers.

EE companies are diverse

The Global Cleantech 100 list also shows diversity within the efficiency industry is expanding. Energy efficiency covers not only high-performing light bulbs but home energy management, waste heat recovery, and green information technology. The top subcategory in the Cleantech 100 list was heating and cooling, which the report says has big appeal “given that the maintenance of indoor climate conditions accounts for 75 percent of the building sector’s energy demand.”

The top-rated North American company was Nest, of Palo Alto, Calif., for showing “the value of design to industrial products and fostering consumer interest in a dull product like the thermostat.”  

Meanwhile, Tendril, a home-energy management in Boulder, Colo., made a fifth straight appearance, making it one of only four companies (and the only efficiency one) to appear on every Cleantech Global 100 list.

The other U.S. energy efficiency firms:

  • Alphabet Energy of Hayward, Calif., which manufactures a solid-state semi-conductor that turns heat into electricity, “like solar panels that use heat â€" instead of light.” A car’s exhaust, for example, could be used to create electrical power and improve fuel efficiency.  
  • Digital Lumens of Boston networks light-emitting diodes and software for commercial and industrial venues. The “Intelligent Lighting System” aims to generate the same amount of light for 10 percent of the energy cost by pinpointing how much light to use and where to put it.
  • Enlighted in Sunnyvale, Calif., develops sensors that can be attached to individual lights to control energy use throughout a building. The system can also be adapted to control heating and cooling.
  • Gridium of Menlo Park, Calif., offers software to measure a building’s energy use to cut costs.
  • Nexant of San Francisco counsels utility companies on how to promote energy efficiency.
  • Next Step Living of Boston works directly with consumers to make their homes energy efficient.
  • Opower of Arlington, Va., is a developer of a software-as-a-service that utilizes customer engagement and billing analytics for utilities.
  • OSIsoft, San Leandro, Calif., writes software that manages manufacturing processes.
  • Phononic Devices, Raleigh, N.C., has developed advanced thermoelectric devices that efficiently manage and monetize heat.
  • Project Frog of San Francisco builds energy-efficient manufactured housing, commercial buildings, and schools.
  • Transphorm, Goleta, Calif., says it has an efficiency breakthrough with a material known as gallium nitride, which can better process power and energy during the conversion of electricity from one form to another. 

Programmable thermostat installation photo by by U.S. Department of Energy, under public domain.

Read More

The Radical Right Wing Is Becoming An Unlikely Advocate For Solar Power

Clean Power CREDIT: ANDY KROPA/INVISION/AP, FILE

Published on October 25th, 2013 | by Guest Contributor

Originally published on ClimateProgress
by Ryan Koronowski

CREDIT: ANDY KROPA/INVISION/AP, FILEReaders of Glenn Beck’s email list received a sponsored message from a source that might surprise some: a solar generator vendor. Solutions From Science sells heirloom seeds, emergency food, and solar products designed to “make people more self-reliant,” according to the Clinton Herald.

Bill Heid, the company’s president, wrote in the email that solar generators are great in emergencies, “run quietly, emit no dangerous fumes, and produce free electricity from the sun.” The email throws in a special deal for Beck’s readers after it makes the case for preparing to live off the grid:

And whether it’s hurricanes, ice storms, brownouts, or blackouts… with a Solar Generator, you won’t have to worry about painful power outages ever again. As I’m writing this, there are power outages in the news from wild fires, wind, flooding, heavy snow, copper thieves, cars hitting power poles and even an explosion on one college campus. Many experts are even saying the whole grid is going down.

A solar-powered generator that allows consumers to get clean power during emergencies like wildfires, high winds, flooding, brownouts, and blackouts (among the other more apocalyptic scenarios), being sold under Glenn Beck’s name.

The name of the company and the products it advertises could lead people to believe it was just in the business of taking what top climate scientists say, going back to nature, getting off the grid, becoming energy-independent, and freeing its customers from reliance on carbon-based fuels and large agricultural corporations.

Until you get to the About Us page:

Over the last century, America has consciously turned away from its Christian heritage, and the effects have been devastating. The problems our nation currently faces are serious, not just for us, but for our children and grandchildren. Our liberties and freedoms are being threatened (and even slowly taken away) and there is a nation-felt concern for the direction our country is headed.

The road ahead is long and hard, but not impossible. We believe that the only way back for America is a return to the Biblical principles that brought us true freedom in the first place â€" freedoms that our Founding Fathers understood were ones given by a Creator, not a king or state. As we seek to restore America, we must remember that ‘unless the Lord builds the house, those who build it labor in vain.’ (Psalm 127).

Now the Beck endorsement makes a bit more sense.

Back when Glenn Beck was on Fox News, he said “there aren’t enough knives” for climate scientists to kill themselves with in response to the 2007 IPCC report. He brought a member of the Exxon-funded evangelical “Cornwall Alliance” onto his show to talk about how climate change is a “false religion.”

In 2009, he talked (video) on his show about “that green stuff” and how “I haven’t bought it for a long time.” He then breathed a dramatic sigh of relief over the failure of a ballot measure in Los Angeles that would have brought 400 megawatts to the city.

And yet earlier this year, Beck tweeted out a photo of his ranch being “almost 100% powered by ‘green energy.’”

Dig at Al Gore aside, he decided to invest his money in renewable energy. This isn’t to say he held back from mocking it, saying that when he and his family went to his ranch to be “off the grid” during the summer of 2012, his solar powerit left his ice cream soggy, causing him to run generators.

But right-wing embrace of solar energy â€" however awkward â€" is taking place all over the country, with the Atlanta Tea Party pushing their utility to allow them access to solar energy and Barry Goldwater Jr. advocating for solar in Arizona. What remains unclear is how a sincere embrace of renewable energy could affect the partisan and ideological schism regarding beliefs about climate change.

More of the Beck email below:

glennbecksolar-555x707


Tags: ,


About the Author

is many, many people all at once. In other words, we publish a number of guest posts from experts in a large variety of fields. This is our contributor account for those special people. :D

Related Posts


Read More

Catabolic Ephemeralization? Carson versus Greer

Last week, Kevin Carson, a political historian and theorist of the Mutualist tradition, took issue with the concept of catabolic collapse, a term coined a few years ago by the author John Michael Greer. Greer responded; the exchange that followed provided an illuminating look at two views of the future that actually share many qualities but which differ in important respects.

Greer is one of a handful of prescient observers (along with James Howard Kunstler, Dmitry Orlov and Richard Heinberg among others) who has taken a stab at trying to predict what the world might look like as the interconnected crises of resource depletion, climate change and economic collapse unfold in the coming decades.

While he pursues his own unique line of thinking, Greer’s work shares with those other authors a few key convictions, the most important being that there is no combination of alternative energy, conservation or other technology that can keep our globalized system running as it has. And along with those other writers, Greer believes that its not just car culture and the Interstate highway that’s doomed; the Internet itself is unlikely to survive for many more years.

chinatownchicago

Decaying infrastructure. By swanksalot via flickr.

Unlike some post-peak writers, however, Greer doesn’t believe that we are facing what some have called a secular apocalypse in which industrial civilization imminently and rapidly implodes over a course of months. Instead, he argues that we are likely to experience what he calls “catabolic collapse” in which industrial civilization reverses course, shedding layers of complexity, infrastructure and technological achievement in a series of painful downward steps, happening over time. Catabolic collapse begins at the point at which the available energy and other resources of a complex society are not enough to maintain its energy- and capital-intensive infrastructure.

According to Carson, the problem with the theory of catabolic collapse is that it ignores what he calls “one of the most central distinguishing characteristics of our technology: ephemerality.” The classic example from Buckminster Fuller, he writes, is the replacing of “a transoceanic cable system embodying God only knows how many thousand tons of metal with a few dozen communications satellites weighing a few tons each.”

“It’s quite true that the mass-production industrial civilization that peaked in the 20th century is falling into ruin, failing to invest in upkeep at sustainable levels, and generally eating its seed corn â€" just as happened with Rome. The difference is, the Interstate Highway System, the civil aviation infrastructure, and the old electrical grid aren’t something to mourn. They’re something that would decay anyway, because they’re increasingly irrelevant to the kinds of production technology and economic organization the emerging successor society will be based on.”

Thanks to technological advancement in recent years, Carson argues, distributed infrastructure â€" including distributed renewable energy and distributed manufacturing enabled by peer-to-peer open source designâ€" is making that same collapsing infrastructure obsolete.

3d printer

3D printer. By Opensourceway via flickr.

“Metaphorically speaking, we live in the early days of an emerging economy in which peasant villages â€" with a Star Trek molecular replicator in each cottage â€" lives in the shadows of the decaying aqueducts.”

Having followed Carson’s work for a few years, I think I understand what he is saying; unfortunately his choice of metaphors here seemed to have caused quite a bit of misunderstanding among Greer and his followers, who are so put off by the idea of Star Trek (“touchstone of the absurd” according to Greer) that they don’t notice that the replicator “technofantasy” Carson mentions is in fact a metaphor.

More to the point: Greer takes issue with the idea that the ephemeral technologies Carson mentions are really less resource intensive, arguing that we only think they are because of mistaken accounting. Satellites are not possible without a space program, and space programs require so much infrastructure that it’s ludicrous to suggest that they require fewer resources than transoceanic cables. As for the Internet, “Descend from the airy realms of cyber-abstractions into the grubby underworld of hardware, and it’s an archipelago of huge server farms, each of which uses as much electricity as a small city …”

So which is it? Are we headed for a future in which short-wave radio returns and a rebuilt postal service takes over from failing server farms, as Greer would have it? Or will we be able to “leapfrog” away from our old imploding infrastructure toward a world of distributed, highly efficient, peer-to-peer manufacturing facilitated by open source design?

cellphonecharging

Mobile phone charging station in Uganda. Source: AdamCohn via flickr.

It is at this point when I feel it’s time to step back and ask: what do we really know, and what can we observe?

Some examples:

  • In spite of Greer’s claim that the infrastructure of satellite communications is larger than laying transoceanic cable, we simply don’t know whether this is true or not. We do know that countries such as North Korea and India, which have energy consumption orders of magnitude smaller than the United States, have managed to launch satellites.
  • Even so, it’s important to note that satellites are not necessary for cell phone communications. The vast majority of cell calls are not routed through satellites, but through local cell phone towers. As Hobert Pruitt puts it, “cellular phones are basically fancy radios that use cellular towers.”
  • Cell phone penetration in Africa is expected to exceed 80 percent in the coming year. This is 10 times the number land line users. In Somalia, a country with ongoing civil war and no government, there are six cell phone companies and a 16.3 percent penetration, which suggest that cell phone access could be quite resilient and even grow in very dire situations. Mobile-money services in Somalia actually substitute for banking, which is non-existent.
  • Globally, more people have cell phones than have access to grid electricity and safe drinking water. Internet penetration globally is at 34 percent. If Greer is right that modern telecommunications is full of hidden embodied energy and capital costs, how is this possible?
  • The idea that the Internet is a huge energy hog is a myth. Claims that it is can be traced almost entirely to reports written by Mark Mills for the coal industry, presumably to promote the idea that without coal everyone would have to give up Facebook.
  • Even if current Internet infrastructure is vulnerable, there are alternatives. In Athens and around the world, for example, growing numbers of people have been creating parallel internets by creating a “mesh” of rooftop wifi antennas. The fact that people are setting systems like these up in a place with amidst a collapsing economy is a hint to a direction things might go, at least in the short term.

Of course none of this obviates the need for things like food security, water and basic sanitation. But these are issues that are probably better addressed with existing site specific permaculture design approaches and open source appropriate technology.

Greer is a big advocate of distributed renewable energy, mostly using a time-tested, small scale off-grid approach as opposed to the net metering/plug-in path that most people pursue.

There is a third option however. The rapidly falling cost of solar power, combined with the microgrid revolution and improving storage technology makes community-scaled, shared renewable electricity even more viable than in the past. Stitching these microgrids into the broader grid greatly increases the resilience and potential efficiency of both.

Finally, there is the potential of distributed manufacturing from open source design, which Carson has written about in great detail. The only thing I would emphasize is that Carson’ view (as I understand it) is not that distributed manufacturing allows for continued consumption at our current level. Rather, as centralized production models collapse and overproduction ends, the need for a “push” economy fed by incessant advertising and consumerist addiction will fall away as well.

modulartractor

Open source, modular tractor designed by Open Source Ecology. Source: EchelonForce via Flickr.

Putting these and other elements togetherâ€"hi-tech, distributed communications, distributed energy and manufacturing, local sustainable food systems, appropriate technology and tactical urbanism among othersâ€"sets the stage for a future that looks quite a bit different than the present one. One might describe it as a kind of postmodern pastiche that looks neither like the antiquated futurisms we once imagined nor an idyllic return to preindustrial peasant society.

It’s a future that by current middle-class measures might look impoverished, but by other metrics is healthier, more resilient, more nourishing and more abundant in the ways that really matter.

Read More

Rising Energy Costs Lead to Recession; Eventually Collapse

How does the world reach limits? This is a question that few dare to examine. My analysis suggests that these limits will come in a very different way than most have expectedâ€"through financial stress that ultimately relates to rising unit energy costs, plus the need to use increasing amounts of energy for additional purposes:

  • To extract oil and other minerals from locations where extraction is very difficult, such as in shale formations, or very deep under the sea;
  • To mitigate water shortages and pollution issues, using processes such as desalination and long distance transport of food; and
  • To attempt to reduce future fossil fuel use, by building devices such as solar panels and electric cars that increase fossil fuel energy use now in the hope of reducing energy use later.

We have long known that the world is likely to eventually reach limits. In 1972, the book The Limits to Growth by Donella Meadows and others modeled the likely impact of growing population, limited resources, and rising pollution in a finite world. They considered a number of scenarios under a range of different assumptions. These models strongly suggested the world economy would begin to hit limits in the first half of the 21st century and would eventually collapse.

The indications of the 1972 analysis were considered nonsense by most. Clearly, the world would work its way around limits of the type suggested. The world would find additional resources in short supply. It would become more efficient at using resources and would tackle the problem of rising pollution. The free market would handle any problems that might arise.

The Limits to Growth analysis modeled the world economy in terms of flows; it did not try to model the financial system. In recent years, I have been looking at the situation and have discovered that as we hit limits in a finite world, the financial system is the most vulnerable part because of the system because it ties everything else together. Debt in particular is vulnerable because the time-shifting aspect of debt “works” much better in a rapidly growing economy than in an economy that is barely growing or shrinking.

The problem that now looks like it has the potential to push the world into financial collapse is something no one would have thought ofâ€"high oil prices that take a slice out of the economy, without anything to show in return. Consumers find that their own salaries do not rise as oil prices rise. They find that they need to cut back on discretionary spending if they are to have adequate funds to pay for necessities produced using oil. Food is one such necessity; oil is used to run farm equipment, make herbicides and pesticides, and transport finished food products. The result of a cutback in discretionary spending is recession or near recession, and less job availability. Governments find themselves in  financial distress from trying to mitigate the recession-like impacts without adequate tax revenue.

One of our big problems now is a lack of cheap substitutes for oil. Highly touted renewable energy sources such as wind and solar PV are not cheap. They also do not substitute directly for oil, and they increase near-term fossil fuel consumption. Ethanol can act as an “oil extender,” but it is not cheap. Battery powered cars are also not cheap.

The issue of rising oil prices is really a two-sided issue. The least expensive sources of oil tend to be extracted first. Thus, the cost of producing oil tends to rise over time. As a result, oil producers tend to require ever-rising oil prices to cover their costs. It is the interaction of these two forces that leads to the likelihood of financial collapse in the near term:

  1. Need for ever-rising oil prices by oil producers.
  2. The adverse impact of high-energy prices on consumers.

If a cheap substitute for oil had already come along in adequate quantity, there would be no problem. The issue is that no suitable substitute has been found, and financial problems are here already. In fact, collapse may very well come from oil prices not rising high enough to satisfy the needs of those extracting the oil, because of worldwide recession.

The Role of Inexpensive Energy

The fact that few stop to realize is that energy of the right type is absolutely essential for making goods and services of all kinds.  Even if the services are simply typing numbers into a computer, we need energy of precisely the right kind for several different purposes:

  1. To make the computer and transport it to the current location.
  2. To build the building where the worker works.
  3. To light the building where the worker works.
  4. To heat or cool the building where the worker works.
  5. To transport the worker to the location where he works.
  6. To produce the foods that the worker eats.
  7. To produce the clothing that the worker wears.

Furthermore, the energy used needs to be inexpensive, for many reasonsâ€"so that the worker’s salary goes farther; so that the goods or services created are competitive in a world market; and so that governments can gain adequate tax revenue from taxing energy products. We don’t think of fossil fuel energy products as being a significant source of tax revenue, but they very often are, especially for exporters (Rodgers map of oil “government take” percentages).

Some of the energy listed above is paid for by the employer; some is paid for by the employee. This difference is irrelevant, since all are equally essential. Some energy is omitted from the above list, but is still very important. Energy to build roads, electric transmission lines, schools, and health care centers is essential if the current system is to be maintained. If energy prices rise, taxes and fees to pay for basic services such as these will likely need to rise.

How “Growth” Began

For most primates, such as chimpanzees and gorillas, the number of the species fluctuates up and down within a range. Total population isn’t very high. If human population followed that of other large primates, there wouldn’t be more than a few million humans worldwide. They would likely live in one geographical area.

How did humans venture out of this mold? In my view, a likely way that humans were able to improve their dominance over other animals and plants was through the controlled use of fire, a skill they learned over one million years ago  (Luke 2012).  Controlled use of fire could be used for many purposes, including cooking food, providing heat in cool weather, and scaring away wild animals.

The earliest use of fire was in some sense very inexpensive. Dry sticks and leaves were close at hand. If humans used a technique such as twirling one stick against another with the right technique and the right kind of wood, such a fire could be made in less than a minute (Hough 1890). Once humans had discovered how to make fire, they could it to leverage their meager muscular strength.

The benefits of the controlled use of fire are perhaps not as obvious to us as they would have been to the early users. When it became possible to cook food, a much wider variety of potential foodstuffs could be eaten. The nutrition from food was also better. There is even some evidence that cooking food allowed the human body to evolve in the direction of smaller chewing and digestive apparatus and a bigger brain (Wrangham 2009). A bigger brain would allow humans to outsmart their prey. (Dilworth 2010)

Cooking food allowed humans to spend much less time chewing food than previouslyâ€"only one-tenth as much time according to one study (4.7% of daily activity vs. 48% of daily activity) (Organ et al. 2011). The reduction in chewing time left more time other activities, such as making tools and clothing.

Humans gradually increased their control over many additional energy sources. Training dogs to help in hunting came very early. Humans learned to make sailboats using wind energy. They learned to domesticate plants and animals, so that they could provide more food energy in the location where it was needed. Domesticated animals could also be used to pull loads.

Humans learned to use wind mills and water mills made from wood, and eventually learned to use coal, petroleum (also called oil), natural gas, and uranium. The availability of fossil fuels vastly increased our ability to make substances that require heating, including metals, glass, and concrete. Prior to this time, wood had been used as an energy source, leading to widespread deforestation.

With the availability of metals, glass, and concrete in quantity, it became possible to develop modern hydroelectric power plants and transmission lines to transmit this electricity. It also became possible to build railroads, steam-powered ships, better plows, and many other useful devices.

Population rose dramatically after fossil fuels were added, enabling better food production and transportation. This started about 1800.

Figure 1. World population based on data from

Figure 1. World population based on data from “Atlas of World History,” McEvedy and Jones, Penguin Reference Books, 1978 and UN Population Estimates. 

All of these activities led to a very long history of what we today might call economic growth. Prior to the availability of fossil fuels, the majority of this growth was in population, rather than a major change in living standards. (The population was still very low compared to today.) In later years, increased energy use was still associated with increased population, but it was also associated with an increase in creature comfortsâ€"bigger homes, better transportation, heating and cooling of homes, and greater availability of services like education, medicine, and financial services.

How Cheap Energy and Technology Combine to Lead to Economic Growth

Without external energy, all we have is the energy from our own bodies. We can perhaps leverage this energy a bit by picking up a stick and using it to hit something, or by picking up a rock and throwing it. In total, this leveraging of our own energy doesn’t get us very farâ€"many animals do the same thing. Such tools provide some leverage, but they are not quite enough.

The next step up in leverage comes if we can find some sort of external energy to use to supplement our own energy when making goods and services.  One example might be heat from a fire built with sticks used for baking bread; another example might be energy from an animal pulling a cart. This additional energy can’t take too much of (1) our human energy, (2) resources from the ground, or (3) financial capital, or we will have little to invest what we really wantâ€"technology that gives us the many goods we use, and services such as education, health care, and recreation.

The use of inexpensive energy led to a positive feedback loop: the value of the goods and service produced was sufficient to produce a profit when all costs were considered, thanks to the inexpensive cost of the energy used. This profit allowed additional investment, and contributed to further energy development and further growth. This profit also often led to rising salaries. The additional cheap energy use combined with greater technology produced the impression that humans were becoming more “productive.”

For a very long time, we were able to ramp up the amount of energy we used, worldwide. There were many civilizations that collapsed along the way, but in total, for all civilizations in the world combined, energy consumption, population, and goods and services produced tended to rise over time.

In the 1970s, we had our first experience with oil limits. US oil production started dropping in 1971. The drop in oil production set us up as easy prey for an oil embargo in 1973-1974, and oil prices spiked. We got around this problem, and more high price problems in the late 1970s by

  1. Starting work on new inexpensive oil production in the North Sea, Alaska, and Mexico.
  2. Adopting more fuel-efficient cars, already available in Japan.
  3. Switching from oil to nuclear or coal for electricity production.
  4. Cutting back on oil intensive activities, such as building new roads and doing heavy manufacturing in the United States.

The economy eventually more or less recovered, but men’s wages stagnated, and women found a need to join the workforce to maintain the standards of living of their families.  Oil prices dropped back, but not quite a far as to prior level. The lack of energy intensive industries (powered by cheap oil) likely contributed to the stagnation of wages for men.

Recently, since about 2004, we have again been encountering high oil prices. Unfortunately, the easy options to fix them are mostly gone. We have run out of cheap energy optionsâ€"tight oil from shale formations isn’t cheap. Wages again are stagnating, even worse than before. The positive feedback loop based on low energy prices that we had been experiencing when oil prices were low isn’t working nearly as well, and economic growth rates are falling.

The technical name for the problem we are running into with oil is diminishing marginal returns.  This represents a situation where more and more inputs are used in extraction, but these additional inputs add very little more in the way of the desired output, which is oil. Oil companies find that an investment of a given amount, say $1,000 dollars, yields a much smaller amount of oil than it used to in the pastâ€"often less than a fourth as much. There are often more up-front expenses in drilling the wells, and less certainty about the length of time that oil can be extracted from a new well.

Oil that requires high up-front investment needs a high price to justify its extraction. When consumers pay the high oil price, the amount they have for discretionary goods drops.  The feedback loop starts working the wrong directionâ€"in the direction of more layoffs, and lower wages for those working. Companies, including oil companies, have a harder time making a profit. They find outsourcing labor costs to lower-cost parts of the world more attractive.

Can this Growth Continue Indefinitely?

Even apart from the oil price problem, there are other reasons to think that growth cannot continue indefinitely in a finite world.  For one thing, we are already running short of fresh water in many parts of the world, including China, India and the Middle East.  In addition, if population continues to rise, we will need a way to feed all of these peopleâ€"either more arable land, or a way of getting more food per acre.

Pollution is another issue. One type is acidification of oceans; another leads to dead zones in oceans. Mercury pollution is a widespread problem. Fresh water that is available is often very polluted. Excess carbon dioxide in the atmosphere leads to concerns about climate change.

There is also an issue with humans crowding out other species. In the past, there have been five widespread die-offs of species, called “Mass Extinctions.” Humans seem now to be causing a Sixth Mass Extinction. Paleontologist Niles Eldredge  describes the Sixth Mass Extinction as follows:

  • Phase One began when first humans began to disperse to different parts of the world about 100,000 years ago. [We were still hunter-gatherers at that point, but we killed off large species for food as we went.]
  • Phase Two began about 10,000 years ago, when humans turned to agriculture.

According to Eldredge, once we turned to agriculture, we stopped living within local ecosystems. We converted land to produce only one or two crops, and classified all unwanted species as “weeds”.  Now with fossil fuels, we are bringing our attack on other species to a new higher level. For example, there is greater clearing of land for agriculture, overfishing, and too much forest use by humans (Eldredge 2005).

In many ways, the pattern of human population growth and growth of use of resources by humans are like a cancer. Growth has to stop for one reason or otherâ€"smothering other species, depletion of resources, or pollution.

Many Competing Wrong Diagnoses of our Current Problem

The problem we are running into now is not an easy one to figure out because the problem crosses many disciplines. Is it a financial problem? Or a climate change problem? Or an oil depletion problem? It is hard to find individuals with knowledge across a range of fields.

There is also a strong bias against really understanding the problem, if the answer appears to be in the “very bad to truly awful” range. Politicians want a problem that is easily solvable. So do sustainability folks, and peak oil folks, and people writing academic papers. Those selling newspapers want answers that will please their advertisers. Academic book publishers want books that won’t scare potential buyers.

Another issue is that nature works on a flow basis. All we have in a given year in terms of resources is what we pull out in that year. If we use more resources for one thingâ€"extracting oil, or making solar panels, it leaves less for other purposes. Consumers also work mostly from the income from their current paychecks. Even if we come up with what looks like wonderful solutions, in terms of an investment now for payback later, nature and consumers aren’t very co-operative in producing them. Consumers need ever-more debt, to make the solutions sort of work. If one necessary resourceâ€"cheap oilâ€"is in short supply, nature dictates that other resource uses shrink, to work within available balances. So there is more pressure toward collapse.

Virtually no one understands our complex problem. As a result, we end up with all kinds of stories about how we can fix our problem, none of which make sense:

“Humans don’t need fossil fuels; we can just walk away.” â€" But how do we feed 7 billion people? How long would our forests last before they are used for fuel?

“More wind and solar PV” â€" But these use fossil fuels now, and don’t fix oil prices.

“Climate change is our only problem.”â€"Climate change needs to be considered in conjunction with other limits, many of which are hitting very soon. Maybe there is good news about climate, but it likely will be more than offset by bad news from limits not considered in the model.

Read More

Burning the Carbon Sink

Australian temperature

My home is burning at the moment. Not the bricks and mortar I live in here in the UK, but the place I call home. The coastal region of New South Wales. In fact there is a 750 ha fire about 20km from my parents’ house as I write this.

Bush fires are part of life in Australia, and they always have been.  But the fires at the moment are very early in the season, so people back home are asking if the fires are related to climate change.

Linking a single fire with climate change isn’t very sensible, particularly while peoples’ homes are still burning.  But the long term link between climate change and the risk of heat related extreme weather events, like fire and heatwaves, is gradually being understood.  It is a link which is better evidenced than for things like hurricanes and floods, where robust trends are virtually non-existent.

According to the CSIRO ”Australian annual average daily mean temperatures have increased by 0.9°C since 1910″.  In January it was so hot the Australian Bureau of Meteorology added some new colors to the legend on its heat map (above), the purple indicating above 50°C (122°F).  As we have hotter and longer summers we face increased risks of heat related dangers, one of which is fires.

The current fires and the bit of climate debate they have provoked has gotten pretty ugly back home.  But the vivid images serve to remind me of something else that I never read in the papers; the vulnerability of the carbon cycle to the risk of fire, and other feedbacks, in the coming century.

The importance of carbon sinks

In my recent climate science for beginners post I talked about why scientists are uncertain as to how much the planet will warm by the end of this century.  The two reasons that are often discussed are the path of future emissions and climate sensitivity.  But there is also a third important factor.  The future of carbon sinks.

If it wasn’t for the increased absorption of carbon dioxide by the oceans and land sinks since the industrial revolution atmospheric concentrations of CO2 would already be above 500 ppm, and the world would be much warmer.

The importance of carbon sinks

Click the image to expand it.

Of the total carbon dioxide emitted by human activity since 1750 about 44% remains in the atmosphere, 30% has been absorbed by the ocean and 26% by land sinks including trees, soils and fungi.

I’ve shown the annual sources of carbon emissions before, but where those emissions go in terms of sinks is hugely important too.

Carbon Sinks

As humans emit more and more carbon dioxide each year the atmospheric concentration of CO2 increases, while both land and ocean sinks also absorb more CO2.  While absorption from the ocean has grown steadily, there is large annual variability and uncertainty surrounding land sinks.

If recent land sink growth is the result of CO2 fertilization and nitrogen deposition then land sinks may continue to absorb more and more carbon.  Whereas if they are largely a response to past land use change in terms of regrowth and thickening they may weaken, exacerbating warming.

Such uncertainties seem common when looking at carbon sinks.  Higher temperatures are generally associated with lower net absorption but increased CO2 and rainfall with more.  Fire are an issue too.

During the 20th century annual carbon emissions from fire increased by around 40%, driven largely by increased tropical forest fires (see Mouillot).  Some fires like the Amazonian fires of 1997/98 and the Black Dragon fires in China and Siberia in 1987 were big enough to have a discernible impact on global atmospheric concentrations.

The future of carbon sinks

If it wasn’t for the land and ocean sinks carbon dioxide concentrations would be growing at 4.8 ppm each year, rather than the recent average of 2.0 ppm. If you priced this mitigation service at $25/t CO2, it would be worth half a trillion dollars each year. But if you like the ocean, the price being paid in terms of acidification is virtually incalculable.

Hopefully land sinks will continue to absorb more carbon dioxide due to increased fertilization.  But at the same time rising temperatures put natural carbon sinks at risk.  This includes the more immediate risks like fire and dehydration of peatlands, through to the threats of melting permafrost, methane hydrates and ocean pump declines.

Vulnerable Carbon Pools

This image from the Global Carbon Project is a good visual summary of the many vulnerabilities carbon sinks face in the coming century.  You can see the northern and eastern parts of Australia subject to fire.

This map isn’t saying that these things are going to happen, there is a lot of uncertainty.  It is simply pointing out that as the world warms up the natural carbon sinks become more vulnerable to decline.  For a proper summary check out the Global Carbon Balance and its Vulnerabilities.

As the Eucalypts continue to burn in New South Wales I can’t help but wonder if at some point in the future the oceans and land are going to start throwing all that carbon dioxide back into the atmosphere.

Authored by:

Lindsay Wilson

Lindsay is the founder of Shrink That Footprint, a resource that helps people understand and reduce their carbon emissions.  He is also a member of the team at Maneas, a data driven corporate strategy group. With a background in economics he has previously worked as an analyst at Bloomberg New Energy Finance, and as a freelance consultant in energy strategy in the resources and government ...

See complete profile

Read More
Powered By Blogger · Designed By Alternative Energy