Thursday, December 13, 2012

Zombie LDK Stops Production, Fires Thousands

Zombie LDK Stops Production, Fires Thousands
by Sneha Shah

LDK Solar (NYSE:LDK)which used to be the biggestsolar wafer producer has completely stopped production ofpolysilicon and sharply reduced shipments to preserve cash. Thecompany is effectively bankrupt and surviving due the largesse ofstate owned Chinese banks which have given $3 billion in loans tothe company. LDK has almost no chance of paying down thismonstrous debt given that it has been operating on negative grossmargins for the last few quarters.

LDK cost structure is way too high compared to itscompetitors
LDK has a much higher cost structure in most solar segments. Itswafer processing cost is 25c/watt compared to 12c/watt forRenesola (NYSE:SOL),its polysilicon cost is around $30/kg compared to $23/kg forRenesola and $17/kg for GCL Poly (3800.HK).Given that even Renesola and GCL are making losses, the situationfor LDK is dire. LDK has diversified into production of cells,modules as well as solar plant construction. However these effortslike the others have also failed spectacularly with its Germanacquisition also near bankruptcy.

LDKs Giant Polysilicon Plant stops Production
LDK’s biggest failure has been in building up its polysiliconproduction. The plant took too much time and too much money to getbuilt and never managed to reduce its costs to a competitivelevel. Despite building a plant with 15000 tons of capacity makingit one of the top polysilicon players, the company has nevermanaged to ramp production to make decent profits. Not its plantlies idle as polysilicon prices have crashed to $15/kg almost halfthat of its cost of $30/kg. It remains to be seen whether thisplant will ever restart. The only chance for it to do that is ifthe Chinesegovernment imposes duties on imports of polysilicon.
LDK Management has no clue what to do
The Management of LDK Solar has performed disastrously right frombuilding the polysilicon plant to diversification into thin filmsolar panels (Best Solar), acquiringsolar system companies (Germany, USA), managing debt etc.They kept on spending money and building capacity even as thewhole house of cards collapsed around them. Even as recently aslast year , they signed a deal to build a massive polysiliconplant in China’s Inner Mongolia province even as they could notrun their poly plant profitably. The company had forecast morethan $2.5 billion in sales at the beginning of the year and nowthey have reduced it to less than $1 billion.

LDK has fired thousands of employees this year and willcontinue to do so
The company in its latest quarterly results reported a sharp decline in cash as the company continued to burn cash. LDK management has fired 2500 workers as its utilization fell sharplyand has fired almost 9,000 workers or 40% of its workforce thisyear. Given its uncompetitive structure, the company will continueto fire workers. LDK has almost 4 GW of wafer capacity and hasused less than 50% of that capacity this quarter.

LDK is Bankrupt but the Chinese Government does not wantto let go
LDK is bankrupt and one can easily make that out going throughits balance sheet. The company has almost $3.7 billion of plantassets which in reality are of much lower value. If the company takes even a 10% asset writedown of its PPE, it will have anegative worth given that it has only $50 million of equity listedon its balance sheet. LDK has more than $3 billion in loans and ithas been reported that a small bank Shanghai Rural Commercial Bankfor overdue loans worth 100 million yuan has already sued LDK torecover that loan. LDK recently sold a 20% equitystake for a pittance to a state owned vehicle Heng Rui XinEnergy and changedits management structure. The Xinyu government has alsogiven it a grant. But given its massive problems all loans andgrants will only prolong the pain given that LDK has no chance ofcoming out of this downturn unless a miracle happens.

View the original article here

Sunday, December 9, 2012

US Energy Production Outpacing Consumption

Print Friendly
[Editor's notes at the end.]
The US Energy Information Administration has just released its “Annual Energy Outlook 2013
” report with projections for US energy markets through to 2040. The report shows that growth in the country’s energy production is outpacing the growth of consumption.
Specifically, the report found that the growth of renewable energy is much faster than the use of fossil fuel.

“EIA’s updated Reference case shows how evolving consumer preferences, improved technology, and economic changes are pushing the nation toward more domestic energy production, greater vehicle efficiency, greater use of clean energy and reduced energy imports,” said EIA Administrator Adam Sieminski. “This combination has markedly reduced projected energy-related carbon dioxide emissions.”
Some key findings:

Crude oil production, particularly from tight oil plays, rises sharply over the next decade. The advent and continuing improvement of advanced crude oil production technologies continues to increase projected domestic supply. Domestic production of crude oil increases sharply in AEO2013, with an annual growth averaging 234 thousand barrels per day (bpd) from 2011 through 2019, when production reaches 7.5 million bpd (Figure 1). The growth results largely from a significant increase in onshore crude oil production, particularly from shale and other tight formations. After about 2020, production begins declining gradually to 6.1 million bpd in 2040 as producers develop sweet spots first and then move to less productive or less profitable drilling areas.Motor gasoline consumption is lower in the AEO2013 relative to the level in AEO2012, reflecting the introduction of more stringent corporate average fuel economy (CAFE) standards; growth in diesel fuel consumption is moderated by increased use of natural gas in heavy-duty vehicles. AEO2013 incorporates the greenhouse gas (GHG) and CAFE standards for light-duty vehicles (LDVs) through the 2025 model year, which raise the new vehicle fuel economy requirement from 32.6 miles per gallon (mpg) in 2011 to 47.3 mpg in 2025. The increase in vehicle efficiency reduces gasoline use in the transportation sector by 0.5 million bpd in 2025 and by 1.0 million bpd in 2035 in AEO2013 compared to the AEO2012 Reference case (Figure 2). Furthermore, the improved economics of natural gas results in an increase in the use of liquefied natural gas (LNG) in heavy-duty vehicles that offsets a portion of diesel fuel consumption. The use of petroleum-based diesel fuel is also reduced by the increased use of diesel produced using gas-to-liquids (GTL) technology. Natural gas use in vehicles reaches 1.7 trillion cubic feet (including GTL) by 2040, displacing 0.7 million bpd of other motor fuels. 

The United States becomes a larger exporter of natural gas than projected in the AEO2012 Reference case. US natural gas production increases throughout the projection period (Figure 3), outpacing domestic consumption by 2020 and spurring net exports of natural gas. Higher volumes of shale gas production in AEO2013 are central to higher production volumes and an earlier transition to net exports than was projected in the AEO2012 Reference case. US exports of LNG from domestic sources rise to approximately 1.6 trillion cubic feet in 2027, double the 0.8 trillion cubic feet projected in AEO2012; the United States becomes a net exporter of LNG in 2016

Renewable fuel use grows at a much faster rate than fossil fuel use. The share of electricity generation from renewables grows from 13 percent in 2011 to 16 percent in 2040. Electricity generation from solar and, to a lesser degree, wind energy sources grows as recent cost declines make them more economical. However, the AEO2013 projection is less optimistic about the ability of advanced biofuels to capture a rapidly growing share of the liquid fuels market than AEO2012. As a result, biomass use in AEO2013 totals 4.2 quadrillion Btu by 2035 (compared to 5.4 quadrillion Btu in AEO2012) and 4.9 quadrillion Btu in 2040, up from 2.7 quadrillion Btu in 2011.With improved efficiency of energy use and a shift away from the most carbon-intensive fuels US energy-related carbon dioxide (CO2) emissions remain more than 5 percent below their 2005 level through 2040 [Editor's note: recall, from our post moments ago, that "energy-related carbon dioxide (CO2) emissions" is a very deceiving term that hides simultaneous growth in CO2 emissions from natural gas flaring, among other things].

The projected growth rate for US energy-related CO2 emissions has declined successively in each Annual Energy Outlook since AEO2005, reflecting both market and policy drivers (Figure 4). Emissions from motor gasoline demand in AEO2013 are lower than in AEO2012 as a result of the adoption of fuel economy standards, biofuel mandates, and shifts in consumer behavior. Emissions from coal use in the generation of electricity are lower as power generation shifts from coal to lower-carbon fuels, including natural gas and renewables. The story is somewhat more complex for natural gas. Emissions from natural gas use are higher in the industrial and electric power sectors in AEO2013 than in AEO2012 as a result of increased consumption; however, the increase is partially offset by lower emissions from natural gas use in the residential and commercial sectors in AEO2013 as a result of the implementation of efficiency standards for energy-using equipment and other changes that affect demand.

Other AEO2013
 Reference case highlights:The Brent spot crude oil price declines from $111 per barrel (in 2011 dollars) in 2011 to $96 per barrel in 2015. After 2015, the Brent price increases, reaching $163 per barrel in 2040, as growing demand leads to the development of more costly resources. World liquids consumption grows from 88 million bpd in 2011 to 113 million bpd in 2040, driven by demand in China, India, Brazil, and other developing economies.Total US primary energy consumption grows by 7 percent in the AEO2013 Reference case, from 98 quadrillion Btu in 2011 to 108 quadrillion Btu in 2040. The fossil fuel share of primary energy consumption falls from 82 percent in 2011 to 78 percent in 2040 as consumption of petroleum-based liquid fuels falls, largely because of the incorporation of new fuel efficiency standards for LDVs.In the AEO2013 Reference case, energy use per capita declines by 15 percent from 2011 through 2040 as a result of improving energy efficiency (e.g., new appliance standards and CAFE) and changes in the way energy is used in the US economy. Energy use per 2005 dollar of gross domestic product (GDP) declines by 46 percent from 2011 to 2040 in AEO2013 as a result of a continued shift from manufacturing to services (and, even within manufacturing, to less energy-intensive manufacturing industries), rising energy prices, and the adoption of policies that promote energy efficiency. CO2 emissions per 2005 dollar of GDP have historically tracked closely with energy use per dollar of GDP. In the AEO2013 Reference case, however, as lower-carbon fuels account for a bigger share of total energy use, CO2 emissions per 2005 dollar of GDP decline more rapidly than energy use per 2005 dollar of GDP, falling by 56 percent from 2005 to 2040, at an annual rate of 2.3 percent.Net imports of energy decline both in absolute terms and as a share of total US energy consumption. The decline in energy imports reflects increased domestic petroleum and natural gas production, increased use of biofuels, and lower demand resulting from rising energy prices and the adoption of new efficiency standards for vehicles. The net import share of total US energy consumption is 9 percent in 2040, compared with 19 percent in 2011. (The share was 29 percent in 2007.)

[Editor's notes: There's a lot of data above, and I'm sure much of it is confusing to the average reader. So, I'm just going to pull out a few underlying points and add some of my own:

It's pretty clear that the EIA is predicting a huge natural gas boom, not just in the coming years, but the coming decades. Whether or not this will come about is unclear, but that's the track we are currently on. The EIA is quite 'biased' in its energy projections because it gives a lot of credence to what has been happening in the past year, what is happening at the moment, and not so much what is likely to happen as wind and solar policies and innovation make them more and more attractive from a financial perspective.Natural gas is arguably much better than coal or oil. However, several researchers have also put up huge red flags regarding methane leaks and the true result of natural gas flaring, red flags which imply natural gas may not be so much better after all, and may not be better at all in a worst case scenario.The projected increase of renewable electricity from 13 percent in 2011 to 16 percent in 2040 seems like a joke. If we had that small of an increase in that time, I'd be shocked. I'd bet all my savings that projection is way off. And, again, it's a result of the EIA's narrow approach to making these projections. However, it is based on something -- a US Congress that has been completely horrid at doing what the public wants, which is climate action and a strong promotion of renewable energy production. Let's hope we don't stick in a grid-locked Congress and society on this matter for long... and certainly not as long as the EIA is projecting!
Those are my main thought chime in with your own if you have them.]
Source: U.S. Energy Information Administration

Saturday, December 8, 2012

What Are the Near-Term Climate Pearl Harbors?

So they [the Government] go on in strange paradox, decided only to be undecided, resolved to be irresolute, adamant for drift, solid for fluidity, all-powerful to be impotent…. Owing to past neglect, in the face of the plainest warnings, we have entered upon a period of danger….  The era of procrastination, of half measures, of soothing and baffling expedience of delays, is coming to its close.  In its place we are entering a period of consequences….  We cannot avoid this period, we are in it now….
– Winston Churchill, November 12, 1936, House of Commons
What kind of climatic mini-catastrophes might move public and policymaker opinion over the next decade?  Please share your thoughts below.

“The battleship USS Arizona belches smoke as it topples over into the sea during a Japanese surprise attack on Pearl Harbor, Hawaii, in a Dec. 7, 1941 file photo. The ship sank with more than 80 percent of its 1,500-man crew, including Rear Admiral Isaac C. Kidd. The attack, which left 2,343 Americans dead and 916 missing, broke the backbone of the U.S. Pacific Fleet and forced America out of a policy of isolationism.” (AP Photo/File)
Yesterday marked the 71th anniversary of Pearl Harbor.  In the wake of the extreme weather in the past two years, including superstorm Sandy — all of which served to increase concern about global warming among the public and some politicians — I’m updating my post from 3 years ago, “What are the near-term climate Pearl Harbors?” (which I had updated already last year).
The genesis of the original piece started with an October 2008 post, “Is 450 ppm (or less) politically possible? Part 7: The harsh lessons of the financial bailout.”  It concluded that a key driver of serious government action is “bad things must be happening to regular people right now.”  Shortly after that I wrote a post on the paper “Target Atmospheric CO2: Where Should Humanity Aim?” by Hansen et al
.  I noted the authors conclude:
The most difficult task, phase-out over the next 20-25 years of coal use that does not capture CO2, is herculean, yet feasible when compared with the efforts that went into World War II. The stakes, for all life on the planet, surpass those of any previous crisis. The greatest danger is continued ignorance and denial, which could make tragic consequences unavoidable.
A NY Times
blogger posed this question, “What kind of wake-up call does Mr. Romm think is conceivable on a time scale relevant to near-term policy?”
My reply was “Multiple Pearl Harbors over the next decade — half or more of these happening
” followed by a list of 9 items.
Before repeating that list, let me note that I pointed out that one of the media’s greatest failings is ‘underinforming’ people that “Bad things are
happening to real people right now thanks in part to human-caused climate change — droughts, wildfires, flooding, extreme weather, and on and on.” I listed a perfect example: “My article criticizing the NYT on the bark beetle story“.  Things hadn’t changed much by last December but the U.S. weather has been so relentlessly extreme that media coverage has improved a tad in recent months (see the July post, Every Network Gets Extreme Weather Story Right, ‘Now’s The Time We Start Limiting Manmade Greenhouse Gases’ — ABC.)

If FDR had said, “Yesterday, Dec. 7, 1941 – a date which will live in infamy – the United States of America was suddenly and deliberately attacked. But we’re still working to identify the perpetrators.”  Well, not bloody much would have happened.
Of course, the U.S. military had some warnings, but there was a massive volume of intelligence signals (“noise”) coming in.  Roberta Wohlstetter wrote in 1962: “To discriminate significant sound against this background of noise, one has to be listening for something or for one of several things….   One needs not only an ear but a variety of hypotheses that guide observation.”

The Japanese commander of the attack, Mitsuo Fuchida, was quite surprised he had achieved surprise.  Before the Russo-Japanese war of 1904, the Japanese Navy had used a surprise attack to destroy the Russian Pacific Fleet at anchor in Port Arthur.  Fuchida asked, “Had these Americans never heard of Port Arthur?“

So if you have the right hypothesis or worldview, you can make sense out of “noisy” warnings.  If you don’t, then you will be oblivious even to signs that in retrospect will seem quite obvious.  Certainly future generations will be stunned by our obliviousness.

In the case of the almost non-stop series of “off the charts” extreme climatic events that many opinion leaders seem shocked about over and over again — they aren’t merely “explainable and predictable” after the fact.  They were very often predicted or warned about well in advance by serious people.  The powers that be simply choose to ignore the warnings because they don’t fit their world view.

Unfortunately for the nation and the world, there is no American Churchill on climate.  Quite the reverse:
That lack of statesmenship means the country is not going to act on the basis of the increasingly dire warning of scientists (see Lonnie Thompson on why climatologists are speaking out: “Virtually all of us are now convinced that global warming poses a clear and present danger to civilization”).
No, things are going to have to get worse.  And it certainly will take more than one climate Pearl Harbor.  I fear it will take most of these happening over the span of a few years:

Arctic goes [virtually] ice free before 2020. It would be a big, visible global shock.Rapid warming over next decade, as Nature and Science articles suggest is quite possible (posts here and here)Continued (unexpected) surge in methaneA [multi-year] megadrought hitting the SW [and Great Plains] comparable to what hit southern Australia.More superstorms, like Katrina.A heatwave as bad as Europe’s 2003 one [Russia's in 2010] but hitting the U.S. breadbasket.Something unpredicted but clearly linked to climate, like the bark beetle devastation.Accelerated mass loss in Greenland and/or Antarctica, perhaps with another huge ice shelf breaking off, but in any case coupled with another measurable rise in the rate of sea level rise.The Fifth Assessment Report (2012-2013) really spelling out what we face with no punches pulled.
And no, to preempt comments similar to one I had in the original post, I’m not “hoping” for those things to happen. Quite the reverse.  I have have been proposing strong emissions reductions for many, many years to minimize the chances of catastrophic impacts. In any case, hope can’t change what is to come — only strong action now can.

That was my original list [only slightly modified].  I think it holds up, except for number 9.  The IPCC has not only undermined its credibility but demonstrated time and time again that it is incapable of spelling out what we face with no punches pulled — see “Blockbuster IPCC Chart Hints at Dust-Bowlification, But Report Is Mostly Silent on Warming’s Gravest Threat to Humanity” and “IPCC’s Planned Obsolescence: Fifth Assessment Report Will Ignore Crucial Permafrost Carbon Feedback!”

The drought the U.S. has been experiencing is slowly getting to the level that can change thinking — let’s hope it doesn’t get to that level, though such Dust-Bowlification is inevitable if we don’t act soon.
I think it’s a little clearer what scale monster heat wave starts to change people’s thinking (see Russian President Medvedev: “What is happening now in our central regions is evidence of this global climate change, because we have never in our history faced such weather conditions in the past”).  We know that there’s an 80% Chance Russia’s 2010 July Heat Record Would Not Have Occurred Without Climate Warming.  We also know that the Monster crop-destroying Russian heat wave is projected to be once-in-a-decade event by 2060s (or sooner).

Two years ago, Lester Brown explained to me that when the real food instability comes — if, for instance, the U.S. breadbasket gets hit with the type of 1000-year heat wave Russia did — then the big grain producers will ban exports, to make sure their people are fed.  In this scenario, if you don’t have your own food supplies or an important export item to barter — particularly oil — your country is going to have big, big problems feeding its people.  That might wake folks up a tad.

That may well be the biggest evolution of my thinking in the past 3 years, that it is food insecurity — and the daggers that climate change threaten it with — that will ultimately force action (see “My Nature
Piece On Dust-Bowlification And the Grave Threat It Poses to Food Security“).
Your ideas are welcome.  You can read the original reader comments here.

I did note in the original piece that preferably these “mini-catastrophes”  would not themselves be evidence that we had waited too long and passed dangerous, irreversible tipping points.

One can argue that a big surge in methane would be evidence that we had waited too long (see “Climate Experts Warn Thawing Permafrost Could Cause 2.5 Times the Warming of Deforestation!“), but the likely rate of emissions from the tundra don’t change the nature of the actions, only their scale, which are already quite intense (see “The full global warming solution: How the world can stabilize at 350 to 450 ppm“).
If you want 350 ppm — or if you want 450 ppm in a (likely) world where the permafrost has begun to turn into the permamelt — then because we have listened to the siren song of delay for so long, we will need a WWII-style and WWII-scale effort.  As I noted in the conclusion to my book:
This national (and global) re-industrialization effort would be on the scale of what we did during World War II, except it would last far longer. “In nine months, the entire capacity of the prolific automobile industry had been converted to the production of tanks, guns, planes, and bombs,” explains Doris Kearns Goodwin in her 1994 book on the World War II homefront, No Ordinary Tim
e. “The industry that once built 4 million cars a year was now building three fourths of the nation’s aircraft engines, one half of all tanks, and one third of all machine guns.”
The scale of the war effort was astonishing. The physicist Edward Teller tells the story of how Niels Bohr had insisted in 1939 that making a nuclear bomb would take an enormous national effort, one without any precedent. When Bohr came to see the huge Los Alamos facility years later, he said to Teller, “You see, I told you it couldn’t be done without turning the whole country into a factory. You have done just that.” And we did it in under five years.
But of course we had been attacked at Pearl Harbor, the world was at war, and the entire country was united against a common enemy. This made possible tax increases, rationing of items like tires and gasoline, comprehensive wage and price controls, a War Production Board with broad powers (it could mandate what clothing could be made for civilians), and a Controlled Material Plan that set allotments of critical materials (steel, copper, and aluminum) for different contractors.
How ironic that denial, driven in large part by conservative fear of big government, has created an “era of procrastination, of half measures, of soothing and baffling expedience of delays” that will ultimately require somewhat bigger government (for several decades) to prevent catastrophe or, if the deniers truly “triumph,” then staggeringly huge government (for a century and probably much more) to “adapt” to [through a combination of abandonment, triage, and misery] a ruined world (see “Don’t believe in global warming? That’s not very conservative”).

Finally, Pearl Harbor #1 is increasingly likely (see Death Spiral Watch: Experts Warn ‘Near Ice-Free Arctic In Summer’ In A Decade If Volume Trends Continue). The fact that what’s happening in the Arctic (and its implications for sea level rise, the tundra, and our weather) isn’t one of the major media stories of the year — comparable to the fiscal cliff — may be the clearest evidence that the media is under- and mis-reporting the story of the century.

What I didn’t realize when I wrote the original list is that the shockingly fast loss of Arctic ice would itself lead to more superstorms and extreme weather (see “NOAA Bombshell: Warming-Driven Arctic Ice Loss Is Boosting Chance of Extreme U.S. Weather“). So the current bout of extreme weather is likely the “new normal.”

The Pearl Harbors are here. The Churchills and FDRs aren’t.

Wednesday, November 14, 2012

Man Lights House with Toyota Prius

An enterprising New Jersey man used his Toyota Prius as a generator to run his home’s lights, laptops, and a television because he lost power due to the recent Sandy outages. For one week he used an inverter and some extension cords to produce power for his home while only using 75% of one tank of gas.

At one point, as you probably know, millions of homes were without power in New York and New Jersey.

There are instructions online explaining how to use a Prius as a generator. WikiHow has a six-step process, including a tip about proper ventilation, because car exhaust can be deadly if it accumulates in a space where people or animals are breathing.

The New Jersey man is not the only person to have used a Prius to power devices in his home. Last year, a man in Massachusetts did as well. “When it looked like we were going to be without power for awhile, I dug out an inverter (which takes 12v DC and creates 120v AC from it) and  wired it into our Prius… These inverters are available for about $100 many places online,” the man said. He only used five gallons of gas to power his home appliances for three days.

Many backup generators use two-cycle engines, which are known to create large amounts of air pollution, though they certainly are much cheaper than a Prius. Some electric vehicle owners also have solar panels on their homes and generate all or a portion of their own power.
The 2012 Prius has been rated at 536 miles per tank.

Tuesday, November 13, 2012

New Chemical Process Produces Biofuel Strong Enough to Power Jets

Thanks to scientists harnessing the power of chemistry, you may one day soon fly in a plane fueled by plants. An article published in the journal Nature last week describes a new technique developed by researchers at UC Berkeley that can create biofuels powerful enough to be used as jet fuel. Created using bacterial fermentation and chemical catalysis, the amped up biofuel is ten times more powerful, and it can serve as a viable power source for large industrial vehicles and airplanes.

The researchers have created a two-step process that drastically increases the potency of biofuels. First, plant sugars are broken down through fermentation using the bacterium Clostridium acetobutylicum,
producing acetone ethanol. Then, the resulting product is run through chemical catalysis in order to increase the amount of carbon in each molecule.

This new process ratchets up the amount of carbon present in normal ethanol by ten times, making it as powerful as diesel and jet fuel made from petrochemicals. The next challenge for the researchers is to find a way to duplicate their new methods on an industrial scale.

View the original article here

Monday, November 12, 2012

Natural Gas – Is It Stunting Innovative Thinking?

Let’s admit it, infrastructure is a boring word.   There’s nothing sexy about it.  It implies disruptions to our lives as we deal with delays and detours for construction and repair projects.  Yet it is absolutely necessary, and infrastructure is what needs to be upgraded in our water, gas, and electric grids.

My previous articles discussed investments that are ongoing or needed in the electrical grid to modernize generation, transmission, distribution, and consumption.  However, the same issues exist for gas and water too.  In some aspects, the needs are even more striking.  But how we build our infrastructure and what we build for our infrastructure also says a great deal about how innovative is our thinking.  And unfortunately, right now that thinking is “like for like”, and merely replicates existing energy models with known weaknesses in reliability and resiliency instead of building infrastructure based on new models.

Natural gas is seen by some in the energy business as a panacea to all energy concerns.  It’s domestic.  It’s cleaner than coal.  However, it requires significant infrastructure investments.  No matter how much innovation you put into the extraction technologies for fossil fuels (which by the way had HUGE federal government assistance), the supply chains still require buildouts of pipelines to transport it to refineries and on to points of consumption.  We simply don’t have sufficient pipeline capacity to transport it to all the places that want it in the USA.  It’s an infrastructure play that has a number of challenges.

The natural gas that is extracted must be processed, just like oil must be refined, or electricity must be generated.  These industrial operations expend lots of energy in processing gas into what is considered pure gas for end use consumption.  The transport of processed natural gas in pipelines requires more energy to compress it and move it in pipelines, and compressor stations, like electricity substations, are placed along major transmission corridors to boost pressure.  This map shows the interstate natural gas pipelines that transmit highly compressed natural gas.  Pipelines have physical constraints – there is only so much space available for gas, and they require electricity to compress the gas in the pipelines.  Therefore, when there is a significant electricity outage in a region, it can also impact the transmission and distribution of natural gas.
According to the US Department of Transportation’s Pipeline and Hazardous Materials Safety

Administration records, there are over 2 million miles of distribution pipeline.  As we saw in San Bruno, California two years ago, failure to properly monitor and maintain distribution pipelines has consequences.  Smart Grid technologies including the colorfully named PIGS (Pipeline Insertable Gauges) that can monitor and transmit measurements within pipes can help reduce the odds of similar mechanical, technical, and human failures.

But with natural gas, we are once again relying on a model of centralized production, large-scale transport, and wide-scale distribution.  It has all the weaknesses of today’s electrical grid.  Acts of nature and human causes can cause disruptions.  And because natural gas is a conveniently transportable fuel, that also means it is a very exportable fuel.  Sine we won’t see any federal or state laws that require that natural gas produced in the USA must be consumed in the USA – it will go to the highest bidder – on or offshore.  While gas is inexpensive now, it hasn’t always been, and if history is our guide, there’s no guarantee that it won’t be in the future.

So at the cusp of grid modernization, we are placing much of our energy future in a source that we hope will remain cheap and be readily available at any point it is needed, which requires committed investments in new infrastructure and enhancements to existing infrastructure.  It is an energy source that also generates concerns about potential environmental degradation and seismic destabilization.  And somehow, this all looks better than clean domestic renewables that require a different infrastructure investment, but avoid those troubling questions about price fluctuations, exportability, and environmental impacts.  Yes, we have too much “like for like” thinking about infrastructure going on when we need truly revolutionary thinking.

View the original article here

Sunday, November 11, 2012

Voltaic DIY Solar Charger Kits: Off-Grid Power Your Way

Voltaic Systems makes some of the highest quality solar charging accessories on the market today. We’ve featured many of their gadgets in the past, from solar charging backpacks to iPad cases. Now they’ve launched a new line of products aimed at the DIY enthusiast: a selection of solar charging kits which allow users to design their own portable power systems.

One size fits all is so 1990's. Today’s consumers, even the conscious kind, want customization. The market has responded, giving us the ability to have everything, from cars to hamburgers, “our way”. While it may seem a little self-absorbed, there is some wisdom in building customization into today’s product designs. When people can customize, there’s less waste, and we come away more confident that our unique needs will be met. Voltaic’s newest products are designed to make it easier for people to create their own solar chargers based on their own power, cost and form factor requirements.

The new solar charging kits build on the path forged by Voltaic’s Fuse, a 10 W solar panel that comes with a strap system so it can be attached to any type of backpack. Especially convenient if you love your current backpack and don’t want to shell out $200+ for one with the panels already embedded.

Each DIY solar charging kit includes waterproof solar panels, connectors, and a universal battery. Kits range in size from 2 Watts of solar power for basic smartphone and small device charging, to 16 Watts for laptop, tablet and digital camera charging. Single panel kits can be connected directly to a Voltaic battery for power storage. Those looking for more juice can build multi-panel kits need to use a circuit box (below) which includes two inputs for solar panels, an output for an LED wire (optional) and two power output cables.

“We’ve discovered that one size doesn’t fit all. Our customers charge hundreds of different devices in wildly different conditions all over the world.” said Shayne McQuade, CEO of Voltaic Systems. “We created these solar charger kits so our customers can build solar systems tailored to their specific power, weight cost, and form factor requirements.” Small kits start at $25.

Saturday, November 10, 2012



Top 10 States, 10 States Needing Most Improvement, Most-Improved States Highlighted: AK, AZ, CA, CT, KS, LA, MD, MA, MI, MN, MS, MO, MT, NE, NY, NC, ND, OK, OR, PA, RI, SC, SD, VT, WA, WV, WY

Category: Investment, Renewable Energy

View the original article here

Wednesday, November 7, 2012

Air Force Wind Turbines at Radar Station Convert Nay-Sayers to Cheerleaders

Here’s a crazy lede from a press release about Air Force wind turbines that came out earlier this week: “Change is blowing into Cape Cod Air Force Station as the 6th Space Warning Squadron receives two new wind turbines.” Crazy, because just a couple of years ago the U.S. Department of Defense expressed serious national security concerns about radar interference from wind farms, and now here they are plunking down a couple of wind turbines right in the middle of a radar station. However, before the alarm bells go off, take a look at what’s changed over the past couple of years.

First off, let’s note that the full lede in that press release goes like this:
“Change is blowing into Cape Cod Air Force Station as the 6th Space Warning Squadron receives two new wind turbines here saving an estimated $1 million in annual energy costs.”

The two turbines are expected to slash electricity costs at the station in half and pay for themselves in about twelve years. After that, they will provide the station with free electricity for up to 13 years, assuming they reach their 20-25 lifespan.

The press release is a bit vague on the details but apparently the station currently receives electricity from an oil-fired power plant. From that benchmark the Air Force expects that the two turbines combined will cut carbon dioxide, sulfur dioxide, and nitrogen oxide emissions by 2,000 metric tons per year.

So, what’s changed? In this instance, perhaps nothing. There are different kinds of radar systems for different purposes. This one is for “space situational awareness” and for tracking sea-launched intercontinental ballistic missiles as well as satellites, and it’s possible that wind turbines don’t make a difference for that kind of system.

In general, though, wind turbines are recognized as a threat to radar operations, and by the mid-2000's it became obvious that the growing wind industry in the U.S. was on a collision course with military radar systems. In response, the Department of Homeland Security commissioned a study on radar interference from wind turbines that was released in 2008.

The study noted that the conventional solution was location-based, meaning simply that wind turbines could not be located anywhere near a radar station.

That’s a rather primitive approach to solving a high-tech problem, and sure enough the study recommended exploring technological rather than geographical solutions.

Fast-forward just three years and you’ll see that one is at hand, at least in the UK. Within the past year, the UK’s Ministry of Defence has worked out a big deal with wind farm developers that has “unlocked” 4 gigawatts (GW) worth of blocked wind farms.

Something must be up in the U.S., too, because just last August the Department of Defense signed a memorandum of understanding with the Department of the Interior to explore the potential for wind power and other forms of alternative energy on millions of acres of land at western military bases.

Meanwhile, a U.S. company called Aveillant has come up with a new radar system for air traffic control at airports that it calls a Holographic Radar, which uses 3-D imagery to distinguish between airplane wings and turbine blades.

In any case, if radar systems can be redesigned to coexist with wind turbines, perhaps a technological solution will soon be at hand to ensure that wind turbines can coexist more safely with birds, too.

Tuesday, October 16, 2012

Managing the Magnitude of the Smart Grid

The envisioned, next-generation smart grid is an evolution that builds on decades of accumulated engineering lessons learned to make the ways that we deliver and receive electricity more intelligent, robust and reliable. But utilities, industry, policymakers and even consumers must not underestimate what is being undertaken here. The smart grid is not just a next step; it’s an interrelated range of steps—and, sometimes, leaps—that are all pointed toward the same set of revolutionary goals

It’s that potentially overwhelming magnitude of the enterprise that makes the emerging smart grid so complex and daunting. Consider the layers of change that are being carried out.
Let’s start with the shift from one-way to two-way power flow within a region of the grid. Instead of unidirectional power flowing—usually from a central-station plant to some sort of business or residential consumer—the long-range vision for the smart grid is predicated on bidirectional power flow, anywhere across the network. In this model, augmented with significantly more robust technologies for distributed generation and secure, real-time information exchange, any power user could also be a power producer. This shift also demands substantial change in regulations and business processes, as there are brand-new questions to be worked out in terms of who shoulders the costs of interconnection and how players on both ends of connections are to be equitably compensated.

Now let’s look at the proposed transition to a truly interstate and even international grid, in which power and information could be exchanged from one region to another. This, too, constitutes dramatic and multidimensional change. For example, the United States is today served by, effectively, three grids that are comprised of mostly proprietary systems that are purchased, deployed and operated more than 3,000 independent utilities, each with their own processes and legacy infrastructures. The smart grid envisions power and information flowing flexibly across existing regional jurisdictions, and that demands interoperability across equipment interfaces, data formats, content definitions, measurement units, etc.

These are historic changes that, in the end, stand to add up to ground-breaking benefits in terms of reliably satisfying unprecedented demand for power, reducing carbon footprint, keeping energy costs in check, enabling new business models and empowering consumer choice in the way power is used. The world’s smart-grid stakeholders will not only have to keep their eye on those long-term goals; for the smart grid to come about efficiently, they also will need to institute technology, business-process or regulatory changes within context of a comprehensive, long-range plan.

IEEE 2030® “IEEE Guide for Smart Grid Interoperability of Energy Technology and Information Technology Operation with the Electric Power System (EPS), End-Use Applications, and Loads,” for example, was created with just such a system-of-systems orientation. The document provides a roadmap to interconnection and interoperability, interface by interface across the grid. Utilities can use IEEE 2030 to inform their smart-grid infrastructure plans, and vendors can use it to help craft product strategies for the smart-grid market opportunity.

The basic model of electricity production, distribution and consumption has not fundamentally changed since the power industry’s inception, but the smart grid is brining new engineering principles, technological capabilities and business relationships into play. This is a journey that will ultimately have to play out over decades.
It will take leadership, the will to invest and take risk and commitment to a long-range plan to make it happen.

Monday, October 15, 2012

Memo to Federal and State Highway Agencies: Keep CMAQ Funds On Track to Cut Pollution

Congress included an innovative program in the 1991 Intermodal Surface Transportation Efficiency Act (ISTEA) that for over 20 years has helped clean up the environment by providing funds for transportation projects designed to reduce traffic congestion and improve air quality.

The Congestion Mitigation and Air Quality Improvement (CMAQ) funds are provided to states based on the population of local areas in “non-compliance” or those “seeking to maintain compliance” with strict national standards for ozone and carbon monoxide set up under the landmark Clean Air Act. In the first 10 years of the program the number of person days of unhealthy air quality declined by 38 percent nationally with California leading the pack for spending funds and accounting for 97 percent of that improvement.
But MAP-21 (Moving Ahead for Progress in the 21st Century), passed by Congress June 29th, included a number of provisions that put CMAQ funds in immediate jeopardy and could siphon as much as half of the program’s annual $3.3 billion funding away from regions facing public health threats due to air pollution by providing “flexibility” to states on how the money is spent.

Such implementation would eviscerate CMAQ as an important tool for preserving public health, diverting investments in projects that reduce pollution from tailpipes through technologies such as new rail cars and buses as well as diesel vehicle retrofit projects. CMAQ funds have gone towards improved public transit, traffic signalization and other traffic flow improvements, trip reduction and ride-sharing initiatives, and bicycle facilities.

Thankfully, federal and state highway administrations can stay the course towards cleaner air, and guidance from the U.S. Transportation Department can make sure we the public know what is happening with our tax dollars. In this spirit earlier this week the NRDC, along with seven other organizations, sent a letter to Transportation Secretary Ray LaHood urging him to implement the law to maximize CMAQ’s effectiveness and to direct his staff to ensure states continue to use CMAQ for projects that actually clean the air and improve public health for the sake of our communities and environment. We recommended that:
State highway agencies be required to hold a 30-day comment period before diverting funding from regions with significant air pollution since accountability and transparency is the least taxpayers deserve in exchange for the additional latitude the new law provides; andFHWA issue a special rule allowing substantial flexibility in determining what sources localities and states can use to provide the newly required local match. Under the 2007 Energy Independence and Security Act, CMAQ funds did not need to be matched.
When the EPA established new ground-level ozone standards in 2008 they mapped areas that have met or not met (attainment vs. nonattainment) the standards. The map is splattered with nonattainment areas from coast-to-coast, putting millions of Americans, especially those in major metropolitan areas, at risk from breathing air that contains ozone, a component of smog pollution that can trigger a variety of respiratory-related health problems, and is especially dangerous to people with lung disease, asthmatics, children, older adults and people who are active outdoors.

Ground-level ozone also damages vegetation and ecosystems, leads to reduced agricultural crop and commercial forest yields, reduced growth and survivability of tree seedlings, and increased susceptibility to diseases, pests and other stresses such as harsh weather. The science is also clear that particulate matter or soot, especially fine particles, can cause severe health damage as well which explains a laudable change to CMAQ -- 25 percent of the money must be used to reduce such pollution in states challenged by it.
EPA’s Final Nonattainment Areas for the 2008 Ozone Standards

I urge lawmakers to continue putting every penny of available CMAQ funds into improving our air quality, the original intent of the program. Those of us who breathe air can't afford the environmental or health-related consequences of not doing so.

Sunday, October 14, 2012

Tidal Power Capacity Potential in the UK Estimated at 153 GW

There are 153 GW of potential tidal and wave power capacity in the UK, according to a new report from the Crown Estate. The new report was commissioned to help predict the future of the technology.

The report from the Crown Estate underlines the enormous energy potential in the UK’s marine environment. To harness this enormous 153 GW of tidal power capacity, there are three primary types of technology that will be needed — tidal stream devices, tidal range barrage schemes, and tidal range lagoon schemes.

“The report predicts tidal stream devices could produce 95 terawatt hours (TWh) a year from 32GW of installed capacity, tidal range barrage schemes could supply 96 TWh/year from 45GW of capacity, and tidal range lagoon schemes could produce 25TWh/year, drawing on 14GW of capacity.”

And there is also the potential for “27GW of wave energy capacity, which could produce 69TWh of electricity a year.”

The authors of the report say that the figures for the different technologies should be interpreted separately, and that all of the results remain theoretical for now.

According to Rob Hastings, the director of the Crown Estate energy and infrastructure portfolio, the report is intended to be a reference to help in the development of the industry and associated policies.
“While the science of wave and tidal resource assessment is still emerging, and future work will clarify the resources that are practically available, it is clear that wave and tidal energy could contribute substantially to the UK’s electricity needs,” he said.

“Improving understanding about the extent and locations of resources will help to accelerate development in a sustainable way.”

The UK’s Secretary of State for Energy and Climate Change, Ed Davey, recently visited the European Marine Energy Centre (EMEC) in Orkney and had this to say:
“[EMEC is a huge asset to the development of wave and tidal energy in the UK and has helped secure UK leadership in the global market.
“The UK has the largest wave and tidal resource in Europe, which could produce 20 per cent of current UK electricity demand and cut carbon emissions.”

Sunday, October 7, 2012

US 'Solar Zones' in Place, Ready for Big Projects

The Obama administration on Friday gave final approval to a plan that opens up 285,000 acres in 17 zones in six Western states for streamlined utility-scale solar power development. The Department of the Interior said the fast-track sites are “characterized by excellent solar resources, good energy transmission potential, and relatively low conflict with biological, cultural and historic resources.”

The Programmatic Impact Statement (PEIS) for solar energy development doesn’t limit such power plants to the solar energy zones, but the benefits to siting projects in them will be substantial. The government’s major land caretaker, the Bureau of Land Manaagment, has committed to “facilitating faster and easier permitting in the SEZs, improving and facilitating mitigation, facilitating permitting of needed transmission to the SEZs, encouraging solar development on suitable adjacent nonfederal lands, and providing economic incentives for development in SEZs.”

image via BrightSource Energy

The Department of the Interior said that if fully built out, solar projects in the zones could produce some 23,700 megawatts of electricity, enough to power around 7 million American homes.
Secretary of the Interior Ken Salazar signed the Record of Decision codifying the plan in Las Vegas on Friday, joined there by Senate Majority Leader Harry Reid (D-Nev.), proving that even in the face of a recalcitrant Congress, the executive branch has tools to make things happen.

“Energy from sources like wind and solar have doubled since the President took office, and with today’s milestone, we are laying a sustainable foundation to keep expanding our nation’s domestic energy resources,” Salazar said in a statement. “This historic initiative provides a roadmap for landscape-level planning that will lead to faster, smarter utility-scale solar development on public lands and reflects President Obama’s commitment to grow American made energy and create jobs.”

image via the White House
There are zones in six states, but that’s a little bit misleading: Of the 285,000 acres, more than half – 147,910 – are in California’s Riverside County, which borders Orange County on its western flank and then stretches all the way east across the Mojave and Colorado deserts to Arizona.

Pre-Obama, no big solar energy projects had been permitted on public lands. But according to the Interior Department, under Obama 33 renewable energy projects have been approved for construction on or involving public lands, including 18 solar plants, seven wind farms and eight geothermal plants. In May, the first of those big projects –  Enbridge Silver State North, a 50-megawatt solar PV array 40 miles south of Las Vegas – went online.

Saturday, October 6, 2012

Windstrument Wind Turbine Provides Renewable Energy With its Orchid-like Design

Asahi Kasei Plastics N.A. is working with Unified Energies International to develop the Windstrument, a wind turbine that aims to bring affordable renewable energy to the world.

The Windstrument is developed for both residential use and utility-scale projects, including entire communities, industrial centers, and agricultural groups. The rooftop or pole-mounted system is affordable, quiet, powerful, bird-safe, and scalable. The device was extensively tested in Jacobs/Ford Detroit wind tunnel and field tested for over 3 years in one of the harshest climates on earth. Thanks to its beautiful orchid-like design (which can be adapted to local environments), the Windstrument is anything but an eyesore.

View the original article here

Friday, October 5, 2012

Vibrant Protection for Cyclists, Runners, Pedestrians at Night — Visibelt!

I love the way this attention-getting, eye-catching, visibelt for cyclists, runners, pedestrians, and any night-time traveler is made of energy-efficient LED lights; spinning out from one’s body like an aura. I want one. It would go great with an actual “Aura” system on my wheels when biking at night. This visibelt will bring bright notice of me or you as the “other” (less armored) traveler sharing the road.

What exactly is visibelt? Visibelt is a light with huge surface area that wraps around your body or backpack to make sure you get seen. Using just two LED lights combined with the innovative plastic light-carrying tube means battery life is still competitive with smaller, less eye-catching alternatives — like under-seat, rear LED bike lights. Vizibelt has three light modes — fast flash, normal flash, and constant light.

Visbelts will be bringing light into any driver’s view. Too many accidents are occurring in an age when the number of cyclists are increasing and cyclists need to be seen. As one roams around the night-time wilderness or urban spaces, or travels home from school or work, this is a must-have for being seen. As a biker or runner goes flying across an intersection (all the while using good safety measures such as lights, stop signs, etc.), one will have another measure of protection — light, light, light.

Thursday, October 4, 2012

Online Service Gives Us New Ways to Access Key U.S. Electric Power Data

Originally published on the U.S. Energy Information Administration website.
The U.S. Energy Information Administration (EIA) today makes key electricity data more accessible than ever before with the release of a new online service. The agency’s first-ever Application Programming Interface (API) allows developers to design web and mobile apps that harness a wealth of information about the U.S. electricity sector.

The free API will give developers access to data on electricity generation, retail sales, and average prices, and the types of fuel that are used to generate electricity at the state and national levels. Electricity generation and fuel consumption data for individual power plants with more than 1 megawatt of capacity also are available. These data are structured into a hierarchical set of 39,000 categories, grouping related series and assisting in the exploration of EIA’s data.

“EIA’s API will enable independent developers to create innovative information technology applications that can be used to improve energy decision-making. The value of EIA’s data will be enhanced even further when it is combined with data beyond what the agency collects, such as market or environmental data,” said EIA Administrator Adam Sieminski.

Of particular interest to developers will be the geographical metadata provided with each series (for example, the longitude and latitude information of individual electricity plants). Standards-based country and state codes are provided, where applicable. These metadata will permit advanced mapping applications.
Planned additions to EIA’s API include petroleum and natural gas data, along with state energy estimates. As these data sets are added over the coming months, the total number of data series available through EIA’s API will grow.

APIs are an important element of a government-wide Digital Strategy to make information more transparent and customer-centered.

Wednesday, October 3, 2012

Oxymoron of the Day: Nocturnal Photosynthesis

Here’s one that almost slipped by us: last month, the US Department of Energy granted $14 million to an international biofuel research team headed by the University of Nevada, with the goal of developing a new strain of poplar tree that can perform nocturnal photosynthesis. That sure sounds like a honey trap for certain pundits and federal legislators who don’t like government spending on biofuel research, especially when it scans like an oxymoron and involves spending millions on a common tree that your local nursery probably sells for less than fifty bucks. However, that relatively small investment of $14 million could make all the difference in the ability of the domestic biofuel industry to help power the US through a hotter, dryer future.

The technical name for nocturnal photosynthesis is crassulacean acid metabolism (CAM). The phenomenon was discovered back in the 1950's, when researchers at Newcastle University in the UK noted that prickly pear, agave, and some other desert plants open up their pores to absorb carbon dioxide at night, rather than during the day as in normal photosynthesis.

With a store of carbon dioxide at hand, these plants have a power source for photosynthesis during the day while keeping their pores shut tight against water loss.
According to researchers at the University of Nevada, CAM plants can thrive on 8 to 16 inches of precipitation annually, compared to typical non-CAM biofuel crops that requires 20 to 40 inches.

Why poplar? Well, as the US recovers from its corn ethanol hangover, the search is on for woody, drought-tolerant biofuel crops that don’t compete with crops for food and animal feed. That makes the ideal biofuel crop a non-food plant that can be grown on marginal land that is not suitable for cultivating food crops.
That’s where poplar comes in. Biofuel from poplars is already a hot topic in biofuel research circles because the tree grows quickly in poor soil and it tolerates dry conditions.

As a perennial biofuel crop, poplar has a potential advantage over annual crops in terms of soil conservation and energy required for cultivation.

A poplar biofuel farm could do double duty as a managed forest for wildlife habitat and recreation. Poplar is also being tested as a form of soil remediation called phytoremediation, in which plants remove contaminants from soil as they grow.

That’s all well and good, but one thing that poplar lacks is the genetic mechanism for CAM, and that is exactly what the new DOE grant is designed to give it.

Helping to nudge things along, researchers at Oregon State University have been working on a genetic modification to create semi-dwarf trees, including dwarf poplar. The idea is to keep forest US forest industries viable in a hotter, dryer world by introducing trees that are more drought-tolerant due to a larger proportion of root mass.

The increased root mass would also enhance ability of semi-dwarf trees to perform soil conservation and phytoremediation tasks.

No surprise that the Oregon State research is partly funded by the Department of Energy as well as the Department of Agriculture, the National Science Foundation, and forest industry partners.

The University of Nevada project, by the way, is titled “Engineering CAM Photosynthetic Machinery into Bioenergy Crops for Biofuels Production in Marginal Environments. The research team also includes the University of Liverpool, Newcastle University, the Oak Ridge National Laboratory, and the University of Tennessee, Knoxville.

Image: Poplar tree at night. Some rights reserved by Horia Varlan.

Tuesday, October 2, 2012

SoloPower Offers Relatively Expensive Panels with a Potentially Huge Cost Benefit

Editor’s Update October 2, 2012: Someone has notified me that SoloPower has not actually posted anything on the cost of SoloPower’s solar modules, and that the numbers below (which we retrieved from Greentech Media) are extrapolated from another technology. I am contacting SoloPower to try to get confirmation of that.

We wrote about SoloPower in March when it broke the efficiency record for CIGS solar modules, and we’ve actually covered the company a few times over the past few years. SoloPower charges $2.20 per watt for their flexible solar panels, if purchasing a whopping 10 MW (10,000 kW) of panels, that is. On such a large-scale, typical hard solar panels made of metal and glass (or plastic) can cost $1 per watt.

SoloPower’s flexible CIGS (Copper Indium Gallium Selenium) solar panels have some clear installation benefits. They can be pasted onto roofs without penetrating them. Penetration requires expensive contractors, and the construction of mounting equipment for solar panels does, too.

SoloPower’s flexible CIGS solar panels.
Flexible solar panels can also be installed onto uneven surfaces much more easily than typical rigid metal panels, and they can even be installed onto surfaces that it would be impossible to mount rigid metal panels onto.

As you can see in the picture above, installation can actually be very simple, and this opens up a window of opportunity to install solar panels yourself, which is far cheaper than having a contractor install them. This is because solar panel installation is so expensive that it costs more than the panels themselves.
There is a catch to flexible thin-film solar panels, though. They tend to be less efficient than rigid mono-crystalline panels, and they are more expensive.

The easy pasting installation concept also has a catch. Pasting panels on your roof may entail replacing your roof when the panels go bad, depending on the type of roof you have.

If you have an asphalt roof, then the panels could outlast the roof, but they normally have the same lifespan as the roof (20 years). So, ideally, the panels are likely to need replacement at the same time as the roof.
There are also adhesives which can easily be peeled back off — although I can’t verify how long such adhesives could last. And there’s one major catch to this idea — ease of theft. Solar panels that are permanently glued to roofs cannot be stolen, so their theft deterrence is superior.

For smaller scale projects that are up to 500 kW, SoloPower’s panels cost $2.95 per watt.
SoloPower is headquartered in San Jose, California.
Source: Green Tech Media

Photo Credit: SoloPower
Interested in free solar estimates for your home?

View the original article here

Monday, October 1, 2012

Silevo’s Triex Hybrid Solar Cell Has Reached 21% Conversion Efficiency at Production

The solar cell innovator and photovoltaic (PV) solar module manufacturer Silevo recently announced that its Triex™ solar cell technology won The Solar Industry Award 2012 for Excellence in Innovation. The Triex technology is a powerful hybrid solar module that is able to perform with very high efficiency and ‘low temperature coefficients’ while costing considerably less to produce than was previously possible.

The Solar Industry Awards were created to cast a spotlight on the people, products, and services that are continuing to develop innovative manufacturing practices and products that “demonstrate technological development towards grid parity while reducing overall cost.”

Silevo’s winning of the award follows closely on the heels of its announcement that it has reached a conversion efficiency greater than 21 percent “with its Triex solar cells at its high volume manufacturing facility—one of the highest across the solar industry.”

“Silevo is honored to be presented with The Solar Industry Award for our advancements with Triex technology,” said Zheng Xu, founder and CEO of Silevo. “Now that we’ve begun commercial production of Silevo cells with greater than 21 percent conversion efficiency, this award reflects our determination to develop and bring to market a technology that makes sustainable, widespread solar adoption viable for the energy market. Silevo’s Triex technology is the first offering that can bring significant balance of system (BOS) savings, as well as an increased energy yield.”

The Triex technology is the first hybrid solar solution that mixes “high-performance crystalline silicon N-type substrates, thin-film passivation layers and a unique tunneling oxide layer—all in a single solar module, that is powered by breakthrough ‘tunneling junction’ architecture.” When these three materials are combined they allow the Triex module to deliver very high conversion efficiency and very competitive costs.

“The Solar Industry Awards continue to reward and recognize the people, process and products that make up the global PV and solar industry,” said David Ridsdale, editor in chief of Solar International. “Now in their fourth year, the awards are voted for by the industry ensuring these awards are decided by the industry. Silevo’s Triex technology exhibits the complexity and comprehensiveness of the PV market and was selected as a winner due to the confidence of the awards selection panel in recognition of the perceived value Silevo’s technology can add to the industry.”

Sunday, September 30, 2012

Restricting nuclear power has little effect on the cost of climate policies

ScienceDaily (Oct. 1, 2012) — By applying a global energy-economy computer simulation that fully captures the competition between alternative power supply technologies, a team of scientists analyzed trade-offs between nuclear and climate policies. Strong greenhouse-gas emissions reduction to mitigate global warming shows to have much larger impact on economics than nuclear policy, according to the study. Incremental costs due to policy options restricting the use of nuclear power do not significantly increase the cost of even stringent greenhouse-gas emissions reductions.

"Questions have been raised if restricting nuclear energy -- an option considered by some countries after the accident in Fukushima, Japan -- combined with climate policies might get extremely expensive. Our study is a first assessment of the consequences of a broad range of combinations of climate and nuclear policies," lead author Nico Bauer says. Restrictions on nuclear power could be political decisions, but also regulations imposed by safety authorities. Power generation capacities would have to be replaced, but fossil fuels would become costly due to a price on CO2 emissions, this in sum is the main concern.

"However, in case of restricted use of nuclear power, the flexibility of allocating a long-term carbon budget over time enables higher near-term emissions due to increased power generation of natural gas," Bauer says. Along with demand reductions and efficiency improvements, these provisions could help fill the gap on electricity. The price of natural gas is projected to decrease due to demand reductions, according to the study. Decommissioning existing plants will also avoid refurbishment costs for expanding lifetimes of old nuclear power plants.

As a result, early retirement of nuclear power plants would lead to cumulative global gross domestic product losses (GDP) that amount to about 10 percent of climate policy costs. If no new nuclear capacities are allowed, the costs would amount to 20 percent.

For their study, the scientists looked into different nuclear power policies. These cover a range of scenarios from "Renaissance," with a full utilization of existing power plants, a possible refurbishment for a lifetime expansion and investments in new nuclear power capacities, to "Full exit," with a decommissioning of existing power plants and no new investments. They contrasted each scenario with climate policies implemented via an inter-temporal global carbon budget which puts a price on carbon emissions. For the budget, the cumulative CO2 emissions from the global energy sector were limited to 300 gigatons of carbon from 2005 until the end of the century. This represents a climate mitigation policy consistent with the target of limiting global warming to 2 degrees Celsius.

"A surprising result of our study is the rather little difference between a 'Renaissance' or a 'Full exit' of nuclear power in combination with a carbon budget when it comes to GDP losses," Bauer says. While the 'no policy case' with a nuclear phase-out and no carbon budget has only negligible effect on global GDP, the imposition of a carbon budget with no restrictions on nuclear policy implies a reduction of GDP that reaches 2.1 percent in 2050. The additional phase-out of nuclear power increases this loss by about 0.2 percent in 2050 and hence has only little additional impact on the economy, because the contribution of nuclear power to the electricity generation can be substituted relatively easy by alternative technology options, including the earlier deployment of renewables.

Saturday, September 29, 2012

The Importance Of Alternative Energy Sources

One of the biggest challenges the human race faces today is finding and using alternative energy sources. The push for means of generating electricity has been around for over 100 years, but when oil and coal-fired generators produced power inexpensively, the world put the search for alternative energy sources on the back burner for a number of years.

We cannot procrastinate any longer, however, as many of the earth's natural resources, such as oil, are depleting.

A Short History Lesson on Alternative Energy Sources

The need for an alternate energy source was rekindled in the 1970's with the oil shortage that created lines at gas stations and produced critical shortages throughout the United States. The search for alternate power generation is not limited to finding new ways of powering vehicles, as supplying cheap power for homes and industries is a continuous endeavor. There have been many advances in the search for alternative energy sources, but the price of the power produced still remains too high.

Wind, water and sun are touted as renewable energy resources with claims that once the technology is perfected, making it more cost effective, they can replace the need for oil and natural gas to turn turbines in the generation process. Even geothermal power production is one of the alternate energy sources being researched.

The Source Of The Energy Depends on The Location

For many people the switch to alternative energy sources is a matter of finding the type of alternative power that works the best in their particular geographical location. Persons who live in areas that have limited exposure to the sun for example, may not be too excited about using solar panels to supply power. When the sun goes down for an extended number of days, the town can go dark.

In some of those areas, wind is not a problem as it seems to blow nearly every day. Using wind power to turn turbines to generate electricity can work there, but may not work in other areas that experience less windy conditions. Another of the alternative energy sources, hydropower uses the power of rivers to turn generators, but the cost of the infrastructure to get power to the people from the generator may still be high for long range use.

With the three major alternative energy sources continuing to be researched and advanced, the need for an answer to out problem becomes more evident every time a person receives their electric bill, or fills their car with gas.

The resources that we have left on the planet are running out. Do your part to keep educated on the latest changes in technology and any up to date with the issues at hand to learn what you can do to help solve the energy crisis.

Madison Greene is like anyone else. She is interested in saving the planet and finding alternatives for depleting natural resources. She has done diligent research and found a book that teaches you how you can safely make your own alternative fuel [] at home for less than one dollar a gallon. Learn how you can save money and the planet by visiting: []

Friday, September 28, 2012

Why is Alternative Energy Important?

The global economy is today far more intertwined than perhaps ever before. A natural calamity or terrorist attack affects the stock indexes of stock markets around the world. The mortgage crisis in the US has played havoc with the stock markets globally. Recession in the US and Europe causes a slide in the global economy. If these are rather obvious and agreed upon, it can also be accepted that the rising oil prices in the world market are pinching the lay consumer worldwide. Economic development and the consumerist culture have led to a spurt in the purchase of cars in several countries of late, notably India and China. Besides, car sales in the US and other developed nations show little sign of decreasing.

Environmental Concerns on the Rise

The environmental lobby is, in today's world, alive and kicking, if not influential as well. Indeed, there is growing awareness about the need for caring for the environment, among both governments and citizens. Global warming, the threat of an Arctic meltdown, and the like have acquired sinister overtones owing to unusual climatic phenomena being experienced in various parts of the world in recent years. When it doesn't rain in the rainy season, winter barely occurs, or it rains in deserts, you are wont to sit up and wonder just what the dickens might be happening. Hence, when the burning of fossil fuels is decried as adding to pollution, and depleting the ozone layer, it does acquire a negative tinge to it in the collective psyche.

Limited Reserves of Fossil Fuels

Fossil fuels have been formed over a period of time spanning millions of years. The entire known reserves of fossil fuels worldwide cannot last beyond perhaps centuries. This is assuming constant prices and no price is paid for the entailed environmental degradation. Even in view of the scarcity of the supply if fossil fuels as a source of power, it makes eminent sense to be actively considering alternative forms of energy. Once the economic, environmental and political issues are factored in, the quest for feasible alternative sources of power takes on an element of urgency.

Nations Toying with Alternative Energy

In recent years, we have been witnessing this urgent search for alternative energy the world over, whether it is France's adoption of nuclear energy, the Indian massive development of hydroelectricity, the Dutch fondness for wind energy, or the "corn for energy" experimental project in the US. Dependence on certain foreign nations for oil is fraught with the risk of letting them hostage the growth of the national economies. Wars have traditionally been fought over scarce resources, be it as varied as gold, land, spices, water or oil. The world might perhaps witness fewer conflicts if the crucial energy requirements of the various nations began to be met in greater proportion by renewable, locally prevalent and environment-friendly modes of alternative energy.

Inevitable Proliferation of Alternative Energy

To sum up, alternative energy is important because fossil fuels exist in limited reserves. Moreover, the consumption of fossil fuels is associated with unsavory environmental and medical consequences. The rising oil import bills is causing various nations to actively explore alternative forms of energy. in a bid to buoy up their respective economies. Several types of alternative energy are easily available, commercially viable and practically applicable. The energy scenario the world over can be expected to turn much more variegated, innovative and conscious of environmental concerns in the times to come.

For more information about alternative energy, visit [].

Thursday, September 27, 2012

Way Out Alternative Energy Sources

When we think of viable alternative energy sources, we think of solar energy, wind power and even wave power. But have you ever considered the possibility of making energy out of old pills, used diapers and molten salt? Alternative energy from garbage and molten salt sources is no longer stuff of fiction; it has successfully been tried and tested. 

Alternative energy prevents waste and emissions

Since companies in the UK have to comply with commercial EPC regulations, alternative energy is not only for bunny huggers; everyone can benefit from producing energy from new sources. The energy needed to power our daily activities can be transferred from almost anything. We are sitting with all this potential energy but without the means to tap into it.

Some innovative scientists and inventors have found ways to convert waste, which is usually difficult to dispose of, into energy and this gives us the possibility of killing two birds with one stone.  Not only does this lessen the risk of chemical leakage into water tables and reduces landfill problems, but it also takes the strain off our stocks of fossil fuels and oil.

Use an energy pill

Medication seems one of the most unlikely sources of alternative energy.  Expired medication is notoriously difficult  to get rid of. If people flush their medication down the toilet, it becomes part of the water system, and if it's  thrown away as landfill, the potentially harmful chemicals seep into the soil and eventually water tables. Governments are becoming increasingly concerned about pharmaceutical water pollution, as scientists have found high levels of many drugs in water sources. Some of these are hormone pills, which can cause cancers and animal mutations.

A company in the USA that specialises in the disposal of expired medications sends expired drugs to an energy company that converts waste products into energy. Six and a half million pounds of pills were disposed of in 2006, producing enough energy to power hundreds of homes for over a year.

Fill up the tank with dirty diapers

Another alternative energy company was looking for waste sources that are consistently produced to make diesel fuel from. The answer came in the form of used diapers. The company now transforms 30, 000 tons of diapers into 10, 000 tons of diesel fuel at 50 US cents per liter in a low-emission closed system.

Molten salt versus fossil fuel

We've already looked to the elements, the air, sun and water, for alternative energy sources but a rocket building company and solar energy company in North America thought out of the box and came up with a method of making energy out of molten salt.

Molten salt has commonly been used to make alloys, but analysts say that the idea of combining solar power and molten salts is promising. Solar power is collected by tilting mirrors that direct it onto the molten salt, which is then heated up to over 1000 degrees Fahrenheit; the  steam produced is used to drive a turbine. The molten salt can be reused to repeat the process and no emissions are produced.

Drive with drink

Thousands of bottles of smuggled alcohol are confiscated in Sweden each year, and authorities have come up with a brilliant alternative energy use for it - by using alcoholic cocktails as the biogas source to power cars and buses. It seems like a noble use for the large quantity of hard liquor produced for consumption each year.

Alternative energy sources give us a way out of the oil crisis and let us feel all warm and fuzzy inside about saving the environment with renewable energy. However, at the moment, it's difficult to make alternative energy resources accessible to everyone, as new energy systems are expensive to produce. 

Frances wrote this article for National Energy Consultants
Commercial EPC