Search

Solving Wicked Problems Poses New Challenges in Today's Funding Environment

3P Author ID
95
Primary Category
Content

On the shoulders of giants

In his 1962 address at Rice University on the nation’s fledgling space program, President John F. Kennedy famously said: “We choose to go to the Moon in this decade and do the other things, not because they are easy, but because they are hard; because that goal will serve to organize and measure the best of our energies and skills, because that challenge is one that we are willing to accept, one we are unwilling to postpone, and one we intend to win … ”

The leadership and rhetorical skill of JFK inspired a critical decade for government funded R&D, as well as public-private research partnerships. Apparently, it worked. In 1962 the U.S. had only begun to send astronauts into low Earth orbit.  On July 20, 1969, Neil Armstrong walked on the moon. A phenomenal achievement and testament to focused, purpose-driven research and development.

Some five decades later, a private space company launches resupply rockets to the International Space Station; Moore’s Law now makes possible pocket-sized computing power far surpassing the room-sized computers that guided Eagle to its 1969 moon landing; and nearly the sum total of human knowledge is instantaneously available through a global communications network.

Arguably, none of these technologies would exist today were it not for the inspirational fire Kennedy lit on that warm September day in 1962. But we also confront a host of interconnected social and environmental challenges; “wicked problems” that defy simple or straightforward solutions.

In many ways these wicked problems are a consequence of our own ingenuity. The innovation that brought about the technology we now take for granted is part of the solution for these wicked problems - and part of the problem itself.

Funding research and development

You get what you pay for, or what you get depends on who pays for it

Government spending on R&D as a percentage GDP peaked at 2.9 percent in 1964. More than 60 percent of all R&D expenditures were funded by federal government coffers. In the decades since then, that balance of R&D spending flipped. In Q1 2015, only 0.8 percent of GDP went for R&D.

Corporate R&D funding now represents 65 percent of all expenditures, a flip from how it was in the 1960s. Depending on who you ask, this is either a good thing or the death of science as we know it, reflecting the complexities of research and development.

On the one hand, the shifting balance of R&D funding toward industry helps bring new technologies to market quicker. Applied and developmental research hastens the conversion of knowledge into useful products, creates jobs and economic growth, and gives investors a return on their investment.

On the other hand, the declining share of federal funds for basic research has far-reaching consequences for the kind of wicked problems closing in around us. A 2015 MIT report titled The Future Postponed: Why Declining Investment in Basic Research Threatens a US Innovation Deficit warns of an "investment gap" in basic research. Marc Kastner, who chaired the committee that wrote the report, notes that science funding in the U.S. is "the lowest it has been since the Second World War as a fraction of the federal budget." In 1968, 10 percent of the federal budget was allocated to basic research. By 2015 it had dropped to 4 percent. Kastner contents this sharp decline poses serious threats to the nation's future.

The MIT report outlines 15 case studies where important research suffers from lack to adequate funding, including neurodegenerative diseases, such as Alzheimer’s, infectious disease, robotics, and cyber security.

Contrast this with the 19 billion Facebook spent to acquire WhatsApp. Corporate acquisitions of tech startups can bring products to market quickly, but many argue at the expense of basic research and science. R&D begins to look more like M&A, or corporate venturing.

The economics of acquiring in-process R&D, maintaining a return for shareholders and meeting consumer demand, dictate where and how corporate R&D funds get spent.

The economics of R&D

Finding the next killer app

Applied research and development can push forward a new energy economy, improve the standard of living for tens of millions of people, and help put humanity on a path to sustainability. But it is in basic research, where the payoff is uncertain and the horizon for marketability is far in the future, that makes possible the next "killer app."

Private funding of R&D, both philanthropic and corporate, is vital for converting new knowledge into benefits for society. The role of the private sector in research and development is implied in many of the recently adopted Sustainable Development Goals (SDGs). These are the wicked problems that we hope to solve. That we must solve.

Behind the shifting balance of R&D funding in the U.S. is perhaps the most vexing issue of all: a narrative of disdain for science, inside and outside the beltway, and the need for a renewed public-private partnership in research and development.

In the end, R&D is a multistakeholder, interdisciplinary, and collaborative endeavor. Funding R&D has shifted dramatically since the days of Apollo, when government and industry made a grand partnership to put a man on the moon. Can we do that again? That's how we solve the wicked problems.

Image credit: Wikimedia

3P ID
251078
Prime
Off

Wikileaks Gets Unabashedly Political

3P Author ID
8838
Primary Category
Content

For many critics, the once heralded site for sharing information has become a political platform for its embattled founder and his vendetta against the U.S. Democratic Party – and the losers are all of us.

Wikileaks has taken center stage in the U.S. presidential race. Over the past few months, the site released a slew of hacked emails from some of Hillary Clinton's closest advisors, most notably John Podesta. The timing – and the fact that the emails may have been obtained by Russian hackers working with the country's authoritarian president, Vladamir Putin -- caused many to wonder what happened to the progressive Wikileaks of the past.

Things have changes so much that today even one of people who helped put Wikileaks on the map – Edward Snowden – criticized the platform in a recent Tweet.

Modest curation would mean vetting both the source of information and the impact of the timing of its release. Wikileaks seems to be doing little of either.

The name itself is a misnomer. Despite the use of the word “wiki” in its title -- a la Wikipedia, the global, crowdsourced information platform that operates under little oversight and almost complete community control -- Wikileaks remains in the control of its celebrity founder, Julian Assange, who has been holed-up at the Ecuadorian Embassy in London since 2012.

Assange has a documented personal dislike for the Clintons. Thus, he is using his site -- which was conceived as a global, community tool -- as a political platform to influence the U.S. elections.

In fact, Wikileaks has become embarrassingly political. A few weeks ago, Assange promised “breaking” information during an “October surprise” press conference, which instead turned into a two-hour rant with little substantive information. What has come out since then is interesting – who knew Hillary at one point considered a carbon tax? -- but it's certainly not game-changing. The timing though – just weeks before the election – is incredibly suspect, as is the source of this information.

Want more evidence that Wikileaks has become blatantly political? Take a look at the Wikileaks Twitter feed. At first glance, it could be that of the Republican National Committee. Nearly every tweet is about Clinton, Obama or the election, and the feed doesn't take a particularly neutral tone.

For example:

On top of that are numerous retweets of Fox News articles, and the use of unflattering pictures of Hillary that could have come straight from the Trump campaign.

As the Guardian noted last week, this is a striking departure from what Wikileaks once was – a platform that received praise from the left for shedding light on government surveillance and providing a real, powerful service to netizens around the world.

"The seeming alliance between Trump and WikiLeaks is an astonishing role reversal," wrote David Smith, a Washington, D.C.-based correspondent for the Guardian. "In 2010 it was lauded by transparency campaigners for releasing, in cooperation with publications including the Guardian, more than a quarter of a million classified cables from U.S. embassies around the world. WikiLeaks founder Julian Assange became a hero to many."

The truth is: We don't need to shed a tear for the site. In fact, Wikileaks may be unnecessary. The biggest leak this year is not the politically-charged materials released by Assange, but the Panama Papers. That release utilized modern technology and a team of independent, global journalists to vet, verify and publish information in the public interest in what many considered a model for journalistic integrity. Rather than focusing on a single country going through an election, reports were released in several countries simultaneously.

Perhaps Wikileaks could learn something from that undertaking. There's a right way to release information, and what is happening right now with Assange's once powerful site is anything but.

Image credit: Pamela Drew via Wikimedia Commons

3P ID
251029
Prime
Off

Fukushima Radiation in the Pacific (Revisited)

3P Author ID
365
Primary Category
Content

My recent post on the spread of radiation stemming from the Fukushima nuclear accident drew quite a few questioning comments. Specifically the article suggested that radiation from the accident was drifting across the Pacific at levels high enough to cause alarm. It turns out such cause for alarm was exaggerated, though there is still reason to be concerned. I appreciate the feedback. I acknowledge that I relied on sources with which I was unfamiliar and posted some information that has been shown to be incorrect. I apologize.

To all who publish online, beware. Bad news travels fast. It gives credence to the old saying, “A lie can travel halfway around the world while the truth is still putting its pants on.” This is especially true on the Internet. I truly hope no one was harmed by this information. Now begins the task of earning back your trust which, though hard-earned, can be quickly lost.

I think the best way to start is to post a revised story on what is actually happening in the waters around Fukushima, Japan, as well as those farther afield.

Let’s start by addressing the points made in the original story.

For starters the initial source, PeakOil, used a bogus NOAA graphic to sensationalize the story, having carefully scrubbed out the legend showing that the colors actually represented wave heights at the peak of the tsunami, not radiation levels as the site would have you believe. I checked this image out, noticed this and chose not to use it in my post. Still, I continued to take the central thrust of the story as true.

Several people went to the generally reliable Snopes site to question the story and found confirmation of their suspicions. The blatant misuse of the NOAA chart is clearly called and tossed into the trash where it belongs. An interesting thing about the Snopes post, however, is that while the site prominently displays a text clipping stating that, “each day 300 tons of radioactive waste seeps into the ocean,” it never specifically addresses that claim.

I dug further and found that number actually comes from a quote by Yushi Yoneyama, an official with the Japanese Ministry of Economy, Trade and Industry, which oversees energy policy as quoted in Reuters (generally considered unassailable) and elsewhere. In 2013, Yoneyama said, "We think that the volume of water [leaking into the Pacific] is about 300 tonnes a day." Of course, anyone could be wrong, but who am I to question Reuters or a Japanese government official? I don’t.

That’s not to say Japanese government officials, or officers of TEPCO, can always be counted on to tell the truth, but their interest has generally been to minimize the extent of the damage, not to embellish it.

As for that amount of leakage, that’s equivalent to about 90,000 gallons of radioactive water. That sounds like quite a bit. But compared to the volume of the Pacific Ocean, it’s not a lot at all. Still, when that much leaks out each day, over the course of a year, it adds up to 33 million gallons. And it’s been five years now.

Even today, TEPCO only acknowledges that radioactive water threatens to flood out of the plant and into the ocean. The company denied, until recently, that any water leaked from the plant at all, even when fish contaminated with high levels of radiation were found near the plant by independent researchers from the University of Tokyo, raising major concerns for local fishermen.

The story regarding radiation reaching the Canadian West Coast, which claimed levels of iodine-131 were 300 times background levels, was recently updated with an editor’s statement that the original figures were incorrect.

Reports of a wildlife biologist (Alexandra Morton) pulling hundreds of herring out of the waters off British Columbia with blood coming out of their eyes and gills have not been discredited. However, there is no evidence linking this observation directly to radiation from Fukushima or anywhere else.

The claim that radiation levels found in tuna off the Oregon coast had tripled also appears to be legitimate. However, those levels are still substantially below what would be considered a health threat.

Having sorted through that, I would summarize as follows: Contaminated water continues to enter to ocean from the Fukushima site in significant volume. Traces of radiation have been found in various locations around the Pacific. It also appears that the levels detected at this time do not indicate any immediate threat to humans outside of Japan. That being said, our knowledge of the long-term impacts of these types of radiation on the oceans, and on ourselves, is far from complete.

Upon review, most of the statements in the original piece were in fact true, but I acknowledge the overall sense was that of an exaggerated cause for concern. What this shows is how easily a group of facts taken out of context can become a convincing story — a lesson for all of us. Putting it on the Internet is like putting a match to a dry grassland.

What is far less clear is what the actual levels are and where they can be found. What makes writing about this issue so difficult, and even dangerous, is the combination of two things: It’s a frightening subject, and there is very little solid information being made available.

In my efforts to bring in some more solid facts, I reached out to Greenpeace, which is monitoring the situation carefully. The group sent me some additional information in a press release with links to reports published outside the U.S.

Greenpeace’s famed ship, the Rainbow Warrior, went out to sample the waters around Fukushima in February of this year with former Japanese Prime Minister Mr. Naoto Kan onboard. What they found was that radiation in the seabed off Fukushima “is hundreds of times above pre-2011 levels.” They also found levels in nearby rivers that were “up to 200 times higher than ocean sediment.”

Expressing concern, Ai Kashiwagi, energy campaigner for Greenpeace Japan, said: “These river samples were taken in areas where the Abe government is stating it is safe for people to live. But the results show there is no return to normal after this nuclear catastrophe.”

The areas sampled include the Niida River in Minami Soma, where readings measured as high as 29,800 becquerels per kilogram (Bq/kg) for radio-cesium. (For those new to the subject, a becquerel is a derived unit that measures radioactivity.) More samples taken at the estuary of the Abukuma River in Miyagi prefecture, more than 90 kilometers north of the Fukushima Daiichi plant, found levels in sediment as high as 6,500 Bq/kg. To put that in perspective, recorded levels in the seabed near the plant before the disaster were 0.65 Bq/kg.

Kendra Ulrich, senior global energy campaigner for Greenpeace Japan, explained: “The sheer size of the Pacific Ocean combined with powerful complex currents means the largest single release of radioactivity into the marine environment has led to the widespread dispersal of contamination.”

Greenpeace expressed concern that the order scheduled to allow people to return to these areas next March “cannot be permitted to stand.” The group claims that “these ecosystems cannot simply be decontaminated.”

Greenpeace's report, which came out in July of this year, concludes by saying the impact of the accident will persist for “decades to centuries.”

So, while we have not yet seen the global-scale consequences some predicted, the situation is indeed bad and getting worse. TEPCO continues to build steel tanks at the rate of three per week, to house a great deal of contaminated groundwater while awaiting decontamination. But according to this PBS documentary, the company will run out of room for more tanks sometime next year. The gravity-fed water filtration system has been effective in removing most contaminants, except for tritium. Tritium is a relatively weak radionuclide with a half-life of 12.5 years, which means it will take about 100 years to fully break down.

The molten nuclear cores in reactors still remain in three reactors. And the site will not be fully stabilized until those are removed. But the radioactivity level in those reactors is far too high for people to enter. TEPCO plans to develop robots to go in and retrieve the molten fuel. The company says that retrieval is estimated to begin in 2020.

In closing, while the level of concern suggested in the prior piece was overstated, I maintain that the situation at Fukushima is far from resolved and that it remains a serious concern, particularly in Japan. I further maintain that any plans to continue expanding nuclear power must include an in-depth review of what has happened in Fukushima, with the understanding that this story is far from over.
Image credit: Digital Global, courtesy of Greenpeace.

3P ID
250560
Prime
Off

Summer Olympics Can't Compete in a Warming World

3P Author ID
100
Primary Category
Content

By Anna Johansson

In August, all eyes were on the Summer Olympic games, the world’s favorite sporting event. It only happens once every four years, and athletes train for hours a day to compete.

But scientists speculate that if climate change continues on its current trajectory, there won’t be a future for the Summer Olympic games. This information comes from a relatively new study in published in the Lancet, which maps out a model of temperatures from now until 2085.

According to the findings, only eight cities outside of Western Europe will be able to comfortably host the summer games in 70 years. Everywhere else, temperatures will be too high for safe athletic competition, based on the researchers’ findings.

Researchers from California, New Zealand and Cyprus used a combination of temperature and humidity data from the past several years to create a model that would show likely outcomes into the future. They also factored in physical abilities in relation to heat for events that require high physical exertion outdoors.

For example, runners competing in the U.S. Olympic Marathon Team Trials in Los Angeles were subjected to temperatures in the high 70s, a record high for the time of year. This was a common factor in many outdoor Olympic trials across the globe, and overheating became a major challenge for competitors during the final events in Rio. Several athletes overheated in the marathon and triathlon, which forced them to forfeit.

The temperature and humidity data gathered in this study primarily came from the cities most likely to host the Olympics in the future. It focused on the Northern Hemisphere because that’s where 90 percent of the world’s population lives. Most of that sector of the planet is expected to become inhumanely hot, at least where athletes are concerned.

All in all, the findings were not encouraging. “Increasing restrictions on when, where, and how the Games can be held owing to extreme heat are a sign of a much bigger problem,” the research team wrote, hinting at the high risk most cities face in having to cancel various outdoor events due to elevated temperatures.

“If you’re going to be spending billions of dollars to host an event, you’re going to want have a level of certainty that you’re not going to have to cancel it at the last minute," they continued.

As you can imagine, the Summer Olympics wouldn’t be the only sporting event to suffer. “High-visibility international athletic events such as the summer Olympics represent just a small fraction of heavy exertion outdoors,” the study reads. Sporting events all over the world may face cancellation if the heat makes it impossible to play.

Outdoor sporting events, including the Olympics, bring in billions per year in revenue for the participating countries. Millions of people are involved in both the production and the playing of these globally beloved sports.

That should put a lot of pressure on us to ensure conditions are hospitable, but Mother Nature is pushing back. After years of enduring inconsiderate house guests, the world seems to be telling its residents that it won’t take any more.

“Climate change is going to force us to change our behavior from the way things have always been done,” said Kirk Smith of the University of California, Berkeley, the lead researcher on the study. “This includes sending your kids outside to play soccer or going out for a jog.” Physical activity of all kinds will have to be kept indoors, and large outdoor sporting events may become all but impossible.

Unfortunately, the signs of global warming may not be enough to convince the world to make a change. It’s more likely that corporations will turn to more destructive methods of fixing the problem, such as building enormous playing fields with indoor climate control.

The amount of non-renewable resources and energy needed to run such facilities will be astronomical. Others may ignore the problem completely. Rather than hold the Summer Olympics in August when it’s traditionally scheduled, perhaps organizers will schedule it in the late fall or early spring when temperatures might still be bearable outside.

The study finishes with a plea of sorts to protect athletes and regular individuals as the climate rapidly drives median temperatures upward. It’s becoming more and more of a challenge to enjoy life as it is now into the near future if certain changes don’t get made; and boosting awareness and pushing for change is the only solution for a world that can continue to witness and participate in sporting events and other outdoor activities 70 years from now.

Image credit: Flickr/Shawn Carpenter

Anna Johansson is a freelance writer, researcher, and business consultant from Olympia, WA. A columnist for Entrepreneur.com, HuffingtonPost.com and more, Anna specializes in entrepreneurship, technology, and social media trends. Follow her on Twitter and LinkedIn.

3P ID
250500
Prime
Off

Why Science-Based Targets Are Necessary For Meeting Paris Climate Agreement Goals

3P Author ID
100
Primary Category
Content

By Tobias Schultz

More than 190 countries have formally signed the Paris Climate Agreement since it was introduced in December 2015. Many have since unveiled plans regarding how they will reduce their greenhouse gas emissions.

The problem? Collectively, the proposed GHG reductions won’t achieve the Paris agreement’s goal of holding global mean temperatures to 2 degrees Celsius over pre-industrial levels. Instead, we are on track to see an increase of almost 4 degrees Celsius by 2100.

Beyond the obvious moral imperative to preserve our planet for present and future generations, your company’s suppliers and customers -- not to mention your own operations -- can and will be affected in many ways by a changing climate. Global warming means a world of ever-increasing uncertainty across supply chains, as rising sea levels, drought and other climate-related circumstances threaten to disrupt manufacturing and especially agriculture.

Consider the effect of current drought conditions in California, where farmers must either spend more for water or allow a portion of their acreage to go fallow. One choice pinches the farmers’ pocketbook; the other directly disrupts purchaser supply chains. Or, if you’re a manufacturer active in a country like China, expert predictions of increasing severe weather incidents ahead likewise will wreak havoc on your supply chains.

How can we craft mitigation policies to tackle problems of such enormity?

Fortunately, climate scientists provided a thorough, credible roadmap published in reports prepared by the Intergovernmental Panel on Climate Change (IPCC) and the United Nations Environment Program (UNEP). Each lays out the level of emissions reductions necessary to stabilize the climate below the +2C target.

One group of businesses has already begun this process by introducing the Science Based Targets initiative (SBT). These companies publicly committed to reduction targets in line with the +2C pathway.

If you’re looking to set a science-based target for your own company, it must align with 50 percent reductions in both CO2 and methane emissions, plus an 80 percent decrease of black carbon, by 2035. Based on the nature of your business, you may want to take more aggressive steps toward sector-specific targets. For example, power generators including public utilities bear a greater burden for reducing CO2 emissions, while the agriculture industry must meet significantly higher methane reduction targets.

These efforts present challenges as well as opportunities. My next post with dive into the details of setting up SBTs for your company, with an overview of Scope 1, 2 and 3 emissions. My concluding post will explain why integrating emissions reduction targets for short-lived climate pollutants is essential to establishing the credibility of your corporate sustainability platform. Check back to TriplePundit next week for the second installment.

Image credit: Pexels

Tobias Schultz is Manager of Corporate Sustainability Services for SCS Global Services, where he designs and implements corporate sustainability programs for clients, including the development of quantitative life cycle assessments (LCAs) and the analysis of the environmental performance of global supply chains. SCS is a worldwide leader in independent, third-party environmental certification.

Schultz can be reached at tschultz@scsglobalservices.com

3P ID
250546
Prime
Off

Connecting Your Startup With Others to Help Your Business Grow

3P Author ID
100
Primary Category
Content

By Gadiel Morantes

Blink and you might miss an added wrinkle to the startup game. Any number of factors contributes to its evolution, but there will always be one constant: the spirit of competition.

Plans and intentions for startups pop up constantly because the market is becoming more and more saturated with those trying to make their mark. A massive and diverse ecosystem of different startups has developed. And, as in any environment, some species prosper while others wither away.

Just look at this telling statistic: Only 1 in 10 startups survive. There are several reasons for those failures, but one potential lifesaver I’d like to focus on is the role of connecting with other startups and how it plays a role in survival.

Sure, you’re fighting to keep your vision alive. But sometimes, the key to startup success is finding a balance between outworking the competition and working with it.

Stay alive by staying in the system

When you work with others and foster positive relationships, you lessen that chance of failure, which can lead to plenty of good things coming your way. In my experience, startups generally get the following invaluable benefits when they begin collaborating with and supporting other startups:


  • Shared resources: Scaling properly is crucial for startups. If you have strong connections, you can start small and share with others in a similar position in your network. Finding ways to share manufacturing space and equipment with another company while you’re both still growing can save a great deal of overhead.

  • Resume sharing: You’ll want to make scalable decisions in hiring as well. Comparing notes with another company on available talent in the field is a great way to find out who will be a great employee for you.

  • A sounding board: Sharing ideas with a competitor might seem like it’s revealing your plan of attack to the enemy, but more often than not, there’s enough business to go around if both companies distinguish themselves. When that’s the case, sharing ideas about similar fields can lead to growth on both sides.

Startups are fundamentally unpredictable. But luckily, when you encounter an obstacle, someone else has probably been there already.

Finding your place in the startup system

It helps both sides when you share stories of surprising defeats or unlikely victories, but these benefits only come through strong bonds with other startups and their founders. And it takes work for those bonds to form.

Here are four tactics I’ve used to help young businesses find common ground with colleagues and thrive in a startup system that, at times, can be cutthroat:

1. Get acquainted with the system. When I network, I try to build relationships with other companies offering complementary services. That way, they refer clients who might need accounting services to me, and I do the same for clients who might want their services. It’s a huge win-win, but it only happens because we get along and make an effort to keep in touch.

Cast a wide net, be gregarious, and seek out people with similar interests. Put reminders on your calendar to catch up with them. It’s common business knowledge, but it provides so many benefits for those who take advantage of it.

2. Give time and value to others. I’ve seen mergers that began as two founders spending time together and trading ideas — sometimes, it’s that simple. Once you’ve established yourself and have knowledge and expertise to offer, start sharing it.

Spend time with industry colleagues and demonstrate that you’re informed and experienced. It’s a great way to build trust with your contacts, while also opening the doors to potential partnerships and mergers with other interested companies.

3. Put like-minded people together. A great way to build your network is by helping others construct theirs. By providing introductions to people who will benefit from meeting each other, you bring demonstrative value and position yourself as a well-connected resource. Again, that kind of worth will be remembered and rewarded.

4. Encourage healthy competition. When we coach founders on their pitch decks, a common issue is the clarity of their value proposition. Founders need to develop differentiation by showing how their product is unique.

One way to figure this out is to look at the competition. Networking with competitors often reveals uniqueness through contrast. There is often enough business for everybody, so when you discuss business with competitors, don’t see it as divulging secrets. Instead, look at it as a way to share different approaches to a common issue.

The startup environment can be a rat race, but it’s that fluidity that sometimes keeps your company alive. Young companies can be competitive without being contentious. Whether you’re a big startup or just beginning, try to find ways to make connections and help the startup ecosystem grow.

Image credit: Flickr/Dennis Skyley

Gadiel Morantes is the chief revenue officer at Early Growth Financial Services, which addresses the lack of on-demand financial support available to startups. With more than 15 years of experience in sales, marketing, and operations, Morantes helps founders streamline the relationship between the sales and business departments and coaches early-stage companies on setting up an optimal infrastructure for success. Follow him on LinkedIn.

3P ID
250663
Prime
Off

Ties to Labor and Human Rights Among Palm Oil Leaders and Laggards

3P Author ID
367
Primary Category
Content

When talking about how the private sector impacts labor rights worldwide, both positively and negatively, the center of such a discussion must start with the global palm oil industry. The sector has grown rapidly over the past 10 to 15 years as food processing and personal care product companies increasingly coveted this ingredient, which can improve and standardize products’ consistency as well as lengthen their shelf lives. According to WWF, global palm oil production was approximately 24 million tons in 1999; by 2019, that amount more than doubled to approximately 50 million tons; in the next few years, watch for annual production to surpass 70 million tons.

Indeed, other industries also touch our everyday way of life, such as seafood, timber and paper, and cotton. But as WWF pointed out, almost half of all consumer goods on the planet include palm oil as an ingredient, which in turn is why this ingredient affects so many people across its entire supply chain -- from farm to factory and then, finally, at the retail check-out counter.

Furthermore, palm oil has become the focal point of the global debate over how to best manage the environment while securing the most basic human freedoms. Concern is now mainstream: In the popular Netflix series "Grace and Frankie," a scene over the palm oil controversy leads one character to tell the lead, played by Lily Tomlin, to get over her hypocritical stance on the industry: “I bet I could open up your purse right now and find three things that have palm oil in them.”

Yet those countless things made from palm oil are wreaking havoc in communities across the world, say many NGOs, as countries including Indonesia, Malaysia, Guatemala and Colombia become the setting for what is often called “conflict palm oil.” These nations are amongst several emerging economies to witness forced labor along with an increase in land grabbing and unfair wages.

Although business and consumer awareness of these problems have surged, many of the world’s leading food and consumer packaged goods companies continue to engage in business relationships with palm oil producers that are associated with labor and human rights violations -- therefore putting their reputation, and the lives of far too many citizens, at risk.

The fundamental problem is that far too many suppliers are refusing to abide by the principles of Free, Prior, and Informed Consent (FPIC). By international law, FPIC grants local communities the right to accept or decline any projects that could impinge on lands they either own, have occupied for generations or use to secure their economic livelihood.

So, who are the leaders in the push for more fairly produced palm oil, and who is falling behind?

Deborah Lapidis, campaign director with the global environmental campaign organization Mighty, told us it is difficult to identify which companies are leading on the procurement of ethically sourced palm oil. Many companies, she insisted, continue to be stubbornly opaque when it comes to the disclosure of where they procure palm oil and how their supply chains operate.

“There is a whole set of brand-name companies that we consider major laggards when it comes to the implementation of their ‘no exploitation' policies, because they have failed to publish their plans and progress reports,” Lapidis told TriplePundit. “And they never respond to our inquiries when we flag problems in their supply chain.”

Companies Mighty has called out for their refusal to engage with environmental and human rights NGOs include some of the world’s most popular food brands: Krispy Kreme, Tim Horton’s, Starbucks, Kraft Heinz, ConAgra, Burger King and Yum! Brands (which runs Pizza Hut, KFC and Taco Bell restaurants). “All of these companies failed to respond to three serious inquiries of potential policy violations in the past few months,” Lapidis said.

The situation is especially dire as labor and human rights abuses in several countries continue to worsen, she explained, despite a boost in consumer awareness about palm oil's impact and hence a much louder outcry.

Mighty’s inquiries included palm oil suppliers such as BLD Kirana on Malaysian Borneo, which allegedly encroached on land long owned by local communities; Indonesia’s IOI Group, which continues to be accused of conducting human rights and environmental abuses; and Korindo, a Korean-owned palm oil producer that activists say occupied land owned by families for generations while deforesting areas on which communities were economically dependent for centuries.

Rainforest Action Network (RAN) is one of the most active NGOs monitoring the palm oil sector and advocating for the fairer treatment of workers within global palm oil supply chains. It is also highly critical of the industry.

“Unfortunately, the palm oil supply chain continues to be riddled with the worst forms of human and labor rights abuses,” said RAN’s Emma Lierley. “While activists have had great success in raising the profile of conflict palm oil in the last few years, what has often been overlooked is the conditions for workers and communities on the ground.”

One company RAN repeatedly called out for intense criticism is PepsiCo. The New York-based beverage and snack food giant is one company included on RAN's Snack Food 20, a group of companies the NGO says is behind both deforestation and the lack of protections for workers within the palm oil industry.

Two years ago, PepsiCo promised a strong commitment to the sourcing of sustainable and responsible palm oil. RAN counters that PepsiCo’s 2014 declaration, which the company updated last year, fails to secure strong human rights protections for workers and their local communities.

The culprit is Indofood, a PepsiCo joint venture that is one of the most prominent palm oil growers in the world. It's the largest Indonesian food processing company and the sole maker of PepsiCo’s food products in Indonesia. The list of abuses, from RAN’s point of view, runs long and include:


  • The company’s reliance on contract workers who often bring their families to palm oil plantations in their desperation to meet impossible quotas

  • Wages that can be up to 75 percent lower than what is generally accepted within the industry; the blind eye to child labor

  • And unsafe working conditions, which range from the excessive use of pesticides to the refusal to provide safety equipment to workers

These ongoing controversies led RAN and other NGOs to urge the suspension of Indofood from the Roundtable on Sustainable Palm Oil (RSPO), the body tasked with defining global standards for ethical and responsibly-sourced palm oil. RAN also led a campaign to thwart PepsiCo’s efforts to attract millennial workers at recruiting events on university campuses. Student activists confronted PepsiCo recruiters, saying young professionals will not work for a company with links to documented human rights abuses.

Are any companies driving positive change on how palm oil is grown, produced and distributed? When it comes to progress on supply chain issues, WWF touts the efforts of companies including Danone, General Mills, Kraft Heinz and Unilever in its most recent global palm oil scorecard.

And Lapidus of Mighty noted a turnaround within some of the larger palm oil suppliers in Southeast Asia. “I think the companies that have an open and transparent grievance mechanism for workers should get credit for being further along than their peers,” she said. Those companies include producers Wilmar, Musim Mas, GAR and Asian Agri.

Emma Lierley of RAN, however, insists the entire palm oil industry has a long road ahead before it can genuinely say it is a sector committed to the fair treatment of workers and local communities. “There are no true leaders on these issues, as conflict palm oil is too often associated with indigenous and community land grabbing, forced and child labor, poverty wages, union busting, and reckless exposure to toxic chemicals,” Lierley said.

Image credit: Mighty/Flickr

3P ID
251050
Prime
Off

U.S. Ups the Ante on Renewable Hydrogen Production

3P Author ID
4227
Primary Category
Content

When it comes to the sparkling green hydrogen economy of the future, the U.S. isn't letting any grass grow under its feet. The Energy Department's National Renewable Energy Laboratory now plans to lead a collaborative effort to accelerate the development of renewable hydrogen.

A total of six national laboratories, dubbed the HydroGEN Advanced Water Splitting Materials Consortium, will work together on faster, cheaper and more efficient ways to recover hydrogen from water. The end goal is to power fuel cells for electric vehicles and other zero-emission applications.

Public-private partnerships for sustainable hydrogen


Earlier this year, the Energy Department introduced the concept for a hydrogen-powered pathway to the "deep decarbonization" of the domestic economy. The new consortium takes it to the next level by making more public resources available to private-sector stakeholders.

NREL described the consortium in a press release issued earlier this week. The idea is to ramp up the development of the new commercial water-splitting techniques that are emerging in the nation's laboratories.

The list includes advanced electrolysis, photoelectrochemical processes, and solar thermochemical processes.

That shortlist is highly selective for a reason. Each of those technologies can be powered by solar and/or wind energy, and that is the key to sustainable hydrogen production. Without these clean, renewable resources, hydrogen is stuck in the fossil fuel track.

The new director of the consortium is Dr. Huyen N. Dinh of the NREL Chemistry and Nanoscience Center. Dr. Dinh had this to say about the need for public resources to step in:

"HydroGEN brings together capabilities that can only be found in the national lab system and makes them easily available to material developers in academia and industry. Our research strategy integrates computational tools and modeling, material synthesis, process and manufacturing scale-up, characterization, system integration, data management, and analysis to accelerate advanced water splitting material development."

More group hugs for U.S. taxpayers


The taxpaying public can also take credit for a broader Obama administration initiative that makes the national laboratories' resources available to the business community, with the goal of accelerating the transition to hydrogen.

That initiative was introduced earlier this year as the Energy Materials Network. Part of its mission is to develop the sophisticated new  materials that are needed to provide catalysts for water-splitting.

As part of that network, HydroGEN will work in these areas:


  • Making novel national lab capabilities, expertise, techniques, and equipment relevant to advanced water-splitting materials more accessible to external stakeholders, including researchers in industry, academia, and other laboratories.

  • Establishing robust online data portals that capture and share the results of non-proprietary research.

  • Facilitating collaboration between researchers working on the three water-splitting pathways and addressing common materials challenges and resource needs, such as high-throughput synthesis techniques and auxiliary component design.

That's just the tip of the advanced materials iceberg. Along with HydroGEN, the Energy Department established a whole raft of other research consortia aimed at pushing the hydrogen economy envelope.

The areas of focus include hydrogen storage, solar conversion efficiency and biomass conversion (biomass is another pathway for renewable hydrogen) among others.

Who's gonna pay for all this?


The HydroGEN initiative is designed to operate with about $10 million per year. The money will go to upgrading labs and providing new grant opportunities for innovators outside of the laboratory system. And the ripple effects could pump billions into the national economy.

For a real-life demonstration of the impact public investments can have in foundational energy research, just take a look at past history. The wind and solar industries are accelerating rapidly, thanks to foundational research and financial backing for commercializing new technologies.

On the other hand, the $10 million is "subject to appropriations," according to the Energy Department. That means the future of HydroGEN is not assured, at least not until after the Nov. 8 presidential election.

Republican presidential candidate Donald Trump, like many who hold leadership positions in that party, is not a fan of renewable energy. It's a safe bet that entire research areas would be sloughed off under his administration.

Democratic nominee Hillary Clinton, in contrast, is poised to pick up where the Obama administration left off, with a slew of vigorous initiatives for decarbonizing the economy.

Based on the recent flurry of activity in the Energy Department, it looks like President Obama hopes renewable hydrogen will feature front and center in a Clinton administration.

Image: via U.S. Department of Energy.

Save

Save

Save

Save

Save

Save

Save

Save

Save

Save

Save

Save

3P ID
250952
Prime
Off

Columbia Scientist Discovers Plastic-Based Energy Storage Solution

3P Author ID
4227
Primary Category
Content

Lithium-ion batteries are still king in the energy storage market, but their shortcomings are holding back the transition to electric vehicles. One main problem is cost. Li-ion battery packs are expensive, and they push up the price of an EV. Their function as an energy storage unit also means they take up a lot of space, and their weight is a drag on efficiency.

The good news is that Li-ion batteries are constantly being improved. In one of the latest developments, a Columbia Engineering researcher tweaked the manufacturing process with the common plastic PMMA. The result is a 10 to 30 percent jump in energy density, which translates into a smaller, lighter and less expensive energy storage product.

The Li-ion energy storage hiccup


The new development zeroes-in on a hiccup that results from the conventional Li-ion manufacturing process. The problem comes up after the battery is completed, when it is charged for the very first time.

As described by Columbia researchers, part of the electrolyte transitions from a liquid to a solid during the first charge, and that solid adheres onto one of the electrodes. That transition can reduce the energy of a Li-ion battery from 5 to 20 percent. That lowers the capacity of the battery and can interfere with its lifespan, too.

(For those of you new to the topic, the electrodes are the parts of the battery that receive and discharge an electrical current. The electrolyte is the part that stores the charge.)

The numbers are even higher for more sophisticated, high-efficiency Li-ion batteries:

"The loss is approximately 10 percent for state-of-the-art negative electrodes, but can reach as high as 20 to 30 percent for next-generation negative electrodes with high capacity, such as silicon, because these materials have large volume expansion and high surface area."

One way to work around the problem is to add lithium-saturated materials during the manufacturing process, to counterbalance the lithium "lost" during the first charge.

However, lithium is unstable in ambient air. The conventional process requires a moisture-free manufacturing environment, which adds significant costs.

Plastic to the rescue


Columbia Engineering researcher Yuan Yang tackled the problem by developing a tightly focused way to protect lithium-based electrodes during the manufacturing process, eliminating the need to create a moisture-free environment in the factory. Here's the rundown from Columbia:
"In these electrodes, he protected the lithium with a layer of the polymer PMMA to prevent lithium from reacting with air and moisture, and then coated the PMMA with such active materials as artificial graphite or silicon nanoparticles."

Once the electrode comes into contact with the electrolyte, the PMMA dissolves away. There is no contact with ambient air during the entire process.

The new process is not ready for commercial development -- yet. The next steps involve refining the process to reduce the thickness of the PMMA coating, and to ramp the steps into high-volume mode.

You can get all the details from the study, published online at the journal Nano Letters.

Energy storage and EVs


Getting back to the EV market, if Yang's new electrode makes it into the market, auto manufacturers have more flexibility. With a higher-capacity battery, an electric vehicle could have greater range without increasing the size and weight of the battery pack.

Another option is to shrink the size of the battery pack without sacrificing range.

That could bring about some interesting options for auto manufacturers like Toyota. Last month I went on a "ride and drive" with Toyota on behalf of TriplePundit, and the company emphasized the extent to which plug-in hybrid EVs can satisfy the needs of most drivers, even with a relatively low driving range.

Toyota executives explained that the goal for the company's signature plug-in hybrid EV, the Prius, is to keep the battery pack relatively small in order to keep the price of the vehicle within an affordable range. (The company is dedicated to increasing fuel efficiency, too.)

Even with a small battery pack, the 2017 Prius can operate in full EV mode for the daily commute of up to 80 percent of U.S. drivers (the high end combines home and workplace charging).

Introducing a smaller energy storage unit with the same range would provide Toyota with an opportunity to keep pushing down the price of a Prius.

That would make the car even more affordable, and bring the zero-emission driving experience to more of the motoring public.

A smaller battery pack would also take up less space. That could provide hybrid manufacturers with an opportunity to introduce new design options and other aesthetic features that attract drivers to the EV experience.

Photo: 2017 Toyota plug-in Prius hybrid EV by Tina Casey.

Save

Save

Save

Save

Save

Save

3P ID
250763
Prime
Off

CDP: Business Commitments on Carbon Fall Short of Paris Climate Goals

3P Author ID
367
Primary Category
Content

Last year’s COP21 climate talks in Paris were historic for the level of commitment dozens of national governments made in order to limit global warming to 2 degrees Celsius this century.

But this treaty will go absolutely nowhere unless the international business community is on board. It is business which drives the economic activities that result in the greenhouse gas emissions causing climate change. But a punitive approach is not necessary: Recent years prove businesses understand that mitigating climate risk can generate economic opportunities.

A recent report that CDP issued this week, however, reveals that the world’s largest companies have a long way to go before society can come close to meeting COP21’s ambitious goals. In fact, the research suggests current company targets will only help the world reach 25 percent of its targeted, science-based emissions reductions.

In fairness, the CDP analysis is based on what companies publicly disclosed during 2015, which means much of the data analyzed was released before the COP21 talks in December. CDP’s report is also dependent on companies replying to its questionnaire in the first place; and considering the “survey fatigue” that understandably inundates corporate sustainability executives, its report does not necessarily paint a complete picture.

Most likely, the next annual release of corporate sustainability goals, and CDP disclosures, will show a spike in commitments when compared to those of last year. Nevertheless, for now CDP’s survey indicates there is plenty of work that can be done on carbon pricing, investments in clean energy technologies and energy efficiency.

But progress is clearly underway. More companies proved that reductions in their environmental impact did not have a negative impact on their profits. CDP describes these firms as being successful at “decoupling,” as in reducing their emissions by 10 percent annually while achieving at least a 10 percent growth in revenues. Companies that decoupled their carbon emissions from their revenue have largely benefited financially because of reductions in their energy consumption, an increase in operational efficiency or even due to the development of new products that can result in a far healthier ledger.

The numbers alone should be a wake-up call for business: This group of companies in aggregate increased their revenues by 29 percent over the past five years, while decreasing their emissions by 26 percent, according to CDP estimates. Companies within this group are all over the map, and include the British retailer Sainsbury’s, India’s IT giant Wipro and the Japanese building materials manufacturer Lixil.

The largest takeaway from this report, however, is that COP21 may very well prove to be the catalyst that sparks more companies into action. This trend can be seen in the increasing number of companies making their commitments to climate action public via the We Mean Business platform. This coalition, in which organizations such as Business for Social Responsibility (BSR), Ceres, WWF and the Climate Group join CDP, is encouraging companies to take on initiatives such as science-based climate targets, the establishment of a carbon price for accounting purposes and investments in renewables.

In early 2015, only three U.S. companies declared commitments to such programs: By the end of the year, that number surged to 50. And it is not only North American and European companies that are taking action on climate change: CDP notes that large firms in Brazil, India and China also signed on to this agenda, and most encouragingly, many of these companies are within the carbon-intensive manufacturing sector.

The downside of CDP’s report is that it is top-heavy on testimonials, which are laden with narratives that are great for a public relations brochure, but cheapen CDP’s sobering case to push more companies to take action. The summary of L'Oréal’s sustainability report, Sharing Beauty with All, is one example that elicits more than a few eye-rolls. Those criticisms notwithstanding, the statistics demonstrate that CDP’s research offers plenty of substance.

And the bottom line is that companies are feeling the heat from stakeholders who want bolder action on climate change, but are undecided on how to proceed. Such firms should consider internal carbon pricing, the report concludes. CDP estimates that 60 percent of companies surveyed said they have no immediate plans to implement such a plan. But a carbon-pricing policy can help a company identify inefficiencies and eventually lead to investments in renewables and energy efficient technologies, CDP asserted.

Microsoft, for example, is an early adopter when it comes to carbon pricing, and the company saved millions as a result. But at a higher level, the software multinational’s four-year-old carbon pricing initiative helped instill a culture of environmental awareness and innovation far beyond its headquarters in Redmond, Washington.

If success breeds imitation, then CDP’s latest report offers plenty of case studies as to why a commitment to a lower-carbon business model is not only good for the environment, but can also secure a company’s leadership within its sector.

Image credit: CDP/Alain Buu (used with permission) 

3P ID
251008
Prime
Off