Featured Past Articles

Many consumers don’t realise is how extremely complicated and complex the process of successfully bringing flowers from field to retail shelves is. Some of the most challenging parts of implementing a seamless floral supply chain are proper maintenance of the cold chain, increased speed to market, and controlling cost.

One of the main factors affecting the floral supply chain is the need for proper temperature control throughout transportation. Thirty eight cent of fresh flowers available in Europe are grown in Kenya and exported into the EU. When flowers are harvested and cut in the fields, they must immediately be cooled in order to make the flower dormant and prevent blooming. Next, flowers enter customs in preparation for their flight to European airport hubs, exposing them to warmer temperatures while they wait for clearance.

The flowers are cooled again once arriving at warehouses in preparation for their journey on refrigerated trucks to forward distribution centers. Ensuring that flowers maintain a cool temperature throughout this process decreases the risk of diminishing vase life during these breaks in the cold chain.

Most parties involved in the supply chain, from retail customers to growers and shippers seek ways to compress time in the floral supply chain. The increased demand associated with floral holidays can create challenges if stakeholders are not aligned with a strong plan. As merchandising within non-traditional channels such as online, drugstores and convenience stores continue to grow, the ability to deliver straight loads from your facility to a customer distribution center is just the start.

Working in collaboration with strong transportation providers in floral distribution who have consolidation points in multiple origins, forward distribution capabilities, and the ability to execute direct store delivery solutions, will make this timely process more manageable. Cost reduction will always be an area of focus within the floral supply chain and increasing attention is being paid to pricing transparency, not just on product costs, but the amount that is spent to get product to market. To combat this, new and untraditional methods are being discussed on how to get floral products to end-users. For example, consolidation tactics like matching floral products with other temperature-compatible products helps to increase volume, reduce cost, and provide more control.

Beyond affecting the bottom line, an improperly managed cold chain can lead to serious quality concerns. By tailoring supply chain best practices for sensitive cold chains, shippers around the world can mitigate risk and better control the final outcome.

Successfully managing the cold chain from field to consumer will always be a crucial component to delivering high quality fresh products, and floral is no different. Customers who work with transportation companies who understand the intricacies of their floral program, have a deep understanding of customers’ needs and expectations, and can provide visibility and management over the entire cold chain by utilising advanced technology solutions, will set themselves up with the best opportunity for which to succeed.

On November 12th, the official kick-off of project SMART in Rwanda was launched. The kick-off of this project is part of the economic mission to Rwanda, led by Dutch Minister for Foreign Trade and Development Cooperation, Minister Ploumen. The aim of this project is to enable farmers in Rwanda to develop a sustainable and profitable business where productivity and food-safety are key. By combining Dutch technology and expertise with local expertise of farmers and knowledge of institutes, both parties see a clear win-win-situation.

During the trade mission to Rwanda, Minister Ploumen stated the importance of the cooperation of Dutch suppliers, such as Bosman Van Zaal and Hoogendoorn with local entrepreneurs of small-scale farms in Rwanda. SMART has projects in South Africa and Rwanda, focusing on different types of technological solutions for as well large, mid as small-scale companies. Small-scale farm Rwanda Best is project partner in Rwanda. The project is cofinanced by the Dutch Ministry of Foreign Affairs. Project partners greenhouse constructor Bosman Van Zaal and automation supplier Hoogendoorn Growth Management will realize the greenhouse.

Kenya’s European Union (EU) market share is about 38%

Kenya’s flower industry is the oldest and largest in Africa contributing 1.29% of the national GDP. The sector has continued to record growth in volume and value of cut flowers exported every year. According to Kenya National Bureau of Statistics in 2013, the floriculture industry exported 124, 858 tons valued at Kshs 46.3 billion. It is estimated that over 500,000 people, including over 90,000 flower farm employees depend on the floriculture industry.

The main production areas are around Lake Naivasha, Mt. Kenya, Nairobi, Thika, Kiambu, Athi River, Kitale, Nakuru, Kericho, Nyandarua, Trans Nzoia, Uasin Gichu, Kajiado and Eastern Kenya.

Kenya is the lead exporter of rose cut flowers to the European Union (EU) with a market share of about 38%. In the United Kingdom, supermarkets are the main retail outlets. Other growing destinations include Japan, Russia and USA.

Going by the investment trends of the last couple of years, one is left to ask this simple question. Is Kenya shifting to the auction like markets while moving away from the auctions?

The trend of investment in flower for both new investments and extension of existing farms has been 80% uplands and 20% lowlands. This means the target production is long stems and big head varieties. However, the market is slowly shifting from auction to the more lucrative (relative) wholesale. Statistics available show less than five farms are purely auction growing with most of the remaining farms doing 65% direct and 35% auction. So, which way Kenyan flowers?

Most of the growers have shifted to the wholesale markets, and some of them are doing retail markets for some customers who need the long stem big head varieties. In this trend, one is left asking himself, why are we shifting to these varieties? Is it because our traditional short stem and small head market is dwindling? Or is it because the other market is bigger and has less competition? The answer will be Yes and No. Why? We all need to agree that Ecuador has slowly encroached to our traditional markets and looking at their quality, the competition maybe stiffer.

A marketing guy in a flower farm asked me the other day what the greatest marketing challenges facing flower business today were and whether I think these challenges are different than they have been historically. That’s a great question and I’ve pondered it for a while.

Ultimately, what I decided is that I really don’t think the challenges themselves are any different today than they have been historically and I think the same basic marketing principles apply today that have always applied.

So, what are the greatest marketing challenges facing flower business today and why are they really the same as they ever were? In my opinion, at a high level, the challenges are:

  1. Identifying the most viable target markets.
  2. Effectively positioning what you have to offer against the competition.
  3. Selecting the right communication channels to appeal to your identified market

Over 95% of the exported cut flowers are transported by air which makes securing air cargo space a priority. To cushion this, large exporters have been able to exercise some control over space through joint ventures with freight forwarders.

The freight forwarders inspect and document flower and temperature conditions, palletize packed flowers, store them in cold storage facilities at the airport, clear them through export customs, obtain phytosanitary certification, and load the cargo onto commercial or charter flights. Some forwarders also offer cooled transport for growers.

The idea that “Climate science is settled” runs through today’s popular and policy discussions. Unfortunately, that claim is misguided. It has not only distorted our public and policy debates on issues related to energy, greenhouse-gas emissions and the environment, but has inhibited the scientific and policy discussions that we need to have about our climate future.

My training as a computational physicist—together with a 40-year career of scientific research, advising and management in academia, government and the private sector—has afforded me an extended, up-close perspective on climate science. Detailed technical discussions during the past year with leading climate scientists have given me an even better sense of what we know, and don’t know, about climate. I have come to appreciate the daunting scientific challenge of answering the questions that policy makers and the public are asking.

The crucial scientific question for policy isn’t whether the climate is changing. That is a settled matter: The climate has always changed and always will. Geological and historical records show the occurrence of major climate shifts, sometimes over only a few decades. We know, for instance, that during the 20th century the Earth’s global average surface temperature rose 1.4 degrees Fahrenheit.

Neither is the crucial question whether humans are influencing the climate. That is no hoax: There is little doubt in the scientific community that continually growing amounts of greenhouse gases in the atmosphere, due largely to carbon-dioxide emissions from the conventional use of fossil fuels, are influencing the climate.

There is also little doubt that the carbon dioxide will persist in the atmosphere for several centuries. The impact today of human activity appears to be comparable to the intrinsic, natural variability of the climate system itself.

But rather the crucial, unsettled scientific question for policy is, “How will the climate change over the next century under both natural and human influences?” Answers to that question at the global and regional levels, as well as to equally complex questions of how ecosystems and human activities will be affected, should inform our choices about energy and infrastructure.

But—here’s the catch—those questions are the hardest ones to answer. They challenge, in a fundamental way, what science can tell us about future climates.

Even though human influences could have serious consequences for the climate, they are physically small in relation to the climate system as a whole. For example, human additions to carbon dioxide in the atmosphere by the middle of the 21st century are expected to directly shift the atmosphere’s natural greenhouse effect by only 1% to 2%. Since the climate system is highly variable on its own, that smallness sets a very high bar for confidently projecting the consequences of human influences.

A second challenge to “knowing” future climate is today’s poor understanding of the oceans. The oceans, which change over decades and centuries, hold most of the climate’s heat and strongly influence the atmosphere. Unfortunately, precise, comprehensive observations of the oceans are available only for the past few decades; the reliable record is still far too short to adequately understand how the oceans will change and how that will affect climate.

A third fundamental challenge arises from feedbacks that can dramatically amplify or mute the climate’s response to human and natural influences. One important feedback, which is thought to approximately double the direct heating effect of carbon dioxide, involves water vapor, clouds and temperature.

But feedbacks are uncertain. They depend on the details of processes such as evaporation and the flow of radiation through clouds. They cannot be determined confidently from the basic laws of physics and chemistry, so they must be verified by precise, detailed observations that are, in many cases, not yet available.

Beyond these observational challenges are those posed by the complex computer models used to project future climate. These massive programmes attempt to describe the dynamics and interactions of the various components of the Earth system—the atmosphere, the oceans, the land, the ice and the biosphere of living things. While some parts of the models rely on well-tested physical laws, other parts involve technically informed estimation. Computer modeling of complex systems is as much an art as a science.

For instance, global climate models describe the Earth on a grid that is currently limited by computer capabilities to a resolution of no finer than 60 miles. (The distance from New York City to Washington, D.C., is thus covered by only four grid cells.) But processes such as cloud formation, turbulence and rain all happen on much smaller scales. These critical processes then appear in the model only through adjustable assumptions that specify, for example, how the average cloud cover depends on a grid box’s average temperature and humidity. In a given model, dozens of such assumptions must be adjusted (“tuned,” in the jargon of modelers) to reproduce both current observations and imperfectly known historical records.

We often hear that there is a “scientific consensus” about climate change. But as far as the computer models go, there isn’t a useful consensus at the level of detail relevant to assessing human influences. Since 1990, the United Nations Intergovernmental Panel on Climate Change, or IPCC, has periodically surveyed the state of climate science. Each successive report from that endeavor, with contributions from thousands of scientists around the world, has come to be seen as the definitive assessment of climate science at the time of its issue.

For the latest IPCC report (September 2013), its Working Group I, which focuses on physical science, uses an ensemble of some 55 different models. Although most of these models are tuned to reproduce the gross features of the Earth’s climate, the marked differences in their details and projections reflect all of the limitations that I have described.

For example:
• The models differ in their descriptions of the past century’s global average surface temperature by more than three times the entire warming recorded during that time. Such mismatches are also present in many other basic climate factors, including rainfall, which is fundamental to the atmosphere’s energy balance. As a result, the models give widely varying descriptions of the climate’s inner workings. Since they disagree so markedly, no more than one of them can be right.
• Although the Earth’s average surface temperature rose sharply by 0.9 degree Fahrenheit during the last quarter of the 20th century, it has increased much more slowly for the past 16 years, even as the human contribution to atmospheric carbon dioxide has risen by some 25%. This surprising fact demonstrates directly that natural influences and variability are powerful enough to counteract the present warming influence exerted by human activity.

Yet the models famously fail to capture this slowing in the temperature rise. Several dozen different explanations for this failure have been offered, with ocean variability most likely playing a major role. But the whole episode continues to highlight the limits of our modeling.
• The models roughly describe the shrinking extent of Arctic sea ice observed over the past two decades, but they fail to describe the comparable growth of Antarctic sea ice, which is now at a record high.
• The models predict that the lower atmosphere in the tropics will absorb much of the heat of the warming atmosphere. But that “hot spot” has not been confidently observed, casting doubt on our understanding of the crucial feedback of water vapor on temperature.
• Even though the human influence on climate was much smaller in the past, the models do not account for the fact that the rate of global sea-level rise 70 years ago was as large as what we observe today—about one foot per century.
• A crucial measure of our knowledge of feedbacks is climate sensitivity—that is, the warming induced by a hypothetical doubling of carbon-dioxide concentration. Today’s best estimate of the sensitivity (between 2.7 degrees Fahrenheit and 8.1 degrees Fahrenheit) is no different, and no more certain, than it was 30 years ago. And this is despite a heroic research effort costing billions of dollars.

These and many other open questions are in fact described in the IPCC research reports, although a detailed and knowledgeable reading is sometimes required to discern them. They are not “minor” issues to be “cleaned up” by further research. Rather, they are deficiencies that erode confidence in the computer projections. Work to resolve these shortcomings in climate models should be among the top priorities for climate research.

Yet a public official reading only the IPCC’s “Summary for Policy Makers” would gain little sense of the extent or implications of these deficiencies. These are fundamental challenges to our understanding of human impacts on the climate, and they should not be dismissed with the mantra that “climate science is settled.”

While the past two decades have seen progress in climate science, the field is not yet mature enough to usefully answer the difficult and important questions being asked of it. This decidedly unsettled state highlights what should be obvious: Understanding climate, at the level of detail relevant to human influences, is a very, very difficult problem.

We can and should take steps to make climate projections more useful over time. An international commitment to a sustained global climate observation system would generate an ever-lengthening record of more precise observations. And increasingly powerful computers can allow a better understanding of the uncertainties in our models, finer model grids and more sophisticated descriptions of the processes that occur within them. The science is urgent, since we could be caught flat-footed if our understanding does not improve more rapidly than the climate itself changes.

A transparent rigor would also be a welcome development, especially given the momentous political and policy decisions at stake. That could be supported by regular, independent, “red team” reviews to stress-test and challenge the projections by focusing on their deficiencies and uncertainties; that would certainly be the best practice of the scientific method. But because the natural climate changes over decades, it will take many years to get the data needed to confidently isolate and quantify the effects of human influences.

Policy makers and the public may wish for the comfort of certainty in their climate science. But I fear that rigidly promulgating the idea that climate science is “settled” (or is a “hoax”) demeans and chills the scientific enterprise, retarding its progress in these important matters. Uncertainty is a prime mover and motivator of science and must be faced head-on. It should not be confined to hushed sidebar conversations at academic conferences.

Society’s choices in the years ahead will necessarily be based on uncertain knowledge of future climates. That uncertainty need not be an excuse for inaction. There is well-justified prudence in accelerating the development of low-emissions technologies and in cost-effective energy-efficiency measures.

But climate strategies beyond such “no regrets” efforts carry costs, risks and questions of effectiveness, so nonscientific factors inevitably enter the decision. These include our tolerance for risk and the priorities that we assign to economic development, poverty reduction, environmental quality, and intergenerational and geographical equity.

Individuals and countries can legitimately disagree about these matters, so the discussion should not be about “believing” or “denying” the science. Despite the statements of numerous scientific societies, the scientific community cannot claim any special expertise in addressing issues related to humanity’s deepest goals and values. The political and diplomatic spheres are best suited to debating and resolving such questions, and misrepresenting the current state of climate science does nothing to advance that effort.

Any serious discussion of the changing climate must begin by acknowledging not only the scientific certainties but also the uncertainties, especially in projecting the future. Recognizing those limits, rather than ignoring them, will lead to a more sober and ultimately more productive discussion of climate change and climate policies. To do otherwise is a great disservice to climate science itself.

Dr. Koonin was undersecretary for science in the Energy Department during President Barack Obama’s first term and is currently director of the Center for Urban Science and Progress at New York University. His previous positions include professor of theoretical physics and provost at Caltech, as well as chief scientist of BP, BP.LN -0.82% where his work focused on renewable and low-carbon energy technologies.