To save forests, researchers are hooking trees up to Twitter

Huge amounts of revealing data can be collected from sensors attached to trees.

Tim Rademacher, Harvard Kennedy School; Grace Field, University of Cambridge, and Kathy Steppe, Ghent University

In July 2018, a century-old red oak went live on Twitter. The account @awitnesstree, tweeting from the Harvard Forest in Petersham, Massachusetts, introduces itself in its bio:

Witnessing life as a tree in a changing environment for more than a century. Views are my own – sort of (data translated by scientists and communicators at HF).

Every few days, the tree updates its 9,118 followers. On February 24 2020 it posted: “The last 2 days were extremely hot for February. When is this heatwave going to end?”

The day before, it had complained even more:

Now, after a hiatus due to COVID-related challenges, the Witness Tree is coming back online.

The tree’s messages are based on data from a suite of sensors on and around its trunk, using a real-time approach to tree monitoring pioneered by Witness Tree’s inspiration and sister project Led by Ghent University, set up its first tweeting tree in 2016, and currently monitors sensor data from 21 trees across Belgium, Germany, India, the Netherlands and the UK.

The sensors fitted to Harvard’s Witness Tree include a ribbon embedded in its trunk to track water flow, a spring-loaded pin pushing against its bark to monitor shrinkage and swelling and a camera to capture leaf growth. Continuous data streams from these sensors tell us how the tree is affected by changes in its immediate environment. This technology is still in its infancy, but it shows exceptional promise.

Real-time sensors monitor the Witness Tree’s wellbeing.

By analysing data from Witness Tree and, we have already learned that drought can cause a tree’s stomata – the openings on the underside of its leaves – to close. The closed stomata block water intake, disrupting tree growth. More frequent droughts may therefore lead to less carbon uptake by trees and forests.

Harvard Forest’s Witness Tree.

Forthcoming studies even indicate that individual trees respond differently to the same heat waves, and that water transport in trees can react instantly to the presence of a solar eclipse. With the sun obscured by the moon, stomata close as they would do at night, immediately reducing water intake.

As we continue to assess incoming data from Witness Tree and, we will surely learn even more about how trees affect – and are affected by – their surroundings.

Science communication

The red oak at Harvard Forest, along with its Asian and European cousins at, is first and foremost a rich source of scientific data. But at the same time that data, when converted to tweets by custom-built algorithms, turns the Witness Tree into a platform for science communication research.

Behind the scenes, a computer program analyses the incoming numbers from Witness Tree’s sensors: cross-checking against pre-programmed thresholds for normal activity, looking for abrupt changes and compiling summaries.

For each key data feature, including daily water use, sap flow dynamics, stem shrinkage and trunk growth, the researchers at Harvard Forest have provided the program with several different prewritten message templates. The program chooses one of these templates, inserts the relevant data, and posts the completed message on Twitter as if in the tree’s own voice.

Because the messages are chosen from templates at random, they can be used as a testing ground to study how the public prefers to engage with different topics and writing styles.

Preliminary results suggest, somewhat surprisingly, that the Witness Tree’s followers engage equally with data-driven and narrative-based tweets. The addition of multimedia – through images, videos or data visualisation – generates more responses, likes and retweets. Any posts that directly concern climate change seem to attract the most attention.

The future

To gain access to even more data, both the Witness Tree project and are expanding. The single Witness Tree will soon become part of a forest network spread over urban, suburban and rural areas to study how trees function in different environments.

Future witness trees with fine particulate matter sensors sensitive to poor air quality could help grow awareness about environmental stress factors faced by humans and trees alike.

New trees monitored by will measure carbon lost due to tree respiration, paving the way for more accurate carbon accounting. By cementing our understanding of how trees contribute to the carbon cycle, we will be in a better position to reduce carbon output globally.

Long-term, Witness Tree and aim to work together to build a vast, international network of tweeting trees: in other words, an internet of trees. The data from this “internet” will provide invaluable insights into the wellbeing of our forest ecosystems – from detecting early signs of drought and tracking the impact of pests and pathogens to forecasting sap flow for maple syrup production.

The trees currently monitored by, spread across Europe and Asia.

As we have learned more about how trees interact with the ecosystems that they visually define, trees have often been represented as social creatures in recent research and popular writing. In a way, Witness Tree and play into this idea by giving their trees a human-like voice. They use personification as a tool to communicate effectively with a wide audience.

But it would be counterproductive to take this metaphor too seriously, because each tree’s voice is in fact a fiction fed by automated messages. Really, it’s the data talking – and the story that data tells is the brutally honest reality of environmental change.The Conversation

Tim Rademacher, Postdoctoral Research Fellow, Harvard Kennedy School; Grace Field, PhD Candidate in History and Philosophy of Science, University of Cambridge, and Kathy Steppe, Professor of Applied Plant Ecophysiology, Ghent University

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Source: To save forests, researchers are hooking trees up to Twitter

Trees talk to each other and scientists have mapped the network

By Robert Dalheim
Scientists discovered that trees talk to each other through the Wood Wide Web. And now, they’ve mapped it.

Do trees actually talk to each other? And if so, how do they do it?

Just over 20 years ago, ecologist Suzanne Simard discovered that trees do communicate with each other, and it’s through a fungal network scientists have nicknamed the Wood Wide Web.

And now, an international team of scientists has created the first global map of the vast underground network. They did this by creating a computer algorithm to analyze a database from the Global Forest Inititiave, which includes 1.2 million trees in more than 70 countries.

The algorithm takes into account the different fungal species that associate with each tree species. It also takes into account local climate factors – which the scientists say has the biggest role to play.

“It’s the first time that we’ve been able to understand the world beneath our feet, but at a global scale,” Thomas Crowther, an author of the study from ETH Zurich, told the BBC. “Just like an MRI scan of the brain helps us to understand how the brain works, this global map of the fungi beneath the soil helps us to understand how global ecosystems work.

“What we find is that certain types of microorganisms live in certain parts of the world, and by understanding that we can figure out how to restore different types of ecosystems and also how the climate is changing,” he said.

Source: Trees talk to each other and scientists have mapped the network – Woodworking News, 2019-05-16

Foresters struggle to tell tree tales

By Rob Chaney
If a tree issue blows up in the forest, does anyone hear it?

Considering that eight of every 10 Americans live in big cities, that’s a problem for the Society of American Foresters. On Friday, the organization of forest professionals, loggers, mill workers, academics and government land managers gathered to puzzle how to better get their stories told.

Because while millions of Americans may never see a Ponderosa pine burn in a wildfire, they will breathe the smoke and may cancel their vacation plans and might pay more taxes for disaster relief. Meanwhile, the assembled society members at the University of Montana struggled with their own mixed messages, long-standing mistrust of opponents and unfamiliarity with a fast-changing media landscape.

“If we can’t get our collective act together, how can we expect the public to come around to broader agreement on forest issues?” asked Dave Atkins, a retired forester who now runs the online media outlet and serves on the National Association of Forest Professionals communication committee. He cited a recent NAFP survey that found 45 percent of U.S. and Canadian residents think that trees are harvested in national parks and protected areas (not true), and 64 percent believe deforestation is a major threat in North America (forests here are shrinking, but not at the rate of tropical forests in the Amazon or Indonesia).

“We have to take responsibility for the fact that people don’t understand what the forest condition really is,” Atkins said. “Seventy-one percent of the respondents had not heard about a forestry sector story in the past year. In places like Montana, we see this stuff all the time. But 83 percent of Americans live in urban areas.”

Source: Foresters struggle to tell tree tales – Missoulian, 2017-04-14

Bringing the forest to classroom, home, and office

Foresters have long faced the challenge of explaining their work to decision-makers, the public, and even peers who are far removed, physically or culturally, from the woods. Increasing urbanization exacerbates the problem, such that we have now identified, as coined by Richard Louv, “nature deficit disorder”. Then how sustainable can our forests be when the population has no direct experience with them or even a frame of reference to understand their benefits and importance? Communication, the great catalyst, can transmit knowledge, facilitate understanding and even spur action, but traditionally, foresters are not good at telling their stories. We need all the help we can get to gain the public’s attention and to make our message understood.

We know that all communication media are not created equal. Graphs and tables are powerful tools for conveying certain types of information among professionals, but will never capture the full complexity of a forest system. A well written narrative or essay can capture the heart and stir the imagination, but our attention spans are becoming ever shorter and we are gathering information in smaller, graphically presented bites. Foresters, along with everyone else, must learn to use the new tools of communication that are now available.

One tool that is finally getting off the ground is virtual reality (VR). This puts panoramic or spherical imagery in three dimensions, so the viewer feels fully immersed in the scene being presented. Game developers presenting computer generated imagery (CGI) are at the forefront, but even the average Jane or Joe can get in the act thanks to our friends at Google. With just a smartphone, a free app (either Android or iOS), and a $15 viewer, anyone can produce and share VR content. This is a rapidly evolving area of development for the company, so the shelf life for the details to follow may be short. Still, the technology has come far enough to warrant some experimentation and it is probably wise to jump on rather than be flattened by this oncoming train.

The most accessible region of Google’s VR ecosystem is known as Cardboard. I became aware of it when a precut piece of cardboard with two plastic lenses inserted showed up in my Sunday New York Times. I followed the directions to fold the cardboard into a stereo viewer and then used my phone to access a website with links to VR content, some CGI and some based on photography or video. The phone was placed in the viewer to explore the VR, complete with sound played through the phone’s speaker. An entertaining novelty to be sure, one could also see the potential for a more substantive use of the medium.


The Google Cardboard app allows users to capture and share their own 3-D panoramas. Basically, the user launches the app, selects Cardboard Camera, and then rotates in place to capture the image. The software on the phone then processes the image and places it in a gallery accessible within the app. After selecting one of the images from the gallery it displays with a small icon of the Cardboard viewer. Touch the icon and the image converts to stereo mode. Place the phone in the viewer and the VR experience awaits. Images are sharable from within the app, but may be handled quite differently depending on the mode and target of delivery. Incorporating an image into a webpage takes some additional processing that will be described below.

While Cardboard Camera can certainly take vacation photos to a new level, it also seemed like a excellent mechanism to capture and convey the sights and sounds of the forest. It’s easy to imagine students peering into the viewer to learn about the redwoods or the rainforest. But could it also provide a valuable tool for resource professionals? In my own specialty of natural community restoration, I envisioned managers being able to visually compare their sites to reference communities documented in VR. It would not be a substitute for quantitative measures, but a way to convey details about a site that text and numbers cannot.

To test this vision, I took my Samsung Galaxy S7 smartphone to the woods, first holding it by hand at near arm’s length and rotating in place. As might be expected, the finished image, while perfectly readable, had waves where the stitching software could not compensate for my erratic movement. To correct this, I bought an inexpensive ($15) phone mount and placed it on my Promaster XC525 tripod. It worked, but I wondered whether rotating the phone on top of the center point would affect the stereo image. Subsequent trials confirmed that the 3-D effect depends on rotating the phone at some distance from the center point.

There may be more elegant solutions, but I chose the forester’s path by creating an arm for my tripod using a three foot piece of 1×1″ poplar. I drilled a 1/4″ hole through it about one inch from the end. A 1 1/2″ inch long 1/4″ stainless steel bolt was run through the hole near the end of the stick. To this bolt I attached a small ball head from a TrekPod and then attached the phone mount. The ball head is necessary because the phone mount holds the phone horizontally while the Cardboard Camera requires the phone to be held vertically for image capture. Another hole was drilled 25 inches from the first. I ran the second bolt through a quick release plate for my tripod before inserting it through the hole and securing it with a wing nut. I then attached the assembly to the tripod using the quick release plate and finally set the phone in the mount at the end of the arm. Enough of the poplar stick extended behind the tripod to provide a handle that aided in rotating the phone during shooting.


From my relatively brief experience, there are a few of shooting tips to pass along:

  • Carefully consider sky conditions before venturing out. When full sun would produce too much contrast, an overcast sky may help even out an exposure. Some of the worst results seem to come when shooting toward backlit clouds.
  • The Cardboard Camera app has an exposure lock setting that is supposed to minimize banding in the image. However, disabling the exposure lock allows the camera to adjust as it is turned toward or away from the sun.
  • Typically, photographers are told to avoid the flat light of midday, but for VR panoramas, the sun must be high enough to be out of the field of view. Otherwise,  the flare will wash out sections of the image.

As mentioned, there are a variety of ways to share VR images. Social media are immediate and built for mobile technology, but posting them on a website provides a stable platform that is accessible to everyone and searchable. This ties back to the idea of having a visual catalog of reference sites for natural communities. The Florida Natural Areas Inventory has begun such a catalog and VR images would be a valuable addition. It’s easy to envision a number of similar applications.

The images from the Cardboard Camera are saved in a vr.jpeg format. Normal image editing software  seems to remove the file information that enables the VR. Therefore, the image files must be copied and saved without alteration. Conversely, these images will be displayed as a flat image by standard web browsers unless they are processed before posting. Google provides a drag-and-drop image conversion utility to produce a stereo-capable panorama for the web.  The converter works in Chrome, but not Safari. (I have not tested other browsers.)

Once posted, the image is accessed using a JavaScript API called VR View. The script can be called on Google’s server, but may not always function properly. Therefore it is recommended that users host it on their own server. This is relatively straightforward for those with experience with web hosting, although much of the available scripting quickly exceeded my coding abilities.

My experimentation with the technology is still in its early days. A simple application of VR can be found on my website. One can view the panoramas on a desktop computer in full screen mode and navigate them with a mouse. From a phone with Google Cardboard installed,  one can launch the stereo image ready for 3-D viewing.

I’m certain that VR will become easier to produce and more integrated into all forms of media. The trick will be to capture and present engaging images that will increase our understanding of and appreciation for our forests.