Most of us are familiar with the concept of the ecosystem – the idea that all living things are interconnected with (and interdependent on) one another and the environment they live in. Can you think of something beyond nature that behaves like an ecosystem?
If you answered “the internet”, then give yourself a cookie – you had the same thought as a gang of biologists and ecologists who’ve just published a paper examining ways to use a computational algorithm – much like the one used by Google for calculating the search engine ranking of webpages – to determine which endangered species are most at risk, and which are most crucial to the survival of others.
In simple terms, PageRank rates the importance of websites and ranks them in a list compared to other websites. Sites with a higher ranking are those that are linked to more often by other sites and therefore have a greater number of connections.
Adapting this approach to ordering the web of connections within an ecosystem allows species to be ranked in importance by virtue of how many other species are linked to them.
One example of species that depend on each other are the overfished Atlantic cod (Gadus morhua) and other smaller animals that descend from it in the North Atlantic food chain. Because the predator has been depleted, species including smaller pelagic fish and northern snow crabs have boomed and are themselves depleting populations of phytoplankton and zooplankton.
It’s an innovative and useful tool, though other researchers are keen to underline its shortcomings:
Fraser Torpy, microbial ecologist and statistician from the University of Technology in Sydney, Australia, said the study is “very useful adjunct to our ability to determine what makes a species important in terms of its position in its ecological community”.
However, he cautions that the method may only work for simple food webs. “Whilst [this is] an innovative and genuinely useful novel technique for endangered species assessment, it must be remembered that the true complexity of real ecosystems cannot be overestimated.”
With the caveat that I’m no ecologist (nor, for that matter, a search algorithm expert), it occurs to me that as limited a method of modelling ecosystems this algorithm may be, its demonstrated ability to scale to the vast numbers of the web’s uncounted pages means it probably has the potential to outperform any other analytical method currently available. And as extinction rates increase in response to climate change and human intervention, maintaining the ecosystem that supports our own civilisation demands every tool we can get our hands on, regardless of how far short from perfect they might fall. [image by Hello, I am Bruce]
To what degree will computational algorithms be able to assist our understanding of natural systems? Where will their usefulness end… or will we eventually be able to reduce every system to equations, no matter how complex, once we have the necessary processing and memory resources available?