All the better to see you with

Think about the speed at which you can distinguish things you see. Virtually instantly you are able to detect whether you are looking at a cat, a plant or a vehicle. For this to be accomplished the signal received through your eyes must pass all the way to the back of your brain. Yet more neural pathways then enable you to recognise the signal, and describe what you see. For us, recognition takes less that 100 milliseconds.

The neural pathways involved in sight. Source.

The neural pathways involved in sight. Source.

Researchers around the world attempting to develop a computer model that can perform the same tasks as the human visual system. Simon Thorpe is one of the scientists examining the biological processes that must be met by a computer.

In a recent seminar Thorpe highlighted that although progress in the field has been substantial we are still some way off making a visual system that would be viable in a robot. First off, a computer needs to be able to match the physical “power” of the brains visual system. Of the 86 billion neurons in the human brain, four billion are used in the visual system. These neurons transmit electricity at one to two meters per second at 20 watts and one KHz. Thorpe said that modern computers are more than capable of matching this speed of processing. The performance of the computers he uses are measured in teraFLOPS, which is a lot of power (FLOPS is an acronym that essentially means calculations per second).

With the required computer power achieved, attention is now turning to how to “teach” a computer to distinguish objects in a picture. The traditional method for teaching was to show a visual processor millions of images over the course of a year. This is called back-propagation and is run at speeds of up to 100 images in 100 milliseconds. For comparison, a human can easily recognise a picture that is displayed for 25ms. The main problem with back-propagation is that it does not simulate the learning process of a human.

After a year of training via back-propagation a computer was able to recognise jellyfish, bears, leopards, polyps and monkeys with nearly 100% accuracy. There is a competition that is run every few years to test the accuracy of object recognition by computers. It is called ImageNet, and the list of objects which the computers are meant to be able to recognise can be found here.

The winners of this competition in 2012 stated a company, DNNResearch, which has now been bought by Google. In under six months they have added software that makes it possible to search your own images for particular objects, whether that be a particular flower, animal or vehicle.

Groups around the world are now trying to create object vision by methods other than back-propagation¹²³. By more closely mimicking the way that neurons in the retina fire, cameras are being created that can distinguish moving objects based on contrast and orientation. The example that Thorpe showed in the seminar was a highway. When the camera was pointed at the highway for an extended period it began to learn what was a car and what was not. Thorpe has found that when only 1% of retinal neurons have been stimulated it becomes possible to recognise most objects, this is also matched by the newer approaches to computer object vision.

There are numerous potential applications for this technology. It could be used in manufacturing and industry, monitoring the production of goods. Or it could be used in navigation systems in driver cars, trains or other modes of transport. It could have applications in medicine. Or aid people with damaged eyes. It may also help improve the function of the bionic eye. In the immediate future it seems it will be applied in some of Googles latest products, such as the Google Glass. When developed it may be able to identify objects for you via the glasses⁴⁵.

The final part of the seminar referred to the development of computer consciousness, or artificial intelligence. In Thorpes opinion we will be able to develop a machine that can process and analyse sensory stimuli. However the step to independent thought is still far away, if possible at all. It would require huge advances in the field.

But for the time being at least, you may take pride in your ability to see better than a computer.

References

1. Masquelier, T. & Thorpe, S. J. 2007. Unsupervised Learning of Visual Features through Spike Timing Dependent Plasticity. PLoS Comput Biol, 3, e31.

2. Masquelier, T. & Thorpe, S. J. Learning to recognize objects using waves of spikes and Spike Timing-Dependent Plasticity.  Neural Networks (IJCNN), The 2010 International Joint Conference on, 18-23 July 2010 2010. 1-8.

3. VanRullen, R., Delorme, A. & Thorpe, S. 2001. Feed-forward contour integration in primary visual cortex based on asynchronous spike propagation. Neurocomputing, 38–40, 1003-1009.

4. Mishkin, M., Ungerleider, L. G. & Macko, K. A. 1983. Object vision and spatial vision: two cortical pathways. Trends in Neurosciences, 6, 414-417.

5. Applegate, R. A., Thibos, L. N. & Hilmantel, G. 2001. Optics of aberroscopy and super vision2. Journal of Cataract & Refractive Surgery, 27, 1093-1107.

Featured image sourced from here.

The Biggest History

When David Christian meets people who claim to be “ancient” historians he must chortle inwardly. As Christian has developed a new educational course that encompasses the entire history of our universe – all 13.82 billion years of it.

http://palaeos.com/cosmic_evolution/bighistory.html

A timeline of our universe. Source.

In a recent seminar Christian talked about the importance as well as the contents of a new syllabus he is aiming to introduce to high schools. In the 1980’s Christian began to wonder why historians traditionally do not look back past human history. He reasoned that a more expansive study of history would help humans to gain a perspective on, and a sense of belonging to, something bigger than themselves. Christian also strongly agrees with C.P. Snow who, over 50 years ago, said that there was a lack of communication between science, the humanities and the general public. From these thoughts Big History was born.

As Big History covers all known history up until the modern day, it is very interdisciplinary. When the course was first taught at Macquarie University in 1989 Christian said he would often bring in specialist lecturers to talk about cosmology, chemistry, evolution or biology in greater depth. After being a student to these lectures for a decade, Christian began to give some of the lectures himself. This was not necessarily out of choice for him, rather the specialist lecturer was not able able to attend. As Christian delivered more and more of the content the course developed a natural flow through the content, which is perhaps now the biggest strength of the course.

It was this connectivity between fields that prompted Bill Gates to contact Christian about 10 years ago. Gates had taken the course and was keen to develop Big History into a subject that could be taught in high schools. In 2011 some schools in the US and Australia began running Big History courses. Today over 300 US and 100 Australian schools have Big History included in their curriculum.

The Big History course is divided into eight compartments which have been called thresholds (See figure below).

http://www.geekwire.com/2013/learn-history-bill-gates-online/

The 8 thresholds of the Big History course. Source.

Each threshold represents a significant advancement in in the history of our universe. These thresholds are very anthropocentric. That is, these are the stages that needed to occur for human society to exist as it does today. When learning about this progression of events you begin to realise just how special our planet and our species are.

In the beginning of the course, which typically runs over a 13 week period, you learn about the Big Bang and the subsequent influence of gravity and dark matter in creating stars. With stars it became possible for new elements to be formed. Before stars only the first four elements of the periodic table existed; over 99% of which was hydrogen and helium. With young stars, elements up to iron (no. 26) could be formed via the fusion of smaller elements. Super massive stars can then go supernova, making it possible for the other elements to be formed. As you can see from the diagrams below the formation of stars and supernovas was necessary to form the Earth as well as the human body.

http://chandra.harvard.edu/resources/illustrations/chemistry_universe.html

The elemental composition of the universe. Source.

http://chem11project10-11.blogspot.com.au/2010/10/finding-out-about-matter.html

The elemental composition of the Earth. Source.

element body

The elemental composition of the human body. Source.

That’s right! We are all stars…literally.

After stars facilitated the formation of solar systems, which is the next threshold (no. 4), the conditions on Earth were such that life could exist. This means that the Earth was in the “Goldilocks” zone. The Goldilocks zones are the regions of the universe were conditions are suitable for life – juuuuust right. The beginning of life on Earth, 3.5-3.8 billion years ago, is threshold five of the course.

As a biologist I feel there could have been another threshold or two during the development of life, such as multicellularity or the Cambrian explosion. Much as a physicist would probably say there should be more thresholds for the formation of the universe. But in the interest of simplicity these are not major thresholds for the course.

The next threshold is the ability of species within the homo genus to learn collectively. This means that communication had developed to a level of sophistication that allowed inter-generational knowledge accumulation. And so, the tools improved over hundreds of thousands of years until eventually humans had asserted themselves as the dominant species on Earth, after colonising the vast majority of landmasses.

The next threshold was agriculture at the end of the pleistocene (10000 years ago), which allowed population centers to generate a surplus of food. The final threshold brings us to the modern era – the burning of fossil fuels.

The burning of fossil fuels allowed us to generate vast amounts of energy that had been stored 300 million years ago. It has been estimated that due to fossil fuels humans now control a density of energy that is one million times greater than the sun.

In regards to this energy, Christian said, ” I don’t think we’re really in charge of this huge machine we are driving”. He said this level of energy may be the undoing of civilisation, through climate change or other phenomena.

The full course of Big History is available free online here.

This is a video from the era of C.P. Snow about the issues of communicating with scientists, and of being scientifically literate. David Christian hopes to bridge the gap in communication between scientists and the community through big history.

Note: I have been writing “our” universe throughout this piece as there are a growing number of scientists who believe multiple universes (multiverse) may exist.

Biography of David Christian

David Christian earned his BA at Oxford University, his MA at Ontario University and by 1975 had completed his PhD at Oxford. His PhD was on 19th century Russian, and the influence of vodka on the Russian peasantry. He taught at Macquarie University from 1975 to 2000 and in that time developed his course called Big History. In 2001 he moved to San Diego University, but returned to Macquarie in 2009. In his time in America he was contacted by Bill Gates about Big History, and the Big History Project resulted. Since 2009 he has been teaching various course at Macquarie, and recieved a distinguished lecturer award in 2013. He has also authored a book “Maps of Time”.

References

Davies, P. 2008. The Goldilocks Enigma: Why is the universe just right for life? Houghton Miffin Harcourt.

Bousso, R. & Susskind, L. 2011. Multiverse interpretation of quantum mechanics. Phys. Rev. D. 85, 045007

Christian, D. 1991. The Case for “Big History”. Journal of World History, 2: 223-238.

Richerson, P.J., Boyd, R. & Bettinger, R., L. 2001. Was Agriculture impossible during the Pleistocene but mandatory during the Holocene? A climate change hypothesis. Journal of Would History. 66: 387-411.

Vagrant tropical fish moved south for the winter

The EAC (East Australian Current), popularised by turtles in Finding Nemo, does actually exist, so well done Hollywood fact-checkers. The current is also constantly transporting marine creatures south, certainly as far as Sydney and often much further, to the Victorian coast. It is not, however, the turtles that interest Dr Will Figueira of the university of Sydney. Instead he focuses on tropical fish that are involuntarily and irreversibly swept hundreds of kilometers down the coastline to temperate ecosystems.

east_australian_current_fig2

The currents at play off the eastern coast of Australia. Image source.

As Dr Figueira stated in a recent seminar, this has drastic repercussions for the individuals in terms of survival in their new surrounds. This phenomena also provides some insight into the ecosystem changes that may occur as the climate and oceans warm. This long-distance dispersal is only possible during the larval stage of development, which all marine fish go through. Fish, and many other marine animals, have a larval stage to aid in dispersal of the new generation away from the breeding grounds. This helps increase the distribution of the species by colonising new habitats and helps lower the chance of inbreeding¹. As larvae the fish form part of the plankton clouds on which large marine organisms, like baleen whales, feed. It is not until the fish reach a mature stage that they will resist the current and attempt to settle in an appropriate habitat.

Life cycle of a typical marine fish. Image source.

Life cycle of a typical marine fish. Image source.

Dispersal, especially when facilitated by strong currents like the EAC, will likely move the fish outside of its optimum habitat. Figueira has spent the last decade investigating the persistence of a number of tropical fish species in the sub-optimal  temperate waters on the south east coast of Australia. Obviously, as these fish are not adapted to the cooler waters they usually die. Figueira has found that the critical threshold for survival is around 17-19°C for most tropical fish². So, if the temperature of the water remains above this threshold over winter, the tropical species could establish a colony. So far this has not been observed. Warming temperatures may soon make it more common for tropical colonies to persist for multiple years. By 2080, Figueira estimates that 100% of winters will be survivable for 5 out of the 8 species he has worked on². This will result in a range shift for these species, which may suggest that all marine ecosystems will (attempt to) shift several degrees towards the poles.

While studying the tropical vagrants between the Solitary Islands (30°S) and Merimbula (37°S)(South GBR 24°S) Figueira examined the characteristics that may change with temperature. As fish are cold-blooded, their metabolism will slow down at lower temperatures, which causes them to eat less and may result in starvation. Coupled with this is a lower growth rate and a slower burst speed, which is used to escape predators³. All of this means that the tropical fish are disadvantaged compared to their temperate cousins.

This is part of a genetic bottleneck that has been observed in the lab. Predators selectively kill the weaker members of the community. This is the second bottleneck through which the tropical fish have had to traverse, the first being the currents. Larval duration strongly correlates with the distance of dispersal⁴. That is, the longer the fish is in the larval stage the further they are carried by the current. Therefore, the faster fish can become mature and settle the less harsh the ecosystem will be for them. This may mean that the smaller tropical fish, which have shorter larval times, and can breed more rapidly may more easily colonise regions further south than their historical distribution. It is also likely that fish from higher latitude reefs will relocate with greater ease than the more equatorial species, which in turn may force the more polar species further from the reefs. Importantly, species that have low or no reliance on corals will be much more likely to avoid extinction⁵. These reasons mean that breeding of tropical fish in temperate regions is yet to be observed.Bottlenecked fish

One of the species Figueira studies (Abudefduf vaigiensis), which has become prevalent on the coast of NSW during the warmer months. Image source.

One of the species Figueira studies (Abudefduf vaigiensis), which has become prevalent on the coast of NSW during the warmer months. Image source.

What does all this mean for the future? Well, so far 47 tropical species have been cataloged along the NSW coast, over 1700km away from their usual habitat³. It is unknown whether climate change will increase the distance that the EAC moves vagrants. Warmer oceans may increase the force of the current, but will also decrease the duration of the larval stage. It has also been found that generally when two species interact one will have a competitive advantage in the warmer waters, and the other will have an advantage in cooler waters⁷. The probable influx of tropical fish, both predator and prey, will almost certainly change the dynamic of ecosystems along the coast⁸.

References Cited

1. Figueira, W. F., Biro, P., Booth, D. J. & Valenzuela, V. C. 2009. Performance of tropical fish recruiting to temperate habitats: role of ambient temperature and implications of climate change. Marine Ecology Progress Series, 384, 231-239.

2. Figueira, W. F. & Booth, D. J. 2010. Increasing ocean temperatures allow tropical fishes to survive overwinter in temperate waters. Global Change Biology, 16, 506-516.

3. Figueira, W. F., Booth, D. J. & Gregson, M. A. 2008. Selective mortality of a coral reef damselfish: role of predator-competitor synergisms. Oecologia, 156, 215-226.

4. Booth, D. J., Figueira, W. F., Gregson, M. A., Brown, L. & Beretta, G. 2007. Occurrence of tropical fishes in temperate southeastern Australia: Role of the East Australian Current. Estuarine Coastal and Shelf Science, 72, 102-114.

5. Feary, D. A., Pratchett, M. S., Emslie, M. J., Fowler, A. M., Figueira, W. F., Luiz, O. J., Nakamura, Y. & Booth, D. J. 2014. Latitudinal shifts in coral reef fishes: why some species do and others do not shift. Fish and Fisheries, 15, 593-615.

6. Curley, B. G., Jordan, A. R., Figueira, W. F. & Valenzuela, V. C. 2013. A review of the biology and ecology of key fishes targeted by coastal fisheries in south-east Australia: identifying critical knowledge gaps required to improve spatial management. Reviews in Fish Biology and Fisheries, 23, 435-458.

7. Galaiduk, R., Figueira, W. F., Kingsford, M. J. & Curley, B. G. 2013. Factors driving the biogeographic distribution of two temperate Australian damselfishes and ramifications for range shifts. Marine Ecology Progress Series, 484, 189-202.

8. Poloczanska, E. S., Brown, C. J., Sydeman, W. J., Kiessling, W., Schoeman, D. S., Moore, P. J., Brander, K., Bruno, J. F., Buckley, L. B., Burrows, M. T., Duarte, C. M., Halpern, B. S., Holding, J., Kappel, C. V., O/’Connor, M. I., Pandolfi, J. M., Parmesan, C., Schwing, F., Thompson, S. A. & Richardson, A. J. 2013. Global imprint of climate change on marine life. Nature Clim. Change, 3, 919-925.

Featured image sourced from here.

Climate Change Modelling – Projections of future climate conditions

Considering current climatic patterns, estimating carbon emissions for the next 100 years and future changes to ecosystems to produce one model sounds difficult doesn’t it? Well these are the major factors that climatologists have to consider when constructing climate models.

These models can then be adjusted and used to predict the impact of climate change on the habitat on individual organisms.

Dr Rebecca Harris, from the University of Tasmania, has to consider all of the variables that exist in each of these considerations. Once she has grappled with these variables she has to choose an appropriate model for a particular species.

In a recent seminar Dr Harris revealed some of the fundamental differences in the approach to science that exist between climatologists and ecologists. For example: as an ecologist an error margin is just part of the results, she said. However, errors for climatologists represent an unthinkable situation – mistakes in their algorithms.

Dr Harris is one of the brave few scientists who is trying to bring some ecology to climatology and some climatology to ecology. It is hoped that this will improve models of species distribution modelling as well as global climate models (GCMs).

Climate models

Climate models. Source

Global climate models are computer programs that project future climates based on rainfall, temperature, winds and carbon emission patterns. The end product will look similar to the figure above, only the patterns change across time. From these models climatologists are able to discern changes in climatic conditions that may occur over the next 50 or 100 years.

To help improve the accuracy of climate models, they can be run at best and worst case scenario, in terms of carbon emissions, as well as anything in between. Unfortunately to understand many of the guidelines in place we have to descend into the dark world of acronyms. Every few years the Intergovernmental Panel on Climate Change (IPCC) release a statement about the different emission scenarios that they predict are likely. Some earlier GCMs are based on assessment report four (AR4), which listed 4 likely scenarios (A1, A2, B1 & B2 – see the IPCC website for information about the situations that lead to these scenarios). By 2014 another set of guidelines was released (surprisingly – AR5) which changed the likely scenarios slightly. The frightening fact is that we have been tracking at the highest scenario from 1990 to the present day¹.

This graph shows the amount of heating that will occur at the different carbon emission scenarios. The increase in temperature is influenced by radiative forcing. Radiative forcing is the difference between short wave and long wave radiation coming to earth. Radiation from the sun enters the atmosphere as short  wave radiation. Short wave radiation can penetrate greenhouse gases. When the short wave radiation hits the ground it warms the earth, which increases the amount of long wave radiation emitted from the earth. Long wave radiation cannot penetrate the layer of greenhouse gases and so it is reflected back to earth, which results in a temperature increase.  Image sourced from Harris et al (2014).

This graph shows the amount of heating that will occur at the different carbon emission scenarios. The increase in temperature is influenced by radiative forcing. Radiative forcing is the difference between short wave and long wave radiation coming to earth. Radiation from the sun enters the atmosphere as short wave radiation. Short wave radiation can penetrate greenhouse gases. When the short wave radiation hits the ground it warms the earth, which increases the amount of long wave radiation emitted from the earth. Long wave radiation cannot penetrate the layer of greenhouse gases and so it is reflected back to earth, which results in a temperature increase. Image sourced from Harris et al (2014).

The newer global climate models are based on these scenarios. To date over 50 GCMs have been archived in the IPCC for use by scientists and politicians. These newer models also incorporate additional information which tries to predict the influence of land use and natural disasters on the climate system – adding yet more variables².

It is these variables, the huge number of assumptions and their inaccuracy at certain topographies that have led some to question the accuracy of GCMs as well as their validity³. Unfortunately there is an unknown level of uncertainty when using climate models, and it could be argued that the variables that will alter the climate are too random to be predicted. However, despite the limitations of GCMs, they remain the most useful and compete tool for estimating the impacts of climate change on specific species and ecosystems.

However one last alteration needs to be made to GCMs for them to have ecological or local significance. The models need to be downscaled. Most models work on a grid of 50-100km, which is not at all helpful in an ecological sense. There are various methods used to downscale the models, which adds yet more scope for differences in conclusions.

Finally, once the models have been downscaled they can be used to model potential habitats for specific species during climate change. These can also be used to predict the lack of snow that will fall in 2050 (Click here for the SMH article) or the frequency and severity of cyclones⁴.

In conclusion, climate modelling gives us a possible range of possible scenarios that may exist in the next century due to the impacts of anthropogenic climate change. Despite possible inaccuracies, GCMs will provide valuable information that may help us limit the damage that will occur to the environment. Hopefully the information garnered from these models will prompt all of us to try and aim for the best case scenario.

Question for the week

Should the data generated from GCMs be used to set up a conservation plan for a single species? Or would we be better served trying to focus conservation efforts on the low trophic levels and keystone species to try and conserve the entire ecosystem? Is either a realistic goal? Are more wholesale changes necessary?

References

Harris, R. M. B., Grose, M. R., Lee, G., Bindoff, N. L., Porfirio, L. L. & Fox-Hughes, P. 2014. Climate projections for ecologists. Wiley Interdisciplinary Reviews: Climate Change, 5, 621-637.

1. Rahmstorf, S., Cazenave, A., Church, J. A., Hansen, J. E., Keeling, R. F., Parker, D. E. & Somerville, R. C. J. 2007. Recent Climate Observations Compared to Projections. Science, 316, 709-709.

2. Pielke, R. A., Pitman, A., Niyogi, D., Mahmood, R., McAlpine, C., Hossain, F., Goldewijk, K. K., Nair, U., Betts, R., Fall, S., Reichstein, M., Kabat, P. & de Noblet, N. 2011. Land use/land cover changes and climate: modeling analysis and observational evidence. Wiley Interdisciplinary Reviews: Climate Change, 2, 828-850.

3. Refsgaard, J. C., Madsen, H., Andréassian, V., Arnbjerg-Nielsen, K., Davidson, T. A., Drews, M., Hamilton, D. P., Jeppesen, E., Kjellström, E., Olesen, J. E., Sonnenborg, T. O., Trolle, D., Willems, P. & Christensen, J. H. 2014. A framework for testing the ability of models to project climate change and its impacts. Climatic Change, 122, 271-282.

4. Emanuel, K. A. 2013. Downscaling CMIP5 climate models shows increased tropical cyclone activity over the 21st century. Proceedings of the National Academy of Sciences, 110, 12219-12224.

Featured image sourced from here.