You are currently browsing the tag archive for the ‘technology’ tag.
Friday, 31 January, 7.30 PM at the Star and Shadow Cinema, Newcastle
In a world stressed by climate change and population growth, the issue of GM crops is again rising to the top of the socio-political agenda. Agri-business, as supported by some scientists, promotes GM as the only way forward.
The Star and Shadow Cinema in Newcastle in collaboration with Durham University’s Institute of Hazard, Risk and Resilience, is holding a film led debate with experts across multiple disciplines to help the community appreciate what is at stake. The discussion will be led by Executive Director of IHRR Professor Sarah Curtis, Professor John Gatehouse, Dr Susana Carro-Ripalda and Dr Philip Garnett. Read the rest of this entry »
Dr Philip Garnett, a researcher on the Tipping Points project, reports on the 30th Chaos Communications Congress he attended in Hamburg, Germany in December that addressed topical issues such as mass surveillance, the future of the Internet and the Snowden affair.
The 30th Chaos Communication Congress (30C3) was always going to have at its heart the events surrounding the leaks of NSA files by Edward Snowden. In some ways I was banking on it. Why else would I drag myself away from Christmas leftovers and warm fires to get on a plane to Hamburg in the cold and rain? I’m glad I did. Amongst the talks on automatic fermentation of beer, and the vacuum tube races around the conference building, was a rich vein of presentations and discussions around (to paraphrase the opening speaker) the nightmare reality to which the community has woken up to from a bad dream.
Despite this, there was still a sense of fun at the congress. There were many stalls where you could enjoy learning to pick a lock, program a board of LEDs with the message of your choice, or check out the latest DIY 3D printers. However there was also a punishing timetable of talks reacting to, discussing, and trying to grasp the new post-Snowden future. One thing they all shared was in common was a sense of disbelief, or speechlessness. It wasn’t the confirmation that we are being monitored exactly (we all knew that) it was perhaps the scale of it, and a sense that nothing can be done to reverse it. Read the rest of this entry »
Dr Elias Lopez-Romero a Marie Curie-IEF Fellow from the Department of Archaeology at Durham University introduces the ALERT app that allows users to upload information about vulnerable archaeological sites near European Atlantic coastlines.
Present climatic change and anthropogenic pressure increasingly affect the coastal zone. Hundreds of archaeological sites are currently threatened along the European Atlantic coasts by the accelerated relative rise in sea level, erosion, and various anthropogenic modifications to the environment. In spite of this situation little attention has been paid to the development of methodologies for monitoring the vulnerability of this kind of heritage. This is particularly true in areas like Western France or the Iberian Peninsula where, unlike research initiatives in England, Ireland, Scotland or the Mediterranean Basin, there have not been long-term dedicated approaches to this topic.
Since 2006, the ALERT project has brought together researchers involved in coastal archaeology. This group quickly moved toward developing an interdisciplinary approach aiming at the construction of a vulnerability model for coastal heritage, developing assessment and monitoring maps, and assessing the strategies for research and action adapted to the local and regional scales. Read the rest of this entry »
The prospect of governing geoengineering is perplexing for a variety of reasons, many of which deal with the nature of the technologies involved and the scale at which they are intended to be deployed. While in some ways similar in scope to other technological innovations such as nanotechnology or synthetic biology, the methods used for SRM are not novel; nevertheless the end result may be the making of entirely new climate(s). One approach that has potential for mitigating the effects of climate change is Solar Radiation Management (SRM) – spraying large quantities of reflective particles into the Earth’s stratosphere to reflect solar radiation back out to space.
SRM has entered into mainstream science-policy debates only very recently, primarily due to the political expedience of climate change as a global environmental problem that threatens the existence of the human species itself, not to mention eliminating biodiversity on a scale never experienced before. The UK has a research project on SRM known as SPICE (Stratospheric Particle Injection for Climate Engineering), which had its own problems including the question of patenting the technology that came to light as a result of its stage gate evaluation process. There are some examples of developing policy for geoengineering such as the Solar Radiation Management Governance Initiative (SRMGI) and the Oxford Principles, ethical guiding principles for governing geoengineering. Read the rest of this entry »
When it comes to controversial scientific research many scientists can be dismissive or evasive when it comes to dealing with the public. But when it comes to looking at the bigger picture of how the research could interface with policy and in turn governance, it is actually non-scientists that may hold some of the answers, and not necessarily those in high positions of political or financial power either. Public dialogues about geoengineering seem like a model example of this, showing that engaging with non-scientists can lead to productive assessments of the actual risks involved and judging whether or not the science or technology is even appropriate at all. Now this may seem problematic to some, but it could actually bring science, technology and democracy a little closer together.
The problem of granting patents for geoengineering technology was what prevented the project SPICE from continuing research beyond computer modelling, ending an experimental trial that could have one day led to engineering the Earth’s climate at a scale never before seen. Prof Phil Macnaghten at Durham University, who was an advisor on the SPICE project, oversaw the stage gate process for the project, which was in place to ensure that it met the criteria for engaging with public values. Some puzzling questions arose during the stage gate panel. If geoengineering did become mainstream and worked who would own it? Would it stay in the public domain or fall under intellectual property laws and therefore be subject to commercial interests? Read the rest of this entry »
Governing scientific and technological innovations is tricky business. This is primarily due to the presence of uncertainty, the risks that society must face if it chooses to intervene using methods that could either have damaging consequences, fail entirely or both. Everyone knows it’s a clique of course, but we really do ‘live in exciting times’ as humanity has at hand an array of advanced technologies at its disposal. But climate change is in a sense antithesis to technological development or at least to how it has proceeded thus far, mostly because the world is locked into using fossil fuels as its primary source of energy. Yet the controversial applications of geoengineering may prove a last resort for reducing the temperature of the planet preventing devastating environmental impacts induced by climate change. Read more
The problem of brownfield land is universal. Countries throughout the world have problems with contaminants present in soil that prevent people from using the land. Large demand exists to improve soil health and to regenerate brownfield land for present and future generations. While brownfield land can clearly affect the physical health of people, plants and animals it may also affect people’s mental health or sense of well-being.
Land previously developed for industry or other uses may affect public health in a variety of different ways that does not appear well understood at this time. IHRR’s research project ROBUST (Regenerating Brownfield Land Using Sustainable Technologies) at Durham University is investigating how to restore brownfield land sustainably, but is also researching how brownfield land affects the well-being of communities that live around it. Recently, the project has begun its first public field trial testing a new technology for improving soil health that uses recycled minerals to improve the natural defences of the soil against contamination.
Remote sensing provides a unique perspective of disasters that allows their full impact to be viewed in great detail. It can help people manage disasters and is an effective way of understanding the impacts of a large-scale hazard such as a tsunami. NASA’s satellites are of course some of the most valuable tools available for remote sensing, but other organisations such as the European Space Agency and China Academy of Space Technology also do high-resolution remote sensing. Here are some images of disasters acquired by the Landsat 7, including the flooding of New Orleans after Hurricane Katrina and the Sendai coast in Japan before and after the Tohoku tsunami.
This image shows the path of destruction left by a series of tornadoes that tore through the Upper Midwest region of the US on 7 June 2007. The tornadoes flattened farm fields and strong winds uprooted trees sending them crashing into people’s homes.
An interesting infographic from the UN International Strategy for Disaster Reduction shows the annual damages caused by large-scale natural disasters. It provides context for people killed by disasters, a staggering 1.1 million, those affected, 2.7 billion, and the total cost in damage, 1.3 trillion USD. This information from EM-DAT The International Disaster Database includes all disasters entered into its database (both natural and technological) are based on at least one of these criteria: 10 or more people reported killed, 100 or more people reported affected, declaration of a state of emergency or call for international assistance. Read more
Anyone who has been through airport security in the US before and after 11 September 2001 knows how it has transformed politically, socially and technologically. Other countries, especially the UK, have fallen suit using similar scanning and surveillance technologies in large international airports such as Heathrow. But the majority of passengers are ‘low risk’ meaning that they are unlikely to commit an act of terrorism either at the airport or when airborne. Yet they are still often times forced to submit to procedures that are built upon the premise that they could be a terrorist or if not a terrorist then a potential threat or disruption to airport security, if not national security. This is problematic for a number of reasons, especially considering the resources invested into securing airports from potentially anyone distracts from the ‘real terrorists’, whoever they may be. In response to this conundrum, the TSA in the US is piloting ‘risk-based’ approaches to enhance airport security. While avoiding the obvious non-culprits such as children, people over 75 and those serving in the armed forces, the TSA plans to depend more on techniques that monitor behaviour and implement ways to ‘pre-check’ passengers such as biometric verification and behaviour monitoring techniques. Read more