You are currently browsing the tag archive for the ‘technology’ tag.
Clean water is often taken for granted despite growing evidence that it is threatened in many parts of the world by either environmental contamination and/or socioeconomic problems, such as poverty, which often tend to go hand in hand. Arsenic contaminated ground water used for drinking and cooking is commonplace in many parts of Bangladesh. Like other chemical elements known to be poisonous to humans, arsenic is tolerated to some degree, but beyond certain thresholds ingesting arsenic is toxic leading to risk of disease and death.
Arsenic contaminated groundwater currently threatens the health of 70 million people in 61 of 64 districts in Bangladesh. Many people living in districts plagued with arsenic contaminated ground water regularly drink water with concentrations of arsenic far above national and WHO standards. An important study from Prof Peter Atkins and Dr Manzurul Hassan explores how groundwater arsenic concentration varies throughout areas of southwest Bangladesh. Understanding the scale of arsenic contamination, the complex processes that lead to arsenic in groundwater and how arsenic spreads over time is currently needed to reduce arsenic-related health risks. The study reveals a highly uneven spatial pattern of arsenic concentrations that can inform government policy for addressing where high levels of arsenic contamination occur in order to mitigate arsenic poisoning, a health and social hazard. 358 of the 375 tubewells sampled in the study had concentrations of arsenic of at least .05 mg/L and only 17 of the tubewells (4.50 percent) sampled are considered arsenic-safe. This is a large health concern for people living in areas of Bangladesh where the only source of water they have is contaminated with arsenic that is either above or well above the WHO standard (<0.01 mg/L), but also the limit set by the government of Bangladesh (0.05 mg/L). Read more
Using LIDAR remote sensing (light detection and ranging) geoscientists made a detailed scan of an earthquake zone in northern Mexico and compared it with a survey taken before the 2010 Sierra El Mayor Earthquake. They found that the quake did not occur on a major fault, but through a series of small faults that came together.
Here is a 3D visualisation of the earthquake zone mapped by researchers.
Despite aid efforts in the past, many victims of the 2010 floods are still homeless over a year after the catastrophe occurred. According to a report released by the People’s Accountability Commission on Floods, 1.5 million people are still without shelter in districts of the Sindh province that were extremely damaged by the floods. There are also problems with providing enough resources, such as milk, exposing infants to malnutrition and starvation. The government of Pakistan ended relief activities on 31 December and are no longer providing food, tents or temporary shelter.
Here are two images of southern Pakistan taken by NASA’s MODIS. The first image was taken 24 January 2010 and the second on 23 January of this year.
The Ushahidi platform has been used by people all over the world to report information on large-scale hazard events in real-time. Ushahidi enabled people in Pakistan during the intense flooding in 2010 to report information via SMS about where help was needed. It has also been used to track national elections creating visualisations of data on the web or sent via mobile devices. Ushahidi also created SwiftRiver, which can be used to analyse information about emerging disasters online that are communicated via text messaging.
This is a valuable tool for aid workers as well as journalists who are pressed for time when reporting on an ongoing disaster or crisis. The software platform also has the ability to ‘crowd source’ information to assess trustworthiness of messages or other sources of information coming in. It was used during the aftermath of the 2010 Haitian Earthquake and began in 2008 with tracking the spread of post-election violence in Kenya. While it can be used virtually anywhere, the software is clearly very useful for people in developing countries who have access to the internet and/or use mobile devices.
Ushahidi is also behind CrowdMap which aggregates information about a particular disaster or other event allowing it to be visualised on a map and timeline. CrowdMap does not require any software installation and seems both straightforward and easy to use. What makes CrowdMap and other Ushahidi software platforms particularly powerful for the user is that they can create their own ‘Deployment’ that allows them to report information about a disaster both locally and internationally. Here are a few examples of how CrowdMap has been used to report on disasters. Read more
A news agency specialised in investigative reporting in the US, ProPublica, released an informative series of reports (here, here and here) on the use of body scanning technology by the Transport Security Agency, who is responsible for implementing and regulating travel security measures under the Homeland Security Act of 2002. After its 10th anniversary, many people are wondering what the TSA has actually accomplished in making airports in the US safe from terrorism. The articles focus on the use of body scanners, which are at the focal point of controversies surrounding the TSA. Early on, before the body scanners were first introduced, there was concern as to whether they could pose a significant health risk as the x-ray scanners use ionising radiation that could cause cancer in a minority of airline passengers that pass through them. This is due to the fact that since millions of people enter the scanners the probability of an unfortunate few getting cancer from the machines goes up.
“Even though it’s a very small risk, when you expose that number of people, there’s a potential for some of them to get cancer,” said Kathleen Kaufman, the former radiation management director in Los Angeles County… ProPublica
This health risk is considered low based on the amount of ionising radiation people receive from scanners, which is much less than what they receive while airborne. People receive much larger doses of ionising radiation from being bombarded with cosmic rays when travelling by air at high altitudes. In fact, pilots and flight attendants are actually classified as ‘radiation workers’. According to a study from NASA, flight routes at high latitudes potentially increase radiation exposure to passengers during solar storms. In the case of x-ray body scanners, it is one of a number of risks that air passengers must endure. Read more
The American Geophysical Union Meeting is an immense science conference, the largest of its kind in the world. Taking place in the multilevel citadel known as the Moscone Center in downtown San Francisco, it lies within the city’s technological landscape and seedy urban environment. I was fortunate enough to catch the first session of a press conference on the Tohoku earthquake that devastated Japan early this year, in particular its most destructive secondary hazard – the tsunami – that slammed into the east coast killing tens of thousands of people and causing catastrophic damage. Not surprisingly much attention is being given to the earthquake and tsunami at the conference. The first press conference on the disaster focused on technologies that were in place to track the tsunami, but also public risk perception of tsunami events in Japan, which needless to see seem alarming in regards to preparing for future tsunami events in Japan.
Prior to the tsunami in March of this year, there were four DART buoys in place along the coast of Japan. Three of them were owned by the US, while the other one was monitored by Russia. These buoys allowed researchers to see the tsunami 30 min after it first occurred and empirical observations matched modelling of the tsunami according to Dr Eddie Bernard from the Pacific Marine Environment Laboratory in Seattle, Washington. The buoys take measurements from the sea floor, detecting the changes in the weight of the water above it. Bernard thinks that solely reporting tsunami wave heights is insufficient for evacuating populations before a tsunami strikes, instead there should be ‘flood forecasts’ that can inform people about the levels of flooding that will likely occur, but this will vary depending on where people live along the coast line. Also, Bernard argues that flood forecasting cannot be done without the information available from DART. ‘An earthquake shakes the earth for four minutes and a tsunami crashes the Earth for 12 hours’, he said.
‘For the people who deal with this along the coastline for 12 hours, any additional information you can provide them as soon as possible, whether it’s five minutes, 40 minutes or 50 minutes, would have been very useful’. Read more
The River Eden is one of the most beautiful rivers in the UK, if not all of Europe. It is host to a wide variety of different plant and animal species and is a Site of Special Scientific Interest. Many communities live on or near the River Eden or one of its tributaries. Human impact on the landscape often has unintended or unforeseen consequences on ecosystems, including rivers. Agriculture, primarily through the use of fertilisers, changes the ecology of river systems which affects all plant and wildlife as well as humans.
Over time, pollutants accumulate in the River Eden through farming practices, something that has been of concern to local communities and scientists alike. In order to monitor and develop ways for decreasing diffuse pollution from agriculture, researchers from Durham, Lancaster, Newcastle, the Centre for Hydrology and Ecology, Askham Bryan College (Newton Rigg) and the Eden Rivers Trust, have come together with communities, including farmers, to develop new ways to monitor and improve river water quality. The project known as the Eden Demonstration Test Catchment, or EdenDTC for short, has installed 10 river monitoring stations to monitor the water quality of the River Eden. The EdenDTC has made live, real-time data about the River Eden freely available online. Read more
After the 10th anniversary of the 9/11 terrorist attacks on the World Trade Center (WTC) and the Pentagon, there is still much work to be done in how security, terror and risk are understood and prepared for in society. Like financial crises, it is not so much a matter of if it will happen, but when it will happen. The goal of increased security intelligence is to mitigate risk of any terrorist attack, but a recent review of national security in the US from the federal and state levels, all the way to the security technologies used by airports, shows that vulnerabilities do exist and they need to be addressed as soon as possible. The 9/11 attacks not only changed how the United States viewed the risk of terror, but resonated with countries throughout the world who have experienced terrorist attacks since that tragic event, including the 7/7 bombings in London, the bombing of commuter trains in Madrid, the Belsan school hostage crisis in Russia that killed 330 people (mostly children) and a whole list of others.
The 9/11 Commission’s report, while literary in tone and revealing of a number of important details about the attacks, still only provides a limited scope of what actually took place before and after the attacks. There were a number of testimonies left out of the report including one given by a former translator for the FBI, Behrooz Sarshar, who said he had knowledge of a ‘kamikaze pilot’ plan to attack the US. He was formally interviewed by the 9/11 Commission who received pressure from the 9/11 Family Steering Committee to take Sarshar’s testimony and although a memorandum of this meeting is available online, it is heavily edited, with much of its content omitted. Sarshar said he had written to FBI Director Robert Mueller twice about what he knew, but did not do so until November 2002 and again in January 2003, long after the attacks. When asked why he waited so long to bring forward this information ‘he said he didn’t want to do any damage to the FBI’. Sarshar’s and other potentially useful testimonies were left out of the 9/11 Commission’s report. Read more
As new areas of scientific research and technology emerge within the private and public sectors problems of governance and communication arise. If new scientific innovations continue to play a large role in society how should the general public be informed about them? And if they are informed how, if at all, can the public be involved in the decision-making processes that govern them? Science itself has been viewed as inseparable from the technologies associated with it such as nanotechnology, GMOs, nuclear power and a host of others. New technologies also make an increasing demand of the sciences as well, as funding is often influenced by innovation.
In response to these perplexing issues, the Sciencewise Expert Resource Centre funded by the Department for Business, Innovation and Skills in the UK have initiated and guided a series of public dialogue activities over the past five years. Some of the public dialogues are examples of ‘upstream engagement’ where members of the public outside of government and the scientific community can engage in conversation with scientists and policy makers about a range of complex issues on scientific advances that are at an early stage in the innovation process, such as geoengineering. The upstream model is one of three models of dialogue. Read more
Policing terrorism at the University of Nottingham has been the subject of huge controversy, especially its filming of students suspected of being involved in terrorist activities. It started in May 2008 when Nottingham student Rizwaan Sabir downloaded a copy of an al-Qaeda training manual for his PhD proposal and sought the support of staff member Hicham Yezza, who worked at Nottingham’s school of modern languages. Both were arrested by counter-terrorist Met officers. Freedom of Information Act documents posted on the website Unileaks reveal that they were mentioned in a report by the Home Office, Islamist Terrorist Plots in Great Britain: Uncovering the Global Network. Read more