Science&Tech - Tehran Times Tehran Times - Iran's Leading International Daily http://tehrantimes.com/science Sat, 14 Sep 2013 20:18:36 +0000 Joomla! 1.5 - Open Source Content Management en-gb Innovating our way to energy abundance and climate change http://tehrantimes.com/science/110745-innovating-our-way-to-energy-abundance-and-climate-change http://tehrantimes.com/science/110745-innovating-our-way-to-energy-abundance-and-climate-change In 1980, the biologist Paul Ehrlich and the economist Julian Simon made a simple wager. Ehrlich bet that price of five common metals would rise over the next decade. Simon bet the price would fall. The loser would pay the difference in a $1,000 bundle of the five metals.
 
As the Yale historian Paul Sabin describes in his new book The Bet: Paul Ehrlich, Julian Simon and Our Gamble Over Earth‘s Future, each man was essentially betting on his vision of the future. Ehrlich — a neo-Malthusian best known for his 1968 book The Population Bomb, which predicted that humanity would soon run out of food and resources — believed that rising prices for basic materials would show that the world was headed towards scarcity and catastrophe. 
 
The optimistic Simon—whose views might be described as “cornucopian” — thought that falling prices would demonstrate that human creativity was finding ways to make basic resources cheaper and more widely available.
 
Simon won the bet — prices of the metals dropped by about 50% between 1980 and 1990, even as the global population increased by $800 million. Simon won $576.07, and for a decade at least, human ingenuity triumphed over material scarcity.
 
Some of that might have been due to fortunate timing—Sabin notes that when economists ran simulations for every 10-year period between 1900 and 2008, they found that Ehrlich would have won 63% of the time. Yet it’s hard to avoid the conclusion that fears of resource scarcity, which have recurred throughout history, are often overblown. Here’s Sabin in Slate:
 
Gloomy forecasts for soaring resource costs reveal an all-too-common tendency to overlook how scarcity and abundance relate to each other. Scarcity, by leading to increased prices, spurs innovation and investment. Efforts to locate new resources and design cheaper methods yield new technologies. New periods of abundance occur, even overabundance or a glut. We see that abundance today in natural gas markets.
 
Oil and gas fields
 
Another example: A new report that has come out the research and consulting firm Wood Mackenzie, which estimated that there are nearly 1.4 trillion — with a “t” — barrels of oil equivalent (boe) reserves in conventional but undeveloped oil and gas fields. That includes nearly 1.1 trillion boe of “technical reserves” — a term used for resources for which there are not yet development. (For comparison’s sake, the world uses about 33 billion barrels of oil a year.) Over half of those discoveries are classified as “good technicals,” which means they should economic to recover under current price levels.
 
That much oil and gas would be worth approximately $760 billion alone. And it doesn’t count the new sources of oil and gas that have been unlocked by innovative technologies like hydrofracking and horizontal drilling, which have helped the U.S. produce more oil than it has since 1989. “This reemphasizes that there is still plenty of growth out there remaining,” says David Highton, principal analyst of Upstream Research at Wood Mackenzie.
 
That’s good for the global economy, which benefits from cheaper and more abundant energy. But it also shows that when prices rise—as they have for a number of commodities in recent years, most notably oil—it stimulates companies to find new ways to get at resources that, in cheaper days, wouldn’t have been worth the trouble. As technology for locating and exploiting resources improves, prices can even drop, which is exactly what has happened with natural gas in the U.S., thanks to fracking. “In the U.S., prices for natural gas peaked at around $13 per million cubic ft. in 2008,” says Jason Bordoff, the director of the Center for Global Energy Policy at Columbia University. “They’re now at $4 to $5, and we’ll have a fairly large supply of inexpensive gas for awhile.”
 
So human ingenuity conquers the natural world? Not exactly. As Sabin told me, one lesson of the Simon-Ehrlich bet is that we should have a more “cautiously optimistic” attitude towards issues of resource scarcity. But that may not apply to the biggest environmental and economic challenge facing the world: climate change. 
 
The market-driven innovation
 
The market-driven innovation that has multiplied supplies of oil and natural gas will also make it that much more difficult to quit carbon. (It’s true that as natural gas replaces coal, it has reduced carbon emissions, but gas is still a fossil fuel.) “The flip side of this is we generate more fossil fuels and climate change worsens,” says Sabin. “It puts us in the position of making choices about what resources we want to develop and what world we want to live in.”
 
That’s why I was cheered by a bit of news that came out of China. The Chinese government pledged to reduce overall coal use in an effort to fight crippling air pollution. It would stop approving new coal-fired plants in industrial centers like Beijing and Tianjin in northern China, as the Yangtze and Pearl River Deltas in eastern and southern China. The government said it would reduce coal use — which now accounts for 70% of the energy mix — to less than 65% by 2017, while retrofitting existing plants to cut pollution. The “key to preventing air pollution is to curb coal burning — China burns half of all the coal consumed in the world,” the Chinese environmentalist Ma Jun told the New York Times.
 
It’s true that China is making this move chiefly to reduce the brutally unhealthy levels of air pollution that cloak its cities — not to so much to reduce carbon emissions. But the point is still the same. 
Human ingenuity has made it likely that we’ll have more than enough oil, coal and natural gas to power the world — and to cook ourselves if we let it happen. We need to choose to save ourselves.
 
(Source: Time)
]]>
amirsabetee@gmail.com (Bryan Walsh) World - Science &Tech Sat, 14 Sep 2013 15:06:26 +0000
In a breathtaking first, NASA’s Voyager 1 exits the Solar System http://tehrantimes.com/science/110744-in-a-breathtaking-first-nasas-voyager-1-exits-the-solar-system http://tehrantimes.com/science/110744-in-a-breathtaking-first-nasas-voyager-1-exits-the-solar-system PASADENA, Calif. (The New York Times) -- By today’s standards, the spacecraft’s technology is laughable: it carries an 8-track tape recorder and computers with one-240,000th the memory of a low-end iPhone. 
 
But Voyager 1 has become — thrillingly — the Little Spacecraft That Could. On Thursday, scientists declared that it had become the first probe to exit the solar system, a breathtaking achievement that NASA could only fantasize about back when Voyager was launched in 1977, the same year “Star Wars” was released. 
 
“I don’t know if it’s in the same league as landing on the moon, but it’s right up there — ‘Star Trek’ stuff, for sure,” said Donald A. Gurnett, a physics professor at the University of Iowa and the co-author of a paper published Thursday in the journal Science about Voyager’s feat. “I mean, consider the distance. It’s hard even for scientists to comprehend.” 
 
Even among planetary scientists, who tend to dream large, the idea that something they built could travel beyond the Sun’s empire and keep grinding away is impressive. 
 
Plenty of telescopes gaze at the far parts of the Milky Way, but Voyager 1 can now touch and feel the cold, unexplored region in between the stars and send back detailed dispatches about conditions there. It takes 17 hours and 22 minutes for Voyager’s signals to reach NASA’s Jet Propulsion Laboratory here. 
 
“This is historic stuff, a bit like the first exploration of Earth, and we had to look at the data very, very carefully,” said Edward C. Stone, 77, NASA’s top Voyager expert, who has been working on the project since 1972. He said he was excited about what comes next. “It’s now the start of a whole new mission,” he said. 
 
The lonely probe, which is 11.7 billion miles from Earth and hurtling away at 38,000 miles per hour, has long been on the cusp, treading a boundary between the bubble of hot, energetic particles around the solar system and the dark region beyond. There, in interstellar space, the plasma, or ionized gas, is noticeably denser. 
 
Analyzing the data
 
Dr. Gurnett and his team have spent the past few months analyzing their data, trying to nail down whether what they were seeing was solar plasma or the plasma of interstellar space. Now they are certain it was the latter, and have even pinpointed a date for the crossing: Aug. 25, 2012. 
 
At a news conference on Thursday, NASA scientists were a bit vague about what they hope to get from Voyager 1 from now on. The answer, to some extent, depends on what instruments continue to function as the power supply dwindles. Dr. Stone expects Voyager 1 to keep sending back data — with a 23-watt transmitter, about the equivalent of a refrigerator light bulb — until roughly 2025. 
 
One hope is that Voyager 1’s position will allow scientists to more accurately study galactic cosmic rays, which are high-energy particles that originate outside the solar system. They would use the information to make judgments about what interstellar space is like at even greater distances from Earth. 
 
In its heyday, Voyager 1 pumped out never-before-seen images of Jupiter and Saturn. But it stopped sending home pictures in 1990, to conserve energy and because there was no longer much to see. A companion spacecraft, Voyager 2, also launched in 1977, has stopped sending back images as well. Voyager 2 is moving in a different direction but is also expected to exit the solar system. 
 
Eventually, NASA said, the Voyagers will pass other stars, coasting and drifting and being pulled by gravity. The next big encounter for Voyager 1, in around 40,000 years, is expected to be a dwarf star dispassionately known as AC+793888 in the constellation of Camelopardalis. 
 
But already, Voyager 1 has achieved what Dr. Gurnett called the “holy grail of heliosphere research.” 
 
Voyager 1 left Solar System
 
Voyager 1 left the solar system the same month that Curiosity, NASA’s state-of-the-art rover, landed on Mars and started sending home gorgeous snapshots. Curiosity’s exploration team, some 400 strong, promptly dazzled the world by driving the $2.5 billion robot across a patch of Martian terrain, a feat that turned the Red Bull-chugging engineers and scientists of Building 264 of the Jet Propulsion Laboratory campus into rock stars. By comparison, the Voyager mission looked like a Betamax in the era of Bluetooth. 
 
The 12-person Voyager staff was long ago moved from the Jet Propulsion Laboratory campus to cramped quarters down the street, next to a McDonald’s. In an interview last month at Voyager’s offices, Suzanne R. Dodd, the Voyager project manager, said that when she attended meetings in Building 264, she kept a low profile in deference to the Mars team. 
 
“I try to stay out of the elevator and take the stairs,” Ms. Dodd said. “They’re doing important work there, and I’ll only slow them down.” 
 
At 52, Dodd is a relative newcomer to Voyager, first working on the mission in 1984. Now she and her team seem poised to return to the spotlight. 
 
As the solar system’s edge grew tantalizingly close, NASA asked the Voyager scientists to increase the amount of data collection. The problem: the 8-track data recorders from 1977 were not exactly bursting with extra space. Could Ms. Dodd even find anyone who specialized in that piece of technology and could coax it to record more? 
 
“These younger engineers can write a lot of sloppy code, and it doesn’t matter, but here, with very limited capacity, you have to be extremely precise and have a real strategy,” she said. 
 
She was able to find her man: Lawrence J. Zottarelli, 77, a retired NASA engineer. He came up with a solution. But would it work? 
 
Zottarelli waited at Voyager mission control one afternoon last month to find out. The first of the newly programmed data dumps was set to come down. Dodd, Dr. Stone and Zottarelli watched two old Sun Microsystems computers like children watching for a chick to peck through an egg. “Nine, eight, seven,” Dr. Stone counted down. 
 
“Everything’s fine,” said Zottarelli, flashing a thumbs up. “You’re on your own now.” 
 
The relief was written all over Ms. Dodd’s face. “It’s not easy flying an old spacecraft,” she said. 
 
Her eyes moved to Dr. Stone, who was peering at a computer through his trifocals. 
 
“There are lots of old missions,” he responded with a sly smile. “But not many are doing exciting new things.”
]]>
amirsabetee@gmail.com (Brooks Barnes) World - Science &Tech Sat, 14 Sep 2013 15:04:04 +0000
Bermuda triangle earthquake triggered 1817 tsunami. http://tehrantimes.com/science/110713-bermuda-triangle-earthquake-triggered-1817-tsunami http://tehrantimes.com/science/110713-bermuda-triangle-earthquake-triggered-1817-tsunami A model predicted the tsunami wave height from a January 8, 1817, earthquake offshore South Carolina. 
 
A "tidal wave" violently tossed ships docked along the Delaware River south of Philadelphia at about 11 a.m. ET on January 8, 1817, according to newspapers of the time. Turns out, that tidal wave was actually a tsunami, launched by a powerful magnitude-7.4 earthquake that struck at approximately 4:30 a.m. ET near the northern tip of the Bermuda Triangle, a new study finds.
 
The study links the tsunami to a known Jan. 8, 1817, earthquake. 
 
The temblor shook the East Coast from Virginia south to Georgia, where the seismic waves made the State House bell ring several times. 
 
Based on archival accounts of the 1817 shaking, geologists had gauged the quake's size at magnitude 4.8 to magnitude 6. Now, with new geologic detective work and computer modeling of the tsunami, researchers have considerably revised the earthquake's size. A magnitude-7.4 quake releases almost 8,000 times more energy than a magnitude-4.8 earthquake.
 
The size and location, or epicenter, of the 1817 earthquake has never been pinned down so closely before. 
 
U.S. Geological Survey research geophysicist Susan Hough and her colleagues zeroed in on the source from newly uncovered archival records, looking at where the shaking was strongest. But they weren't sure about the tsunami link: The 11 a.m. arrival time seemed too late for a 4:30 a.m. earthquake. So they created a computer model of the tsunami, testing different locations and magnitudes. The best fit to force a foot-high (30 centimeters) wave up the mouth of Delaware Bay by about 11 a.m. was a magnitude-7.4 earthquake offshore of South Carolina.
 
"That was the eureka moment," Hough told LiveScience's OurAmazingPlanet. "Darned if that wave doesn't hit the Delaware River and slow way down."
 
A spooky source
 
The foot-high tsunami wave started about 800 miles (1,300 kilometers) south of Delaware Bay and 400 to 500 miles (650 to 800 km) offshore of South Carolina, according to the study, published in the September/October issue of the journal Seismological Research Letters. That's smack on the northwestern limb of the so-called Bermuda Triangle. 
 
"When we started to say, 'OK, it's the Bermuda Triangle Fault,' that did not go over well," Hough said. "Some of our colleagues didn't want us to get into all this hooey."
 
No obvious culprit jumps out of the seafloor topography, such as a linear feature that could be an earthquake-causing fault, Hough said. But according to ship records, the sea above the temblor's likely epicenter trembled for several years. 
 
Earthquakes can be felt at sea, and ship captains reported shaking before and after Jan. 8, 1817, that could have been foreshocks and aftershocks, the researchers said. Ships in the area also rocked or shook from earthquakes in 1858, 1877 and 1879.
 
"It was interesting enough to mention," Hough said. "People were feeling earthquakes on ships, and earthquakes can damage early ships. Maybe this is part of the thinking that there were strange things going on in that part of the ocean."
 
However, Hough's goal isn't to solve the mystery of the Bermuda Triangle, but rather to fill in the gaps in the East Coast's earthquake history. 
 
Before the new study of the 1817 earthquake, the only other big offshore temblor in recorded history was the 1929 Grand Banks quake, a magnitude-7.2 off the south coast of Newfoundland that unleashed a deadly tsunami.
 
"Grand Banks has been seen as an outlier or a fluke event," Hough said. "If our interpretation is correct, it points to a more distributed [seismic] hazard. Maybe we should expect this kind of earthquake all along the continental shelf."
 
(Source: Live Science)
]]>
amirsabetee@gmail.com (Becky Oskin) World - Science &Tech Sat, 14 Sep 2013 13:17:33 +0000
Arctic ice grows again in August after record 2012 melt http://tehrantimes.com/science/110712-arctic-ice-grows-again-in-august-after-record-2012-melt http://tehrantimes.com/science/110712-arctic-ice-grows-again-in-august-after-record-2012-melt The area of Arctic sea ice was nearly 30% greater in August than a year ago, according to recent satellite data, though projections based on longer-term trends suggest the sea ice will continue its decline over time.
 
Arctic sea ice covered 2.35 million square miles in August, up from 1.82 million square miles a year earlier, according to the National Snow and Ice Data Center, or NSIDC, in Boulder, Colo. The level recorded last year was a record low.
 
Arctic sea ice partially melts each summer and re-forms in the winter. "It's been much colder in the Arctic this summer, so not much ice has melted," said Julienne Stroeve, climatologist at NSIDC. The measurements were based on data obtained from U.S. weather satellites. The nearly 30% year-to-year increase partly reflects the extreme low level of sea ice in August 2012.
 
"If you get a record one year, you don't expect another record the next year," said Chris Rapley, professor of climate science at University College London. He also noted that data on the area of sea ice doesn't capture the whole picture, because it doesn't include the thickness — and therefore volume — of sea ice. Scientists say they need to obtain better data to gauge changes to Arctic ice volumes.
 
Arctic sea ice will be a key issue addressed in an October report by the United Nations Intergovernmental Panel on Climate Change that is expected to reiterate a long-term declining trend in Arctic summer sea ice. 
 
NSIDC data show that monthly August ice extent in the Arctic declined 10.6% a decade from 1979 to 2013. Estimates for further declines vary, but some models suggest that the Arctic will lose its August ice cover entirely by 2060, according to Dr. Stroeve.
 
The primary significance of this year's increase is that the "narrative of the 'spiral of death' for the sea ice has been broken," according to Judith Curry, climatologist at the Georgia Institute of Technology. "It remains unclear as to what extent the decline in sea ice over the past decades is caused by natural variability versus greenhouse warming. Whether the increase in 2013 is a one year blip in a longer declining trend, or whether it portends a break in this trend remains to be seen."
 
Scientists are continuing to debate the cause of the decline in the rate of warming over the past 15 years. A significant contributing factor seems to be associated with a shift in Pacific Ocean circulation patterns. 
 
(Source: The Wall Street Journal)
]]>
amirsabetee@gmail.com (Gautam Naik) World - Science &Tech Sat, 14 Sep 2013 13:17:36 +0000
PayPal debuts its newest hardware, Beacon http://tehrantimes.com/science/110650-paypal-debuts-its-newest-hardware-beacon http://tehrantimes.com/science/110650-paypal-debuts-its-newest-hardware-beacon One of David Marcus’s biggest challenges when taking on the role of President of PayPal was extending the platform into point of sale, and making it easier to pay with PayPal than swiping your credit card. But changing consumer behavior is an enormously difficult task. So he thought that the only thing more alluring than swiping a card is simply doing nothing when you check out of a store. Today the company is one step closer to making this a reality. PayPal is debuting Beacon, a new add-on hardware device for merchants that leverages bluetooth technology to enable consumers to pay at stores completely hands-free.
 
When thinking through the problem, PayPal played with geo-location leveraging GPS and Wi-Fi for iOS and Android. These technologies are what powers Square’s hands free payment system ‘Pay with Square. But they found that it ended up being a poor experience for the consumer because it sucked the batter life out of their phones.
 
Plus Marcus tells us that the company wanted to find a solution that would scale across point-of-sale systems.
 
PayPal began experimenting with Bluetooth Low Energy (aka BLE), which allows connected devices to communicate with each other while keeping the energy consumption by the devices at a very low level. Last June, PayPal recruited some of its best engineers and designers (and brought on a few hardware experts as well, says Marcus), including Mike Mettler, Hasty Granbery and Josh Bleecher Snyder to start developing the connected device. The initial goal was to develop a prototype that would leverage BLW to enable a transaction to take place without having an app running, without GPS being turned on, and even without phone signal for those places with thick concrete walls.
 
It’s important to note that in iOS 7, Apple is debuting iBeacon, which could provide similar services for apps. But VP of global product Hill Ferguson maintains that this does not extend to Android, and Beacon will work for any smartphone.
 
Beacon was developed, and started being tested on PayPal’s San Jose campus in January. So what is Beacon? Essentially it is a small hardware device, that runs on its own WiFi, plugs into an outlet and serves as a ‘beacon’ to other connected devices. Any store running compatible point of sale systems, including Erply, Leaf, Leapset, Micros, NCR, ShopKeep and Vend, will simply plug a PayPal Beacon device in a power outlet in their store, and the device can be integrated. The device itself runs updates on its own, and Granbery says that the merchant doesn’t have to touch it at all. When plugged into an outlet, the device, which takes a triangular shape similar to PayPal Here, lights up. We’re told Yves Behar, who also designed Here, was the brains behind the design of Beacon.
 
Consumers have to have downloaded the PayPal app, and opted into ability for retailers to use Beacon to use hands-free check-in and payments. Once this is activated, any time a consumer walks into the store the technology will trigger a vibration or sound to confirm a successful check in (this happens in milliseconds), your photo will then appear on the screen of the merchant’s Point-of-Sale system so you can be greeted by name. The app does not have to open on the consumer’s smartphone. You order or pay for your goods, and paying only requires a verbal confirmation, and the checkout is complete. 
 
PayPal also warns that it is aware of the potential privacy issues so PayPal Beacon won’t constantly track your location unlike other technologies. If you enter a store and decline to check in, or just ignore the prompt entirely, no information is transmitted to PayPal or the merchant. PayPal has also said that there will be no ads served via the platform.
 
The broader vision for Beacon is to have an open platform from which merchants can create compelling, friction-less experiences for their consumers. As part of the announcement today, PayPal is giving developers access to the PayPal mobile in-store API. So for example, a drugstore could populate your prescriptions as you walk into the store so you could automatically pay for your bill without having to swipe a card. Or you could automatically add your name to a wait list for tables at a restaurant by walking in.
 
Marcus says that PayPal has been showing the device to retailers, large and small, and they “love it.” Part of the reason, he says, is that not only does it help them connect and potentially bring in customers, but it also integrates seamlessly with their existing point of sale systems. “I want to create an operating system for the retail environment,” he tells me.
 
He also adds that the company focused particularly on design. “PayPal was not a hardware company,” he says. But steadily with the debut of mobile payments device and platform PayPal Here, and now Beacon, the payments giant is definitely digging deep into hardware. He adds that this was the most sophisticated piece of hardware to ship to date and right now the company is not focused on making a profit on Beacon. PayPal says that the cost of the device has not been determined yet but it will be in the double digits.
 
PayPal says it will start piloting Beacon enabled shopping experiences in Q4, and the full roll-out is planned early next year.
 
It’s no secret Marcus has been working tirelessly to create more of a startup culture at PayPal to drive further innovation at the company. If there was anyone who could make this work, it is Marcus, who has been a serial entrepreneur until PayPal acquired his mobile payments company Zong a few years ago. He, along with the engineers I spoke to, said that this launch was much different than launches past. 
 
As evidenced by PayPal latest app update from last week, the company is making a big bet on creating an in-store experience for both consumers and retailers. It’s ambitious to say the least because not only does the payments company have to change the behavior of the consumer (from just swiping a card) but it also has to convince retailers and the 100 million plus PayPal users that it is worth it. Marcus says that providing a frictionless experience for both parties will be the key to success when changing behavior.
 
(Source: Techcrunch)
]]>
amirsabetee@gmail.com (Leena Rao) World - Science &Tech Tue, 10 Sep 2013 15:38:49 +0000
Scientists help farmers create greener dairies http://tehrantimes.com/science/110649-scientists-help-farmers-create-greener-dairies http://tehrantimes.com/science/110649-scientists-help-farmers-create-greener-dairies PRAIRIE DU SAC, Wis. (AP) -- Cows stand patiently in a tent-like chamber at a research farm in western Wisconsin, waiting for their breath to be tested. Outside, corrals have been set up with equipment to measure gas wafting from the ground. A nearby corn field contains tools that allow researchers to assess the effects of manure spread as fertilizer. 
 
Scientists based at the University of Wisconsin-Madison have started a slew of studies to determine how dairy farms can reduce their greenhouse gas emissions. They will look at what animals eat, how their waste is handled and the effects on soil, water and air. 
 
Their work is part of a government-sponsored effort to help farmers adapt to more extreme weather and reduce their impact on climate change. The studies also will support a dairy industry effort to make farms more environmentally friendly, profitable and attractive to consumers. 
 
The Innovation Center for U.S. Dairy is developing a computer program that will allow farmers to compare water consumption, energy use and greenhouse gas emissions from their farms to the national average and learn how improving their practices could help their bottom line. 
 
"We like to say sustainability makes cents — c-e-n-t-s," said Erin Fitzgerald, the center's senior vice president for sustainability. 
 
Environmentally speaking, the big issue for dairy farms for decades was manure. 
 
Karl Klessig remembers state agents coming to his farm in 2002 and handcuffing him after an unexpected rain washed manure spread several days earlier into nearby Lake Michigan. Klessig was told that if his family didn't immediately till the manure into the ground, tearing up the grass that feeds their cows, he'd soon be in jail. 
 

Environmental awareness
 
It was a big loss, but it "jump-started" their environmental awareness, Klessig said. The family welcomed researchers from UW-Madison and UW-Extension onto its property in Cleveland, about 70 miles north of Milwaukee, for tests that had some unexpected results. 
 
For example, the family had been leaving its pastures untilled for up to a decade to allow the grass to build up density, feeding the cows and reducing erosion. But scientists found that also allowed phosphorus to accumulate in the top layer of soil. Klessig said his family has been able to reduce phosphorus by tilling pastures more often and growing corn, which uses phosphorus to grow. 
 
They also learned the farm was losing hundreds of pounds of soil each year through its drainage system and wormholes were allowing manure to run into those pipes. It was nerve-racking to have researchers point out these problems, Klessig said. 
 
"Sometimes you feel like you're on top of the table, and you only have underwear on," he said. 
 
But the scientists also offered solutions, which Klessig said, "made us better farmers." 
 
Studies like the ones done at Klessig's farm helped provide the basis for the computer program being developed by the Innovation Center. The tool will be bolstered by data from a $10 million project led by UW-Madison but including scientists, engineers and scholars from multiple universities. 
 
It is one of four projects funded by the U.S. Department of Agriculture to help farmers in specific regions adapt to climate change while reducing their environmental impact, said Ray Knighton, national program leader for soil and air quality at USDA's National Institute of Food and Agriculture. The other projects involve the beef industry in the southern Great Plains and Southwest, wheat production in the Pacific Northwest and wood production in the Southeast. 
 
Five-year dairy project
 
The five-year dairy project focuses on a strip of the northern U.S. from New York to Wisconsin. It is climate-specific in part because things like temperature affect the amount of milk cows produce. 
 
At the federally owned research farm in Prairie du Sac, scientists are looking at the impact made by relatively small changes. For example, as cows digest, they essentially burp out methane, a greenhouse gas. So, does changing the animal's diet make its breath less toxic? 
 
They're also exploring possibilities like whether there's a relationship between the amount of milk a cow produces and how much methane it gives off. If so, it might be possible to one day tell farmers that cows with certain genes "will enhance your profits but also enhance the environment," said Mark Powell, the USDA soil scientist leading the team of researchers. 
 
His and others' work will eventually be combined into what's called a life cycle assessment that tallies the environmental impact of the entire industry — from the corn grown to feed cows to trucks that deliver milk to grocers. Farmers and others in the dairy industry can then use that information to assess how their decisions add up. 
 
"Engaging the dairy producers is the most important thing on this project," said lead researcher Matt Ruark, a UW-Madison assistant professor and extension soil scientist. "There is a public demand for milk. But cows don't just produce milk, they also produce manure and methane." 
 
Klessig, whose family owns a cheese-making business along with its dairy farm, said farmers are eager for such information because their success depends on making good choices that they can explain to customers. 
 
"We hear it from our customers at the creamery," he said. "It's not that we're organic or we're not organic. They actually want to understand what we're doing.
]]>
amirsabetee@gmail.com (M.L. Johnson) World - Science &Tech Tue, 10 Sep 2013 15:35:55 +0000
How is the mobile-security business doing? Don't ask http://tehrantimes.com/science/110617-how-is-the-mobile-security-business-doing-dont-ask http://tehrantimes.com/science/110617-how-is-the-mobile-security-business-doing-dont-ask While as much as 70 percent of PCs in use around the world have security tools installed on them, that's the case for just 5 percent of smartphones and tablets, according to Charles Kolodgy, an analyst at market researcher IDC.
 
"Users don't believe there is much of a threat to these devices," he said. "There has yet to be -- and probably never will be -- a massive worm, virus or Trojan."
 
Digital-security companies aren't exactly eager to highlight this divide. When asked for specifics about mobile usage, some companies obscured or inflated their numbers.
 
Of course, these same companies have used similar tactics in their main PC businesses. 
 
Companies such as Symantec and Intel's McAfee bundle their programs with PCs, as users who have been bombarded with warning messages will know. That can help attract new subscribers, but it has the added benefit of allowing companies to claim higher install rates even when many of those PCs aren't actually using the software. 
 
Even if you don't want antivirus software, odds are that you're going to get it, and you're going to like it -- or else.
 

Mobile-security software
 
Some makers of mobile-security software have similar deals in place with handset manufacturers and mobile-network operators that bundle the programs with their phones. Otherwise, users need to go to an app store on their own and download it. As you can probably guess, not many people do.
 
Here's what some of the biggest security companies said when I asked for mobile stats:
 
--- Symantec and Trend Micro declined to provide figures.
 
--- NQ Mobile, a Chinese company, said it has 372 million registered user accounts, which includes partnerships. Only a third -- 122 million -- are active monthly users. NQ Mobile's products work in the background and continue to provide protection for users even if they are no longer considered active, which is why both numbers are given, said spokesman Kim Titus.
 
--- McAfee said it "has secured more than 150 million handsets with its mobile security solutions." That also includes partnerships. For a measure of how often consumers independently request the software, McAfee's mobile app has been downloaded more than 4 million times from Google Play, the company said. One reason why the higher number is used is companies often don't get their mobile security software from app stores, and thus their usage isn't reflected in the lower figure, McAfee said.
 
--- AVG, based in Amsterdam, said it has 44 million active mobile users.
 
--- Lookout Security, based in San Francisco, said it has 45 million users. The figure only includes people who have gone through Lookout's registration process, the company said.
 
--- Kaspersky Lab, a Russian company, said its mobile app has been downloaded more than 1 million times.
 
Considering there are 4.6 billion mobile phones on the planet, the number of people actually using the typically free security apps is "minuscule," said Lawrence Pingree, an analyst at research firm Gartner.
 
Some companies may have to come up with a new form of measurement.
 
(Source: Global Tech)
]]>
amirsabetee@gmail.com (Jordan Robertson) World - Science &Tech Mon, 09 Sep 2013 14:58:52 +0000
Sandy’s ‘freaky’ path may be less likely in future http://tehrantimes.com/science/110616-sandys-freaky-path-may-be-less-likely-in-future http://tehrantimes.com/science/110616-sandys-freaky-path-may-be-less-likely-in-future WASHINGTON (AP) -- Man-made global warming may further lessen the likelihood of the freak atmospheric steering currents that last year shoved Superstorm Sandy due west into New Jersey, a new study says. 
But don’t celebrate a rare beneficial climate change prediction just yet. The study’s authors said the once-in-700-years path was only one factor in the massive $50 billion killer storm. They said other variables such as sea level rise and stronger storms will worsen with global warming and outweigh changes in steering currents predicted by the study’s computer models.
 
“Sandy was an extremely unusual storm in several respects and pretty freaky. And some of those things that make it more freaky may happen less in the future,” said Columbia University atmospheric scientist Adam Sobel, co-author of a new study on Sandy. But Sobel quickly added: “There’s nothing to get complacent about coming out of this research.”
 
The study published Tuesday in the journal Proceedings of the National Academy of Sciences looks at the giant atmospheric steering currents, such as the jet stream. 
 
A spate of recent and controversial studies have highlighted unusual kinks and meanders in the jet stream, linking those to extreme weather and loss of sea ice in the Arctic. This new study looks only at the future and sees a lessening of some of that problematic jet stream swerving, clashing with the other studies in a scientific debate that continues.
 
Both camps agree on what happened with the weird steering that shoved Sandy, a late season hurricane that merged with a conventional storm into a massive hybrid, into New Jersey. The jet stream plunged in an odd way. 
 
A high pressure system off the coast of Canada and Greenland blocked the storm from moving east, as most do.
 
That high pressure block now happens once or twice a year in August, September and October. Computer models show the jet stream will move further north, so the “giant blob of high pressure” will be even less frequent next century, said study lead author Elizabeth Barnes of Colorado State University.
 
But Barnes and Sobel said because so many other factors are involved this doesn’t mean fewer storms hitting the New York region. This is only one path; storms usually come from the south instead of from the east like Sandy.
 
Scientists agree that future storms will be slightly stronger because of global warming and that sea level is rising faster than researchers once thought, Sobel said. Those factors likely will overwhelm the predicted change in steering currents, he said.
 
Rutgers University climate scientist Jennifer Francis, one of the major proponents of the jet-stream-is-changing theory, said she doesn’t see the jet stream becoming stronger and moving north as Barnes says the models predict. Her work and others points to more Sandy-like storms, especially because there seem to be more late-season tropical storms.
]]>
amirsabetee@gmail.com (Seth Borenstein) World - Science &Tech Mon, 09 Sep 2013 14:49:39 +0000
Incredible technology: How to fight wildfires http://tehrantimes.com/science/110583-incredible-technology-how-to-fight-wildfires- http://tehrantimes.com/science/110583-incredible-technology-how-to-fight-wildfires- Wildfires, like the Rim Fire raging in Yosemite, Calif., are some of nature's most awesome, and devastating, spectacles, devouring large swaths of forest and grassland in hours.
 
Battling such blazes requires firefighters to pair traditional techniques, such as firebreaks, to contain the voracious flames, with newer technologies like drones and satellite imaging, to monitor the fire's progress.
 
Wildfire activity has been 50 percent above average for the last five years, said Julie Hutchinson, battalion chief of the California Department of Forestry and Fire Protection (CAL FIRE). If uncontained, these fires pose a threat to human life and property. 
 
"We're always looking for technology that could benefit the public and firefighters and provide an additional layer of safety," Hutchinson said.
 
Containing the blaze 
 
Once a wildfire gets going, containing the blaze is the immediate priority. The standard response includes fire trucks (and related equipment), ground crews, bulldozers and aircraft. On the ground, firefighters lay down fire hoses along the fire's edge, every 100 feet (30 meters) or so. Then firefighter crews or bulldozers create what's known as a firebreak or fire line around the perimeter of the blaze, a strip of land or trench where any potential fuel — such as dry brush or grass — has been removed.
 
"We don't want the fire to come out of that area, and the only way to do that is to remove any fuel," Hutchinson told LiveScience.
 
When the media reports a fire is "X percent contained," X refers to the fraction of the fire's circumference around which a fire line extends. For example, if 9 miles (14 kilometers) of fire line surround a fire that is 10 miles (16 km) in circumference, the fire is 90 percent contained.
 
Sometimes, firefighters create a controlled burn to direct the fire's spread. In a technique called "firing out," a fire is created between the wildfire and a natural barrier, such as a road, to remove any vegetation in the wildfire's path.
 
Aircraft play an important role, too. Helicopters fly over and dump water or sometimes suppressant foam on fire hotspots. The foam acts as insulation to prevent unburned fuels from catching fire. Fixed wing aircraft called air tankers fly over the blaze dumping flame retardant chemicals, such as ammonium phosphate.
 
Surveillance from above
 
Air tankers are also used to monitor fires from above. Aerial camera footage and GPS data are fed into a computer system to improve models of the fire's behavior.
 
For the first time, firefighters have deployed a Predator drone to fly over the Rim Fire. In contrast to a manned airplane, the remotely piloted aircraft doesn't risk the life of the pilot, and can fly over the fire for much longer. Firefighters are using information gathered by the drone to guide the allocation of firefighting resources on the ground to where they are most needed. The aerial view also reveals the location of critical infrastructure such as power lines, gas lines and water systems in the fire's path. 
 
Computer models serve an important role in predicting how a wildfire will behave. The predictions take into account the weather, landscape and fuel conditions. These models provide a snapshot of the fire's potential, Hutchinson said. "Where it becomes important is when you start having multiple fires in a state, and you're having to allocate resources," she added. 
 
To better understand when and where wildfires occur, researchers comb through satellite imagery. The U.S. Forest Service and U.S. Geological Survey are using data from the Landsat Earth-observing satellites to study every major fire in the country since 1984, mapping its severity.
 
Mark Cochrane, a senior scientist at the Geospatial Sciences Center of Excellence at South Dakota State University, is using satellites data to determine the best techniques for preventing wildfires. "This information helps us understand how what we've done on the landscape affects fires now," Cochrane told LiveScience. Though it varies by region, forest thinning and prescribed burns — both of which aim to eliminate fire fuel before the fire occurs — seem to be the most effective methods, he said.
 
"It's inevitable wildfires will occur," Cochrane said. "Accepting that fire is part of the landscape, what can we do to inoculate [the land] so that areas where people are living or that are highly valued do not burn?"
 
(Source: Live Science)
]]>
amirsabetee@gmail.com (Tanya Lewis) World - Science &Tech Sun, 08 Sep 2013 15:33:24 +0000
Climate change: Warm words and cool waters http://tehrantimes.com/science/110582-climate-change-warm-words-and-cool-waters http://tehrantimes.com/science/110582-climate-change-warm-words-and-cool-waters A report that the current "pause" in global warming could be linked to cyclic cooling in the Pacific will be interpreted by climate skeptics as evidence that global warming isn't happening, and by politicians as a reason to forget about climate change and carry on with business as usual. Both responses would be dangerously wrong.
 
There is no serious argument within climate science about the link between carbon dioxide levels and temperature. 
 
Between 1970 and 1998 the planet warmed at an average of 0.17C per decade, and from 1998 to 2012 at 0.04C per decade. 
 
Carbon dioxide levels in the atmosphere, however, continued to rise and are now higher than at any time in the last 800,000 years. Twelve of the 14 warmest years on record have occurred since 2000; the last two years have been marked by catastrophic floods in Australia and record-breaking temperatures in the U.S.; and the loss of north polar ice has accelerated at such a rate that climate modelers expect the Arctic Ocean to be routinely ice-free in September after 2040.
 
There is, however, a serious debate about why the observed temperatures have not kept pace with computer-modeled predictions and where the heat that should have registered on the global thermometer has hidden itself. One guess – supported by some sustained but still incomplete research – is that the deep oceans are warming: that is, the extra heat that should be measurable in the atmosphere has been absorbed by the sea. This is hardly good news: atmosphere and ocean play on each other, and any stored heat is likely to be returned to the atmosphere sooner or later, in unpredictable ways. 
 
Atmosphere and warmth
 
The real lesson from the latest finding is that there is a lot yet to be understood about how the planet works, and precisely how ocean and atmosphere distribute warmth from the equator to the poles.
 
The other message is that more warming is on the way. This is because the planet has still to experience the consequences of greenhouse gas emissions of 20 years ago or more. 
 
Just as the gas flame under the kettle takes a few minutes to warm the water, so the additional energy put into the atmosphere takes a decade or two to make itself felt. That is why climate modelers who have been puzzled by the lower-than-expected rise in temperatures have also warned that extremes of heat will increase in the next decade or two, whether or not governments start to reduce emissions.
 
But they also warn that if the world goes on burning coal, oil and gas at ever-increasing rates then by the second half of the century continents could move to a new climate regime in which the coldest summer months will be substantially hotter than the hottest experienced today. The news is not reassuring, and complacency is not an option.
 
(Source: The Guardian)
]]>
amirsabetee@gmail.com (Science Desk) World - Science &Tech Sun, 08 Sep 2013 15:25:04 +0000