Sci/Tech Review (Nov 2016)

10 Most Amazing Discoveries of 2016



Adobe Voco ‘Photoshop-for-voice’ causes concern

  • 7 November 2016

A new application that promises to be the “Photoshop of speech” is raising ethical and security concerns.

Adobe unveiled Project Voco last week. The software makes it possible to take an audio recording and rapidly alter it to include words and phrases the original speaker never uttered, in what sounds like their voice.

One expert warned that the tech could further undermine trust in journalism.

Another said it could pose a security threat.

However, the US software firm says it is taking action to address such risks.

Voice manipulation

At a live demo in San Diego on Thursday, Adobe took a digitised recording of a man saying “and I kissed my dogs and my wife” and changed it to say “and I kissed Jordan three times”.

The edit took seconds and simply involved the operator overtyping a transcript of the speech and then pressing a button to create the synthesised voice track.

“We have already revolutionised photo editing. Now it’s time for us to do the audio stuff,” said Adobe’s Zeyu Jin, to the applause of his audience.

He added that to make the process possible, the software needed to be provided with about 20 minutes-worth of a person’s speech.

Dr Eddy Borges Rey – a lecturer in media and technology at the University of Stirling – was horrified by the development.

“It seems that Adobe’s programmers were swept along with the excitement of creating something as innovative as a voice manipulator, and ignored the ethical dilemmas brought up by its potential misuse,” he told the BBC.

“Inadvertently, in its quest to create software to manipulate digital media, Adobe has [already] drastically changed the way we engage with evidential material such as photographs.

“This makes it hard for lawyers, journalists, and other professionals who use digital media as evidence.

“In the same way that Adobe’s Photoshop has faced legal backlash after the continued misuse of the application by advertisers, Voco, if released commercially, will follow its predecessor with similar consequences.”

ID checks

The risks extend beyond people being fooled into thinking others said something they did not.

Banks and other businesses have started using voiceprint checks to verify customers are who they say they are when they phone in.

One cybersecurity researcher said the companies involved had long anticipated something like Adobe’s invention.

“The technology is new but its underlying principles have been understood for some time,” said Dr Steven Murdoch from University College London.

“Biometric companies say their products would not be tricked by this, because the things they are looking for are not the same things that humans look for when identifying people.

“But the only way to find out is to test them, and it will be some time before we know the answer.”

Watermark checks

Google’s DeepMind division showed off a rival voice-mimicking system called WaveNet in September.

But at the time, it suggested that the task needed too much processing power to find its way into a consumer product in the near future.

For its part, Adobe has talked of its customers using Voco to fix podcast and audio book recordings without having to rebook presenters or voiceover artists.

But a spokeswoman stressed that this did not mean its release was imminent.

“[It] may or may not be released as a product or product feature,” she told the BBC.

“No ship date has been announced.”

In the meantime, Adobe said it was researching ways to detect use of its software.

“Think about watermarking detection,” Mr Jin said at the demo, referring to a method used to hide identifiers in images and other media.


The Feds Created a Helium Problem That’s Screwing Science

SO YOU MAY have heard about a helium shortage. Helium is indeed so light that it can float up and out of Earth’s atmosphere—but that’s not the real problem. The trouble, says a new report, is actually political, a string of bad decisions that threw helium prices into chaos. The result: headaches and canceled experiments for scientists, and a few new ideas for how to keep buying the profoundly useful element.

At one point, the government had stored a billion cubic meters of helium in a massive cavern in Amarillo, Texas–the Federal Helium Reserve overseen by the Bureau of Land Management. In 1996 Congress passed a law to gradually shut the facility down and sell off the reserves, but this depressed prices, which screwed up the market and discouraged competition. A second bill in 2013 was supposed to help fix it, but—surprise—it ended up discouraging competition in different ways, according to a report last week from the Government Accountability Office. When the reserve shuts down in a few years, scientists expect even more volatility.

That’s a problem, because helium is more than just a delightful gas that floats balloons and gives us Mickey Mouse voices. It boils—which is to say, becomes a gas—at minus 452.2 degrees Fahrenheit. Or, put another way, it becomes a liquid at the lowest temperature of any element in the universe. So superchilled liquid helium plays an irreplaceable role in scientific research. Low-temperature physicists use it to power their dilution refrigerators, which can cool samples down to a fraction of a degree above absolute zero. At these temperatures, molecules have almost no kinetic energy and can barely move. Physicists can then measure tiny quantum effects obscured at higher temperatures. For similar reasons, liquid helium minimizes fluctuations in telescopes. The team behind the BICEP2 telescope in Antarctica, for example, lugged liquid helium to the South Pole, where it’s already pretty cold—just not liquid-helium-cold.

Liquid helium is also used to cool superconducting magnets in everything from magnetic resonance imaging (MRI) machines to the Large Hadron Collider. The materials that make those magnets only superconduct at temperatures a few degrees above absolute zero—temperatures only possible with liquid helium. “Helium is the only element we can use reliably. There is no alternative” says Tom Rauch, a global sourcing manager for GE Healthcare, which makes and services MRI machines.

[IMG]Click to Open Overlay Gallery
A helium-cooled GE Healthcare MRI under construction. The thermal shield of the MRI is wrapped in layers of aluminum mylar. GE HEALTHCARE
But if labs can’t afford it, or can’t plan when to buy it? Industrial and military applications—such as semiconductor manufacturing, leak detection, and diving—actually account for most helium used in the US. And the military can handle price changes. But smaller users like labs that have fixed budgets, especially in physics, can’t. “It’s just killer when prices fluctuate,” says William Halperin, a physicist at Northwestern University.

Lance De Long, a physicist at the University of Kentucky, has been forced to abandon experiments because of helium prices. His lab makes new materials and then analyzes them using a machine with a helium-cooled superconducting magnet. This year, helium cost him $35 per liter—unusual to be sure, as other researchers have reported prices anywhere from $6.50 to $12. But that illustrates the variability in prices all over the country. Scientists have also been coping with a general upward trend, with prices rising 50 percent since 2000.

On the other hand, helium’s irreplaceability has forced some scientists to become much more creative in how they buy and use it. In 2014, the American Physical Society and the American Chemical Society connected with the Defense Logistics Agency, which buys helium for the military, to broker lower costs for researchers. The pilot is tiny—only seven universities—but it’ll expand if it’s successful.

Another possibility stems from a fundamental property of the element. It’s a noble gas, which means that it doesn’t react—or combine—well with almost anything else. Given the right kind of (expensive) capture systems, you can recycle and reuse helium. Labs and industrial facilities are installing those systems to grab back helium that escapes into the air.

For now, scientists are just hoping for more stable prices. International producers such as Qatar have recently stepped up production. But helium sellers around the world set their prices according to Federal Helium Reserve auctions, so all eyes are on the Bureau of Land Management to set better rules.

 How Wolves change Rivers:

Mysterious ninth planet could one day tear apart the solar system

View Slideshow

When scientists announced earlier this year that they might have discovered a ninth planet in Earth’s solar system, the finding brought comfort to those of us still mourning Pluto’s downgrade to “dwarf planet.” But new research out of the University of Warwick shows that the mysterious “Planet Nine,” which is believed to circle the sun once every 15,000 years, could eventually spell disaster for our solar system.

planet nine, white dwarf, red giant, planet x, ninth planet, solar system, sun, earth, destruction of solar system, neptune, uranus, gas giants, university of warwickWhile it sounds terrifying, don’t get too worried – by the time Planet Nine could become a threat, the sun will have died of natural causes and Earth will no longer be habitable. Hopefully, humanity will have moved on to a new home already. It all has to do with the lifecycle of our solar system, according to physicist Dr. Dimitry Veras. Roughly seven billion years from now, our sun will begin to die – inflating to a huge size as its mass begins to blow away. This enormous fireball will swallow the inner planets, including the Earth, before the star fades back into a smoldering ember known as a “white dwarf.” At this point, the sun will have shrunk down to a dense star about the same size as the Earth.

Related: Astronomers may have discovered a ninth planet in our solar system

In the past, scientists believed the outer planets of the solar system would survive this shift, continuing to orbit at a safe distance. Not a happy end for Earth or the inner planets, perhaps, but certainly not the death of the solar system itself. However, Dr. Veras has projected that the presence of a large planet beyond the known reaches of our solar system could cause some of the surviving giant planets to be ejected into space instead.

When the sun expands and becomes a red giant billions of years in the future,  the four known gas giants will be pushed further out into the solar system – however, Planet Nine, which has been estimated to be 10 times more massive than Earth – may not experience the same shift. Instead, it might find itself thrust inward, interfering with the orbits of Uranus and Neptune, potentially tearing apart the solar system as we know it.

Related: New NASA tech could provide the entire solar system with internet

It’s important to stress that this is still a hypothetical scenario. While scientists are fairly certain that Planet Nine exists due to the observed influence on orbiting objects near Neptune, it hasn’t been visually confirmed by telescope. That means we don’t know exactly how far out it might be located – and according to Dr. Vargas, the location is crucial. The further out Planet Nine may be orbiting, he says, the likelier it is that the solar system will meet a violent end.

Dr. Vargas’ findings will be published in the Monthly Notices of the Royal Astronomical Society.



The Universe Has Almost 10 Times More Galaxies Than We Thought

This image shows a portion of the sky used to recalculate the total number of galaxies in the observable universe.

NASA, ESA/Hubble via AP

Counting all the galaxies in the universe is hard. So hard, it seems, that it’s possible to miss billions of them.

A new analysis of Hubble Space Telescope data finds there are almost 10 times more galaxies in the universe than we once thought there were — about 2 trillion of them, up from about 200 billion.

It’s the first major revision to the number since 1995, when scientists turned Hubble’s gaze on one section of sky for 10 days and created an image, unveiled in 1996, that NASA called “mankind’s deepest, most detailed optical view of the universe.”

Based on the single section of sky and the galaxies that showed up in it, astronomers extrapolated that the entire universe should have about 200 billion galaxies.

But that far from settled the question. Twenty years later, the new analysis begins by noting that the number of galaxies in the universe is still “a fundamental question.”

The study, led by Christopher Conselice of the University of Nottingham and accepted for publication in the latest issue of The Astrophysical Journal, used deep-space images from the Hubble telescope as well as other deep space data that had already been published to create a 3-D image of the observable universe.

(The observable universe is just the universe that we can see from Earth.)

The results of the latest study suggest that 90 percent of the galaxies in the observable universe are too faint or far away for current telescope technology to see. Which means the 1996 estimate, which had been based only on what Hubble could see, was way off.

“It boggles the mind that over 90 percent of the galaxies in the universe have yet to be studied. Who knows what interesting properties we will find when we discover these galaxies with future generations of telescopes?” Conselice said in a NASA press release.

“In the near future, the James Webb Space Telescope will be able to study these ultra-faint galaxies,” he said. The James Webb Space Telescope, one of the most expensive things NASA has ever built, is scheduled to launch in 2018, and will be able to peer deeper into space than its predecessor Hubble.




Elon Musk unveils ‘Solar Roof’








The Brightest Super Moon:



Leave a Reply

Your email address will not be published. Required fields are marked *