Quantcast
Channel: Popular Science | RSS
Viewing all 20161 articles
Browse latest View live

Hey Did You Guys Know There's an iPad Event Today?

0
0
iPad 3 Announcement Apple

Oh hey, looks like it's Apple Event Day! Apple will be announcing (almost certainly, anyway) a new iPad, which may be called the iPad 3 or the iPad HD, the latter moniker coming from the rumored inclusion of an ultra-high-resolution "Retina" display like the iPhone 4's. The fun starts at 1PM EST, and we recommend the liveblog run by the folks at GDGT. Check out our earlier post for a quick check on the rumor situation, but otherwise, sit back and enjoy the show (not that you'll really be able to avoid it).


An Apple TV With Siri Could Finally Replace Remote Controls

0
0
No More Remotes! Dan Nosowitz
Stop with the gimmicky remotes. Give me voice or give me buttons

One of the smaller rumors going around about today's Apple event predicts that Apple will release a new version of its little black set-top box, Apple TV. A sub-rumor suggests that this Apple TV might incorporate Siri, Apple's voice-command Lady of Wonder. Siri on Apple TV could legitimately be the first alternative way to control your TV that isn't actually worse than a black plastic stick with buttons on it.

The handheld, button-powered remote, evolutionarily, is kind of as good as it will ever be. Manufacturers of TVs, as well as assorted gadget and accessory makers, are all too willing to cram whatever trendy technology they can find onto remotes. Touch controls! Capacitive screens! Apps! Motion sensing! And none of them really work.

Meanwhile, the need for them to work is growing: As TVs and connected boxes do ever more things, we need a way to navigate them that's as effective as the way we navigate through our phones, our computers, and our tablets. New TVs have apps, they have games, they have news and photos and video chat and Netflix and Hulu and Pandora and Spotify and even live TV. It's time for a new system that feels like an elegant and appropriate way to take advantage of all that good stuff.

Samsung has shown off a new remote control that eliminates almost all of the buttons in exchange for a touchpad. I like eliminating buttons; one of the problems with the venerable clickers is that they've turned into forearm-sized slabs with dozens of unhelpfully labeled buttons, most of which you'd never use even if you understood what they did. It's time to tear it down and start over, so I'm happy to see some work on that front.

But this remote is not the answer. It has some very good ideas, to be fair. Samsung's new Smart TVs ape both the gesture and voice control from Microsoft's Kinect, and Samsung stuck a little microphone in the remote itself, so you can speak at a relatively normal volume instead of hollering across the room at your TV. Smart! But I didn't care for the remote itself. There's no easy way to "favorite" or "bookmark" channels, which means you have to scroll through a list of them with a downward swipey motion or type them in with a keypad. Only, there is no keypad; there's an onscreen grid with the numbers one through nine on it, which you scroll through with the touchpad. It's a pretty bad compromise. And the general problem with touch commands is a lack of physical feedback: there is no benefit, besides trendiness, to replacing a button that you can grope for and depress in the dark with a touch area that you cannot.

Other alternative methods of control are lacking, too--anyone who has tried to navigate through the Kinect's menus with their arms knows that it's silly and fun but so inefficient compared to an Xbox controller that no reasonable person would really rely on it as their go-to method of control. Roku's fancified new remote packs motion-sensing as well--and aside from its use to control a slightly awkward version of Angry Birds, you're better off sticking to the buttons.

There are quite a few smartphone apps that act as remotes as well. These are good in some ways--searching works pretty well, with a pop-up keyboard on your phone--but the problem with touch is that you can't simultaneously look at your phone and at the TV at the same time. You'll either end up hitting the wrong parts of your digital control pad while trying to do it by feel, or you'll bob your head up and down like a filthy pigeon trying to maintain sight of two screens. And we are not pigeons, my friends. No, we are not!

For me, the Boxee Box's remote is about as good as it gets: a tiny stick, on one side boasting a simple five-way directional pad, a play/pause button, and a back button. Flip it over, and you have a QWERTY keyboard, with actual buttons. But is this what we've come to? Where's the future? Where are our mind-reading TVs?

The closest we've gotten to a true next-gen option is the Microsoft Kinect's voice control. It's come a very long way, and in some specific instances, it's better than the Xbox controller. Search especially comes to mind--scrolling through an on-screen alphabet is an annoyance that'll grow in frequency as more and more people hook their TVs up to the internet. But things like moving between screens, browsing through lists when you're not sure what to watch--they're just better suited to pressing a button. The Kinect takes some helpful steps toward that difficult task of browsing: you can search for some metadata, like directors and actors. But the Kinect is still a closed environment--it has, at the moment a limited number of responses to a limited number of specific commands. Siri doesn't have that problem.

Siri's ability to parse human language and deliver results--theoretically, I mean--could be perfect for lounging on your couch on a Sunday afternoon, wondering what to watch. It's not a "voice command" situation, really: you don't give it trigger words, you just speak normally and ask for what you want, which is a definite step up from the current capabilities of Kinect. You could ask for movies similar to your favorites. You could ask for that movie about the magicians, or something, where David Bowie plays Nikola Tesla (that is a real movie). You could ask for a TV show with a mystery, made in the last ten years. Assuming Apple can get Siri to live up to expectations, this could be pretty amazing. And Siri isn't the be-all-end-all of voice command; it's not the only service to use natural speech recognition, and there's no particular reason why Microsoft (or Google, or anyone else) couldn't jump in and implement it just as well. Apple just has a head start, because Siri's already packaged and ready to do this kind of task.

Of course, there are some tasks for which I'm not sure a remote will ever be bested. Saying "Xbox! (or Siri!) Pause!" just isn't as quick and painless as hitting the play/pause button. But for browsing, searching, and discovering, voice command should be the future. We're not sure that future will be today--there's a pretty good chance today will be iPad-only--but the promise of Siri-powered TV is enticing whether or not it becomes a reality in a few hours.

Stop the Foolishness: Of Course Pink Is a Color

0
0
Flamingo Flock szeke via Flickr

There's an interesting volley online right now between two science journalists I admire, Michael Moyer and Robert Krulwich, over the color pink. In a blog post for NPR last week, Krulwich was shocked to learn from a video that pink does not exist, and Monday Moyer responded with his own ontological take.

Pink is not found in the ROYGBIV electromagnetic spectrum. But that does not mean it isn't a color! There's a key distinction between colors of light and of pigment, between additive color and subtractive color, that everyone seems to be glossing over.

As Newton figured out, white light is a compendium of all the parts of the visible spectrum. This is the additive model of color. Take away the blue hues, and a white light will appear yellowish; and so on and so forth. Yet when you combine paints of every color, you get black. That is subtractive color theory, in which black, not white, represents the presence of all color. Additive color is the realm of transmitted light, while subtractive color is the realm of pigments, dyes and chemicals, explains Honolulu-based color expert Jill Morton. Morton is a former professor at the University of Hawaii who has consulted for Xerox, Kodak and others to whom color matters. "Of course pink is a color," Morton says, "but with that said, pink is indeed not part of the light spectrum. It's an extra-spectral color, and it has to be mixed to generate it."

A pink wavelength of light doesn't exist, and this is where the argument seems to have started. Specifically, in a 60-second video by Minute Physics, in which pink lies between red and violet on a curled-up light spectrum.

Thinking of pink the way we think of transmitted light is confusing, hence the controversy. Here's the thing: When you look at a pink object - that is to say, a thing that contains pigments or dyes which render it pink to light receptor cells in your eyes - you are not seeing pink wavelengths of light. An object appears pink because certain wavelengths of light are reflected, and others are absorbed, quenched, by the pigments. Pink is a reflective color, not a transmissive color - you can perceive it because your brain translates light bouncing off it. Color is a construct of our eyes and brains.

"If you take a tube of red paint and add white to it, you'll get pink. If you work with watercolors, take red paint and add a lot of water to it and put it on watercolor paper, that would be pink," Morton says. "Technically it's right that you can't generate pink in the rainbow colors. But you can mix other colors in light to get pink. ... This is about interpreting the visual world."

So there you have it.

And for the record, my defense of pink has nothing to do with my personal double-X chromosomes. I sort of resent defending a hue society says I am biologically supposed to like. But I will come to the defense of any color.

Apple's New iPad: The Screen Is Better, and It's Faster

0
0
The New iPad Kevork Djansezian/Getty Images

Apple just announced the newest iPad, which will be called the iPad, and not the iPad 3 or iPad HD or iPad: Eddie Bauer Edition or with any other modifier. The big hits: it's got a better screen ("better" in this case meaning Apple has stolen all the pixels in the world and crammed them into the new iPad), a faster processor, an optional 4G LTE chip, and some software updates.

So the big news here is the screen resolution, which is bonkers. At 2048 x 1536 pixels, it has significantly more pixels than not just the old iPad, but also your 1080p HDTV and just about any other "HD" screen you've got lying around. Apple's again referring to it as a "Retina Display," which means that you won't be able to see pixels with your bare, weakened, organic eyeballs, at least from the 15-inch distance Apple estimates most people have between their eyes and their iPads. (Note: Apple is clearly excluding cats, which have more sensitive eyes and sit closer to the iPad anyway. This is offensive, probably.) That'll mean every publisher is going to have to re-do their apps for the new resolution, the same way they had to for the transition from the iPhone 3GS to the iPhone 4.

Also included: a new processor, dubbed the A5X, that provides quad-core power. The Tegra 3, which powers a few Android tablets, is also quad-core, but Apple needs the power to burn through the kinds of apps that will take advantage of the new display. Plus there's a 4G LTE antenna, either for AT&T or Verizon, which is pretty great--we're totally impressed by LTE, so it's great to see it here. And Apple has presumably worked some kind of gypsy magic to keep the battery life steady at 10 hours for Wi-Fi and nine hours for 4G--we've seen LTE absolutely slaughter battery life in phones, but hopefully Apple's worked that out.

Oh, and the company also announced a new Apple TV! Seems like basically just a firmware upgrade, though it's not clear if older Apple TVs will get it. There's a new version of the software, some nice bonuses like being able to use iCloud for 1080p movie streaming, and shows available the next day for purchase. No Siri, though, which bums us out because we spent all morning dreaming about how sweet that'd be.

The iPad and Apple TV both maintain pricing: iPad starts at $500, moving up from there due to storage and 4G inclusion, and the Apple TV costs $100.

Captured: The First-Ever Images of Atoms Moving Inside a Molecule

0
0

Ohio State University researchers have captured the first-ever images of atoms moving within a molecule using a novel technique that turns one of the molecules own electrons into a kind of flash bulb. The technique has yielded a new way of imaging molecules, but could one day help scientists to intimately control chemical reactions at the atomic scale.

The images were snapped using an ultrafast laser, which fired 50 femtosecond (a femtosecond is a quadrillionth of a second) pulses at the molecule to knock a single electron outside of the molecules outer shell. This electron, having been knocked out of its proper place, comes crashing back into the molecule, and in doing so it provides the kind of illumination the researchers need to image the molecule itself.

By measuring the scattered signal of the electron as it collides with the molecule, they were able to reconstruct the inner workings of the molecule, including things like the positions of the atoms nuclei. Moreover, because there is a very short lag between when the electron is knocked out and when it comes crashing back, the researchers are able to capture the movement of the atoms within that period, essentially allowing them to make a frame-by-frame film of atomic motion within molecules.

For their experiments the researchers used oxygen (O2) and nitrogen (N2) molecules because their molecular structures are well understood. The next steps will involve imaging more complex molecules and, eventually, figuring out the potential for precisely controlling chemical reactions right down to the atomic level.

Is Typing Changing the Way We Think?

0
0
Words you type with your right hand are happier

According to a new study exploring the interrelation of language and emotion, if a word consists of letters typed with predominantly the right hand, it is felt to be more positive in meaning; a word typed predominantly with the left hand is associated more with negative emotion.

The study included speakers of English, Spanish, and Dutch, both left- and right-handed, but all users of QWERTY keyboard layouts. Even made-up words like boshe, cheethe, plake, and throog were given value judgments that aligned with how they'd be typed.

What this seems to mean is that the everyday action of typing on our keyboards has infiltrated how we perceive and feel about the world.

Accordingly, the saddest story ever written is "Dexter Weaver Serves Breaded Crested Grebe" by Dallas Wiebe, which he wrote on only the left-hand side of his typewriter in the 1960s. An excerpt follows:

dear secret fasters #
easter exacts vast rewards # screw fasts # feasts create a freer texas # crab rears fast # water bearer deceases # stars agree a sweet taste averts graves # set feet faster # sweetwater deserves carvers stabbers eaters carafe drawers steadfast feeders # wear fast rags at 5 # 2 # 45 caveat regrets regarded bad#
dexter fred weaver # sweetwater texas

To Show Us All the Winds of Space, NASA Will Launch Five Rockets at the Same Time

0
0
Launch Madness at Wallops NASA/Wallops
The tracers will light up the sky on the East Coast

One rocket launch is a good time, but five rocket launches is a party. And at NASA's Wallops Flight Facility in Virginia, the party is on. This month, NASA will launch five sounding rockets within about five minutes from Wallops on a mission to test the winds in the high-altitude jet stream some 60 miles up.

The high-altitude jet stream is considerably higher than the one driving the weather down here at the surface, but it does drive the weather in the ionosphere. The charged particles and electric currents there can interfere with spacecraft and satellite communications, a problem exacerbated by the turbulence brought on by the jet stream. So NASA has a vested interest--both in terms of atmospheric science and in terms of spaceflight and satellite design--in gaining a better understanding of how this works.

So later this month, five sounding rockets will be launched in rapid-fire succession (for you rocket geeks, those will be two Terrier-Improved Malemutes, two Terrier-Improved Orions, and one Terrier-Oriole--more about those via the link below) carrying a milky white tracer chemical into the high-altitude jet stream where it will be released. That will let scientists -- and everyone else -- on the ground "see" the winds up there at the edge of space, where they typically gust at 200 to 300 miles per hour.

Electron Ptychography Could Create Tiniest, Sharpest Images Ever

0
0
Lily Anther, Still Alive Using a new microscopy technique, scientists can see living cells clearly without the need to stain them, a process which usually kills the cells. These cells are from a lily anther, the pollen-carrying reproductive portion of the flower. University of Sheffield

A new microscopy method that ditches lenses altogether could create the highest-resolution images ever seen. The system reconstructs an image from the electron waves scattered by a sample, and has no fundamental experimental limits imposed by constraints like blurry glass or wavelengths of visible light. It can even be used to image live cells without harming them.

It's called electron ptychography, and it enables a dramatic improvement in the resolution of the best-available microscopes. Transmission electron microscopes fire a beam of electrons to enable scientists to see atomic features, but they have limitations, especially at high magnifications. Wavelength-scale imaging is not possible, unlike with visible-light microscopes, because the images are inevitably blurred by tiny aberrations in the lenses.

To get around imperfect lenses, this new method measures the diffraction patterns of electrons, X-rays or even visible light waves. The researchers who developed it, led by John Rodenburg of the University of Sheffield in the UK, say it is a new paradigm in electron imaging.

"Provided the correct phase structure can be assigned to this diffraction data, it can then be inverted to realize a completely aberration-free image," Rodenburg and colleagues write. This is what they've figured out: How to calculate the intensity and phase of the imaging wavelength.

The recorded image represents the pattern of electrons or light waves scattered by the object you're looking at through the microscope. Working backward, the team can figure out exactly what scattered the waves, creating a blur-free image. The goal is to create the highest-resolution transmission image ever, about one-tenth of an atomic diameter, Rodenburg said in a Sheffield news release. The team is apparently already working with the European Space Agency to send an electron ptychograph microscope to the moon in 2018 to study lunar dust.

"This approach will enable us to look at how atoms sit next to one another in a solid object as if we´re holding them in our hands," he said in a statement.

The paper appears in Nature Communications.

[via Science Daily]


Robotic Balloon Cranes Could Turn Any Shore Into a Seaport

0
0
Tethered Air's Idea for a Robotic Balloon Crane Jeremy Wiley, Tethered Air via LiveScience

The future of seaports could be not having any seaports at all. At least that's the vision of Jeremy Wiley, founder of Tethered Air. Wiley envisions a system of robot cranes tethered to huge heavy-lifting balloons that could deployed anywhere to ferry shipping containers or other heavy cargo from ship to shore, or vice versa. Such systems would be relatively inexpensive, he says, and could create a near-instant seaport just about anywhere water meets the land.

From a technical standpoint, Wiley's idea is somewhat similar to the Skycam that zooms around overhead at NFL football games. Four anchor cables suspended from a lifting balloon connect with four payload cables that actually connect to the cargo. Those anchor cables reach from the balloon to the ground, and through the controlled shortening and lengthening of all of these cables by robotic mechanisms, the balloon and the payload can be moved around the space inside the four anchor cables.

Wiley dreamed up the idea as a U.S. Marine in Kuwait helping to move cargo from ship to shore prior to the invasion of Iraq. The military has long struggled with the logistics of moving all the men and materiel necessary to make war in a timely and efficient manner, and he started to see potential in a better way to quickly deploy a robotic crane. Likewise, in humanitarian crisis situations like the 2010 Haiti earthquake, moving cargo quickly into areas that have possibly been devastated (or developing areas that never had good deep water ports to begin with) can be the difference between life and death.

Using a robotic balloon lift, you could have a heavy-lift seaport crane in place in a matter of hours just about anywhere. It could be used in places where shipping tends to bottleneck--like at the Panama Canal--without creating anymore costly permanent infrastructure. And, at an estimated $7-10 million for a robo-balloon-crane big enough to lift shipping containers from ship to shore and back, a working Tethered Air crane or something like it might just be a steal of a deal. That's dirt cheap next to the cost of building an actual seaport, and for coastal cities that otherwise could never even dream about building a deep water port facility, it could quickly pay for itself. Much more at LiveScience.

[LiveScience]

Video: Gigantic Solar Flare Will Wash Over the Earth This Morning

0
0
Solar Flare and CME, March 6, 2012 One of two coronal mass ejections associated with two flares on March 6 arrived at Earth this morning. NASA

We love a good solar storm, and a pair of flares that erupted late Tuesday are just the latest in a string of activity from our awakening sun. The second-biggest flare yet burst forth Tuesday and spewed a coronal mass ejection toward our planet, the leading edge of which arrived this morning.

The first flare was a powerful X5.4, making it the second largest flare of this cycle, after an X6.9 on August 9, 2011. Two hours later the sun blew out an X1.3 flare. Both flares expelled large coronal mass ejections, which are traveling 1,300 and 1,100 miles per second, respectively. Both of these new CMEs will hit us and Mars, as well as some spacecraft.

These solar eruptions are emanating from a pretty active stellar region that just rotated into Earth view, according to NASA. The sun is growing more active as it approaches solar maximum sometime in 2013.

Here are some videos of the new flares. Take note of the solar tsunami - the entire surface of the sun ripples with the force of the eruption.

[ESA, NASA]

Reconnaissance Orbiter Captures a Twister on the Martian Plains

0
0
A Twister on Mars NASA/JPL-Caltech/Univ. of Arizona

Pictured: a Martian dust devil twisting across the Martian Amazonis Planitia region. The 100-foot-wide column of swirling air was captured by the Mars Reconnaissance Orbiter last month as it passed over the northern hemisphere of Mars.

Unlike an actual tornado, dust devils--which also occur on Earth--are generally spawned on clear days when warm air along the ground begins to spiral as it moves through cooler air above it. They are visible because of the dirt they kick up (hence the name), and against the monotone background Martian soil this massive devil sticks out. It may look tiny in the pic above, but the column towers half a mile high.

Just another day at the office for the NASA's Mars orbiter, which has been in service since 2006. Fun fact: though it is the youngest Mars orbiter flight to date, SPACE reports that it has delivered more data back home to Earth than all other interplanetary missions combined. When you're recording that much imagery, you're bound to get an amazing shot like this one every once in a while.

[SPACE]

This Month, James Cameron Will Dive to the Deepest Point on Earth in a Custom-Built Sub

0
0
Deepsea Challenger Diving Crews continue in-water testing in Papua New Guinea of DEEPSEA CHALLENGER, the submersible that explorer and filmmaker James Cameron will pilot to the bottom of the Mariana Trench. The vessel is the centerpiece of DEEPSEA CHALLENGE, a joint scientific project by Cameron, the National Geographic Society and Rolex. Photo by Mark Thiessen/National Geographic

An expedition to the deepest place on this planet is set for later this month, in a custom-built one-man sub to be helmed by Hollywood director James Cameron. The vessel Deepsea Challenge will dive to the deepest part of the Mariana Trench, seven miles into the Challenger Deep, where Cameron will shoot 3-D video.

The 26-foot-long ship was built by a team of Australian engineers and fits just one person. It has several cameras and shares qualities of a race car and a torpedo, according to National Geographic, which is sponsoring the expedition along with Rolex.

Cameron will spend about six hours at the bottom of the trench, collecting samples for scientists and shooting footage for a documentary. He just completed a series of five-mile-deep untethered test dives off Papua New Guinea and is readying for the real thing later this month. Cameron says the mission is meant to inspire interest in the oceans and in exploration.

Suspiciously absent from the discussion is any mention of new footage for an "Avatar" sequel, which supposedly will be set in the oceans. But that's probably OK.

The director could become only the third person to dive to the bottom of the Mariana, following then-Navy lieutenant Don Walsh and Swiss oceanographer Jacques Piccard, who made the dive in a bathyscaphe called Trieste Jan. 23, 1960. The Deepsea Challenger would be the first manned vessel to reach the Challenger Deep.

Cameron could soon have plenty of company, however - as we read last week, Richard Branson's Virgin Oceanic venture is also building a sub that can withstand the deep's 1,100 atmospheres of pressure. Google CEO Eric Schmidt has donated money to marine technology company DOER Marine to build another ship, according to the BBC. But if Cameron's trip goes to plan, he'll be first, and along with glory he could win an X Prize.

Nat Geo built a nice website for the expedition, with plenty of photos and background on the mission - click here to see their coverage.

[via BBC]

Today in Good Reads: Inside the Mysterious Disorder of the Le Roy High School Girls

0
0
Le Roy, New York Oatka Creek, Le Roy, New York. Wikipedia

One of the more bizarre and sad television-news gawkeries in recent memory has centered on the uncontrollable tics of a group of high school girls in upstate New York. The afflicted patients have been shown flailing on the "Today Show," Erin Brockovich got involved, and the community has been up in arms. In next week's New York Times Magazine, journalist Susan Dominus offers a reasoned and sympathetic explanation of the psychology behind the girls' behavior, the mass hysteria that ensued, and the power of group behavior.

A couple of months ago in Le Roy, a working-class town near Rochester, a group of girls started waving their arms, stuttering or otherwise behaving abnormally, and no one could explain it. Le Roy residents even contacted Brockovich, the environmental crusader, to investigate claims of pollution from the town's shuttered factories. Psychologists offered varying opinions, but the mystery remains.

The Times story is well worth a read.

Reconstructed: a Dinosaur With Four Iridescent Wings

0
0
Reconstruction of Microraptor Jason Brougham, American Museum of Natural History
Microraptors like these used to roam the Earth

Dinosaur fans have barely recovered from the revelation that some dinosaurs were not scaly, but feathered. Now it looks like those fans will have to update their mental imagery once more: several of the same researchers who brought us vivid images of colorfully feathered velociraptors two years ago have come out with a paper today in Science offering conclusive evidence of the first known iridescent dinosaur.

That distinction belongs to the microraptor, one of a class of birdlike carnivores called dromaeosaurs (colloquially, "raptors") that lived roughly 120 million years ago. It stands out from other raptors thanks to its unusual four-winged form and to the fact that, weighing in at less than 1 kilogram, it was small enough to fly, at least in a sense: scientists believe that the air resistance provided by its feathers helped it glide from tree to tree.

Those four wings appear even more striking now that we know they were iridescent. Iridescence is a property of some surfaces - such as soap bubbles, oil slicks, and birds like peacocks, hummingbirds, and grackles - that makes them appear to change color when viewed from different angles. When it occurs in plumage, it's because of alternating layers of keratin, air, and melanin at the cellular level of the feather, all scattering light at different frequencies.

The team produced a conservative reconstruction that assumed the simplest kind of iridescence, the glossy blue-on-black found on grackles and crows.By studying a set of feathers from modern birds, for which both the cellular structure and the appearance are known, the Science team was able to derive a set of structural factors that tend to go along with iridescence. The most telltale sign? Long, narrow melanosomes, the organelles in a cell in which melanin is stored. After collecting 20 samples of fossilized melanosomes from a well-preserved microraptor specimen from China's Liaoning province, the researchers were able to demonstrate that its slender melanosomes unambiguously match those associated with iridescence in all of the available bird species of which we're currently aware.

We can't be nearly as confident, warn the researchers, about which type of iridescence the microraptor's feathers displayed. Iridescence can take many forms depending on the organization of the melanosomes and the keratin in which they are embedded. Examples in modern birds range from the grackle's simple blue-black sheen, produced by a smooth layer of melanosomes topped by a smooth layer of keratin, to the peacock's ever-shifting rainbow, produced by a spongy layer of keratin dotted with air bubbles.

Since keratin doesn't fossilize, that structural information has been lost for the microraptor. "Imagine if you had a brick building and you removed the cement. All the bricks would collapse together," said molecular paleobiologist and paper co-author Jakob Vinther. Lacking the ability to infer the microraptor's particular kind of iridescence, his team produced a conservative reconstruction that assumed the simplest kind, the glossy blue-on-black found on grackles and crows.

Knowing that microraptor's feathers were iridescent also sheds light on another recent debate. A 2011 paper in Science claimed to be able to infer whether a dinosaur was nocturnal (active at night) or diurnal (active during the day) based on the shape of its scleral ring, a feature of the eye, and used this method to declare the microraptor to be nocturnal. But this recent discovery raises a question: why would a nocturnal dinosaur have showy iridescent feathers that can only be seen in daylight? Vinther noted, "Nocturnal birds don't have feathers designed for display purposes, because in the daytime when their feathers could be seen, they're inactive." Think, for example, of the drab plumage of owls. "This is an example of why it's important to learn about coloration, because it can help us question other kinds of evidence," Vinther said.

The Mixed Blessings of the iPad's Retina Display

0
0
iPad 3 With LTE Apple
The iPad's screen is amazing--but for lovers of magazines, photos, and videos, there may be some unexpected downsides

The newest iPad's new Retina display is a marvel of engineering: a combination of exacting manufacturing, advancements in LCD technology (smaller transistors lead to smaller pixels, which equals higher pixel density at lower power), and possibly some gypsy magic paid for with Jonathan Ive's toenails. With four times the resolution to work with, apps are going to look almost painfully sharp. But it's not an immediate win/win: almost everything that currently looks crystal-clear on an iPad's screen will need a high-resolution overhaul to look equally good spread across 3.1 million pixels. And that comes with some drawbacks, both for app developers and consumers.

Every developer we've spoken to ultimately views the four-fold resolution increase on the new Retina display as a very good thing. But most now have their work cut out for them to fill all those pixels, which is not a trivial task.

For digital magazines and newspapers, the increased pixel density will bring the new iPad's screen close enough to the dots-per-inch of print that the distinction is basically unnoticeable, which will mean gorgeous visuals and razor-sharp text. Behind the scenes, it won't be tremendously difficult to update magazines (since publications that also exist in print are already dealing with high-resolution files), but it will take some time to adapt them to digital formats. All photos and videos have to be swapped, layouts will have to be adjusted, interfaces may have to be altered, and measures must be taken to ensure that the right version is delivered to the right customer (no sense sending the high resolution version to older iPads).

Then there's the problem of downloading. Many apps, after upgrading their interfaces and graphics to higher resolutions, are inevitably going to become larger downloads. Magazines will probably become much larger. A typical issue of a publication like The New Yorker weighs in at around 150MB. That's already large, especially for a weekly publication--and that's a mostly-text magazine. For many graphically-intensive magazines, like, say, Popular Science, issues can be twice that size.

The Retina display's resolution is four times bigger than before: twice as many pixels vertically, and twice as many horizontally. But thankfully, that doesn't mean an across-the-board four-fold increase in file sizes: Text, vector graphics and many interface structures will scale without much or any increase in file size. The largest increase will come from photos and videos, which need more resolution to scale up cleanly.

Huge download sizes are a problem. You won't be able to store as many back issues on your iPad. It's worth noting here that Apple hasn't upgraded the storage capacity of the iPad in this version.

Beyond magazine and newspaper apps, the issue of upscaling content will rear its head elsewhere, especially with videos. Any video you watch now on your iPad 2 (say, one that's 960 x 540 pixels on the web) will appear much smaller on the new iPad's Retina display. That's because 960 pixels fills almost all of the iPad 2's 1024 pixels of horizontal resolution, yet only about half of the new iPad's 2048-pixel horizontal span. To view the same video at the same size on the new iPad, it must either be stretched to fit (losing quality) or replaced with a higher resolution version (making it a much larger file).

For streamed videos from services like Netflix and Hulu, you'll get a helping hand from the 4G LTE antenna, but even the with a solid connection on 4G or even Wi-Fi, you'll be streaming at 1080p at best--which even then is a slightly lower resolution than the iPad's screen, meaning full-screen playback will be stretched. And with a less-than-ideal connection, quality will continue to drop.

Nimrod Gat, a developer who's worked on Boxee's iPad app, is mostly optimistic about the new hardware. He also noted that there's more pressure now to take advantage of that screen: "Users expect breathtaking experiences," he said, "so you should definitely consider than when trying to create a new app." More breathtaking apps is certainly a good thing. But he also pointed out that the decreased quality of standard-def video--which currently looks just fine on previous iPads--will be "very noticeable" on the new Retina display.

From a developer's side, he said that users won't really have to choose between an "iPad" version, for the iPads 1 and 2, and an "iPad HD" version for the new iPad. "You can supply different assets--images, icons--and they'll do the magic for you," he said. The proper version for your iPad will be automatically delivered no matter what, but both come in a single universal download. Meaning original iPad users will have to download all the high-resolution assets, even though they won't be used. Gat noted that many smaller apps would then break the 20MB limit for 3G. Tellingly, Apple just raised the limit to 50MB yesterday.

Photos will probably look incredible on the new iPad--if they're big enough. Our friends at Popular Photography delved into this more deeply, but the gist is, many photo services on the web deliver photos in sizes significantly smaller than the Retina display's native resolution. Flickr, for example, limits you to 1,000 pixels in height and width--about a quarter of the screen size of the new iPad. Even crazier, the new Photoshop Touch app, which is otherwise pretty cool, only allows you to work with photos that are 1,600 by 1,600 pixels--which can't even fill the whole screen.

Sending photos among friends will be tricky too--you're probably compressing just about every photo you send without realizing it. Facebook compresses photos. Twitpic too. So do services like Instagram and Picasa. We're definitely interested to see how images from these services look on the new screen; there's a fair chance that photos won't quite pop as much as you'd like them to--at least at full size.

In the end, more pixels on the screen and less space in between them is a very good thing for content creators. But don't be surprised if many of the services you enjoy now in crystal clarity on your iPad take a little while to start serving enough resolution to provide the same experience on a Retina display.


Lockheed's Space Fence Prototype Starts Tracking Space Junk With Advanced Radar

0
0
Space Fence Lockheed's Space Fence prototype is tracking orbital debris in a test program for the U.S. Air Force. Lockheed Martin

Orbital debris is a large and growing problem, and no one is quite sure how to deal with it - polar lasers, nets and other concepts are still merely ideas. But we should at least monitor all that space trash, to be certain where it is and whether it's heading for something we want to protect, like the ISS or a military satellite. The Air Force's new Space Fence, designed to keep an eye on space trash, is getting closer to reality.

Lockheed Martin, which is vying for a contract to build the new space fence, just switched on its prototype space-debris radar system. Lockheed's concept uses new ground-based radar systems to track across the whole sky, looking for more than 200,000 objects and anything at least 0.8 inches across.

The radar is a solid-state S-band radar, like the type used in Navy warships and in some weather radar. Its wavelength frequency can detect much smaller objects than the current space debris tracking, the Mercury-era Air Force Space Surveillance System. That system can monitor 20,000 of the 100 million or so objects in orbit, and it can't see anything smaller than 11 inches in diameter. Plenty of orbital debris is a lot smaller than that, especially after accidental collisions and deliberate explosions blast satellites to smithereens.

Raytheon is also competing for the Space Fence contract, which will be worth $3.5 billion. This new Lockheed prototype is a demonstrator worth $107 million, and the Air Force formally approved its design a week ago.

The Air Force is expected to award the contract later this year, and the Space Fence will be operational by 2017. Lockheed made this explainer video to show off its new system.

New Breed of 'Nuclear Clocks' Could be 100 Times More Accurate than Today's Atomic Timekeepers

0
0
Measuring the Cosmos With a Better Atomic Clock NASA/ESA

Atomic clocks are the most accurate timekeepers in the world, but a "nuclear clock" would be even better. An international team of researchers from the University of New South Wales, the University of Nevada, and Georgia Tech have propsed a new kind of atomic timekeeper that wouldn't lose or gain 1/20th of a second in 14 billion years (that's roughly the age of the entire universe). It would be 100 times more accurate than the best atomic clocks we have right now, the researchers claim.

Conventional atomic clocks keep time by the orbit of an electron around an atom, using the electron as a sort of pendulum that ticks off units of time at highly regular intervals. But the new proposal calls for using lasers to orient the electrons in an atom in such a precise way that a clock could actually see beyond them to use a neutron orbiting the atom's nucleus as a timekeeper.

Neutrons are held so tightly to the nucleus (unlike electrons, which orbit at an atomic distance) that they are nearly completely immune from any kind of outside interference, whereas loosely bound electrons can be affected by external influences ever so slightly. In terms of atomic clocks, "ever so slightly" is actually a troublesome margin of error even if it is just small fractions of a second over extremely long periods of time. The newly proposed clock could help test physical and quantum theories with unprecedented accuracy by improving the precision of timekeeping by a couple of orders of magnitude.

DARPA Wants Humanoid Robots That Can Drive Tractors, Open Doors and Save the Day

0
0

DARPA is poised to launch a new Grand Challenge for a humanoid robot, according to robotics insiders - and the result could be a souped-up metal soldier running alongside BigDog, driving an ATV, unlocking doors and clearing a path to safety for its human counterparts. There's no official agency announcement yet, but robotics companies heard all about it at a recent industry day.

An official broad agency announcement is reportedly due soon, but here's what we can expect, according to the robotics researchers at Hizook: A bipedal humanoid ‘bot that can climb in and out of a tractor or open-frame utility vehicle, drive it, get out, walk to a locked door, open it and go inside; then walk through a rubble-strewn hallway, climb a latter, find a leaky pipe, seal it and fix a broken pump. Wow. The robot will be remotely operated, at least in part, but DARPA also plans to introduce some communications latency so the robot will have no choice but to act on its own.

DARPA's tall orders have previously resulted in some pretty amazing creations, from inflatable robots to the super-fast Cheetah, so it's reasonable to expect that this challenge will yield some incredible results. Gill Pratt, who was part of the programs that led to ARM-H and Cheetah, among others, announced this new challenge at a DTRA Industry Day recently, reports Hizook's Travis Deyle.

DARPA will fund six robotics hardware teams and a dozen software teams, Pratt said. There's also some sort of competition between already-funded teams and unpaid software teams - if the unpaid guys perform better, they'll start receiving DARPA funding.

Given the challenge, the plan is to run this competition twice, one year apart. Either way this sounds pretty amazing.

There are a few early contenders, most likely PETMAN - watch him run below. But who knows what else engineers will come up with? Like Deyle, this magazine has wondered why American bipedal robot development has generally lagged behind our Asian counterparts. This could be a step in the right direction.


[Hizook]

How It Works: An Underground Robot Library

0
0
Underground Robot Library Graham Murdoch
The research library of the future

Engineers from Dematic, a firm that builds automated parts and storage-retrieval systems for Boeing, Ford and IBM, designed a five-story underground storage area managed by five robotic cranes. Dematic has built 17 automated library systems worldwide, but the University of Chicago's is the most complex. The company has three more libraries under construction. See how it works here.

An App That Gives Apes the Ability to Control Robots (This Is a Real Thing)

0
0
Bonobo and Lexigrams Bonobo Hope Great Ape Trust Sanctuary

Did you know that bonobos have a "fascination with computers"? No? Neither did we. But a new Kickstarter project from the Bonobo Hope Great Ape Trust Sanctuary in Des Moines, Iowa needs funding to make every bonobo's technological dreams a reality--from operating vending machines to, improbably, controlling their own robots.

The bonobos are apparently already pretty adept at recognizing and communicating through lexigrams, a language used by apes in which words are replaced by specific images (which may or may not have anything to do with the actual word). At the moment, the bonobos use a big interactive board filled with lexigrams, but in Iowa, the researchers want to give them the ability to move about and control their own environment, which means the bonobos need a portable version. Looks like they're working on a version for Android Honeycomb--I see a Motorola Xoom pictured.

The app would give the bonobos the ability to control vending machines, interact with people on the go, play games (involving squirt guns, which, please give us video of squirt-gun-wielding bonobos), and for some reason, control robots with a terrifying bust of another bonobo. Imagine you get transported to some highly advanced alien planet and the aliens give you an unkempt, screaming mannequin on wheels to play with. That would freak me out, but perhaps this is one of the key differences between humans and bonobos!

[Kickstarter via @jennydeluxe]

Viewing all 20161 articles
Browse latest View live




Latest Images