Hungry? Try A 3-D Printed Extra Cheese Pizza For Dinner (

I can’t imagine this tastes that great, but it’s probably better than some of the bargain basement brands in the grocery store. This would also open the door to essentially most baked and pastry-type items since they’ve managed the basic “dough with toppings” delivery system.


Researchers have been talking about 3D printing food for a long time, but when NASA decided it was time to find new ways to provide food to astronauts on long trips (like to Mars), 3D printed food became more relevant than ever. As earth’s population continues to grow fast, some people believe 3D printed food is the future of food. I have to admit, it sounds tastier than eating bugs for protein. This 3D printed cheese pizza actually looks pretty good.

It’s clear that in the future, we will no longer have huge grocery stores stocked with everything we could ever want to buy. We’ll have to get more creative about food as it relates to our survival. With over 7 billion people on the planet, and with more people born every day, there really is no other option. 3D printed food sounds good to me, especially when it’s a cheese pizza. According to news station KXAN:

“Powdered ingredients that can keep for years are mixed into individual vessels. A heated plate then receives a square of dough, a layer of sauce, and some cheese topping. Twelve minutes later – voila – an appetizing little pizza.”

The top layer of the printer is what melts the cheese. If NASA is able to send 3D printers into space, they’ll save a ton of room in the spaceship since they won’t have to pack all those boxes of food for the trip. Also, the astronauts would have a much more appealing diet than just eating space food for every meal.

This reminds me of the movie Matrix, and how they talked about the goop they had to eat each day. I’m sure they would have liked to have a 3D printer to make a cheese pizza instead of that juicy oatmeal stuff they ate. If you want to see a 3D printer create even more food, click over here to 3D Printing Industry and watch a printer make some pretty creative looking pancakes. Someday we might all have a food printer in our kitchen. It’s not as far-fetched as you might think.

3D Printed Cheese Pizza Looks Delish

NASA Finds Water On Mars (

This is a stunning discovery. Of course, finding clay (or water after a sample’s been super-heated) isn’t the same thing as finding drinking water, but it does mean more excitement and interest will be eyeballing NASA. I believe we need far more money spent on space (and undersea) exploration.

NASA’s Curiosity Rover Just Found Water in Martian Soil

Just when you thought ol’ Curiosity was digging in for the winter, the little discovery machine came up with a doozy: It discovered water in Martian soil. NASA scientists just published five papers in Science detailing the experiments that led to the discovery. That’s right. There’s water on Mars.

Impressive as it is, though, the discovery comes with some caveats. It’s not like Curiosity stumbled on a lost lake under a mountain or a stream trickling across the landscape. Rather, it found water molecules bound to other minerals in Martian soil. There’s kind of a lot of it, too. Researchers say that every cubic foot of Martian soil contains about two pints of liquid water. All things told, about two percent of the Martian soil is made of up water.

“We tend to think of Mars as this dry place—to find water fairly easy to get out of the soil at the surface was exciting to me,” Laurie Leshin, dean of science at the Rensselaer Polytechnic Institute, told The GuardianShe also explained how the discovery was made. Curiosity picked up and sieved a scoop of soil from the surface before dropping it into an on-board oven. “We heat [the soil] up to 835C and drive off all the volatiles and measure them,” she said. “We have a very sensitive way to sniff those and we can detect the water and other things that are released.”

Of course, this isn’t the first sign of water on the red planet. Back in June, Curiosity scooped up a rock specimen that contained a type of clay only be formed in neutral water telling scientists that Mars was once home to running water. And of course, scientists have long suspected water once existed on the planet due to various formations across the Martian landscape. In fact, it’s widely believed that water existed in abundance on Mars, perhaps just as prominently as it does on Earth.

The discovery is important for a number of reasons, but especially exciting because of what this means for future missions to Mars. “We now know there should be abundant, easily accessible water on Mars,” says Leshin. “When we send people, they could scoop up the soil anywhere on the surface, heat it just a bit, and obtain water.” She makes it life on Mars sound so easy; now we just have to figure out how to get around that the quantity of deadly radiation we’ll encounter on the trip over. [Science via The Guardian

Update (5pm): We reached out to Dean Leshin to ask what the discovery of water meant for the larger question of life on Mars, and she replied with a shade of optimism:

Although we found water bound up in the soil particles, it’s still pretty dry. Also, we didn’t find evidence of organic molecules in the soil. So, this doesn’t have a very big bearing on the life on Mars discussion. However, we now know that our instruments are working beautifully, and our next step is to drill into rocks that may have been better places to preserve evidence or organics and of wet environments that could be suitable for life.

Antimatter Particles Detected Erupting From Solar Flares (

It no longer requires a science fiction movie to see antimatter. Here’s a photo from NASA showing antimatter particles streaming away from the sun back in May of this year.


2013 Solar Flare

2013 Solar Flare This image shows a mid-size solar flare that peaked May 3, 2013. It’s been colorized teal. NASA/SDO/AIA

In the surge of energy of solar flares, physicists have now detected antimatter particles streaming away from the sun.

Researchers already knew that the reactions that fuel the sun create antimatter particles called positrons, among other particles. However, this is the first time the sun’s positrons have been detected in this way, according to the New Jersey Institute of Technology. The lead scientist in the discovery, Gregory Fleishman, is a professor at the institute.

Fleishman and his colleagues’ new measurements could help scientists better understand solar flares and the basic structure of matter. The techniques they worked out could also make it easier for other scientists to detect positrons coming from other objects in space. In a summary of their research, Fleishman and his colleagues sounded optimistic, saying that their discovery could soon make it routine to detect positrons in solar flares, which are brief, bright eruptions of energy on the sun’s surface. (Large solar flares may cause radio blackouts on Earth.)

Positrons are the antimatter counterparts to electrons. They have the same mass as electrons, but have a positive, instead of a negative, charge. They also emit microwave radiation of the opposite polarization as electrons do. So Fleishman and his colleagues used data from NASA’s Solar and Heliospheric Observatory and the Nobeyama Radioheliograph in Japan to find instances of polarized radiation that matched positrons.

They’re presenting their work this week at a meeting hosted by the American Astronomical Society.

Original PopSci Article

Google Builds 108 Terapixel Portrait of Earth (

A frame of Timelapse’s view of the growth of Las Vegas, Nevada.


In May, Google unveiled Earth Engine, a set of technologies and services that combine Google’s existing global mapping capabilities with decades of historical satellite data from both NASA and the US Geological Survey (USGS). One of the first products emerging from Earth Engine is Timelapse—a Web-based view of changes on the Earth’s surface over the past three decades, published in collaboration with Time magazine.

The “Global Timelapse” images are also viewable through the Earth Engine site, which allows you to pan and zoom to any location on the planet and watch 30 years of change, thanks to 66 million streaming video tiles. The result is “an incontrovertible description of what’s happened on our planet due to urban growth, climate change, et cetera,” said Google Vice President of Research and Special Initiatives Alfred Spector.

But that’s just the surface of what Google has created with Earth Engine. In an exclusive interview with Ars Technica, Spector and Google Visiting Scientist Randy Sargent drilled down on how Google, using software developed by Sargent’s team at Carnegie Mellon University’s CREATE Lab, has generated what amounts to an animated 108 terapixel time-lapse portrait of the planet. Here’s how the company did it.

The big picture

“We began to realize a few years back that Google Maps could be augmented to support all sorts of data,” Specter said. “We had this idea that we could extend it to support multispectral (imagery) data. And given that we were getting feeds over time, we could store them as time-based sets, so you could go back and forth in time to look at changes. That idea became the Earth Engine project.”

Over 45 years of NASA satellite data has been “ingested into Earth engine,” said Sargent. “That’s been married to Google’s compute infrastructure, so you can detect deforestation or find land use changes.”

Sargent’s team used 909 terabytes of data from the Landsat 4, 5, and 8 satellites—with each of the million images weighing in at more than 100 megapixels.

Landsat’s polar orbit allows each satellite to take a full set of images of the Earth’s surface every 16 days. But not all of those images are keepers due to weather and other factors. “It’s not as easy as just lining up the pixels,” Sargent said. “Most of the challenges involved dealing with the atmosphere—if it’s cloudy, you’re not seeing anything. And if it’s hazy, you have to look through it. So we had to build mosaics that excluded cloudy images and then correct for haze.”

To do that, Google used 20 terabytes of data from MODIS (MODerate resolution Imaging Spectroradiometer) sensors on NASA’s Earth Observing System Terra and Aqua satellites. “MODIS captures the entire planet daily,” said Sargent. “It has enough different spectral bands (ultraviolet through infrared) that it helps us analyze what it sees in the atmosphere.” Using MODIS’ MCD43A4 data (which provides information on ground and atmospheric reflectance), the Earth Engine team built a cloud-free, low-resolution model of the Earth for each year for which data was available.

That data was used to create statistical estimates for the color of each pixel of Landsat coverage and to correct for seasonal variance in vegetation, haze, and cloud cover.

Early years of the dataset had gaps due to the 1987 failure of Landsat 5’s Ku-band transmitter—which prevented the downlink of imagery collected outside the range of US and cooperating international ground stations. This meant large chunks of Asia (particularly in China) were not covered by Landsat’s archives until 1999. So to get a complete picture for each year, the data was interpolated between years where images were available.

Processing all of the data to produce the final mosaics representing each of the 29 years covered—from 1984 to 2012—took under a day, using 260,000 core-hours of CPU time in Google’s compute cloud.

Serving up the time machine

With 29 world-spanning mosaics mapped to Google’s model of the Earth, the next step was to make the images explorable both in space and time. To achieve this, Sargent’s Carnegie Mellon research team extended the open source GigaPan Time Machine software developed by Carnegie Mellon’sCREATE Lab.

Time Machine ingests very high-resolution videos and converts them into multiple overlapping multi-resolution video tiles delivered as a stream, using manipulation of HTML5’s video tag in a way similar to how Google uses HTML image tags to pan and zoom in Google Maps.

Previous Time Machine projects had handled videos with billions of pixels of resolution. But Time-Lapse Earth pushed the envelope for Time Machine, because of the size of the data. The 30-meter-per-pixel video was generated from 29 Mercator-projected mosaics created by Earth Engine, and each frame had 1.78 trillion pixels.

In order to generate the millions of overlapping videos required and integrate them into Earth Engine’s geospatial search capabilities, CMU researchers had to connect Time Machine into Earth Engine and Google’s computing and storage infrastructure. Just encoding the videos for Global Timelapse consumed 1.4 million core-hours of compute time. The total process of creating the Timelapse application took 3 days of total processing time, 1.8 million core-hours, and at its peak used 66,000 cores simultaneously in Google’s cloud.

In order to seamlessly present the final product through a Web browser as users zoom and pan through it, Time Machine created what Sargent called  “a tree of tiles.” Each individual video that represents a “viewport” in the Global Timelapse data is a video file, indexed in a treed table of contents by Earth Engine.

“The client makes a request for the fragment of the video for the location a user is looking at, which includes the table of contents,” said Sargent. “As you’re switching resolution levels or locations, you have one set of video information on screen, and on the back end we’re cueing up other videos ahead of time to anticipate where you’re going to look next.”

The video frames extend beyond the boundary of the “viewport” given to the user so that the system has some slack in responding to a user panning around a location.

While the Global Timelapse is a powerful educational tool showing the impact we have on our environment, Google’s Earth Engine is also aimed at being a platform to provide researchers and policymakers worldwide with satellite data and other data sets they may not have had the resources to use before. “There are a number of partners who are currently using Earth Engine,” Spector said, “primarily earth scientists.” Google is also seeking other sources of geospatial data sets to add to Earth Engine to extend its usefulness.

Google is still exploring the potential applications of the underlying data.There’s also an Earth Engine API, based on Python—but it’s currently restricted to Google’s partners.

Original Arstechnica Article

%d bloggers like this: