Tag: cameras

NASA’s high altitude ER-2 scans California’s wildfires

For the second time this year, swaths of California are burning out of control thanks to unseasonably warm and dry temperatures. To better study what's happening and assess the environmental impact, NASA deployed its high-altitude ER-2 aircraft with a host of scientific instruments on board. In the image above, sunlight glints on its prop as it flies over the Thomas Fire in Ventura county at around 65,000 feet.

The ER-2 has been scanning the blazes with a couple of interesting instruments that have flown, or will fly aboard the international space station (ISS). One of them is the AVIRIS spectrometer that can penetrate cloud, dust and smoke to see the ground below. While providing a clear image of the ground, it can also measure fine details in vegetation like water content and plant species growing. Eventually, a similar instrument will be launched into space.

On the flight pictured above, however, it's carrying another instrument, the Cloud-Aerosol Multi-Angle Lidar (CAMAL). It was originally developed to a validate space-based version of the instrument called CATS, which operated for 33 months aboard the ISS, before going out of service last month. Now, CAMAL is being used for a similar purpose aboard the ER-2. Unlike other types of LiDAR used to scan the ground, CAMAL can study pollution, smoke, clouds and other atmospheric phenomena.

Another view of the Thomas Fires from space

Ideally, NASA flies the AVIRIS spectrometer over regions before a fire starts to get a base measurement, then overflies the same spot again afterwards. Comparing the before and after images gives researchers an idea about the severity of a fire.

Meanwhile, when the blaze is active, the area can be scanned with the CAMAL LiDAR to get a picture of the dust, smoke and cloud cover in the area. Using the space-based version of the instrument, for instance, NASA scanned the October wildfires, finding plumes extending as high as 2-3 miles that created "the worst air quality ever recorded in many parts of the Bay Area," NASA's CATS team said.

"The vision is that these types of measurements could be available from space in the next decade," said JPL's Rob Green. "The resulting information would then be used to develop fuel maps in advance that could be used to make better predictions about where you could mitigate risk by clearing brush and trees." CAMAL can also be used by researchers to study cloud formations and learn more about climate change, which is helping fuel the wildfires in the first place.

Source: NASA


Facebook adds a sound and music library you can use for video

Facebook has a bunch of new tools for video creators. First up, it has launched a community hub for 360 degree video (which isn't live for everyone just yet) that gathers educational bits like how to use 360 degree cameras, how to edit said videos and a primer on spatial audio. Speaking of editing, the social network has also launched 360 Director, a page with tools for adding annotation, setting zoom level, and the ability to save a video as a draft, among others. Facebook will also loan out the pricey cameras, starting with the GoPro Fusion and ZCam S1 at launch.

Not into 360 video? Well, Zuckerberg and Co. have something for you too. Facebook Sound Collection is a gaggle of songs and sound effects you can use with your authored videos, 360-degree ones included. Meaning, you can use these without fear that your masterpiece will get muted because of a copyright violation. Will that stop people from uploading video with licensed music? Probably not. Facebook's ambitions for video are pretty transparent at this point, but it's good to see that the company is willing to invest in its users in addition to its original programming aspirations.

Via: TechCrunch

Source: Facebook (1), (2), (3)


Lytro shuts down its multi-focus photo sharing platform

Remember Lytro's cool, but kinda gimmicky "light-field" photography? It let you take pictures without a defined focus point and post them on a special online platform, where users could click on which part they wanted to be sharp. It never caught on with consumers, and Lytro discontinued its pricey Illum camera (above) and switched gears to video and VR. Up until recently, though, users could still post the interactive "living pictures," but Lytro has announced that as of November 30th, the platform (pictures.lytro.com) has been killed.

The only way now to view the multi-focus images is via Lytro's desktop application, or using the Illum camera itself. All the living pictures we posted on our Illum mini-review, for instance, have now disappeared, leaving a black box, as shown below. The best you can do is use the app to export JPEGs or other conventional formats.

No other camera companies ever adopted light field-type tech, but it is being pursued by companies like Avegant, and Lytro itself, for virtual and augmented reality. Shuttering its online platform effectively marks the end of Lytro's consumer camera adventure, and will no doubt leave the (very few) Illum buyers bitter.

Via: DP Review

Source: Lytro


Netgear recalls Arlo outdoor camera power adapters over fire risks

If you bought one of Netgear's Arlo outdoor cameras and then snapped up an extra power adapter just in case, you may want to sit up and take notice. Neetgear and the Consumer Product Safety Commission are recalling Arlo's aftermarket power adapter after receiving seven reports of the cord overheating and melting, in one case leading to a fire. Only 7,700 affected adapters were sold in North America between June and October, but that still presents a sizeable risk.

This shouldn't affect the adapter that came with your camera, so you can keep using it as long as you didn't have to swap out the original. Still, this underscores the usefulness of having a backup for a security camera -- you don't have to leave your backyard unguarded, even if it's just for a short while.

Source: CPSC


The iPhone 8 goes up against the Samsung Galaxy S8 Plus

Before you start throwing down cash for new phones like a Grinch post heart-expansion, watch our video to directly compare more factors than just name brand and price on two of the most popular phones. The iPhone 8 and Galaxy S8 Plus are both less than $1000 (no thanks, iPhone X) but still expensive, beginning at $699 and $825, respectively.

Either would make a great gift to yourself or someone else, but it all depends on what you're going for. The iPhone 8 looks a little ho-hum in terms of standard old design, but acts zippier because of the new A11 bionic chip, which Apple claims makes it 25% faster.

Alternatively, maybe you love Samsung or are just now open to one because of Apple's no headphone jack policy. The S8 and S8 Plus have a slick design that our own phone reviewers absolutely love and its display (a dazzling 2,220 x 1,080) compared to Apple's (a meh 1,334 x 750) really put it at the top of the visual appearance heap.

And then there's the camera test. While on paper the smartphone's cameras seem very similar, (Apple with a 7-megapixel front-facing camera, 12-megapixel back; Galaxy S8 Plus with 8-megapixel front-facing camera, 12-megapixel back), in practice, the selfies from the Galaxy S8 Plus seem far superior.

After testing set-up, call quality, video downloading time, playback, visual appearance and cameras on each of the phones, we picked the Samsung Galaxy S8 Plus as the winner of this particular head-to-head challenge. Let us know in the comments what we should test next!

This article was briefly removed from the site to update the video thumbnail and pricing. Prices now reflect MSRP rather than Amazon's "Buy Now" option in our database.


Instagram warns you if posts show harm to animals or nature

Protecting wildlife and sensitive natural areas is hard enough as it is, and it's not helping that every brain-dead tourist wants to post a selfie with a koala bear or dolphin. Starting today, Instagram is making it harder to find such content. If you search hashtags associated with images that could harm wildlife or the environment, it will post a warning before letting you proceed.

"I think it's important for the community right now to be more aware," Instagram's Emily Cain told National Geographic. "We're trying to do our part to educate them."

At the same time, selfies taken in newly Instagram-popular spots, like Bonneville Salt Flats and Yellowstone National Park, can ravage their sensitive environments. That forces officials to either shut down the spots or make them more tourist-friendly, destroying their original character.

Now, if you search on several hundred terms, the app will throw a flag saying "Protect Wildlife on Instagram," adding that "you are searching for a hashtag that may be associated with posts that encourage harmful behavior to animals or the environment." Only then can you see posts, learn more or cancel the operation.

The decision followed an investigation by National Geographic and World Animal Protection into wildlife tourism. The investigators discovered that animals were being captured illegally from rain forests and kept in cages, then trotted out for selfies with tourists ignorant of their plight.

If someone's behavior is interrupted, hopefully they'll think, maybe there's something more here, or maybe I shouldn't just automatically like something or forward something or repost something if Instagram is saying to me there's a problem with this photo.

The warning will pop up for hundreds of hashtags, both in English and the languages of Thailand, Indonesia and other nations where selfie wildlife tourism is rampant. Instagram isn't saying which terms will trigger the flags, though, as it wants users to discover them on their own.

World Animal Protection's Cassandra Koenen points out that the animals people most want to pet or hold, like koala's and sloths, really don't like being handled. And the problem is made worse because tourists are terrible at determining which attractions treat animals poorly.

Though Instagram's gesture doesn't seem like it'll be much of a deterrent, Koenen believes that it will stop folks that don't mean harm and just don't know better. "If someone's behavior is interrupted, hopefully they'll think, maybe there's something more here, or maybe I shouldn't just automatically like something or forward something or repost something if Instagram is saying to me there's a problem with this photo," she said.

Source: Instagram


LiDAR strips landscapes down to their bare glory

LiDAR is having a moment right now helping self-driving cars and robots not hit things, but don't forget about what else it can do. In a study called The Bare Earth, scientists from the Washington Geological Survey used it to image the ground right down to dirt and rocks. Stripped of trees and other distractions, the images provide not only valuable geological survey data, but stunning, otherworldly views of our planet.

The image above depicts a LiDAR relative elevation model (REM), showing current and previous channels carved out by the Sauk River in Washington State's Skagit and Snohomish counties. In the regular satellite image below, however, only the active, vegetation-free channels are clearly visible -- a striking display of what the technique can reveal.

"REMs are extremely useful in discerning where river channels have migrated in the past by vividly displaying fluvial features such as meander scars, terraces, and oxbow lakes," explains Washington State geologist Daniel E. Coe in a PDF. "This type of information is very informative in channel migration and flood studies, as well as a host of other engineering and habitat assessments."

Carried on special aircraft, LiDAR produces visible-light laser beams. While much of it bounces off of trees and vegetation, enough make it through to the ground. By examining the raw data to see which beams travel the farthest, scientists can "edit out" trees, vegetation and man-made structures. That reveals hidden seismic faults, glacial landforms, landslides, lava flows and other features that are invisible on a regular satellite photo.

Used in this way, the images also become a fantastic educational tool. Understanding things like drumlins, moraines and kettles can be pretty tough, but seeing a landform stripped to the bare ground brings the concepts to life. Suddenly, it becomes much easier to visualize how glaciers, lava flow (above), tsunamis and other natural phenomena have scarred (and will scar) our planet's surface over time.

Considering that they're made by scientific instruments, the LiDAR images also happen to be beautiful (seriously, check them out). You could even say that like other types of art, they strip away the surface to show things we may not want to see. A great example is the Toe Jam Hill fault scarp (part of the Seattle fault zone), first revealed by a LiDAR scan in the '90s.

Subsequent trenching showed that it was the site of a single large earthquake 1,100 years ago, part of the great Seattle subduction quake at around 900 AD. That seismic event has become a part of the oral history and legend of Coast Salish Native Americans in the region, and is an ongoing concern to folks in Washington's Puget Sound area. LiDAR has become an important tool in confirming ancient stories, while helping us learn about future ones -- hopefully before they happen.

Via: Kottke

Source: Washington State Geological Survey


DxO’s snap-on Android camera is now available to pre-order

DxO One is a compact snap-on camera that drastically improves the quality of your smartphone photos, but only iPhone users have been able to benefit so far. It recently arrived on Android, however, and is now up for pre-order as a fairly attractive "Early Access Pack." For $499, you get the camera, a protective shell and DxO's PhotoLab software, effectively saving about $260. The caveats are that there's still no shipping date, and the device only works on newer models that have built-in USB Type-C ports.

The DxO One is one of the few survivors of the "camera phone" era of a few years ago, with the most notable contenders at the time being Sony's QX10 and QX100 models. DxO's model likely struck a chord because of its more practical direct physical interface and relatively compact, pocketable size.

At the same time, you're getting a large 1-inch, 20.2-megapixel sensor and f/1.8, 32mm equivalent fixed lens that gives you photo quality akin to a nice compact camera like Sony's RX100 V -- better than any small-sensor smartphone can manage. However, if you're okay with pairing your phone wirelessly, you'd be better off spending a bit more on a dedicated camera like one of Sony's previous RX100 III and IV models or Panasonic's Lumix LX10.

Supported mobile phones are the HTC U11, Huawei Mate 9, Huawei P10, LG G6, LG V20, Moto Z, Nexus 5X, Nokia 8, Nubia Z11 mini, Samsung Galaxy A5 2017, Samsung Note 8, Samsung Galaxy S8 and Huawei Honor 9. You can also select "other" and specify your brand of phone, and hope for the best.

As mentioned, the DxO One for Android is on pre-order for $499 including the Outdoor Shell and cable back door, along with the DxO PhotoLab software. It'll be able to do most of what the iOS version can, but some recently announced features like time lapse and Facebook Live streaming will come later.

Via: Android Central

Source: DxO


Google caters to the DIY crowd with an AI camera kit for Raspberry Pi

Google created its AIY Projects initiative -- "artificial intelligence yourself" -- to encourage developers and DIY enthusiasts to learn about artificial intelligence. The first project in the series, the ready-to-assemble Raspberry Pi-based AIY Voice Kit, was based on a project from MagPi magazine. Now Google has a second project ready for release this year: the AIY Vision Kit.

The camera kit comes with a cardboard shell, an AI-capable circuit board, a light-up arcade button, a tiny speaker, a lens kit with both macro and wide settings and various connection components, including a tripod mounting nut. You'll need to supply your own Raspberry Pi Zero W, Raspberry Pi Camera, an SD card and a power supply. The VisionBonnet circuit board has an Intel Movidius MA2450 low-power vision processing unit, which can run neural network models right on the device. You'll get software, too, which has three TensorFlow-based neural network models: one to recognize a thousand common objects, another that can recognize faces and expressions and a third that can detect people, cats and dogs. There's also a Python API that can adjust the arcade button colors and speaker sounds.

With this Raspberry Pi-based camera, Google says you can create a device that can identify different plant and animal species, be notified when your dog shows up at the back door, see if you left your car in the driveway, watch your holiday guests react to your decorations or even trip an alarm when your little brother enters your room. Of course, these are just examples. Developers and hackers will surely find even more exciting things to do with this device.

Google isn't the only company offering AI tools for developers to create solutions with. Amazon also just announced its own image recognition camera, too. Google's more DIY-centric AIY Vision Kit is available for pre-order now via Micro Center for $45, and will be available for delivery and store pickup December 31st.

Source: Google


Essential Phone joins the portrait mode photo party

The Essential Phone's camera was one of its bigger weak spots at launch. A few months (and a handful of updates) later and things have gotten better. The latest patch adds a portrait mode along with an exposure compensation control. As Android Police notes, the patch also includes a tweak to the device's JPEG compression algorithm that'll hopefully boost image quality, along with the usual stability fixes.

These new features were announced when the speed-focused patch was released in October. At least Essential recognizes that its camera is far from perfect and seems intent to keep tweaking things until they're right, though. You can judge the results of today's patch for yourself below.

Via: Essential (Twitter)

Source: Google Play