Tag: ces

Facebook’s Hugo Barra says standalone headsets are key to social VR

Even though Oculus didn't have an official presence at CES this year, its leader, Hugo Barra, made a surprise appearance at Qualcomm's press conference to make an important announcement: Xiaomi would be its global hardware partner for Oculus Go, its first standalone VR headset. What's more, Xiaomi would also be making a special variant of the Go, the Mi VR Standalone, especially for China. In an interview with Barra following the press conference, he explained the reason for the push in standalone headsets: social VR.

"This is a product category that will help us bring the most number of people into VR and really start unlocking these opportunities for social presence," said Barra. "It's the idea of having a completely self-contained product that you can just put on and start using it. You can do everything in one step."

"Everything is integrated together in one device," he continued. "That is the best experience that we think can be created."

When Facebook CEO Mark Zuckerberg stood on stage at last year's Oculus Connect and said that he wants to bring VR to a billion people, he really meant it. And a key reason for doing so is not just to sell hardware or games -- although that's obviously a pretty good incentive as well -- but it's to further Facebook's agenda of connecting the world via social VR.

"The vision that we have is for VR to enable people to spend time together and to do things together that they otherwise wouldn't be able to," said Barra. "It could be something as simple as Facebook Spaces, where you get in a place together with your avatars and interact. But it could also be watching something together, or playing a game together or doing any number of things together."

"And, of course, the more people we have in VR, the more people are going to start spending time together [in VR]. Which brings us back to why standalone VR is a huge focus for us."

In 2016, Oculus first showed off Project Santa Cruz, a prototype of its higher-end standalone headset, to a select group of reporters. But last year, it announced that it was working on a lower-end and much more affordable version of it called the Oculus Go. It'll be priced at just $199, and importantly, it's the product that Facebook hopes will drive the VR category to the masses.

The other part of Oculus' quest to spread VR to as many people as possible is its new partnership with Xiaomi. "[Xiaomi] is a very exciting, innovative company," said Barra of his former employer. But more important, he said, Xiaomi has a history of making high-quality products at very affordable prices. Also, while the Mi Standalone VR does share the same core features as the Oculus Go, Barra said there are aspects to its hardware and, more importantly, its software, that makes it highly localized to the Chinese market.

"It was very important for us to work with a partner who could bring a lot of leadership and expertise about the Chinese market," he added. "It's a market that is very important for us because we want to bring VR into as many hands as possible."

There's certainly evidence to suggest China's importance in the VR industry. According to a Canalys report, China accounted for 40 percent of VR shipments in 2016, while an IDC report suggests that China is on pace to become the world's largest market for virtual- and augmented-reality headsets by 2020. This is in part due to China's billion-plus consumer base, but also the rapid growth of VR-related startups in the country in recent years.

Facebook isn't the only company investing in standalone VR. Google, for example, unveiled a standalone Daydream headset, the Lenovo Mirage Solo, at CES. The Mirage Solo sets itself apart from the Oculus Go with Google's WorldSense technology, which gives it six degrees of freedom and positional tracking without external cameras or sensors. But the Mirage Solo is also tentatively priced at around $300, which is $100 more than the Oculus Go.

Barra doesn't seem worried about the competition. "We pride ourselves first and foremost on being the leaders in VR," he said. "We've been doing this for a very long time. We have a pretty high degree of confidence that the Oculus VR experience is the best, by a wide margin, from anything else you see out there."

Facebook's heavy investment in standalone headsets doesn't mean that it's giving up PC or mobile-based VR. "Standalone is the newest one, and is obviously one that we're really excited about," said Barra. Mobile VR is still very important, because it's the lowest barrier to entry, while the Rift offers the highest-quality VR. "We believe in all three," he said.

But in the end, Barra believes that standalone headsets are the future. It will be, he says, the easiest way to get people to use VR. "The Oculus Go will be our most successful VR product on the market," he said. Whether or not that will ultimately lead to mass social VR adoption, however, remains unclear.

Click here to catch up on the latest news from CES 2018.


Nanoleaf wants you to control your smart home with a dodecahedron

A few months ago, Nanoleaf revealed a "Rhythm Starter Kit" that let you synchronize its colorful Aurora light panels with music. At CES, the company released a product designed to control it and more: the Nanoleaf Remote. But instead of a typical handheld remote control, the Nanoleaf Remote is in the shape of a 12-sided polygon -- a dodecahedron if you will -- which changes the lights depending on which side it's on. And the beauty of it is that you don't have to use it with just Nanoleaf products -- it'll work with compatible Apple HomeKit products as well.

So, for example, you could turn one of the sides to adjust the temperature, or you could flip it to another to lift your blinds. Or if you like, you can map one of the sides to a particular "scene" that will do things like turn on your lights and play a favorite song at the same time. And, of course, all of the functions can be designated and assigned using the Nanoleaf companion app.

I checked out the Nanoleaf Remote at the company's booth at CES, and was enamored by its ease of use. Admittedly, I was also very amused by its die-like shape, and I asked a spokesperson if you could roll the Remote like you could a pair of dice. He said yes, you could, but he did warn that there's a chance it could break if you did it that way. At the same time, he said it was made out of a durable plastic, so you could try it out at your own risk.

Changing lights is really as easy as shifting the dodecahedron from side to side. Each side lights up in a different color after it's face up, as a visual cue that an action has been triggered. In our demo, all the Remote did was change the Nanoleaf light panels from one color palette to another, but it did so pretty easily and quickly.

OLYMPUS DIGITAL CAMERA

And the Remote isn't the only thing on display at Nanoleaf's booth. It also gave a preview of an upcoming product, which are Nanoleaf's new square light panels. Like with the triangular panels, they can be customized with other Homekit products. The cool part though, is that while you can only connect up to 30 triangular panels together, you could potentially connect up to 1,000 of these square panels together to create a truly immersive experience.

What's more, the square displays are designed to be water-resistant, and you can touch and glide your hand around on them to change their colors. The reason for the waterproofing? According to a spokesperson, Nanoleaf envisions that these square displays could eventually be installed as a high-end backsplash in a kitchen, or perhaps even a bathroom. That sounds pretty crazy to me, but it would make for quite a conversation starter at your next dinner party.

The Nanoleaf Remote will go on sale later this spring for $50, while the availability and pricing of the square displays are still to be determined.

Click here to catch up on the latest news from CES 2018.


Meltdown and Spectre flaws loomed large over CES

The Meltdown and Spectre CPU vulnerabilities hung like a shadow over the festivities of CES. What's typically a celebration of consumer electronics was instead a stark reminder of just how far-reaching these issues are. And that's especially the case for Intel and AMD, both of whom unveiled fast new processors that are still vulnerable to future Spectre exploits. They each had statements about what they're doing to secure their hardware, but there was no escaping that the threat of Spectre is the new normal. That's particularly troubling when tech companies are hoping to launch smart home solutions that seep into every aspect of ours lives.

Intel faced the brunt of the early criticism, when initial reports pegged the potential exploits as something that affected only its chips. It turns out that's not the whole story. The Meltdown vulnerability is specifically aimed at Intel's hardware, but Spectre will be an ongoing issue for every modern CPU. All the same, no massive security hole was going to put a stop to Intel CEO Brian Krzanich's opening CES keynote -- not when its big-budget show was being held at a giant music venue at the Monte Carlo hotel.

After an opening act that featured virtual instruments and a virtuoso child dancer, Krzanich went into crisis response mode almost immediately. "The collaboration among so many companies to address this industry-wide issue across several different processor architectures has been truly remarkable," he said, praising the unusual way competitors rallied together. "Security is job number one for Intel and our industry. So the primary focus of our decisions and discussions have been to keep our customers' data safe."

Krzanich went on to assure the audience that Intel hasn't heard about anyone using these exploits to steal customer data. And he also gave us more clarity about the company's response, noting that it plans to fully patch its product line from the past five years by the end of the month. As for reports of fixes slowing down processors, he reiterated Intel's line about the impact being "highly workload dependent." Microsoft gave us a bit more insight into what that means the next day -- basically, you can expect noticeable slowdowns with Intel's chips from 2015 and earlier.

As for AMD, its CTO, Mark Papermaster, told press and analysts that it still believes there is "near zero risk" for its users. Thanks to architectural differences from Intel, the Meltdown (aka "Rogue Data Cache Load") vulnerability doesn't affect AMD's chips. When it comes to the two Spectre vulnerabilities, he said Variant 1, otherwise known as "Bounds check bypass," will be fixed through OS and software patches.

Papermaster reiterated that there's "near zero risk" for its architecture to Variant 2, or "branch target injection." Specifically, he noted, "vulnerability to Variant 2 has not been demonstrated on AMD processors to date." That carefully worded statement leaves room for the possibility that hackers could come up with new exploits that take advantage of that flaw.

This CES was a particularly ill-timed launch for one of the strangest collaborations in the tech industry: Intel's new 8th-generation Core CPU with AMD's RX Vega GPU. When we first heard about the chip, we were intrigued by the possibilities. It finally gives computer makers the flexibility to make ultraportables with solid gaming chops. But now, with the threat of Spectre, the chip's luster is ruined a bit. Similarly, it's just tough to get too excited about AMD's upcoming Ryzen desktop CPUs. Even its promising Radeon Mobile GPU, which could bring even faster performance to laptops than its Intel collaboration, is still tainted by its connection with AMD's affected processors.

In an interview with Engadget, Jim Anderson, AMD's Radeon head, said, "Regardless of Spectre and Meltdown, we are always focused on continuing to improve our security. ... It's key for two very important markets for us, both data center and the commercial PC market." As for any potential performance hits, Anderson said the impact should be "negligible." Since our chat with AMD, Microsoft has halted patches for Windows systems running the company's chips. It turns out the update ended up bricking some machines. Microsoft blamed AMD's documentation for not conforming with earlier instructions, and it's unclear when the patches will resume.

It'd be bad enough if Spectre affected only individual devices, but this year at CES, tech companies also doubled down on connected platforms built on user data. LG has its ThinQ AI, and Samsung is bringing Bixby and SmartThings to more products. And on a similar front, we're also seeing more companies integrating with smart assistants like Alexa and Google Assistant. It'll be more important than ever to ensure that smart home platforms are secure locally in your home, and that the servers powering all of the assistants are also as secure as possible. (Google, Amazon and Microsoft all say they've patched their servers against known exploits.)

The worry isn't that a hacker could discover your Netflix guilty-watch queue. Instead, there's the potential for them to tap into smart home platforms to track your location, use your home cameras to peep on your family and access the microphones spread throughout your home. Indeed, we've already seen how vulnerable connected baby monitors were, which allowed people to spy on kids and potentially communicate with them. As gadgets reach deeper into our lives, so does the potential for serious attacks.

Tim Alessi, LG's director of product marketing for home entertainment, assured us that the company has "always had a history of making our devices as secure as possible." And when it comes to the widespread data collection that LG's ThinQ smart devices will employ, he noted, "We're not just collecting data for data's sake. It's to help people get the most out of their TVs. And, during setup, it's very clear during the opt-in process to make their own decision."

LG Electronics marketing VP, David Vanderwaal, showing off the company's new CLOi AI-powered robot and smart home devices.

Steve Marcus / Reuters

Going into CES this year, we knew the Meltdown and Spectre vulnerabilities would be something every major tech company would be thinking about. And their response was what you'd expect: They're working hard to fix the immediate issues, and they'll keep an extra eye on security in the future. Intel, which initially deflected blame, vowed to be more transparent with the public.

Other major chipmakers, like NVIDIA and Qualcomm, aren't worried about the implication of Spectre. The former claims that its GPUs are entirely immune, while Qualcomm's CEO, Cristiano Amon, seems confident that the company's December patches were enough to mitigate any major issues. He also pointed out that mobile users download software from app stores, which are far more secure than desktops and servers that can run software from just about anywhere.

Until we start to explore entirely new processor designs, we won't be entirely free from the dangers of Spectre. And that's not an easy feat. The x86 CPU architecture powers nearly every desktop, notebook and server. And Spectre remains a flaw in ARM-based mobile processors. While there's a chance that chip makers might be able to tweak their existing designs, that could have unintended consequences. Up until now, the main push for chip companies has been to shrink their existing technology down to smaller fabrication techniques. But, more than ever, there's a need for whole new architectures, which could take years and untold amounts of R&D funding to develop.

Click here to catch up on the latest news from CES 2018.

LG CLOi photo: Steve Marcus/Reuters


Tobii proves that eye tracking is VR’s next killer feature

There are plenty of ways virtual reality headsets could get better. They could offer higher-resolution screens (like the new Vive Pro), a wider field of view and improved built-in tracking sensors. But another feature might be even more essential: eye tracking. It's not a new concept -- we've been following FOVE's eye-tracking headset, as well as 7Invensun's Vive accessory, for a few years now. But it seems more important than ever as consumer VR winds up. Tobii, a company that's been exploring the potential of eye tracking for a while, is hoping to integrate its technology into the next generation of VR headsets. And based on some demos I saw, it's clearly not a question of if VR headsets will get eye tracking. It's when.

Tobii first showed off its technology integrated into the HTC Vive at GDC last year. But at CES, it unveiled some new experiences to demonstrate the benefits of eye tracking. I was surprised to find that, aside from some sensor rings around the Vive's lenses, it didn't look as if the company added much to the headset. Instead, its hardware is able to seamlessly fit inside the Vive.

To calibrate the tracking, I followed a dot around the display for a while using just my eyes. Then I was presented with a mirror that reflected my VR avatar. It tracked my head movement around, as usual, but the eyes were blank and expressionless. Then I moved on to another mirror with eye-tracking enabled. When I blinked, my avatar blinked. It's a small thing, but it went a long way toward making the experience feel more immersive. I wasn't constantly reminded by the limitations of expression in VR.

Then I moved on to a screen featuring two robots. When I glanced at them, they made direct eye contact and responded with text messages. There's an uncanny social awareness to them, as if they're actually aware of your intent on having a conversation. This sort of feedback could easily make it seem like you're chatting organically with game characters. And it could be even more useful in social VR environments -- just imagine how awkward it'd be if we were stuck with boring avatars that didn't reflect our eye movement.

A surprising demo just involved throwing rocks at far-off bottles. Without eye tracking, it was almost impossible to accurately knock down anything. But with the feature turned on, all I had to do was focus on one bottle, throw the rock with enough virtual momentum, and down it went. As I smoothly knocked down most of the bottles on the screen, I almost felt like I had superpowered accuracy.

New for CES was a trio of experiences showing off Tobii's technology. One was a virtual living room, where I was able to select something to watch by moving my eyes across a media library. Today you'd have to either rely on a controller's touchpad or crane your entire head around to interact with virtual objects. It's not just a clunky way to replace something you can do in real life easily, like scroll through your Netflix queue. It adds a whole new capability that was never possible without eye tracking.

Tobii

Next, I found myself sitting in a virtual loft playing an augmented reality game. On my left was Mars, while Earth was on my right. The goal was to launch rockets from Mars and make them hit alien ships floating around Earth. I could spin both planets, which changed the angle of the rockets and the ships, and I also had a button for turning Tobii's tech on and off. Naturally, the game was much easier to play when I could just look at a planet and rotate it with the Vive controller's touchpad. Doing that manually, by selecting a planet with the controller, was far less fluid and made the game nearly impossible to play.

I also played through a scenario similar to Star Trek Bridge Crew, which involved manipulating a daunting number of buttons and dials on a spaceship. If you've played that Star Trek VR game, you'd know that one tough part of it is making sure you hit the right button at the right time. With eye tracking in Tobii's scenario, I only had to look at a button to select it. The company's tracking technology did a solid job of choosing the right button most of the time, even though the demo had plenty of other things to select nearby.

In addition to simply making VR interaction more fluid, Tobii claims that eye tracking will also allow for more efficient foveated rendering. That's a technique that makes your computer devote most of its graphics power to what you're seeing, while keeping offscreen content at a lower quality. Typically, foveated rendering works across the entire screen, no matter where you're technically looking. But with eye tracking, it can focus the best quality to what your eyes are actually pointed at, while slightly downgrading what's around it. Tobii quietly enabled the feature during my last demo, and I was surprised that I didn't even notice it in action. The big benefit? It could make it easier to run VR on slower systems.

While VR is the most immediate and obvious fit for Tobii, the company is still aiming to work with more PC manufacturers to build eye tracking into their laptops. Currently, more than 100 games support the technology as well. You can also expect to see Tobii's eye tracking in even thinner laptops over the next few years. (Right now it's mainly relegated to beefy gaming notebooks.) The company let me take a glimpse at its upcoming "IS5" sensor design, which is significantly smaller and slimmer than its current solution (above). In particular, the camera has been dramatically shrunken down.

Tobii's CEO, Henrik Eskilsson, told us that eye tracking will eventually be viewed as a requirement for VR. And I'm inclined to believe him. Accurate eye tracking delivers a better sense of presence, which is really the ultimate goal for virtual reality. Trying Tobii's technology for just 30 minutes has already ruined me for every VR headset without it. I'd call that a success.

Click here to catch up on the latest news from CES 2018.


CES showed us smart displays will be the new normal

Before the start of CES 2018, the only real smart speakers with a display were the Amazon Echo Show and the Echo Spot. But now that Google has partnered with several manufacturers to make a whole line of Echo Show rivals, a bona fide new device category has been born: the smart display. And based on the devices revealed this week, I believe the smart display will slowly start to outnumber smart speakers and will likely be the norm going forward.

The simple reason for this argument is that the display makes such devices much more useful. Sure, you could have Alexa or Google Assistant tell you there's a Starbucks 1.5 miles away from you. But wouldn't it be nice to actually see where it is on a map? Or if you wanted to know the time, you could just, you know, look at the screen. Or if you wanted to know who the artist of the song is but couldn't be bothered to interrupt the track, you could do the same. That extra visual layer is really useful, especially for quick, glanceable information.

Of course, you could've made this same argument months ago when the Echo Show debuted. But these new Google Assistant displays are so much better in almost every way. For example, when you make a search query, it won't just spit out a short generic answer with the transcript showing up on-screen; it'll actually appear in a way that makes sense. So if you search for "cornbread recipe," the display will offer an array of recipes to choose from. Tap on one and you'll be presented with a lovely step-by-step recipe guide, all without having to install any additional skill or action.

Or if you ask a Google Assistant smart display to play relaxing music, it won't pick out a random playlist and start playing a song you don't want (something that happens quite frequently with the Echo). Instead, it'll offer a visual selection of playlists, which you can then scroll through and pick the one you want. Perhaps my favorite feature is when you ask for directions. It will not only show you the map on the screen but also send those same directions straight to your phone without you having to ask.

Plus, Google has now opened the door for so many more companies to start making smart displays. At CES, we saw Lenovo, JBL and LG show off their versions, each with very different designs. Eventually, even more companies will join the fray, adding their own spin on what a smart display looks like. With so many options on the market, there'll soon be a smart display for every kind of home. Amazon might've introduced the smart-display concept, but Google will be the one to democratize it.

And this is just the beginning. Smart displays can be incorporated in more than just a little 10-inch prop on the table. Personal assistants are already in smart fridges from LG and Samsung, so it doesn't take much imagination to think that Alexa and Google Assistant displays could take over the rest of your home. Imagine a smart display not only on the front of your fridge but also in the kitchen TV or maybe the bathroom mirror. Soon smart displays will be everywhere. CES 2018 was just the beginning.

Click here to catch up on the latest news from CES 2018.


Engadget Today | CES 2018: It’s a wrap!

That's it, the show's over! It's been a wild ride, as usual. After landing here a week ago, we're glad to be packing up and heading back to our own homes, but we'll always have a soft spot in our hearts for the LVCC. We can't wait to see all the new gadgets from the show in the review lab, but for now, it's adios, see you next year.

Click here to catch up on the latest news from CES 2018.


Dolby knows what you’re watching based on your breath

If you thought it was creepy that technology lets networks know what you're watching, you'd better sit down. It turns out that Poppy Crum, chief scientist at Dolby Labs, has been researching how our bodies and emotions react to what we see and hear. Don't panic, though. All that information is being used to understand how to make us feel more when we watch a Hollywood epic. "In the cinema, we can measure exhalants [...] and be able to tell what movie they're watching, just by the chemical signature," Crum told Engadget on stage at CES. And you thought clearing your browser history was enough to cover your tracks.

Click here to catch up on the latest news from CES 2018.


Cubinote prints colorful sticky notes from your smartphone

Even with all the reminder and to-do apps out there, plenty of people are still fond of physical sticky notes. And if you don't want to pick one over the other, there's a startup trying to bring the best of both worlds together. Cubinote is a Bluetooth- and WiFi-enabled printer that pairs with an iOS or Android app to make sticky notes on the fly. The company says the product could come in handy if you, say, want to leave it at your home office and send reminders or other random notes to it from your workplace. Or if your parents don't live in the same city as you and you want to send them messages every now and then.

This works well because while the Cubinote features Bluetooth, it's also a WiFi device, making it possible to connect to your phone from a long distance. It's essentially like connecting to a Nest Cam or another Internet of Things product. One of the best parts about it is that the sticky paper it prints doesn't leave any residue where you place it, so you don't have to worry about the notes dirtying up your monitor, desk or anywhere else you stick them.

The only downside is that it's not cheap: Cubinote's pricing its namesake sticky note printer at $149, which seems like a lot of money for such a niche product. If you like it, though, you can pick it up in March from Amazon and other retailers.

Click here to catch up on the latest news from CES 2018.


Engadget Today | The darkness after the storm

Technology and innovation are amazing. It's why we do what we do here at Engadget every day. But it's all for naught if the electricity goes. There was no reminder of this more timely or apt than a huge power failure at CES on Wednesday. Exhibitors got angry, attendees got confused and companies got spicy on Twitter (oh, brands!). That didn't stop us checking out the best of the rest so far though. Enjoy.


Sphero spinoff Misty Robotics unveils its first robot for developers

Misty Robotics, the company that spun out from Sphero's advanced robotics division last year, is taking a big step towards its mission of bringing a mechanical helper to homes. Today at CES, the company unveiled the Misty 1 developer robot, a cute, hand-made machine that'll be sold to a lucky group of customers. The company only plans to sell a few dozen units of the Misty 1 -- at most, up to 50, according to CEO Tim Enwall. Developers will also have to apply to buy the robot for $1,500.

Misty Robotics will judge entries based on who has an adequate amount of time to spend with the device, as well as who can commit to collaborating with their team down the line. It's an unconventional approach towards releasing a new product, but it seems a necessary step for Misty, since it needs to make sure it has developers committed to learning a whole new robotic platform. The company plans to release a mass-produced model later this year, the Misty 2, but this early batch will let it see which aspects of its technology devs like the most.

The Misty 1 looks like an early concept version of Wall-E with its mini-tank treads, squat size and large, expressive eyes. Naturally, there's a ton of gadgetry hidden underneath its adorable exterior. There's a occipital light sensor, HD camera and far-field microphone array up top, right above its 4.3-inch LCD display. Below that, there's a speakers and RGB led, as well as sensors in the front and rear for avoiding objects. It's all powered by two Snapdragon processors. With all of that hardware, the robot will be able to see and recognize faces, as well as map indoor environments.

So what does it actually do right now? Well, the company still seems to be figuring that out. Misty engineers are able to remotely control it from a smartphone, and they've also programmed one to act a bit aggressive when you get in its face (it growls like an animal, then performs a randomized series of actions to scare you away).

Now that the hardware is pretty much set, the company is more intrigued with seeing what developers do with it. To make it even easier to configure, the robot can also be programmed with Blockly, a simple language that's used in STEM classes. The company has also found that even experienced coders appreciate having a quick way to tell the Misty 1 what to do.

Of course, this is just a start for Misty Robotics. The company recently held a "robothon" at its Boulder to give 25 developers a shot at building some experience for the Misty 1. We can expect to see that sort of experimentation on a larger scale once even more devs get their hands on it.

Click here to catch up on the latest news from CES 2018.