Tag: personal computing

How Microsoft embraced ‘messy’ creativity with Windows Ink

Windows Ink isn't Microsoft's first stab at bringing stylus support to PCs -- that would be Windows XP Tablet Edition -- but it is the company's most successful. It made stylus support a core part of Windows 10, and it's a big reason you're seeing so many computer makers shipping digital pens of their own. While the company's renewed push into the space with its hybrid Surface tablets seemed baffling at first, it's ended up looking like a prescient move. It even convinced Apple to compete with the iPad Pro's Pencil.

With the Surface Pen and Windows Ink, Microsoft found a way to let PC users do something completely new: It gave them a way to break free from the constraints of the keyboard and mouse.

"I think it's [Windows Ink] the first time that technology has embraced 'the messy,'" Aaron Woodman, general manager of Windows Marketing, told Engadget. "For me, seeing Pen come to life in a way where you don't have to go from top to bottom, from left to right, you can create in a way before your thought is really complete. I don't think there's a ton of technology that's really embraced that fluidity."

He's got a point. The way we interact with computers hasn't changed much over the years. If you learned how to use a PC with a keyboard and mouse, you'd have no trouble using a modern machine. The advent of smartphones and tablets, with their capacitive touchscreens, was the biggest change over the past few decades. But what if you want to draw a detailed picture, jot down notes in your own handwriting or write out mathematical equations? You'd turn to one of our earliest writing tools: the stylus.

"We're embracing that, yes, [stylus support features with Windows Ink] are hardware-driven; yes, they require a platform that has to be broad in reach; and yes, for part of that, you need ecosystem partners," Woodman said. "That really starts to get people to understand it and see themselves using it in applications like Office. To see that come through in a way that customers don't feel like they're jumping over walls, I think it's really satisfying personally."

Devindra Hardawar/Engadget

In particular, Woodman credits Microsoft's close partnership with Wacom, a company best known for its stylus tablets and displays, for the progress with Windows Ink so far. That allowed the two companies to build a sensor that "essentially allows you to go between pen protocols." For computer makers, that's helpful since it lets them choose between different pen protocols. Basically, it let Microsoft open up the market for styluses, just like Windows did for PCs decades ago.

Now, Woodman says retailers are selling twice as many pen-capable machines, compared to those that don't have them. 43 percent of consumers with stylus machines are also using their pens monthly, according to Microsoft's stats. Given just how well they're taking off, though, it's surprising that Microsoft chose to make the Surface Pen an additional $100 purchase for the Surface Laptop, Pro and upcoming Book 2.

Windows Ink's integration with Microsoft Office is a clear example of how stylus support can breathe new life into programs we've used for years. In Word and PowerPoint, you can use a stylus to edit documents as if you were marking up paper. And, as you can imagine, having a more natural input mechanism is a big help for OneNote. It's not only useful for jotting down your thoughts, but you can also use it for recording complex math equations — the sort of thing that would be tough to type out on a keyboard. OneNote can also convert your handwritten equation into something formatted for a computer, and you can then have it evaluate an equation, factor it and graph it.

It was a long road getting here, though. The first "Tablet PCs" powered by Windows XP (like the Compaq on the right) were woefully underpowered, heavy and generally just hard to use. It was difficult enough to get them to do basic Windows tasks, so there wasn't much chance consumers would spend time with their styluses. There were also some early digital pens available for Windows 8. Really, though, it took the launch of the Surface 2 and Pro 2 for us to really see what a stylus could do in Windows. The Surface Pen was light, responsive and simply felt good to use. Microsoft steadily refined it with future Surface models, giving us better tips and more pressure sensitivity.

Even after the launch of Windows 10, it took over a year for Microsoft to make stylus support truly meaningful with last year's Anniversary Update. That introduced Windows Ink and its accompanying software, including built-in sticky notes and a sketchpad. More importantly, it also gave Microsoft's partners more of a reason to bundle styluses with their computers. Apple entered the fray with the iPad Pro's Pencil in 2015, which is a decent stylus, but is only useful in a few creative apps. And you can forget about seeing it in MacOS anytime soon -- Apple is focusing its touchscreen efforts entirely on iOS.

Embracing a new type of computing creativity seems a bit out of character for Microsoft — at least, the pre-Satya Nadella Microsoft. But the timing for the company's change of heart makes sense. Thanks to faster and more efficient computing hardware, it's finally turning its stylus ambitions into a reality. And more importantly, consumers and computer makers are finally paying attention.

"On some level, we have a responsibility to solve the challenges customers are facing," Woodman said. "Now, watching 3D objects in Powerpoint [via the Fall Creator's Update] is mind boggling. Not because you see it in 3D, but because it saves you infinite steps. I think Pen has the same type of promise. It's more about just feeling like you have that permission to go beyond the boundaries of how people have defined the products so far."


Razer’s new webcam and microphone are made for streamers

Razer is known as a gaming laptop, mouse and keyboard maker, but it actually offers a wide variety of products, like Xbox controllers, power banks, and even an upcoming phone. Razer also makes webcams like the Stargazer, which is built for streaming video games. Now Razer is upping its streaming game with two new "streamer certified" peripherals, a webcam with a built-in ring light called Kiyo as well as a USB condenser mic named Seiren X.

The $100 Kiyo's built-in light has 12 levels of brightness to help light your face for those important picture-in-picture streams on Twitch. It also outputs high-def video at 720p with 60 frames per second (FPS) or 1080p at 30 FPS. The Seiren X also retails at $100 and comes with a removable desk stand so you can set it up anywhere you're streaming from. It connects via USB and has 25mm condenser capsules and a tighter recording angle that's optimized for streaming, according to the company.

"Streaming has become an integral part of the gaming community," said Razer CEO Min-Liang Tan in a statement. "We took a hard look at what streamers really needed, and engineered products to support those specific use cases. The result are products that produce professional quality streams while remaining accessible to beginner users."

Ring lights aren't anything new, of course. I had one that you could slide onto Apple's old standalone iSight camera years ago. Still, the Kiyo could be attractive to someone who has a darker room and needs to stream a better image. There are plenty of microphones to choose from, but if you're using other Razer gear, the affordable Seiren X might entice you, too.

Via: The Verge

Source: Razer


‘Aztez’: The bloody indie brawler that should’ve been big

Imagine: It's 2012 and Matthew Wegner is sitting at his desk in the back of a one-bedroom apartment in Tempe, Arizona, pounding away at a keyboard. It's night but thick black drapes are pulled over the window; the room is suffused with dim yellow light, casting sickly shadows over the papers tacked to the walls. Most of them are emblazoned with the name Aztez, depicting bloody battles among ancient Aztec warriors. Wegner's fingers fall still as he closes a line of code and reviews his work. His computer hums, hot.

A ball of blinding white light suddenly explodes in the middle of the room, shooting sparks to the ceiling and singeing the carpet -- Wegner jumps up and stares, wide-eyed, at the intrusion. As the glare fades, a familiar shape emerges. Wegner is looking at himself: a little older, a little more weathered, but definitely himself.

"Don't do Aztez!" the second Wegner says, frantic. "I'm you from five years in the future. Trust me, stop working on this game. It doesn't go well."

The original Wegner finds his voice. "But everyone says it's going to be great! We already have a lot of buzz."

"It's a trap. Quit Aztez. Now!" The light returns and swiftly envelops the second Wegner before popping out of existence entirely. His final words reverberate around the tiny, smoking room. Wegner blinks and shakes away his shock. He pulls out his chair and sits down. Moments later, his fingers are flying over the keyboard again, coding combat combos into Aztez.

"If, five years ago, future-me were to teleport into the room and be like, 'I'm Matthew from the future -- Aztez, just get off of it, don't do it,' I'd probably still finish it," Wegner says, back in the real world.

"Yeah, I would too," his partner, Ben Ruiz, agrees. Ruiz and Wegner are the founding (and only) members of Team Colorblind, an independent game-development studio based in the Phoenix area. Their first game, Aztez, hit Steam in August after seven years of work and five years of positive public attention. It's a stylish brawler with deep strategy elements set in the ancient Aztec empire, featuring black-and-white environments splattered with bright red blood.

During development, Ruiz and Wegner rode a rising wave of love for indie games, as consumers discovered the depth of experiences available outside of the big-budget, DLC-obsessed, AAA marketplace. They rode that indie wave until it crashed. It's difficult to say exactly when, because they didn't feel the impact -- they were trapped in a development bubble, building a game that the media, their friends and fellow developers all said would perform wonderfully once it was out in the wild.

At least, that's what they said five years ago.

The video game industry evolves rapidly, constantly adopting new technologies and taking advantage of fresh distribution methods. As Team Colorblind continued to work, the indie market lost its new-toy sheen and became an established, overcrowded haven for anyone with GameMaker and an idea.

"If I was paying attention to Steam, maybe I wouldn't be so blindsided by what happened, but I'm also not necessarily sure what I would've done differently," Ruiz says. "If I'd have known like, oh, it's a saturated market now -- what the fuck do you do?"

After seven years, Aztez emerged from its development bubble -- and it bombed. This time around, Ruiz and Wegner definitely felt the impact.

"Fucking madness," Ruiz says.

The method

Ruiz chose Aztez's release date extremely carefully. He knew they didn't want to launch during the holiday season, which is generally dominated by the AAA money machine, so he looked at the summer. He scoured the charts of upcoming Steam releases, searching for a day without any big-name games. August 1st looked good. He locked it in.

"There were 40 other games that launched on August 1st," Wegner says. One of those games was Slime Rancher, an adorable first-person title that had generated a rabid fanbase while it was still in Early Access. Another was Tacoma, the new game from the team behind indie darling Gone Home. Neither of these had shown up on the new-release charts Ruiz had studied.

"If I would've seen Slime Rancher, I would've been like fuck it, we're going to wait a month," Ruiz says. "Because there's no reason to compete with that."

Slime Rancher didn't single-handedly destroy Aztez's chances at success. However, with 39 other new games also hitting Steam that day, it was difficult for any title to truly stand out. Even an eye-catching brawler with a notable amount of name recognition.

This is a different world than the one Ruiz and Wegner operated in when they started working on Aztez in 2010. Back then, Steam was a curated space, where employees worked directly with developers to approve their games and get them on the store. A handful of titles went live every week and indie developers lucky enough to secure a Steam deal could generally bank on that release to see them through the fiscal year. Getting on Steam was like hitting the jackpot.

That changed in 2012. Indie games were all the rage, development tools were becoming increasingly accessible, and there were hundreds of new titles ready to be distributed every day. Steam set up Greenlight, a system where players themselves approved indie games for the store, and Early Access, where developers could publish games-in-progress for community feedback, allowing them to feed the hype beast from day one.

Meanwhile, Ruiz and Wegner continued working on Aztez, heads down, not paying much attention to the wider marketplace as it tilted around them. They had five years to go. Today, Greenlight is dead, but the Early Access model has spread to other platforms, including consoles.

"It's almost like in the last five years, everyone was on Steam, and then it refractured and the consoles got markets back again," Ruiz says. "Five years ago everyone was on Steam because new consoles would come out and they were like $1,000, right? So, it's just this sin wave apparently, because it's like oh yeah, that's what happened with the previous generation, too."

This tug-of-war between Steam and consoles continues today. Right now, it feels like there's more opportunity for indie games to succeed on the PlayStation 4 or Switch ("They don't say Xbox One. I don't think anyone owns those," Ruiz adds) than Steam. However, PS4 has been the reigning indie hub for a few years and its dominance might be coming to an end. Managerial shakeups have recently altered the company's approach to smaller studios -- Sony is losing the indie market and the Switch alone can't support this ecosystem on behalf of all three major consoles. The momentum is poised to shift back toward Steam any time now, but the platform still has to deal with its oversaturation problem.

"I think we came out on the bad part of that rollercoaster, because you know, there's a billion indie games that come out on Steam every single day," Ruiz says.

Ruiz and Wegner aren't making any money on Aztez and they've started picking up contract work again to pay the bills.

Back in 2010, they worked on the game exclusively on nights and weekends, but they managed to secure some investors early on and Ruiz has been building Aztez full-time since then. Wegner continued doing contract work and focused on the game in his spare time -- over the past year, however, Wegner was full-time on Aztez as well.

Colorblind has to repay its investors before they see a cent out of Aztez. So far, they've sold roughly 2,000 copies of the game across Steam and other, smaller distributors. It sells for $20 when it's not on sale.

"We have no money coming to us until we pay that back," Wegner says. "Which is super frustrating, because if you sell two copies a day on Steam, you're making $1,000 a month after Steam's cut. If you sell 10 copies a day you're making $5,000 a month."

All of that cash -- real and potential -- is being funneled to Aztez's investors for now.

"It'll make money eventually," Wegner says. "Funny thing is like, I'm 37, and I'll be probably in my 40s, like, oh, Aztez made me some money, that's cool."

"Oh no," Ruiz laments.

"Then I'll put it towards my medical class for being a 40 year old."

"I'm 33," Ruiz says. "I might be in my 40s when Aztez makes some money."

Ruiz and Wegner did a lot of things right when it comes to savvy indie distribution -- Ruiz sent out press releases, published YouTube videos and got some high-profile streamers to play Aztez -- but they were a few years behind the market. They launched on Steam when consoles might have been a better move.

Now, they're working on PS4, Xbox One and Switch versions of the game. Aztez is actually running on all of those platforms, but it probably won't launch until early next year, after the holiday rush.

By that time, it's hard to say where the market's energy will be. Aztez could easily miss out on the current console bubble, too. For instance, plenty of players today are excited about indie games on the Switch, but there's no telling how long that interest will last.

"Maybe by the time we launch on Switch, you know, in whatever January, February, we will also be amongst the crowd again," Ruiz says.

Ruiz and Wegner know all it takes is one good day to set Aztez on the path of financial solvency, and they know the console releases could be major. They're also painfully aware all of their plans could fail spectacularly. It's a tense waiting game.

"We're in the part of the metaphor where, like, we hit the other car or the boundary, and flew out the windshield, and our faces and arms just ate street," Ruiz says. "I kind of feel like we're still in the street. So intellectually it's like OK, we're alive, I know we'll be alive, this isn't going to kill us. ... I know we're going to be okay, but everything blows right now. Just blood everywhere, like oh, dang it."

Ruiz and Wegner may be disappointed, bleeding and sore -- but they're not defeated.

"Some day, there could be the spark that becomes the fire, and all of the sudden it gets into the right hands, and then the conversation starts and then it's like oh, fuck," Ruiz says. "We sold 10,000 units today and we did the next day, and then all of a sudden we're a normal, successful game developer. That could happen at any time. But the fact that that is the nature of the universe is just torture. Because every day it doesn't happen. And you know it's not going to happen. ... Once the consoles are done I'm going to be relieved to let that fall out of my brain. But as long as that's in the future I can't abandon it, I can't abandon the idea that like, oh, it might turn around."


Google and Microsoft troll each other over software vulnerabilities

Google has a history of not playing nicely with Microsoft. The company has previously posted publicly about their competitor's software vulnerabilities, and understandably, Microsoft hasn't been very happy about it. But now, the company has turned the tables on Google. Microsoft found a vulnerability within the Chrome browser, and while Google patched it in beta versions, it wasn't fixed in the public release for roughly a month.

However, Google posted the fix on GitHub instantly, before it was applied to the public release. While the fix for this issue doesn't out the vulnerability, according to Microsoft, that hasn't always been the case. Microsoft believes that a fix should be applied before they are public knowledge.

Microsoft does have a point here. It took Google a month to patch this particular Chrome vulnerability; that's plenty of time for a hacker to examine it and exploit it. It's probably not the best judgment to put fixes for vulnerabilities on GitHub before they're patched in a browser.

That being said, though, are we really benefitting from this one-upmanship between Google and Microsoft? Sure, the issues are being identified and corrected, which is always a good thing. And a bit of friendly competition can certainly be helpful. But this may have veered beyond "friendly" territory and started endangering users' security in the process. Perhaps it's time for both companies to rethink their approach when it comes to these issues.

Source: Microsoft


The Surface Book 2’s secret weapon is ceramic, says Panos Panay

With the Surface Book, Microsoft delivered yet another way to rethink traditional computers. It resembled a laptop more than the earlier Surface devices, which were basically tablets with keyboard covers. But it also packed in one new trick: a large screen that you could easily remove at the touch of a button and use as a tablet. At the heart of that feature was a unique hinge that looked unlike anything else on the market. It had one big problem, though: it wasn't very stable.

It was something that was hard to ignore when you used it on your lap; the screen would shake as you typed, as if the display was barely holding on for life. It made the Surface Book feel more like a prototype than an expensive high-end laptop -- not exactly inspiring. So when it came time for the sequel, it was one of the first things Microsoft addressed.

Inside The Microsoft Corp. Hardware Lab As Company Presents $999 Laptop

"The hinges are completely redesigned; it's all from the learnings of the first one, because you want more stability," Microsoft's Panos Panay, the creator of the Surface line and its VP of devices, said in an interview with Engadget. "We redesigned the connection mechanism, we went to ceramics, we lightened the whole product."

Yes, ceramics. That's not something you'd typically find in a notebook, but it's ability to deal with high temperatures better than metal made it the ideal material. Specifically, the notebook uses a small ceramic part [above] that works together with muscle wire to attach and detach the screen, as well as keep everything steady.

"We didn't invent muscle wire. But we went and found it and thought, how would you include it with a hinge that could lock these two together, with a mechanism that felt robust and premium," Panay said. "You had to hear it when it was open. You had to know when it was locked... That disconnect moment should be emotional, it should be connected to you, you should understand it."

The ceramic (center) and muscle wire hinge mechanism in the Surface Book 2.

Devindra Hardawar/Engadget

Based on my hands with the Surface Book 2, it's definitely much more stable than the original. And, oddly enough, removing and reattaching the display to the keyboard base felt easier than before. It makes a satisfying click when it connects to the base, but it also smoothly detaches. It might sound like a small change, but it's a truly meaningful one for Surface Book users. We were promised the laptop of the future -- and the future shouldn't have screens that wobble like a bobble head on a dashboard.

Naturally, the Surface Book 2. It's more powerful than before, and there's a new 15-inch model joining the family. But is that enough to take on Apple's MacBook Pro, not to mention other powerful laptops?

"With the [Surface Book 2] hardware, we redesigned everything on the inside, period," Panay said. "To get to that next level of performance we needed -- it's three times more powerful -- we put in a quad-core Intel 8th generation CPU. Now you're in a totally different class of computing from gen 1 to gen 2."

Even though the Surface Book was generally well reviewed, early users were plagued with a variety of issues, including screen flickering, power problems and bouts of instability. We talked with Microsoft representatives about those problems shortly after the laptop's launch, and, for the most part, they acknowledged that they still had work to do.

"When we launched the Surface Book, we had some challenges from the silicon through the software," Panay said. "This is why the Surface Book 2 is so important... Right now we look at Surface Book quality and it's off the charts. Did it take some learning to get it to where we needed to be? Absolutely."

While he wouldn't point to any specific changes that helped stability, Panay noted that they have a better understanding of how they're pushing the CPU, GPU and hinge components. The Surface Book design is unique among laptops: it houses its CPU in the display, but holds its graphics hardware (and additional battery) in the keyboard base. In particular, dealing with those early issues strengthened Microsoft's relationship with Intel, which was essential as they developed Surface Book 2.

Panay didn't say much about what his team is cooking next, after reinventing laptops three times, as well as all in one desktops with the Surface Studio. But, not surprisingly, he's excited about the vision of seamless computing Microsoft is pushing to consumers.

"There was a point in time where you had to switch between your pen, your touchscreen, and your keyboard," he said. "There was literally a break in flow... The most inspiring thing about our categories today, whether its Cortana with the dual array mic, or interacting with Office with a Pen and touch... As they continue to evolve into a seamless way, we're going to get the best out of people... We're now in a generation where, if you want to get something done: Start. Go. Move."

Photo credit: Mike Kane/Bloomberg/Getty Images (Panos Panay)


Samsung’s phone-as-desktop concept now runs Linux

Samsung's DeX is a clever way to turn your phone into a desktop computer. However, there's one overriding problem: you probably don't have a good reason to use it instead of a PC. And Samsung is trying to fix that. It's unveiling Linux on Galaxy, an app-based offering that (surprise) lets you run Linux distributions on your phone. Ostensibly, it's aimed at developers who want to bring their work environment with them wherever they go. You could dock at a remote office knowing that your setup will be the same as usual.

It's not quite the same as your typical Ubuntu or Debian install. Linux on Galaxy launches through an app, and it's using the same kernel as Android itself in order to maintain performance. And it almost goes without saying that you'll really want a DeX setup, since most Linux apps are expecting a large screen, mouse and keyboard.

As it stands, you'll have to be patient. Linux on Galaxy isn't available right now -- you can sign up for alerts, but it's not ready for public consumption. Even so, this is good evidence that Samsung thinks of DeX as considerably more than a novelty feature. It may be a long, long while (if ever) before many people are using their phones as desktops, but Samsung is willing to gradually build up its ecosystem and eventually give you an incentive to take a second look.

Source: Samsung, Linux on Galaxy


Adobe Photoshop adds support for Microsoft’s Surface Dial

As part of its Creative Cloud 2018 rollout, Adobe has revealed that Microsoft's Surface Dial, seemingly made for Photoshop CC, is finally supported by the app. Adobe notes that for now, it's shipping as a "tech preview," meaning you'll have to first turn the feature on and it's not production-ready, so there may be a few bugs. You'll need to have a Bluetooth-capable PC running the latest version of Windows 10, and functionality is limited to the brush settings for now. As shown in the image below, you can use the Dial to adjust the brush size, opacity and other parameters.

It's unclear why it's taken Adobe and Microsoft so long to get together on the Surface Dial, nor why the functionality is so limited. As Microsoft has just released its Surface Book 2, however, there may have been some pressure for Adobe to at least do something to help motivate creative folks to look at the dial, which remains a niche product so far.

I've had a chance to use other physical dials -- notably the Palette Gear -- with multiple functions on Photoshop CC and other Adobe apps, so it doesn't seem like it's that hard to implement. Hopefully, the device will be fully functional by the time Microsoft releases its next Surface Studio desktop, whenever that might be.


Samsung leapfrogs Intel again with 8-nanometer chips

Samsung has qualified its 8-nanometer chip-making process for production three months ahead of schedule. It's the same "low power plus" (LPP) process used for its current 10-nanometer silicon, not the next-gen extreme ultraviolet (EUV) lithography for its future 7-nanometer tech. That'll yield chips that are ten percent more energy efficient and ten percent smaller than the 10-nanometer ones it's making right now. At the same time, since the 8-nanometer chips use the same process, Samsung will be able to "rapidly ramp up," it said.

Samsung said that the new process will be ideal for "mobile, cryptocurrency and network/server" applications. It notably worked again with Qualcomm, its 10-nanometer chip launch customer, to perfect the new tech. Rumors in Korea had it that Qualcomm would switch its 7-nanometer production to TMSC, which is reportedly slightly ahead of Samsung in developing that tech.

However, Samsung confirmed with ZDNet that Qualcomm will be using its 8-nanometer process, without providing any specific details. Given that information, it seems likely that Qualcomm will build its next-gen Snapdragon chips with Samsung, using the tried-and-true LPP process instead of bleeding-edge 7-nanometer tech, which necessitates a switch to extreme ultraviolet lithography.

By that time, Samsung should have its own 7-nanometer EUV process up to speed, with 6-nanometer chips set to follow after that. Anyway, Samsung Mobile is probably Qualcomm's biggest customer with its Galaxy S8 and Note 8 phones, so it would have been pretty awkward to split off to another foundry.

Though they don't compete much in the same markets, the news puts Intel even further behind Samsung, at least in terms of chip trace sizes. Intel has yet to release any 10-nanometer chips, though it has said that when it does (in 2018 or 2019), it will be "generations ahead" of Samsung thanks to better feature density. By then, however, Samsung might have closed that gap by being two or three actual generations ahead of Intel in terms of lithography. Samsung is expected to reveal its roadmap for 8- and 7-nanometer chips later today.

Source: Samsung


Earbud translators will bring us closer: The Future IRL

The moment Google Pixel Buds were used earlier this month to demonstrate real time translation from Swedish to English, people started freaking out about potential use cases for this kind of technology. But the thing is, Google isn't the only company taking this on.

Doppler Labs offered me a chance to try the beta version of its translation software, used inside of its existing Here One earbuds. It plans to release the translation feature in a software update early next year. I jumped at the chance, and first exchanged pleasantries with a fluent Cantonese speaker, then let folks in San Francisco's Dolores Park use the buds to translate Spanish. Everyone that tried them in front of me loved them, but that doesn't mean they're perfect. Proper nouns are enormously difficult to translate with ease across languages, and that was apparent when we asked one person in Spanish whether she preferred House Stark or House Targaryen in Game of Thrones. The translation spit out mostly gobbledygook. I struggled similarly when trying to understand where my conversational partner lived (Near Ocean Beach in San Francisco, from what I could tell) but it took about three tries to get there.

Doppler Labs plans to up their earbud ante even future in Q3 of 2018, when an updated earbud will give even longer battery life and power for translation, enabling some compute either on the earbuds or on a paired phone, without having to touch the cloud for translation, a pretty common occurrence in most products like it now.

The wise gadget lover might wait for that updated bud, or for that matter, v.2 of Google Pixel Buds or other competitors. But if you imagine yourself an intrepid explorer of the world, translation earbuds are probably already on your wish list. You could wait for generation two or later products from Google, Doppler, Bragi and more, but let's be real: This technology is simply too life-changing to make yourself wait.


Adobe remakes Lightroom CC as a hybrid app and 1TB cloud service

Adobe has unveiled a raft of new apps and updates for Max 2017, most notably a big revamp of Lightroom CC to make it more cloud-friendly for mobile users. The centerpiece is an all-new Lightroom CC with a 1TB cloud service -- the "Project Nimbus" app that leaked last year. It features a streamlined version of Lightroom CC that keeps images, edits and metadata synced in Creative cloud across PC and Mac, Android and iOS. For desktop users who prefer the current, non-cloud app, Adobe has re-branded it as Lightroom Classic CC.

Lightroom CC and Lightroom Classic CC

To be clear, because Adobe's new naming system is pretty darn confusing, Lightroom CC is a series of apps app and a service. As Adobe describes it, Lightroom CC "is designed to be a cloud-based ecosystem of apps that are deeply integrated and work together seamlessly across desktop, mobile and web." Lightroom Classic CC, on the other hand, "is designed for desktop-based (file/folder) digital photography workflows."

Despite the fact that it's cloud-based, Adobe says Lightroom CC is "built on the same imaging technology that powers Photoshop and Lightroom." The desktop app has changed considerably, however. The new version for PC and Mac has an all-new, simplified interface with streamlined sliders, presets and quick-adjustment tools, and some of the features in the old version of Lightroom CC are missing.

The prime feature of Lightroom CC is the cloud sync, which works automatically to save all of your RAW images, edits and metadata, letting you pick up where you left off regardless of your location or device. Another key new feature is Adobe Sensei, an AI algorithm that figures out what's in your images and automatically tags them, much as Google Photos does. Adobe is also highlighting its built-in sharing tools that let you build cutom galleries and share them on social media or through the new Lightroom CC Portfolio integration.

The mobile apps on iOS and Android have also been significantly updated, though they'll still work as they did before with Lightroom CC Classic. The iOS version gets Adobe Sensei search and tagging, an enhanced app layout and iOS 11 file support. Meanwhile on Android, Adobe has finally added tablet support and a local adjustment brush, along with the same Sensei searching as on iOS.

Using a preview copy, I tried out the new desktop version on Windows 10, and the new user interface is completely different and more like the tablet version. Gone are the top "Library," "Develop," "Map," "Slideshow" and other menus, replaced simply by "My Photos," and "Edit." Photo organization has also been simplified, reduced to two grid sizes and a single image, eliminating the "Select/Candidate" and "Survey View."

All of the tools from "Develop" are now in "Edit," but some popular tools like "Tone Curve," "Panorama," and "HDR Merge" are no longer available. There's now an "Edit in Photoshop" button that will presumably let you do more fine-tuned work. However, if you've got an established workflow and rely on those missing tools, you'll obviously want to stick to the Lightroom CC Classic version.

As for the Lightroom CC Classic desktop app, Adobe has made a few small changes including a faster boot time, image previews, file imports, and a new color range and luminance masking functionality. It emphasized that Lighroom CC "continues to focus on a more traditional desktop-first workflow with local storage and file and folder control," compared to the "cloud-centric" operation of Lightroom CC.

I personally liked the new version of Lightroom CC, as I always found the "classic" version to be a bit confusing and cluttered. I generally prefer to use photoshop, but I can now see myself using Lightroom CC for most of my photo editing chores instead. The fact that you can pick up a photo edit where you left off, whether you're on the train, at home or at work, is also a life-changing feature for me.

However, there are a lot of users who depend on the app to make a living, and have often automated the use of it to a large degree. Many of those folks will have no interest in the new app, but Adobe is slowly but surely shifting everything to the cloud, so one day, you may have no choice. For now, Lightroom Classic CC users don't have to worry about it, as it's still available for the same price.

Photoshop CC and new apps

Adobe made some significant changes to Photoshop CC (above), most significantly updates that will make it easier for cloud users to connect. It also added what it calls "major improvements to learning and getting started," thanks to interactive, step-by-step tutorials and rich tip tools. Other highlights include Lightroom photo access from the start screen (above), 360 spherical panoramic image editing, symmetry painting (tech preview), numerous brush tweaks, new font tools and much more.

The company unveiled three new apps that do three very different things. The first, Adobe XD CC, is aimed at users who want to design and prototype mobile apps and services, developed "in open partnership with the design community through a public beta," the company said.

For animation creators, Adobe also unveiled Character Animator CC. It lets you take graphics and characters from Photoshop or Illustrator, and add "visual puppet controls," pose-to-pose blending, physics behaviors and other 2D character animation tools. Finally, there's Adobe Dimension CC, basically a package that lets designers do quick-and-dirty 3D work for branding, packaging design, etc. "with the ease and simplicity of working with 2D."

Plans and Pricing

With the introduction of Lightroom CC, Adobe has introduced several new plans that, it has to be said again, are bound to create some confusion because of its naming system. First off, know that all of its image editing products, including Photoshop CC, Lightroom CC, Lightroom Classic CC, Illustrator and others fall under the "Lightroom CC Photography Service" moniker.

With that in mine, there are three new photography plans, all available starting today. The first is the "Creative Cloud Photography plan with 1TB," which includes Lightroom CC (both the desktop and mobile versions) and Lightroom CC Classic, along with Photoshop CC, Adobe Spark with premium features, Adobe Portfolio and 1TB of cloud storage. That costs $19.99, but Adobe's discounting it to $14.99 for the first year.

The $9.99 "Creative Cloud Photography Plan" gives you the same features, but just 20GB of storage, while the all-new $9.99 "Lightroom CC" plan subtracts Lightroom Classic CC and Photoshop CC, while giving you back the 1TB of storage. Adobe will continue to offer Lightroom 6 as "the last stand-alone version of Lightroom that can be be purchased outside of a Creative Cloud membership." However, it "will no longer be updated with camera support or bug fixes after the end of 2017," it adds.