main

Gaming News

Japanese Sex Toy Turned Into A Transformer-Type Robot

August 17, 2018 — by Kotaku.com0

bcoa4vyr7gprmpv5xy0q-960x540.jpg

Some transforming-type toys turn into cars or fighter jets. This one turns into a male masturbation aid.

Keep in mind, it’s not one-to-one scale and not designed for actual use. This is just Japanese sex-toy maker Tenga releasing an off-beat transforming-type toy.

As Otakuma reports, the regular Tenga Robo is 3,700 yen ($33.50). These are not sex toys in disguise and don’t come with the Tenga wank gear.

Advertisement

Advertisement

A few years back, Tenga also launched a Sony-published online comic about crime-fighting girls who were designed to look like its masturbators.

Tech News

Fox AI predicts a movie's audience based on its trailer

July 29, 2018 — by Engadget.com0

doomu via Getty Images

Modern movie trailers are already cynical exercises in attention grabbing (such as the social media-friendly burst of imagery at the start of many clips), but they might be even more calculated in the future. Researchers at 20th Century Fox have produced a deep learning system that can predict who will be most likely to watch a movie based on its trailer. Thanks to training that linked hundreds of trailers to movie attendance records, the AI can draw a connection between visual elements in trailers (such as colors, faces, landscapes and lighting) and the performance of a film for certain demographics. A trailer with plenty of talking heads and warm colors may appeal to a different group than one with lots of bold colors and sweeping vistas.

Notably, the deep learning approach already appears to work in real world conditions. While Fox did use existing movies as a benchmark, it also had success anticipating the performance of future movies. Sure enough, the visual cues in a brand new movie trailer gave an idea as to what attendance would be like several months later.

There are flaws in this method. It doesn’t capture temporal info (Fox uses an explosion after a car chase as an example), and it would ideally combine both the video and text descriptions to get a fuller sense of the story.

However, Fox isn’t shy about the practical applications. The AI could help studios craft trailers they know will appeal to a movie’s intended audience, whether they’re casual moviegoers who stick to the blockbusters or aficionados who want something off the beaten path. You might well see trailers that play up specific imagery to increase the chances that you’ll buy tickets. And that’s important in the streaming era, where movie theaters have to compete for viewers who could easily stay home and watch something on Amazon or Netflix.

Tech News

DJI's leaked Mavic 2 drone will come in 'Pro' and 'Zoom' versions

July 29, 2018 — by Engadget.com0

Monty_f, Twitter

What little mystery surrounded DJI’s upcoming Mavic 2 drone appears to have evaporated. Numerous UK residents have noticed that the latest Argos catalog includes a prominent ad for the Mavic 2 that reveals just about everything, including some clarifications on past leaks. It doesn’t appear to have a removable gimbal, alas. Instead, there will be separate Mavic 2 Pro and Mavic 2 Zoom models tailored to specific needs. The ‘regular’ Zoom model would include a 2X optical zoom lens for aerial close-ups, while the Pro would pack a Hasselblad camera with a 1-inch sensor and no zoom. You’d have to decide whether quality or flexibility is your main focus.

Other details? The listing confirms 360-degree collision awareness (here described as “omnidirectional obstacle sensing”), a brisk 45MPH top speed and a 31-minute flight time comparable to the Mavic Pro Platinum. There’s no mention of the video recording quality (presumably 4K), although the drones would transmit live 1080p video at distances of up to five miles. And pricing remains a mystery. It could come in around the existing Mavic Pro price range of $999 to $1,099, but that’s far from guaranteed.

Just when the Mavic 2 line would show up is another story. DJI has indefinitely postponed a launch event that was supposed to take place on July 18th, and there’s clearly been nothing since then. The mere presence of the drone in the catalog doesn’t ensure that a launch is around the corner, either. You may still be waiting a while as a result — you’ll just have a decent idea of what to expect whenever the Mavic 2 does arrive.

@OsitaLV better quality image pic.twitter.com/UWuj6k56WE

— Monty_f (@monty_f) July 28, 2018

Tech News

Centauro is a disaster-response robot that looks like a horse

July 26, 2018 — by Engadget.com0

IIT

These days, most robots under development seem to be based on humans. However, a tweak to these designs might actually make robots more effective and stable. Centauro is based on the design of a centaur, hence the name, and its four legs on wheels provide movement and freedom that have been unachievable with comparable bipedal models. Its human-like torso and arms allow it to perform fine motor functions.

The robot stands 1.5 meters (almost 5 feet) tall and weighs 93 kgs (just over 200 lbs). Its skeleton is composed of lightweight metals, and its body is covered in 3D-printed plastic. Batteries can keep Centauro operational for about 2.5 hours. The robot is not autonomous; it requires a human operator, which is a job that takes quite a bit of training.

[embedded content]

Disaster response is the main goal for Centauro, and its leg structure give it added mobility in precarious areas and situations. “It will be able to navigate in affected man-made environments, including the inside of buildings and stairs, which are cluttered with debris and partially collapsed,” the Centauro Project’s website says. Its four legs provide full the much-coveted six degrees of freedom movement.

Centauro is based on Momaro, which was introduced at the DARPA Robotics Challenge by the University of Bonn. It was the top-ranked robot at the competition, so it was only natural that Momaro’s ingenious design would provide inspiration for other robots, like Centauro. The platform was built by the Italian Institute of Technology (IIT) and is funded by the EU. The CENTAURO Consortium is coordinated by the team at the University of Bonn that developed Momaro.

Tech News

Adorable home robot Kuri is being discontinued

July 25, 2018 — by Engadget.com0

Kuri

Cute mechanical companion Kuri is no more. In a blog post published today, manufacturer Mayfield Robotics said that operations have been paused while it evaluates the company’s future, and that pre-orders of the adorable home robot will not be filled (all pre-order deposits will be refunded).

Mayfield Robotics, part of the Bosch Startup Platform, was established in 2015 with a bold vision to domesticate robots. Kuri was designed to be neither traditionally functional (like a vacuum cleaner), nor educational, but was intended to enter the home as a family member, reading to kids, playing with pets and taking photos of precious family moments.

The first Kuri units were priced at $700 apiece — relatively affordable for the tech involved but nonetheless expensive for a robot that didn’t really do much. Interest in Kuri was high, but pre-orders were low. As Mayfield’s blog post notes, “there was not a business fit within Bosch to support and scale [the] business.” Crowdfunded “social robot” Jibo faced a similar issue, failing to scale as backers hoped it would.

The decision arguably reflects the wider robotics industry. Droids are improving, but putting a cute face on them doesn’t make them useful, and usefulness is what’s going to sell products. Look at virtual assistants on smart speakers, such as Amazon’s Alexa. These have become commonplace in modern homes because they have tangible purpose — and are, of course, considerably more affordable. At this stage in robotics R&D, poor adorable Kuri could never compete.

Tech News

Cell-sized robots could help find disease within your body

July 24, 2018 — by Engadget.com0

MIT

Small robots aren’t anything new, from DARPA’s insect-sized disaster relief bots to diminutive inchworms powered by humidity. Now, though, researchers at MIT have likely created the smallest robots, yet: Microscopic, cell-sized electronic circuits made of two-dimensional materials that catch a ride on colloids, insoluble particles that stay suspended in liquid or even air.

Since these minuscule devices can sense their environment, store data and carry out computational tasks, they could eventually be found in oil and gas pipelines, checking for leaks. They could be deployed into the air at a chemical refinery to sense harmful byproducts, or even into the human digestive tract for early detection of illness. “We wanted to figure out methods to graft complete, intact electronic circuits onto colloidal particles,” MIT’s Michael Strano said in a blog post. “Colloids can access environments and travel in ways that other materials can’t.”

Instead of focusing on mobility like previous research efforts have, the current group made its robots more functional. The devices are self-powered, using a small photodiode that provides electricity to the robots’ circuits to allow for computation and memory storage. Each little bot has tiny retroreflectors so they can be located after they’ve traveled through whatever substrate they need to. Don’t plan on seeing these in your body anytime soon, of course, but according to Strano, they are definitely a “new field” of robotics.

Tech News

DARPA pushes for AI that can explain its decisions

July 23, 2018 — by Engadget.com0

ValeryBrozhinsky via Getty Images

Companies like to flaunt their use of artificial intelligence to the point where it’s virtually meaningless, but the truth is that AI as we know it is still quite dumb. While it can generate useful results, it can’t explain why it produced those results in meaningful terms, or adapt to ever-evolving situations. DARPA thinks it can move AI forward, thoug. It’s launching an Artificial Intelligence Exploration program that will invest in new AI concepts, including “third wave” AI with contextual adaptation and an ability to explain its decisions in ways that make sense. If it identified a cat, for instance, it could explain that it detected fur, paws and whiskers in a familiar cat shape.

Importantly, DARPA also hopes to step up the pace. It’s promising “streamlined” processes that will lead to projects starting three months after a funding opportunity shows up, with feasibility becoming clear about 18 months after a team wins its contract. You might not have to wait several years or more just to witness an AI breakthrough.

The industry isn’t beholden to DARPA’s schedule, of course. It’s entirely possible that companies will develop third wave AI as quickly on their own terms. This program could light a fire under those companies, mind you. And if nothing else, it suggests that AI pioneers are ready to move beyond today’s ‘basic’ machine learning and closer to AI that actually thinks instead of merely churning out data.

[embedded content]

Tech News

Harvard's robot arm can grab squishy sea animals without hurting them

July 21, 2018 — by Engadget.com0

Wyss Institute at Harvard University

As you might imagine, you can’t just grab extra-soft sea creatures like jellyfish or octopuses when you want to study them. Not if you want them to remain intact, anyway. Thankfully, researchers at Harvard’s Wyss Institute have a far more delicate solution. They’ve created a robot arm (the RAD sampler) whose petal-like fingers can quickly form a ball shape around an animal, capturing it without risking any harm. It’s simpler than it looks — it uses just a single motor to drive the entire jointed structure, so it’s easy to control and easier still to repair if something breaks.

To date, the arm has only been useful for catch-and-release experiments. In the future, though, biologists could outfit the machine with cameras and sensors to collect information about whatever’s inside the sphere, whether it’s the material composition, size or the genetic sequencing. If that happens, researchers could study fragile undersea critters in their native habitats and glean insights that wouldn’t be available above water or with dead specimens.

[embedded content]

Tech News

DARPA's insect-sized SHRIMP robots could aid disaster relief

July 19, 2018 — by Engadget.com0

DARPA

DARPA’s efforts to propel military technology forward often manifest in a diverse fashion, spanning everything from drone submarine development to a biostasis program that aims to buy more time to rescue soldiers on the battlefield. The SHRIMP program, short for SHort-Range Independent Microrobotic Platforms, is another potentially life-saving initiative that is being designed to navigate through hazardous natural disaster zones.

What differentiates SHRIMP from microrobotics limited by SWaP (size, weight and power) constraints is its size. DARPA has managed to shrink the tech down to the size of an insect — a scale of mm-to-cm. Program manager Dr. Ronald Polcawich says the smaller scale is what gives SHRIMP robots an advantage over larger robots — which are too large to inspect damaged environments.

Downsizing robotics comes with various trade-offs, which notably include the loss of technical power and control to effectively carry out tasks. To combat such challenges, DARPA plans to pursue actuator materials and mechanisms that would prioritize factors like strength-to-weight ratio and maximum work density. Advances in these areas could equip SHRIMP with both the endurance and ability required to execute critical tasks.

The SHRIMP robots are part of DARPA’s push to drive forward functional microrobotics that offer unrestricted mobility, dexterity, and maneuverability. They’re set to undergo rigorous “Olympic-style” trials which will scrutinize SHRIMP’s capacity to jump, lift increasingly larger masses, and traverse inclines and measure its overall efficacy. The tests are expected to begin in March 2019. DARPA says it anticipates a total of $32 million to help fund research and development.

Tech News

'Robot chemist' could use AI to speed up medical breakthroughs

July 18, 2018 — by Engadget.com0

robot-chemist-u-glasgow-960x640.jpg

Getty Images/iStockphoto

Scientists can only do so much to discover new chemical reactions on their own. Short of happy accidents, it can take years to find new drugs that might save lives. They might have a better way at the University of Glasgow, though: let robots do the hard work. A research team at the school has developed a “robot chemist” (below) that uses machine learning to accelerate discoveries of chemical reactions and molecules. The bot uses machine learning to predict the outcomes of chemical reactions based on what it gleans from direct experience with just a fraction of those interactions. In a test with 1,000 possible reactions from 18 chemicals, the machine only needed to explore 100 of them to predict study-worthy reactions in the entire lot with about 80 percent accuracy.

The University said it found four reactions just through this test, and one reaction was in the top one percent of unique responses.

That may not sound like a great success rate, and it will ideally get better. However, it’s easy to see the robot dramatically speeding up the discovery process by letting scientists focus on the handful of reactions that are most likely to pan out. That could accelerate the development of new treatments, new battery formulas and extra-strong materials. And it wouldn’t necessarily cost jobs — rather, it could help chemists focus on the trickier aspects of research instead of plowing through mundane tests.