The team here at Unbounded Robotics is pleased and excited to today introduce UBR-1. UBR-1 is a state-of-the-art ROS-based mobile manipulation platform designed for robotics researchers and businesses.
A bit of a story, and then a lot of pictures — by far the most interesting class I’ve ever taken was Advanced Brain Imaging in grad school. As a hands on lab class, each week we’d have a bit of a lecture on a new imaging technique, and then head off to the imaging lab where one of the grad students would often end up in the Magnetic Resonance Imager (MRI) and we’d see the technique we’d just learned about demonstrated. Before the class I was only aware of the structural images that most folks think of when they think of an MRI, as well as the functional MRI (or fMRI) scans that measure blood oxygenation levels correlated with brain activity and are often used in cognitive neuroscience experiments. But after learning about Diffusion Tensor Imaging, spin-labeling, and half a dozen other techniques, I decided that the MRI is probably one of the most amazing machines that humans have ever built. And I really wanted to build one.
A stepper motor to satisfy all your robotics needs! This 4-wire bipolar stepper has 1.8° per step for smooth motion and a nice holding torque. The motor was specified to have a max current of 350mA so that it could be driven easily with an Adafruit motor shield for Arduino (or other motor driver) and a wall adapter or lead-acid battery.
One of the recent projects garnering a lot of attention in the 3D printing community and the open prosthetics movement in general (and Jon Schull shared it in e-NABLE here, thanks for the tip) is a teaching project by Richard Hague that is both a model for many elements involved with feedback in the human arm and an example how an electronic and mechanical 3DP design might replicate this complexity within a tight form factor.
3D printing can render everyday artifacts in clear plastic, so we can see in unprecedented detail how they work – and this exquisite model of a prosthetic arm is a brilliant example. It is one of the highlights at the London Science Museum’s 3D printing exhibition, which features more than 600 printed objects.
Designed by Richard Hague, director of the Additive Manufacturing and 3D Printing Research Group at the University of Nottingham, UK, and his students the arm shows how the printers can create strong structure, mobile joints and delicate sensors – like spiral-shaped metal touch-detectors – all in one process.
“It’s a mock-up but it shows circuits that sense temperature, feel objects and control the arm’s movement,” says Hague. “3D printing gives us the freedom to make complex, optimised shapes, and our research aim is focused on printing-in electrical, optical or even biological functions.”
Such techniques are also bringing prosthetics to people who previously could not afford them. For instance, the open-source “robohand” project, pioneered by South African carpenter Richard Van As, aims to print cheap, plastic customised prostheses for people who have lost fingers, or who were born with some digits missing or malformed. Some of his work – with the designs available online – is also on show at the Science Museum.
This is a non-profit video made solely for the purpose of disseminating the research involved in this project.
Here, we present an initial effort towards the design of tabletop game agents that are perceived as socially present and can socially interact with several human players. To achieve this, we started by building a custom digital table and performing some initial empirical studies. Based on these studies and on some related work, a case study where a digital table is used as an interface and a social robot plays Risk against three human opponents was developed and evaluated.
Advancements in robotics are continually taking place in the fields of space exploration, health care, public safety, entertainment, defense, and more. These machines — some fully autonomous, some requiring human input — extend our grasp, enhance our capabilities, and travel as our surrogates to places too dangerous or difficult for us to go. Gathered here are recent images of robotic technology at the beginning of the 21st century, including robotic insurgents, NASA’s Juno spacecraft on its way to Jupiter, and a machine inside an archaeological dig in Mexico.
via designboom, this robotic arm installation by artist Jonathan Schipper prints out tiny salt-crystal strucutures that look like cities rising from dunes. And if this all weren’t piquing visitor’s interest enough, it’s set up so the installation can be viewed from a hot-tub, so you can get a jet-water massage while salt-pillared cities (or interpretations of Lot’s wife) rise around you. Named ‘detritus’, the installation is on display at the boiler in brooklyn until november 24th of this year.
Poppy is an Open-source humanoid platform based on robust, flexible, easy-to-use hardware and software. Designed by the Flowers Lab at inria Bordeaux (France), its development aims at providing an affordable humanoid robot for Research and Education. More information on poppy-project.org
The robot with a human face unveiled at the Smithsonian’s National Air and Space Museum was built by London’s Shadow Robot Co to showcase medical breakthroughs in bionic body parts and artificial organs. “This is not a gimmick. This is a real science development,” museum director John Dailey said.
Rich_O shared an Arduino robotics project he has been working on for a while on the Adafruit Forums — and has shared code he tuned for autonomous roving!
I wanted to share the learning project I have been working on to program the Arduino and control a robot.
I purchased the J-Bot rover chassis (4 wheel / 4 motor DFROBOT ROB003), the (((Ping ultrasonic range finder, Standard RC servo (HS-311), Arduino Uno and the Adafruit V2 motor controller. I initially attempted this project with the Leonardo but It was to difficult for me to use the USB interface with the Leo’s virtual USB port. I used the 5 AA battery holder (fitted inside the rover between the motors) and batteries to power the motors and a 6 AA batteries to power the Arduino Uno and servo.
My next step is to add remote control via integration with my onboard Raspi via WiFi and the MobaXterm remote terminal.
Heather Knight is in the bully pulpit for artificial intelligence. According to her TED profile:
Her installations have been featured at the Smithsonian-Cooper Hewitt Design Museum, LACMA, SIGGRAPH, PopTech and the Fortezza da Basso in Florence, Italy. Her past work also includes robotics and instrumentation at NASA’s Jet Propulsion Laboratory, interactive installations with Syyn Labs (including the award- winning Rube Goldberg machine music video with OK Go), and sensor design at Aldebaran Robotics. She was recently named Assistant Director of Robotics at Humanity+.
Petting Zoo by Minimaforms is speculative robotic environment populated by artificial intelligent creatures that have been designed with the capacity to learn and explore behaviors through interaction with participants. Within this immersive installation interaction with the pets foster human curiosity, play, forging intimate exchanges that are emotive, evolving over time and enabling communication between people and their environment. The installation exhibits life-like attributes through forms of conversational interaction establishing communication with users that are emotive and sensorial. Conceived as an immersive installation environment, social and synthetic forms of systemic interactions allow the pets to engage and evolve their behaviors over time exhibiting features and personalities that are formed through their interactions with the general public. Pets interact and stimulate participation with users through the use of animate behaviors communicated through visual, haptic and aural communication. Pet interactions are stimulated through interaction with human users or between other pets within the population.
Jim Henson made this film in 1963 for The Bell System. Specifically, it was made for an elite seminar given for business owners, on the then-brand-new topic — Data Communications. The seminar itself involved a lot of films and multimedia presentations, and took place in Chicago. A lengthy description of the planning of the Bell Data Communications Seminar — sans a mention of the Henson involvement — is on the blog of Inpro co-founder Jack Byrne. It later was renamed the Bell Business Communications Seminar.
The organizers of the seminar, Inpro, actually set the tone for the film in a three-page memo from one of Inpro’s principals, Ted Mills to Henson. Mills outlined the nascent, but growing relationship between man and machine: a relationship not without tension and resentment: “He [the robot] is sure that All Men Basically Want to Play Golf, and not run businesses — if he can do it better.” (Mills also later designed the ride for the Bell System at the 1964 World’s Fair.) Henson’s execution is not only true to Mills’ vision, but he also puts his own unique, irreverent spin on the material.
The robot narrator used in this film had previously starred in a skit for a food fair in Germany (video is silent), in 1961. It also may be the same robot that appeared on the Mike Douglas Show in 1966. Henson created a different — but similar — robot for the SKF Industries pavilion at the 1964 World’s Fair.
This film was found in the AT&T Archives. Thanks go to Karen Falk of the Henson Archives for providing help and supporting documentation to prove that it was, indeed, a Henson production..
Footage courtesy of AT&T Archives and History Center, Warren, NJ