Cheap, efficient and emotionally intelligent robots may edge out legions of workers over the next 20 years, experts have predicted
More than a third of British workers will be displaced by machines and artificial intelligence by 2035, with job losses particularly acute among blue-collar employees, they say. In the US, nearly half the workforce is expected to be displaced.
New industrial robots will replace low-paid workers, while the improved creativity of artificial intelligence puts paid to white-collar jobs, Robot Revolution, a report from Bank of America Merrill Lynch, says. People earning less than £30,000 a year are five times more likely to be displaced than those on more than £100,000 a year, the report suggests.
“We are seeing the earliest cognitive stages of human and machine development, where robots are able to collect large amounts of data, analyse it and make optimum decisions, and potentially learn from past interactions,” the report says. “Looking out to the future, we are likely to see the evolution of intelligent machines that can sense and understand human emotion and also show adaptability to their surroundings, rendering them increasingly autonomous.”
The report suggests that companies begin to replace human workers with robots when the cost saving reaches 15 per cent. As the cost of employing robots falls, this tipping point is more easily reached.
In the US, the cost of employing a welding robot is now about £5 an hour, compared with £16 for a human equivalent.
In San Francisco, a small start-up company is developing the world’s first fully automated burger maker to replace workers in fast-food restaurants. The robot, which is being made by Momentum Machines, will shape burgers from mincemeat, cook them to a specified level of chargrilling, toast the buns, add tomatoes, onions and pickles and place the finished products on a conveyor belt.
The Robot Cambrian Explosion
Half a billion years ago, Earth’s animal life rapidly evolved during the event known as the Cambrian explosion. In the future, growing swarms of robots all talking with one another could spark a similar “Cambrian explosion” for robotic evolution. A robotics expert who has worked for the U.S. military recently published a paper on the technological changes that could rapidly spawn the next generation of robots powered by advanced artificial intelligence. He also weighs the consequences of robots rapidly replacing huge numbers of human workers.
Two technologies could play the biggest roles in rapid robot and AI evolution, according to Gill Pratt, who has served as robotics program manager for the U.S. Defense Advanced Research Projects Agency (DARPA). First, “Cloud Robotics” could allow robots to share experiences and knowledge through wireless connections and the Internet. Second, “Deep Learning” algorithms allow robots to learn from experience and apply those lessons to more general scenarios. Together, they could lead to more capable robots with the AI brains to handle many more jobs currently done by humans, according to Pratt’s paper published in the Journal of Economic Perspectives.
Predicting the Cambrian Explosion
Such considerations may seem far off in the future. But Pratt devotes most of his paper to describing progress in data storage, battery energy storage, electronics power efficiency, computation power and other technological areas that could help spark the Cambrian revolution in robotics. He also mentions the significant amounts of money tech companies have poured into developing better robots and AI.
“The timing of tipping points is hard to predict, and exactly when an explosion in robotics capabilities will occur is not clear,” Pratt writes. “Commercial investment in autonomy and robotics—including and especially in autonomous cars—has significantly accelerated,with high-profile firms like Amazon, Apple, Google, and Uber, as well as all the automotive companies, announcing significant projects in this area.”
The U.S. military and other government agencies have also been pushing the boundaries of robotics research. During his time with DARPA, Pratt oversaw the U.S. military’s DARPA Robotics Challenge. This summer, the contest awarded a total of $3.5 million to the most capable robots from teams around the world in 2015. Such robots had to demonstrate their capability to handle physical tasks such as turning a valve, tripping circuit breakers, climbing stairs and even driving a car.
In the end, nobody knows exactly when a Cambrian explosion in robotics could happen. But some AI researchers have begun looking to past examples of abrupt leaps in technological progress for clues about the future of AI and robots; they’re even paying “research bounties” worth between $50 and $500 for historical examples. Their hope is that a sudden rise of AI won’t catch humanity completely by surprise.
Robot Surgeons
SURGEONS will use robots to conduct kidney transplants for the first time in Britain this spring, as the machines take on an increasing number of medical procedures.
Some medics predict that, in the next five years, robots will make the decisions about where to cut into the patient.
From next month, surgeons at Guy’s and St Thomas’ NHS Foundation Trust in London will turn to robots to perform keyhole kidney transplants, because the machines can carry out the most crucial part of the operation more quickly than a human.
Keyhole surgery is less painful and allows the patient to recover more quickly. At the moment, however, many surgeons feel they are not able to sew the blood vessels attaching the donated kidney to the patient quickly enough during a keyhole procedure. If not done swiftly enough, the donor kidney becomes damaged because it does not have a blood supply from the patient.
Carefully attaching the blood vessels between the donor kidney and the patient is the most important element of a kidney transplant.
A transplant surgeon will control the robot, which should manage the sewing up, through the tiny keyhole-sized incision in the body, within 30 to 40 minutes.
Nizam Mamode, consultant transplant surgeon at Guy’s and St Thomas’, who will be doing the robotic transplants, said: “What the robot does is allow you to stitch much more quickly than you otherwise would be able to do.”
So far, about 300 such robotic transplants have been conducted in India. Smaller numbers have been carried out in America and Italy.
Initially, the da Vinci robot, already used for other types of surgery including bladder surgery, will be used for the transplants. Surgeons expect to soon have a robot designed specifically for transplants.
Mamode predicts that in future robots will not just be driven by surgeons but will make key decisions such as where to cut into the patient.
Mamode said: “I think, increasingly, the robots will be making those decisions ... about where to cut. So, if you just think about going through the abdominal wall, normally the way that we would do it is, take a knife, go through the skin, stop, seal all the bleeding vessels, cut a bit more, have a look to see if we are right down to the level we need to be, cut a bit more. We are very close to a stage where a robot could do that.”
Brian Davies, emeritus professor of medical robotics at Imperial College London, believes surgeons will still wish to take the final decision about where the series of cuts are made, but says robots will increasingly advise doctors.
Davies said: “What we will see more of in the next five years is the ability to advise the surgeon of the suggested location and sequence of cuts, together with alternative possibilities.”
Surgeons at the Royal Devon and Exeter NHS Foundation Trust are investigating whether imaging technology could be attached to the end of a robot arm so that during an operation to remove cancerous tissue, the machine could tell the surgeon where the cancer is and how much tissue should be removed.
John McGrath, consultant urological surgeon at Royal Devon and Exeter, says it is possible that in future robots could be programmed to perform such an operation on their own.
McGrath said: “If we knew on the MRI scan before surgery where the tumour was, you could you take the surgeon out completely and just program into the robot a series of co-ordinates, and it would remove the area we have identified on a scan and remove it safely.”
Davies believes the surgeon will be reluctant to let robots work entirely on their own, however. He said: “Such systems are not totally autonomous in deciding what is the target tissue in real time on the patient. The surgeon, in taking responsibility for her patients, will want to have the final say.”
Robots are now commonly used to remove prostate glands, bladders and wombs. It is estimated that 80% of prostate cancer patients now have the prostate removed by a robot.
Personal Robot
Meet Sally — she is your new best friend,” the ad for Persona Synthetics begins. “The help you’ve always wanted. She is faster, stronger, more capable than ever before. She can be just about anyone: a teacher, a helper, a carer, a friend.”
The words soothe, over images of a prim electronic Mary Poppins cooking, cleaning, watering the garden and saving children. It managed to fool Twitter for a while: “Anyone seen the creepy Persona Synthetics ad on C4? Scared the hell out of me!” chirped Umar Siddiqui.
In fact, it was a marketing stunt for Channel 4’s Humans, a sci- fi eight- parter set in a parallel world where androids are ubiquitous household gadgets.
Loosely based on the Scandi drama Real Humans, it unpicks the complex emotional events set in motion when Gemma Chan’s beautiful robot, or synth, starts working for a family with a near-absent career mum (an excellent Katherine Parkinson), struggling dad (Tom Goodman-Hill), young child and two hormonal teenagers.
“We were keen to avoid the typical sci-fi dystopia where you’d have the synths lay waste to humanity,” says Jon Brackley, who adapted the Swedish show with his Spooks writing partner, Sam Vincent. “This is a world we think could happen, and we’ve tried to portray it as realistically as possible. It’s basically what would happen if our iPhones came in human form.”
Domestic drudgery being the stuff of reality television, Humans has a thriller plot involving William Hurt, the inventor of the synths, who secretly gives a select few real human emotions. The story has echoes of Blade Runner, including an icy blonde synth who spends time as a sex worker, but, as Vincent points out: “There is, in essence, only one robot story, and that’s, ‘What are they going to do to us?’”
Humans seems distinct from our fascination with post-apocalyptic mechanical slugfests. Instead, like Alex Garland’s recent Ex Machina, it recalls the fertile era of Isaac Asimov stories, using future tech to satirise everything from the Cold War to sex as it riffs on what occurs when cheap or illegal labour is replaced on farms and in brothels. For Parkinson, it was this emotional element that appealed. She sees the show as being about the trade-off we’re making when we entrust our lives to others, from Apple to au pairs.
“I was breastfeeding throughout filming, so I was interested in the idea that when we delegate what we call menial jobs, we can be giving away more than we want,” she says. “If you get someone in to change your baby’s nappy because you’re going back to work, that’s great. But every nappy change is also an intimate moment with your child.”
Parkinson’s Laura is deeply suspicious of Chan’s Anita, guiltily aware that her career and what seems to be an affair are keeping her away from home and letting Anita in. She tries to take over stories at bedtime and checking in on her daughter at night, but falters in the face of Anita’s robotic responses. Parkinson found playing opposite Chan’s unruffled features unsettling: “I’d be trying to make her laugh, or waving my hands around like mad to overcompensate for her stillness.”
She found herself thinking of Chan/ Anita as not quite human, and it had an eerie effect. “Although Anita is synthetic, she looks so beautiful, and it made me think of the silicone breasts in porn. You could say those breasts are synthetic, so surely they don’t arouse a man, right? Well, it doesn’t work like that.
The thing about people is, we’ll see something beautiful and fall in love with it. That seems to be the risk we’re facing: loving our machines and hating each other.”
Humans
Imagine a world where robots look, act, and function like human beings, minus consciousness or free will. They’ve been programmed to take over sophisticated tasks, like housekeeping, manual labor, medical care and, yes, sex. Imagine the human equivalent of a Roomba.
This is the world depicted in AMC’s Humans, which recently wrapped up its first season. Set in a “parallel present” where highly sophisticated androids—called “synths”—have become ubiquitous, the show explores what it would be like for humans to one day experience intimacy, jealousy, and bigotry towards machines.
Critics have praised Humans for going beyond sci-fi’s typical doomsday portrayal of artificial intelligence. If the Terminator and Matrix franchises reflected our fear of one day being dominated by machines, Humans taps into a subtler anxiety about being replaced by them. In this parallel world, synths don’t just take jobs, they also excel at the subtleties of domestic labor: preparing home cooked meals, massaging a tired spouse, reading to a child before putting her to bed.
Writing for Engadget, Devindra Hardawar argues that we’re seeing a shift in representations of A.I. in popular culture like Ex Machina and Black Mirror, another U.K. TV import, both of which depict androids designed to evoke human emotions. For Hardawar, we seem to be preparing for an inevitable future. “We’re still facing our fears and anxieties of this new tech through science fiction,” Hardawar points out, “but now it’s on a much smaller and more intimate scale.”
But humans won’t be displaced by androids. As fascinating as this new wave of sci-fi is, it misses something important: our robotic usurpers are already here, they just don’t look like us. Any future involving the displacement of humans most likely belongs to “theroids,” a vast menagerie of mechanical beasts.
Robots taking jobs
Humans’ title sequence opens with grainy video clips documenting the progress of android dexterity and intelligence. A giant metal hulk gives way to articulating arms and fingers; soon robots are sewing, moving chess pieces, and playing the violin; and finally, snippets of advertising promise “extra help around the house,” and ask, “What could you accomplish if you had someone—something—like this?” A newspaper headline states: “Robots threaten 10 Million Jobs.”
In the world of Humans a linear progression of android development has resulted in synths who match or exceed human dexterity and capability enabling them to take over a number of jobs, from waiters to nurses to 9-1-1 operators. Low-skilled workers have been displaced en masse, spawning the “We Are People” movement. But Humans isn’t concerned with the politics of this world so much as it is with the emotional toll it might take on a middle class family like the Hawkins, who purchase a synth they call Anita in the pilot episode.
We see the mother, Laura Hawkins, cringe when her youngest daughter prefers that Anita read to her at bedtime. “I can take better care of your children than you can,” she tells Laura. And it’s true, of course, that a synth is never tired, bored, angry, or intoxicated; that they bake better cakes, never get impatient when reading to a child, and can even attend to the needs of a lonely husband. But we often show love through acts of domestic labor. Humans asks how it would feel to outsource such care.
Meanwhile, Laura’s eldest child Mattie must navigate young adulthood in a rapidly changing world. She had wanted to be a doctor, but now Synths do the job better than any human can. “There’s nothing I could do that a synth can’t do better,” she complains to her boyfriend, as they sit overlooking a golf course. The synth caddies, she remarks, can hit holes-in-one every time.
What will our usurpers look like?
This proliferation of androids forces us to ask a very specific set of existential questions about our relationship to technology and what remains of human purpose. Such questions emerge organically in the world of Humans because synths look just like us. They’re essentially a digital upgrade: Humanity 2.0.
But while androids make for a powerful literary device through which to explore our fear that technology might one day surpass us, the human form isn’t always best for the job. Take, for example, the synth telephone operator. Why create a physical robot for this job which has to receive audio through a wired earpiece and then respond via speaker into a microphone? Couldn’t synth software do the trick without complex parts that mimic the functions of ears and a mouth?
Furthermore, wouldn’t it make better economic sense to distribute artificial intelligence across multiple hardware platforms, instead of clustering so much precious technology into a single body? Wouldn’t it make more sense to have a dog-like Roomba, a wireless home operating system, and a self-driving car, since each component could be upgraded and replaced?
In our own world, this diversified approach to robot morphology is already the norm. Rather than build androids with broad intelligence and skill sets, manufacturers have been developing highly specialized robots for specific tasks. Many are modeled after existing creatures, and so it would be more accurate to call them “theroids,” or “animal-like” robots. For underwater spying the U.S. Navy built a drone that looks and swims like a Bluefin tuna; Boston Dynamics’ “Big Dog” is a tireless pack mule that will walk alongside soldiers in the field; and robotic swans might soon be testing water quality near you, which makes sense since swans are well designed for floating along lakes.
Visit Japan’s first robot-staffed hotel, and you’ll interact with a number of theroids, including a robotic dinosaur. If the dinosaur accidentally hurts you, and you’re elderly, you might be taken care of by a bear. This cuddle bot is a patch of interactive fur, in case that’s your thing. Of course animals aren’t always the best shape for tasks, either. Hotels around the world are buying room service robots that happen to resemble floating trashcans.
In our own world we’re also seeing white-collar jobs outsourced to intelligent machines. Instead of being shaped like humans, however, ours are shaped like computers. In Rise of the Robots, Martin Ford details a number of white-collar careers that have been threatened, or outright replaced, by clever software. The process of legal discovery, for example, was once the job of trained lawyers and paralegals as it took a human mind to discern whether a certain document or fact had potential relevance to the case at hand. Today, “e-Discovery” software can analyze millions of electronic documents and isolate the relevant ones. They go beyond mere keyword searches, using machine learning to isolate concepts, even if specific phrases aren’t present.
Androids can’t replace pharmacists on their own, but pharmacists can be replaced by a complex automated system like the University of California San Francisco’s robotic pharmacy, which is capable of producing hundreds of thousands of labeled doses of medicine without error. This is part of the growing trend within large-scale manufacturing to replace teams of people with customized, automated systems. As John Markoff recently detailed, robotic arms are now picking the lettuce we eat, operating the grocery distribution systems that bring that lettuce to our neighborhoods, and building the cars that get us to the store.
Like any great work of science fiction, Humans draws from our current world to ask big questions about who we are now and what we might become. But although the synth may embody our collective fear of being replaced, the reality is that the android scenario helps obscure the fact that we already share a world with robots that exceed us in a variety of capacities. While we sit on the sofa watching the rise of androids on screen, the Roomba quietly cleans the carpet around us.
SAM
Construction workers on some sites are getting new, non-union help. SAM – short for semi-automated mason – is a robotic bricklayer being used to increase productivity as it works with human masons.
In this human-robot team, the robot is responsible for the more rote tasks: picking up bricks, applying mortar, and placing them in their designated location. A human handles the more nuanced activities, like setting up the worksite, laying bricks in tricky areas, such as corners, and handling aesthetic details, like cleaning up excess mortar.
Even in completing repetitive tasks, SAM still has to be fairly adaptable. It’s able to complete precise and level work while mounted on a scaffold that sways slightly in the wind. The robot can correct for the differences between theoretical building specifications and what’s actually on site, says Scott Peters, cofounder of Construction Robotics, a company based in Victor, New York, that designed SAM as its debut product.
“In construction, your design will say that a window is located exactly 30 feet from the corner of a building, and in reality when you get to the building, nothing is ever where it says it’s supposed to be,” Peters says. “Masons know how to adapt to that, so we had to design a robot that knows how to do that, too.”
In its current iteration, the system is best suited to work on large swaths of flat walls, most commonly found in projects for universities, hospitals, and other large sites. But some amount of detailed work isn’t beyond the system’s abilities. SAM can emblazon a company logo in brick on a wall, for instance, by following a pixelated map of the image. It can also bump bricks in or out by about half an inch, to create a textured look to a wall face.
SAM is able to apply mortar to bricks before laying them, without human help.
The robot is able to do all of this using a set of algorithms, a handful of sensors that measure incline angles, velocity, and orientation, and a laser. The laser is rigged up between two poles at the extreme left and right sides of the robot’s work space, and moves up and down the wall as work progresses to act as an anchor point for the robot. Without this, the robot would not know exactly where to lay brick, or how to assess its motion on the scaffold relative to where the wall is.
Peters says SAM’s purpose is to leverage human jobs, not entirely replace them - a human mason can lay about 300 to 500 bricks a day, while SAM can lay about 800 to 1,200 bricks a day. One human plus one SAM equals the productivity of having four or more masons on the job.
Also in the family of robot construction workers is an Australian-designed machine called Hadrian, which can purportedly build a house made of 15,000 bricks in about two days. Hadrian, however, is still a prototype, while Construction Robotics will do its first limited commercial release of SAM this fall. Three units are for sale, each with a price tag around half a million dollars. With this cost in mind, Peters stresses that this kind of system will have the most payback on major commercial projects.